automated tracking and behavior quantification of …individual laying hen and detect six different...

19
Agricultural and Biosystems Engineering Publications Agricultural and Biosystems Engineering 2014 Automated Tracking and Behavior Quantification of Laying Hens Using 3D Computer Vision and Radio Frequency Identification Technologies Akash D. Nakarmi Iowa State University, [email protected] Lie Tang Iowa State University, [email protected] Hongwei Xin Iowa State University, [email protected] Follow this and additional works at: hp://lib.dr.iastate.edu/abe_eng_pubs Part of the Agriculture Commons , Bioresource and Agricultural Engineering Commons , and the Poultry or Avian Science Commons e complete bibliographic information for this item can be found at hp://lib.dr.iastate.edu/ abe_eng_pubs/612. For information on how to cite this item, please visit hp://lib.dr.iastate.edu/ howtocite.html. is Article is brought to you for free and open access by the Agricultural and Biosystems Engineering at Iowa State University Digital Repository. It has been accepted for inclusion in Agricultural and Biosystems Engineering Publications by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected].

Upload: others

Post on 04-Mar-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

Agricultural and Biosystems EngineeringPublications Agricultural and Biosystems Engineering

2014

Automated Tracking and Behavior Quantificationof Laying Hens Using 3D Computer Vision andRadio Frequency Identification TechnologiesAkash D. NakarmiIowa State University, [email protected]

Lie TangIowa State University, [email protected]

Hongwei XinIowa State University, [email protected]

Follow this and additional works at: http://lib.dr.iastate.edu/abe_eng_pubs

Part of the Agriculture Commons, Bioresource and Agricultural Engineering Commons, and thePoultry or Avian Science Commons

The complete bibliographic information for this item can be found at http://lib.dr.iastate.edu/abe_eng_pubs/612. For information on how to cite this item, please visit http://lib.dr.iastate.edu/howtocite.html.

This Article is brought to you for free and open access by the Agricultural and Biosystems Engineering at Iowa State University Digital Repository. Ithas been accepted for inclusion in Agricultural and Biosystems Engineering Publications by an authorized administrator of Iowa State UniversityDigital Repository. For more information, please contact [email protected].

Page 2: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

Transactions of the ASABE

Vol. 57(5): 1455-1472 © 2014 American Society of Agricultural and Biological Engineers ISSN 2151-0032 DOI 10.13031/trans.57.10505 1455

AUTOMATED TRACKING AND BEHAVIOR QUANTIFICATION OF LAYING HENS USING 3D COMPUTER VISION AND RADIO

FREQUENCY IDENTIFICATION TECHNOLOGIES

A. D. Nakarmi, L. Tang, H. Xin

ABSTRACT. Housing design and management schemes (e.g., bird stocking density) in egg production can impact hens’ ability to perform natural behaviors and production economic efficiency. It is therefore of socio-economic importance to quantify the effects of such schemes on laying-hen behaviors, which may in turn have implications on the animals’ well-being. Video recording and manual video analysis is the most common approach used to track and register laying-hen behaviors. However, such manual video analyses are labor intensive and are prone to human error, and the number of target objects that can be tracked simultaneously is small. In this study, we developed a novel method for automated quan-tification of certain behaviors of individual laying hens in a group-housed setting (1.2 m × 1.2 m pen), such as locomotion, perching, feeding, drinking, and nesting. Image processing techniques were employed on top-view images captured with a state-of-the-art time-of-flight (ToF) of light based 3D vision camera for identification as well as tracking of individual birds in the group with support from a passive radio-frequency identification (RFID) system. Each hen was tagged with a unique RFID transponder attached to the lower part of her leg. An RFID sensor grid consisting of 20 antennas installed underneath the pen floor was used as a recovery system in situations where the imaging system failed to maintain identi-ties of the birds. Spatial as well as temporal data were used to extract the aforementioned behaviors of each bird. To test the performance of the tracking system, we examined the effects of two stocking densities (2880 vs. 1440 cm2 hen-1) and two perching spaces (24.4 vs. 12.2 cm of perch per hen) on bird behaviors, corresponding to five hens vs. ten hens, re-spectively, in the 1.2 m × 1.2 m pen. The system was able to discern the impact of the physical environment (space alloca-tion) on behaviors of the birds, with a 95% agreement in tracking the movement trajectories of the hens between the au-tomated measurement and human labeling. This system enables researchers to more effectively assess the impact of hous-ing and/or management factors or health status on bird behaviors.

Keywords. 3D vision, Behavior monitoring, Laying hen, RFID, Stocking density, Tracking.

he spatial requirement for laying hens and its im-pact on their welfare remains one of the most debated topics among egg producers and advo-cates of animal welfare. With the 2012 European

Union (EU) ban on conventional cages for laying hens and recent developments in the U.S., non-cage or alternative housing systems are likely to become more predominant (Zimmerman et al., 2006). The United Egg Producers (UEP) and consumer food chain McDonald’s put forward welfare guidelines in 2000. The UEP guidelines recom-mended that cage floor space be increased over a five-year period, ending in 2008, from the U.S. industry standard of 348 cm2 hen-1 to a range of 432 to 555 cm2 hen-1 (UEP, 2000), whereas the McDonald’s recommended welfare

practices call for cage floor space of 465 cm2 hen-1 (McDonald’s, 2000). The EU, on the other hand, recom-mended cage floor space for conventional cages to be 550 cm2 hen-1 until 2012 (Hy-Line, 2000). Without well-controlled, large-scale experiments, it is difficult to assert if and how increasing space allocation actually improves the welfare of laying hens. Different potential indicators of welfare status should be considered before the effect of stocking density (SD) can be assessed.

Researchers have explored many possible indicators of welfare and methods of measurement. Behavior is one such important indicator of animal welfare. Xin and Ikeguchi (2001) developed a measurement system to quantify feed-ing behavior of individual poultry in order to study effects of biophysical factors such as light, ration, noise, and ther-mal variables. Gates and Xin (2001) developed and tested algorithms for determining individual feeding statistics and pecking behavior from time-series recordings of feeder weight. Puma et al. (2001) developed an instrumentation system to study dynamic feeding and drinking behaviors of individual birds. Persyn et al. (2004) used the measurement system and computational algorithm developed by Xin and Ikeguchi (2001) to quantify feeding behaviors of pullets and laying hens with or without beak trimming. Cook et al.

Submitted for review in November 2013 as manuscript number SE

10505; approved for publication by the Structures & EnvironmentDivision of ASABE in July 2014.

The authors are Akash D. Nakarmi, Postdoctoral Fellow, Lie Tang,ASABE Member, Associate Professor, and Hongwei Xin, ASABE Fellow, Director of Egg Industry Center and Distinguished Professor,Department of Agricultural and Biosystems Engineering, Iowa StateUniversity, Ames, Iowa. Corresponding authors: Lie Tang, 2346 Elings Hall, Ames, IA 50011; phone: 515-294-9778; e-mail: [email protected];and Hongwei Xin, 1202 NSRIC, Ames, IA 50011; phone: 515-294-4240; e-mail: [email protected].

T

Page 3: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1456 TRANSACTIONS OF THE ASABE

(2006) adapted and expanded the behavior measurement system and analytical algorithm developed by Persyn et al. (2004) to investigate stocking density effects on feeding behavior of group-housed laying hens.

Behavioral characteristics are usually evaluated using audiovisual tools by a human observer, which is time and labor intensive, subjective to human judgment, and only applicable for a limited observation period (Abrahamsson, 1996). Quantification of animal behaviors, and hence ani-mal welfare, in livestock using image processing brings along specific problems. The appearance of animals varies according to their posture, which makes processing and interpretation of images difficult (Van der Stuyft, 1991). Researchers have used visual monitoring to study group behaviors of animals. Image processing techniques have been used to monitor the weight distribution in poultry flocks (De Wet et al., 2003; Chedad et al., 2003), spatial distribution of pigs (Shao et al., 1998; Hu and Xin, 2000), and trajectory of a flock of poultry (Vaughan et al., 2000). Monitoring behavior of an individual animal within a group requires tracking of the animal. This problem can be allevi-ated by constraining the animal of interest so that it is in a standard position with no other animals around. This has been applied to pigs to monitor weight (Schofield et al., 1999) and back fat (Frost et al., 2004). Leroy et al. (2006) developed an automatic computer vision technique to track individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking. The system involved, however, monitored behaviors of individually caged hen. For freely moving animals, such as a group of laying hens in a cage or a pen, constraints are impractical.

Tracking multiple laying hens for behavior monitoring is a challenging task with interesting features from a com-puter vision perspective. Segmenting laying hens from the background can be difficult, as the litter floor on which the hens live can often be of similar intensity as that of their feathers. Laying hens tend to flock together, and because laying hens are not highly mobile animals, difficulty in separating individual hens can persist for a prolonged time. Conversely, certain hens may make sudden and quick moves, thereby creating a discontinuous trajectory, which can create difficulties in tracking as well.

The literature on classical multi-target tracking is based on the use of data association after foreground detection in the image. Uchida et al. (2000) proposed a robust method for tracking many pedestrians by viewing from an upper oblique angle. They extracted individuals by background subtraction. When pedestrians overlapped one another, they tracked targets robustly based on their trajectories. Howev-er, the movement of poultry in group settings can be rather complex and random.

Computer vision has been applied to tracking animals. Sumpter et al. (1997) tracked a group of ducks at high frame rate. Sergeant et al. (1998) developed a poultry track-ing system in which a camera was placed above poultry. They detected poultry silhouettes based on color infor-mation and segmented the silhouettes of poultry by using the information on the contours of the silhouette. The iden-tities of the animals between two subsequent images were

maintained using a set of simple heuristics. These tech-niques were further enhanced as model-based tracking, which allows for more robust and accurate shape tracking, including locations on the animal body that are not detecta-ble through image features (Tillett et al., 1997). Fujii et al. (2009) used a computer vision technique based on particle filters to track multiple laying hens. However, like other developed systems, their system was not able to track lay-ing hens for a prolonged period of time, as the particle fil-ters were not able to track the hens when they made sudden quick movements.

The objective of this study was to develop an automated tracking and behavior quantification system for individual hens housed in groups at different stocking densities (SDs). For experimental purpose, the hens were housed in groups of five or ten (SD5 and SD10, respectively) in which each hen was tracked, and each hen’s perching, nesting, feed-ing/drinking, and movement behaviors were monitored and delineated.

MATERIALS AND METHODS The developed laying-hen tracking system consisted of

hardware and software subsystems (fig. 1). The hardware subsystem consisted of a structural framework of experi-mental pen, electronic devices (imaging system, RFID components, and communication modules), and a comput-er. The software subsystem consisted of a data acquisition component and an offline data processing component.

EXPERIMENTAL PEN DESIGN AND SETUP A 1.2 m × 1.2 m pen was constructed to house multiple

laying hens (fig. 2); 0.6 m and 1.2 m long feeders were attached outside the north sidewall when the hens were housed at SD5 and SD10, respectively. A water source (two nipple drinkers) was mounted on the inside of the south sidewall. A 1.2 m × 0.31 m nest box was placed just outside the east sidewall. Entrances (exits) to the nest box were located at the north and south sides. The nest box en-trances were 15 cm above the floor. A perch was placed inside the pen 20 cm from the west wall and 25 cm above the floor. Sawdust was used as bedding material for the pen

Figure 1. Schematic diagram of the laying-hen tracking and behavior monitoring system.

Page 4: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1457

floor. An identical pen was made to house hens before moving them into the test (primary) pen for data collection. Fluorescent lighting at an intensity of 10 to 12 lux in the open area and 1 to 2 lux in the nest box was on at 6:00 h and off at 22:00 h, i.e., 16L:8D photoperiod. The resource allowance for the hens in the experiment is shown in ta-ble 1.

The laying hens used in this study were 32-week-old White Leghorns from aviary housing weighing approxi-mately 1.4 kg at procurement. A total of 15 hens were

housed in groups of five and ten in two identical pens. First, five birds (SD5) were housed in the primary pen, and ten birds (SD10) were housed in the holding pen. After three days of data collection, five other birds from the hold-ing pen were moved into the primary pen, and data were collected for three more days with ten hens in the primary pen (i.e., a total of 15 hens). The hens were acclimatized for at least five days between data collection. The hens were fed twice a day at 9:00 h and 17:00 h. Eggs were col-lected once a day at 17:00 h. The litter was cleaned every two weeks during the experiment.

RFID ANTENNA NETWORK DESIGN AND INTERFACING A total of 20 antennas (RI-ANT-G02E-30, Texas In-

struments, Dallas, Tex.) were used to create an antenna grid, with 18 antennas laid underneath the floor and other two antennas mounted beneath the entrances to the nest box. The 18 antennas on the floor were 30 cm apart on cen-ter. Due to their close proximity, the antennas severely in-terfered with one another. Walls wrapped with aluminum foil were created around each antenna to reduce the inter-ference. However, this foil wrapping significantly reduced the readable range of each antenna, from 28 to 9 cm, which resulted in dead regions between antennas where the anten-nas did not detect any tags. Figure 3 shows the layout of the 18 antennas installed under the floor. The inner circles rep-resent the readable range of the antennas during operation, and the dead regions are shown in black.

A 4-antenna cluster was created, which was then con-nected to an RFID reader (RI-STU-251B, Texas Instru-ments) via a 4-channel multiplexer (RI-MOD-TX8A, Texas Instruments). Five such clusters were created. Figure 3 shows the layout of the clusters, and figure 4 shows the interfacing of the clusters with other devices used in the RFID system. The communication protocol between the 4-channel multiplexers and the RFID readers was RS485. The readers were configured to work in a master/slave syn-chronization scheme, with the first reader working as the master and all others as the slaves. This configuration al-lowed the system to read all 20 antennas in less than 0.5 s. With the use of five 4-channel multiplexers, five antennas (one from each 4-antenna cluster) could be read simultane-ously.

The RFID readers were connected to serial-to-Ethernet servers (VESR901, B&B Electronics, Ottawa, Ill.) and fi-

Table 1. Resource allowance for hens in the experimental pen compared to conventional cage, aviary, and enriched colony houses (Hongwei Xin, personal communication, 5 November 2013).

Parameter Experimental Conv.

Cage Aviary Enriched Colony SD5 SD10

Wire mesh floor space (cm2 hen-1)

- - 568 547 763

Litter floor space (cm2 hen-1)

2880 1400 - 516 -

Nest space (cm2 hen-1)

743.2 371.6 - 86 63

Perch space (cm hen-1)

24.4 12.2 - 15.2 17.7

Feed trough space (cm hen-1)

12.0 12.0 10.2 10.2 12.1

Nipple drinker (hens drinker-1)

2.5 5 6 8.9 7.5

Figure 2. Schematic and photograph of the experimental pen.

Page 5: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1458 TRANSACTIONS OF THE ASABE

nally interfaced to the computer using an off-the-shelf Ethernet hub. Each serial-to-Ethernet server was assigned a unique IP address. The communication protocol between the RFID readers and the serial-to-Ethernet servers was RS485, while TCP/IP was the Ethernet protocol used for interfacing the RFID clusters with the computer. Figure 5 shows the instrumentation used in the RFID network.

IMAGING DEVICE AND INTERFACING A state-of-the-art 3D imaging sensor (Cambube3,

PMDTec, Siegen, Germany) based on the TOF (time of flight) of light principle was adopted in this research. This sensor is robust to illumination conditions and is rapidly gaining popularity in agricultural applications. Nakarmi and Tang (2010, 2012) used this technology for sensing inter-plant spacing for corn at early growth stages. This 3D sensor proved to be particularly advantageous over conven-tional color-based cameras, as this study involved continu-ous tracking of laying hens during dark hours. In addition to intensity imaging, the camera provides distance or depth information, which is particularly useful when tracking objects of similar size, shape, and color. The camera was

mounted ~1.85 m above the floor to cover the 1.2 m × 1.2 m pen area. The imaging sensor was connected to the computer using USB2.0 communication protocol.

SOFTWARE SUBSYSTEM As previously stated, the software subsystem consisted

of two components: data acquisition and offline data pro-cessing. The data acquisition component included two in-dependently running threads: one for image acquisition and the other for RFID data acquisition.

Data Acquisition System The images were captured for 18 h per day, with 10 h of

light time and 8 h of dark time. Images were not captured while feeding the hens and collecting eggs from the pen. The hens were given enough time to settle down before the images were captured. During the capture of each frame, tags read by the RFID sensor network were also recorded. The records were stored in the database and accessed later during the image processing phase to determine hen loca-tions and identities.

Multithreading programming allows a data acquisition system to handle multiple tasks simultaneously, and this technique was implemented in this application to ensure maximum data acquisition speed in reading the multiple RFID antennas. Multithreading programming with uniquely configured device IPs enabled the computer to scan data from the RFID readers in different threads. The RFID read-ers kept transmitting RFID tag numbers to the TCP/IP socket, and the computer did not have to poll each RFID reader, which essentially enabled the system to operate at a maximum sampling speed. The image acquisition thread acquired images at ~5 frames per second (fps). Each frame was sequentially numbered and stored in the user-specified file path.

The RFID data acquisition threads were run first, and we manually ensured that all the devices were working correct-ly. The image acquisition thread was then run. The main program thread then created a record for each RFID tag, which consisted of ImagePath (user-specified path where the images were stored), ImageNo (frame number), TagID (RFID tag number), AntennaID (RFID antenna that read the tag), and TimeStamp (time at which the frame was cap-tured). The records were stored in the RFID data table in the database.

Data Processing System The offline data processing component primarily con-

sisted of image processing algorithms. The images were read from the user-specified folder and were processed for hen detection. For each frame, the corresponding RFID data were retrieved from the RFID data table in the data-base. The centroid of a detected hen was used to locate the closest RFID antenna, which in turn was used to associate the hen with her corresponding RFID tag. For each pro-cessed frame, the system created a tracking record that con-sisted of ImagePath, ImageNo, HenID (1 through 5 for SD5, and 1 through 10 for SD10), TagID, CentroidX (x-coordinate of hen pixel mass), CentroidY (y-coordinate of hen pixel mass), MajorAxisLength (major axis of the ellipse fitted on the hen pixel mass), MinorAxisLength (mi-

(a)

(b)

Figure 3. RFID antenna grid: (a) antenna layout with five clusterslabeled A through E, and (b) 18 antennas installed under the floor.

Page 6: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1459

nor axis of the ellipse fitted on the hen pixel mass), Head-ing (heading direction of the hen, 0° to 359°), and TimeStamp. The records were then stored in the Tracking data table in the database.

IMAGE PROCESSING ALGORITHM OVERVIEW The images were subjected to a background subtraction

method for foreground detection. The foreground image was filtered using an anisotropic diffusion filter, which essentially helped in enhancing object edges. The filtered image was then segmented using a modified watershed algorithm. Regions in close proximity were merged to form laying hens in the first frame. In subsequent frames, over-laps between the previously identified hen regions and cur-rently segmented watershed regions were used to detect laying hens. As the frames were captured at 5 fps, between-frame movements of the hens were limited; therefore, the algorithm sufficiently tracked the hens.

Noise Reduction In order to alleviate over-segmentation caused by con-

taminated noise in the watershed transform, we usually have to employ a filter that can effectively reduce noise while preserving important edge information. Although linear filtering can reduce noise in the image, it usually causes blurring and possibly fusing of important edges. Perona and Malik (1990) and Gilboa et al. (2001) reported that diffusion filters were more effective in smoothing noise while preserving necessary edge information. In this study, a diffusion filter (Gilboa et al., 2001) was adopted to reduce the noise effect (fig. 6). The algorithm generalizes the linear and nonlinear scale spaces in a complex domain

Figure 5. RFID system instrumentation.

Figure 4. Schematic of 4-antenna clusters with master/slave synchronization scheme between RFID readers.

Ethernet hub

Ethernet servers

Multiplexers

RFID readers

Page 7: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1460 TRANSACTIONS OF THE ASABE

by combining the diffusion equation (eq. 1) with the simpli-fied Schrodinger equation (eq. 2). The imaginary part of the fundamental solution for the linear complex diffusion is regarded as an edge detector (smoothed second derivative),

which helps preserve edge information while effectively removing noise (eq. 3):

200

Δ , , 0t tI c I I I c== = < ∈ (1)

where It is the noise-free image describing the real scene (I = IR + iII), I0 is the observed or initial image with some degradation due to noise, c is the complex diffusion coeffi-cient, and Δ is the Laplacian operator.

( )2Δ

2i V x

t m

∂ψ = − ψ + ψ∂

(2)

where ψ = ψ(t, x) is the wave function of a quantum parti-cle, m is the mass of the particle, ħ is Planck’s constant,

V(x) is the external field potential, and 1i = − .

00

0

,

, 0

t xx xx

t xx xx

R R R I I R t

I I R R I I t

I c I c I I I

I c I c I I

=

=

= − =

= − = (3)

where c = cR + icI is the diffusion coefficient with real com-ponent cR = cosθ and imaginary component cI = sinθ, and IRxx and IIxx are second derivatives of IR and II, respectively. Foreground Detection and Gradient Computation

A background subtraction technique was used to detect foreground objects. An image of the pen was taken without laying hens in it and was subtracted from the image with hens to segment out the foreground. A median filter was used to eliminate smaller regions from the foreground im-age. Foreground pixels were grouped to form connected components. In the first frame, area threshold was used to decide if a connected component contained one or multiple hens. For subsequent frames, its vicinity was scanned to see if there were other hens around that region in the previous frame. Figure 7 shows the foreground objects detected after background subtraction.

The components that were larger and could be formed from multiple hens were selected for further processing. In the next step, the Sobel gradient operator was used to com-pute a gradient magnitude image. The operator used two 3 × 3 kernels, which were convolved with the original im-age to calculate approximations of the derivatives in the

(a) (b) (c)

Figure 7. Foreground detection: (a) background image, (b) image with laying hens, and (c) detected foreground objects.

(a)

(b)

Figure 6. Noise reduction: (a) original distance image, and (b) noise-reduced image after application of diffusion filter.

Page 8: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1461

horizontal (eq. 4) and vertical (eq. 5) directions, respective-ly. The resulting gradient approximations were then com-bined to compute the gradient magnitude (eq. 6):

1 0 1

2 0 2

1 0 1xG I

− = − ∗ −

(4)

1 2 1

0 0 0

1 2 1yG I

= ∗ − − −

(5)

2 2x yG G G= + (6)

where Gx and Gy are gradient approximations in the hori-zontal and vertical directions, respectively, I is the original image, and G is the gradient magnitude. The gradient mag-nitude image was further used for watershed transfor-mation. Figure 8 shows the gradient magnitude computed on selected foreground objects that were then subjected to watershed transformation.

Foreground Segmentation When multiple hens flocked together, it was challenging

to separate them before they could be tracked. Edge-based segmentation methods require strong edge information for

good segmentation results, which was not always the case when hens came in contact with each other due to the tex-ture of their feathers. Watershed transformation, on the other hand, works well in such situations but is plagued with slower computation and over-segmentation problems.

Watershed transformation is an image processing opera-tion based on mathematical morphology that is analogous to rain falling on a landscape, with each drop flowing down the steepest path toward a body of water called the catch-ment basin. Classic watershed algorithms are based on suc-cessive complete scans of the image. At each step, all the pixels are scanned one after another in a predetermined order, generally with a progressive scan or an interlaced scan. These algorithms do not run in a fixed number of iterations, and the number of iterations is often very large. On the other hand, the fast watershed algorithm, proposed by Vincent and Soille (1991), is designed such that it does not require scanning the entire image at every iteration. Rather, it allows random access to the pixels of an image and direct access to the neighbors of a given pixel, thereby significantly increasing the efficiency. The fast watershed algorithm is summarized below.

Employing the previously described analogy, when a water drop flows down along a relief, it will flow into the region minimum. The Vincent and Soille (1991) watershed segmentation method is based on immersion simulations; starting from the lowest altitude, the water will progressive-

(a) (b) (c)

Figure 8. Gradient computation: (a) foreground objects, (b) objects selected for gradient computation based on size, and (c) gradient magnitude image. Top row is for hens at SD5, and bottom row is for hens at SD10.

Page 9: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1462 TRANSACTIONS OF THE ASABE

ly fill the different catchment basins of the image. Two steps are involved in the immersion algorithm: sorting and flooding. In the sorting step, the image pixels are sorted in ascending order according to their grayscale values, which enables direct access to the pixels at a certain gray level. The minimum and maximum grayscale values (hmin and hmax, respectively) are also computed. In the flooding step, the algorithm progressively floods the catchment basins of the image. The algorithm is composed of fast computation of geodesic influence zones and breadth-first scanning of all pixels in the order of altitude (their grayscale values), thereby assigning a distinct label to each minimum and its associated catchment basin. This process is implemented level-by-level using a FIFO (first in, first out) queue of pixels. The output is an image demarcated by the label of the catchment basins. A dam is built to prevent the basins from merging when two floods originating from different catchment basins meet.

Let I: DI → be a grayscale image, with hmin and hmax

the minimum and maximum gray levels, respectively. Starting at the gray level h = hmin, the catchment basins with the minima of I are successively expanded up until h = hmax. Let Xh denote the union of the set of catchment basins computed at level h. A connected component of the thresh-old set Th+1 at level h + 1 can either be a new minimum or an extension of a catchment basin in Xh. In the latter case,

the geodesic influence zone of Xh within Th+1, (IZTh+1) is computed, resulting in an update Xh+1. Let MINh denote the union of all regional minima at altitude h. The recursive algorithm explained above is defined in equations 7 and 8:

( ){ }min minminh I hX p D | I p h T= ∈ = = (7)

( ) [ )11 1 min maxMIN , ,

hh h T hX IZ X h h h++ += ∈ (8)

The watershed transform of I, W(I), is the complement of Xhmax in DI, i.e., the set of points of DI that do not belong to any catchment basin, and is given by equation 9:

( )maxI hW I D \ X= (9)

According to recursive equations 7 and 8, it is the case that at level h + 1 all non-basin pixels (i.e., all pixels in Th+1 except those in Xh) are potential candidates to get assigned to a catchment basin in step h + 1. Therefore, the pixels with gray level h′ ≤ h that are not yet part of a basin after processing level h are merged with some basin at the higher level h + 1.

Pixels that, in a given iteration, are equidistant to at least the two nearest basins may provisionally be labeled as “wa-tershed” pixels. However, in the next iteration, this label may change again. A definitive labeling of a pixel as a “wa-tershed” pixel can only happen after all levels have been processed. Figure 9 shows watershed transformation results

(a) (b)

Figure 9. Segmentation using watershed transformation: (a) gradient magnitude image, and (b) image after watershed transformation, withwatershed lines in black and catchments basins in color. Top row is for hens at SD5, and bottom row is for hens at SD10.

Page 10: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1463

with watershed lines in black and catchment basins in col-or. The fast watershed algorithm still suffers from an over-segmentation problem. Therefore, the segmented watershed partitions need to be merged to form individual hen re-gions.

Vision-Based Tracking After watershed transformation of the foreground image,

in the very first frame, the regions in close proximity were merged to form laying hen regions. Large partitions were considered as probable hen regions, and merging of such partitions was avoided. Area and orientation information along with mean height were used during the process. In subsequent images, overlaps between the previously identi-fied hen regions and watershed regions were used to merge regions and form individual hens. Because the images were acquired at ~5 fps, the relative movements of the hens in consecutive frames were limited, and the algorithm, in most cases, was able to track individual hens. When the hens made sudden quick movements, it was difficult to associate watershed regions with previously identified hens. In such situations, information from the RFID antenna network was used to recover the identities of lost birds. The RFID net-work was also used to recover hen identities when multiple hens were in the nest box and one hen exited the nest box. The vision system was unable to maintain hen identities in such situations. The system then maintained a separate data list to restore information for the hens without identities. As

soon as the RFID system picked up their tags and their identities were recovered, the corresponding information saved in the data list was merged to the main list that stored the tracking information. Figure 10 shows hens detected and identified in groups of 5 and 10, respectively. Figure 11 shows laying hens identified in different frames in groups of 5 and 10, respectively.

Hen Identity Recovery Using RFID Antenna Network A passive RFID glass transponder (RI-TRP-WEHP-30,

Texas Instruments) with a unique number was taped onto the lower part of each hen’s leg (fig. 12). When a hen stood within the readable range of an antenna, the tag number was read and its approximate position was known based on the location of the antenna that read the tag. The RFID an-tenna network was therefore helpful in locating hens in situations when the visual tracking failed to correctly track them. The RFID antenna network was also used to track hens moving in and out of the nest box. When multiple hens were in the nest box and one of them appeared in the camera view, it was not possible to maintain the identity of the hen until its tag was read by one of the antennas.

EXPERIMENTS AND RESULTS Figure 13 shows the error distribution between the man-

ually located centroids (i.e., manual labeling) and those generated automatically by the tracking algorithm, where a

(a) (b)

Figure 10. Hen detection and identification: (a) segmented hens and (b) hens labeled for tracking. Top row is at SD5; bottom row is at SD10.

Page 11: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1464 TRANSACTIONS OF THE ASABE

total of 600 images from each stocking density were used for the comparison. Frames where the software detected a movement greater than 5 cm were used, which accounted for 95 centroids in the case of SD5 and 176 centroids in the case of SD10. From this distribution, 95% of centroids lie within 4 pixels, i.e., less than 4 cm, of the manually select-

ed centroids. It was also noted that the manual selection of centroids was expected to exhibit an error of ±3 pixels.

Figures 14 and 15 show comparisons of manually ex-tracted trajectories and software-generated trajectories for laying hens housed in groups of 5 and 10, respectively. In the case of SD10, one of the hens was in the nest box throughout the image sequence. The filled circles represent the positions of hens in the first frames, while the filled diamonds represent the positions in the last frames. It can be seen in these figures that the manual and software-generated trajectories closely resemble each other.

Visual tracking of the laying hens seemed to work by simply using frame-to-frame correspondence based on the overlap of pixels between consecutive frames, given that the hens were correctly identified in the first frame. There were situations when visual tracking was unable to main-tain the identities of the hens. One such case is shown in figure 16, in which one of the hens exited the nest box and appeared on the pen floor. In this scenario, the visual track-ing system did not know which hen, among those in the nest box, had exited, although it still kept the track of the hen without identifying which hen it was. When the RFID system detected the tag attached to the hen, the visual tracking system recovered the hen’s identity. The tracking data associated with the hen, between the frame when the system lost the hen’s identity and the frame when the RFID

(a) (b)

Figure 12. (a) RFID glass transponder and (b) hen with RFID glasstransponder taped onto the lower part of her leg.

(a)

(b)

Figure 11. Laying hen identification at different times: (a) from left to right, five hens identified in frames 0, 1000, and 4000, respectively; and (b) ten hens identified in frames 0, 500, and 1000, respectively.

Page 12: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1465

system recovered the hen’s identity, was temporarily stored in a separate table in the database. When the hen’s identity was recovered, this series of data was moved into the main data table where the tracking data for all the hens were cor-rectly associated with their identities.

In another case, when certain hens made sudden quick movements, the visual system failed to keep track of the hens by simply using frame-to-frame correspondence based on overlapped pixels. In the scenario shown in figure 17, hen 6 appeared to make a sudden quick movement between two consecutive frames. When frame-to-frame correspond-ence was used, hen 8 was misidentified as hen 6. In this case, the RFID system was once again used to recover the identity of the hens, as it was able to read the tags attached to the hens at their correct positions.

The system was able to track individual hens and extract their behaviors, such as perching, nesting, feeding, drink-ing, and movement. The SD effect was examined by com-paring behavioral data for the same 5 hens used at both SD levels. Figure 18 shows the time spent by the hens in the feeding area on different days. The graph clearly indicates that the hens (B1 to B5) spent more time in the feeding area when housed in a group of 5 than when housed in a group of 10.

Figures 19, 20, 21, and 22 show the time budgets of the hens’ perching, nesting, feeding, and drinking behaviors, respectively. The shaded block along the horizontal axis indicates the dark hours of the day. Data collection started at 10:00 h and continued until 16:00 h. Between 16:00 h and 20:00 h, the feed trough was refilled, the water supply was checked, and eggs were collected from the nest box and also from the pen floor, if any. The next round of data collection started at 20:00 h until 8:00 h the next morning. The feed trough was refilled and the water supply was checked before data collection started again at 10:00 h.

Figure 19 clearly shows that the hens spent more time

(a)

(b)

Figure 13. Error distribution between manually extracted and soft-ware-detected centroids at (a) SD5 and (b) SD10.

Figure 14. Manually extracted vs. software-generated trajectories of laying hens at SD5.

0

5

10

15

20

25

30

35

40

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15M

ore

Freq

uenc

y

Displacement Error (pixels)

0

5

10

15

20

25

30

35

40

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15M

ore

Freq

uenc

y

Displacement Error (pixels)

1

2

3

4

5

0

20

40

60

80

100

120

140

160

180

200

0 20 40 60 80 100 120 140 160 180 200

Y-ax

is (p

ixel

s)

X-axis (pixels)

Manual

Software

Page 13: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1466 TRANSACTIONS OF THE ASABE

Figure 15. Manually extracted vs. software-generated trajectories of laying hens at SD10.

(a) (b)

(c) (d)

Figure 16. Recovering hen identity when a hen exited the nest box: (a) in frame 401, hen 6 was about to enter the nest box; (b) in frame 402,hen 6 entered the nest box and was outside camera view; (c) in frame 475, a hen whose identity was unknown exited the nest box; and (d) in frame 484, the hen was identified as hen 9.

1

2

3

45

6

7

8

9

0

20

40

60

80

100

120

140

160

180

200

0 20 40 60 80 100 120 140 160 180 200

Y-ax

is (p

ixel

s)

X-axis (pixels)

Manual

Software

Page 14: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1467

on the perch at night than during the day. The data also show that the hens spent 348 ±240 min hen-1 d-1 and 265 ±158 min hen-1 d-1 on the perch when housed at SD5 and SD10, respectively, presumably due to the available perch space. Similarly, figure 20 shows time budget of nesting behavior. The hens spent more time in the nest box between 10:00 h and 11:00 h, and the time spent in the nest box slow-ly declined. It was observed that only 3 or 4 hens spent most of their time on the perch at night, while some hens spent the entire night in the nest box or on the floor despite having enough perch space. The data revealed that the hens spent 99 ±165 min hen-1 d-1 and 78 ±142 min hen-1 d-1 in the nest box when housed at SD5 and SD10, respectively.

Figure 21 shows the time budget of feeding behavior. The feeding behavior seems consistent throughout the day, with nearly zero activity at night. It can be seen that the hens spent 87 ±21 min hen-1 d-1 and 60 ±17 min hen-1 d-1 in the feeding area when housed at SD5 and SD10, respectively. It should be noted that the time spent at the feeder, as determined by

the data, does not necessarily represent the amount time of feeding. Similarly, as shown in figure 22, drinking behavior seems consistent throughout the day and was nearly zero at night. The hens spent 32 ±12 min hen-1 d-1 and 27 ±11 min hen-1 d-1 in the drinking area when housed at SD5 and SD10, respectively. Figure 23 shows the time budget of movement. The hens’ movement averaged 499 ±236 m d-1 and 540 ±160 m d-1 when housed at SD5 and SD10, respectively.

Figure 24 shows a comparison between the distributions of movement by the hens housed at SD5 and SD10 filtered at 5 cm to ignore smaller movements, which could be the result of erroneous centroid extraction or body stretching (hence cen-troid shifting) without real locomotion. The total idle time and/or movements smaller than 5 cm accounted for about 96% and 97% at SD5 and SD10, respectively. About 92% of the movements during the day (10 h d-1) were between 5 and 10 cm long when the hens were housed at SD5, and about 89% were between 5 and 10 cm long when the hens were

(a) (b)

Figure 17. Maintaining hen identities when hens made a sudden quick movement: (a) frame 651 prior to hen 6 making a sudden quick move-ment; (b) in frame 652, hen 8 appeared at hen 6’s position in frame 651. The identities of the hens were maintained with the RFID network.

Figure 18. Time spent at feeder by five hens (B1 to B5) on different days when housed at different stocking densities (SD5 and SD10).

0

20

40

60

80

100

120

140

1 2 3

Tim

e Sp

ent (

min

-day

-1)

Day

B1 - SD5

B2 - SD5

B3 - SD5

B4 - SD5

B5 - SD5

B1 - SD10

B2 - SD10

B3 - SD10

B4 - SD10

B5 - SD10

Page 15: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1468 TRANSACTIONS OF THE ASABE

housed at SD10. The average travel speed was found to be 0.38 and 0.45 m s-1 at SD5 and SD10, respectively.

Figure 25 shows the average time spent by the hens per-forming different activities. The same 5 hens on average spent 32% and 25% of their time on the perch when housed in groups of 5 and 10, respectively. The difference presuma-bly arose from the difference in available perch space. Simi-

larly, the hens on average spent 9% and 7% of their time in the nest box, and 8% and 6% of their time in the feeding area when housed in groups of 5 and 10, respectively. The hens spent 3% of their time in the drinking area at both stocking densities. For the rest of the time (48% and 60%), the hens performed activities such as standing, walking, and sitting when housed at SD5 and SD10, respectively.

Figure 19. Perching behavior time budget for hens housed in groups of 5 or 10 (SD5 or SD10). Vertical bars represent standard errors.

Figure 20. Nesting behavior time budget for hens housed in groups of 5 or 10 (SD5 or SD10). Vertical bars represent standard errors.

0

50

100

150

200

250

300

350

0

10

20

30

40

50

60

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 1 2 3 4 5 6 7 8 9

Cum

mul

ativ

e Ti

me

(min

-day

-1-b

ird-1

)

Aver

age

Tim

e (m

in-h

r-1-b

ird-1

)

Time of Day

Average: SD5 Average: SD10 Cummulative: SD5 Cummulative: SD10

0

50

100

150

200

250

300

350

0

10

20

30

40

50

60

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 1 2 3 4 5 6 7 8 9

Cum

mul

ativ

e Ti

me

(min

-day

-1-b

ird-1

)

Aver

age

Tim

e (m

in-h

r-1-b

ird-1

)

Time of Day

Average: SD5 Average: SD10 Cummulative: SD5 Cummulative: SD10

Page 16: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1469

Multi-regression statistical analysis with t-test using R on the data showed that the SD effect was significant on the perching behavior of the laying hens (p = 0.0023). The hens spent more time on the perch at SD5 (348 min) than at SD10 (265 min). This is not surprising because of the lim-ited perch space. Similarly, the SD effect was prominent on time spent by the feeder (p < 0.0001): 87 min at SD5 and

60 min at SD10. On the other hand, the SD effect was in-significant on nesting or drinking behaviors (p = 0.3597 and 0.1366, respectively). The result also show that SD did not affect movement of the hens for the given floor space of 1.2 m × 1.2 m (p = 0.2422). It should be pointed out that it was not the purpose of this study to examine effects of

Figure 21. Feeding behavior time budget for hens housed in groups of 5 or 10 (SD5 or SD10). Vertical bars represent standard errors.

Figure 22. Drinking behavior time budget for hens housed in groups of 5 or 10 (SD5 or SD10). Vertical bars represent standard errors.

0

50

100

150

200

250

300

350

0

10

20

30

40

50

60

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 1 2 3 4 5 6 7 8 9

Cum

mul

ativ

e Ti

me

(min

-day

-1-b

ird-1

)

Aver

age

Tim

e (m

in-h

r-1-b

ird-1

)

Time of Day

Average: SD5 Average SD10 Cummulative: SD5 Cummulative: SD10

0

50

100

150

200

250

300

350

0

10

20

30

40

50

60

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 1 2 3 4 5 6 7 8 9

Cum

mul

ativ

e Ti

me

(min

-day

-1-b

ird-1

)

Aver

age

Tim

e (m

in-h

r-1-b

ird-1

)

Time of Day

Average: SD5 Average: SD10 Cummulative: SD5 Cummulative: SD10

Page 17: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1470 TRANSACTIONS OF THE ASABE

resource allocation on hen behaviors or time budget. Ra-ther, the two stocking densities were used to test the func-tionality of the automated tracking and behavior qualifica-tion system. Follow-up studies will be conducted to deline-ate such effects.

CONCLUSIONS A sensor fusion approach to tracking laying hens housed

in groups of 5 and 10 has been developed and tested. Due to the varying nature of their appearance related to their posture, and their social behavior of performing activities

in groups, detecting individual laying hens housed in groups was a challenging task. However, the image pro-cessing techniques developed based on the depth images was able to satisfactorily detect and identify individual hens with occasional help from the developed RFID system, when necessary. The developed tracking system was used to automatically extract behaviors, such as locomotion, perching, nesting, feeding, and drinking, of hens housed in groups of 5 and 10, thereby quantifying the effects of re-source allocation (e.g., stocking density) on their behaviors. The system has been demonstrated to be capable of track-ing and maintaining identities of individual hens, which is

Figure 23. Movement time budget for hens housed in groups of 5 or 10 (SD5 or SD10). Vertical bars represent standard errors.

Figure 24. Comparison of movement by hens housed at 5 hens per group (SD5) or 10 hens per group (SD10).

0

100

200

300

400

500

600

700

800

900

1000

0

20

40

60

80

100

120

140

160

180

200

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 1 2 3 4 5 6 7 8 9

Cum

mul

ativ

e Ti

me

(m-d

ay-1

-bird

-1)

Aver

age

Dist

ance

(m-h

r-1-b

ird-1

)

Time of Day

Average: SD5 Average: SD10 Cummulative: SD5 Cummulative: SD10

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0

1000

2000

3000

4000

5000

6000

5 7.5 10 12.5 15 20 25 50 100 More

Cum

mul

ativ

e

Freq

uenc

y

Bin (cm)

Frequency - SD5

Frequency - SD10

Cummulative% - SD5

Cummulative% - SD10

Page 18: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

57(5): 1455-1472 1471

critical for extraction of time budgets of individual hen behaviors. This unique tracking system will enhance re-searchers’ ability to examine the impact of physical and management factors on the behaviors and well-being of group-housed animals.

ACKNOWLEDGEMENTS This project was supported in part by the USDA-NIFA

Agriculture and Food Research Initiative Program (Grant Award 2011-67021-20223).

REFERENCES Abrahamsson, P. (1996). Furnished cages and aviaries for laying

hens: Effects on production, health, and use of facilities. Report 234. Uppsala, Sweden: Swedish University of Agricultural Sciences, Department of Animal Nutrition and Management.

Chedad, A., Aerts, J.-M., Vranken, E., Lippens, M., Zoons, J., & Berckmans, D. (2003). Do heavy broiler chickens visit automatic weighing systems less than lighter birds? British Poultry Sci., 44(5), 663-668. http://dx.doi.org/10.1080/00071660310001643633.

Cook, R. N., Xin, H., & Nettleton, D. (2006). Effects of cage stocking density on feeding behaviors of group-housed laying hens. Trans. ASABE, 49(1), 187-192.

De Wet, L., Vranken, E., & Berckmans, D. (2003). Computer-assisted image analysis to quantify daily growth rates of broiler chickens. British Poultry Sci., 44(4), 524-532. http://dx.doi.org/10.1080/00071660310001616192.

Frost, A. R., French, A. P., Tillett, R. D., Pridmore, T. P., & Welch, S. K. (2004). A vision-guided robot for tracking a live, loosely constrained pig. Comp. Elect. Agric., 44(2), 93-106. http://dx.doi.org/10.1016/j.compag.2004.03.003.

Fujii, T., Yokoi, H., Tada, T., Suzuki, K., & Tsukamoto, K. (2009). Poultry tracking system with camera using particle filters. In Proc. IEEE Intl. Conf. on Robotics and Biometrics (pp. 1888-1893). Piscataway, N.J.: IEEE.

Gates, R. S., & Xin, H. (2001). Comparative analysis of measurement techniques of feeding behavior of individual poultry. ASAE Paper No. 014033. St. Joseph, Mich.: ASAE.

Gilboa, G., Zeevi, Y. Y., & Sochen, N. A. (2001). Complex diffusion processes for image filtering. In Scale-Space and

Morphology in Computer Vision: Lecture Notes in Computer Sci. 2106, 299-307.

Hu, J., & Xin, H. (2000). Image-processing algorithms for behavior analysis of group-housed pigs. Behavior Res. Methods Instrum. Comp., 32(1), 72-85. http://dx.doi.org/10.3758/BF03200790.

Hy-Line. (2000). Hy-Line Variety W-36 commercial management guide. West Des Moines, Iowa: Hy-Line International.

Leroy, T., Vranken, E., Van Brecht, A., Struelens, E., Sonck, B., & Berckmans, D. (2006). A computer vision method for on-line behavioral quantification of individually caged poultry. Trans. ASABE, 49(3), 795-802. http://dx.doi.org/10.13031/2013.20462.

McDonald’s. (2000). Recommended welfare practices: Egg-laying hen guidelines. Oak Brook, Ill.: McDonalds’s Corporation.

Nakarmi, A. D., & Tang, L. (2010). Inter-plant spacing sensing at early growth stages using a time-of-flight of light based 3D vision sensor. ASABE Paper No. 1009216. St. Joseph, Mich.: ASABE.

Nakarmi, A. D., & Tang, L. (2012). Automatic inter-plant spacing sensing at early growth stages using a 3D vision sensor. Comp. and Elect. Agric., 82, 23-31. http://dx.doi.org/10.1016/j.compag.2011.12.011.

Perona, P., & Malik, J. (1990). Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Machine Intel., 12(7), 629-639. http://dx.doi.org/10.1109/34.56205.

Persyn, K. E., Xin, H., Nettleton, D., Ikeguchi, A., & Gates, R. S. (2004). Feeding behaviors of laying hens with or without beak trimming. Trans. ASAE, 47(2), 591-596. http://dx.doi.org/10.13031/2013.16040.

Puma, M. C., Xin, H., Gates, R. S., & Burnham, D. H. (2001). An instrumentation system for studying feeding and drinking behavior of individual poultry. Appl. Eng. Agric., 17(3), 365-374.

Schofield, C. P., Marchant, J. A., White, R. P., Brandl, N., & Wilson, M. (1999). Monitoring pig growth using a prototype imaging system. J. Agric. Eng. Res., 72(3), 205-210. http://dx.doi.org/10.1006/jaer.1998.0365.

Sergeant, D., Boyle, R., & Forbes, M. (1998). Computer visual tracking of poultry. Comp. Elect. Agric., 21(1), 1-18. http://dx.doi.org/10.1016/S0168-1699(98)00025-8.

Shao, J., Xin, H., & Harmon, J. D. (1998). Comparison of image feature extraction for classification of swine thermal comfort behaviour. Comp. Elect. Agric., 19(3), 223-232. http://dx.doi.org/10.1016/S0168-1699(97)00048-3.

Figure 25. Time spent by hens performing different activities.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

SD5 SD10

Tim

e Sp

ent p

er A

ctiv

ity

Stocking Density

Other

Water

Feeder

Nest

Perch

Page 19: Automated Tracking and Behavior Quantification of …individual laying hen and detect six different behavior phe-notypes: standing, sitting, sleeping, grooming, scratching, and pecking

1472 TRANSACTIONS OF THE ASABE

Sumpter, N., Boyle, R. D., & Tillett, R. (1997). Modelling collective animal behaviour using extended point-distribution models. Leeds, U.K.: University of Leeds, School of Computer Studies.

Tillett, R. D., Onyango, C. M., & Marchant, J. A. (1997). Using model-based image processing to track animal movements. Comp. Elect. Agric., 17(2), 249-261. http://dx.doi.org/10.1016/S0168-1699(96)01308-7.

Uchida, K., Miura, J., & Shirai, Y. (2000). Tracking multiple pedestrians in crowd. Proc. IAPR Workshop of Machine Vision Applications (pp. 533-536). International Association for Pattern Recognition.

UEP. (2000). Animal husbandry guidelines for U.S. egg-laying flocks. Atlanta, Ga.: United Egg Producers.

Van der Stuyft, E., Schofield, C. P., Randall, J. M., Wambacq, P., & Goedseels, V. (1991). Development and application of computer vision for use in livestock production. Comp. Elect. Agric., 6(3), 243-265. http://dx.doi.org/10.1016/0168-1699(91)90006-U.

Vaughan, R., Sumpter, N., Henderson, J., Frost, A., & Cameron, S. (2000). Experiments in automatic flock control. Robotics and Auton. Systems, 31(1-2), 109-117. http://dx.doi.org/10.1016/S0921-8890(99)00084-6.

Vincent, L., & Soille, P. (1991). Watersheds in digital spaces: An efficient algorithm based on immersion simulations. IEEE Trans. Pattern Anal. Machine Intel., 13(6), 583-598. http://dx.doi.org/10.1109/34.87344.

Xin, H., & Ikeguchi, A. (2001). Characterization of feeding behavior of growing broilers: A research report. Ames, Iowa: Iowa State University.

Zimmerman, P. H., Linderberg, A. C., Pope, S. J., Glen, E., Bolhuis, J. E., & Nicol, C. J. (2006). The effect of stocking density, flock size, and modified management on laying hen behavior and welfare in a non-cage system. Appl. Animal Behavior Sci., 101(1-2), 111-124. http://dx.doi.org/10.1016/j.applanim.2006.01.005.