+ All Categories
Home > Documents > Chick feather pattern recognition

Chick feather pattern recognition

Date post: 21-Sep-2016
Category:
Upload: cl
View: 216 times
Download: 1 times
Share this document with a friend
8
Chick feather pattern recognition Y. Tao, Z. Chen and C.L. Griffis Abstract: A crescent model is proposed for chick wing image processing and feather pattern recognition, thereby implementing chick sex separation by machine vision technology. The crescent shape delineates the region of interest in a wing image by an arc of large radius and an arc of small radius at two off-centred circles. Wing feathers are divergently distributed in the crescent region, manifesting as an oriented stripe pattern. Male chick feathers gradually change in length from short to long and then to short in accordance with the crescent envelope. Female chick feathers alternate the stripe lengths, following a long – short – long stripe pattern. Based on this knowledge, a chick feather pattern can be numerically characterised by a stripe length sequence and a stripe endpoint sequence. For pattern classification, the first-order differences of these two sequences are used. The mean value of the stripe endpoint difference sequence is the most efficient feature in male – female chick classification. Experimental results justified the model and feature selection strategy, and showed the feasibility of automatic chick sex separation. 1 Introduction Chick wing feather patterns, which differ in sex as the result of genetic encoding, have been widely used for chick sex identification [1–3]. Research indicates that the two sexes of broiler chicks have different nutritional requirements [1] and sex segregation of baby chicks provides significant benefits to the poultry industry. Commonly, sex separation is accomplished by manually opening the wings of each chick and inspecting the feather patterns to determine the sex. A female chick’s wing consists of alternating long and short feathers with an uneven distribution of feather endpoints; in contrast, the feather lengths in a male chick’s wing gradually change from short to long and then to short, from one side to the other [1, 4]. Overall, the feathers are confined in a fan-like region in the wing image, modelled as a crescent shape in this paper. In terms of image recognition, a feather pattern is an oriented pattern [5], which bears some resemblance to a fingerprint [6], a vasculature [7], or a texture pattern [8]. However, the feather pattern has its peculiarities: (1) its region of interest (ROI) is confined to a crescent region; (2) its orientations are divergently dis- tributed. Consequently, feather pattern recognition cannot be efficiently achieved by existing techniques commonly used for fingerprint identification and texture analysis [5–8]. In this work, we report a crescent model dedicated to chick feather pattern recognition, and thereby show the feasibility of automatic chick sex separation by machine vision technology. Automatic feather-based sex separation involves chick wing image acquisition and feather recognition. Evans [1] and Jones et al. [2] used visible light to acquire wing feather images. Tao and Walker [4] proposed using near ultraviolet light for wing image acquisition. Following image acquisition are tasks of image processing and pattern recognition. Since a local region in a divergent stripe pattern can be considered approximately as parallel stripes, directional filtering [6] can be used for image enhancement. For feature extraction, we calculate feather lengths and feather endpoints and collect these data in two sequences, hence numerically representing the feather-based chick sex separation knowledge that is used by human visual inspection. Based on the numerical features, chicks can be automatically classified into male and female classes. 2 Crescent model The diagram for automatic chick sex separation is depicted in Fig. 1. It consists of an electro-optic imaging system for wing image acquisition. Baby chicks on the conveyer are illuminated with an ultraviolet light source, and their wing images are captured using a CCD camera that senses the near-ultraviolet spectrum. Since the feathers in a wing image acquired in ultraviolet lighting have strong contrast with respect to the background or surroundings, the feather patterns can easily be segmented using a simple threshold- ing operation. For pattern recognition, the feathers in a wing image form an oriented pattern [5] because of the local orientation of feather stripes. Therefore, directional filtering may enhance the image, as will be reported later. The simplest way to segment feather stripes in a wing image is through thresholding. In the binary image, feather stripes are characterised in terms of locations, orientations, and lengths. To characterise the partial fan-like region and the divergent feather stripes, we propose a crescent model. This model defines a crescent region that is enclosed by two offcentred arcs: an outer arc associated with a small radius (r) and an inner arc associated with a large radius (R), as shown in Fig. 2. In this figure, Fig. 2a and 2b illustrate the feather patterns for male and female chicks, respectively. The length of a feather stripe is confined by the crescent region, and the stripe orientation is divergently distributed q IEE, 2004 IEE Proceedings online no. 20040730 doi: 10.1049/ip-vis:20040730 Y. Tao is with the Department of Biological Resource Engineering, University of Maryland, College Park, MD 20742, USA Z. Chen was with the University of Arkansas and is now with the Department of Radiology, University of Rochester, Box 648, Rochester, NY 14642, USA C.L. Griffis is with the Department of Biological and Agricultural Engineering, University of Arkansas, 203 Engineering Hall, Fayetteville, AR 72701, USA Paper first received 6th January and in revised form 29th April 2004 IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004 337
Transcript

Chick feather pattern recognition

Y. Tao, Z. Chen and C.L. Griffis

Abstract: A crescent model is proposed for chick wing image processing and feather patternrecognition, thereby implementing chick sex separation by machine vision technology. Thecrescent shape delineates the region of interest in a wing image by an arc of large radius and an arcof small radius at two off-centred circles. Wing feathers are divergently distributed in the crescentregion, manifesting as an oriented stripe pattern. Male chick feathers gradually change in lengthfrom short to long and then to short in accordance with the crescent envelope. Female chick feathersalternate the stripe lengths, following a long–short–long stripe pattern. Based on this knowledge, achick feather pattern can be numerically characterised by a stripe length sequence and a stripeendpoint sequence. For pattern classification, the first-order differences of these two sequences areused. The mean value of the stripe endpoint difference sequence is the most efficient feature inmale–female chick classification. Experimental results justified the model and feature selectionstrategy, and showed the feasibility of automatic chick sex separation.

1 Introduction

Chick wing feather patterns, which differ in sex as the resultof genetic encoding, have been widely used for chick sexidentification [1–3]. Research indicates that the two sexesof broiler chicks have different nutritional requirements [1]and sex segregation of baby chicks provides significantbenefits to the poultry industry. Commonly, sex separationis accomplished by manually opening the wings of eachchick and inspecting the feather patterns to determine thesex. A female chick’s wing consists of alternating long andshort feathers with an uneven distribution of featherendpoints; in contrast, the feather lengths in a male chick’swing gradually change from short to long and then to short,from one side to the other [1, 4]. Overall, the feathers areconfined in a fan-like region in the wing image, modelled asa crescent shape in this paper. In terms of image recognition,a feather pattern is an oriented pattern [5], which bears someresemblance to a fingerprint [6], a vasculature [7], or atexture pattern [8]. However, the feather pattern has itspeculiarities: (1) its region of interest (ROI) is confined to acrescent region; (2) its orientations are divergently dis-tributed. Consequently, feather pattern recognition cannotbe efficiently achieved by existing techniques commonlyused for fingerprint identification and texture analysis[5–8]. In this work, we report a crescent model dedicatedto chick feather pattern recognition, and thereby show thefeasibility of automatic chick sex separation by machinevision technology.

Automatic feather-based sex separation involves chickwing image acquisition and feather recognition. Evans [1]and Jones et al. [2] used visible light to acquire wing featherimages. Tao and Walker [4] proposed using near ultravioletlight for wing image acquisition. Following imageacquisition are tasks of image processing and patternrecognition. Since a local region in a divergent stripepattern can be considered approximately as parallel stripes,directional filtering [6] can be used for image enhancement.For feature extraction, we calculate feather lengths andfeather endpoints and collect these data in two sequences,hence numerically representing the feather-based chick sexseparation knowledge that is used by human visualinspection. Based on the numerical features, chicks can beautomatically classified into male and female classes.

2 Crescent model

The diagram for automatic chick sex separation is depictedin Fig. 1. It consists of an electro-optic imaging system forwing image acquisition. Baby chicks on the conveyer areilluminated with an ultraviolet light source, and their wingimages are captured using a CCD camera that senses thenear-ultraviolet spectrum. Since the feathers in a wingimage acquired in ultraviolet lighting have strong contrastwith respect to the background or surroundings, the featherpatterns can easily be segmented using a simple threshold-ing operation. For pattern recognition, the feathers in a wingimage form an oriented pattern [5] because of the localorientation of feather stripes. Therefore, directional filteringmay enhance the image, as will be reported later. Thesimplest way to segment feather stripes in a wing image isthrough thresholding. In the binary image, feather stripes arecharacterised in terms of locations, orientations, andlengths. To characterise the partial fan-like region and thedivergent feather stripes, we propose a crescent model. Thismodel defines a crescent region that is enclosed by twooffcentred arcs: an outer arc associated with a small radius(r) and an inner arc associated with a large radius (R), asshown in Fig. 2. In this figure, Fig. 2a and 2b illustrate thefeather patterns for male and female chicks, respectively.The length of a feather stripe is confined by the crescentregion, and the stripe orientation is divergently distributed

q IEE, 2004

IEE Proceedings online no. 20040730

doi: 10.1049/ip-vis:20040730

Y. Tao is with the Department of Biological Resource Engineering,University of Maryland, College Park, MD 20742, USA

Z. Chen was with the University of Arkansas and is now with theDepartment of Radiology, University of Rochester, Box 648, Rochester,NY 14642, USA

C.L. Griffis is with the Department of Biological and AgriculturalEngineering, University of Arkansas, 203 Engineering Hall, Fayetteville,AR 72701, USA

Paper first received 6th January and in revised form 29th April 2004

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004 337

with respect to a common point (O) on the line connectingthe two arc centrs ðOr and ORÞ:

The crescent model is very convenient for chick wingimage processing and feather pattern recognition. First, itserves as a guide to locate the region of interest (ROI) in awing image. The ROI can be localised at fast speed usingimage decimation and multiresolution techniques [9, 10].For efficient digital image analysis, it is convenient to cropthe ROI in a bounding box big enough to contain all featherswith ample margins. Second, the crescent model describesthe divergent orientation of feathers, resulting from theinterception of radial lines, by a crescent region. Speci-fically, feather lengths and locations are confined by twoarcs of the crescent envelope, and feather orientation isdetermined by the location of the emanating centre. Ingeneral, this model captures the conspicuous features ofchick feathers. In practice, the chick’s wing presentation toimage capture cannot guarantee consistent and precisealignment due to a living chick’s instinctive movements onthe moving conveyer. Thus, image rotation is needed duringwing image processing. Figure 2b, for example, shows anexample of the crescent shape of Fig. 2a rotated by y: Withthis crescent model, the line connecting the two arc centresdeviates from the vertical direction due to rotation.

Besides confinement of the crescent region, the modelalso portrays the gradual change of feather orientation, whensweeping from one side to the other in a wing image.

Each feather is a line segment within the crescent region, asdefined by upside endpoint and a downside endpoint. Theupside endpoint can be used as the reference point formeasuring feather location and feather length, and isreferred to as the feather endpoint. To extract conspicuousfeatures for feather pattern recognition, we use only twofeather parameters: feather length and feather endpointcoordinates. Figure 3a illustrates a wing image consisting ofeight feathers with the lengths fli; i ¼ 1; 2; . . . ; 8g andfeather endpoint coordinates fðxi; yiÞ; i ¼ 1; 2; . . . ; 8g:The simplest way to designate the feather sequence isbased on the x coordinates of the feather endpoints.However, the feather sequence may be out of order due toimage rotation resulting from chick movement. One caseis illustrated in Fig. 3b, where the feather sequence appearsto be fl1; l3; l2; l4; . . . l8g since x3 < x2: If the image isappropriately rotated back, the correct order fl1; l2; l3; l4;. . . l8g may be obtained. It would be time-consuming to findthe rotation angle y if we strictly follow the crescent modelin Fig. 2b. In practice, the image rotation angle can bedetermined by a technique illustrated in Fig. 3b, which seeksa chord line for the crescent shape from the wing image.The image rotation is carried out as follows. First, we findthe down-endpoint coordinates of the outermost feathers onboth left and right sides, for example ðx1; y1Þ and ðx8; y8Þ inFig.3b, which forms a chord on the arcs. The image rotationangle f is defined the angle formed by the chord line and thex axis, i.e.

f ¼ arctany8 � y1

x8 � x1

ð1Þ

Then, we define the rotation center ðO0Þ by the middle pointof the chord. Finally, we rotate the image around therotation centre by an angle �f; a counteraction to removethe obliqueness of the wing image. In the rotated image, thefeathers can be labelled by a sequence in the order of feathertip coordinates.

Concerning the crescent model for chick feather patterrecognition, we should point out that we do not need todetermine the arcs and emanating centre for the crescent

Fig. 2 Crescent model for chick feather patterns

a Male chick feather patternb Female chick feather pattern

Fig. 1 Diagram of feather-based chick sex separation system

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004338

model from a feather pattern. The usefulness of the crescentmodel is that it suggests a ROI shape and an oriented stripepattern.

3 Feather image processing

With the digital image acquired by the electro-opticalsystem in Fig. 1, our next task is to perform wing imageprocessing and feather pattern recognition using theflowchart shown in Fig. 4.

The raw image from a CCD camera is a large imagebecause the CCD captures the wing as only part of its fieldof view. With a large raw image, the immediate task is tolocate the ROI and to crop it. Since feathers have highcontrast intensities against surroundings, it is easy to definethe ROI by simple thresholding, with the thresholddetermined from the intensity histogram. In the binaryimage, the ROI of chick feathers assumes a crescent region,which can be cropped from the raw image using a boundingbox. Fast ROI localisation can be implemented usingimage decimation and ‘from presence to classificationmodel’ [9, 10]. It is convenient to define a bounding box thatis large enough to enclose the ROI with an ample margin.With the ROI, we can determine the rotation angle using(1) and then perform the image rotation. In the rotatedimage, we update the ROI and then crop it. Henceforth, thecropped ROI represents the wing image that will be used forfeather pattern recognition.

Since the feather pattern is an oriented pattern, it can beenhanced by directional filtering or directional smoothing[5, 6], which can be performed in either the space domainor the frequency domain. In the space domain, the filter isa digital window, and the filtering process is describedby convolution between the wing image and the window.In accordance with the divergence of the feather stripes,we design direction filters as shown in Fig. 5a. Specifi-cally, a bank of nine filters, fhk; k ¼ 1; 2; . . . ; 9g; areused to accommodate the divergent orientation. Weillustrate their digital implementation in a 7 � 7 windowin Fig. 5b, where a digital filter is a 7 � 7 windowcontaining a digital line. The pixels on a digital line areassigned the same non-zero value, or the pixel in question(at the centre) may be overweighted by a bigger value. Tomaintain image energy after convolution, it is necessary tonormalise the filter, i.e. to ensure that the sum of theentries is 1. In digital geometry, pixels constituting adigital line may populate in a ‘jagged’ manner, such as h3

(the filled pixels) in Fig. 5b.Directional filtering with the directional filter bank is

expressed by

giðx; yÞ ¼ f ðx; yÞ hiðx; yÞ i ¼ 1; 2; . . . ; 9 ð2Þ

and

f̂f ðx; yÞ ¼ maxfgiðx; yÞ i ¼ 1; 2; . . . ; 9g ð3Þ

where ‘ ’ represents convolution and max { } denotesselecting the maximum pixel value. Since directionalfiltering produces a bank of images in (2), it costs timeand memory in image processing. In the pursuit of speed, weshould reduce the number of filters and the window size.

Following directional filtering comes the thresholdingoperation, which produces a binary feather pattern. Thethreshold can easily be determined from the histogram of

Fig. 3 Wing image processing

a Illustration of feather endpoint coordinates fðxi; yiÞ; i ¼ 1; 2; . . . ; 8g andfeather lengths fli; i ¼ 1; 2; . . . ; 8gb Determination of wing image rotation angle f and rotation centre O 0

Fig. 4 Flowchart of wing image processing and recognition

Fig. 5 Directional filters for wing image enhancement

a Orientations of the filter bank fhi; i ¼ 1; 2; . . . ; 9gb Digital implementations in a 7 � 7 window

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004 339

the wing image. For the sake of stripe computation, weextract stripe skeletons from the binary image by a thinningalgorithm [8]. Owing to interference from the soft fluffyfeathers known as ‘down’, and other clutter during wingimage acquisition, breaks and bridges exist in the resultant

binary feather image. Therefore, a trimming procedure [7] isused to remove spurs, and to bridge the breaks, for theskeletons. Figure 6 shows the image processing applied to atypical female chick wing image, and Fig. 7, application to amale chick wing image.

Fig. 6 Typical example of feather image processing applied to a female chick wing

a Wing image cropped from raw image after image rotationb Binary image resulting from thresholdingc Skeletons generated by thinningd Trimmed skeletons and feather endpoints (marked by ‘ þ ’)

Fig. 7 Typical example of male chick wing image processing

a Wing image cropped from raw image after image rotationb Binary image resulting from thresholdingc Skeletons generated by thinningd Trimmed skeletons and feather endpoints (marked by ‘ þ ’)

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004340

4 Feather feature extraction

As suggested from the knowledge of human visual chick sexseparation, we choose feather endpoints and feather lengthsto characterise a feather pattern. A feather sequence,l ¼ fl1; l2; . . . ; lNg; can be constructed from a featherendpoint sequence, f½x1; y1�; ½x2; y2�; . . . ; ½xN ; yN �g; in theorder of x-coordinates from one side to the other, i.e.x1 < x2 < . . . < xN ; where N denotes the number of feathers,and li denotes the length of the ith feather. Each featherlength is calculated by the path length on a skeleton in unitsof pixel numbers. From the endpoint sequence, we usethe sequence of y-coordinate components, i.e., y ¼fy1; y2; . . . ; yNg to describe the unevenness distributionof the feather endpoints. As a result, we use these twosequences, y and l, to describe a feather pattern with thecrescent model. For each sequence we can calculate themean and variance values. Specifically, the mean m(l) andvariance var(l) of l, for example, are given by

mðlÞ ¼ 1

N

XN

n¼1

ln ð4Þ

varðlÞ ¼ 1

N

XN

n¼1

½ln � mðlÞ�2 ð5Þ

In similar way, we calculate the mean m( y) and variancevar( y) for y.

From the crescent model in Fig. 2, one can observe thatthe changes either in feather endpoints or in feather lengthsbetween adjacent feathers in a female chick wing are largerthan those in a male chick wing, but the magnitude ofchange is relatively constant for each sex. This observationreveals that the feather pattern can be more efficientlycharacterised by the sequence differences Dy and Dl; rather

than the primary sequences themselves ( y and l). Thedifference sequences Dy and Dl are calculated by

Dy ¼ fjDy1j; jDy2j; . . . ; jDyN�1jg; Dyi ¼ yiþ1 � yi ð6Þ

Dl ¼ fjDl1j; jDl2j; . . . ; jDlN�1jg; Dli ¼ liþ1 � li; ð7Þ

where i ¼ 1; 2; . . . ;N � 1: Using (4) and (5), one can obtainthe means and variances for Dy and Dl; i.e. mðDyÞ; varðDyÞ;mðDlÞ and varðDlÞ: According to the crescent model, it isexpected that a female chick yields larger values of mðDyÞand mðDlÞ than a male chick. Although varðDyÞ and varðDlÞare different for the two sexes, they describe the relativelyconstant characteristics of Dy and Dl for each sex.Considering these statistical entries as features, one canrender chick feather classification. The appropriateness ofone single feature can be estimated based on its classifi-cation performance when applied to the two-class problem(male and female). In principle, pattern classification can beimproved as more features are utilised. However, morefeatures lead to high-dimensional problems, i.e. the featurespace is spanned by high-dimensional basic vectors.Concerning feather pattern recognition, the feature spacecould be spanned by four features: mðDyÞ; varðDyÞ; mðDlÞand varðDlÞ: In practice, by choosing the best feature, wecan accomplish feather pattern recognition using a singlefeature, mðDyÞ; as will be reported in the experimentalresults.

To show the features extracted from feather patterns,Fig. 8 provides one example, where the numerical features,y, l, Dy; and Dl are extracted from the female chick featherimage in Fig. 6. Figure 9 shows another example usingthe male chick feather image of Fig. 7. The values of½mðDyÞ; varðDyÞ; mðDlÞ; varðDlÞ� corresponding to thefemale chick (in Fig. 8) and male chick (in Fig. 9) were

Fig. 8 Numerical feather features of the wing image in Fig. 6

a Feather tip sequenceb Difference version of ac Feather length sequenced Difference version of c

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004 341

[28.77, 11.52, 28.92, 14.70] and [4.91, 3.91, 4.46, 4.81],respectively.

5 Experimental results

Experiments were conducted at Tyson Foods, Incorporated.A total of 88 chick samples were randomly divided into twogroups: the training group consisting of 20 samples and thetest group consisting of 68 samples. Figures 10 and 11 showsome experimental images of the chick samples, which werecropped into a ROI of 180 � 120 pixels. During the trainingprocess, various features of the wing images wereinvestigated for chick sex separation. The best feature forchick sex separation determined in the training stage wasused to classify the test group. In both training and test

stages, sex separation results were confirmed by humanvisual inspection of the chick wings.

The feather classification results are shown in Fig.12,which allows the classification performance of each featureto be easily evaluated. The feature mðDyÞ is the mostappropriate feature for chick sex separation, with which thetwo classes can be separated by a linear partition associatedwith a threshold value. For the training set, the mðDyÞthreshold for chick sex separation was determined to be15.4, as indicated by the partition line in Fig. 12f. The nextmost appropriate feature is mðDlÞ as shown in Fig. 12c.However, it cannot dichotomize the feature space with astraight boundary line. The worst cases are associated withthe features m( y), varðDyÞ; m(l) and varðDlÞ that areextracted from the primary sequences; in these, no linear

Fig. 9 Numerical feather features of wing image in Fig. 7

a Feather tip sequenceb Difference version of ac Feather length sequenced Difference version of c

Fig. 10 Experimental images of female chick wings (cropped by 180� 120 ROI)

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004342

Fig. 11 Experimental images of male chick wings (cropped by 180� 120 ROI)

Fig. 12 Results of chick feather pattern recognition generated from a training set of 20 samples (11 female and 9 male) and test set of 68samples (38 female and 30 male)

The numerical features plotted are: a m(l); b var (l); c mðDlÞ; d varðDlÞ; e m( y); f mðDyÞ; g varðDyÞ; h [var(l), m(l)]; i ½varðDlÞ; mðDlÞ�;j [var( y), m( y)]; k ½varðDyÞ; mðDyÞ�The lines in f and k represent the classification boundaries. The last plot l shows the classification result for 68 testing samples using the mðDyÞ feature.In these plots, represents a male chick and � represents a female chick, and the abscissa k values in a–g and l represent label numbers for chicks

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004 343

partition for chick wing recognition is possible. In Fig. 12f itis seen that the 20 training images from 11 female chicks and9 male chicks are reliably separated by the feature mðDyÞ:This conclusion was verified by human visual inspection.Figures 12h–12k show some two-dimensional featurespaces for chick feather classification. Once again, Fig. 12kshows that mðDyÞ is the best feature able to that can yield alinear partition.

In the test stage, the mðDyÞ feature was extracted toclassify 68 images. The results are shown in Fig. 12l. Withthe threshold value 15.4 obtained in the training stage, the68 test images were classified into 38 female chicks and 30male chicks. One of the female chicks was so close to thedecision line that ambiguity or uncertainty was possible.This was due to feather underdevelopment and downintervention. Again, the classifications of Fig. 12l werejustified by human visual inspection.

In general, the crescent model appropriately describes thefeather pattern. In practice, feather overlapping, featherunderdevelopment, and down intervention may causefeature fluctuations in feature space. During wing imageprocessing, feather stripe segmentation plays the key role inautomatic implementation. In this paper, we have adopted athresholding-based technique. More robust feather segmen-tation may be achieved by a ridgeline (or valley course)tracking technique [7]. For feather feature extraction, theappropriateness of a feature can be evaluated in terms ofclassification confidence or scores with reference to manualclassification results. The experiments justified the crescentmodel for feather pattern recognition and showed theimportance of modelling and feature extraction strategy inpattern recognition.

6 Summary

Baby chicks can be separated according to sex by wingfeather patterns. Chick sex separation is then a patternrecognition problem, which is essentially a special case oforiented pattern recognition. The feather pattern in a chick

wing can be appropriately modelled by a crescent model,which delineates the wing shape by a crescent region inwhich the feathers manifest as divergently oriented stripes.Each feather is characterised by its upside endpoint and itsstripe length. The endpoint sequence and the lengthsequence provide the primary data to characterise a featherpattern. In terms of feather-based sex discrimination, it wasfound that the differences between the two data sequencesare more appropriate than the primary sequences themselvesfor feather pattern classification. Chick sex separation maybe reduced to a simple linear classification issue by usingthe mean values of the difference sequences, thus demon-strating the fact that an appropriate model and featureselection strategy may simplify the nonlinear transform-ations and high dimensionality in pattern classification.Experiments justified the feather pattern model and thefeature selection strategy.

7 References

1 Evans, M.D.: ‘Feather sexing of broiler chicks by machine vision’.Proc. American Society of Agricultural Engineering and CanadianSociety of Agricultural Engineering Meeting, Paper no. 90–3008,1989, pp. 25–28

2 Jones, P.T., Shearer, S.A., and Gates, R.S.: ‘Edge extraction for feathersexing poultry chicks’, Trans. ASAE, 1991, 34, pp. 635–640

3 Swatland, H.J., and Leeson, S.: ‘Reflectance of chicken feathers inrelation to sex-linked coloration’, Poultry Sci., 1988, 67, pp. 1680–1683

4 Tao, Y., and Walker, J.: ‘Automatic feather sexing of poultry babychicks’. U.S. patent, file no. 60/076, 1997, p. 342

5 Kass, M., and Witkin, A.: ‘Analyzing oriented patterns’, Comput. Vis.,Graph., Image Process., 1987, 37, pp. 362–385

6 Sherlock, B.G., Monro, D.M., and Millard, K.: ‘Fingerprint enhance-ment by directional Fourier filtering’, IEE Proc., Vis., Image SignalProcess., 1994, 141, pp. 87–94

7 Chen, Z., and Sabee, M.: ‘Multiresolution vessel tracking inangiographic images using valley courses’ Opt. Eng., 2003, 42, (6),pp. 1673–1682

8 Russ, J.C.: ‘The image processing handbook’ (CRC Press, 2002,4th edn.)

9 Chen, Z., Karim, M., and Hayat, M.: ‘Locating target at high speedusing image decimation decomposition processing’, Pattern Recognit.,2001, 34, (3), pp. 685–694

10 Chen, Z., and Tao, Y.: ‘Food safety inspection using “from presence toclassification” object-recognition model’, Pattern Recognit., 2001, 34,pp. 2331–2338

IEE Proc.-Vis. Image Signal Process., Vol. 151, No. 5, October 2004344


Recommended