joint tracking and recognition of the locomotion state of c ...c.elegans, we developed a novel and...

1
Joint Tracking and Recognition of the Locomotion State of C.elegans from Time-Lapse Image Sequences Yu Wang, Badrinath Roysam Rensselaer Polytechnic Institute, Troy, NY 12180 Abstract Problem and Its Challenges Implementation of Proposed Model Extended Model for Tracking Worms in Physical Contact Interactive Observation Models Feature Computation and Analysis Conclusion and Future Works Reference The nematode Caenorhabditis elegans (C.elegans) is a simple and well-studied organism used for studying genetic influence at morphological or behavioral levels. To facilitate quantitative analysis of C.elegans, we developed a novel and fully automated worm tracking system. Most worm analysis systems in the literature treat worm tracking/detection and locomotion state recognition as two separate tasks, however, our proposed simultaneous tracking and locomotion state recognition model is designed to exploit the synergy between. This model could deal with worms with complex behaviors such as omega bend, or worms in physical contact. Based on the obtained worm structures and motion state labels, morphological, locomotory and other others are calculated, which serve as the basis for behavioral phenotype classification, event analysis and discovery, and other quantitative analyses of C.elegans. The problem: we are interested in tracking worm structure and measuring morphological, locomotory, event and interaction features from time-lapse microscopy image sequence. Application: these quantitative measurements are needed in fundamental research, drug discovery, toxicology, and agricultural science. Challenges: 1) Worms are non-rigid and highly deformable, and they are textureless, which make the worm tracking problem very unique compared to other object tracking problems. 2) Additional issues arise when tracking multiple worms. For example, when worms come into physical contact, it is very difficult to distinguish worms since they produce merged blobs in the binary image (Figure.1(b)). 3) Other challenges include complex behavior and deformation (coiling, omega bend, etc.), cluttered or dynamic background. [1] Katsunori Hoshi, Ryuzo Shingai, Computer-driven automatic identification of locomotion states in Caenorhabditis elegans, Journal of Neuroscience Methods, 2006. [2] Huang KM, Cosman P, Schafer WR, Machine vision based detection of omega bends and reversals in C. elegans. Journal of Neuroscience Methods, Vol. 158, Issue 2, pp. 323-336, December 2006. [3] Roussel Nicolas, Morton Christine A, Finger Fern P, Roysam Badrinath,A computational model for C. elegans locomotory behavior : Application to multiworm tracking,IEEE transactions on biomedical engineering, pp. 1786-1797, 2007. [4] K.M. Huang, P.C. Cosman, and W. Schafer, Automated tracking of multiple C. elegans with articulated models, IEEE International Symposium on Biomedical Imaging (ISBI'07), pp. 1240-1243. [5] Isard, M.; Blake, A., A mixed-state condensation tracker with automatic model- switching ,Computer Vision, 1998. Sixth International Conference on , vol., no., pp.107-112, 4-7 Jan 1998 [6] http://farsight-toolkit.org/wiki/Worm_Features%26Events [7] http://www.rpi.edu/~wangy15/PaperExperiment/ [8] Stephens GJ, Johnson-Kerner B, Bialek W, Ryu WS, Dimensionality and Dynamics in the Behavior of C. elegans. PLoS Comput Biol 4(4), 2008 Proposed Model Results Our automated system quantitates features describing the worm shape, motion, events (reversal or omega bend), and interaction [6]. It outputs totally 42 features. Table.1 Some features (event and interaction features) extracted from the image sequences used in [4] Movie 17,Visual Summary of Tracks and Motion States Start Start End End 50 100 150 200 250 300 F R S O Worm1 Locomotion Pattern 50 100 150 200 250 300 F R S O Worm2 (F=Forward, R=Reversal, S=SharpTurn, O=OmegaBend) -50 -40 -30 -20 -10 0 10 20 30 40 -50 -40 -30 -20 -10 0 10 20 30 40 Track Wavelength Best-fit Bounding Rectangle /Minimum Enclosing Rectangle Amplitude Skeleton Points after rotation and offset Head Skeleton Points before rotation Time Skeleton Points 10 20 30 40 50 60 5 10 15 20 25 30 35 40 45 50 Figure 11. (a) An illustration for computing worm shape features. (b) Curvature map used for computing bending frequency (a) (b) Figure 9. Illustration of the tracking result for different scenarios (touching, crossing, coiling, etc.). Figure 10. Visual summary of the worm tracks and state labels output by our automated system Different from existing tracking systems in the literature which usually treat worm tracking and locomotion stage recognition as two independent tasks, our model proposed here is designed to exploit synergy between the two tasks. The experiments demonstrate that our automated system advances the state-of-the-art with: 1) better performance of tracking interacting worms and worms with complex behavioral patterns. 2) simultaneous recognition of locomotion labels. 3) new features quantifying worm in terms of its morphology, locomotion, behavioral event and interaction. Our future investigations include: 1) to automatically model worm dynamics, we plan to study worm dynamics in lower dimensionality as in [8]. 2) to deal with more complex worm interaction, we will investigate bi-directional tracking or “smoothing” algorithm. Worm Model Figure 4. Simulating basic worm movement patterns: peristaltic progression and radial deformation Figure 3. Parameterized generative worm model Figure 2. Graphical model for simultaneous worm tracking and locomotion state recognition R1 R2 Fundamental Science Fundamental Science Validating TestBEDs Validating TestBEDs L1 L1 L2 L2 L3 L3 R3 S1 S4 S5 S3 S2 Bio-Med Enviro-Civil Figure 1. (a) Worms in physical contact. (b) Merged blob produced by self-touching and interaction of worms. (a) (b) In the literature, tracking/detecting worm structure and recognizing locomotion state are usually treated as two separate tasks [1][2], information about motion states is not used for improving tracking performance [3][4]. In our model, locomotion states (node S) or model parameters (node X) are inferred or estimated probabilistically based on coarse observations (node R and E) made on the images. Centerline before adjustment Centerline after adjustment and B-Spline fitting 1D Search Line Figure 6. Illustration of the centerline adjustment. The red centerline is shifted to the blue centerline by searching along the 1D search lines. Figure 5. The edge observation model. It measures how well the contour of the worm model matches the foreground object, and how close the head and tail in the worm model are to the high-curvature points (which are the potential worm tips). Figure 8. Interactive edge observation model. Edge points found in interaction area are excluded, high-curvature points are searched along the circles centered at head/tail of the worm models. Figure 7. Joint worm tracking model Algorithm: In the model, the tracking problem is to recursively compute the posterior density function (pdf) , which is the conditional probability of the mixed-state given all the observations/measurements made on the images up to time t, and it could be derived as: 1: ( | ) t t P X Z 1: 1: 1 1: 1 1 1 1: 1 1 ( | ) ( | ) ( | ), ( | ) ( | ) ( | )d , t t t t t t t t t t t t t PX Z kP Z X PX Z PX Z PX X PX Z X = = The dynamics model can be further decomposed as [5]: 1 ( | ) t t PX X 1 1 1 1 1 1 1 1 1 ( | ) ( | , ) ( | ), ( | ): ( | , ) ( | , ): ( | , ) ( | ) t t t t t t t t t t t t ij t t t t t t j t t PX X Px S X PS X PS X PS jx S i T Px S X Px x S j Px x = = = = = = where is the locomotion state transition probabilities, which could be automatically trained, or manually set as: is the state-dependent dynamics model or sub-process density. There are four sub-process densities defined for forward movement, reversal, omega bend and sharp turn. is the observation model which outputs the likelihood for each hypothetical worm model. One of the observation models used is edge based, as shown in Figure.5. 1 Forw Reve STurn OBend Forw 0.8 0.1 0.05 0.05 Reve 0.1 0.75 0.05 0.1 ( | ) STurn 0.1 0.05 0.8 0.05 OBend 0.1 0.05 0.05 0.8 t t ij PS jS i T = = = = ij T ( , ) t t t X xS = 1 ( | ) j t t P x x - Interactive observation models are used in the joint tracking model (Figure.8). - Interactive centerline adjustment is also designed to avoid the adjustment of one worm centerline being attracted by other worms. ( | ) t t PZ X When worms come into physical contact, the observations they produced will merge and are not independent, as shown in Figure.7. We need to track the worms in their joint configuration space: (,) 1: 1: 1 ( | ) ( | , ) ( , | ) t ij i j i j t t t t t t t PX Z kP Z X X PX X Z = 0 100 200 300 400 500 0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018 0.02 Reversal Interval Density Worm1 Worm2 0 2000 4000 6000 8000 10000 12000 14000 0 0.2 0.4 0.6 0.8 1 1.2 x 10 -3 Omega Bend Interval Density Worm1 Worm2 Figure 12. Interval between the occurrence of reversals or omega bends (image sequences from U of Utah) Acknowledgement This work was supported in part by Gordon-CenSSIS, The Bernard M. Gordon Center for Subsurface Sensing and Imaging Systems, under the Engineering Research Centers Program of the National Science Foundation (Award Number EEC-9986821). Thanks to Jamie White at University of Utah for providing image sequences and helpful discussion about worm features. High Curvature Point Edge Point Interaction Area, no Edge Point was found Centerline Adjustment: A centerline adjustment procedure is performed to fine tune the centerline of the hypothetical worm model.

Upload: others

Post on 29-Sep-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Joint Tracking and Recognition of the Locomotion State of C ...C.elegans, we developed a novel and fully automated worm tracking system. Most worm analysis systems in the literature

Joint Tracking and Recognition of the Locomotion State of C.elegans from Time-Lapse Image Sequences

Yu Wang, Badrinath Roysam Rensselaer Polytechnic Institute, Troy, NY 12180

Abstract

Problem and Its Challenges

Implementation of Proposed Model

Extended Model for Tracking Worms in Physical Contact

Interactive Observation Models Feature Computation and Analysis

Conclusion and Future Works

Reference

The nematode Caenorhabditis elegans (C.elegans) is a simple and well-studied organism used for studying genetic influence at morphological or behavioral levels. To facilitate quantitative analysis of C.elegans, we developed a novel and fully automated worm tracking system. Most worm analysis systems in the literature treat worm tracking/detection and locomotion state recognition as two separate tasks, however, our proposed simultaneous tracking and locomotion state recognition model is designed to exploit the synergy between. This model could deal with worms with complex behaviors such as omega bend, or worms in physical contact. Based on the obtained worm structures and motion state labels, morphological, locomotory and other others are calculated, which serve as the basis for behavioral phenotype classification, event analysis and discovery, and other quantitative analyses of C.elegans.

The problem: we are interested in tracking worm structure and measuring morphological, locomotory, event and interaction features from time-lapse microscopy image sequence. Application: these quantitative measurements are needed in fundamental research, drug discovery, toxicology, and agricultural science.Challenges: 1) Worms are non-rigid and highly deformable, and they are textureless, which make the worm tracking problem very unique compared to other object tracking problems. 2) Additional issues arise when tracking multiple worms. For example, when worms come into physical contact, it is very difficult to distinguish worms since they produce merged blobs in the binary image (Figure.1(b)). 3) Other challenges include complex behavior and deformation (coiling, omega bend, etc.), cluttered or dynamic background.

[1] Katsunori Hoshi, Ryuzo Shingai, Computer-driven automatic identification of locomotion states in Caenorhabditis elegans, Journal of Neuroscience Methods, 2006.[2] Huang KM, Cosman P, Schafer WR, Machine vision based detection of omega bends and reversals in C. elegans. Journal of Neuroscience Methods, Vol. 158, Issue 2, pp. 323-336, December 2006.[3] Roussel Nicolas, Morton Christine A, Finger Fern P, Roysam Badrinath,A computational model for C. elegans locomotory behavior : Application to multiworm tracking,IEEE transactions on biomedical engineering, pp. 1786-1797, 2007. [4] K.M. Huang, P.C. Cosman, and W. Schafer, Automated tracking of multiple C. elegans with articulated models, IEEE International Symposium on Biomedical Imaging (ISBI'07), pp. 1240-1243. [5] Isard, M.; Blake, A., A mixed-state condensation tracker with automatic model- switching ,Computer Vision, 1998. Sixth International Conference on , vol., no., pp.107-112, 4-7 Jan 1998[6] http://farsight-toolkit.org/wiki/Worm_Features%26Events[7] http://www.rpi.edu/~wangy15/PaperExperiment/[8] Stephens GJ, Johnson-Kerner B, Bialek W, Ryu WS, Dimensionality and Dynamics in the Behavior of C. elegans. PLoS Comput Biol 4(4), 2008

Proposed Model

Results

Our automated system quantitates features describing the worm shape, motion, events (reversal or omega bend), and interaction [6]. It outputs totally 42 features.

Table.1 Some features (event and interaction features) extracted from the image sequences used in [4]

Movie 17,Visual Summary of Tracks and Motion States

Start

Start

End

End

50 100 150 200 250 300F

R

S

O

Worm1Locomotion Pattern

50 100 150 200 250 300F

R

S

O

Worm2 (F=Forward, R=Reversal, S=SharpTurn, O=OmegaBend)

−50 −40 −30 −20 −10 0 10 20 30 40−50

−40

−30

−20

−10

0

10

20

30

40

Track Wavelength

Best−fit Bounding Rectangle/Minimum Enclosing Rectangle

Amplitude

Skeleton Points afterrotation and offset

Head

Skeleton Points beforerotation

Time

Ske

leto

n P

oint

s

10 20 30 40 50 60

5

10

15

20

25

30

35

40

45

50

Figure 11. (a) An illustration for computing worm shape features. (b) Curvature map used for computing bending frequency

(a) (b)

Figure 9. Illustration of the tracking result for different scenarios (touching, crossing, coiling, etc.).

Figure 10. Visual summary of the worm tracks and state labels output by our automated system

Different from existing tracking systems in the literature which usually treat worm tracking and locomotion stage recognition as two independent tasks, our model proposed here is designed to exploit synergy between the two tasks. The experiments demonstrate that our automated system advances the state-of-the-art with: 1) better performance of tracking interacting worms and worms with complex behavioral patterns. 2) simultaneous recognition of locomotion labels. 3) new features quantifying worm in terms of its morphology, locomotion, behavioral event and interaction. Our future investigations include: 1) to automatically model worm dynamics, we plan to study worm dynamics in lower dimensionality as in [8]. 2) to deal with more complex worm interaction, we will investigate bi-directional tracking or “smoothing” algorithm.

Worm Model

Figure 4. Simulating basic worm movement patterns: peristaltic progression and radial deformation

Figure 3. Parameterized generative worm model

Figure 2. Graphical model for simultaneous worm tracking and locomotion state recognition

R1

R2Fundamental Science Fundamental Science

Validating TestBEDs Validating TestBEDs

L1L1

L2L2

L3L3

R3

S1 S4 S5S3S2

Bio-Med Enviro-Civil

Figure 1. (a) Worms in physical contact. (b) Merged blob produced by self-touching and interaction of worms.

(a) (b)

In the literature, tracking/detecting worm structure and recognizing locomotion state are usually treated as two separate tasks [1][2], information about motion states is not used for improving tracking performance [3][4]. In our model, locomotion states (node S) or model parameters (node X) are inferred or estimated probabilistically based on coarse observations (node R and E) made on the images.

Centerline before adjustment

Centerline after adjustment and B−Spline fitting

1D Search Line

Figure 6. Illustration of the centerline adjustment. The red centerline is shifted to the blue centerline by searching along the 1D search lines.

Figure 5. The edge observation model. It measures how well the contour of the worm model matches the foreground object, and how close the head and tail in the worm model are to the high-curvature points (which are the potential worm tips).

Figure 8. Interactive edge observation model. Edge points found in interaction area are excluded, high-curvature points are searched along the circles centered at head/tail of the worm models.

Figure 7. Joint worm tracking model

Algorithm: In the model, the tracking problem is to recursively compute the posterior density function (pdf) , which is the conditional probability of the mixed-state

given all the observations/measurements made on the images up to time t, and it could be derived as:

1:( | )t tP X Z

1: 1: 1

1: 1 1 1 1: 1 1

( | ) ( | ) ( | ),

( | ) ( | ) ( | )d ,t t t t t t

t t t t t t t

P X Z kP Z X P X Z

P X Z P X X P X Z X−

− − − − −

=

= ∫The dynamics model can be further decomposed as [5]:1( | )t tP X X −

1 1 1

1 1 1

1 1 1

( | ) ( | , ) ( | ),( | ) : ( | , )

( | , ) : ( | , ) ( | )

t t t t t t t

t t t t t ij

t t t t t t j t t

P X X P x S X P S XP S X P S j x S i T

P x S X P x x S j P x x

− − −

− − −

− − −

== = =

= =

where is the locomotion state transition probabilities, which could be automatically trained, or manually set as:

is the state-dependent dynamics model or sub-process density. There are four sub-process densities defined for forward movement, reversal, omega bend and sharp turn.

is the observation model which outputs the likelihood for each hypothetical worm model. One of the observation models used is edge based, as shown in Figure.5.

1

Forw Reve STurn OBendForw 0.8 0.1 0.05 0.05Reve 0.1 0.75 0.05 0.1

( | )STurn 0.1 0.05 0.8 0.05OBend 0.1 0.05 0.05 0.8

t t ijP S j S i T−

⎡ ⎤⎢ ⎥⎢ ⎥= = = =⎢ ⎥⎢ ⎥⎣ ⎦

ijT

( , )t t tX x S=

1( | )j t tP x x −

- Interactive observation models are used in the joint tracking model (Figure.8).- Interactive centerline adjustment is also designed to avoid the adjustment of one worm

centerline being attracted by other worms.

( | )t tP Z X

When worms come into physical contact, the observations they produced will merge and are not independent, as shown in Figure.7. We need to track the worms in their joint configuration space: ( , )

1: 1: 1( | ) ( | , ) ( , | )t

i j i j i jt t t t t t tP X Z kP Z X X P X X Z −=

0 100 200 300 400 5000

0.002

0.004

0.006

0.008

0.01

0.012

0.014

0.016

0.018

0.02

Reversal Interval

Den

sity

Worm1Worm2

0 2000 4000 6000 8000 10000 12000 140000

0.2

0.4

0.6

0.8

1

1.2x 10

−3

Omega Bend Interval

Den

sity

Worm1Worm2

Figure 12. Interval between the occurrence of reversals or omega bends (image sequences from U of Utah)

AcknowledgementThis work was supported in part by Gordon-CenSSIS, The Bernard M. Gordon Center for Subsurface Sensing and Imaging Systems, under the Engineering Research Centers Program of the National Science Foundation (Award Number EEC-9986821). Thanks to Jamie White at University of Utah for providing image sequences and helpful discussion about worm features.

High Curvature Point

Edge Point

Interaction Area, no Edge Point was found

Centerline Adjustment: A centerline adjustment procedure is performed to fine tune the centerline of the hypothetical worm model.