MARGO (Massively Automated Real-time GUI for Object-tracking), a platform for high-throughput ethology
Autoři:
Zach Werkhoven aff001; Christian Rohrsen aff001; Chuan Qin aff001; Björn Brembs aff002; Benjamin de Bivort aff001
Působiště autorů:
Dept. of Organismic and Evolutionary Biology & Center for Brain Science, Harvard University, Cambridge, MA, United States of America
aff001; Institut für Zoologie - Neurogenetik, Universität Regensburg, Regensburg, Germany
aff002
Vyšlo v časopise:
PLoS ONE 14(11)
Kategorie:
Research Article
prolekare.web.journal.doi_sk:
https://doi.org/10.1371/journal.pone.0224243
Souhrn
Fast object tracking in real time allows convenient tracking of very large numbers of animals and closed-loop experiments that control stimuli for many animals in parallel. We developed MARGO, a MATLAB-based, real-time animal tracking suite for custom behavioral experiments. We demonstrated that MARGO can rapidly and accurately track large numbers of animals in parallel over very long timescales, typically when spatially separated such as in multiwell plates. We incorporated control of peripheral hardware, and implemented a flexible software architecture for defining new experimental routines. These features enable closed-loop delivery of stimuli to many individuals simultaneously. We highlight MARGO’s ability to coordinate tracking and hardware control with two custom behavioral assays (measuring phototaxis and optomotor response) and one optogenetic operant conditioning assay. There are currently several open source animal trackers. MARGO’s strengths are 1) fast and accurate tracking, 2) high throughput, 3) an accessible interface and data output and 4) real-time closed-loop hardware control for for sensory and optogenetic stimuli, all of which are optimized for large-scale experiments.
Klíčová slova:
Cameras – Animal behavior – Light – Neurons – Biological locomotion – Optogenetics – Graphical user interfaces – Computer hardware
Zdroje
1. Fry S, Rohrseitz N, Straw A, Dickinson M. TrackFly: Virtual reality for a behavioral system analysis in free-flying fruit flies. J Neurosci Meth. 2008;171(1):110–117. doi: 10.1016/j.jneumeth.2008.02.016
2. Donelson NC, Kim EZ, Slawson JB, Vecsey CG, Huber R, Griffith LC. Correction: High-Resolution Positional Tracking for Long-Term Analysis of Drosophila Sleep and Locomotion Using the “Tracker” Program. PLoS ONE. 2012;7(8). doi: 10.1371/annotation/4c62d454-931e-4c48-841a-a701cb658a1c
3. Mathis A, Mamidanna P, Cury K, Abe T, Murthy V. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. 2018.
4. Pereira TD, Aldarondo DE, Willmore L, Kislin M, Wang SSH, Murthy M, et al. Fast animal pose estimation using deep neural networks. Nature Methods. 2018;16(1):117–125. doi: 10.1038/s41592-018-0234-5 30573820
5. Alfonso P, Julián V, Hinz RC, Arganda S, de Polavieja GG. idTracker: tracking individuals in a group by automatic identification of unmarked animals. Nat Methods. 2014;11(7):nmeth.2994.
6. Eyjolfsdottir E, Branson S, Burgos-Artizzu XP, Hoopfer ED, Schor J, Anderson DJ, et al. Detecting Social Actions of Fruit Flies. Computer Vision—ECCV 2014 Lecture Notes in Computer Science. 2014; p. 772–787.
7. Rodriguez A, Zhang H, Klaminder J, Brodin T, Andersson M. ToxId: an efficient algorithm to solve occlusions when tracking multiple animals. Scientific Reports. 2017;7(1). doi: 10.1038/s41598-017-15104-2
8. Romero-Ferrero F, Bergomi MG, Hinz RC, Heras FJH, Polavieja GGD. idtracker.ai: tracking all individuals in small or large collectives of unmarked animals. Nature Methods. 2019;16(2):179–182. doi: 10.1038/s41592-018-0295-5 30643215
9. Ramot D, Johnson B, B T Jr, one CL. The Parallel Worm Tracker: a platform for measuring average speed and drug-induced paralysis in nematodes. PloS one. 2008. doi: 10.1371/journal.pone.0002208 18493300
10. Swierczek NA, Giles AC, Rankin CH, Kerr RA. High-throughput behavioral analysis in C. elegans. Nat Methods. 2011;8(7):592. doi: 10.1038/nmeth.1625 21642964
11. Itskovits E, Levine A, Cohen E, Zaslaver A. A multi-animal tracker for studying complex behaviors. BMC Biology. 2017;15(1). doi: 10.1186/s12915-017-0363-9 28385158
12. Scaplen KM, Mei NJ, Bounds HA, Song SL, Azanchi R, Kaun KR. Automated real-time quantification of group locomotor activity in Drosophila melanogaster. Sci Rep. 2019;9(1):4427. doi: 10.1038/s41598-019-40952-5 30872709
13. Schneider J, Murali N, Taylor GW, Levine JD. Can Drosophila melanogaster tell who’s who? Plos One. 2018;13(10). doi: 10.1371/journal.pone.0205043
14. Branson K, Robie AA, Bender J, Perona P, Dickinson MH. High-throughput ethomics in large groups of Drosophila. Nat Methods. 2009;6(6):nmeth.1328. doi: 10.1038/nmeth.1328 19412169
15. Sridhar VH, Roche DG, Gingins S. Tracktor: Image-based automated tracking of animal movement and behaviour. Methods in Ecology and Evolution. 2019;0(0).
16. Liu G, Nath T, Linneweber GA, Claeys A, Guo Z, Li J, et al. A simple computer vision pipeline reveals the effects of isolation on social interaction dynamics in Drosophila. Plos Comput Biol. 2018;14(8):e1006410. doi: 10.1371/journal.pcbi.1006410 30161262
17. Kain J, Stokes C, Gaudry Q, Song X, Foley J, Wilson R, et al. Leg-tracking and automated behavioural classification in Drosophila. Nature Communications. 2013;4(1). doi: 10.1038/ncomms2908 23715269
18. Uhlmann V, Ramdya P, Delgado-Gonzalo R, Benton R, Unser M. FlyLimbTracker: An active contour based approach for leg segment tracking in unmarked, freely behaving Drosophila. Plos One. 2017;12(4). doi: 10.1371/journal.pone.0173433
19. Geissmann Q, Rodriguez L, Beckwith EJ, French AS, Jamasb AR, Gilestro GF. Ethoscopes: An open platform for high-throughput ethomics. Plos Biol. 2017;15(10):e2003026. doi: 10.1371/journal.pbio.2003026 29049280
20. Straw AD, Branson K, Neumann TR, Dickinson MH. Multi-camera real-time three-dimensional tracking of multiple flying animals. Journal of The Royal Society Interface. 2010;8(56):395–409. doi: 10.1098/rsif.2010.0230
21. Stowers JR, Hofbauer M, Bastien R, Griessner J, Higgins P, Farooqui S, et al. Virtual reality for freely moving animals. Nature Methods. 2017;14(10):995–1002. doi: 10.1038/nmeth.4399 28825703
22. Chagas A, L PL, Arrenberg AB, Baden T. The €100 lab: A 3D-printable open-source platform for fluorescence microscopy, optogenetics, and accurate temperature control during behaviour of zebrafish, Drosophila, and Caenorhabditis elegans. Plos Biol. 2017;15(7):e2002702. doi: 10.1371/journal.pbio.2002702
23. Kuhn HW. The Hungarian method for the assignment problem. 1955;2(1 2):83–97.
24. Mönck H, Jörg A, Falkenhausen T, Tanke J, Wild B, Dormagen D, et al. BioTracker: An Open-Source Computer Vision Framework for Visual Animal Tracking. 2018.
25. Heisenberg M, Wolf R. Vision in Drosophila. Studies of Brain Function. 1984.
26. Berman GJ, Choi DM, Bialek W, Shaevitz JW. Mapping the stereotyped behaviour of freely moving fruit flies. Journal of The Royal Society Interface. 2014;11(99):20140672–20140672. doi: 10.1098/rsif.2014.0672
27. Crall JD, Souffrant AD, Akandwanaho D, Hescock SD, Callan SE, Coronado WM, et al. Social context modulates idiosyncrasy of behaviour in the gregarious cockroach Blaberus discoidalis. Animal Behaviour. 2016;111:297–305. https://doi.org/10.1016/j.anbehav.2015.10.032.
28. Buchanan SM, Kain JS, de Bivort BL. Neuronal control of locomotor handedness in Drosophila. Proc National Acad Sci. 2015;112(21):6700–6705. doi: 10.1073/pnas.1500804112
29. Ayroles JF, Buchanan SM, Chelsea O, Kyobi S, Grenier JK, Clark AG, et al. Behavioral idiosyncrasy reveals genetic control of phenotypic variability. Proc National Acad Sci. 2015;112(21):6706–6711. doi: 10.1073/pnas.1503830112
30. Cruz TL, Fujiwara TE, Varela N, Mohammad F, Claridge-Chang A, Chiappe ME. Motor context coordinates visually guided walking in Drosophila. bioRxiv. 2019.
31. Götz KG, Wenking H. Visual control of locomotion in the walking fruitfly Drosophila. Journal of Comparative Physiology. 1973;85(3):235–266. doi: 10.1007/BF00694232
32. Haag J, Arenz A, Serbe E, Gabbiani F, Borst A. Complementary mechanisms create direction selectivity in the fly. eLife. 2016;5. doi: 10.7554/eLife.17421 27502554
33. Zhu Y, Nern A, Zipursky S, Biology FM. Peripheral visual circuits functionally segregate motion and phototaxis behaviors in the fly. Current Biology. 2009. doi: 10.1016/j.cub.2009.02.053
34. Kim S, Tellez K, Buchan G, Lebestky T. Fly Stampede 2.0: A Next Generation Optomotor Assay for Walking Behavior in Drosophila Melanogaster. Frontiers in Molecular Neuroscience. 2016;9. doi: 10.3389/fnmol.2016.00148
35. Seelig JD, Chiappe ME, Lott GK, Dutta A, Osborne JE, Reiser MB, et al. Two-photon calcium imaging from head-fixed Drosophila during optomotor walking behavior. Nature Methods. 2010;8(2):184–184. doi: 10.1038/nmeth0211-184b
36. Griebel G. Faculty of 1000 evaluation for Noninvasive optical inhibition with a red-shifted microbial rhodopsin. F1000—Post-publication peer review of the biomedical literature. 2014.
37. Klapoetke NC, Murata Y, Kim S, Pulver SR, Amanda B, Cho Y, et al. Independent optical excitation of distinct neural populations. Nat Methods. 2014;11(3). doi: 10.1038/nmeth.2836
38. Wustmann G, Rein K, Wolf R, Heisenberg M. A new paradigm for operant conditioning of Drosophila melanogaster. Journal of Comparative Physiology A. 1996;179(3). doi: 10.1007/BF00194996
39. Wustmann G, Heisenberg M. Behavioral manipulation of retrieval in a spatial memory task for Drosophila melanogaster. Learning & Memory. 1997;4(4):328–336. doi: 10.1101/lm.4.4.328
40. Diegelmann S. Genetic dissociation of acquisition and memory strength in the heat-box spatial learning paradigm in Drosophila. Learning & Memory. 2006;13(1):72–83. doi: 10.1101/lm.45506
41. Ostrowski D, Kahsai L, Kramer EF, Knutson P, Zars T. Place memory retention in Drosophila. Neurobiology of Learning and Memory. 2015;123:217–224. doi: 10.1016/j.nlm.2015.06.015 26143995
42. Putz G. Memories in Drosophila Heat-box Learning. Learning & Memory. 2002;9(5):349–359. doi: 10.1101/lm.50402
43. Sitaraman D, Zars M, Zars T. Reinforcement pre-exposure enhances spatial memory formation in Drosophila. Journal of Comparative Physiology A. 2007;193(8):903–908. doi: 10.1007/s00359-007-0243-9
44. Sitaraman D, Zars M, Zars T. Place memory formation in Drosophila is independent of proper octopamine signaling. Journal of Comparative Physiology A. 2010;196(4):299–305. doi: 10.1007/s00359-010-0517-5
45. Zars M, Zars T. High and low temperatures have unequal reinforcing properties in Drosophila spatial learning. Journal of Comparative Physiology A. 2006;192(7):727–735. doi: 10.1007/s00359-006-0109-6
46. Yang Z, Bertolucci F, Wolf R, Heisenberg M. Flies Cope with Uncontrollable Stress by Learned Helplessness. Current Biology. 2013;23(9):799–803. https://doi.org/10.1016/j.cub.2013.03.054. 23602474
47. Kain JS, Stokes C, de Bivort BL. Phototactic personality in fruit flies and its suppression by serotonin and white. Proc National Acad Sci. 2012;109(48):19834–19839. doi: 10.1073/pnas.1211988109
48. Chiang H. Tactic Reactions of Young Adults of Drosophila melanogaster. 1963;70(2):329.
49. Kain JS, Zhang S, Jamilla A, Samuel AD, Klein M, Bivort BL. Variability in thermal and phototactic preferences in Drosophila may reflect an adaptive bet hedging strategy. Evolution. 2015;69(12):3171–3185. doi: 10.1111/evo.12813 26531165
50. Rosner R, Egelhaaf M, Warzecha A. Behavioural state affects motion-sensitive neurones in the fly visual system. 2009;213(2):331–338.
51. Rien D, Kern R, Kurtz R. Octopaminergic modulation of a fly visual motion-sensitive neuron during stimulation with naturalistic optic flow. 2013;7.
52. Chiappe EM, Seelig JD, Reiser MB, Jayaraman V. Walking Modulates Speed Sensitivity in Drosophila Motion Vision. 2010;20(16):1470–1475.
53. Maimon G, Straw AD, Dickinson MH. Active flight increases the gain of visual motion processing in Drosophila. 2010;13(3):393–399.
54. Gorostiza EA. Does Cognition Have a Role in Plasticity of “Innate Behavior”? A Perspective From Drosophila. Frontiers in Psychology. 2018;9:1502. doi: 10.3389/fpsyg.2018.01502 30233444
55. Todd JG, Kain JS, de Bivort BL. Systematic exploration of unsupervised methods for mapping behavior. Phys Biol. 2017;14(1):015002. doi: 10.1088/1478-3975/14/1/015002 28166059
56. Churgin MA, Jung S, Yu C, Chen X, Raizen DM, Christopher F. Longitudinal imaging of Caenorhabditis elegans in a microfabricated device reveals variation in behavioral decline during aging. Elife. 2017;6:e26652. doi: 10.7554/eLife.26652 28537553
Článok vyšiel v časopise
PLOS One
2019 Číslo 11
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
- Nejasný stín na plicích – kazuistika
- Masturbační chování žen v ČR − dotazníková studie
- Úspěšná resuscitativní thorakotomie v přednemocniční neodkladné péči
- Dlouhodobá recidiva a komplikace spojené s elektivní operací břišní kýly
Najčítanejšie v tomto čísle
- A daily diary study on maladaptive daydreaming, mind wandering, and sleep disturbances: Examining within-person and between-persons relations
- A 3’ UTR SNP rs885863, a cis-eQTL for the circadian gene VIPR2 and lincRNA 689, is associated with opioid addiction
- A substitution mutation in a conserved domain of mammalian acetate-dependent acetyl CoA synthetase 2 results in destabilized protein and impaired HIF-2 signaling
- Molecular validation of clinical Pantoea isolates identified by MALDI-TOF