data-driven discovery for the cortical column conjecturepriebe/cepmooreddd4ccc-072614.pdf ·...
TRANSCRIPT
The Gordon and Betty Moore FoundationData-Driven Discovery Symposium
Data-Driven Discovery
for the
Cortical Column Conjecture
Carey E. Priebehttp://www.ams.jhu.edu/~priebe
Department of Applied Mathematics & StatisticsCenter for Imaging Science
Department of Computer ScienceDepartment of Electrical and Computer Engineering
Johns Hopkins UniversityBaltimore, MD, USA
July 28-29, 2014Palo Alto, California
Outline
Introduction
Notable Findings
Vision & Plans
Introduction Notable Findings Vision & Plans
IntroductionMotivation
Many contemporary theories of neural information processingsuggest that the neocortex employs algorithms composed ofrepeated instances of a limited set of computing primitives.There is a recognized need for tools for interrogating the structureof the cortical microcircuits believed to embody these primitives.
Cortical Column Conjecture
Neurons are connected in a graph that exhibits motifs representing
repeated processing modules.
J. C. Horton and D. L. Adams. “The cortical column: a structurewithout a function,” Philosophical Transactions of the Royal Society B,360(1456):837-862, Apr. 2005.
V. Mountcastle. “The columnar organization of the neocortex,” Brain,120(4):701-722, Apr. 1997.
3 / 25
Introduction Notable Findings Vision & Plans
IntroductionFundamental Question
What are the relevant cortical computing structures associated
with recurring circuit motifs?
Goals
1. To obtain, and to understand the error characteristics of, anobserved cortical graph.
2. To develop, and to understand the theoretical & practicalproperties of, general methodologies for end-to-end corticalgraph inference.
We aim to advance theoretical neuroscience, and impactalgorithms for machine intelligence, through insights derived fromhigh-fidelity reconstructions of cortical microcircuits; in particular,we will estimate relevant cortical computing parameters fromobserved cortical graphs.
4 / 25
Introduction Notable Findings Vision & Plans
Notable Findings IScience, April 25, 2014
Bioinformatics, Department of Computational Biology, Grad-uate School of Frontier Sciences, The University of Tokyo, 5-1-5Kashiwanoha, Kashiwa, Chiba, 277-8561, Japan. 7ComparativeGenomics Laboratory, National Institute of Genetics, 411-8540Yata 1111, Mishima, Shizuoka, 411-8540, Japan. 8Com-putational Biology Group, IIDMM, University of Cape TownFaculty of Health Sciences, Cape Town, 7925, South Africa.9Department of Biochemistry and Biophysics, Texas A&MUniversity, 328B TAMU, College Station, TX 77843, USA. 10De-partment of Biochemistry and Molecular Biology, EgertonUniversity, Post Office Box 536, Njoro, Kenya. 11Department ofBiochemistry, Jomo Kenyatta University of Agriculture andTechnology (JKUAT), Post Office Box 62000-00200, Nairobi,Kenya. 12Department of Biochemistry, Virginia PolytechnicInstitute and State University, 309 Fralin Hall, Blacksburg, VA24061, USA. 13Department of Biological Chemistry and CropProtection, Rothamsted Research, West Common, Harpenden,Herts, AL5 2JQ, UK. 14Department of Biological Sciences,McMicken College of Arts and Sciences, University of Cincinnati,Cincinnati, OH 45221, USA. 15Department of Biological Sci-ences, University of Cape Town, Private Bag, Rondebosch,ZA-7700, South Africa. 16Department of Biological Sciences,University of Wisconsin–Parkside, 900 Wood Road, Kenosha,WI 53144, USA. 17Department of Biological Sciences, WayneState University, 5047 Gullen Mall, Detroit, MI 48202, USA.18Department of Biology and Biotechnology, University ofPavia, Via Ferrata 9, Pavia, 27100, Italy. 19Department of Bi-ology, Baylor University, Waco, TX 76798, USA. 20Departmentof Biology, KU Leuven, Naamsestraat 59, Leuven, B-3000,Belgium. 21Department of Biology, New Mexico State University,Foster Hall 263, Las Cruces, NM 88003, USA. 22Department ofBiology, University of York, Wentworth Way, York, Y010 5DD,UK. 23Department of Biology, West Virginia University, 53Campus Drive, 5106 LSB, Morgantown, WV, USA. 24Depart-ment of Biomedical Sciences, Institute of Tropical MedicineAntwerp, Nationalestraat 155, Antwerp, B-2000, Belgium.25Department of Cellular Biology, University of Georgia, 302Biological Sciences Building, Athens, GA 30602, USA. 26De-partment of Clinical Research, KEMRI-Wellcome Trust Programme,CGMRC, Post Office Box 230-80108, Kilifi, Kenya. 27Depart-ment of Computer and Information Sciences, College of Sci-ence and Technology, Covenant University, P.M.B. 1023, Ota,Ogun State, Nigeria. 28Department of Computer Science andEngineering, Department of Biochemistry and Biophysics,Texas A&M University, HRBB 328B TAMU, College Station, TX77843, USA. 29Department of Entomology, North Carolina StateUniversity, Campus Box 7613, Raleigh, NC 27695–7613, USA.30Department of Entomology, Texas A&M University, 2475TAMU, College Station, TX 77843, USA. 31Department ofEntomology, The Ohio State University, 400 Aronoff Labora-tory, 318 West 12th Avenue, Columbus, OH 43210, USA.32Department of Entomology, University of Arizona, 1140 EastSouth Campus Drive, Forbes 410, Tucson, AZ 85721, USA.33Department of Entomology, University of Illinois at Urbana-Champaign, 505 South Goodwin Avenue, Urbana, IL 61801,USA. 34Insect Pest Control Laboratory, Joint FAO/IAEA Pro-gramme of Nuclear Techniques in Food and Agriculture, Vienna,1220, Austria. 35Department of Environmental and NaturalResources Management, University of Patras, 2 Seferi Street,Agrinio, 30100, Greece. 36Department of Epidemiology ofMicrobial Diseases, Yale School of Public Health, 60 CollegeStreet, New Haven, CT 06520, USA. 37Department of Evolution,Ecology, and Organismal Biology, The Ohio State University,300 Aronoff Laboratory, 318 West 12th Avenue, Columbus, OH43210, USA. 38Department of Natural Sciences, St. CatharineCollege, 2735 Bardstown Road., St. Catharine, KY 40061, USA.39Department of Nutritional Sciences, University of Arizona,Career and Academic Services, College of Agriculture and LifeSciences, Forbes Building, Room 201, Post Office Box 210036,Tucson, AZ 85721–0036, USA. 40Department of NutritionalSciences, University of Arizona, Shantz 405, 1177 East 4thStreet, Tucson, AZ 85721–0038, USA. 41Department of Patho-gen Molecular Biology, London School of Hygiene and TropicalMedicine, Keppel Street, London, WC1E 7HT, UK. 42Depart-ment of Parasitology, Liverpool School of Tropical Medicine,Pembroke Place, Liverpool, L3 5QA, UK. 43Department of Vec-tor Biology, Liverpool School of Tropical Medicine, PembrokePlace, Liverpool, L3 5QA, UK. 44Entomology Section, OnderstepoortVeterinary Institute, Private Bag X5, Onderstepoort, 110, SouthAfrica. 45Eck Institute for Global Health, Department of Bio-
logical Sciences, University of Notre Dame, Notre Dame, IN 46556,USA. 46Department of Veterinary Tropical Diseases, Universityof Pretoria, Private Bag X04, Onderstepoort, 110, South Africa.47European Molecular Biology Laboratories, European Bio-informatics Institute (EMBL-EBI), Wellcome Trust Genome Cam-pus, Hinxton, Cambridge, Cambridgeshire, CB10 1SA, UK.48Group of Bioinformatics and Modeling, Laboratory of Med-ical Parasitology, Biotechnology, and Biomolecules, Institut Pasteurde Tunis, 13, Place Pasteur, BP74, Belvédère, Tunis, 1002,Tunisia. 49Institut de Recherche pour le Développement (IRD),UMR 177 IRD-CIRAD INTERTRYP, CIRDES Bobo-Dioulasso,Burkina Faso. 50Institut de Recherche pour le Développement(IRD), UMR 177 IRD-CIRAD INTERTRYP, LRCT Campus Inter-national de Baillarguet, Montpellier, France. 51Institute of Bi-ological, Environmental, and Rural Sciences, University ofAberystwyth, Old College, King Street, Aberystwyth, Ceredigion,SY23 3FG, UK. 52Institute of Biotechnology Research, JomoKenyatta University of Agriculture and Technology (JKUAT),Post Office Box 62000-00200, Nairobi, Kenya. 53Institute ofIntegrative Biology, The University of Liverpool, Crown Street,Liverpool, L69 7ZB, UK. 54Institute of Zoology, Slovak Academyof Sciences, Dúbravská cesta 9, Bratislava, 845 06, Slovakia.55Integrated Database Team, Biological Information ResearchCenter, National Institute of Advanced Industrial Science andTechnology, Aomi 2-4-7, Koto-ku, Tokyo, 135-0064, Japan.56Kenya Agricultural Research Institute Trypanosomiasis Re-search Centre, Post Office Box 362, Kikuyu, 902, Kenya. 57Lab-oratory of Malaria and Vector Research, National Institute ofAllergy and Infectious Diseases, 12735 Twinbrook Parkway,Room 2E-32D, Rockville, MD 20852, USA. 58TechnologyInnovation Agency, National Genomics Platform, Post OfficeBox 30603, Mayville, Durban, 4058, South Africa. 59Depart-ment of Biochemistry and Sports Science, Makerere University,Post Office Box 7062, Kampala, Uganda. 60Bateman Group,Wellcome Trust Sanger Institute, EMBL European Bioinfor-matics Institute, Wellcome Trust Genome Campus, Hinxton,Cambridge, Cambridgeshire, CB10 1SA, UK. 61Molecular Bi-ology and Bioinformatics Unit, International Center of InsectPhysiology and Ecology, Duduville Campus, Kasarani, PostOffice Box 30772-00100, Nairobi, Kenya. 62National LivestockResources Research Institute (NaLIRRI), Post Office Box 96,Tororo, Uganda. 63National Research Center for ProtozoanDiseases, Obihiro University of Agriculture and VeterinaryMedicine, Inada-cho, Obihiro, Hokkaido, 080-8555, Japan.64Parasite Genomics Group, Wellcome Trust Sanger Institute,Wellcome Trust Genome Campus, Hinxton, Cambridge,
Cambridgeshire, CB10 1SA, UK. 65Riddiford Laboratory, JaneliaFarm Research Campus, Howard Hughes Medical Institute,19700 Helix Drive, Ashburn, VA 20147, USA. 66RIKEN Centerfor Integrative Medical Sciences, 1-7-22 Suehiro-cho, Tsurumi-ku, Yokohama, Kanagawa, 230-0045, Japan. 67South AfricanNational Bioinformatics Institute, South African MRC Bio-informatics Unit, University of the Western Cape, 5th Floor LifeSciences Building, Modderdam Road, Bellville 7530, SouthAfrica. 68Special Programme for Research and Training inTropical Diseases (TDR), WHO, Avenue Appia 20, 1211 Geneva27, Switzerland. 69The Genome Institute, Washington Univer-sity School of Medicine, St. Louis, MO 63110, USA. 70Tsetseand Trypanosomiasis Research Institute (TTRI), Majani Mapana,Off Korogwe Road, Post Office Box 1026, Tanga, Tanzania.71Vector Health International, Post Office Box 15500, Arusha,Tanzania. 72Wellcome Trust Sanger Institute, Wellcome TrustGenome Campus, Hinxton, Cambridge, Cambridgeshire, CB101SA, UK. 73WHO Regional Office for Africa, WHO, Cité duDjoué, Post Office Box 06, Brazzaville, Congo. 74School ofBasic Sciences, Indian Institute of Technology, Mandi 175001,Himachal Pradesh, India. 75Department of Parasite, Vector,and Human Biology, KEMRI-Wellcome Trust Programme,CGMRC, Post Office Box 230-80108, Kilifi, Kenya. 76Instituteof Medical Science, The University of Tokyo, 4-6-1 Shirokanedai,Minato-ku, Tokyo, 108-8639, Japan. 77Department of Bio-chemistry and Biotechnology, Kenyatta University, Post OfficeBox 43844-00100, Nairobi, Kenya. 78Department of Biolog-ical Sciences, Indian Institute of Science Education and Re-search, Indore Bypass Road, Bhauri District, Bhopal, MadhyaPradesh, 462066, India.79Faculty of Science, King AbdulazizUniversity, Jeddah, 21589, SA. 80Department of Biostatistics,Harvard School of Public Health, 655 Huntington Ave. Boston,MA 02461. 81Bioinformatics Research Unit, AgrogenomicsResearch Center, National Institute of Agrobiological Sciences,2-1-2, Kannondai, Tsukuba, Ibaraki 305-8602, Japan.
Supplementary Materialswww.sciencemag.org/content/344/6182/380/suppl/DC1Materials and MethodsSupplementary TextFigs. S1 to S9Tables S1 to S43References (39–101)
12 December 2013; accepted 19 March 201410.1126/science.1249656
Discovery of BrainwideNeural-BehavioralMaps via Multiscale UnsupervisedStructure LearningJoshua T. Vogelstein,1,2* Youngser Park,1* Tomoko Ohyama,3* Rex A. Kerr,3 James W. Truman,3
Carey E. Priebe,1†‡ Marta Zlatic3†‡
A single nervous system can generate many distinct motor patterns. Identifying which neurons and circuitscontrol which behaviors has been a laborious piecemeal process, usually for one observer-definedbehavior at a time. We present a fundamentally different approach to neuron-behavior mapping.We optogenetically activated 1054 identified neuron lines in Drosophila larvae and tracked thebehavioral responses from 37,780 animals. Application of multiscale unsupervised structure learningmethods to the behavioral data enabled us to identify 29 discrete, statistically distinguishable,observer-unbiased behavioral phenotypes. Mapping the neural lines to the behavior(s) they evokeprovides a behavioral reference atlas for neuron subsets covering a large fraction of larval neurons.This atlas is a starting point for connectivity- and activity-mapping studies to further investigate themechanisms by which neurons mediate diverse behaviors.
Nervous systems can generate a wide rangeof motor outputs, depending on their in-coming sensory inputs and internal state.
A comprehensive understanding of how behav-
ioral diversity and selection is achieved requiresthe identification of neural circuits that mediatemany distinct motor patterns in a given nervoussystem. Mapping a functional circuit for one be-
25 APRIL 2014 VOL 344 SCIENCE www.sciencemag.org386
RESEARCH ARTICLES
Timothy O’Leary and Eve Marder, “Mapping Neural Activation onto Behavior in an Entire Animal,”Science, April 25, 2014.“(Authors) usher in a new era of integrated methods for deciphering how an entire nervous system
generates behavior... (They) have thus achieved a technical, multidisciplinary tour de force that will provide
a rich source of research questions.”
5 / 25
Introduction Notable Findings Vision & Plans
Notable Findings I
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
6921995276325151190348716033981157
1131210222510372448952956614412663463311995137127483783
42523051838
37780
20629
14245
54504758
8795
3705
5090
6384
16681555
4716
3435
1281
17151
2686
2292
1190
1102
394
300
94
14465
9897
3366
6531
45684143
slowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslowslow
fastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfastfast
turnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturnersturners
still or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−upstill or back−up
escapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescapeescape
turn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoidturn−avoid
turn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turnturn−turn−turn
turn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawlturn−slow−crawl
stillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstillstill
back−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−upback−up
wiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escapewiggle−escape
straight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escapestraight−escape
right−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoidright−left−avoid
left−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoidleft−right−avoid1111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222233333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333334444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444444555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555555566666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666667777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777777888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888888899999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999991010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101010101011111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131313131314141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414141414151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151515151616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161616161617171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717171717181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181818181919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191919191920202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020202020212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212121212222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222223232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323232323242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242424242525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252525252526262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626262626272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272727272828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282828282829292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929292929
'post−hoc' human labels # animals
stat
istic
ally
dis
tingu
isha
ble
beha
viot
ypes
0 2 4 6tree depth level
(f) Cluster Tree
area (a.u.) forward/backward crawling bias
head turn (deg) sideways movement speed (mm/sec)
direction of motion length (a.u.)
width (a.u.) speed (mm/sec)
0.975
1.000
1.025
−0.5
0.0
0.5
1.0
−40
−20
0
20
40
0.1
0.2
0.3
0.000.010.020.030.04
0.90
0.95
1.00
1.00
1.05
1.10
1.15
1.20
0.5
1.0
0 10 20 30 40 0 10 20 30 40
time (sec)
(e) Cluster Response
Figure 2. (f) Post-hoc human labels assigned to the automaticallydetected behaviotype families and subfamilies. (e) Mean and standarderror of the responses of the eight behaviotype subfamilies.
Vogelstein et al, “Discovery of Brainwide Neural-Behavioral Maps via Multiscale Unsupervised StructureLearning,” Science, vol. 344 no. 6182, 386-392, April 25, 2014.
6 / 25
Introduction Notable Findings Vision & Plans
Notable Findings I
Figure 4. Examples of significant neuron lines and candidate neuronsinvolved in distinct behaviotypes.
Vogelstein et al, “Discovery of Brainwide Neural-Behavioral Maps via Multiscale Unsupervised StructureLearning,” Science, vol. 344 no. 6182, 386-392, April 25, 2014.
7 / 25
Introduction Notable Findings Vision & Plans
Notable Findings IIAdjacency Spectral Graph Embedding & Subsequent Inference
The eigendecomposition of an adjacency matrix provides a way toembed a graph as points in finite dimensional Euclidean space.This embedding allows the full arsenal of statistical and machinelearning methodology for multivariate Euclidean data to bedeployed for graph inference. Our work analyzes this embedding inthe context of various random graph models with a focus on theimpact for subsequent inference. In summary, this body of workdemonstrates that for a broad class of graph models and inferencetasks, adjacency spectral embedding allows for accurate graphinference via standard multivariate methodology.
8 / 25
Introduction Notable Findings Vision & Plans
Notable Findings IIskmeans
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●
●●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
● ●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●
●
●
● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●●
●●
●
●●
●
●
●
●
●
●
●
●
● ●
●●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
● ●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
● ●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●● ●
● ●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●● ●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●●
●●
●
●
●● ●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
● ●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
● ●
●
●
●
●
●
●
●●
●● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
1234
5678
(a) graph
Skmeans, Rhat = 8
Dimensions: 1748 x 1748Column
Row
500
1000
1500
500 1000 1500
(b) adjacency matrix
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●● ●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
−0.7 −0.6 −0.5 −0.4 −0.3 −0.2 −0.1 0.0
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
0.8
axis 1
axis
2
● 1234
5678
(c) spectral embedding
0 2000 4000 6000 8000 10000 0
200
400
600
800
1000
1200
1400
0 2000
4000 6000
800010000
x
y
z ● ●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●●●
● ●
●
●●
●
●
●
● ●
●
● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
● ●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●●
●● ●
●
● ●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●
● ●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●●
● ●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
● ● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●●
● ●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
1234
5678
(d) spatial embedding
Takemura et al, “A visual motion detection circuit suggested by Drosophila connectomics,” Nature, 500,175-181, August 8, 2013.
9 / 25
Introduction Notable Findings Vision & Plans
Notable Findings II
D.L. Sussman, M. Tang, D.E. Fishkind, and C.E. Priebe, “A consistent adjacency spectral embedding forstochastic blockmodel graphs,” Journal of the American Statistical Association, Vol. 107, No. 499, pp.1119-1128, 2012.
D.E. Fishkind, D.L. Sussman, M. Tang, J.T. Vogelstein, and C.E. Priebe, “Consistent adjacency-spectralpartitioning for the stochastic block model when the model parameters are unknown,” SIAM JMAA, Vol.34, No. 1, pp. 23-39, 2013.
V. Lyzinski, D.L. Sussman, M. Tang, A. Athreya, and C.E. Priebe, “Perfect Clustering for StochasticBlockmodel Graphs via Adjacency Spectral Embedding,” submitted for publication, 2013.
D.L. Sussman, M. Tang, and C.E. Priebe, “Consistent latent position estimation and vertex classification forrandom dot product graphs,” IEEE TPAMI, Vol. 36, No. 1, pp. 48-57, 2014.
M. Tang, D.L. Sussman, and C.E. Priebe, “Universally consistent vertex classification for latent positionsgraphs,” Annals of Statistics, Vol. 41, No. 3, pp. 1406-1430, 2013.
M. Tang, Y. Park, and C.E. Priebe, “Out-of-sample extension for latent position graphs,” submitted forpublication, 2013.
A. Athreya, V. Lyzinski, D.J. Marchette, C.E. Priebe, D.L. Sussman, and M. Tang, “A limit theorem forscaled eigenvectors of random dot product graphs,” submitted for publication, 2013.
S. Suwan, D.S. Lee, R. Tang, D.L. Sussman, M. Tang, and C.E. Priebe, “Empirical Bayes Estimation forthe Stochastic Blockmodel,” submitted for publication, 2014.
M. Tang, A. Athreya, D.L. Sussman, V. Lyzinski, and C.E. Priebe, “Two-sample Hypothesis Testing forRandom Dot Product Graphs via Adjacency Spectral Embedding,” submitted for publication, 2014.
10 / 25
Introduction Notable Findings Vision & Plans
Cortical Column Conjecture
Cortical Column Conjecture
The cortical column conjecture suggests that neurons in theneocortex are connected in a graph that exhibits motifsrepresenting repeated processing modules.
Our focus is on extracting, and then estimating the structure of,the cortical graph, for the purpose of subsequent modeling andalgorithm development.
11 / 25
Introduction Notable Findings Vision & Plans
Vision & Plans
End-to-End Pipelinefor estimating cortical graph structure...
... for the purpose ofsubsequent modeling and algorithm development
12 / 25
Introduction Notable Findings Vision & Plans
Cortical Graph
Tim
eSerie
sofAttrib
ute
dGraphs
time
1/3
Task: Estimate Structure
corticalgraph G
n verticesR subgraphs
nr verticessubgraphHr = Ω(Vr)
m motifs
• Cortical graph G:hierarchical block model onn vertices (neurons).
• Induced subgraphs Hr:stochastic block models onnr vertices, r = 1, . . . ,R.
1. Identify large-scalestructures in G ! Hr.
2. Identify clusters of the Hr’scorresponding to repeatedmotifs.
3. Estimate structuralparameters.
13 / 25
Introduction Notable Findings Vision & Plans
Vision & Plans
The Big Five Graph Inference Challenges
1. Clustering the vertices in a graph
2. Clustering a collection of graphs
3. Graph matching
4. Testing and estimation
5. Robustness to errorfully observed graphs
All of these challenges must be addressed in the context ofavailable vertex- and edge-attributes – we will have neuron types(excitatory, inhibitory), edge weights (synapse strengths), spatialinformation, etc. The development of optimal robustmethodologies for attributed graphs presents significant additionalchallenges.
14 / 25
Introduction Notable Findings Vision & Plans
From Images to Graphs to Inference“... a goal of the computer vision is to reconstruct a graphwith error properties that the subsequent inference can tolerate ...”
(a) EM data(Davi Bock, HHMI Janelia)
(b) Segmentation(Hanspeter Pfister, Harvard)
C.E. Priebe, D.L. Sussman, M. Tang, J.T. Vogelstein, “Statistical inference on errorfully observed graphs,”Journal of Computational and Graphical Statistics, accepted for publication, 2014.
15 / 25
Introduction Notable Findings Vision & Plans
Stochastic Block Model
Let BK be the collection of symmetric K ⇥ K block-probabilitymatrices;
BK := {B 2 [0, 1]K⇥K|BT = B}.
Let Mn,K be the collection of length K non-negative integer-valuedvectors ~n with entries summing to n;
Mn,K := {~n 2 NK|KX
k=1
~nk = n}.
Given positive integers n and K, consider ~n 2 Mn,K and B 2 BK.We say a random graph G on vertex set V is a stochastic blockmodel graph G ⇠ SBM(V,~n,B) if the vertices V are partitionedinto subsets V1, · · · ,VK with sizes given by ~n = (~n1, · · · ,~nK) andedges {u ⇠ v} are independent Bernoulli random variables withP[u ⇠ v] = Bij for u 2 Vi and v 2 Vj.
16 / 25
Introduction Notable Findings Vision & Plans
Hierarchical Stochastic Block Model
Given G = (V,E), where V = [n].
• {Vr}Rr=1: a disjoint partition of the vertices V, and for each
r = 1, 2, . . . ,R, write nr := |Vr|.
• Hr = ⌦(Vr) = (Vr,Er).
• Hr ⇠ SBM(Vr, ~mr,Br), where ~mr 2 Mnr,K and Br 2 BK.
• for u 2 Vr and v 2 Vr 0 with r 6= r 0, {u ⇠ v} ⇠ Bernoulli(p).
G ⇠ HSBM(V, ~m,B), where ~m = [~m>1 | ~m>
2 | · · · | ~m>R ]> and B is
given by
B =
2
6666664
B1 pJK,K · · · pJK,K
pJK,K B2. . .
......
. . .. . . pJK,K
pJK,K · · · pJK,K BR
3
7777775.
17 / 25
Introduction Notable Findings Vision & Plans
Biologically Motivated Model
Our cortical graph G:
• n vertices, with n 2 {103, 104, 105, 106}.
• V = [n] is partitioned into R subsets {Vr}Rr=1.
• Hr = ⌦(Vr) each have |Vr| = nr = m = 100, K = 5 blocks.
• R = n/m.
• ~mr = [2, 50, 15, 8, 25]> 8r.•
Br =
2
6664
0.1 0.045 0.015 0.19 00.045 0.05 0.035 0.14 0.030.015 0.035 0.08 0.105 0.040.19 0.14 0.105 0.29 0.13
0 0.03 0.04 0.13 0.09
3
7775 .
• p 2 {10-2, 10-3, 10-4, 10-5}.
Izhikevich E. M. and Edelman G. M. (2008) “Large-Scale Model of Mammalian Thalamocortical Systems,”PNAS, 105:3593-3598
18 / 25
Introduction Notable Findings Vision & Plans
Biologically Motivated ModelLarge-scale model of mammalianthalamocortical systemsEugene M. Izhikevich and Gerald M. Edelman*
The Neurosciences Institute, 10640 John Jay Hopkins Drive, San Diego, CA 92121
Contributed by Gerald M. Edelman, December 27, 2007 (sent for review December 21, 2007)
The understanding of the structural and dynamic complexity ofmammalian brains is greatly facilitated by computer simulations.We present here a detailed large-scale thalamocortical modelbased on experimental measures in several mammalian species.The model spans three anatomical scales. (i) It is based on global(white-matter) thalamocortical anatomy obtained by means ofdiffusion tensor imaging (DTI) of a human brain. (ii) It includesmultiple thalamic nuclei and six-layered cortical microcircuitrybased on in vitro labeling and three-dimensional reconstruction ofsingle neurons of cat visual cortex. (iii) It has 22 basic types ofneurons with appropriate laminar distribution of their branchingdendritic trees. The model simulates one million multicompartmen-tal spiking neurons calibrated to reproduce known types of re-sponses recorded in vitro in rats. It has almost half a billionsynapses with appropriate receptor kinetics, short-term plasticity,and long-term dendritic spike-timing-dependent synaptic plasticity(dendritic STDP). The model exhibits behavioral regimes of normalbrain activity that were not explicitly built-in but emerged spon-taneously as the result of interactions among anatomical anddynamic processes. We describe spontaneous activity, sensitivityto changes in individual neurons, emergence of waves andrhythms, and functional connectivity on different scales.
brain models ! cerebral cortex ! diffusion tensor imaging ! oscillations !spike-timing-dependent synaptic plasticity
The last decade has seen great progress in our understandingof brain dynamics and underlying neuronal mechanisms.
Linking these mechanisms to behavior such as perception isfacilitated by large-scale computer simulations of anatomicallydetailed models of the cerebral cortex (1–3). Although thesemodels have stressed microcircuitry and local dynamics, theyhave not incorporated multiple cortical regions, corticocorticalconnections, and synaptic plasticity. In the present article, wedescribe a large-scale model of the mammalian thalamocorticalsystem that includes these components.
Spatiotemporal dynamics of the simulation show that somefeatures of normal brain activity, although not explicitly built intothe model, emerged spontaneously. The model exhibited self-sustained activity in the absence of any external sources of input.The behavior of the model was extremely sensitive to contributionsof individual spikes: adding or removing one spike of one neuroncompletely changed the state of the entire cortex in !0.5 s. Regionsof the model brain exhibited collective waves and oscillations oflocal field potentials in the delta, alpha, and beta ranges, similar tothose recorded in humans (4). Simulated fMRI signals exhibitedslow fronto-parietal anti-phase oscillations, as seen in humans (5).
The shape and connectivity of the model were determined bydiffusion tensor imaging (DTI) data for a human brain. Experi-mental data from three species, human, cat, and rat, were incor-porated to build other details of the model.
Model Structure. Here, we review some of the basic assumptionsused to construct the model, summarized in Figs. 1–3. A fulldescription is provided in supporting information (SI) Appendix.
For computational reasons, the density of neurons and synapsesper mm2 of cortical surface was necessarily reduced. Accordingly,
the model neurons have fewer synapses and less detailed dendritictrees than those of real cortical neurons. Although we do notexplicitly model subcortical structures other than the thalamus, wedo simulate brainstem neuromodulation, including the dopaminer-gic reward system (6, 7) and the cholinergic activating system.Developmental changes, other than activity-dependent fine-tuningof connectivity due to dendritic STDP, are also not modeledexplicitly.
Macroscopic Anatomy. Diffusion tensor imaging (DTI) data derivedfrom magnetic resonance imaging (MRI) of a human brain wasused to identify the coordinates of the cortical surface to allocatecell bodies of model neurons at appropriate locations. Conse-quently, the model reflects all areas of the human cortex, the foldedcortical structure with sulci and gyri. The DTI data, analyzed usingthe ‘‘TensorLine’’ algorithm (8, 9), formed the white matter tractsof the model, portions of which are illustrated in Fig. 1, that connectindividual neurons in one area with target neurons in other areas.
So that neuronal density approached that of animal cortices,spatial scales were reduced by a factor of 4 (so the model cortex
Author contributions: E.M.I. and G.M.E. designed research; E.M.I. performed research;E.M.I. and G.M.E. analyzed data; and E.M.I. and G.M.E. wrote the paper.
The authors declare no conflict of interest.
*To whom correspondence should be addressed. E-mail: [email protected].
This article contains supporting information online at www.pnas.org/cgi/content/full/0712231105/DC1.
© 2008 by The National Academy of Sciences of the USA
Fig. 1. The model’s global thalamocortical geometry and white matteranatomy was obtained by means of diffusion tensor imaging (DTI) of a normalhuman brain. In the illustration, left frontal, parietal, and a part of temporalcortex have been cut to show a small fraction of white-matter fibers, color-coded according to their destination.
www.pnas.org"cgi"doi"10.1073"pnas.0712231105 PNAS ! March 4, 2008 ! vol. 105 ! no. 9 ! 3593–3598
NEU
ROSC
IEN
CE
Izhikevich E. M. and Edelman G. M. (2008) “Large-Scale Model of Mammalian Thalamocortical Systems,”PNAS, 105:3593-3598
19 / 25
Introduction Notable Findings Vision & Plans
Biologically Motivated ModelSI
Appen
dix
5
presynaptic neuronspo
stsy
napt
ic n
euro
ns
brainstem sensory
nb1
p2/3
b2/3
nb2/3
ss4(L4
)
ss4(L2
/3)
p4 b4 nb4
p5(L2
/3)
p5(L5
/6)
b5 nb5
p6(L4
)
p6(L5
/6)
b6 nb6
TCsTCn
TIs TIn TRNco
rticco
rtical
perce
nt of
cells
numbe
r of s
ynap
ses
nb1 1.5p2/3 L2/3
L126
b2/3 3.1nb2/3 4.2
ss4(L4) 9.2ss4(L2/3) 9.2
p4 L4 L2/3 L1
9.2
b4 5.4nb4 1.5
p5(L2/3) L5 L4 L2/3 L1
4.8
p5(L5/6) L5 L4 L2/3 L1
1.3
b5 0.6nb5 0.8
p6(L4) L6 L5 L4 L2/3
13.6
p6(L5/6) L6 L5 L4 L2/3 L1
4.5
b6 2nb6 2
TCs 0.5TCn 0.5TIs 0.1TIn 0.1
TRN 0.5
8890 10.1 6.3 0.6 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .77.65800 . 59.9 9.1 4.4 0.6 6.9 7.7 . 0.8 7.4 . . . 2.3 . . 0.8 . . . . . .1306 10.2 6.3 0.1 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .783854 1.3 51.6 10.6 3.4 0.5 5.8 6.6 . 0.8 6.3 . . . 2.1 . . 0.7 . 0.5 . . .9.83307 1.7 48.6 11.4 3.3 0.5 5.5 6.2 . 0.8 5.9 . . . 1.8 . . 0.6 . 0.7 . . .135792 . 2.7 0.2 0.6 11.9 3.7 4.1 7.1 2 0.8 0.1 . . 32.7 . . 5.8 1.7 1.3 . . .25.34989 . 5.6 0.4 0.8 11.3 3.8 4.3 7.2 2.1 1.1 0.1 . . 31.1 . . 5.5 1.7 1.3 . . .23.95031 . 4.3 0.2 0.6 11.5 3.6 4.2 7.2 2.1 1.2 0.1 . . 31.4 0.1 . 5.9 1.7 1.3 . . .24.5866 . 63.1 5.1 4.1 0.6 7.2 8.1 . 0.6 7.8 . . . 2.5 . . 0.8 . . . . . .806 10.2 6.3 0.1 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .783230 . 5.8 0.5 0.8 11 3.8 4.2 8.4 2.4 1.1 . . . 30.3 . . 5.4 1.6 1.2 . . .23.33688 . 2.7 0.2 0.6 11.7 3.6 4 8.2 2.3 0.8 0.1 . . 32.2 . . 5.7 1.7 1.3 . . .24.94316 . 45.9 1.8 0.3 3.3 2 7.5 . 0.9 11.7 1 0.8 1.1 2.3 2.1 . 11.5 0.1 0.4 . . .7.2283 . 2.8 0.1 0.7 12.2 3.8 4.2 5.2 1.5 0.8 0.1 . . 33.7 . . 5.9 1.8 1.4 . . .26412 . 63.1 5.1 4.1 0.6 7.2 8.1 . 0.6 7.8 . . . 2.5 . . 0.8 . . . . . .185 10.2 6.3 0.1 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .785101 . 44.3 1.7 0.2 3.2 2 7.3 . 0.8 11.3 1.2 0.8 1.1 2.3 2.5 0.3 11.3 0.2 0.5 . . .9.2949 . 2.8 0.1 0.7 12.2 3.8 4.2 5.2 1.5 0.8 0.1 . . 33.7 . . 5.9 1.8 1.4 . . .261367 . 63.1 5.1 4.1 0.6 7.2 8.1 . 0.6 7.8 . . . 2.5 . . 0.8 . . . . . .5658 10.2 6.3 0.1 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .782981 . 45.5 2.3 0.2 3.3 2 7.5 . 1.1 11.6 1 0.9 1.3 2.3 2 . 11.4 0.1 0.4 . . .7.22981 . 45.5 2.3 0.2 3.3 2 7.5 . 1.1 11.6 1 0.9 1.3 2.3 2 . 11.4 0.1 0.4 . . .7.23261 . 2.5 0.1 0.1 0.7 0.9 1.3 . 0.1 0.1 4.9 . 0.3 1.2 13.2 7.7 7.7 0.6 2.9 . . .55.71066 . 46.8 0.8 0.3 3.4 2.1 7.7 . 0.6 11.9 1 0.6 0.8 2.3 2.1 . 11.7 0.1 0.4 . . .7.41915 . 2.8 0.1 0.7 12.2 3.8 4.2 5.2 1.5 0.8 0.1 . . 33.7 . . 5.9 1.8 1.4 . . .26121 . 63.1 5.1 4.1 0.6 7.2 8.1 . 0.6 7.8 . . . 2.5 . . 0.8 . . . . . .5573 . 2.5 0.1 0.1 0.7 0.9 1.3 . 0.1 0.1 4.9 . 0.3 1.2 13.2 7.8 7.8 0.6 2.9 . . .55.7257 . 46.8 0.8 0.3 3.4 2.1 7.7 . 0.6 11.9 1 0.6 0.8 2.3 2.1 . 11.7 0.1 0.4 . . .7.4243 . 2.8 0.1 0.7 12.2 3.8 4.2 5.2 1.5 0.8 0.1 . . 33.7 . . 5.9 1.8 1.4 . . .26286 . 63.1 5.1 4.1 0.6 7.2 8.1 . 0.6 7.8 . . . 2.5 . . 0.8 . . . . . .62 10.2 6.3 0.1 1.1 . . 0.1 . . 0.1 . . . . . . . . 4.1 . . .783220 . 2.5 0.1 0.1 0.7 0.9 1.3 . 0.1 0.1 4.9 . 0.4 1.2 13.2 7.7 7.7 0.6 2.9 . . .55.73220 . 2.5 0.1 0.1 0.7 0.9 1.3 . 0.1 0.1 4.9 . 0.4 1.2 13.2 7.7 7.7 0.6 2.9 . . .55.7
4000 31 . 7.1 . . . . . . . . . . 23 8 . . . . 5 . 25.9 .4000 31 . 7.1 . . . . . . 14 3.8 . . . 13.2 . . . . . 5 25.9 .3000 13.5 . 48.7 . . . . . . . . . . 9.8 3.3 . . 0.4 . 24.4 . . .3000 13.4 . 48.7 . . . . . . 5.8 1.6 . . . 5.4 . . . 0.6 . 24.4 . .4000 40 . . . . . . . . . . . . 30 . . . 10 10 . . 10 .
Figu
re9:
Distrib
ution
ofneu
ronal
types
and
synap
sesin
the
mod
el.E
achrow
represents
asin
glepostsy-
nap
ticneu
ronof
acertain
type.
Multip
lecom
partm
ents,if
they
exist,are
indicated
.E
achelem
entin
arow
represents
the
percentage
ofsyn
apses
ofa
particu
lartyp
eto
that
neu
ron.
The
most
signifi
cantcon
nection
sare
show
nin
Fig.
8.Shad
edregion
sden
oteplastic
connection
s.
Izhikevich E. M. and Edelman G. M. (2008) “Large-Scale Model of Mammalian Thalamocortical Systems,”PNAS, 105:3593-3598
20 / 25
Introduction Notable Findings Vision & Plans
Theorem
Let Gn ⇠ HSBM(✓).Our goal is to consistently estimate ✓.
Theorem
Under suitable eigenvalue assumptions,
a three-step algorithm given by
1. cluster vertices � ASE(G) to estimate the Hr
2. cluster the collection {ASE(bHr)}bRr=1
3. estimate motif parameters for each cluster of subgraphs
yields
b✓n ! ✓ as n ! 1.
21 / 25
Introduction Notable Findings Vision & Plans
Theorem
Let Gn ⇠ HSBM(✓).Our goal is to consistently estimate ✓.
Theorem
Under suitable eigenvalue assumptions,
a three-step algorithm given by
1. cluster vertices � ASE(G) to estimate the Hr
2. cluster the collection {ASE(bHr)}bRr=1
3. estimate motif parameters for each cluster of subgraphs
yields
b✓n ! ✓ as n ! 1.
“In theory there is no di↵erence between theory and practice. In practice,
there is.” – Yogi Berra22 / 25
Introduction Notable Findings Vision & Plans
Open Source
• connectome data repository
• to make state-of-the-art neuroscience open and to facilitatethe analysis of connectome data
• collaboration with Harvard, HHMI’s Janelia Farm, etc.
•http://www.openconnectomeproject.org
• subcontract to Harvard as a part of DARPA XDATA project
• distributing our graph inference methodologiesvia “the network analysis package”
•http://igraph.org
23 / 25
Introduction Notable Findings Vision & Plans
Vision & Plans
Impact
Our work will provide pioneering robust inference capabilities forcortical graphs which will in turn
1. impact the future of neuroscience through fundamentalinsights into recurring motifs in the neocortex, and
2. impact the future of machine intelligence through estimates ofthe structure of cortical computing primitives.
Furthermore, we expect our work to have major impact on theburgeoning field of robust graph inference, and to unite thenascent field of data science for graphs through a compelling andhigh-profile application in the natural sciences.
24 / 25
Introduction Notable Findings Vision & Plans
Leopold Kronecker to Hermann von Helmholtz (1888):
“The wealth of your practical experience
with sane and interesting problems
will give to mathematics
a new direction and a new impetus.”
Kronecker Helmholtz
25 / 25