BACKGROUND

Since its launch in 2013, the U.S. BRAIN Initiative has helped focus attention on the urgent need to create an unprecedented suite of measurement tools to enable large-scale, massively-multiplexed interrogation of brain circuitry. Most of the technological challenges to be surmounted in their creation are complex, costly, and beyond the reach of individual investigators. Meanwhile, collective resources available worldwide to support such pursuits are finite – especially given the scale of the quest. This significant disparity between critical tasks and the available resources makes it urgent that we foster complementarity between efforts that are ongoing to develop, produce, and disseminate novel, next-gen tools. We are convening a Kavli Futures Symposium: Toward Next-Gen, Open-Source Neurotechnology Dissemination on October 20-21, 2017 in Santa Monica, CA, to advance progress toward this end.

To date, it has largely been individual labs, often working in isolation, that have advanced the technology for fundamental neuroscience. However, in many instances the multiplicity of solitary pursuits has spawned duplication of arduous and expensive efforts; yielded technology that is only marginally robust; and produced a cacophony of (sometimes contradictory) “lore”. Without denigrating the many essential and important advances made to date, much of the progress achieved might be categorized as neurocraft. With this existing paradigm, only a fraction of the potential of next-gen neurotechnology will be realized.

Within individual, isolated efforts, it is generally impossible to surmount the many challenges involved. These include: developing sub-components with requisite complexity, fabricating them with sufficient robustness to permit their reliable concatenation into complex instrumentation or protocols, and then mass-producing them in quantities to sustain wide dissemination. We assert that the sheer diversity of requisite technologies, and the complexity of the targeted instrumentation, necessitates team efforts. The BRAIN Initiative has engendered the coalescence of interdisciplinary teams focused on developing next-gen tools. Technologies being pursued span from engineered molecules, transgenic cellular and animal lines, active instrumentation, to algorithms and big data. Yet, these emerging approaches can only succeed with tight coherence between respective players. Specifically, this requires an efficacious division of labor towards complementary tasks and, importantly, development and adoption of standards. The latter ensures interoperability and robustness of the subcomponents or protocols produced for ultimate assembly into a complex, functional whole. At this juncture, we believe progress towards this collective vision of realizing next-gen neurotechnology is contingent upon ensuring thoughtful and deliberate coordination between ongoing efforts.

We offer two examples that elucidate these needs:

Chemical Tools. Recent chemical tools, such as expansion microscopy and brain clearing strategies, can sometimes benefit from custom-synthesized chemical reagents for labeling, amplifying, and analyzing biomolecules within intact specimens. Nanoparticles for neural labeling and imaging are another emerging field where synthetic chemistry is needed to manufacture tools for use in biology labs.  Much as viral core facilities have risen to the occasion of providing genetically encoded reagents to the neuroscience and biology communities, there will be immense utility in creating core chemistry facilities that can synthesize, provide quality control, and then disseminate non-genetically encoded chemical reagents.

Nano-enabled neurotechnology.  Although nanosystems prototypes are often proven within individual academic laboratories, a different approach must be taken to transform these into bona fide technology with sufficient reliability for broad dissemination. For example, after first proof-of-concept is demonstrated – the engineering, production and deployment of massively multiplexed, implantable electrical or photonic neural nanoprobe systems involves many ingredients. These include large-scale semiconductor integration, highly reproducible foundry-scale production, and big-data computational resources. These tasks are achievable only by distributing and coordinating efforts amongst collaborative partners to attain the requisite robustness and scale-of-production. Essential, finely tuned production protocols are available solely within state-of-the-art industrial semiconductor foundries (chip-production factories). Dedicated engineers at these institutions maintain daily the sophisticated instruments and process tolerances necessary for mass production. The reproducibility and tolerances achieved in these “closed” precision-production foundries are impossible to attain within “open” user facilities at universities or national laboratories. Further, the initial phases of technology development, the subsequent sub-component integration, and the mass production and assembly of advanced instrumentation are unlikely today to be financially sustained by either venture capital or within the commercial sector. Given this backdrop, we believe that only through a well-coordinated worldwide paradigm can individual neurotechnology elements be developed, produced en masse, and then integrated into the complex instrumentation systems necessary to advance this field, with sufficient robustness to make wide dissemination practical.

Coordination of such efforts worldwide will enhance the productivity and output of individual efforts: optimizing, distributing, and coordinating the tasks necessary for systematic engineering and production. For the many types of large-scale “tools” (instruments and facilities) that have long been essential for pursuing the frontiers of physics and astronomy, one paradigm has repeatedly proven successful. It is this: in each step of the process and for each component sub-system within the totality, the Technology Readiness Level (TRL) index must be ascended. This approach, first adopted by NASA for its missions, allows complex projects to be pursued cost-effectively and efficiently − by distributing separate, yet complementary, efforts to build system sub-components. With careful coordination of constituent efforts, and by achieving high TRLs for each sub-system/protocol, complex integrated systems can be realized with sufficient reliability to permit launching cutting-edge exploratory missions (or sophisticated experiments) with high probability of success a priori. However, achieving interoperable, high-TRL neurotechnologies requires a disciplined approach, one that is generally unfamiliar to those within academia, which only a highly coordinated engineering network can provide. But we must also emphasize that neurotechnology development cannot be pursued in an experimental vacuum. At all stages in a project’s evolution, coordinated technological efforts must be directed towards achievement of specific, high-profile goals of experimental neuroscience or neuromedicine. The efforts must be co-directed in a close partnership between experimental and theoretical researchers in neuroscience and neuromedicine, physical scientists, and engineers. Assembling an efficient neurotechnology development process requires, at its core, neuroscience and neuromedicine researchers who participate not simply as adopters of the new tools, but as inextricable co-developers. These collective quests must be driven by iterative, closed-loop cycles — involving technology development and production, technical validation, initial characterization in vitro, followed by challenging experiments in vivo, and, finally, feedback into subsequent cycles of optimization.

We believe it is now essential to promote an overarching coordination and standardization of technologies, sub-system interconnections, and experimental protocols. A self-organized international “working group” could provide the galvanizing vision necessary to coordinate disparate pursuits and optimization of innovative elements by individual neurotechnology developers. Coordination will ensure overall mission coherence, and sustain the complete ecosystem of elemental operations that, by nature, range from the exalted to the pedestrian. It is critical to note that, in isolation, many essential operations will not be perceived as sufficiently cutting-edge to be fundable through regular channels. Further, many requisite tasks, by their nature, are inappropriate for building the careers of graduate or postdoctoral researchers. Instead, to ensure their execution reliably, professional scientists and engineers must carry out such activities. However, as it is generally impossible to sustain experienced technical personnel through short-term single-investigator funding, new support paradigms are essential.