(This article was originally written on September 4th, 2019 submitted to partially fulfill my Neuroscience coursework. Information, especially those regarding publication number, may not accurately reflect current situations)
Brain simulation: hope or hype?
N
Why should we simulate a brain?
There are 4 main goals of simulation in science. To verify an assumption, test heuristic models, predict missing data, properties and performance, and finally, update and generate hypothesis1. In the context of brain simulation, we should expect, additionally, the model to tie immediately to behavior2. In other words, having a digital version of the brain, if done correctly, provides a plethora of benefits and practical advantages to neuroscience research from better documented and organized data to developing new medical intervention for Alzheimer’s disease. Basically, we will be able to weed out the constraints imposed by a biological entity, such as difficulty to observe and record parts of the brain, while preserving the physiological processes.
Brains who tried to create a brain
Over the years, there are multiple groups who wanted to simulate the brain. However, they all have different stance for basic properties they wanted the neurons to have. The first large scale brain simulation is done at The Neuroscience Institute published in 20083. It is a 100 million neuron simulation comprising of multi compartmental neurons. A few years later, other groups have published various simulation including University of Waterloo, IBM research institute and Swiss Federal Institute of Technology in Lausanne (EPFL). The model, Spaun, created by the Canadian group emphasized on a functional imitation of neuronal pathways. They used 2.5 million low complexity neurons (non-multi compartmental) to represent different subsystems in the brain, such as V1, thalamus and AIT4. On the other hand, IBM’s $102.6 million5 Compass is a cortical model that uses simple neurons with properties like spontaneous spiking and spike time dependent plasticity (STDP)6. Compass does not include detailed compartmentalized neurons, various spatial morphology nor ionic characterization, therefore it was able to scale up the model up to 500 billion neurons without overloading the system. Finally, the most publicized model is the Human Brain Project (HBP)’s Blue Brain model by EPFL. The largest scale model produced by HBP was based on a 100,000 neuron cortical column of a 3-week-old rat somatosensory cortex. The Blue Brain placed heavy emphasis on the complexity of neurons to incorporate multi compartments, ionic properties, various morphological types and the ion channel and receptor expression level by single cell RNASeq analysis. Director of the Blue Brain, Henry Markham, even wrote an open letter to criticize Compass’s inadequacy for their lack of neuronal detail at the cellular level7.
To add to the amount of drama concerning HBP, from 2014 onward, some 800 European neuroscientists have undersigned an open letter to the European Commission, who awarded one billion euros (US$ 1.1 billion) to the project, regarding their concerns about the project. Those scientists raised concern that the project approach is “overly narrow” and has “significant risk” that it will not deliver what they promised8. When such a large amount of resources and personnel is involved, it calls serious scrutiny on the project. Are they delivering as promised or was it all just a hype?
Merits and Critiques on the Blue Brain
As HBP started in 2005 and is expected to last until 2023, we are now almost towards the end of the project. So it is just sensible to examine how much value for money HBP is. The one billion euro project was centered around the nine goals put forward by Markham in a Nature review paper9:
- Complete the puzzle: to study the brain in the context of a big picture
- Define functions of ion channels, receptors, neurons and synaptic pathways
- Understand the complexity of the brain
- Explore roles of dendrites
- Discover functional diversity of a single model
- Understand memory storage and retrieval
- Track the emergence of intelligence
- Simulate diseases and develop treatments
- Figure out the circuit design for other industrial application
Since, to date, the largest scale simulation ran and published by the HBP was the somatosensory cortical column10, we shall do a reality check based on the results from it. The main findings from the simulation are reproduction of synchronous and asynchronous state, neuronal response from thalamic activation and single whisker deflection; demonstrated anti correlated inhibitory can cancel out highly correlated activities in explaining uncorrelated neuronal spiking; confirmation on temporal pattern in L5, and replication of neuronal spiking pattern in a population. As we can see, the main outputs are replication and reproduction of previous studies. Thus, it does not really serve the purpose of simulation in general, no testing of heuristic model has been done, no prediction of any sort has been established, and not update and generation of new hypothesis. The input was verification of on the explanation of uncorrelated neuronal spiking11. Which is tip of the iceberg in the amount of assumptions we wanted to verify in neuroscience. Coming back to the bucket list by Markham, one cannot really say how this model was able to check any of the items off the list.
However, despite the model failing to achieve what HBP expected it to be, the group continued to publish papers, most notably the Blue Brain Atlas. From 2005 to present, they have published 64 (with 5 being published on Nature review neuroscience) papers on academic journals with averaging impact factor of 7.7812.
Challenges in whole brain simulation
It is not just the HBP that has to address constraints of modeling the brain, but any simulation project at the present has the following difficulties.
First and the most important issue is that our current knowledge is not even a drop in the ocean. Even the authors of most fundamental ionic property of neurons, the Hodgkin-Huxley-Katz equation, stated that “The agreement must not be taken as evidence that our equations are anything more than empirical description…”11. It is not just the unknown that were of concern, even the known properties of neurons are constantly challenged and should not be taken as a solid ground in building the model.
The data collection approach in the Blue column fundamentally imperfect and severely crippled by this lack of information. They were simply using a “bottom-up” approach of reverse engineering, i.e. by constructing the models based on individual cellular functions. Though they have claimed that the model can be “refined and challenged with additional markers”10 from a “top down”, i.e. Using functional or behavioral data to generate the model, perspective. To date, we lacked most of the top-down information about the brain, the closest we get with a model from top-down approach is the visual system, but certainly not the whole brain. The lack of an informative model construction will also incur an “interviewer bias”: If we are not sure what we are looking for while measuring and collecting data, will we miss some information that we are not aware of?
Perhaps the most important implication of imperfect information is summarized famously by Richard Feynman: “That which I cannot create, I do not understand.” We cannot create something we do not understand, and certainly not understanding the brain more with a model that we largely do not know how it was created. If we use the bottom-up approach as the Blue Brain, are we not just making a digital copy of the brain? It is like if someone copied a text in a foreign language they do not know onto a paper, they would still not know what the text was about. If we do not understand how the biological brain performs its tasks, there is little contribution reproducing a digital version of it.
Computing power is one of the major limiting factors in modeling. As Henry Markham put it, if we wanted to model the whole human brain just by the number of neurons, the computing power would have to go up by at least 100 million times9. This would require a quantum leap in computer development, until then we must compromise either cell number or the acuity of neurons.
Due to the giant gap of knowledge, a lot of the information from the brain is either omitted or unable to be reproduced. So most of these models that were created has to be severely reduced. The question is, to what extend is such a reduction acceptable? If neurons were reduced down to the Spaun level, one would probably criticize the model like Markham did, more importantly missing out the morphological relevance to electrophysiological properties, which mattered in Alzheimer’s disease model12. On the other hand, if the neurons were too complex, then it would overload the computer and unable generate a model with as many neurons as the targeted 100 billion cells. There has been no consensus among the scientific community as to the compromises that must be made, which leads to the second point.
How do we measure the performance of a model? The Blue column failed to meet the expectation of Eliasmith2 that it ties immediately to behavior.
Perhaps the most difficult to model is molecular and non-neuronal inputs. Neurons are not the majority in the brain, but glial cells are, which can take up to 33-66% of brain mass13. Therefore, their influence on neuronal behavior must not be underestimated, which is sadly the case for all the large scale brain simulation models. Though the HBP has been researching on modeling glial cells14, there is still a distance between incorporating it into the column model as most of the glial cell function remains largely unknown13. In addition, in the context of disease simulation, the molecular interactions between immune system and the nervous system remains largely unknown. Since neuroinflammation is a critical determinant of neurological diseases15, to model the disease and investigate treatment efficacy, one must understand what to input to the system, i.e. the research effort of an entirely different filed of study, in order to obtain a valid model. Similar can be argued with the gut-brain axis (GBA), which is as dismay as attempting to characterize glial inputs to the system, because there are still many unknowns in the field of GBA study. Therefore, to summarize and model the effects of these systems, which is currently under characterized, would first require a decently saturated investigation from the aforementioned fields. It appears that there will be years to come before that can be achieved.
Now is not the time to simulate a brain
The takeaway message is that as much as the community wanted to actualize the vision of HBP: testing a complex system while saving time, cost, and risks in real situations, our technologies or knowledge have not ripened to enable us to model the whole human brain. If we would be more patient in continuing research on neuroscience, neuroinflammation, GBA and computer engineering etc. we would be on better grounds to create a brain model. Perhaps in years to come there would be much more superior investigation tools16 on neurons that would increase accuracy and reduce error rate in the course of collecting data for the models. It is not that we should slash funding towards the HBP right away. But that we must identify whether it is wise to regard HBP more superior to other neuroscience researches or even other fields that is essential to have prior to building a model.
References
Dudai, Y. & Evers, K. To Simulate or Not to Simulate: What Are the Questions? Neuron 84, 254–261 (2014). ↩
Eliasmith, C. & Trujillo, O. The use and abuse of large-scale brain models. Curr. Opin. Neurobiol. 25, 1–6 (2014). ↩ ↩2
Izhikevich, E. M. & Edelman, G. M. Large-scale model of mammalian thalamocortical systems. Proc. Natl. Acad. Sci. U. S. A. 105, 3593–3598 (2008). ↩
Eliasmith, C. et al. A Large-Scale Model of the Functioning Brain. Science 338, 1202–1205 (2012). ↩
Barrat, J. Our Final Invention: Artificial Intelligence and the End of the Human Era. (Macmillan, 2013). ↩
Preissl, R. et al. Compass: A scalable simulator for an architecture for cognitive computing. in SC ’12: Proceedings of the International Conference on High Performance Computing, Networking, Storage and Analysis 1–11 (2012). doi:10.1109/SC.2012.34 ↩
Adee, S. Cat Fight Brews Over Cat Brain. IEEE Spectrum: Technology, Engineering, and Science News (2009). Available at: https://spectrum.ieee.org/tech-talk/semiconductors/devices/blue-brain-project-leader-angry-about-cat-brain. (Accessed: 4th April 2019) ↩
Anonymous. Open message to the European Commission concerning the Human Brain Project. (2014). Available at: https://www.neurofuture.eu/. (Accessed: 4th April 2019) ↩
Markram, H. The Blue Brain Project. Nat. Rev. Neurosci. 7, 153–160 (2006). ↩ ↩2
Markram, H. et al. Reconstruction and Simulation of Neocortical Microcircuitry. Cell 163, 456–492 (2015). ↩
Hodgkin, A. L. & Huxley, A. F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117, 500–544 (1952). ↩
Perl, D. P. Neuropathology of Alzheimer’s Disease. Mt. Sinai J. Med. N. Y. 77, 32–42 (2010). ↩
Jäkel, S. & Dimou, L. Glial Cells and Their Function in the Adult Brain: A Journey through the History of Their Ablation. Front. Cell. Neurosci. 11, (2017). ↩ ↩2
Jolivet, R., Coggan, J. S., Allaman, I. & Magistretti, P. J. Multi-timescale Modeling of Activity-Dependent Metabolic Coupling in the Neuron-Glia-Vasculature Ensemble. PLOS Comput. Biol. 11, e1004036 (2015). ↩
Ransohoff, R. M., Schafer, D., Vincent, A., Blachère, N. E. & Bar-Or, A. Neuroinflammation: Ways in Which the Immune System Affects the Brain. Neurotherapeutics 12, 896–909 (2015). ↩
Fois, C., Prouvot, P.-H. & Stroh, A. A Roadmap to Applying Optogenetics in Neuroscience. in Photoswitching Proteins: Methods and Protocols (ed. Cambridge, S.) 129–147 (Springer New York, 2014). doi:10.1007/978-1-4939-0470-9_9 ↩