The Blue Brain

Human brain is the most valuable creation of God. The man is called intelligent because of the brain. The brain translates the information delivered by the impulses, which then enables the person to react. But we loss the knowledge of a brain when the body is destroyed after the death of man. Today scientists are in research to create an artificial brain that can think, response, take decision, and keep anything in memory. The main aim is to upload human brain into machine. So that man can think, take decision without any effort. After the death of the body, the virtual brain will act as the man .So, even after the death of a person we will not lose the knowledge, intelligence, personalities, feelings and memories of that man that can be used for the development of the human society. No one has ever understood the complexity of human brain. It is complex than any circuitry in the world. As we know, the technology is growing faster than ever thing. IBM is now in research to create a virtual brain, called “Blue brain”. This would be the first virtual brain of the world .It  include  cognitive  functions  such  as  language,  learning, perception and memory in addition to brain malfunction such as psychiatric disorders like depression and autism. From there, the modeling will expand to other regions of the brain and, if successful, shed light on the relationships between genetic, molecular and cognitive functions of the brain.

Introduction of Blue Brain-

1. Introduction

The blue brain project aims at a quantitative simulation of the intrinsic connectivity of the cerebral cortex with functional characterization of the neurons and microcircuits at an unprecedented scale. Different classes of cortical neurons are identified by their morphological properties, by their electrical properties in synaptically activating other neurons, as well as by their pharmacological and gene expressing profile. This research program leads to a comprehensive description of the wiring of a column of the cortex, which is being extended to neighboring columns, and ultimately to the whole brain. Dr. Markram and his colleagues make a persuasive case that this approach is critically important for understanding the neural basis of cortical function, as well as for understanding the neural basis of neurological and psychiatric disorders.

The findings are based on research and data from cortical slices. These findings are entered into databases which are linked to supercomputers which enable the information to be retrieved for quantitative analysis and visualization. The research requires cutting-edge supercomputing facilities to capture quantitatively the connectivity and physiological properties.

This large-scale and detailed approach to computational models of the cortex is in turn influencing concepts underlying the development of the next generation of high performance computing.

In summary, the research in the BBP, unique in its scope, is redefining the science and technology of how to simulate the neural basis of brain function. The work in Switzerland is closely linked to complementary research in other laboratories throughout Europe, and is the

central node of the Human Brain project. The opportunities for development of this work in the next five years are great. Based on this evidence, we therefore recommend with the high enthusiasm the Blue Brain Project for an increased level of funding that will enable it to develop considerably over the next five years.

It is important that this funding includes long-term posts for two senior scientific

project leaders as the program develops and enlarges; provision for the next generation of super computing facilities; and provision of support infrastructure including staff for software engineering. It would also be desirable to recruit staff for facilitating international collaboration and developments with industry that are likely to be important to the pharmaceutical industry in developing treatments for brain including psychiatric disorders.

2. WHAT IS BLUE BRAIN?

It’s very clear by reading the general overview of what Blue brain is. Blue brain is a concept which allows to copy or to

transfer all the contents of a human brain into a virtual brain that resides inside a Super computer. The Super computer

used in this is Blue Gene as of the current information revealed. It is like uploading a mind in a computer. Mind uploading can probably be achieved by either of two methods: 1.Copy and Transfer or 2.Slow and steady replacement of neurons. In the previous method, mind uploading would be achieved by scanning, comparing and

contrasting the salient features of a normal biological brain, and then by copying, moving, and saving that information

into a computer system or other computing machine. The stimulated mind then can reside into a computer that is inside a

humanoid robot or a biological body.

Figure1. Cortical meso circuit simulation

The Blue Brain Project is an attempt to reverse engineer the human brain and recreate it at the cellular level inside a computer simulation. The project was founded in May 2005 by Henry Markram at the EPFL in Lausanne, Switzerland. Goals of the project are to gain a complete understanding of the brain and to enable better and faster development of brain disease

3. STEPS TO BUILDING A BLUE BRAIN

3.1. Data collection:

It involves collecting brain portions, taking them under a microscope, and gauging the shape and electrical behavior of neurons individually. This method of studying and cataloguing neurons is very familiar and worldwide. The neurons are captured by their shape, electrical and physiological activity, site within the cerebral cortex, and their population density. These observations are translated into precise algorithms which describe the process, function, and positioning methods of neurons.

Then, the algorithms are used to generate biologically-real looking virtual neurons ready for simulation.

 Figure 2. NEURON cell sbuilder Windows

The primary software used by the BBP for neural simulations is a package called NEURON. This was developed starting in the 1990s by Michael Hines at Yale University and John Moore at Duke University. It is written in C, C++, and FORTRAN. The software continues to be under active development and, as of July 2012, is currently at version 7.2. It is free and open source software, both the code and the binaries are freely available on the website. Michael Hines and the BBP team collaborated in 2005 to port the package to the massively parallel Blue Gene supercomputer.

fig-12 patch-clamp at the Blue Brain lab


Figure 3. 3D neuron morphology reconstruction

Around 200 different types of ion channel are found in the cell membranes of cortical neurons. Different types of neuron have different mixes of channels – and this contributes to differences in their electrical behaviour. The genes for these channels are cloned at the lab, over expressed in cultured cells, and their electrical behaviour recorded. Over 270 genes are known to be associated with voltage-gated ion channels in the rat.

3.2. Data simulation:

It concerns with two major aspects:

a. Simulation speed

b. Simulation workflow

3.2.1Simulation speed

    Simulations of one cortical column (more than 10,100 neurons) run about two hundred times slower than real time. It takes about five minutes to complete one second of stimulated time. The simulations display unevenly line scaling. Presently the major seek is biological soundness rather than presentation. After understanding biologically significant factors for a given effect it might be feasible to crop constituents that don’t subsidize in order to advance performance.

3.2.2 Simulation overflow

Making virtual cells using the algorithms, written to define and describe real neurons, is the major seek of this step. Algorithms and constraints are adapted according to the age, species, and disease stage of the animal being simulated. Each one of the protein is simulated.

Note: there are hundreds of millions of proteins in one cell.

a. First a network skeleton is built from all the different kinds of synthesized neurons.

b. After this, the cells are joined according to the experimentally found rules.

c. Finally the neurons are functionalized and the simulation brought to life.The blueprints of emerging behavior are watched with visualization software.

3.2.2.1 BBP-SDK

   The Blue Brain Project – Software Development Kit, a set of Application Programming Interfaces allows the researchers to use and audit prototypes and simulations. The Blue Brain Project-SDK is a C++ library wrapped in Java and Python.

The primary software used by this for neural simulations is NEURON. Michael Hines of Yale University and John Moore at Duke University developed this in the starting of the 1990s. It uses C, C++, and FORTRAN. It is freely available open source software. The website makes everything available including the code and the binary data freely. Michael Hines in cooperation with BBP team in 2005 ported the package into the massive and parallel Blue Gene.

3.3 Visualization of results

It can be explain by RT neuron  technology

RT Neuron visualization of a neuron

4.1 RT Neuron

RTNeuron is the primary application used by the BBP for visualisation of neural simulations. The software was developed internally by the BBP team. It is written in C++ and OpenGL. RT Neuron is ad-hoc software written specifically for neural simulations, i.e. it is not generalisable to other types of simulation. RT Neuron takes the output from Hodgkin-Huxley simulations in NEURON and renders them in 3D. This allows researchers to watch as activation potentials propogate through a neuron and between neurons. The animations can be stopped, started and zoomed, thus letting researchers interact with the model. RT Neuron is the main application that Blue Brain Project uses for visualization of neural simulations. The BBP team developed this software internally. It is coded using C++ and OpenGL. RT Neuron is an ad-hoc software written specifically for neural simulations, i.e. it can’t generalized to other kinds of simulation. RT Neuron takes the output from Hodgkin-Huxley simulations as input in NEURON and delivers them in 3D. This allows the programmers and researchers to view as activation potentials propagate through or between neurons. The animations can be paused, stopped, startedand zoomed, hence allowing the researchers to interact with the model. The visualizations are multi-scale (they can render individual neurons or a whole cortical column).

4.Hardware and software required for this blue brain technology-

4.1 blue gene

The primary machine used by the Blue Brain Project is a Blue Gene supercomputer built by IBM. This is where the name “Blue Brain” originates from. IBM agreed in June 2005 to supply EPFL with a Blue Gene/L as a “technology demonstrator”. The IBM press release did not disclose the terms of the deal. In June 2010 this machine was upgraded to a Blue Gene/P. The machine is installed on the EPFL campus in Lausanne   and is managed by CADMOS (Center for Advanced Modelling Science).

 The computer is used by a number of different research groups, not exclusively by the Blue Brain Project. In mid-2012 the BBP was consuming about 20% of the compute time. The brain simulations generally run all day, and one day per week (usually Thursday)The supercomputer usage statistics and job history are publicly available online – look for the jobs labelled “C-BPP”.

Blue Gene/P technical specifications:

  • 4,096 quad-core nodes (16,384 cores in total)
  • Each core is a PowerPC 450, 850 MHz
  • Total: 56 teraflops, 16 terabytes of memory
  • 4 racks, one row, wired as a 16x16x16 3D torus
  • 1 PB of disk space, GPFS parallel file system
  • Operating system: Linux SuSE SLES 10

4.2 DEEP – DYNAMICAL EXASCALE ENTRY PLATFORM

DEEP is an exascale supercomputer to be built at the Jülich Research Center in Germany. The project started in December 2011 and is funded by the European Union’s 7th framework program. The three-year prototype phase of the project has received €8.5 million. A prototype supercomputer that will perform at 100 petaflops is hoped to be built by the end of 2014. The Blue Brain Project simulations will be ported to the DEEP prototype to help test the system’s performance. If successful, a future exascale version of this machine could provide the 1 example of performance required for a complete human brain simulation by the 2020s.The DEEP prototype will be built using Intel  MIC (Many Integrated Cores) processors, each of which contains over 50 cores fabricated with a 22 nm process. These

processors were codenamed Knights Corner during development and subsequently rebranded as Xeon Phi in June 2012

4.3A super computer.

4.4 Memory with a very large storing capacity

4.5 Processor with a very high processing power.

5. APPLICATIONS OF BLUE BRAIN

5.1 Gathering and Testing 100 Years of Data

The most immediate benefit is to provide a working model into which the past 100 years knowledge about the microstructure and workings of the neocortical column can be gathered and tested. The Blue Column

Will therefore also produce a virtual library to explore in 3D the microarchitecture of the neocortex and access all key research relating to its structure and function.

5.2 Cracking the Neural Code

The Neural Code refers to how the brain builds objects using electrical patterns. In the same way that the neuron is the elementary cell for computing in the

Brain, the NCC is the elementary network for computing in the neocortex. Creating an accurate replica of the NCC which faithfully reproduces the emergent electrical dynamics of the real microcircuit,

is an absolute requirement to revealing how then neocortex processes, stores and retrieves information.

5.3 Understanding Neocortical Information Processing

The power of an accurate simulation lies in the predictions that can be generated about the neocortex. Indeed, iterations between simulations and experiments are essential to build an accurate copy of the NCC. These iterations are therefore expected to reveal the function of individual elements (neurons,

Synapses, ion channels, receptors), pathways (monosynaptic, disynaptic, multisynaptic loops) and physiological processes (functional properties, learning, reward, goal-oriented behaviour).

5.4 A Novel Tool for Drug Discovery for Brain Disorders

Understanding the functions of different elements and pathways of the NCC will provide a concrete foundation to explore the cellular and synaptic bases of a wide spectrum of neurological and psychiatric

Diseases. The impact of receptor, ion channel, cellular and synaptic deficits could be tested in simulations and the optimal experimental tests can be determined.

5.5 A Global Facility

A software replica of a NCC will allow researchers to explore hypotheses of brain function and dysfunction accelerating research. Simulation runs could determine which parameters should be used and measured in the experiments. An advanced 2D, 3Dand 3D immersive visualization system will allow “imaging” of many aspects of neural dynamics during processing, storage and retrieval of information. Such imaging experiments may be impossible in reality or may be prohibitively expensive to perform.

5.6 A Foundation for Whole Brain Simulations

With current and envisageable future computer technology it seems unlikely that a mammalian brain can be simulated with full cellular and synaptic complexity (above the molecular level). An accurate replica of an NCC is therefore required in order to generate reduced models that retain critical functions and computational capabilities, which can be duplicated and interconnected to form neocortical brain regions. Knowledge of the NCC architecture can be transferred to facilitate reconstruction of

Subcortical brain regions.

5.7 A Foundation for Molecular Modeling of Brain Function

An accurate cellular replica of the neocortical column will provide the first and essential step to a gradual increase in model complexity moving towards a molecular level description of the neocortex with

biochemical pathways being simulated. A molecular level model of the NCC will provide the substrate for interfacing gene expression with the network structure and function. The NCC lies at the interface

between the genes and complex cognitive functions. Establishing this link will allow predictions of the cognitive consequences of genetic disorders and allow reverse engineering of cognitive deficits to

determine the genetic and molecular causes. This level of simulation will become a reality with the most advanced phase of Blue Gene development.

6. ADVANTAGES AND LIMITATIONS

6.1 Advantages

• We can remember things without any effort.

• Decision can be made without the presence of a person.

• Even after the death of a man his intelligence can be used.

• The activity of different animals can be understood. That means by interpretation of the electric impulses from the brain of the animals, their thinking can be understood easily.

• It would allow the deaf to hear via direct nerve stimulation, and also be helpful for many psychological diseases. By downloading the contents of the brain that was uploaded into the computer, the man can get rid from the madness.

6.2 Limitations

Further, there are many new dangers these technologies will open. We will be susceptible to new Forms of harm.

• We become dependent upon the computer systems.

• Others may use technical knowledge against US.

• Computer viruses will pose an increasingly critical threat.

• The real threat, however, is the fear that people will have of new technologies.

• That fear may culminate in a large resistance. Clear evidence of this type of fear is found today with respect to human Cloning.

7. FUTURE SCOPE

The synthesis era in neuroscience started with the launch of the Human Brain Project and is an inevitable phase triggered by a critical amount of fundamental data. The data set does not need to be complete before such a phase can begin. Indeed, it is essential to guide reductionist research into the deeper facets of brain structure and function. As a complement to experimental research, it offers rapid assessment of the probable effect of a new finding on pre-existing knowledge, which can no longer be

managed completely by any one researcher. Detailed models will probably become the final form of databases that are used to organize all knowledge of the brain and allow hypothesis testing, rapid diagnoses of brain malfunction, as well as development of treatments for neurological disorders.

In short, we can hope to learn a great deal about brain function and dis function from accurate models of the brain .The time taken to build detailed models of the brain depends on the level of detail that is captured. Indeed, the first version of the Blue Column, which has 10,000 neurons, has already been built and simulated; it is the refinement of the detailed properties and calibration of the circuit that takes

time. A model of the entire brain at the cellular level will probably take the next decade. There is no fundamental obstacle to modelling the brain and it is therefore likely that we will have detailed models of mammalian brains, including that of man, in the near future. Even if overestimated by a decade or two, this is still just a ’blink of an eye’ in relation to the evolution of human civilization. As with Deep Blue Brain will allow us to challenge the foundations of our understanding of intelligence and generate new theories of consciousness.

8. CONCLUSION

       In conclusion, we will be able to transfer ourselves into computers at some point. Most arguments against this outcome are seemingly easy to circumvent. They are either simple minded, or simply require further time for technology to increase. The only serious threats raised are also overcome as we note the combination of biological and digital technologies.

       The blue brain project can contain 3 steps 1.data collection 2. Data simulation 3 visualization it is mostly use for stored hundred years of data.it is use for neural coding.,it will noble tool for globle discovery of drugs for diseases related to brain. In short, we can hope to learn a great deal about brain function and dis function from accurate models of the brain .The time taken to build detailed models of the brain depends on the level of detail that is captured.

Comments

Popular posts from this blog