Specialists from Google and Harvard University recreated the structure of a 1 mm segment of the human cerebral cortex on a computer. This seemingly small fragment of the brain took as much as 1.4 petabytes of memory to fill.
Last year, researchers at Google created an online database and visualization with synapses (connections between neurons) in half of the fruit fly's brain. Now - together with specialists from Harvard University - they have made available a similar computer model of a fragment of the human cerebral cortex.
The cerebral cortex is the area responsible for higher cognitive functions such as thinking, memory, planning, and speech. Getting to know it in detail can help in understanding the functioning of the brain and its disorders and open the way to new methods of their treatment.
The model covers only 1 cubic mm of tissue (about one millionth of the volume of the entire brain), but in a computer it takes - a trifle - 1.4 petabytes of data.
It was created on the basis of over 200 million images of brain tissue taken with the help of an electron microscope, with a resolution of up to 4 nm. The lion's share of the microscopic image analysis was done by artificial intelligence, but because it couldn't cope with everything, the scientists had to do some of the work by hand.
The material for the study was provided by specialists from Massachusetts General Hospital in Boston, who, during the surgical treatment of epilepsy, sometimes remove small parts of the brain to reach deeper layers.
The three-dimensional simulation includes tens of thousands of neurons and 130 million synapses and covers all layers of the cerebral cortex.
According to its creators, it is the largest reconstruction of a brain fragment with such a number of details.
Researchers are already working on methods for creating larger models. One of the challenges is the colossal demands on computer memory. For example, a simulation of a whole mouse brain would be 500 times larger, let alone a human brain. So scientists are developing, inter alia, new methods of removing information noise and data compression with the help of artificial intelligence. The models are designed to serve brain researchers, so the enormous amount of information will also require new ways to access and use data. (PAP)