Neurons to inspire future computers

BBC News | 23 July 2010 | 09:46 GMT

Nerve cells communication The way neurons communicate could inspire the next generation of computers. Researchers are developing novel computers by mimicking the way that neurons are built and how they talk to each other. Basing computers around neurons could lead to improvements in visual and audio processing on computers. It might mean that computers learn to see or to hear in the future rather than just rely on sensors. As well as building computers, the researchers are also helping to improve understanding of nerve cells and how they operate.

Smarter seeing

While artificial neural networks have been around for more than 50 years they typically do not copy real neurons very closely. By contrast the project being co-ordinated by computer scientist Dr Thomas Wennekers from the University of Plymouth wants to model specific physiological features of the way that neurons in one part of the brain communicate. "We want to learn from biology to build future computers," said Dr Wennekers. "The brain is much more complex than the neural networks that have been implemented so far." The early work of the project has been collecting data about neurons and how they are connected in one part of the brain. The researchers are focussing on the laminar microcircuitry of the neocortex which is involved in higher brain functions such as seeing and hearing.

Subtleties The data gathered has fed highly detailed simulations of groups of nerve cells as well as microcircuits of neurons that are spread across larger scale structures such as the visual cortex. "We build pretty detailed models of the visual cortex and study specific properties of the microcircuits," he said. "We’re working out which aspects are crucial for certain functional properties like object or word recognition." There are hopes that the work will produce more than just improved sensory networks, said Dr Wennekers. "It might lead to smart components that are intelligent," he said. "They may have added cognitive components such as memory and decision making." They might even, said Dr Wennekers, start to be endowed with emotion. "We’ll be computing in a completely different way," he said.

Big brain

While Dr Wennekers and his team are working largely with software simulations, Professor Steve Furber from Manchester is using the inspiration from neurons to produce novel hardware. Called Spinnaker, Prof Furber’s project is trying to create a computer specifically optimised to run like biology does. Based around Arm chips, the Spinnaker system simulates in hardware the workings of relatively large numbers of neurons. "We’ve got models of biological spiking neurons," said Prof Furber. "Neurons whose only communication with the rest of the world is, that they go ping. When it goes ping it lobs a packet into a small computer network."

Computers learn to see Spinnaker uses Arm processors each one of which runs about 1,000 neuron models. The current system uses an eight processor system but, said Prof Furber, the team is in the final stages of designing the chip with 18 Arm processors on board, 16 of which will model neurons. The ultimate goal, he said, was a system that controlled one billion neurons on a million ARM processors. "The primary objective is just to understand what’s happening in the biology," said Prof Furber. "Our understanding of processing in the brain is extremely thin." The hope is also that the simulation leads to innovative computer processing systems and insights into the way that lots of computational elements can be hooked up to each other. "The computer industry is faced with no future other than parallel," said Prof Furber.

Despite this, he said, industry understanding of how to get the most out of all those computational elements was lacking. The big problem, he said, was how to run the system without being swamped by the management overhead of co-ordinating those processors. Spinnaker might show a way to overcome some of these problems as the individual elements will be far smaller than the monolithic processors in use now and will, to an extent, to self-organise. They will also offer advantages in that they are likely to use a lot less power than existing machines. "We think there’s a change in the game there," said Prof Furber.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s