Scientist Discover Real-Life Superpower Hidden Deep in Our Brains – Harness Your HEBBIAN POWER

THE super human could be only a matter of generations away as scientists have discovered evidence that the human brain is evolving far faster than natural selection would normally dictate.

Researchers think our brains are using so-called Hebbian networks hidden deep inside our grey matter to create super-fast, super-powerful short-cuts – which massively increase our power to think and process data.

In essence our brains are working like Artificial Intelligence neural networks – then passing them to our offspring.

Charles Darwin’s theory of the survival of the fittest and the process of natural selection has long been accepted in scientific worlds where genes are passed down from parent to offspring and those genes that help the host survive then get passed on again over the years.

But combined with that basic knowledge these genes also work together in what is known as “gene networks” which can also be be passed on down through the generations.

Now the latest research has built on that looking at the way natural selection impacts on those networks.

Superhuman: Brain function is growing far faster than simple natural selection would dictate

Superhuman: Brain function is growing far faster than simple natural selection would dictate

Computer scientist Dr Richard Watson, Associate Professor at the University of Southampton, believes not only that natural selection acts as a kind of genetic filter weeding out undesirable aspect but also allows these gene networks to “learn” what works and what does not and so being able to improve performance.

The process mimics the way artificial neural networks used by computer scientists can learn to solve problems.

Dr Watson said: “The fact that organisms have gene networks and they are inherited from one generation to the next, that’s not new information.

Learning curve: Donald Hebb father of Hebbian learning neural network theory

Learning curve: Donald Hebb father of Hebbian learning neural network theory

“Gene networks evolve like neural networks learn. That’s the thing that’s novel.”

Artificial neural networks can be used for a wide variety of tasks such as human face recognition and analysing team performance in football games and see how particular tactics worked out.

These systems can take an input, such as the word ‘hello’, and learn to replicate it over time.

Like a child, a neural network cannot make the connection instantly, but rather must be trained over time.

That training is complicated, but in essence it involves changing the strengths of the connections between the virtual “neurons”.

Each time, this improves the result, until the whole network can reliably output the desired answer: in our example, that the funny symbols on the page (“hello”) equals the word ‘hello’.

Now the computer ‘knows’ what you have written.

Dr Watson believes this is how species evolve by learning what works.

However it would also appear that there are different ways for these neural networks to learn.

Dr Watson has focused his attention on so-called “Hebbian learning”.

In this system, he connections between adjacent neurons that have similar outputs are strengthened over time. In short: “neurons that fire together, wire together”. The network “learns” by creating strong links within itself.

“Scientists believe humans could learn to evolve over time”

A particular advantage of Hebbian learning is that the networks can develop “modular” features. For instance, one group of genes might define whether or not an animal has hind legs, or eyes, or fingers.

Dr Watson added: ”If there is an individual that has a slightly stronger regulatory connection between those genes than some other individual does, then they’ll be preferred.

“They’ll be favoured by natural selection. That means over evolutionary time, the strength of the connections between those genes will be increased.”

Express UK

Start the discussion, leave a comment below

Leave a Reply