In 1996, a young graduate student called Richard Watson sat down to read a paper on evolution. The article was provocative. It tackled a long-running problem in evolutionary biology: we do not fully understand how organisms can adapt so successfully to their environments.
Creatures do not seem to be merely at the mercy of random changes, or mutations, in their genes over time. Instead, they actually seem to "improve" their ability to adapt. It seemed this ability was not explained solely by the process of natural selection, in which the best traits are passed on by the most successful organisms.
So the paper's authors, Gunter Wagner at Yale University and Lee Altenberg at the Hawaii Institute of Geophysics and Planetology in Honolulu, decided to look for answers in a surprising place: computer science.
Watson, a computer scientist, was hooked. In the 20 years since he read that paper, he has been developing a theory based on the ideas it contained. His ideas could help explain why animals are so good at evolving: a trait called their "evolvability". What's more, it might even help to solve some long-running curiosities in evolutionary biology.
Many people will be familiar with the idea that genes are passed from parent to offspring, and those genes that help their hosts survive and reproduce have a better chance of getting passed on. This is the essence of evolution and natural selection.
But there is more to it, because genes often work together. They form "gene networks", and those gene networks can also sometimes be passed intact down the generations.
The connections between genes can be strengthened or weakened as a species evolves
"The fact that organisms have gene networks and they are inherited from one generation to the next, that's not new information," says Watson, now at the University of Southampton in the UK. His contribution is largely to do with the way natural selection acts on those networks.
He believes it does not just act like a partial barrier, letting some adaptations through over others. Instead, the impact of this filtering allows gene networks in animals to actually "learn" what works and what does not over time. This way, they can improve their performance – in much the same way that the artificial neural networks used by computer scientists can "learn" to solve problems.
"Gene networks evolve like neural networks learn," he says. "That's the thing that's novel."
Watson's basis for this claim is the idea that the connections between genes can be strengthened or weakened as a species evolves and changes – and it is the strength of those connections in gene networks that allow organisms to adapt.
This process is similar to how human-made artificial neural networks on computers work.
Today, these systems are used for all kinds of tasks. For example, they can recognise people's faces in photographs or videos, and even analyse footage of football games to see which teams' tactics performed better and hint at why. How do computers manage to figure things like that out?
Artificial neural networks are inspired by biological networks – chiefly, the brain. Each network is a collection of simulated "neurons", which are linked up in some way; a bit like the stations and lines on the London Underground.
Networks like this are able to take an input – say, the word "hello" written on a page – and match it to an output – in this case, the word "hello" held in the computer's memory. This is something children do when they learn to read and write.
Neurons that fire together, wire together
Like a child, a neural network cannot make the connection instantly, but rather must be trained over time. That training is complicated, but in essence it involves changing the strengths of the connections between the virtual "neurons". Each time, this improves the result, until the whole network can reliably output the desired answer: in our example, that the funny symbols on the page ("hello") equals the word "hello". Now the computer "knows" what you have written.
A similar thing happens in nature, Watson believes. An evolvable species would "output" a trait just right for a given environment.
There are different ways to get neural networks to learn. One that Watson has focused on, as a good example for what appears to be happening in biological gene networks, is "Hebbian learning".
In Hebbian learning, the connections between adjacent neurons that have similar outputs are strengthened over time. In short: "neurons that fire together, wire together". The network "learns" by creating strong links within itself.
If an organism has certain genes firing together in this way, and that organism proves successful enough to reproduce, then its offspring will not simply inherit its beneficial genes, argues Watson. They will also inherit the connectivity between those genes.
A particular advantage of Hebbian learning is that the networks can develop "modular" features. For instance, one group of genes might define whether or not an animal has hind legs, or eyes, or fingers. Similarly, a handful of related adaptations – like a fish's ability to cope both with higher water temperatures and also saltier water – could get bundled and inherited together in a single gene network.
"If there is an individual that has a slightly stronger regulatory connection between those genes than some other individual does, then they'll be preferred," says Watson. "They'll be favoured by natural selection. That means over evolutionary time, the strength of the connections between those genes will be increased."
The ability of natural organisms to evolve to new selective environments or challenges is awesome
For Watson, this helps to get around a sticky problem in the theory of evolution.
Imagine for a moment that an organism's genome is a piece of computer code. A novice computer programmer might gradually update their code now and again, in an effort to make improvements. They might explore whether a different string of commands might make the program work a little bit more efficiently.
To begin with, this process of trial-and-error updating might work reasonably well. But over time, updating the code this way would become ever more cumbersome. The code would begin to look messy, making it difficult to work out what impact a particular addition might have. This does sometimes happen in programming and there is a term for the result: "spaghetti code". If organisms actually evolved this way, says Watson, "their evolvability – their ability to adapt to new stresses or environments – would be rubbish." But in fact, "the ability of natural organisms to evolve to new selective environments or challenges is awesome." —BBC
|
Thorsten Benner is director of the Global Public Policy Institute in Berlin. On March 2 and 3, German Chancellor Angela Merkel travels to Egypt and Tunisia. Concerns about migration and refugees will… 
Editor : M. Shamsur Rahman
Published by the Editor on behalf of Independent Publications Limited at Media Printers, 446/H, Tejgaon I/A, Dhaka-1215.
Editorial, News & Commercial Offices : Beximco Media Complex, 149-150 Tejgaon I/A, Dhaka-1208, Bangladesh. GPO Box No. 934, Dhaka-1000.
Editor : M. Shamsur Rahman
Published by the Editor on behalf of Independent Publications Limited at Media Printers, 446/H, Tejgaon I/A, Dhaka-1215.
Editorial, News & Commercial Offices : Beximco Media Complex, 149-150 Tejgaon I/A, Dhaka-1208, Bangladesh. GPO Box No. 934, Dhaka-1000.
|