Only a small fraction of the evolutionary selection that has taken place on Earth has been destined to enhance intelligence, creating more or less complex nervous systems. In that sense, the human nervous system seems the most sophisticated of all those that inhabit the planet.
This could change if one day we design an Artificial Intelligence more powerful in all aspects than that construct born after a billion years of trial and error.
To design such AI we could analyze the functioning of said biological nervous systems of all kinds, as Nick Bostrom explains in his book Superintelligence:
Evolutionary algorithms, however, need not just variations to select from, but a fitness function to evaluate the variants, and this is often the most complicated computational component. A fitness function for the evaluation of artificial intelligence would possibly require the simulation of neural development, learning and cognition to evaluate aptitude. So we would do well not to look at the sheer number of organisms with complex nervous systems, but instead look at the number of neurons in biological organisms that we might need to simulate to mimic the fitness function of evolution.
For example, between 15 and 20% of the planet’s biomass is made up of ants, not counting the rest of insects, whose brain size varies substantially. The bee brain has fewer than 10,000,000 neurons. The brain of the fruit fly, less. That of the ants barely 250,000 neurons.
A rough calculation would give us 10 to the power of 24 insect neurons . For aquatic copepods, birds, reptiles, mammals, etc. it would have to go up an order of magnitude: 10 to 25 neurons. In pre-agrarian times there were fewer than 10 to the power of 7 humans, with less than 10 to the power of 11 neurons each. That is, there would be less than 10 to the eighteen human neurons in total (although with a greater number of synapses).
The computational cost of simulating a neuron depends on the power level of the simulation. Extremely simple neuron models require around 1,000 floating point operations (FLOPS) per second to simulate a neuron (in real time).
The Tianhe-2 supercomputer, for example, in September 2013 provided 3.39 x 10 raised to 16 FLOPS. It seems a very high figure, but it falls short if we take into account the number of neurons to simulate:
If we wanted to simulate 10 to 25 neurons over more than a billion years of evolution (longer than nervous systems have existed as we know them) and let our computers run for a year, we would have as a result a requirement of between 10 to the 31st and 10 to the 44 FLOPS.