Is 'The Singularity' Behind Us? [Draft]

How much computational power does it take to simulate a single human cell for one second?

Assuming simulation on the atomic level, for a human neuron, there are roughly 175 trillion atoms per cell.

Interesting biological phenomena happen over the course of hours or longer. We will need to increase our simulation resolution from thousandths of a second to tens of thousands of seconds. This is assuming an atomic model is of high enough fidelity to match observation. How much of a role to quantum fluctuations play in cellular biology and how much more computation would we need to simulate them accurately?

Maybe we don't need to simulate every atom, maybe there are some compression algorithms that we can discover that allow us to model fairly accurately. This is perhaps a worst case scenario, it would require investigating and cataloguing the billions of components of a cell over time. Billions of times, for billions of cells. Those heuristics will be hard won. Perhaps we can offload all the work onto the computer systems and they can catalog them and develop the heuristics. In that case the understanding of the cell will be very black box from a human perspective. Humans have no hope of understanding a single human cell, score one point for the singularity being behind us.

For our understanding of cells it may at some point be more economical to simulate cells via cultures under labratory conditions and sample them. Maybe this is what people mean when they believe "we are living in a simulation".

To summarize, assuming Moore’s law continues, we are at least 100 years of hard research from achieving thorough understanding of the human brain.

Bharath Ramsundar

But are AI prophets like Sam Harris and Ray Kurzweil right? Won't Moore's law carry us through this? Won't just a few more decades of exponential growth devour those orders of magnitude?

Short answer: no. Single processer clock speeds have levelled off.

Logistic growth of computational power.1 The constraints are speed of light, heat dissipation.

Table 1

For a single processor machine.

We are still seeing growth at the global and networked level.

There's still quite a bit of room it improve energy efficiency. As we know from high scool, energy and information are interchangeable. This allows us to find out the theoretical mininum energy cost for computation. It looks like we can squeez out six or seven orders of magnitude! Only 50 more to go and then we can start simulating our brains on our home PCs!

Barring a fundamental revolution in our understanding of the laws of physics this appears to be our lot in life.

... but what if? What if we could go into space, and forget about the Earth and its weak-sauce 49,837 exajoules of available solar energy? Awesome, another 14 orders of magnitude increase in energy we can use for computation! Now we can start to simulate that individual neuron for a couple of hours! Yeah! Looks like we're on the way to the singularity after all! Except... now to understand that one neuron for a couple hours we need to encircle our star with a self replicating swarm of theoretically optimal energy efficient computers. Program them, worry about storage and network latency, parallezation of the computational tasks. It may even get to the point of being as complicated as the neuron itself! And what of the individual human in this situation? It looks up at the starless sky and wonders... will our AI overlord ever deliver the wisdom that we constructed it to? The AI has its own concerns though... it's too busy trying to simulate the human cells that construct its own germ line, just need to appropriate every star in the galaxy and we'll be within striking distance of simulating a single human consciousness!