Skip to Content

AI and ecosystem change

Goutham Belliappa
2019-11-25

Human beings take an average of 27 years to mature, leave home, start a separate householdand start having children. In the US, once a household is formed, the average American  has 1.9 children. At 1.9 children per household, the US is below the basic sustainable population level of 2.1 children per woman. Despite widespread concern about population growth, we actually have halved our fertility rate (TFR) from 4.5 to 2.4 globally, pushing us under the basic replenishment rate.

As humans, we have a long history of continually renegotiating an equilibrium with our environment. We balance our population with different evolutionary levers, including consumption, the carrying capacity of the ecosystem, fertility rates, decimating or intermixing with competing species, etc.

As AI moves towards becoming sentient, it is very likely that self-replication will become part of the equation for these new beings. In our human anchoring bias, we may imagine a linear factory supply chain-type replication for machines where materials are extracted in point A, manufactured in point B, and deployed to point C, where the deployed unit is incapable of self-replication. No sentient being we know of exists in this manner, yet we persist in the belief that machines and robots and AI will have fragile, lengthy supply chains similar to those that exist for hard goods today. The sentient beings we know of carry their replication capability with them; this pattern goes back at least four billion years on Earth.

Now, let’s take a minute to imagine a non-biologic replication that is at the rate of spiders or similar creatures, in which the ratio is closer to 1:100 or 1:1000 per unit in much shorter time frames.

Would this change the balance of power in the universe when the human ratio is a drop in a very large bucket of sentients? Will next-generation sentients compete with humans for resources or will they achieve a human-type equilibrium, where carrying capacity and resources allocated to humans are factored into machine sustainability as a hard constraint, a soft constraint, or a non-existent constraint where non-biologic sentients actively compete for resources?

Humans have a history of similar competition within our own species, and with different results. From the humans who came out of Africa into Europe, who competed with Neanderthals, and who eventually completely dominated them to extinction, to the European invasion of the Americas, where some sort of equilibrium between native and foreign populations that included population blending was achieved, to our current competition for resources with almost every other species on the planet. We have many patterns of competition with other species and even members of our own species. It is an open question of how we will compete for resources with machines.

The other piece of the equation is productivity per unit. The uncountable variations of all life on Earth started with a single cell that natural selection has continuously built upon. One could argue that a human-developed process of re-engineering could be superior to the continuously forward-engineering process that is natural selection. We only understand about 10% of how our brain functions. Some scientists estimate 75% of our DNA is actually “junk,” evolutionary leftovers. Imagine the power of DNA if it was reconstructed at the same human block size, where the junk quotient was minimal.

When we consider AI as potential new sentients, these questions become even more intriguing – and more critical. When we design machines, we follow the human approach of learning – tear down to components, and rebuild. What will the end product look like or will there ever be an end product in what we have designed as a continuously changing environment – much like evolution but fundamentally faster?

Imagine all of this working together – non-biologic AI and self-replicating sentience, assembling, reassembling, continuously evolving and transforming – the next generation of AI looking nothing like its predecessor in very few generations. Sound too much like science fiction? A phone in 1995 looked completely different than smartphones today. We have seen this kind of rapid evolution of technology before. What will this process do to non-biologic, sentient-like smart machines and robots?

Humans have looked to the stars and have felt very small in the vastness of the universe. Imagine a world where humans design an apex that makes us inadequate in so many ways. What then will we perceive as our place in the universe? Do we co-exist in some capacity or will AI be the children to whom we entrust our future? As Elon Musk said, will humans be the “‘Biological Boot Loader’ for AI.”

I consider this a very high-level thought experiment – a counter to the trillions of posts competing for social real-estate – illustrating only one or other side of AI.

A contrarian view “can disclose nature’s failure to conform to a previously held set of expectations. Second, they can suggest particular ways in which both expectation and theory must henceforth be revised.” The use of the 10% brain myth is an example of long held beliefs that is yet to be successfully debunked in any meaningful way in that we still have no idea what part of our brain is used for to serve objective functions that we can control.

It is my belief that the complexity of AI will be much richer than any of our currently held beliefs. The world ahead will be rich in ways that we cannot currently comprehend.

To learn more about AI-based business transformation initiatives, contact me.