Defining the Next Cycle of Technology

“Great! Now Make it All Multi-Cloud!”

Also, this next tech cycle is not just the “cloud computing cycle.” AWS launched in 2006. Google Cloud was launched in 2008. And Microsoft Azure was formally launched in 2010. So, We’re already well over a decade past the dawn of the public cloud. Yet this next tech cycle definitely builds on the ecosystems, methodologies and technologies these hyperscalers provide.

The database you use also has to align with where you need to deploy it. Does it only work in the cloud, or can it be on-premises far behind your firewall? Does it just work with one cloud vendor or is it deployable to any of them? Or all of them simultaneously? These are important questions.

Just as we do not want to be locked into old ways of thinking and doing, the industry does not want to be locked into any one technology provider.

If you’ve just been mastering the art of running stateful distributed databases on a single cloud using Kubernetes, that’s not good enough. Now you’re being asked to do it all over again; This time in a hybrid or multi-cloud environment using Anthos, OpenShift, Tanzu, EKS Anywhere or Azure Arc.

Computing Beyond Moore’s Law

It’s also not the basic broadband or wireless internet revolutions. We’re a full two decades into both of those. Yet the advent of gigabit broadband and the new diverse range of 5G services—also capable of scaling to a gigabit—enable incredible new opportunities in real-time data streaming services, IoT and more.

So how does your database work when you need to connect to systems far and near? How important are the limitations of the speed of light to your latencies? How well do you deal with data ingested from hundreds of millions of endpoints at gigabit-per-second scale?

And finally, underpinning all of this are the raw capabilities of silicon, summed up by the transistor and core counts of current generation CPUs. We’ve already reached 64-core CPUs. The next generation(s) will double that, to the point where a single CPU will have more than 100 processors. Fill a rack-based high-performance computer with those and you can easily get into thousands of cores per server.

And all of this is just traditional CPU-based computing. You also have GPU advancements that are powering the world of distributed ledger technologies like blockchain. Plus, all of this is happening concurrently as IBM plans to deliver a 1,000 qubit quantum computer in 2023 and Google plans to deliver a computer with one million qubits by 2029.

This next tech cycle is powered by all of these fundamentally revolutionary capabilities. It’s what’s enabling real-time full streaming data from anyone to anywhere. And this is just the infrastructure.

Leave a Comment