2020 has been quite a wild ride. As the world has shifted to a wholly remote lifestyle (at least, for now), cloud infrastructure has become an increasingly critical part of how we can operate every day. Most of the services you use to work every day run on some server somewhere. Many of the devices you talk to in your home do the same thing.
If you wanted a little microcosm of where the cloud is right now, we could look back at the past few weeks of AWS re:Invent—where an outright alphabet soup of a dozen-plus cloud products came out. Each of them has its unique use case, but all of them slot into the general theme of where the cloud is headed: a tool for every service you can imagine.
That’s not just an Amazon thing, either. You might call this something along the lines of abstraction-as-a-service, yet another as-a-service term. And that’s set us on a path of what Jason McGee, VP for IBM Cloud and IBM Fellow, describes as an “everything-as-a-service” model. The next ten years, and probably beyond, will be all about making the process of deploying an app on the internet as seamless as possible.
There’s a lot that’s going to happen between now and the end of the decade, and we’re excited to see where things are going next. And there are a lot of other moving parts—changing philosophies around security, innovation in the actual silicon and technical architecture, and others.
But it all takes us down the same road—getting better products out, faster, with fewer chances of things going wrong.
The cloud started as a kind of clunky hardware that, while sometimes challenging to manage, offered an impressive glimpse at the future: getting rid of the need to control the backbones of your applications altogether. And in retrospect, the cloud of aughts wasn’t that consumer-friendly. Given the cost efficiencies available today, you’d probably say it was pretty expensive too.
Over the past decade, it’s gotten easier—maybe so incrementally that it’s hard to notice—to get your app out the door. When we hit 2030, we might look back at this decade and point to containers as one of the turning points of removing server management altogether.
“Containers are a half-step in that journey,” McGee said. “It’s closer to the app, and it’s a better version. I think ten years from now, we wind up just continuing down that progression to something that’s very application-centric and hides a lot of the complexity.”
Getting back to the alphabet soup of this past few weeks: we saw announcements of so many products targeted to so many specific services that it might be hard to pinpoint the end-goal of cloud providers, even beyond Amazon. But that kind of juggling act of dozens of services may, in of itself, be the end goal. Each industry will have a service supporting them to get an app out the door and run as cheaply and efficiently as possible. In short, it’s niche services all the way down.
“As we move away from infrastructure, it becomes easier to imagine [that future of everything-as-a-service],” McGee said. “You have an edge video device that lets you analyze streamed cameras—that’s a pretty niche service when you think about it. I want to do video analytics in my office building; here’s a special box I can install. That’s where we’re headed.”
And that extends beyond just the software and the web console you’re using to manage your services.
What’s in the (abstract) box?
Another meta-trend of the past decade is the rise of machine learning powering pretty much every service you use today. Along with that, companies and venture firms have poured billions of dollars into innovating on the stuff that powers that: the actual silicon in those devices, whether they’re on the edge and in a data center.
The cloud, too, offers an exciting proving ground for all this innovation in processing power. Rather than ending up in your phone or laptop right away (though Apple may have just made that move), companies can experiment with new kinds of hardware in edge devices that need to handle complex processes. And then, again, you don’t even have to manage those devices.
“I think the cloud may be the place where they get vetted and exposed first. Because of the managed nature of the infrastructure, it’s easier for us to introduce specialized hardware in the environment,” McGee said. “It’s much easier than in the broader market. Whether that’s FPGA, new processors, AI optimized silicon—all those things are easier to get into the public cloud first and just expose them as a service. We’re even doing that with quantum. We have some of them on the cloud so people can experiment with building Quantum without buying and running a quantum system, which you can’t do.”
The hardware itself may also be abstracted at a certain point. You might also think of it as a serverless model, but for the hundreds of cameras running in a factory processing 4K video. And as hardware becomes easier to iterate on, especially at the core processing layer, It’s sort of the same principle: just focus on your app, really.
“You have to think about how to deploy end models and how to use those models,” McGee said. “It’s a prime driver for this ‘as-a-service’ model at the edge version of cloud. You have this perfect convergence of AI, frameworks, and models that bring the skill requirements to deploy that yourself down. There’s not many people that can deploy that at scale and operate it by themselves. But there’s a high value of somebody running that for you, as a service.”
Protecting developers from abstraction (and themselves)
Massive security breaches very often make the news. But for every major breach from a state actor, you might also see another kind of breach: someone accidentally left some data open to the public and an enterprising developer found it. In the best case scenario, the company gets a heads up. In the worst case, they may never find out about it at all—and the bad actor continues to reap the benefits.
That often boils down to what may be a series of simple mistakes. They may have configured the wrong IAM permissions. They might have just left an S3 bucket open to the public. They misread the YAML or mindlessly typed an error while watching a basketball game on their second monitor. Or any other number of deceptively simple factors.
So while managing proper Security with a capital S is typically one of the top (if not the top) priorities for cloud providers, there’s another challenge. Suppose you’re abstracting everything out to allow users to just focus on their app. In that case, you may be introducing another point of failure: someone concentrating so much on the app that they don’t think about their configurations.
“It’s an area we spend a tremendous amount of time on,” McGee said. “As-a-service delivery tries to hold the promise that it’ll be more secure because you have experts handling Security for you. You don’t have to be an expert yourself. And a lot of those mistakes come from people configuring things wrong. We’re trying to directly address the configuration and control and compliance side of things.”
That may be easier said than done, especially as we reach a point where there may be hundreds of services available for just about every niche possible. But McGee said proactively ensuring those developers have their configurations correct is about as far up there as protecting them from state-sponsored hackers.
The march to 2030
It’s critical that we remember some of the tremendous innovation that happened across the board in 2020. Getting a vaccine for Covid-19 out the door in less than a year is a monumental scientific achievement and one that will continue to influence our research for decades. As Sir Isaac Newton said, “If I have seen further, it is by standing upon the shoulders of giants.”
But we can take a moment to appreciate the constant stream of new tools and technologies that continue to abstract away all the challenges of getting an application—something perhaps at the scale of Facebook or Google. Removing that challenge isn’t just right for your bottom line: it enables faster innovation that forces everyone to, in return, innovate. That is the best outcome for everyone.
That, of course, assumes everyone gets on board with the cloud. But McGee is optimistic about that.
“I think the penetration ten years from now could wind up being very high—maybe an 80% range,” he said. “Today, some people might say at 20%, while others would say maybe 4%. It depends on how you define ‘cloud.’ We’re in the midst of an expansion of what we think about as cloud from just being centralized public data centers to a more distributed kind of cloud. We have compute at the edge, computing in these core cloud data centers, and more options becoming available. If that’s what happens, then there’s a much broader slice of applications that could take advantage.”