Cloud computing in 2020: views of the industry – Techerati

We asked six industry experts to weigh in on whats now and whats next in cloud computing

When we published our selection of cloud predictions last year, most predicted container orchestrator Kubernetes to consolidate its stranglehold over the container space and, correspondingly, modern cloud infrastructure.

Last November, one of the most extensive customer surveys bore this prediction out. In its study of thousands of companies, cloud and infrastructure monitoring company Datadog found 45 percent of its customers were using Kubernetes. And if that isnt evidence enough, just reflect on VMwares announcement in March that it plans to transition its enterprise virtualisation platform to a system that runs (and runs on) Kubernetes.

But in reality, Kubernetes centrality to cloud was put beyond doubt weeks before we published last years roundup. In January, IBM steamrollered into 2019 fresh off the back of its $34 billion acquisition of Red Hat. This year IBM confirmed it would integrate Red Hats popular Kubernetes implementation, OpenShift, into a new multi-cloud business focus.

It is in this context that most of this years experts consulted their cloud crystal balls. Rackspaces Lee James predicts 2020 to be a year of stiff competition between enterprise IT giants jostling to deliver a Kubernetes solution that unlocks multi-cloud for their customers. On the other hand, Stephan Fabel of Canonical says end-users will start to understand the limitations of Kubernetes, and accordingly, utilise it more strategically. Lastly, Pivotals Michael Cote expects companies to use this new-found savoir-faire to establish a singular, overall Kubernetes strategy.

Read the predictions in their entirety below.

Hybrid becomes the new multi-cloud, again

While the popularity of multi-cloud is undisputed with 81 per cent of companies using cloud technologies in some way, many firms are still making investments in their private cloud solutions. This is due to a number of reasons, such as the security posture, ongoing data centre leasing, or just because its the best platform for the application in some cases business. Indeed, even the UK Government plans to revise its cloud first policy to cloud right (or something similar) early next year, acknowledging that public cloud isnt right for everyone or every use case.

Reflecting this trend, weve seen the cloud giants respond with private cloud solutions that link directly into their public cloud solutions, such as Azure Arc, Google Cloud Anthos and AWS Outposts.

In 2020, theres going to be significant competition between the three biggest cloud hyperscalers and VMware as they all explore and deliver on how Kubernetes will unlock their potential to be the multi hybrid cloud provider of choice for customers. For customers, its ultimately going to come down to which fits and works best, as well as what gives the best bang for their buck. But this sets us up for an exciting year of new product and service announcements as each of the major cloud companies try to establish themselves as the cloud broker of choice.

Unicorn start-ups will begin repatriating workloads from the cloud

There has been a lot said about cloud repatriation of late. While this wont be a mass exodus from the cloud in fact quite the opposite, with public cloud growth expected to increase 2020 will see cloud native organisations leveraging a hybrid environment to enjoy greater cost savings.

For businesses starting out or working with limited budgets, which require an environment for playing around with the latest technology, public cloud is the perfect place to start. With the public cloud, you are your own limit and get immediate reward for innovation. But as these costs begin mounting, its prudent to consider how to regain control of cloud economics.

Repatriating workloads to on-premise is certainly a viable option, but it doesnt mean to say that we will start to see the decline of cloud. As organisations get past each new milestone in the development process, repatriation becomes more and more of a challenge. What we will likely see is public cloud providers reaching into the data centre to support this hybrid demand, so that they can capitalise on the trend

Kubernetes has become an integral part of modern cloud infrastructure and serves as a gateway to building and experimenting with new technology. Its little surprise that many companies we observe are doubling down on the application and reorienting their DevOps team around it to explore new things such as enabling serverless applications and automating data orchestration. We think this trend will continue at strength in 2020.

On a more cautious note, we may also see some companies questioning whether Kubernetes is really the correct tool for their purposes. While the technology can provide tremendous value, in some cases it can be complex to manage and requires specialist skills.

As Kubernetes is now commonly being used for production at scale, it becomes increasingly likely that users encounter issues around security and downtime. As a result of these challenges, we can expect the community will mature and in some cases come to the viewpoint that it might not be right for every application or increase the need to bring in outsourced vendors to aid with specialised expertise.

Organisations should try to find one standard Kubernetes approach

Kubernetes has already emerged as the leading choice for running containerised or cloud native applications. Organisations will now spend the time to create a kubernetes strategy, choosing the distro or services theyll use, to then run a few applications on it. Having multiple initiatives here would be a huge waste and delay the overall strategy. Instead organisations should try to find one standard Kubernetes approach. In 2020, though, the bulk of the work will be finding and modernising the apps that will run on that platform.

Most large organisations are doing just that and will spend 2020 modernising how they build and run software. To start, they need to find a handful of small, high value applications and the services those applications use. Then, run a working Proof of Concept (POC) to validate the platform choice by launching actual applications on the platform.

If it goes well, organisations can then put more apps on it. If it doesnt go well, they can try to find out why and try again, maybe with a new platform. Its important to look at the full end-to-end process: from development, to running in production, to re-deploying companies need to judge the success of the platform choice.

All applications will become mission-critical

The number of applications that businesses classify as mission-critical will rise during 2020 paving the way to a landscape in which every app is considered a high-priority. Previously, organizations have been prepared to distinguish between mission-critical apps and non-mission-critical apps. As businesses become completely reliant on their digital infrastructure, the ability to make this distinction becomes very difficult.

On average, the 2019 Veeam Cloud Data Management report revealed that IT decision-makers say their business can tolerate a maximum of two hours downtime of mission-critical apps. But what apps can any enterprise realistically afford to have unavailable for this amount of time? Application downtime costs organisations a total of $20.1 million globally in lost revenue and productivity each year, with lost data from mission-critical apps costing an average of $102,450 per hour. The truth is that every app is critical.

Businesses will continue to pick and choose the storage technologies and hardware that work best for their organisation, but data centre management will become even more about software. Manual provisioning of IT infrastructure is fast-becoming a thing of the past. Infrastructure as Code (IaC) will continue its proliferation into mainstream consciousness. Allowing business to create a blueprint of what infrastructure should do, then deploy it across all storage environments and locations, IaC reduces the time and cost of provisioning infrastructure across multiple sites.

Software-defined approaches such as IaC and Cloud-Native a strategy which natively utilises services and infrastructure from cloud computing providers are not all about cost though. Automating replication procedures and leveraging the public cloud offers precision, agility and scalability enabling organisations to deploy applications with speed and ease. With over three-quarters of organisations using software-as-a-service (SaaS), a software-defined approach to data management is now relevant to the vast majority of businesses.

Companies will take a step down from the cloud

The pattern goes something like this: lift and shift infrastructure VMs to the cloud, see costs actually go up, then move some workloads back to on-prem, then application lifecycle drivers push new apps (or new features for old apps) to be built in the cloud using PaaS/DbaaS technologies with more favourable cost model, then retire old IaaS apps. The key takeaway is this dynamic is one of the drivers for hybrid approaches.

We saw many companies this past year lift and shift to the cloud. Now, in 2020, I expect well see companies take a step back, and reevaluate their all-in approach. After feeling the effects of the full shift to cloud, the high associated costs and lack of flexibility, many companies will likely move to a multi-cloud or hybrid cloud approach next year.

Taking a hybrid cloud approach enables organizations to leverage the cloud (using AWS, Azure, or GCP) for some applications and computing needs while still keeping mission-critical or sensitive data closer to home. With a multi-cloud strategy, organizations can reduce costs and, instead of being constrained to one cloud, departments have the flexibility to select the service that works best for their individual needs (using a mix of AWS, Azure, or GCP).

Customers and organisations will really begin to look for more management layers on top of their solutions

One of the things I believe well see in 2020 is the true adoption of things like hybrid and multi-cloud solutions but the difference in this upcoming year will be that customers and organisations will really begin to look for more management layers on top of their solutions.

A lot of companies already have things like backup-in-the-cloud and DRaaS somewhere else, so what theyre now looking for is a uniform management layer on top of that to give visibility on cost, as well as a knowledge of where all data is located. Its important to know where data lives, whether workloads are protected, and whether they need to move workloads between different clouds if and when requirements change.

Go here to read the rest:
Cloud computing in 2020: views of the industry - Techerati

Related Posts

Comments are closed.