Public cloud and the evolution of everything as-a-service

The earlier blogs in this series (article 1, article 2, article 3) have introduced the various public cloud services and platforms. Today, let’s discuss how hyperscalers are introducing technological innovation into their service offerings to get ahead in the game.

Everything as-a-service: The future of public cloud

Infrastructure as-a-service (IaaS) and platform as-a-service (PaaS) will soon be merged into everything as-a-service, and infrastructure as code (IaC) will be abstracted on Kubernetes. Eventually, the three current service models — IaaS, PaaS (and CaaS), and SaaS — will be phased out in favor of everything as-a-service, coupled with SaaS models defined by workflows. This will be a unified model of moving everything to the cloud at a single click of a containerized workload.

By 2024, cloud will be able to provide lambda or function services to developers, allowing them to write functionality in low-code templates without worrying about the container or CI/CD build and orchestration. A hypothetical flexible pipeline extending from public platform to on-prem and other clouds will continue to build an intuitive workflow.

Serverless computing: The next dimension

A serverless and function as-a-service (FaaS) system will undoubtedly play an important role in future technology and operations. Cloud providers are now offering serverless databases as an alternative to traditional databases, thereby closing a gap that has existed since the beginning of cloud computing: the ability to quickly build and deploy applications with a smaller reach.

As a result of the full no-Ops platform or IT stack, businesses will be able to focus solely on the development, maintenance and support of digitalized business functions. From the business function to the bottom of the stack, everything is highly automated and ready to be consumed on demand. Serverless solutions and low-code solutions will eventually merge.

Quantum computing: A revolutionary step forward

Quantum is opening possibilities that will transform the way we process information. At their core, quantum computers are based on quantum mechanics phenomena, and are undisputedly the next big thing in today's world, capable of outperforming both current and future supercomputers. In fact, they will enable us to solve complex problems faster than the most powerful current HPC systems, delivering on their promise of speed and quality.

Here’s why quantum computing is critical, not just for answering today’s questions but for future-fit solutions as well:

  • Data-driven solutions must process an increasing amount of data at an exponential rate, necessitating faster processing to ensure real-time delivery.
  • Concerns about data security and privacy are beginning to hinder the acceptance of technological advancements. Quantum will enable high-volume, on-demand data encryption and decryption, alleviating the strain.
  • Quantum can be useful in discussions about energy efficiency (CO2 emission per transaction).

Quantum computing on the cloud

The use of quantum emulators, simulators or processors on the cloud is known as cloud-based quantum computing. Cloud services are increasingly considered a means of gaining access to quantum processing. When users are given access to quantum-powered computers via the internet, they can achieve massive computing power by converting quantum physics into processing power.

Let’s look at how some of the hyperscalers have done this:

  • In 2016, IBM connected a small quantum computer to the cloud to test simple programs that could be designed and run there.
  • Amazon Web Services (AWS) in 2019 launched AWS Braket, a professionally run quantum computing service that aids academics and developers in evaluating current quantum computing technology and imagining future uses.
  • In 2021, Microsoft Azure released the first full stack public cloud ecosystem framework for quantum computing, which is now in beta.
  • Google Cloud in 2019 claims to have achieved quantum dominance and is currently building a new data center that will house cutting-edge quantum hardware, fabrication facilities, and science.
  • In 2017, Atos unveiled the Atos Quantum Learning Machine, the world's most powerful quantum simulator. Atos is at the forefront of this rapidly expanding technology, which has the potential to usher in a new era of business and economic progress.

Even though quantum computing is still in its nascent stages, improvements in implementation and error correction will have a substantial impact on many fields. Quantum computing on the cloud will benefit businesses and other fields, doing away with the wait for technology to mature and spread.

Changing the game with AI and Machine Learning

Almost every enterprise modernization and data mining interaction with cloud providers references artificial intelligence (AI) and machine learning (ML) as a difference. Although AI/ML was once out of reach for most business budgets, public cloud companies' ability to supply AI/ML services has made it more accessible.

Amazon, Microsoft and Google all offer the capacity to develop and run neural networks and other types of AI in their public cloud computing infrastructure with variable resources and price.

Google Cloud Machine Learning

More than any other major cloud service provider, Google has the most comprehensive collection of machine learning technologies. In its AutoML operation, you can use the company's own built-in algorithms. For those looking for AI acceleration, the Tensor Processing Unit (TPU) chips are a one-of-a-kind offering. Since Google owns the TensorFlow machine learning system, you're in particularly good hands if that's your preferred development platform.

AWS Sagemaker

Since Amazon has been in the cloud computing industry for quite some time, it has a wide range of products to complement SageMaker. The company's marketplace of third-party machine learning applications, which can be used in addition to Amazon's own, is superior to the competition.

Microsoft Azure Machine Learning

Microsoft is a leader in speech and vision and natural language processing. Their Redmond research labs are leaders in modern machine learning. With enterprise applications like SQL Server that embed analytics and machine learning capabilities, Microsoft can provide an on-premises or hybrid cloud machine learning experience. The company is distinguished by its implementation of the open standard ONNX technology for AI model portability, as well as its development of field-programmable gate array (FPGAs) as acceleration tools for machine learning inference.

Overall, there is a lot of overlap across these services, and many of them are free. Regardless of the vendor you choose, you should examine the programming frameworks and tools supplied by each. The easiest and most adaptable option is to use one of the many machine learning engines available in TensorFlow and other frameworks to generate several models.

Read my previous blogs in this series:

Multi-cloud trends and strategy: An executive summary - Atos

Decoding the Cloud Computing timeline

Determining the right cloud platform for your business - Atos

By Nilesh Shinde

Principal Consultant at Atos Syntel

Posted on November 12, 2021

 

 

Share this blog article


About Nilesh Shinde
Principal Consultant at Atos Syntel
Nilesh Shinde is an Atos Cloud space expert. He is a principal consultant of client solutions at Atos|Syntel, where he is responsible for solutioning large Cloud deals that enable enterprises to boost productivity, accelerate their digital transformation, and reduce costs. Nilesh has over 17+ years of experience in the business and technology fields, including architecture design, systems integration, operations management, and consulting.

Follow or contact Nilesh