The Look Out 2020+ Tech Trends Radar will help you understand the technological landscape that lies ahead. Take a look at the key technologies set to impact your business in the coming years. Understand the steps you should take today for success tomorrow.
The future will be driven by data and powered by AI and Cloud, with Quantum and High-Performance Computing making the once impossible possible. Companies need to understand technology today to survive and thrive tomorrow.
Executive Vice President Big Data & Security Solutions and Group CTO, Atos
The Look Out 2020+Tech Trends Radar provides a pictorial view of our findings, allowing you to quickly understand disruptive emerging technologies and the actions you might consider taking. The positioning in the quadrant illustrates when they are likely to impact your business along with the potential size of the impact, while the colors represent the current maturity of each topic.
A lightweight virtualization technology that provides applications with an isolated environment inside a single operating system instance, containers provide users and applications running inside them with the illusion and experience of running on their own dedicated machine.
The new computing continuum will be a heterogeneous environment based on the decentralization and federation of diverse computing entities and resource typologies. These will include multi-Cloud (and Cloud federation) models with their diverse, decentralized and autonomic management and hybrid Cloud models that cross boundaries between internal and external Cloud services or between public, private and community providers. Then, Cloud Service Integration (CSI) provides a flexible means for assembling these various Cloud-based elements in support of business process that transverse IT domains.
The open source hardware model extends the ideas and methodologies popularized in open source software development to hardware development. Documentation — including schematics, diagrams, list of parts and related specifications — are published with open source licenses so other teams can modify and improve them, based on specific needs. These are sometimes combined with more traditional open source software, such as operating systems, firmware or development tools. For instance, both Linux and Android operating systems are being used in embedded devices.
Software-defined anything/everything (SDx) is an approach that replaces legacy — and often specialized — hardware controlled by physical mechanisms with software running on commodity hardware platforms. The concept may be applied to a wide variety of aspect of an IT system including networking, compute, storage, management, security and more.
Also known as Hyperscale, Web-Scale computing is a large, distributed, grid computing environment that can scale-out efficiently as data volumes and workload demands increase in internet-size ways. Compute, memory, networking, and storage resources are added quickly and cost-effectively. Often built with stripped down commercial hardware (sometimes based in open-hardware licenses), Web-Scale computing optimizes hardware. Potentially millions of virtual servers may work together to accommodate increased computing demands without, however, requiring additional physical space, cooling or electrical power.
3D printing (3DP), or additive manufacturing, is an inexpensive approach to manufacturing 3D objects, materializing them from virtual designs created using CAD (Computer Added Aided Design) programs. It can be used to create almost any shape or geometry, extending from nanoscale to complete buildings. To create the objects, the 3D printer superposes layer upon layer of material using various additive processes. These layers may use different materials, such as powder, polymers, metals, paper, or even foodstuffs.
5G represents the next generation of communication networks and services. It is not intended to be an evolution of legacy communication technologies but a novel approach for fulfilling the requirements of future applications and scenarios. As such, legacy 4th generation LTE technologies continue to evolve in parallel.
Blockchain is a distributed database that uses cryptographic techniques to store a growing list of records – or blocks – sequentially. Blockchains can be private, public or owned by a consortium. It uses distributed ledger technologies to enable a new model where trust is established in a peer-to-peer network without the need for a trusted third party.
Cognitive computing can be seen as an integration of algorithms and methods from diverse fields such as Artificial Intelligence (AI), machine learning, Natural Language Processing (NLP) and knowledge representation to enhance human performance on cognitive tasks. It is able to learn and understand natural language as well as reason - and even interact more naturally with human beings than traditional programmable systems. Cognitive computing systems can supplement human work in three capabilities: increased engagement, improved evidence-based decision-making, and discovery of insights hidden in massive amounts of data.
Deep Learning is a branch of machine learning with its roots in neural networks where multi-layered neural network algorithms attempt to model high-level abstractions in data. Currently most applications of Deep Learning use supervised learning, where a network is trained with a large set of labeled data examples for each category. Unsupervised learning, on the other hand, is where machines identify objects, text or images without having been specifically trained on a related dataset. Deep Learning is also commonly used in Natural Language Processing, analyzing multi-dimensional spaces from texts.
DevOps is a philosophy for how to build and operate software that encourages teams to focus on business value, work collaboratively, deploy software more frequently in smaller increments and build reliable solutions. Furthermore, DevOps promotes continuous improvement across all of these dimensions.
The growth of IoT and the emergence of ever-richer Cloud services together call for data to be processed at the edge of the network. Also referred to as fog computing, mesh computing, dew computing and remote Cloud, Edge computing moves applications, data and services away from the centralized model of Cloud computing to a more decentralized model that lies at the extremes of the network. Edge computing is also closely related to the concept of Swarm computing.
Insight platforms are the third generation of business analytics platforms, after the Business Intelligence and Big Data phases. They represent a combination of new and existing technologies that collect and analyze massive data sets from connected environments in real time, rapidly transforming that data into actionable (prescriptive) insights. Examples include the combination of streaming analytics analyzing data in motion in real time to accelerate time-to-insight; distributed analytics analyzing data in situ within a distributed architecture; and prescriptive analytics, which makes predictions based on its Big Data analysis and then suggests decision options.
Advanced automation is a class of new automation technologies, such as Robotic Process Automation (RPA) or knowledge-based and/or AI-supported automation solutions that are transforming the existing automation solutions for desktops, datacenter, application support and installation. They expand the existing automation capabilities in depth and range, and also take the decision-making process to a new level. Special environments, such as IoT or Edge, come with special requirements that can only be fulfilled with advanced automation.
The Internet of Things represents a ubiquitous communication network that effectively captures, manages and leverages data from billions of real-life objects and physical activities. Networks of spatially distributed sensors and actuators (nodes), each with a transceiver and a controller for communicating within a networked environment, detect and monitor events (sensors) or trigger actions (actuators). Each has a unique identifier and the ability to transfer data over a network without human-to-human or human-to-computer interaction.
Geographical information systems (GIS) capture, store, analyze and display information referenced according to its geographical location. The next generation takes the third dimension into account and provides increased resolution for a much more realistic representation of the world. Spatial data can be gathered from a wide array of sources, including global positioning satellites, beacons, Wi-Fi hotspots, remote sensors and visible light communication (VLC) sources such as Li-Fi. Analytics and visualization technologies allow companies to extract insights from this spatial data.
Also referred to as ultra-narrowband, Low-Power Wide-Area Network (LPWAN) wireless communication technology has a low power requirement and a long range, but a low data rate. LPWAN was designed to enable objects that don't have a powerful source of energy to be connected, primarily to the IoT. After all, many objects connected to the IoT only need to transfer small amounts of data, such as commands and statuses, and that operation only requires a small amount of power.
Natural user interfaces (UI) are systems designed to make human-computer interaction feel as natural as possible. This wide range of technologies allows the user to leverage everyday behaviors, intuitive actions and their natural abilities to control interactive applications. These might include touch, vision, voice, motion and higher cognitive functions such as expression, perception and recall. Some natural user interfaces rely on intermediary devices while other more advanced systems are either unobtrusive — or even invisible — to the user.
Prescriptive analytics is an advanced form of business analytics that helps decision-makers determine the best course of action among various choices, given known parameters. It goes beyond descriptive analytics, which provides insight into what happened, and predictive analytics, which aims to forecast what will happen. Prescriptive analytics leverages Big Data not only to suggest one or more possible courses of action but also show the likely outcome of each..
Trusted devices are terminals and software-powered objects and machines that are made secure and trustworthy in order to protect data and process availability, integrity and confidentiality. They include human interaction devices such as smartphones and payments terminals as well as autonomous devices such as smart homes and smart machines. Trusted devices rely on high-security design, hardened software and hardware, and intensive certification processes provided by trusted third parties (notably leveraging 'common criteria' norms).
Virtual assistants are software agents that perform services or tasks on our behalf. They understand queries and can answer them in a natural language. They exploit artificial intelligence, natural-language processing, machine learning, voice processing and reasoning and knowledge representation to make human-machine interactions simpler, more natural and more appealing.
Miniature electronic devices with integrated sensing, computing and communication capabilities that are worn on the body. They leverage the wearer’s context — detected by embedded sensors — to deliver either general or specific services that enable the wearer to act in real time based on the information they provide. Although most popular wearables today are smart watches, wearables can be found in different places of our body: bracelets, headbands and helmets, contact glasses, earphones, globes, digital pens, smart clothing, jewelry, and even tatoos.
WebRTC is a collaborative standardization effort by the Internet Engineering Task Force (IETF) and World Wide Web Consortium (W3C), WebRTC is an open standard for Web Browser based Real-Time Communications and is supported by all major browser vendors, including Google Chrome, Mozilla Firefox, Microsoft Edge and Apple Safari. In essence, the WebRTC standards define both a browser API and the real-time communication protocols that enable voice, video and data communications to be embedded in web applications without the need for browser plugins.
Wireless power describes the transmission of electrical power without solid wires, using electromagnetic fields instead. There are two types of wireless power: near-field charging, which uses inductive or capacitive charging, and far-field or radiative charging, which uses beams from electromagnetic devices .
Autonomous vehicles is an emerging field arising from the interaction of transportation vehicles and robotic capabilities, such as environmental sensors, context awareness and autonomous decision-making using Artificial Intelligence. These self-driving vehicles rely on these technologies to drive themselves while recognizing and responding to their surrounding environment.
The brain-computer interface (BCI) is a direct communication pathway between the brain and an external device based on neural activity generated by the brain. While many approaches use invasive devices, the most promising initiatives are based on non-invasive approaches. Electroencephalogram (EEG) devices record brain activity. EEG's fine temporal resolution, ease of use, portability and low set-up cost has made it the most widely studied potential candidate for a non-invasive interface.
Context-aware computing systems collect and store diverse data, then leverage analytics to deduce context from the interactions among the data before triggering actions based on that contextual information. In doing so they effectively empower data by extracting its meaning in relation to other pieces of data, bringing the full potential to data that might otherwise be leveraged in isolation.
Continuous authentication exploits behavioral (passive) biometrics (a form of biometrics that exploits dynamic human characteristics) to establish that an individual is who they say they are. Well-known examples include voice, typing style, mouse use, heart rate and walking pace. Like all biometrics, continuous authentication enables multi-factor authentication when combined with other security mechanisms such as smart cards.
A digital twin is simply a digital replica of a physical asset, process, system or service across its lifecycle. This virtual representation of the physical world is, in essence, a simulation much akin to the 3D renderings of computer-aided design (CAD) models, asset models and process simulations used by engineers for decades. It uses real-time data to promote understanding, identify problems, develop new opportunities and plan for the future.
Exascale supercomputers refer to High-Performance Computing (HPC) systems capable of at least one billion billion calculations per second (one exaFLOPS) — a thousand-fold increase over today's petascale supercomputers. It provides a major step forward in addressing the new challenges of the 21st century at a time when all sectors (but particularly industry, academia and science) are demanding increasingly powerful computing systems for resolving problems involving ever-growing volumes of data.
With Moore’s Law in serious jeopardy, general-purpose computing devices are no longer sufficient to satisfy highly demanding applications. Hardware accelerators are highly specialized computing devices targeting a narrow field of computing. They can be classified roughly into three categories: programmable hardware (like GPUs), Field Programmable Gate Arrays (FPGA) and fixed-function hardware (like Google Tensor Processing Unit). The field is still evolving with some new capabilities like neural accelerators, neuromorphic processors, and other AI specialized components.
An immersive experience is one that is totally absorbing, that allows users to disconnect from the real world and lose themselves in a simulated dimension. Immersive experience technologies encompass a wide range of devices, including virtual reality (VR) – digital simulations of real-world environments; 3D displays – display devices that create the perception of depth; haptic devices – which add the sensation of touch; and holographic user interfaces – laser-based volumetric displays where users interact with holographic images
Privacy-enhancing technologies (PETs) refer to technologies involved in protecting or masking personal data (whether employees, customers or citizens) to achieve compliance with data protection legislation and sustain customers' trusted relationships. PETs not only protect very sensitive data (such as credit card information, financial data or health records), they also shield the very personal information (including purchasing habits, interest, social connections and interactions) that digital users are keen to allow some services to leverage, but only provided some privacy is respected.
QUIC is an experimental protocol, created by Google and standardized in 2016 through IETF. The name stands for ‘Quick UDP Internet Connections’, which is due to the fact it allows the fast and easy sending of simple packets over the connectionless User Datagram Protocol (UDP). It's very similar to TCP+TLS+HTTP2 but a new multiplexed and secure transport built on top of UDP. QUIC provides multiplexing and flow control equivalent to HTTP/2; security equivalent to TLS; and connection semantics, reliability and congestion control equivalent to TCP.
With the growth of Cloud, APIs and the IoT, cybercrime is constantly increasing in volume, sophistication and impact. Cyber-defense strategies have evolved toward new self-adaptive security principles. This approach moves the emphasis from protection to real-time detection and response, which adapts defenses immediately. Technologies and processes incorporate Security Operation Centers (SOC), which rely on new generation Security Information and Event Management (SIEM) technologies enhanced with machine learning and prescriptive analytics. Self-adaptive security also relies on new generations of context-aware security technologies that dynamically adapt to threats.
Used extensively and exclusively within blockchain, smart contracts are digital peer-to-peer contracts written into lines of programming code. They are executed automatically to enforce a contract. Smart contracts allow business rules agreed by the parties to be embedded across a distributed, decentralized blockchain network and executed as part of a ledger's transactions, ensuring transparency and mitigating any conflict.
Smart machines refer to systems embedded with cognitive computing capabilities that are able to make decisions and solve problems without human intervention. They perform activities and tasks traditionally conducted by humans, boosting efficiency and productivity.
Today’s further adoption of digital working solutions, particularly through the rapid adoption of Cloud productivity and collaboration platforms, has enabled experiences to flow freely across devices and enabled the portability of the user persona across multiple create and consume device types. This will further evolve into a new ubiquity. Personal devices won’t be a requirement in order to have a productivity or collaboration experience; instead the facilities in which we interact, such as the conference room, will identify users and provide assistant functionality, as well as immersive collaboration experiences based on tools, content management, Artificial Intelligence (AI) and Internet of Things (IoT) experiences built into the space.
Biocomputers are computers that use biological materials such as DNA and proteins to perform computational calculations that involve the storage, retrieving and processing of data. They leverage the capabilities of living beings, relying on nanobiotechnology to engineer biomolecular systems that provide the computational functionality.
Computing memory represents a new approach to solving the limitations of classical (Von Neumann) computing architectures. In this model, certain computational tasks are performed in place in a specialized memory unit called computational memory. This co-existence of computation and storage at the nanometer scale could enable ultra-dense, low-power, and massively-parallel computing systems. Resistive memory devices, where information is represented in terms of atomic arrangements within tiny volumes of material, are poised to play a key role as elements of such computational memory units.
Due to their nature, computing devices have long been at the center of our attention. Smartphones, tablets, our laptops and desktops all take up a huge percentage of our focus and mental reasoning capacity. However, with the rise of technologies like speech recognition, chatbots, XR (AR/VR/MR) and advanced machine learning, the long-kept promise of devices that deeply and naturally integrate into our everyday lives finally becomes within reach. In time, this will result in IT becoming invisible and ready-to-hand.
An alternative to traditional computing architectures, neuromorphic computing systems integrate electronic analog circuits with digital ones to mimic neuro-biological architectures similar to those of nervous systems of living beings. In that way, they can provide new ways to represent information, adapt to change (plasticity), provide additional robustness, and incorporate learning and self-development capabilities.
Quantum physics bring a new computing paradigm that is much richer than, while complementary to, the famous Boolean computing that is the foundation of computer science. Built over the basic element of the qubit, the so-called ‘Quantum computer’ uses quantum-mechanical phenomena to execute operations on data. A theoretical concept born in the early 1980s, Quantum computing's first technological implementations were demonstrated 15 years later. Since then, the Quantum computer has not yet reached the mainstream, though tremendous advances have been emerging on both the hardware and application sides.
Swarm computing combines network and Cloud capabilities to create on-demand, autonomic and decentralized computing. While Edge computing is the initial step towards the decentralization of computing, Swarm computing will consolidate this trend by exploiting IoT devices' increasing rich computing and storage capacities. Combining complex multi-Cloud architectures with Edge computing will enable Swarm computing scenarios to develop. Swarm instances will be temporal infrastructures created on demand in response to specific needs.