As an introduction to this quarter’s Atos Digital Security Magazine, we have highlighted the definitions of digital sovereignty and data sovereignty that we use when speaking with our peers, clients and our teams.
Spoiler alert: Atos defines data sovereignty as the degree of control an individual, organization, or government has over the data it produces and works with.
Accepting this definition leads to a big question: “How can an organization find the right degree of control?” Let’s tackle it in four steps.
Data has become the main support of our digital economy. For most companies, the digital strategy relies on a few critical pillars:
- Their data
- The way data are processed (algorithms, apps, compute)
- Who can access data and run operations and reports on them
If you don’t have a clear view of these pillars, you don’t have a clear picture of your business — which means you will soon be out of business. It’s that simple.
As you build your organization, you influence the access rights to your data and application features according to your digital strategy. You must ensure that access rights and identities are compliant with that strategy and based on the sensitivity of your data. That’s the definition of control. Controlling your data is to specify and enforce who can do what with them at any point in time. If those identities and accesses don’t align with your choices and your strategy, it reflects not just a lack of control over the data, but over the business itself.
How can you increase control?
Cybersecurity levers are data sovereignty’s best friend.
To control who can do what with your data, you have to start with identity and access management (IAM) and extend it beyond your employees to all kinds of IT, OT and IoT objects. Since you cannot implicitly trust the underlying layers that host, transport and process your data in clear text, you need to add encryption to protect its confidentiality and integrity in the above 3 mentioned states.
However, IAM and encryption are often entangled in layers of applications, infrastructures and networks to run business applications and facilitate user experience. It exposes them to the flaws and weaknesses in each layer, which requires additional security controls to minimize the likelihood and impact of a breach. These additional controls include antimalware, firewalls, intrusion protection sensors, secure coding, testing amongst others.
Last but not least, there is no such thing as perfect cybersecurity controls, so you must constantly monitor them for compliance and incidents and be able to respond and recover from them in a timely manner. In short, following the major functions of the NIST Cybersecurity Framework (identify, protect, detect, respond and recover) is mandatory.
How much should you increase it?
Sometimes, more is less.
All those levers can be seen as sliders moving between agility and control. Strengthening a security control could indeed come at the expense of agility, even create some user’s friction. The best way to have complete control is to store your data in a vault, inside a faraday cage, 100 meters underground, surrounded by armed guards. Only then would it be truly inaccessible — and utterly useless.
So, for every security control, you should find the proper balance between excess and restraint. The best way to find that sweet spot is at the core of cybersecurity: What kind of risk am I addressing and in which form could it happen? You won’t apply the same level of control to mitigate an espionage risk, external influence, usage/operation prevention, or data loss, whether accidental or malicious.
Are all data born equal?
Protecting all assets the same way often results in protecting none of them correctly. Indeed, security controls tend to be attracted by the weakest link. While you may assume that your least sensitive data will benefit from the same strong protection you give your most sensitive data, the opposite is often true. When someone says, “all my data are sensitive,” I often hear, “I don’t know which data are sensitive, and worse, I don’t know that I don’t know.” In risk management parlance, we call this the worst possible treatment: denial!
The double 80/20 rule.
Data classification is a huge program that should never be underestimated. But, to give you a sense of scale, we have observed through many projects that, in average 80% of customers’ data are not sensitive, and that the remaining 20% can further be split into 80% “just sensitive” and 20% highly sensitive. It is not unusual that the split is close to that double 80/20 rule: 80% non-sensitive, 16% sensitive and 4% highly sensitive (maybe even classified). Of course, this is a rule of thumb. Companies working solely on highly sensitive topics (a nuclear plant, a military entity, etc.) will have a vastly different split, and don’t need that rule. Others should focus, as a priority, on the 4%.
Figure 1: We can visualize every Cybersecurity control as a different lever (top right on the diagram) that you can pull. Each of them (data encryption as an example here) can go in a direction or another depending on the level of control your risk assessment resulted in and which can differ according to your datasets sensitivity.
Wait! Did I say that encryption could help protect the confidentiality and integrity of data from the underlying layers processing and computing them? Isn’t encrypting data in use still an unsolved challenge? Well, yes and no.
Privacy-enhancing technologies (PET)  have emerged in the last few years to address this issue, the most popular of which are confidential computing and differential privacy.
However, the holy grail of PET is fully homorphic encryption, which promises to unite control with agility, allowing companies to use third parties to add value to data without compromising confidentiality and integrity. There are already practical implementations, but using it for sensitive data and processing still comes with some impacts on efficiency. Once that problem is solved, data sovereignty will enter a whole new era.
 Read the two-part report from Lunar Ventures with an overview of PET
About the author
Global CTO for cybersecurity products, Distinguished Expert, Atos
Member of the Atos Scientific Community
Coming from an Information Technology engineering background, with 20 years’ of experience in information security, Vasco has helped many customers balance operational constraints versus acceptable business risks. In the recent years he has expanded this experience to help customers look into what the information security landscape might be in the next 5 years+ and best way to manage it. During innovation workshops, he shares with them some keys to anticipate the future shape of cybersecurity and maximize sovereignty over their most critical data.
Using those customer interactions and by continuously monitoring major technological trends, Vasco influences Atos cybersecurity services and products roadmaps, as well as partnerships, mergers and acquisitions.