Data Loss Prevention
Data Loss Prevention (DLP) has been a key area of focus for enterprises in recent years, due to an increased recognition of the value of data, advancing technologies making the potential for data loss greater, and recent regulatory requirements, related to personally identifiable information protection.
Malicious leakage, which typically occur via malicious attacks or data harvesting malware. Enterprises defend against these using firewalls, encryption, user access controls, IDS/IPS and anti-malware software.
Inadvertent leakage, which it typically the focus of traditional DLP solutions. An inadvertent data leakage could be a user e-mailing a confidential presentation to their personal e-mail, storing data on a free Internet-based storage solution, or even using personal devices to store company data.
- Enforcing network rules off the managed network or modifying rules for more hostile networks.
- Restricting sensitive content from portable storage, including USB drives, CD/DVD drives, home storage, and devices.
- Restricting copy and paste of sensitive content.
- Restricting applications allowed to use sensitive content — e.g., only allowing encryption with an approved enterprise solution, not tools downloaded online that don’t allow enterprise data recovery.
- Auditing use of sensitive content for compliance reporting.
- Used with complementary controls to prevent confidential data from leaking.
- Prevent unauthorized parties from accessing your data
- Prevent data loss and data breaches
- Provide better visibility and control over your data
- Improve compliance and avoiding compliance violations
- Protect data against security threats caused by BYOD and IoT
- Provide a 360° view of the location and flow of data
All verticals with a focus on industries where data management is a key priority such as finance, healthcare, government, etc.
DLP platforms need to be implemented not just as an IT initiative but take the broader more business-oriented information into account ; otherwise, it will deliver little value.
DLP requires an active participation from the units that own the risk of the information controlled by the DLP which can lead to misalignment between the risks and the detection policies.
The digital signature provides a proof of origin as well as addressing authenticity, integrity, timestamping and non-repudiation requirements. It employs asymmetric cryptography operations through implementation of Public Key Infrastructures that play the role of root of trust.
- Code signing and software authenticity validation
- Alternative to classic methods of authenticity validation (signature, stamping, etc)
- Protection of intellectual property
- Validation of commercial transactions
- Validation of communication
- Validation of trust for software
- Digitalization of document control systems
- Streamlining financial operations
- Improve interactions between citizens and local administration/ public entities
All, but especially:
Public Sector and Defence
- While implementation of PKIs has become the norm in enterprise environments, they have often been designed to address information security requirements such as encrypted communication; digital signature requires more planning and configuration to provide legally binding status and sometimes requires to invest in a new PKI.
- As technology evolves and new cryptographic algorithms are deployed, older implementations become obsolete or incapable to address newer requirements and require technology refresh.
- The digitally signing software must not only be capable of applying digital signatures but also attest secure implementation of cryptographic operations.
- All implementations must ensure long-term validation of digital signatures, regardless of the time-validity of the certificate used for signing at the time of checking.
Fraud detection performs real-time, near real-time and/ or batch analysis of activities, ensuring the safety and reliability of information. Activities can be submitted manually or automatically, and the assets can belong to private consumers or businesses.
- Credit card fraud detection
- Compromise authentication credentials (such as passwords)
- Blacklists of known insurance fraudsters
Strong user authentication is no longer safe enough against hackers, thereby background fraud detection is needed to detect anomalies. From abnormal user behavior to suspect transactions can be spotted by a fraud detection system.
Markets such as banking, insurance, e-commerce and gaming have the highest demand for fraud detection. Other markets like government benefit programs or collection agencies are slower at implementing fraud detection.
A Public Key Infrastructure (PKI) supports the distribution and identification of public encryption keys, enabling users and computers to both securely exchange data over networks such as the Internet, IoT, OT and verify the identity of the other party. A typical PKI consists of hardware, software, policies and standards to manage the creation, administration, distribution and revocation of private and public keys and digital certificates. Digital certificates are at the heart of PKI as they affirm the identity of the certificate subject and bind that identity to the public key contained in the certificate.
- Complexity of governance and security of access
- Enforcing complying to different markets and regulations
- Increase security and compliance of applications in the environment
- Increase certificate management process strictness
- Secure Email with signed or encrypted messages
- Secure browsing, documents and files with rights management, authentication and encryption (Encrypted File System)
- Build an internal (company) or external (public) trustchain which permits to issue certificates for : Digital Signature, server Authentication, client Authentication, key Encryption, email Protection, data Encipherment, timestamping, non-repudiation, prove the state of issued digital identities with CRL and OCSP mechanisms, Manage the entire lifecycle of certificate authorities and end entities certificates F102.
- Improve security in authentication use cases by replacing login / password authentication by certificate authentication ; it removes the risk of sharing passwords with the option to delegate the access rights.
- Simplify the identity management process and authentication on applications. In particular, end users no longer need to define a new password each X months and potentially lose the password.
- Decrease support cost and increases productivity by reducing the amount of time spent by help desk and employees.
- Improve security in communications between users and machines, machines and machines, users and users.
- TLS Encryption (e.g : exchange credit card information in order to buy something on internet)
- Mutual authentication (e.g : ensure that a user connects himself to the appropriate service and ensure that the service correctly identifies and authenticate the user).
- Email signature and encryption (e.g : Send quotes, contracts and billing information to a customer).
- Digital signature (e.g : Digitally sign a contract or an offer).
- Simplify management of keys and certificates.
- Have a concrete view of each issued certificate.
- Be aware of certificates that are expiring soon.
PKI systems are largely deployed today as the asymetric cryptography provides the highest level of security in authentication, signature and encryption use cases. Today most of customers are already convinced of the criticality and usefulness of PKI systems.
Nevertheless it is difficult to push a new PKI solution inside a corporate network that has already its own PKI. The change of a PKI solution implies a long process of certificate migration. Once the new PKI is in place, each previously issued certificate must be issued again on the new PKI and replaced on the appropriate machine / cryptographic device.
In the future, PKI will need to provide answers for new challenges. For example, PKI solutions must adapt in order to use post quantum cryptographic algorithms and still be useful on the market. It will be very hard in a few years time to deploy a PKI solution unable to manage such algorithms.
Privacy by design
Privacy risks refer to the impact of processing individual information in applications and systems. Privacy by Design is therefore a set of mandatory privacy principles applicable in certain geographies such as Europe and Canada. Privacy by Design means protecting privacy by embedding it early in applications, procedures or processes.
- E-petitions: possible risk is disclosure of information about the individual who signed
- Electronic toll pricing: possible risk is third party access to location data
- Smart metering (ex. energy consumption): possible risk is profiling based on consumption patterns
Privacy by design aims to prevent risks through settings existing in the systems and processes. Privacy and security should ensure that businesses are acting according to stated commitments and objectives.
No vendors identified
In terms of maturity levels, privacy by design is emerging, with a penetration between 5% to 20% of target audience and moderate benefit rating.
Privacy today is not something that is tackled from the beginning, rather added after. Vendors use statements like “build with privacy by design in mind”, but struggle to provide reference material to support this claim.
Privacy by design is not something new, as it was first discussed in the early 1990. Even so, privacy controls are regarded as a cultural change in processing personal data and privacy professionals are currently intensifying discussions in how privacy by design is best implemented.
An essential part of all highly regulated environments and especially those relying on non-repudiation services, described by RFC3161, the concept of time stamping addresses the necessity to determine the proof of existence of data after a certain point in time. Time stamping services are usually provided through PKI implementations which act as Time Stamping Authorities, being therefore part of the regular offerings of commercial public certification authorities.
- Certifying the time succession of information transactions
- An important part of any blockchain implementations
- Plays an essential role for determining the non-repudiation status of data
- Gives a clear picture of the succession of data events in time.
In particular: Financial services, Legal, Retail
In general: all verticals for certifying legal and contractual agreements
The challenges are rather implementation wise as most enterprise environments benefit from a PKI that plays the role of the Timestamping Authority.
All components in the timestamping implementation must be both timestamping able and prevent the alteration of the timestamp record.
While the PKI can be designed with redundant components, the failure of the system will directly impact the ability of timestamping.
Attribute-based encryption is an innovative encryption scheme that has been developed to address the challenges of data protection in the Cloud and having greater control over who should be accessing what data. Each user is identified by a set of attributes, and some function of those attributes is used to determine decryption ability for each ciphertext. The scheme could also be constructed in a way that allows multiple authorities to distribute attributes so there is no need for a central authority.
There are several attribute-based encryption schemes, such as ABE, KP ABE, CP-ABE, H-ABE and they all come with their unique benefits and challenges.
Key area for application is encrypting assets within cloud environments, where traditional role based encryption systems fail to provide the level of granularity required to secure sensitive data – e.g. patient records within Healthcare. Lightweight attribute-based schemes have also been developed for IoT devices.
Multiauthority version of the attribute-based encryption enables to provide fine grained access control for sensitive data stored in the cloud. The benefit of attribute-based encryption is that unlike existing encryption techniques, it is more resilient to insider attacks and data theft.
All, particularly Healthcare
There are a number of attribute-based schemes that have been developed, however they all come with their own unique benefits and drawbacks. Research and innovation in that area is still ongoing on how to effectively address the concerns over effective revocation of attributes, reduce the execution times and computation cost.
Cloud Testing tools
Cloud testing tools refer to using cloud technology to perform testing in a cloud computing environment. Cloud testing’s purpose is to test functional and non-functional requirements. It comprises of cloud-based lab management, service virtualization, on-demand delivered testing tools and device clouds.
Although several use cases exist, the focus in future is on lab scalability and matching production use case scenarios in a realistic way.
Performance testing: Manual and/ or automatic scaling should not cause any disruption
Security testing: Data access is granted only to authorized customer
Functional testing: Proper integration with other applications
Network Testing: Data must keep integrity during transfer
Load and Stress Testing: Check system fails or changes over time when increasing load
Cloud testing tools help companies to control costs, especially if using testing tools is seasonal. It is also beneficial for companies that rely mainly on manual testing.
Cloud testing tools also simplify the build test and test loop as they integrate well with CI/CD pipeline.
Using a virtualized infrastructure as test labs can reduce costs, but organizations must monitor and balance consumption based on usage profiles.
Database encryption solutions protect large volumes of data at rest, specifically they secure the column, table or the on-premises instance of relational database management system platforms.
API Method: Is done by querying within the encrypted columns in the application. It can be time consuming and it can create performance issues
Plug-In Method: Is done by attaching an encryption module to the specific database. It works separately from the application and is thus more flexible.
TDE Method: Transparent data encryption (TDE) is done within the database engine itself, performing encryption or decryption without code modification neither on the database nor the application.
When deploying database encryption, the primary target for anonymization is specific regulated data (credit card number, personally identifiable information, protected health information and financial data). Critical, non- regulated data is also considered.
However database encryption is more and more being used as risk-based access control (RBAC) by data protection and privacy laws and to address data residency situations. Database encryption is actively used to also minimize risk of data breach, maintain data privacy, access control and segregation of duties.
As maturity level, database encryption is mature mainstream, with a penetration of 20% – 50% of target audience.
A major challenge for using database encryption is resource allocation and authorization schemes, since data must be rapidly decrypted and re-secured once access to it in not needed.
Data Discovery & classification
Data classification is the process of organizing information assets using an agreed-upon categorization, taxonomy, or ontology. It enables effective and efficient prioritization for data and analytics governance policies that spans value, security, access, usage, privacy, storage, ethics, quality and retention. Through the discovery process, data is gathered, merged, and interpreted to provide businesses with more precise insights regarding their customers, business, and industry, as well as providing critical information outside IT networks. More concretely, Data discovery & Classification enables an organization to have the visibility of user behavior in the cloud and to assess the nature and how data is accessed. Data-driven decision-making is an important factor of data discovery market growth, alongside BI tools. Data discovery market applications apply visual tools comprising geographical maps, heat-maps, and pivot tables to make the process of finding patterns faster and intuitive. Data discovery is increasingly popular because of privacy regulations, but is also available for several other markets.
- Privacy compliance
- Risk mitigation
- Master data and application data management
- Data stewardship
- Data discovery for analytics and application integration
- Data catalogs for operations and analytics –Efficiency and optimization of systems, including tools for individual DataOps
– Content and records management
Discovery and classification of data is increasingly trending in the context of digital business transformation and new privacy regulations. Classification is used by security professionals to assess and assign profile risks on data. The two combined can produce faster, more reliable, and efficient data use for discovery, risk reduction, value assessment and analytics. This enables organizations to focus security and analytics efforts primarily on their most important datasets.
Banking, Financial Services, and Insurance (BFSI), Telecommunications and IT, Retail and E-commerce, Manufacturing, Energy and Utilities
Data discovery and classification in a unified manner across cloud and on-premises is quite complex considering the scale, data types and platform architecture. Additionally, data is frequently in transit, meaning continuous classification of it. Data protection regulations have increased the need to classify data around individuals and entities, with privacy-centric classification efforts, and as a foundation for data life cycle management. Data classification attempts can be difficult at best: to identify, tag and store all an organization’s data, without first considering the utility, value, and risk of that data.
Born from the requirement to create an accurate, centralized and homogenous data set by combining various data sources with no common scheme, data mapping has become a central focus point for organizations whose activities rely on, or is related to processing large amounts of data. The mapping of data into a common scheme involves complex operations that consist of identification, transformation, mediation and eventually consolidation. Initially a human effort intensive process, it has later shifted towards more automated methods such as semantic, data-driven mapping.
- Building complex population social structure statistics combining census and public authorities’ information.
- Build up of information repositories.
- Media and communication technology.
The ability to build vast repositories of information by correlating various existing dispersed sources into a data warehouse without copying full sets of source data.
Public sector, Media , Healthcare and Life science
- The data mapping process becomes more and more complex due to the amounts of data and the complexity of systems that use the data, requiring higher computing power and automation. Cloud computing in general and the adoption of serverless, microservices based operations addresses this challenge.
- The necessity to comply with data protection regulations, especially for existing data repositories at the time regulation comes into effect.
Dynamic Data Masking
The goal for Dynamic Data Masking is to minimize the exposure of sensitive data, without needing to modify extensively the existing applications. It will mask the data in real time, by changing the data stream while is receiving the requests. That way the person who requested is not able to see the sensitive data.
- Meeting compliance regulations: an operator from call center won’t be allowed to see all the personal sensitive data (credit card number, passport number, etc.) from the customer who is calling. The data showed to the call center operator is based on the user authentication level.
- Compliance with regulations
- Avoid security concerns
- No need to modify the existing applications
Healthcare System, Banking, Manufacturing, Automotive, Energy (Oil and Gas), Government, Legal, Telecommunications
A challenge could be the fact that it doesn’t protect the original data from the storage level.
Dynamic Data Masking offers protection of the data only when it is accessed.
Also found under the name of lightweight cryptography and being subject of an international standard ISO/IEC 29192, it is the technology that allows for “just enough encryption” for securing communication of electronic messages between resource-constraint systems that have not been initially designed for full encryption enabled communication and which would otherwise communicate over unsecure communication protocols. Due to their lower computational requirements, symmetric key algorithms better qualify for lightweight encryption implementations, but there are some implementations that employ public key encryption as well.
- Securing communication between IoT devices operating within a manufacturing plant environment
- Securing Health monitoring devices communication
- Securing sensor communication within automotive systems
- Portable handheld devices deployed in a warehouse
- Automatic self-check-out stores
Allows for confidentiality and integrity of communication and data reliability by eliminating MITM attacks between devices
Manufacturing, Healthcare, Transportation
- Target devices are by design built with limited computational resources.
- Cryptographic operations have a direct negative impact on usually limited power resources, especially for high-throughput systems.
- Cryptographic operations negatively impact the communication latency which is important in some applications.
KMaaS (Key Management as a Service)
KMaaS are KMS (Key Management Server) solutions provided as a service. They tackle some of the drawbacks to adoption of encryption, enhancing robustness, flexibility and simplicity.
- Their underlying security modules can be either hardware or software.
- Most of Cloud Service Providers do have KMaaS natively available but, for regulatory
purposes or based on a risk assessment requiring segregation of powers, many
organizations choose for some of their data to take key management out of the CSP
- Central view on all encryption policies
- Simplified migration away from an obsolete or deprecated encryption scheme or algorithm.
- Greater control on encryption keys generation and storage.
- Parallel cryptographic operations
- Hybrid and/or Multi Cloud integration
- Direct integration with various other SaaS to do Bring-Your-Own-Key (GCP, AWS, Azure, Salesforce, O365, …)
- Expose a KMIP interface that allows integration with any application compatible with this
standard (VMware vCenter for example)
- Expose specific integration with some big SaaS of the market.
- Strong data protection regulatory compliance with low audit reporting costs
- Cost reduction on encryption policies management
- Enhanced and scalable encryption keys security and resiliency
- Encryption policy visibility and consistent enforcement across all environments (On Premise and Multi Cloud)
- Centralized traceability of cryptographic operations and access to decryption keys
- Simplicity (compared to hosting and managing KMS/HSM)
- Benefit from advanced cryptographic solutions without the need to host & manage them
- The price, as the aaS model is more attractive financially.
All verticals, especially if we consider BYOK use cases and particularly those where a large number of cryptographic operations are required, such as Financial Services, Government with growing importance on Healthcare as well for patient records protection.
- Complexity of the deployment options and lack of clarity from providers
- Impact on Business Applications performance
- Dependency on the KMaaS provider.
Anonymization tools can encrypt or remove personally identifiable information from datasets with the purpose of preserving a data subject’s privacy.
Anonymization Tools can be used by analytics teams.
Anonymization techniques : confidential computing, homomorphic encryption, differential privacy, format-preserving encryption (FPE), secure multiparty computation (SMPC), zero-knowledge proofs, multicloud key management as a service (KMaaS), EDRM, Transport Layer Security (TLS) decryption platform, cloud data protection gateways, CASBs, enterprise key management (EKM), secure instant communications, dynamic data masking and database encryption.
- Protection of personal health information
- Prevention of any critical data from being processed unlawfully (taking in consideration GDPR and other data compliances regulations)
- Data analytics use cases.
Anonymization tools play an important role in processing data, especially if it’s personal data. By applying them, data is secured, and businesses do not need any permissions to process it. Risk minimization in data transfer, application of automated Big Data techniques, cost-saving from reduction of fines, and stronger information security are all results of using anonymization tools.
Healthcare Industry, Smart cities, Big Data and Media
Collecting anonymous data and deleting identifiers from the database limits your ability to derive value and insight from your data. For example, anonymized data cannot be used for marketing efforts, or to personalize the user experience.
Relevant data de-identification architectures and algorithms are designed to strike a balance between utility and security for a specific set of use cases and types of data fields. The type of scenario determines whether de-identification is an option, as well as how it relates to other controls and risk levels.
Autonomous Privacy Impact Assessment
Autonomous Privacy Impact Assessment (PIA) is an automated and intelligent system that discovers and classifies data, runs PIA assessments automatically, flags risks and provides remediations and mitigation controls. It would be a system with a full discovery, assessment, control, automation workflow.
As all organisations are affected by ever evolving data privacy legislation and increase in the amount of data they are processing, the technology is of interest to all industries with a wide range of applications.
In today’s connected world where individuals’ rights to privacy are increasingly protected, organisations must ensure they comply with and are able to demonstrate compliance with data protection regulations. Failure to do so results in significant fines and reputational damage. Current point-in-time assessments are no longer fit for our data rich environments and autonomous PIA will be able to support the Privacy by Design principle and continuous data privacy compliance assurance and effective risk management.
All, particularly Public Sector and Healthcare
Currently the automated tooling provides support more on the discovery and process side, however as the technology advances to being more autonomous, the question arises what can be left to intelligent systems. The decisions that need to be made are often context specific, the idea of encoding regulations and legal norms at the start of information processing systems is at odds with the dynamic and fluid nature of many legal norms, which need a breathing space that is typically not something that can be embedded in software.
Blockchain/DL for Data Security
Blockchain is used to synchronize data stored in a distributed manner amongst peers on all the computers or servers participating in a particular network. Trust is created because all the nodes in the network control, check and consent to any additions or changes to what is recorded. Once stored on the blockchain, the data cannot be manipulated or changed – it is immutable.
Depending on the specific application, one of three types of blockchain can be used:
- Public blockchain
- Private blockchain
- Hybrid blockchain
- Record keeping, transferring value (via cryptocurrencies or otherwise) and smart contracts to automatically execute a transaction when one or more preconditions is met
- Key areas where blockchain applications are being explored are cloud storage, art and ownership, anti-counterfeiting, governance, Internet of Things, and digital identity
From data security perspective, blockchain can be an effective tool and solution to ensure data Confidentiality, Integrity and Availability, offering improved resilience, encryption, auditing and transparency, and has a great potential to reduce costs, especially for financial institutions.
Aviation and transport
Blockchain is still an evolving technology, which means there are still a number of challenges that need to be resolved so the adoption of this technology can accelerate. From an organizational point of view, the issues of acceptability and the need of new governance models are the main barriers to adoption. Moreover, the lack of legal and regulatory support is another barrier.
Growing concerns over consumer privacy have led to regulations that now govern how businesses are handling customer data. Consent Management enables and collects user choices regarding how their personal data should be handled. Ultimate goal is to have the consumers determine themselves how much of their data to be shared and to whom and with the possibility to change or opt out and have their data removed. In the European Union (EU), GDPR has made consent management a necessity for businesses, including the ones based in the United States but operating in the EU.
- Comply with GDPR and ensure customer data privacy
- Request for removal of personal data e.g. credit card details from a bank’s system.
DataSecOps should be considered as an extension of DevOps practices and principles to Data. It is a practice focusing on collaboration of admins, engineers, applications and AI to deliver data fast and in a secure way. The focus is on automation and providing reliable and secure data pipelines that reduce risks and increase productivity. DataOps uses technology to orchestrate and automate data delivery with the appropriate levels of security and quality. However DataOps is a practice (like DevOps), not a technology or tool. It is a cultural change that is supported by tooling.
This can help in data science and AI projects to deliver data securely for analysis.
As data security would be part of development, the plugged-in modules for this purpose are obsolete. This reduces costs (e.g. licensing), and, as DevOps reduces the TTM, it also may reduce reaction times on upcoming threads, as no third party reaction time, compatibility tests and so on are needed.
Scientific communities can benefit in healthcare areas.
Very early days. DataSecOps has not yet evolved as a practice or discipline that an organization can identify as providing value or improving delivery.
Machine Ethics is focused on adding an ethical dimension to machines and is concerned with ensuring that the behaviour of machines toward human users, and perhaps other machines as well, is ethically acceptable. As there is a potential for AI to have one day greater-than-human-level abilities, the importance of the machines being capable of moral reasoning, judgment, and decision-making becomes critical.
Ethics in technology is critical across the board where AI and ML are used to ensure non-discrimination, due process, transparency and understandability in decision-making processes.
While the benefits of intelligent machines is key, the need to facilitate and support the work on defining a commonly accepted ethical framework based on the principles of beneficence, non-maleficence, autonomy and justice, on the principles and values based on fundamental rights, such as human dignity, equality, justice and equity, non-discrimination, informed consent, private and family life and data protection, as well as on other underlying principles and values, such as non-stigmatisation, transparency, autonomy, individual responsibility and social responsibility, and on existing ethical practices and codes, is essential.
Machine Ethics is a rich, complex multi-disciplinary and very fast-moving topic and perhaps more complex than many other ethical issues facing society today. Therefore there are currently no commonly accepted and agreed upon standards and guidance.
File Analysis is a software that can analyze, search, track, index and report on file content, but also on file metadata of unstructured data. File Analysis software is available on-premises, but also as a SaaS solution. The software helps with classification of valuable business data, from where it can be more easily discovered/ found and leveraged. Through the capabilities, e-discovery, analytics and data migration are present.
- File Analysis can be applied to help with compliance regulations, such as GDPR, in a manner that can rapidly identify sensitive personal data or key corporate data.
- Risk Mitigation: using File Analysis software will make you better understand the risk of your unstructured data footprint, and through the findings are: where the data resides and who has access to it.
- Optimization of storage utilization by finding and eliminating obsoleted / duplicated data
- Finding and eliminating or quarantining sensitive data
- Identification of access permissions (who can see the data, who has rights to the data)
- Text analytics
- Available as a SaaS solution but also on-premise.
Healthcare, Banking, Manufacturing, Telecommunications, Legal, Financial
- Higher cost, if you already have or choose an on-premises solution
- Setting the suitable data classification that better fits for your organization.
Homomorphic encryption (HE) can make it possible to perform some or any types of operations on encrypted data without revealing the data to anyone.
Technically, HE is a set of algorithms that enable computation on
encrypted data. In other words, it is a cryptographic method that enables third parties to process encrypted data and return an encrypted result to the data owner, while providing no knowledge about the data or the results.
HE is incidentally leveraging lattice-based cryptography, which is based on relatively simple mathematical tools like linear algebra and also considered as best candidates to resist quantum computers . It is based on a generalisation of public key cryptography, with an extra public key used to perform operations on ciphered data.
Ciphered data will remain secret until decrypted via the private key. Only the individual with the matching private key can access the unencrypted data after the functions and manipulation are complete. This allows the data to be and remain secure and private even when someone is using it. HE enables algorithm providers to protect proprietary algorithms and data owners to keep data private.
Homomorphic encryption has huge potential in areas with sensitive personal data such as in financial services or healthcare. In these cases, homomorphic encryption can protect sensitive details of the actual data, while still analyzing and processing it.
Homomorphic encryption addresses the C of the CIA triad – Confidentiality and therefore can be a useful tool in ensuring compliance with regulatory requirements.
All, particularly Public Sector and Defense, Financial Sector, Healthcare.
The key barrier to widescale adoption is that homomorphic encryption is still very slow, which makes it impractical for many applications today. Work on decreasing the computational overhead, that is required for homomorphic encryption, is currently ongoing.
Targeted delivery of digital experiences to individuals based on their deduced personality and behavioral attributes.
- Targeted advertising
- Digital environments uniquely adapted to an individual
- Personalized and adaptive learning
- Increased business revenue by maximizing the use of available pools of consumer and client data
- Creating easiness of interaction with digital devices as humans are fond of familiarity
- Paving the way for a not so distant future where the boundary between human and cyberspace entities behavior fades away
- Privacy concerns
- GDPR restrictions
- Additional tracking technologies needing development and adoption beside cookies and device IDs
- Sifting through all the combinations that can interfere with applicable privacy laws
Quantum Key Distribution
Quantum Key Distribution is a method for securing communication by using cryptographic protocols with quantum mechanics as drivers. It allows for a more secure way to generate a symmetric key which in turn can be used to encrypt/decrypt messages or secure connections.
As this is an extension to an existing technology, the use cases can be extended from the existing cryptographic protocols:
- Enhanced key generation: by utilizing quantum mechanics components, the key generation process is done in a faster and more secure way;
- Enhanced communication security: it extends the current security communication protocols by leveraging the quantum domain.
The technology relies on quantum mechanics which provides security against tampering or interception. At the same time, it provides greater resistance against standard attacks.
As this is a cryptographic method that is yet to reach full commercial value, the adopting verticals are usually the ones in need of high security requirements, such as governmental organizations or banks/financial entities.
Since this is a technology relying on quantum computing, the cost of implementation is sometimes very high.
Quantum-Safe encryption, also referred to as Post-quantum cryptography or quantum-proof, quantum-resistant encryption) is a component of Quantum Safe Cryptography (QSC) which aims to solve the public-key cryptography problem with the rise of quantum computing.
For Quantum Safe Cryptography, the use cases revolve mainly around replacing soon-to-be obsolete standard cryptographic protocols.
Quantum PKI: by replacing standard cryptographic protocols with quantum-safe protocols, the vulnerability of the current methods of encryption will be mitigated.
Replacing standard cryptographic methods with quantum-safe methods will enhance communication and encryption security as well as mitigate the existing problem with current cryptographic algorithms.
Any industry that relies on standard cryptographic methods will be vulnerable when a commercial version of a quantum computer is available. The adoption would come first on the telecom vertical and then move outwards to other sensitive industries, such as government and defense institutions, banking/finance, healthcare.
Quantum computers are a relatively new technology and a commercially available version is not yet available. At the moment, research and development in this area is very expensive an requires a high degree of knowledge and understanding around other scientific fields, such as mathematics and physics.