Delphix & the EU General Data Protection Regulation (GDPR)

keep-calm-and-prepare-for-the-gdpr

GDPR Observations

Last week I was reading a CMS Wire article by Brian Wallace, titled Who’s Ready for the GDPR? [Infographic], and found a few of the data points cited, eye catching.

On May 25, 2018 the GDPR goes into effect and according to the embedded infographic…

Note: direct quotes from the infographic are cited in italic.
  1. The GDPR requires that all EU Citizen data [i.e. Sensitive & Personal] be protected as stipulated in the final text of the regulation, even if the data lives outside of the EU.
    • Sensitive data: Name, location, identification numbers, IP address, cookies, RFID info
    • Sensitive personal data: Health data, genetic data, biometric data, racial or ethnic data, political opinions, sexual orientation
  2. 92% of U.S. businesses list GDPR as a top data protection priority
    • 77% of U.S. businesses have started preparing for GDPR, but only 6% are GDPR-ready
      • The low readiness percentage is consistent with my experience working alongside data owners at major U.S. corporations
  3. In addition to protecting EU Citizen’s data, there are other services a custodian of their data must provide. Some of these include:
    • EU citizens have the right to access their data as well as information about how it is being used 

    • EU citizens can take their data to a different agency upon request 

    • EU citizens have the right to data erasure 

    • Certain companies and governmental organizations must appoint a Data Protection Officer
    • Companies must implement reasonable data protection measures 

    • Companies must assess for threats
  4. Noncompliance with the GDPR will be costly. Top tier fines are set at €20 million or 4 percent of global annual turnover, whichever is greater

What are the challenges specific to a Data Protection Officer?

The challenges are the same challenges faced by CIOs and CDOs in major corporations today;  to secure sensitive and personal data while delivering copies to developers, testers, and analysts in effort to compete at the speed required in the Digital Age. Corporate metrics used to measure success also remain the same – increase revenue, reduce costs, and stay compliant. Sounds reasonable until you evaluate the application services your organization provides and realize your data is heavy and the anchor by which all other tasks in your process workflow wait.

What I consistently hear from clients include, but are not limited to:

  1. Slow, complex, masking process workflows which require teams of programmers to maintain the code
    • Integration testing (i.e. maintaining referential integrity across multiple, disparate, databases) adds substantial complexity
    • No concept of a masked master data set where copies can be quickly created
  2.  More than one masking toolset and/or process workflow; requires multiple skillsets and teams
  3. Masked data does not have realistic values substituted into the fields
  4. Too few copies of data sets to developers, testers, and analysts means sharing
    • a corruption introduced by one individual stops everyone from working
  5. Too many copies of the data sets requires a significant amount of storage and time to refresh and manage multiple copies
    • physically impossible to accommodate due to limited capital resources
  6. Teams that subset data to deliver copies faster and reduce storage are simply pushing the problem downstream
    • Developers cannot test end-to-end processes
      • Too often issues are only exposed in production
    • Testers are limited to a small set of test cases
      • Too often defects are found later in QA
    • QA, as told to me by every CIO I speak to, bears the brunt of performing the lions share of testing, meaning
      • issues and defects which should be found in Dev and Test are found in QA, typically due to providing stale and/or subset data to Dev and Test
      • the goal for these CIOs is to shift left their testing process workflows so QA can focus on a finite set of product quality testing

The Delphix Masking Engine

Addressing #1-3 (above)

Contrary to most of the masking solutions in the industry today, which are complex, require programmers, and are difficult to manage when changes occur to the data sets, Delphix Masking provides a GUI-based software solution. There are (3) powerful and easy to use components which simplify the core capabilities of an enterprise class masking tool.

  1. Profile – scan the selected data sets, identify sensitive data, and return a report of elements found along with recommended masking algorithms.
  2. Secure (Mask) – Apply the assigned masking algorithms to their respective elements while maintaining referential integrity; no programming required. Elements will be masked with fictitious, but realistic data substitutions. Once the algorithms are assigned the masking will be consistent and repeatable.
  3. Audit – To ease the demands of maintaining compliancy, Delphix provides a report that identifies which sensitive data elements have been protected thus simplying delivery to auditors. Audit will also alert admins if new data fields are added which introduce new vulnerabilities.

Screen Shot 2017-06-20 at 4.24.02 PM

The Delphix Data Virtualization Engine

Addressing #4-6 (above)

Data virtualization is the complimentary capability to masking. Protecting data at rest and in use with masking accommodates regulatory requirements but does nothing to enable your business to ‘go faster’. Why? Because data is still heavy and slow. Delphix Data Virtualization addresses the demands of your business by making data lightweight. What if…

  • you can have a full size, secure (masked), read-writable copy of any size database in minutes?
  • have as many copies of that database as you want/need without additional storage costs?
  •  provide developers, testers, and analysts with self-service access to their database (or files) and include the ability to:
    • reset, rewind, or refresh their database w/o opening a ticket
    • bookmark copies of their database for future reference or share with other teams
    • version control data like teams do for source code

Well, those are not ‘what if’ scenarios but real capabilities found in the Delphix Data Virtualization Engine. The (3) areas that define how Delphix manages the data virtualization workflow are:

  1. Collect – Delphix attaches to data sources (Databases and Applications) using native protocols to the platform.
  2. Control – By maintaining a unique set of common blocks in Delphix, users experience a 90% savings in non-prod storage. Leveraging the TimeFlow retention log users can provision copies from any point in time; masked master copies can be created from which all other copies can be created in minutes with certainty where and how the data was protected and distributed.
  3. Consume – Developers, testers, and analysts can refresh, rewind,  restore, bookmark, and share their database(s) and application(s) from any point in time in a matter of minutes versus the hour, days, and weeks required today.
howiiworks-diagram-new-08

Delphix Virtualization & Masking Engine(s)

Summary

GDPR will commence on May 25, 2018 and with it bringing hefty penalties for non-compliance. The level of effort and the impact to every organization in massive in scope and a distraction from day-to-day development and maintenance of your business services. Delphix provides an enterprise class solution to accommodate the protection of sensitive and personal data through an easy to use, but very powerful, masking solution. Combining masking with data virtualization enables businesses to continue to work securely on business services while adapting new workflow processes to address GDPR.

For more information to learn if Delphix is the right solution for you, please Contact Us.