Data First Strategy

 If data are the jewels of the company, then companies are handcuffed to their treasure.

The industry is long overdue for a disruptive correction to the biggest problem facing companies today – being data constrained.  Specifically, the long ‘wait’ times for delivering databases and datasets to your application developers, DBAs, testers, and analysts that are current, consistent, and secure. Data management in the pre-production space can no longer be treated by CIOs as just one of the components in their IT strategy if they expect to remain relevant competing on business agility, customer affinity, and operational excellence. Data must be considered the first priority around which your enterprise IT strategy is built. Today’s brittle IT infrastructures are incapable of handling the current data demands of the business and this has a tremendous impact on cost. How much cost? – Howard Rubin cites in his paper titled Technology Economics: The “Cost of Data”  –   … did you know that overall 92% of the cost of business — the financial services business — is ‘data’? According to Rubin, The next breakthroughs in the cost structure of the banking and financial services technology economic will likely come about through a focus on the efficiencies of data.  So, it’s not surprising that according to a recent news report by Gartner, By 2015, 25% of Large Global Organizations Will Have Appointed Chief Data Officers (CDO). But how will CDOs address the data constraint head on? – The correction required by organizations will be a Data First Strategy.

Root of the problem?

The industry has provided solutions to virtualize and automate everything in the data center. Well, almost everything, except virtualizing the data (data = databases and files.) Are you engaged in some of the latest trends like Agile Development, Cloud, or DevOps? Responsible for assuring data governance and compliancy? If so, are you able to deliver datasets, regardless of size, securely and in a matter of minutes to these teams? Can you provide unlimited copies for comparison or regression testing? Offer inherent continuous data protection so datasets can be reset or rewound to a previous point in time? The answer is typically no, and thus the reason that a Data First Strategy has been impossible, until now.

What’s needed?

The postal service has matured over time from the pony express to locomotives, automobiles, and airplanes to expedite package delivery. In much the same way postal depots were established to optimize delivery routes, so is the need to plan where Corporate data is needed to expedite not only the delivery of data, but also the services dependent on this data. A Data First Strategy is a paradigm shift in the way an organization’s services are created, built, and managed to deliver the right data, to the right teams, at the right time. This strategy relies heavily on the ability to deliver full datasets as fast as other virtualization technologies can deliver their service, typically in minutes. There are two tenets that define a Data First Strategy:

  1. Prioritizing your data first in your architectural design (business, hardware, and software)
    1. Focus on consumers, data center location, services, SLAs, security
  2. Prioritizing your data first in value to the company
    1. Focus on monetization, management, governance, compliance, collaboration, and acquisition of Corporate data

The subtly here is, if you could draw a box in your architecture design to depict immediate delivery and access to your data, then you could remove the ‘wait’ times that impede time to market, provide immediate access to audit data, and instant access to virtually unlimited copies to everyone.

Achieving a Data First Strategy

Possible with Delphix Agile Data Management

Delphix Agile Data Management unlocks your data by virtualizing your Corporate datasets and expediting their delivery to the teams who need them. The datasets are stored in a highly optimized, single set of common blocks, which can be used to securely create, refresh, and rewind copies of any size in a matter of minutes. To the end user the virtualized copies look and respond as full size datasets and can be managed by these teams through an easy-to-use self-service GUI, further eliminating ‘wait’ times. Some of the immediate capabilities that can be realized include, but are not limited to:

  • Immediate access to point-in-time datasets; delivered in minutes versus day/weeks, consider:
    • audit data access
    • unlimited ‘what if’ testing scenarios
    • root cause analysis of production data issues without impact to production resources
    • unlimited copies for regression testing
    • test data management
    • training facilities
  • Automated data masking to ensure immediate protection of customer data; onshore and offshore
  • Eliminate storage vendor lock-in to manage your pre-production environment
  • Enable seamless data movement across heterogenous storage arrays
    • a mandatory requirement for Cloud and data center migrations
  • Prepare for Cloud enablement by providing a hardware agnostic medium for transferring data efficiently and securely to and from a cloud service
    • includes public, private, and hybrid cloud environments.
  • Leverage a virtualized data copy for Immediate recovery for Disaster Recovery
  • Continuous Data Protection (CDP) for Source and Virtual Data Sets
    • tracks all changes as they occur

Customers have realized the following benefits:  reduced time to market for customer facing services by 60%, provide a new or refreshed data copy 99% faster, increase the number of pre-production data copies while reducing the storage required by 97%.

Closing observation

Data is the constraint in every organization. CIOs and CDOs will need to adopt and adapt a new approach to data delivery within their organization if they expect to achieve the full benefits from initiatives like Agile Development, DevOps, and Cloud. A Data First Strategy is the next step, but can only be achieved by data virtualization. Without the ability to deliver the right data, to the right teams, at the right time organizations will continue to struggle with compliancy, application and service quality, and escalating project costs.


Why I left Oracle for Delphix?

I recently bumped into a colleague of mine from Sun Microsystems in New York City. Although it’s been 18 months since I left Oracle, post-Sun acquisition in 2010, he was shocked. We had met when I started at Sun in 1997 and, since the Financial melt down in 2008, we assumed roles and trained as Enterprise Architects (EA); think TOGAF, ITIL, OEA, Business Value, Operations Management Capabilities Model, etc. versus the Java definition of EA. We were focused on critical business problems at the executive level. My management was very supportive in providing all necessary resource to ensure success, I was frequently asked to facilitate industry talks (Engineered Systems, Cloud, Data Center consolidation, and Virtualization, to name a few), and the compensation plan is excellent. I wasn’t looking for a new career.

So he asks me, Why the [explicative] would you walk away from a secure job to chase a startup?

So I asked him, What’s the biggest, unsolved problem in IT today? – Being data constrained, right? And isn’t data the ‘jewels’ of every company? It’s great we have all these industry driven buzzword-initiatives like Virtualization, DevOps, Agile Development, Cloud (Public, Private, Hybrid, *aaS), Big Data, and even Engineered Systems. But at the end of the day, no matter how fast you can stand-up infrastructure, you still can’t provide the developers, analysts, DBAs, and testers with a copy, or multiple copies, of a gigabyte/terabyte database in minutes for pre-production environments; in most cases it takes days, weeks, etc. to provision or refresh. And, if you can deliver a fast copy, it’s typically a snapshot which begins to go stale the moment the end-user receives it.

Delphix’ Agile Data Management solves this data constraint for databases and application data by providing an optimized copy of the Source. The Delphix Engine efficiently manages all changes from the Source, and those applied to the full read-writable virtual copies, to provide continuous data protection (CDP) from which copies can be created, refreshed, and reset to any point in time in minutes.

So Delphix solves the biggest problem in the industry, which I found not only exciting but a significant change agent to assist customers realize the potential benefits of Cloud, Big Data, DevOps, etc. by eliminating the ‘wait time’ for data delivery and management. But what ultimately convinced me to leave a position of 16 years was the executive management team and rock star list of engineers who are at the core of this company. A cool product is one aspect, having a team that can execute makes all the difference. I was offered an opportunity to be part of this team and have no regrets.