Adopt vs Adapt – successful IT leaders know the difference


A few weeks ago a I was speaking to an acquaintance of mine in the industry. He was flustered by the endemic nature and mediocre results of adopting new technologies within his company; citing examples of success stories in the IT industry where other companies deployed complex solutions in record time. Then he asked me…

You’ve worked on many complex solution sets over your career with clients, what differentiates those that successfully adopt solutions, products, and services from the ones that don’t? 

 I replied…

They know the difference between adopt and adapt, and they employ both.

In my experience there is one pattern I’ve observed where adept IT team leaders consistently deploy solutions, products, and services successfully. The pattern is contrary to the citation on Wikipedia for Adoption (software implementation); in my observations, their definition conjoins the two and I don’t see where a software implementation should be addressed any different from other IT projects.

To ADOPT, is to take as one’s own. It’s a ‘go, no-go’ decision which is typically supported, and/or driven, by one or more of the following:

  • Return On Investment (ROI)
  • Total Cost of Ownership (TCO)
  • External pressure; ex. compliance & regulatory requirements

To ADAPT, is to make suitable to requirements and necessitates much more analysis, preparation, work and planning. Critical to success includes the following:

  • Complete understanding of the current (state) workflow process
    • requirements
    • constraints
    • impediments
  • Level of Effort (LOE) to integrate/replace the current state with a future state
    • labor
    • time
    • cost
    • risk
  • Success Criteria and clear metrics which support the expected ROI, TCO, or external pressure demands

In projects where the IT leader only focused on adoption has led to deployment times which far exceeded their target completion date.

On the flip side, those IT leaders who only focused on adapting to a technology had a difficult time measuring and justifying the realized business value back to the organization.

In my experience, the IT leaders that employed both were consistently more successful, on time, and on budget.


Finding a Data Virtualization Solution

proprietary eponym
brand name or trademark of a successful product, that has come into general use to refer to the generic class of objects rather than the specific brand type, without the exclusive rights to said product being lost by the parent company. For example, kleenex is used to describe many types of facial tissue.
                         DELPHIX may become the next proprietary eponym for Data Virtualization. 

What is Data Virtualization?

In order to find a data virtualization solution we need to understand its meaning. In doing so we quickly realize that data virtualization is a very broad term that’s loosely defined. As such it’s a solution space that’s still in its infancy, changing with the advent of new technologies. So before we can find a solution, we need to define the criteria in which we are searching.

Data are the quantities, characters, or symbols stored for computer processing and calculation. Virtualization makes a single physical instance of something appear as several instances in the same footprint (space) as the original physical instance. Data Virtualization is the representation of a single physical instance of data (e.g. a 5 TB Oracle 11g DB) and presenting several , say 10, usable instances in the space of the original physical instance (in this example, 50 TBs in the space of  5 TB) which appear as 10 full physical instances to the end user.

However, the explanation above is only one interpretation.

Wikipedia defines this term as any approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how it is formatted or where it is physically located.

Screen Shot 2016-08-02 at 11.01.33 AM

Search results include references to data integration, data federation, and simplified/consolidated or integrated views. Simplified views represents an abstract layer that sits in front of multiple data sources which are presented to the consumer as a single source. Indeed a data virtualization technique with several products on the market including Cisco Data Virtualization.

In addition to Data Virtualization, other common Google search terms include Data on Demand, Data as a Service (DaaS), and Copy Data Management. These terms are certainly types of data virtualization which provide value to an organization but some fall short in addressing your data. A bigger problem is that each of these data virtualization types imply the need for disparate solution sets. Ideally, executives seek to so rationalize their solutions, not expand them.

Data on Demand    Screen Shot 2016-08-02 at 11.09.11 AM

A search returns many resources offering historical and statistical data from data repositories across many public and private industry verticals.  One example is NASDAQ which offers a service that provides for-fee data sets in different consumable formats. These services do provide a virtual-like service, but none of them {from what I observed} offer a solution to provide your data on demand.

Data as a Service (DaaS)    Screen Shot 2016-08-02 at 11.11.10 AM

As the name implies, *as-a-service is associated with compute clouds. Dataversity explains … DaaS model is all about offloading the risks and burdens of Data Management to a third-party Cloud-based provider. But wouldn’t it be ideal to have your data securely delivered as a service to your developers, testers, and analysts without the requirement to employ a cloud solution too?

Many of the sites returned in this keyword search appear more like external data feed services versus providing your intellectual capital as a service. That’s not to say this isn’t classified as an implementation of data virtualization, it is. Oracle is one example of this type of service where they not only provide data but advanced services as well.

Copy Data Management    Screen Shot 2016-08-02 at 11.12.16 AM

Many of the CIOs I’ve spoken to are quick to point out that they want less copies of data to manage and this phrase, initially, conjures up the wrong vision. That said, an article written by Brien Posey on TechTarget cites – Copy data management seeks to reduce the number of copies to two — the primary data and the backup copy. When additional data copies are required, an underlying snapshot mechanism is used to create a virtual copy of the data – which is close to what executives desire.

Search results point to backup, DR, data protection, and snapshot copies for use with dev, test, sandboxing, etc. The good news is, it’s your data. The shortcoming is employing a snapshot capability. Snapshots are equivalent to a photograph, a single point-in-time image of the data set. In some cases, this is perfectly fine. For many others like agile development, aggressive iteration testing, comparative testing, or Test Data Management (TDM) one would need to take frequent snapshots to accommodate the granularity the users frequently request. To service this request would require a skilled technologist and a delay in receiving the new data.

Mapping search results to capabilities… 

Capabilities that executives identify as value in finding a data virtualization solution include:

  • Dramatically reducing the storage footprint of multiple non-production copies without the complexities, limitations, or resources required to manage the data sets (data virtualization)
  • Delivering the right data, to the right teams, at the right time in minutes (data on demand)
  • Delivering your data across the enterprise to the teams that need it most whether on premise or via cloud services through a self-service access point (data as a service)
  • Simplified management of fresh, masked, full sets of data that are protected and available at specific points in time (copy data management)

Combine these capabilities with self-service delivery of masked data to the consumer and we’ve just defined the criteria for Finding a Data Virtualization Solution.

Delphix Data Virtualization

The Delphix Virtualization Engine is a software solution that was designed from a clean sheet of paper to address the massive shortcoming in delivering your data fast, efficient, and securely to consumers using your existing platforms.

Delphix provides Data Virtualization by delivering instant, secure, full read/writable copies of databases and files to the people who need it, when they need it. For applications that rely on multiple heterogenous databases, Delphix can deliver those databases at the exact point-in-time necessary to achieve integration testing. Simplistic views are achieved by providing end users with their own sets of data which they can change, bookmark, refresh, reset, and share with other teams versus receiving an access-only version.

Delphix provides Data on Demand for your data through an easy to use self-service interface. Users can manage their copies of databases much the same way that developers manage source code via version control. Imagine the ability to provide unlimited versions of data sets to your developers, testers, and analysts with the ability to switch between the copies in a matter of minutes while requiring no overhead in additional storage.

Delphix provides data as a service by eliminating your existing process of opening a ticket to request new data, awaiting approval by management, and coordination of backup admins, DBAs, and storage admins to facilitate the request. Data sets can be easily accessed by the consumer either on premise, or in the cloud, from their desktop.

Delphix provides copy data management by its inherent continuous data protection capability of staying in-sync with all changes to both the production source and virtual copies. Contrary to snapshots (which are similar to photographs) the Delphix Virtualization Engine is analogous to streaming video whereby all points in time are captured and accessible to create an exact replica of the data set from any point in time, securely, in a matter of minutes.

The Delphix Virtualization Engine is the single solution available today to address your data virtualization needs by unlocking the barriers to accessing your data sets fast and securely.

Below is a 2-minute overview which highlights the power of the Delphix solution.