Data storage systems have evolved through four distinct stages during their brief history. First, monolithic storage arrays gave way to SAN/NAS distributed storage networks. Next, advanced software techniques were developed that led to storage virtualization, with storage volumes abstracted from the underlying physical devices. Finally, in the current era, cloud services have created a new model, one that combines virtualization with economies of scale, the likes of which have never been seen before.

 

Interestingly, these transitions in the storage industry were all driven by corresponding changes in compute platforms, each of which required a new data storage architecture that could adapt to a new IT ecosystem.

 

Hybrid Cloud is the New Ecosystem

Today, the economics of the cloud are driving a transition towards hybrid cloud services, and once again, storage architectures must evolve and adapt to meet the needs of this new computing paradigm. The idea driving the adoption of hybrid clouds is clear – divert compute and storage resources into the public cloud in order to reduce the resources required (i.e. reduce costs) in on-premises data centers.

 

Blog - Data Fabric end points.jpg

 

On the compute side, moving virtual machines (VMs) and their associated applications between physical servers and cloud-hosted servers within a hybrid cloud is well-understood and efficient. However, importing and exporting application data, is not as seamless. Data movement is significant problem for the hybrid cloud. While cloud compute services are designed to be device and location independent, data is almost always housed somewhere on permanent storage, and is not easily moved.

 

Insufficient data mobility between data centers and public cloud providers can be a significant barrier to hybrid cloud adoption.   In a recent CIO survey conducted by IDG Research Services, 78% of enterprise IT organizations viewed the ability to manage data across multiple clouds as critical or very important, but only 29% of these organizations viewed their ability to do so as either excellent or good.

 

Blog - Hybrid cloud data movement.jpg

 

 

Without a common framework to load and move data, hybrid cloud success will remain an elusive goal. What’s needed is a way to manage, secure, protect, share, and move data among different clouds. Imagine a hybrid cloud where all of the data management capabilities within it are consistent, connected, and form a coherent, integrated, and compatible system-in essence a fabric that joins on-premises equipment with numerous public clouds

 

A Data Fabric Eliminates Hybrid Cloud Silos

To realize the vision of a data fabric, a method must exist to seamlessly control and manage data between on-premises storage arrays and the many storage end points within the hybrid cloud. Fundamentally, a data fabric is a way to manage data, both on-premises and within the cloud, using a common structure and architecture. A data fabric provides efficient data transport, software-defined management, and a consistent data format, allowing data to move more easily among clouds.

 

Blog - Data Fabric vs Silos.jpg

 

With data portability enabled via a connected data fabric, application servers and application data become conjoined. Here are a few of the benefits:

 

  • Economic and data governance flexibility. When using a data fabric to design new applications in the cloud, if an application project fails, simply delete the server instances and data from the cloud. If the application takes off, you can easily move it on premises or to another, more secure, cloud environment.
  • Better utilization of resources. Mature applications often take up data center space, power, and the resources of a skilled IT staff. A data fabric enables you to selectively move applications to a public cloud infrastructure and focus internal IT resources and mindshare on the applications that deserve attention.
  • Cloud-based disaster recovery. One of the most exciting capabilities of the data fabric is the ability to deliver multi-site disaster recovery (DR). SAN-to-SAN replication between data fabric end points creates hot site DR with very short recovery times, and a cloud-based, cost-effective, DR option.

Each prior transition in the storage industry was enabled by technological advances and driven through an ecosystem model that presented new business capabilities while at the same time reducing costs. As the hybrid cloud matures, a data fabric ecosystem will be required in order to provide a consistent framework for data movement throughout the hybrid cloud.

 

The hybrid cloud changes the role of IT in several significant ways. With the cloud, IT is no longer just about building infrastructure and running data centers; it’s about utilizing the tools and the applications to acquire, transform, apply, and protect the data on which business depends.

 

The hybrid cloud model-combining on-premises capabilities with resources and services available from various cloud providers-is poised to become the dominant model in enterprise IT, and a data fabric will enable IT organizations to take best advantage of this model.

Larry Freeman