Data is king, especially when an IT shop tries to decide what, where, when, and how to store it. Should it be on-premises? In the cloud? Applications may change, but the data they create has enduring value. A critical consideration in this process is understanding how data is being moved between storage locations and the role of application lifecycle management in this process.
NetApp believes that the cloud is compelling for certain use cases and that the hybrid cloud will become the model for enterprise IT for years to come. Historically, NetApp IT hosted applications in our internal data centers, defined as physical locations under our control. When we wanted to move an application from one data center to another, we did some type of data center migration. Migrations usually require months of planning and significant downtime to execute.
Managing Data is Paramount
As our application service offerings mature, our processes have changed. We are now able to deploy cloud-enabled applications within our on-premises data centers and then extend workloads into external cloud service providers using NetApp Private Storage, our data management solution for the cloud. Underlying this is a data fabric that seamlessly connects different data management environments across disparate clouds. It will allow us to transform how we manage, secure, and move our data across the hybrid cloud. We can take advantage of the changing economics of the cloud while maintaining full control of our data at all times.
What we have come to recognize over time is that the data fabric can play an integral role in managing our application lifecycles and the data during the process. We use it to plan how we move data among different cloud providers, then leverage our automation capability to find the appropriate compute resource. The next step in our strategy is to further mature our hybrid cloud model so that we can move applications as part of our normal operating procedure to ensure they align with various phases of the application lifecycle.
By leveraging NetApp Private Storage for our development and test environments, we can quickly accelerate new capabilities for our applications while taking advantage of readily available compute power in a hyper-scale cloud provider like Amazon Web Services. We can also quickly deploy systems in the cloud to meet business continuity and disaster recovery requirements. In addition, applications with cyclical performance requirements-such as sales or financial applications with month-end bursts of activity-can be placed into the appropriate cloud instance based on their workload demands.
Our deployment model is based on rental or ownership of the compute based on whether we are hosting the environment in our internal data center or external cloud service provider. We move applications among cloud providers by simply stopping the application in one cloud instance, syncing the data using NetApp® SnapMirror® data replication technology, starting the application in the new provider, and pushing domain name system (DNS) changes as needed.
Managing Application Lifecycles
Data is king and dictates our actions. A data fabric allows us to operate across all the possible cloud instances and then figure out what they are best suited for. As this becomes our normal operating model, we expect to see a fundamental change in how we view our data center strategy-from a discussion about locations to one about the platform of required capabilities where the underlying physical layers are no longer relevant. We will gradually move our legacy applications into this model as part of the application lifecycle. NetApp’s vision of a Data Fabric will fundamentally change how enterprise IT organizations operate.
The NetApp-on-NetApp blog series features advice from subject matter experts from NetApp IT who share their real-world experiences using NetApp’s industry-leading storage solutions to support business goals. Want to view learn more about the program? Visit www.NetAppIT.com.