There is no doubt that IT spend on the public cloud is accelerating and from this a wide variety of hybrid cloud models are emerging. The well known IT Analyst Company – Gartner predicted that by 2016 the cloud will become the bulk of new IT spend, while 90% of Enterprise spend will be On-Premise cloud. The consequence of this is that the Hybrid Cloud will become the dominant cloud model and the keywords will be Choice and Flexibility. What does worry me is the word that puts most IT professionals into a state of fear and that’s lock-in.

 

lock-in.jpg

 

Lock-in is the last thing I need. It contravenes any advantage of moving to a hybrid cloud model. My mandatory requirements are therefore to keep the terms choice and flexibility top-of-mind when defining the service requirements of my business. The other key variable is cost. These are elements I must be able to measure if, as Gartner predict, hybrid cloud becomes the dominant model. As I said in one of my previous posts – If you can’t measure it, you can’t manage it. Therefore the data driven hybrid cloud is a given. Anything else and you have no visibility to make the quick, insightful decisions that will enable any business advantage. If you can’t Measure it, you can’t Manage it blog post

 

My vision, and it may be flawed, but after 25 years working in and around the data centre here goes, beaks down into five simple questions. Simple – I like, Complex – I don’t like.

 

Decisions, Decisions,  Decisions

  1. Where do I put my compute, apps and data ?
  2. How much will it cost ?
  3. How do I consume it, whats the model ?
  4. Are there any risks ?
  5. How do I move it, what are the restrictions ?

 

The last point ‘ How do I move it’ is key. This is where lock-in raises its head and prevents you from moving your assets in a timely fashion. After all, one thing is a given, everything will change. New services, on-premise and off-premise, will come and go. Costs will vary for the same services. As a business leader I want to make quick, insightful decisions to move the assets I have in the hybrid cloud. After all if its costing me more, is high risk and I can’t move, why am I doing it in the first place.

 

Clearly there is a requirement for an integrated approach that is not about speeds and feeds, or product features. A great model to help you think about cloud is the concept of the Data Fabric. This provides a visual, data centric method to view your cloud aspirations. There is no doubt that the hybrid cloud is evolving. Today the danger is that we become established in isolated on premise, off premise and hyperscale clouds models, but the NetApp vision for the Data Fabric is enabled by the speed and scale offered by a consistent data format, software defined data management and fast efficient data transport.

 

Data Fabric.jpg

 

Data transport is the key phrase for me. It is the data that defines an organisation and enables the business. It is the entity that is difficult to move, it has Mass, whereas compute is relatively easy to move. If you think the idea of a fabric is new then think again. In 1905 Albert Einstein delivered his theory on Space, Time Fabric, described as a single interwoven continuum. The Data Fabric is just that – a single interwoven continuum. Mass, Space, Time – it all fits. Einstein influencing Cloud Computing 110 years on.

 

Space-Time.jpg

 

So today we are  building on the single interwoven Data Fabric vision by introducing new cloud based appliances for backup with SteelStore, extreme scale with  StorageGRID webscale, along with updates to the Cloud Manager, Cloud ONTAP and OnCommand Insight software.

 

OCI.jpg

 

It is the new Oncommand Insight 7.1 that I want to talk about a little more because in may view it makes cloud work for you by interrogating the complexity, but presenting the reality simply. For example I can easily monitor multiple cloud instances from multiple cloud service providers. I can monitor and report on both physical and virtual resources. These are key metrics that affect operational service quality and  cost management. It operates on any NetApp Platform and 3rd Party storage and monitors a plethora of different variables such as performance, configuration , capacity, cost showback and chargeback, switches. The list goes on but to my earlier point,  ‘If you cant measure it you cant manage it’ in the cloud the importance of being able to report on an ever changing,  dynamic landscape is paramount.

 

To summarise, our Data Fabric vision is differentiated by the ‘ready now’ technology approach that includes an open aligned ecosystem of leading technology and cloud service partners. This is not a start / stop journey by 2017 over 50% of businesses worldwide will have invested in hybrid cloud. Expect to see a lot more as NetApp continue to build on the Data Fabric vision.

 

Read more about today’s announcement here:

New Solutions Enable Customers to Build a Secure Path for Data from On-premises to Amazon Web Services

mm

Laurence James