Well for any of you that have read my posts before, you know I love a tenuous music link to my BLOG titles, well after a week in Berlin at NetApp Insight, the title was ready made and too good to pass on, so thanks 80’s power ballad merchants Berlin…
I’ve had a busy couple of weeks with events, presentations etc and last week in Berlin was no exception, with 5 very full days spent with the NetApp team, peers from other partners and a range of fascinating customers. It was a packed agenda, which included some firsts for me, I did some press interviews with UK trade press, appeared on the NetApp TechOnTap podcast and a video interview broadcast on NetApp’s youtube channel (links at the bottom)
The nice thing about the mix of vendor, partners and customers, is you get a lot of really interesting views, from vendor strategy to the reality of business issues and what customers need to solve those issues.
Well no doubt the star of the Insight event was NetApp’s Data Fabric message… why? Many vendors of course have strategy messages, but I think NetApp delivered two very clear things during the week.
Data Fabric is an excellent strategic direction for NetApp as a vendor, but more importantly it’s a critical conversation for businesses as they plan their future technical strategy.
Strategy is great of course, strategic conversations are very important as we all look to predict the future with our technology strategies, however strategy only works if we can deliver it and that was the “take my breath away” moment, not only was the Data Fabric strategy very clear the examples of how NetApp technology allows us to execute that strategy was pretty impressive, not only on their traditional controllers, but across virtual machines, 3rd party storage and of course cloud providers.
Why should you care about Data Fabric?
In my mind you certainly should, data is at the heart of transformation of the modern business, those businesses that are going to thrive in the future are those that are going to adapt to the digital world the most successfully.
That adaption only comes though, if we have tools that allow us to quickly and easily take advantage of whatever technology is the most appropriate to meet our needs.
What we can’t do is build our solutions around technology silo’s, flash just for VDI, cloud just for certain applications, remote sites that can’t integrate, whatever decision we make, if we are not thinking about a larger strategy that provides us with mobility of our data and resource then we are just building an IT museum of bad ideas. The conversations with NetApp customers showed the message was certainly resonating with pretty much everyone, as we could all see the power and sense in the proper design of a data fabric.
But does it really work?
The strategy is great and spot on, but it is just talk if we can’t actually execute and that is what NetApp showed time and again and it was that ability to deliver that really brought the event alive.
A quick summary, the Data Fabric strategy is based around the flexibility delivered by the NetApp OnTap operating system and the ability for that OS to be deployed anywhere that is capable of running it, be that traditional controllers, virtual machines, in front of 3rd party hardware or running as a cloud instance, once we get OnTap in a location we can take advantage of NetApp capabilities there and easily move data between them.
What NetApp also added to the mix was the ability to use their SnapMirror engine to move data between pretty much anything, anything running OnTap is one thing, but then once we add Alta Vault into the mix, it gives us a gateway to get data to and from the NetApp Data Fabric with ease – very very impressive stuff.
This fabric is more than just smart technology though, one of the most interesting things that was raised was during NetApp founder Dave Hitz session, he highlighted how this kind of flexibility opens up a whole new range of business paradigms.
When discussing cloud strategy, two questions are often raised;
“Is it Secure?” and “Does the provider have me locked in?” – these are valid questions, what Dave showed was that a Data Fabric Strategy had the ability to eliminate both of those queries from that strategic discussion.
This is done in two ways but understanding why we ask the question is important to understand how it removes those risks. The reason we ask the question is all to do with data. Data is the thing with weight, it’s difficult to move and it’s oh so valuable to our business, it’s the thing that may be at most risk with a cloud provider and as for lock in, data (not compute) is the thing that is difficult to quickly and effectively move, and because data is so difficult to move in and out of the cloud, it’s what ties us to vendors when we go all in with them.
NetApp private storage as part of the overall fabric strategy, solves both of those issues. The data remains under your control, in your own secure data centre, in a location of your choice, and the only bit of the cloud you consume is the compute and the thing with cloud compute is that is really flexible and can be moved easily. In the days previous session,Neto From Brazil (yep, that’s how he’s known) showed just how easy, as he cluster failed over a SQL VM from Amazon AWS to Microsoft Azure, using failover cluster manager in just 5 seconds, yep, 5 second failover between massive public clouds.
That’s great for resilience, but how does that stop vendor lock in, well remember when I talked about new business paradigm? That is exactly how we beat the vendor lock in debate, if we can move our workloads between clouds in just 5 seconds, then think about the possibilities, imagine you are a business that uses extensive cloud compute, but are at the behest of your service providers charges, well imagine if you could, in 5 seconds, flick your workloads to a new provider, imagine a new paradigm, where maybe monthly you go to your providers and negotiate the best deal for your compute needs and move to the platform that saves your company the most, or delivers the most bang for your buck…think about the competitive edge and agility IT is delivering…
One Last Treat
Finally there was a little Data Fabric futures as we looked at how Alta Vault, has grown, allowing us to take an on-premises backup, sending it to Amazon and then literally, and yes, literally, dragging and dropping that Alta Vault backup into Azure and restoring our data there and presenting our database… the tool’s not quite available today, but the tools exist and work and it’s only time before this is available to us all.
This was the stuff that really did do the Berlin on me, I’ve talked about Data Fabric for a while now and genuinely think it’s an incredibly powerful strategy and consideration for business, and this is just the NetApp portion of it, there are other elements you can build into your fabric too, but for Insight the NetApp portions of this, not just the strategy but the tools to execute it, probably made the show for me.
Heck, that was just the Data Fabric stuff, there was other great stuff around the further enhancements to all-flash, the increasing capabilities of Alta Vault, Snapmirror to anywhere, containers, further developments for Cloud OnTap, some great stuff from vendor partners and not forgetting copy free transition in OnTap 8.3.2, along with some great tools from NetApp that I didn’t know about, all made for a really great event.
So well done NetApp a week well spent, below are some resources and links to some of the presentations, my lovely video Interview with Justin Parisi and of course as ever, If you want to continue the discussion look me up on twitter @techstringy or find me on linked in and we can continue the chat.