Just about a year ago I wrote a piece about NetApp and how they were making a strategic shift (Turning a big storage ship),changing their focus as well as the perception of both the industry and customers. This coincided with the launch of the latest version of the companies bestselling storage operating system ONTAP – version 9
A year on, after spending a few days with the NetApp’s leadership as part of our annual NetApp A-Team get together, I thought It would be good to check in on how that big storage ship was doing and was it still turning in the right direction.
First some context, data is increasingly the lifeblood of our organisations, it’s in the top 2 or 3 assets any business holds and we are constantly seeing how organisations are using data in ever more creative ways. While of course we continue to create more and keep it for longer.
Not only do we need more from our data, the way we consume data services is changing, the big public cloud providers are giving us analytics services on demand, allowing us to solve more and more complex problems, well as long as we can get our data to their cloud offerings in the first place. Which means more data been housed in the cloud, which is great for analytics, but isn’t always a great fit for our data sets.
In that context, how does a big storage vendor remain relevant?
In my opinion they have to embrace the changing attitude to data, just wanting to store it isn’t enough, to quote a friend of mine “storing is boring” and in reality it kind of is, if your only view of data strategy is storing it neatly, you are missing a trick.
So the question is, are NetApp embracing this new data driven world?
Shift to a data management company
This is something I’ve been hearing over the last 6 months and fully expect this is going to be front and centre of a lot of NetApp messaging, as they start to move from storage company to data management company, this focus is absolutely right, in my own company, we have done the same thing, because it’s what our customers demand, it’s not about building infrastructure and storing data, it’s about taking a valuable asset and getting the most out of it.
There is no point just talking to a modern organisation about how much storage you can provide and how fast it is, organisations want to know “how can you make sure my data asset, remains an asset”
For those not familiar with NetApp’s Data Fabric, it is a critical part of their vision as they make the shift to a data management company. A data fabric is NetApp’s view of how we build a data infrastructure that allows us to get the best from our data giving us the flexibility in how and where we store it, how we move it, while maintaining security and compliance, all crucial in a modern data strategy.
But this does go beyond just a strategic goal, this is baked into all of NetApp’s thinking, the idea that you can move data across any NetApp platform regardless of whether it’s hardware, white box, virtual machine or even sat in AWS or Azure, is very powerful. It also isn’t limited to ONTAP, allowing us to move data between ONTAP, Solidfire, E-Series, AltaVault and even none NetApp platforms via Flexarray.
Ultimately will data fabric be stretched beyond the NetApp portfolio? Who knows.. it would be great if it did, but there’s a lot of work to be done.
Embracing the new world
Part of the new way of working with data includes the cloud, there is no getting away from this reality, whether it’s consuming SaaS like Office365 and Salesforce or it’s holding our data long term in S3 or Azure Blob stores or needing to present our data to analytics tools, organisations are moving more data to the cloud.
What part does an on-prem storage vendor play in this? It has to be two fold;
Help me to move data to the cloud
Because they supply on-prem storage arrays, NetApp can’t ignore the reality that their customers want to move data to the cloud. To NetApp’s credit they are embracing this challenge and are helping enable this movement.
The data fabric strategy and ONTAP are a key part of this, the ability to take NetApp’s core storage OS and deploy it directly from either AWS and Azure, means that not only can you move your data from your on-prem array straight into a public cloud, but because it’s the same operating system end to end you can crucially maintain all of the on-premises efficiencies, management and controls on your data in the public cloud and this is a real positive.
It’s not only moving of data to the cloud that NetApp have turned focus to however, it’s also looking at ways that cloud based services can play a part in thier future that is also interesting. This has started with two services, Cloud Sync and Cloud Control.
Cloud Sync assists users in automating the process of moving data from on-premises NFS datastores straight into Amazon S3 storage and back again.
While Cloud Control allows organisations to protect their Office365 data, by allowing us to back it up and hold it in an alternate location.
The important thing to note with these two services is they are exactly that, they are services, no traditional NetApp tools are needed as any part of the solution, you subscribe to the services and begin to use them.
If anything proves NetApp’s position on embracing the new world, it is this.
Big Wheels Still Turning?
With a range of new announcements due soon, including the much anticipated NetApp HCI platform, the storage behemoth, in my opinion, continues to evolve, it’s focus is right and certainly aligning to the challenges that the organisations I deal with talk about.
It continues to do smart things within it’s core product set, continuing to add tools that enable their wider data fabric strategy and working these directly into their portfolio, especially the product at the heart of it, ONTAP.
Personally I continue to be very enthused by what NetApp are doing and the direction they are taking and for me, those big wheels are not only turning, they are turning in exactly the right direction.
Let’s see if they can keep it up.