Abdicating responsibility

Sorry I’ve been quiet on the blog post front but I’ve had a hectic few weeks involved in all kind of interesting conversations and events (even manning the booth at a couple of them), what’s been noticeable at these events is the amount of similar discussions I’ve had with businesses of all sizes, from small to large and all that’s in between and there’s been some interesting areas of commonality.

Over the next few weeks I’d like to share some of those with you. Up first has been something really interesting that has gone right to the top of my list and actually it came to light again this week when in a meeting with one of my favourite CIO’s. For this post let’s call him Bill (can’t share his or his companies name on this occasion), but Bill is a very astute CIO, very well connected, spends time doing all the things that you would expect, what is always interesting is when I bring something to the table he hasn’t thought about before.

Today was one of those rare treats, as I was sharing with him my last few weeks and some of the fascinating chats I’ve had, so what caught Bills interest?

Abdicating responsibility

Let me pose a question for you.

When we design our internal IT infrastructures, our compute, our applications, our storage, how many of us look at that infrastructure, that holds all of our key data assets and decide, that’s OK, I’ve built it now, it’ll be fine, I’ll just leave it there and make no provision whatsoever for protecting the data on that infrastructure?

I’ve built it now, it’ll be fine, I’ll just leave it there and make no provision whatsoever for protecting the data on that infrastructure?

None of us , right?

None of us design an infrastructure for our business and just trust the IT gods with our business data and assume everything will be fine and I don’t need to take any further responsibility for it, do we?

So here’s my question..

When we move bits of our infrastructure to a “cloud” provider, why do we then abdicate all responsibility for our data and say, that’s OK, Microsoft, Google, Amazon or <insert your provider of choice here> will just take care of it for me and I no longer need to concern myself with such triviality as data protection.

Whenever I posed that question recently there is a moment of realisation, that actually, in many cases that’s exactly what’s going on, we have not sufficiently considered the implications.

This isn’t to say all cloud providers, or all people haven’t, because of course they have and if you are taking infrastructure as a service, you may well be including levels of resilience and data protection, however the further up the cloud stack we go, i’m finding the less this is considered.

I was pointed last week at the very interesting AWS shared responsibility document (I suggest you go give it a read here) which makes very clear who is responsible for what.

The diagram sums it up nicely, AWS “responsible for security of the cloud” you, the customer “responsible for security in the cloud” and right at the top of this is Customer Data.

When we look at software as a service offering such as Office 365, this problem persists, Microsoft have a number of policies and rules you can set around data retention to try to mitigate data loss, however they do not backup your Office365 environment, you lose data in there, you have to realise it’s your responsibility and it strikes me that often we don’t.

How many of us are dropping our data into these platforms without seriously considering appropriate data protection models? If my conversations are anything to go by, quite a lot of us.

What’s to be done?

It’s certainly a problem, so what do we do about it?

I thought I’d share with you a couple of ideas that have come about during my discussions of the last few weeks;

  1. Understand the risk

    Out first step with anything like this, is of course understand the risk, if we are moving infrastructure and production data to any kind of service, then we need to understand the risks, fully understand who is responsible for what in your service agreement.

  2. Understand our requirements

    What are our data protection requirements for the information we are moving to our service provider, do we need to retain it long term? do we need it in alternate locations? How critical is the data we have housed in the service? Really all the same questions you would be asking for the data you have in your own infrastructure.

  3. How do we fulfill those requirements?

    Of course the last part of the equation is how do we go about addressing the problem, well it’s how long is a piece of string, a bit like it is when it comes to protecting our own infrastructure, maybe we want to simply backup, there’s a bunch of solutions out there that backup data from services like 365 to an alternate location, people like Assurestor offer “backup solutions”, or there is the Symantec approach of using enterprise vault to archive out of your cloud provider, and even vendors who you may surprise you, like NetApp with their backup for Office 365 solutions.

These are just the simple ones, protecting a service like 365, but there is considerations a plenty, if it’s software, platform or infrastructure as a service, what you need to protect and how, can vary greatly. But the point of this post wasn’t to provide every possible answer, it was much more straightforward than that, it was simply to raise the question, are you abdicating responsibility when you move your critical data assets to the cloud?

If you have been don’t worry, you are certainly not alone, because let’s face it, it is just not discussed, hopefully if nothing else, this post will encourage you to ask the question of your current and any future cloud based data infrastructures.

Go protect that cloud data.

If you want to contact me, please feel free to leave a comment here, contact me on LinkedIn or via the twitters @techstringy

Update – about two minutes after i published this post by friend Jason Reid over at Assurestor published an article about the very same subject – go and give it a read

 

Advertisements

Turning a big storage ship

Storage is a funny old part of IT industry and a part of the industry that is going through change much quicker than most.

The rapid move to flash, the need to integrate cloud, issues over management, security and governance all make the storage industry a challenging place for vendors and those architecting and using storage solution alike.

If we add to that a change in how we view our infrastructure, we see technologies that are abstracting much of our infrastructure from end users and developers, we see tech like openstack, Azurestack and even tech like Vmware VVOLS all of which present a single look and feel higher up the technology stack, this move almost sees some parts of the decision making cycle have a view of “we don’t really care about storage”.

With all that in mind how do you make a difference in the storage market?

For me in the answer stands in taking a view of how as a storage vendor you can be smart in addressing the challenges we see as architects, customers, business owners, developers or storage admins we all have a different view of what we need from our storage solutions, so how do you go about addressing all those requirements?

This leads nicely onto a look at how one of those storage players is taking those challenges on. Yesterday (May31st) saw NetApp announce the launch of ONTAP9. Now for me the interesting thing about the announcement was not the cool tech, of which there is much, but more the business messaging that came with it.

What it ONTAP 9

ontap9

ONTAP is NetApp’s storage operating system, traditionally it has run on NetApp’s own controller hardware as a classic storage array, as you’d expect, but today and actually for some time, ONTAP is much more than that.

Ultimately it’s an operating system and like any OS’s, it can be installed on any suitable platform which opens up a whole range of options and if you are someone making strategic data decisions, then flexibility can be massively useful.

Imagine if your storage operating system could not only be installed on your traditional array, but it could also be installed on hardware of your choice, pick up your whitebox hardware and install your storage OS on it, what if you could install that OS then in AWS and have a cloud version of your storage?

Now it’s not that you can’t do all those things today, but what we are talking about here is doing all of those things with the same operating system, all of the same functions, same management tools, ability to seamlessly move data sets between all of those different repositories all because you have the same OS and capabilities.

Well that is ONTAP, ONTAP 9 installed on bespoke controllers, ONTAP Select on whitebox hardware of your choice and ONTAP Cloud, running as an instance in AWS, all the same technology doing all of the same things in the same way.

Why Version 9?

That’s a good question, to which I don’t have an official answer, but do have a thought! NetApp have taken a lot of criticism over the last few years, some of it valid, some of it not, but one of those criticisms is lack of clarity of message, different platforms, vague numbering of ONTAP versions and of course the drawn out move from traditional ONTAP to the scale out version of ONTAP known as clustered ONTAP.

I’d say there all valid criticisms and personally I think NetApp are now in a position to draw a line under that and ONTAP9 is that line. Gone are the variations of 7-mode and clustered, gone is confusing naming, an attempt at clarity. Everything is now ONTAP and as mentioned it’s variants ONTAP Select and ONTAP cloud. Yes there are still other portfolio solutions, E-Series for cheap, deep fast storage and of course Solidfire for the webscale cloud provider market. However, where general data is concerned, ONTAP is your OS of choice.

Simplification can only be a good thing, just remember ONTAP and then you can look at your requirement and decide where best to deploy it, but wherever you want your data, ONTAP can be there backing it.

Getting flashy

Flash is absolutely changing the storage game for many, it’s spawned new and innovative storage companies, it’s changed the way organisations can mine data, it’s allowed us to deliver data solutions faster and more efficiently and heck, it’s generated debate, when will we just buy flash? do we need to have designed from the ground up for flash? when will flash be cheaper than disk? when will it have the density we need? and so on.

All those questions have probably clouded many a storage debate the last 2-3 years but in the end, most of it doesn’t matter, flash is just another media for storing data.

For me what’s important is can I integrate it seamlessly into my environment and not a special silo living in a corner on its own. The thing with flash is, it’s fast and it is maybe here where it is delivering its true benefit.

One of the most difficult things we do when architecting a solution is getting the performance right, traditionally many metrics, much extrapolation of data and a bit of guesswork have gone into sizing our storage platforms for performance, however flash simplifies that completely, for pretty much all workloads, flash is going to plenty fast enough, you put 20 flash drives in a system churning out ½ million IOPS, that’s going to be lightning fast and we are seeing increasingly people are not just putting their big database workloads on there, it’s the entire stack, it’s VMware, Windows Servers, Exchange, SharePoint, VDI, you name it, people are looking at it and saying, get flash, I don’t have to concern myself with performance.

NetApp have maybe been lucky when it comes to flash, or maybe it’s great foresight, however not many of the long standing vendors had an OS that could move to flash just so easily, ONTAP was built with flash in mind, before people had flash in mind! None of this built for flash stuff, ONTAP just delivers, it was already optimised in writing to flash and a couple of tweaks to the read path and suddenly ONTAP looks like it was always built for flash.

Oh and if you’re not sure if you’ll get benefit, commercially NetApp are backing that with a guaranteed 3 times performance improvement guarantee.

There are other benefits too, ONTAP9 makes NetApp the first of the enterprise vendors to support 15TB flash drives, yep 15TB on a flash drive, tie that with a guaranteed 4:1 storage efficiency your 15Tb flash drive is effectively a 60Tb drive all lightning fast. Why should supporting 15TB drives be a big deal? But it is, the fact ONTAP just treats flash as another media should not be underplayed, because it means, while some flash vendors are struggling to support flash drives bigger than 3TB NetApp are already looking are already looking ahead to 32Tb next year and 128TB in the not too distant. No major changes to ONTAP, it’s just another bigger drive, great investment protection.

Squeeze it in

This move to flash is a big deal for many and the benefit of removing the performance concern is worth it’s weight in gold, but how to do you make this more affordable? Flash drives are still not quite as cheap as their rotating equivalents (not a million miles off, but) and let’s not kid anyone that a 15TB flash drive is going to be anything but pricey, so one way to tackle this is, efficiency.

NetApp has always delivered great efficiency and ONTAP9 just builds on that, alongside the normal data efficiency techniques you’d expect (in-line compression, dedupe, efficient snapshots, clones etc.) data compaction is added (Check out Adam Berghs excellent ONTAP9 blog here for more detail on data compaction) NetApp are also financially backing that as mentioned with a 4:1 efficiency guarantee, not the minimum NetApp say they ll get, but they’ll financially back it if you don’t get at least that. So as a buyer, I’m buying less physical storage and potentially squeezing al my data in flash and if that doesn’t work NetApp will give you some disks!

netapp efficiency

The improvements in the management interface In ONTAP9 mean you can also see the effectiveness of each of the efficiency technologies and if you feel for some reason one of those efficiencies isn’t for you or your specific workload, that’s fine.. turn it off.

So is that ship turning?

There is some neat stuff technically in ONTAP9 no question and I’ll point you in direction of some blog posts that go into a bit of detail, but what interests me is the strategic thinking, if we look at what is going on in the storage market, it’s changing greatly, from where we place our data, to what we as users want from our storage in a modern environment.

ONTAP makes great strides in addressing many strategic concerns, flexibility that allows me to place ONTAP on bespoke hardware, my own hardware or spin it up in the cloud and the ability to move data between any of those platforms as and when I need it. Strategically this is significant.

Commercial simplification is important, buying storage can’t be commercially difficult and should take the risks from our storage purchase. The newly announced guarantees around performance and efficiency are a big helpful step toward that.

NetApp have been criticised over the last few years and as I said earlier, some of it justified, some not, is ONTAP9 the answer to all the issues NetApp may have, perceived or otherwise? no of course not, however my belief is it’s a step in the right direction and having spent some time over in their Sunnyvale HQ a couple of weeks ago, what I can tell you is it doesn’t stop here, they have a leadership who understands where the company needs to go and are trying to tackle all those things head on, with smart tech and some neat commercial thinking.

Buy why not go find out yourself I’ve attached some resources below that are worth a look and of course feel free to comment here, or on twitter @techstringy or find me on LinkedIn.

Thanks for reading. Ships Ahoy!

Some resources for you

NetApp resources

Hear from Lee Caswell on the NetApp podcast

NetApp ONTAP9 product pages

ONTAP9 BLOG

Independent resources

Dave Brown – NetApp ONTAP 9 Announcement

Chris Maki – ONTAP 9.0 is here

Dave Morera – First Look at ONTAP 9

Adam Bergh – NetApp Announces ONTAP 9

Press resources

Press Interview from CRN with John Woodall