NetApp’s Future, do they matter?

A couple of weeks ago I was at a technology event speaking with some of the attendees when the subject of NetApp was raised, accompanied by the question “Are NetApp still relevant?” I was taken a back by this, particularly as over the last few years I felt NetApp had done a great job in re-positioning themselves and changing the view of them as a “traditional” storage company.

However, this message had clearly not reached everyone and made me consider “Does NetApp’s vision really deal with challenges that are relevant to the modern enterprise?” and “have they done enough to shake the traditional storage vendor label?”.

I’m writing this blog 33000 ft above the United States, heading home from NetApp’s Insight conference. Reflecting on the three days in Las Vegas, I wondered, did what I hear answer those questions? and would it keep NetApp relevant for a long time to come?

#DataDriven

The modern tech conference loves a hashtag, one that attempts to capture the theme of the event and #DataDriven was Insight 2018’s entry to the conference hashtag dictionary.

But what does  Data Driven actually mean?

Data plays a significant role in driving modern business outcomes and the way we handle, store and extract information from it, is a keen focus for many of us and this is clearly the same for NetApp.

Throughout Insight,  NetApp stated clearly their vision for the future is to be a data company not a storage one, a subtle but crucial difference. No longer are speeds and feeds (while still important) the thing that drives their decision making, it is Data that is at the heart of NetApp’s strategy, a crucial shift that matches how the majority of NetApp’s customers think.

Data Fabric 2.0

NetApp’s data fabric over the last 4 years has been at the centre of their thinking. Insight however, presented a fundamental shift in how they see the future of data fabric, starting with making it clear it is not “NetApp’s Data Fabric” but “your data fabric”.

A fabric shouldn’t be “owned” by a storage vendor, it is ours to build to meet our own needs. This shift is also driving how NetApp see the future delivery of a data fabric, no longer something that needs building, but “Data Fabric as a Service” a cloud powered set of tools and services that enable your strategy. This is a 180° turn for this approach making it no longer an on-prem infrastructure that integrates cloud services, but a cloud service that integrates and orchestrates all of your data end points regardless of location.

The demonstration of this vision was extremely impressive, the future data fabric was clear in its direction, a fabric is yours, to be consumed as you need it, helping us to deliver services and data as and when we need to, quickly, efficiently and at scale.

The awkward HCI Conversation

Perhaps the most immediate beneficiary of this shift is NetApp’s Hyper Converged Infrastructure (HCI) platform. NetApp are by no means early in this market and in some quarters there is debate as to whether NetApp HCI is a Hyper Converged platform at all. I’ll admit, while the industry definition of HCI doesn’t really bother me, as technology decisions should be about outcomes not arbitrary definitions, I do have reservations about the long term future of NetApp’s HCI platform.

However, what NetApp showed as part of their future Data Fabric vision was a redefinition of how they see HCI, redefined to the extent that NetApp’s view of HCI is no longer hyper converged but Hybrid Cloud Infrastructure.

What does this mean?

It’s about bringing the cloud “experience” into your datacentre, but this is much more than building a “private cloud” it is about HCI becoming a fully integrated part of a cloud enabled data strategy. Allowing organisations to deploy services and enable the simple movement of them from public cloud to on-prem and back again, making HCI just an end point, a location from which your cloud services could be delivered.

Ultimately HCI shouldn’t be about hardware or software, but outcomes and NetApp’s aim is to allow this technology to speed up your ability to drive those outcomes, regardless of location.

This transformed in my mind a platform from one that I struggled to see its long-term value to something that has the potential to become a critical component in delivering modern services to organisations of all types.

Summary

Did what I hear address the questions raised to me? Would it convince a wider audience that NetApp remain relevant? For that we will have to wait and see.

However, In my opinion NetApp presented a forward thinking, relevant strategy that if executed properly is going to be a fundamental shift in the way they are developing as a company and will ensure they remain relevant to organisations by solving real and complex business challenges.

I’m very interested to see how this new vision for Data Fabric evolves and if they can execute the vision presented so impressively at Insight, they may finally shed that “traditional” NetApp label and become the data authority company that they are aiming to be.

You can get further details on announcements from Insight by visiting the NetApp Insight site and where you will find a wide range of videos including the two general session keynotes.

If you want to find out more about NetApp’s vision for your self, then it’s not to late to register to attend NetApp’s Insight EMEA conference in Barcelona, details are here.

Advertisements

Building a modern data platform – what have we learned?

As I reach the end of this series, it raises the question “what have we learned?”. If you’ve read through it all, you’ve learned you are patient and I’ve learned that writing a series of posts actually takes quite a bit of time. But I digress!

Let’s start at the beginning – what is a modern data platform?

I’ve used the term throughout, but what does it mean? In the introductory post I stated “In today’s modern world however, storing our data is no longer enough, we need to consider much more” and that’s true as organisations now want their data to provide modern data platformcompetitive edge and insights, we also need to ensure we are “developing an appropriate data strategy and building a data platform that is fit for today’s business needs”. In essence those two areas neatly define a modern data platform, storing data is no longer enough and our platform needs to fit today’s rapidly changing demands, integrate with new technologies and give the scale and flexibility we need to turn our data into an asset, all of this while ensuring our data maintains its privacy, security and we maintain governance and control

It’s not storage

While storage plays an important part in any data strategy (our data has to live somewhere) it’s important to realise when we talk about a data platform, it’s not about storage, while the right storage partner plays a crucial part, the choice isn’t driven by media types, IOPS, or colour of bezel, it’s about a wider strategy and ensuring our technology choice enables us to provide the scale, flexibility and security a modern platform demands.

Break down walls

We have also learned that data cannot be stored in silo’s, be that an on-prem storage repository or its modern equivalent the “cloud silo” placing our data somewhere without consideration of how we move it so we can do what we need to with it quickly and easily, is not designing a modern data platform.

Data Insight is crucial

Where our data is held and on what, while important, pales when compared to the managing the futureimportance of insight into how our data is used. Our modern data platform must provide visibility into the who’s, where’s, when’s, what’s and why’s of data usage, who’s accessing it, where is it and when, if ever, are they accessing it, what are they accessing and why. Knowing this, is critical for a modern data platform, it allows us to build retention, security and compliance policies, it allows us to start to build effective data leak protections and be more efficient with our storage and control the costs and challenges that comes with our ever increasing reliance on data.

Without this insight you don’t have a modern data platform.

Data is everywhere

We have also learned that our data is everywhere, it no longer resides in the protected walls of our data centers, it’s living on a range of devices both sat inside and outside of those walls. That’s not just the data we have, it’s also the increasing range of devices creating data for us, our platform needs to be able to ingest, process and control all of it. Protecting data on the very edges of our network to the same degree that we protect, secure and govern that which sits inside our data centers is crucial.

Cloud, cloud and more cloud

Just a few years ago the prognosis for the data industry was that cloud was going to swallow it all and those who looked to use “traditional” thinking around data would be swept away by the cloud driven tide.

080118_0950_Optimisingt1.jpgNow while cloud is unlikely to wipe out all data life as we know it, cloud should certainly play a part in your data strategy, it has many of the attributes that make it an ideal repository, its flexibility, scale, even commercial models make it an attractive proposition.

But it has limits, however ensuring our data platform can integrate cloud where appropriate and maintain all of the enterprise control we need is a core part of a modern platform, you can’t design a modern platform without considering cloud.

It’s a platform

The reason I used the word platform, is because that is what it is, it’s not one component, it is built up of multiple components, as I’ve shown here, it’s storage, data management, governance, control, be it in the datacentre, on the edges of your network or utilising the cloud.

The days of our data just been about one element are gone, we need a strategy that looks at how we use data in its entirety.

Building a modern data platform

The point of this series has been to provide some practical examples of the tools and technologies I’ve used building modern data platforms. Not every platform uses all of these technologies all of the time and it doesn’t have to be these specific ones to build your platform. What is more important is the concept of a data platform and hopefully this series has introduced you to some areas you may not have considered previously and will help you design a platform to get the very best from your data assets.

If you have any questions, please leave a comment on the site, or contact me on twitter @techstringy or LinkedIn

If you’ve missed any of the series head back to the introduction where you’ll find links to all of the parts of the series.

Thanks for reading.

Building a modern data platform – exploiting the cloud

No modern data platform would be complete if we didn’t talk about the use of public cloud. Public cloud can play a very important part in building a modern data platform and provide us with capabilities we couldn’t get any other way.

In this part of our series we look at the benefits of public cloud, the challenges of adoption and how to overcome them and ensure we can embrace cloud as part of our platform.

Why is public cloud useful for our data?

If we look at the challenges normally associated with traditional approaches to data storage, scale, flexibility, data movement, commercials, then it quickly becomes clear how cloud can be valuable.

While these challenges are common in traditional approaches, these are the areas were public cloud is strongest. It gives us scale that is almost infinite, a consumption model were we pay for what we need as we need it and of course flexibility, the ability to take our data and do interesting things with it once it’s within the public cloud. From analytics and AI to the more mundane backup and DR, flexbility is one of the most compelling reasons for considering public cloud at all.

While the benefits are clear, why are more organisations not falling over themselves to move to cloud?

What’s it lacking?

It’s not about what public cloud can do, it is more about what it doesn’t that tends to stop organisations wholeheartedly embracing it when it comes to data assets.

As we’ve worked through the different areas of building a modern data platform our approach to data is about more than storage, it’s insight, protection, availability, security, privacy and these are things not normally associated with native cloud storage and we don’t want our move to cloud to mean we lose all of those capabilities or have to implement and learn a new set of tools to deliver them.

Of course there is also the “data gravity” problem, we can’t have our cloud based data siloed away from the rest of our platform, it has to be part of it, we need to be able to move data in to the cloud, out again, between cloud providers, all while retaining enterprise control and management.

So how do we overcome these challenges?

How to make the cloud feel like the enterprise?

When it comes to the modern data platforms, NetApp have developed into an ideal partner for helping to integrate public cloud storage. If we look back at part one of this series (Building a modern data platform-the storage) we discussed NetApp’s data services which are built into their ONTAP operating system making it the cornerstone of their data fabric strategy. What makes ONTAP that cornerstone is, as a piece of software, the ability for it to be installed anywhere, which today also means public cloud.

Taking ONTAP and its data services into the cloud provides us with massive advantages, it allows us to deliver enterprise storage efficiencies, performance guarantees and the ability to use the enterprise tools we have made a key part of our platform with our cloud based data as well.

NetApp has two ways to deploy ONTAP into public cloud. It can be installed as Cloud Volumes ONTAP, a full ONTAP deployment on top of native cloud storage, providing all of the same enterprise data services we have on-prem and extend them into the cloud and seamlessly integrate them with our on-prem data stores.

An alternative and even more straightforward approach, is having ONTAP delivered as a native service, no ONTAP deployment or experience necessary. You order your service enter a size, performance characteristics and away you go, with no concern at all with underlying infrastructure, how it works and how it’s managed. You are provided with enterprise class storage with data protection, storage efficiencies and performance service levels previously unheard of in native cloud storage, in seconds.

It’s not a Strategy without integration

While adding enterprise capabilities are great, the idea of a modern data platform relies on having our data in the location we need it, when we need it while maintaining management and control. This is where the use of NetApp’s technology provides real advantage. The use of ONTAP as a consistent endpoint provides the platform for integration, allowing us to use the same tools, policies and procedures at the core of our data platform and extend this to our data in the public cloud.

NetApp’s SnapMirror provides us with a data movement engine so we can simply move data in and out of and between clouds. Replicating data in this way means that while our on-prem version can be the authoritative copy, it doesn’t have to be the only one, replicating a copy of our data to a location for a one off task, which once completed can then be destroyed, is a powerful capability and an important element of simplifying the extension of our platform into the cloud.

Summary

Throughout this series we have asked the question “do we have to use technology X to deliver this service?” the reality is of course no, but NetApp are a key element of our modern data platforms because of this cloud integration capability, the option to provide consistent data services across multiple locations is extremely powerful allowing us to take advantage of cloud while maintaining our enterprise controls.

While I’ve not seen any other data services provider coming close to what NetApp are doing in this space, the important thing in your design strategy, if it is to include public cloud, is ensure you have appropriate access to data services, integration, management and control, it’s crucial that you don’t put data at risk or diminish the capabilities of your data platform by using cloud.

This is part 6 in a series of posts on building a modern data platform, you can find the introduction and other parts of this series here.

Assessing the risk in public cloud – Darron Gibbard – Ep72

As the desire to integrate public cloud into our organisations IT continues to grow, the need to ensure we maintain control and security of our key assets is a challenge but one that we need to overcome if we are going to use cloud as a fundamental part of our future IT infrastructure.

The importance of security and reducing our vulnerabilities is not, of course, unique to using public cloud, it’s a key part of any organisations IT and data strategy. However, the move to public cloud does introduce some different challenges with many of our services and data now sitting well outside the protective walls of our datacentre. This means that if our risks and vulnerabilities go unidentified and unmanaged it can open us up to the potential of major and wide-reaching security breaches.

This weeks Tech Interviews is the second in our series looking at what organisations need to consider as they make the move to public cloud. In this episode we focus on risk, how to assess it, gain visibility into our systems regardless of location and how to mitigate the risks that our modern infrastructure may come across.

To help discuss the topic of risk management in the cloud, I’m joined by Darron Gibbard. Darron is the Managing Director for EMEA North and Chief Technology Security Officer for Qualys with 25 years’ experience in the enterprise security, risk and compliance industry, he is well placed too discuss the challenges of public cloud.

In this episode we look at the vulnerabilities that a move to cloud can create as our data and services are no longer the preserve of the data centre. We discuss whether the cloud is as high a risk as we may be led to believe and why a lack of visibility to risk and threats is more of a problem than any inherent risk in a cloud platform.

Darron shares some insight into building a risk-based approach to using cloud and how to assess risk and why understanding the impact of a vulnerability is just, if not more useful that working out the likelihood of a cloud based “event”.

We wrap up with a discussion around Qaulys’s 5 principles of security and their approach to transparent orchestration ensuring that all this additional information we can gather can be used effectively.

The challenges presented around vulnerability and risk management when we move to public cloud shouldn’t be ignored, but it was refreshing to hear Darron presenting a balanced view and discussing that the cloud is no riskier than any enterprise environment when managed correctly.

Qualys are an interesting company with a great portfolio of tools, including a number that are free to use and can assist companies of all sizes to reduce their risk exposure both on-prem and in the cloud, to find out more about Qualys you can visit www.qualys.com.

You can also contact Darron by email dgibbard@qualys.com or connect with him on LinkedIn.

Thanks for listening.

For the first show in this series then check out – Optimising the public cloud – Andrew Hillier – Ep71

Protecting 365 – a look at Veeam Backup for Office 365

Recently Veeam announced version 2.0 of their Backup for Office 365 product this extended the functionality of its predecessor with much needed support for SharePoint and OneDrive for business. While looking into the release and what’s new it prompted me to revisit the topic of protecting Office 365 data, especially the approach of building your own solution to do so.

Back in April I wrote a post for Gestalt IT (“How to protect Office 365 data”), the basics of which considered the broadly held misconception that Microsoft are taking care of your data on their SaaS platform. While Microsoft provide some protection via retention and compliance rules and a 30-day rolling backup of OneDrive, this is not a replacement for a robust enterprise level data protection solution.

The article examined this issue and compared two approaches for dealing with the challenge, either via SaaS (NetApp’s SaaS backup platform was used as an example) or doing it yourself with Veeam. The article wasn’t intended to cover either approach in detail but to discuss the premise of Office 365 data protection.

This Veeam release though seemed like a good opportunity to look in more detail into the DIY approach to protecting our Office 365 data.

Why flexibility is worth the work

One of the drivers for many in the shift to 365 is simplification, removing the complexity that can come with SharePoint and Exchange deployments. It then surely follows that if I wanted simplicity, I’d want the same with my data protection platform. Why would I want to worry about backup repositories, proxy and backup servers or any other element of infrastructure?

The reality however, is when it comes to data protection, simplification and limiting complexity may not be the answer. Simplicity of SaaS can come at a price of reducing our ability to be flexible enough to meet our requirements, for example limiting our options to;

  • Have data backed up where we want it.
  • Deal with hybrid infrastructure and protect on-prem services.
  • Have full flexibility with restore options.

These limitations can be a problem for some organisations and when we consider mitigation against provider “lock-in” and the pressures of more stringent compliance, then you can see how for some, flexibility quickly overrides the desire for simplicity.

It is this desire for flexibility that makes building your own platform an attractive proposition. We can see with Veeam’s model the broad flexibility this approach can provide;

Backup Repository

Data location is possibly the key deciding factor when deciding to build your own platform, Veeam provide the flexibility to store our data in our own datacentre, a co-lo facility, or even a public cloud repository. Giving the flexibility to meet the most stringent data protection needs.

Hybrid Support

The next most important driver for choosing to build your own solution, is protecting hybrid workloads. While many have embraced Office365 in its entirety, there are still organisations who, for numerous reasons, have maintained an on-premises element to their infrastructure. This hybrid deployment can be a stumbling block for SaaS providers, with an Office 365 focus only.

Veeam Backup for Office365 fully supports the protection of data both on-prem and in the cloud, all through one console and one infrastructure, under a single licence. This capability is hugely valuable, simplifying the data protection process for hybrid environments and removing any need to have multiple tools protecting the separate elements.

Recovery

It’s not just backup flexibility when building your own platform that has value, it is also the range of options this can bring to recovery. This flexibility to take data backed up in any location and restore it to multiple different locations is highly valuable and sometimes an absolute necessity for anything from practicality to regulatory reasons.

What’s the flexibility cost?

Installation

Does this extra flexibility come with a heavy price of complexity and cost? In Veeam’s case no, they are renowned for simplicity of deployment and Backup for Office 365 is no different. It requires just the usual components of backup server, proxy, backup repository and product explorers with the size of the protected infrastructure dictating the scale of the protection platform.

There are of course limitations (Backup for Office 365 System Requirements) one major consideration is bandwidth, it’s important to consider how much data you’ll be bringing into your backup repository both initially and for subsequent incremental updates. While most SaaS providers will have substantial connectivity into Microsoft’s platform for these operations, you may not.

Licencing

A major benefit of software as a service is the commercial model, paying by subscription can be very attractive and can be lost when deploying our own solution. This is not the case with Backup for Office 365 which is licenced on a subscription basis.

Do it Yourself V as a Service

The Gestalt IT article ended with a comparison of the “pro’s and Cons” of the two approaches.

Do It Yourself

As A Service

Pro’s

Cons

Pro’s

Cons

Flexibility Planning Simplicity Lack of control
Control Management Overhead Lower Management Overhead Inability to customise
Customisation Responsibility Ease of Deployment Cloud only workloads
Protect Hybrid Deployments Data Sovereignty

I think these points remain equally relevant and when deciding what approach is right for you, regardless of what we’ve discussed here with Veeam’s offering. If SaaS is the right approach, it remains so, but If you do take the DIY approach, then I hope this post gives you an indication of the flexibility and customisation that is possible and why this can be crucial as part of your data protection strategy.

If building your own platform is your chosen route then Veeam Backup for Office365 V2 is certainly worthy of your consideration, But regardless of approach remember the data sat in Office365 is your responsibility, make sure its protected.

If you want to know more, you can contact me on twitter @techstringy or check out Veeam’s website.