Assessing the risk in public cloud – Darron Gibbard – Ep72

As the desire to integrate public cloud into our organisations IT continues to grow, the need to ensure we maintain control and security of our key assets is a challenge but one that we need to overcome if we are going to use cloud as a fundamental part of our future IT infrastructure.

The importance of security and reducing our vulnerabilities is not, of course, unique to using public cloud, it’s a key part of any organisations IT and data strategy. However, the move to public cloud does introduce some different challenges with many of our services and data now sitting well outside the protective walls of our datacentre. This means that if our risks and vulnerabilities go unidentified and unmanaged it can open us up to the potential of major and wide-reaching security breaches.

This weeks Tech Interviews is the second in our series looking at what organisations need to consider as they make the move to public cloud. In this episode we focus on risk, how to assess it, gain visibility into our systems regardless of location and how to mitigate the risks that our modern infrastructure may come across.

To help discuss the topic of risk management in the cloud, I’m joined by Darron Gibbard. Darron is the Managing Director for EMEA North and Chief Technology Security Officer for Qualys with 25 years’ experience in the enterprise security, risk and compliance industry, he is well placed too discuss the challenges of public cloud.

In this episode we look at the vulnerabilities that a move to cloud can create as our data and services are no longer the preserve of the data centre. We discuss whether the cloud is as high a risk as we may be led to believe and why a lack of visibility to risk and threats is more of a problem than any inherent risk in a cloud platform.

Darron shares some insight into building a risk-based approach to using cloud and how to assess risk and why understanding the impact of a vulnerability is just, if not more useful that working out the likelihood of a cloud based “event”.

We wrap up with a discussion around Qaulys’s 5 principles of security and their approach to transparent orchestration ensuring that all this additional information we can gather can be used effectively.

The challenges presented around vulnerability and risk management when we move to public cloud shouldn’t be ignored, but it was refreshing to hear Darron presenting a balanced view and discussing that the cloud is no riskier than any enterprise environment when managed correctly.

Qualys are an interesting company with a great portfolio of tools, including a number that are free to use and can assist companies of all sizes to reduce their risk exposure both on-prem and in the cloud, to find out more about Qualys you can visit www.qualys.com.

You can also contact Darron by email dgibbard@qualys.com or connect with him on LinkedIn.

Thanks for listening.

For the first show in this series then check out – Optimising the public cloud – Andrew Hillier – Ep71

Advertisements

Protecting 365 – a look at Veeam Backup for Office 365

Recently Veeam announced version 2.0 of their Backup for Office 365 product this extended the functionality of its predecessor with much needed support for SharePoint and OneDrive for business. While looking into the release and what’s new it prompted me to revisit the topic of protecting Office 365 data, especially the approach of building your own solution to do so.

Back in April I wrote a post for Gestalt IT (“How to protect Office 365 data”), the basics of which considered the broadly held misconception that Microsoft are taking care of your data on their SaaS platform. While Microsoft provide some protection via retention and compliance rules and a 30-day rolling backup of OneDrive, this is not a replacement for a robust enterprise level data protection solution.

The article examined this issue and compared two approaches for dealing with the challenge, either via SaaS (NetApp’s SaaS backup platform was used as an example) or doing it yourself with Veeam. The article wasn’t intended to cover either approach in detail but to discuss the premise of Office 365 data protection.

This Veeam release though seemed like a good opportunity to look in more detail into the DIY approach to protecting our Office 365 data.

Why flexibility is worth the work

One of the drivers for many in the shift to 365 is simplification, removing the complexity that can come with SharePoint and Exchange deployments. It then surely follows that if I wanted simplicity, I’d want the same with my data protection platform. Why would I want to worry about backup repositories, proxy and backup servers or any other element of infrastructure?

The reality however, is when it comes to data protection, simplification and limiting complexity may not be the answer. Simplicity of SaaS can come at a price of reducing our ability to be flexible enough to meet our requirements, for example limiting our options to;

  • Have data backed up where we want it.
  • Deal with hybrid infrastructure and protect on-prem services.
  • Have full flexibility with restore options.

These limitations can be a problem for some organisations and when we consider mitigation against provider “lock-in” and the pressures of more stringent compliance, then you can see how for some, flexibility quickly overrides the desire for simplicity.

It is this desire for flexibility that makes building your own platform an attractive proposition. We can see with Veeam’s model the broad flexibility this approach can provide;

Backup Repository

Data location is possibly the key deciding factor when deciding to build your own platform, Veeam provide the flexibility to store our data in our own datacentre, a co-lo facility, or even a public cloud repository. Giving the flexibility to meet the most stringent data protection needs.

Hybrid Support

The next most important driver for choosing to build your own solution, is protecting hybrid workloads. While many have embraced Office365 in its entirety, there are still organisations who, for numerous reasons, have maintained an on-premises element to their infrastructure. This hybrid deployment can be a stumbling block for SaaS providers, with an Office 365 focus only.

Veeam Backup for Office365 fully supports the protection of data both on-prem and in the cloud, all through one console and one infrastructure, under a single licence. This capability is hugely valuable, simplifying the data protection process for hybrid environments and removing any need to have multiple tools protecting the separate elements.

Recovery

It’s not just backup flexibility when building your own platform that has value, it is also the range of options this can bring to recovery. This flexibility to take data backed up in any location and restore it to multiple different locations is highly valuable and sometimes an absolute necessity for anything from practicality to regulatory reasons.

What’s the flexibility cost?

Installation

Does this extra flexibility come with a heavy price of complexity and cost? In Veeam’s case no, they are renowned for simplicity of deployment and Backup for Office 365 is no different. It requires just the usual components of backup server, proxy, backup repository and product explorers with the size of the protected infrastructure dictating the scale of the protection platform.

There are of course limitations (Backup for Office 365 System Requirements) one major consideration is bandwidth, it’s important to consider how much data you’ll be bringing into your backup repository both initially and for subsequent incremental updates. While most SaaS providers will have substantial connectivity into Microsoft’s platform for these operations, you may not.

Licencing

A major benefit of software as a service is the commercial model, paying by subscription can be very attractive and can be lost when deploying our own solution. This is not the case with Backup for Office 365 which is licenced on a subscription basis.

Do it Yourself V as a Service

The Gestalt IT article ended with a comparison of the “pro’s and Cons” of the two approaches.

Do It Yourself

As A Service

Pro’s

Cons

Pro’s

Cons

Flexibility Planning Simplicity Lack of control
Control Management Overhead Lower Management Overhead Inability to customise
Customisation Responsibility Ease of Deployment Cloud only workloads
Protect Hybrid Deployments Data Sovereignty

I think these points remain equally relevant and when deciding what approach is right for you, regardless of what we’ve discussed here with Veeam’s offering. If SaaS is the right approach, it remains so, but If you do take the DIY approach, then I hope this post gives you an indication of the flexibility and customisation that is possible and why this can be crucial as part of your data protection strategy.

If building your own platform is your chosen route then Veeam Backup for Office365 V2 is certainly worthy of your consideration, But regardless of approach remember the data sat in Office365 is your responsibility, make sure its protected.

If you want to know more, you can contact me on twitter @techstringy or check out Veeam’s website.

NetApp, The Cloud Company?

051718_1626_NetAppTheCl1.jpgLast week I was fortunate enough to be invited to NetApp’s HQ in Sunnyvale to spend 2 days with their leadership hearing about strategy, product updates and futures (under very strict NDA, so don’t ask! ) as part of the annual NetApp A-Team briefing session. This happened in a week were NetApp revealed their spring product updates which, alongside a raft of added capabilities to existing products, also included a new relationship with Google Compute Platform (GCP).

The GCP announcement now means NetApp offer services to the 3 largest hyperscale platform providers. Yes that’s right, NetApp the “traditional” On-prem storage vendor are offering an increasing amount of cloud services and what struck me while listening to their senior executives and technologists was this is not just a faint nod to cloud but is central to NetApp’s evolving strategy.

But why would a storage vendor have public cloud so central to their thinking? It’s a good question and I think the answer lies in the technology landscape many of us operate in. The use of cloud is commonplace, its flexibility and scale are driving new technology into businesses more quickly and easily than ever before.

However, this comes with its own challenges, while quick and easy is fine for deploying services and compute, the same can not be said of our data and storage repositories, not only does data continue to have significant “weight” but it also comes with additional challenges, especially when we consider compliance and security. It’s critical in a modern data platform that our data has as much flexibility as the services and compute that need to access it, while at the same time, allowing us to maintain full control and stringent security.

NetApp has identified this challenge as something upon which they can build their business strategy and you can see evidence of this within their spring technology announcements not only as they tightly integrate cloud into their “traditional” platforms, but also the continued development of cloud native services such as those in the GCP announcement, the additional capabilities in AWS and Azure, as well as Cloud Volumes and services such as SaaS backup and Cloud Sync. It is further reflected in an intelligent acquisition and partnering strategy with a focus on those who bring automation, orchestration and management to hybrid environments.

Is NetApp the on-prem traditional storage vendor no more?

In my opinion this is an emphatic no. During our visit we heard from NetApp Founder Dave Hitz, he talked about NetApp’s view of cloud and how initially they realised that it was something they needed to understand and decided to take a gamble on it and its potential. What was refreshing was that they did this without any guarantees they could make money from cloud, but just they understood how potentially important it would be.

Over the last 4 years NetApp has been reinvigorated with a solid strategy built around their data fabric and this strong cloud centric vision, which has not only seen share prices rocket, but has also seen market share and revenue grow. That growth has not been from cloud services alone, in fact the majority is from strong sales of their “traditional” on-prem platforms and they are convinced this growth has been driven by their embracing of cloud, a coherent strategy that looks to ensure your data is where you need it, when you need it, while maintaining all of the enterprise class qualities you’d expect on-prem, whether the data is in your datacentre, near the cloud or in it.

Are NetApp a cloud company?

No. Are they changing? Most certainly.

Their data fabric message honed over the last 4 years is now mature in not only strategy but in execution, with NetApp platforms, driven by ONTAP as a common transport engine, providing a capability to move data between platforms be they on-prem, near the cloud or straight into public hyperscalers, while crucially maintaining the high quality of data services and management we are used to within our enterprise across all of those repositories.

This strategy is core to NetApp and their success and it certainly resonates with businesses that I speak with as they become more data focussed than ever, driven by compliance, cost or the need to garner greater value from their data. Businesses do not want their data locked away in silo’s, nor do they want it at risk when they move it to new platforms to take advantage of new tools and services.

While NetApp are not a cloud company, during the two days It seemed clear to me that their embracing of cloud puts them in a unique position when it comes to providing data services. As businesses look to develop their modern data strategy they would be, in my opinion, remiss to not at least understand NetApp’s strategy and data fabric and the value that approach can bring, regardless of ultimately if they use NetApp technology or not.

NetApp’s changes over the last few years have been significant and their future vision is fascinating and I for one look forward to seeing their continued development and success.

For more information on the recent spring announcements, you can review the following;

The NetApp official Press Release

Blog post by Chris Maki summarising the new features in ONTAP 9.4

The following NetApp blogs provide more detail on a number of individual announcements;

New Fabric Pool Capabilities

The new AFF A800 Platform

Google Compute Platform Announcement

Latest NMVe announcements

Tech ONTAP Podcast – ONTAP 9.4 Overview

 

 

Building a modern data platform – Prevention (Office365)

In this series so far, we have looked at getting our initial foundations right and ensuring we have insight and control of our data and have looked at components that I use to help achieve this. However, this time we are looking at something that many organisations are already using which has a wide range of capabilities that can help to manage and control data but which are often underutilised.

For ever-increasing numbers of us Office365 has become the primary data and communications repository. However, I often find organisations are unaware of many powerful capabilities within their subscription which can greatly reduce the risks of data breach.

Tucked away with Office365 is the Security and Compliance Section (protection.office.com) and is the gateway to several powerful features that should be part of your modern data strategy.

In this article we are going to focus on two such features “Data Loss Prevention” and “Data Governance”, both offer powerful capabilities that can be deployed quickly across your organisation and can help to significantly mitigate against the risks of data breach.

Data Loss Prevention (DLP)

DLP is an important weapon in our data management arsenal, DLP policies are designed to ensure sensitive information does not leave our organisation in ways that it shouldn’t and Office365 makes this straightforward for us to get started.

We can quickly create policies that we can apply across our organisation to help identify types of data that we hold, several predefined options already exist including ones that identify financial data, personally identifiable information (PII), social security numbers, health records, passport numbers etc. with templates for a number of countries and regions across the world.

Once our policies which identify our data types are created we can apply rules to that data on how it can be used, we can apply several rules and, depending on requirement, make them increasingly stringent.

The importance of DLP rules should not be underestimated, while it’s important we understand who has access to and uses our data, too many times we feel this is enough and don’t take that next crucial step of controlling the use and movement of that data.

We shouldn’t forget that those with the right access to the right data, may accidentally or maliciously do the wrong thing with it!

Data Governance

Governance should be a cornerstone of a modern data platform it is what defines the way we use, manage, secure, classify and retain our data and can impact the cost of our data storage, it’s security and our ability to deliver compliance to our organisations.

Office365 provides two key governance capabilities.

Labels

Labels allow us to apply classifications to our data so we can start to understand what is important and what isn’t. We can highlight what is for public consumption, what is private, sensitive, commercial in confidence or any other range of potential classifications that you have within your organisation.

Classification is crucial part of delivering a successful data compliance capability, giving us granular control on exactly how we handle data of all types.

Labels can be applied automatically based on the contents of the data we have stored, they can be applied by users as they create content or in conjunction with the DLP rules we discussed earlier.

For example a DLP policy can identify a document with credit card details in, then automatically apply a rule that labels it as sensitive information.

Retention

Once we have classified our data into what is important and what isn’t we can then, with retention policies, define what we keep and for how long.

These policies allow us to effectively manage and govern our information and subsequently allows us to reduce the risk of litigation or security breach by either retaining data for a period, as defined by a regulatory requirement, or, importantly, permanently deleting old content that you’re no longer required to keep.

The policies can be assigned automatically based on classifications or can be applied manually by a user as they generate new data.

For example, a user creates a new document containing financial data which must be retained for 7 years, that user can classify the data accordingly, ensuring that both our DLP and retention rules are applied as needed

Management

Alongside these capabilities Office365 provides us with two management tools, disposition and supervision.

Disposition is our holding pen for data to be deleted so we can review any deletions before actioning.

Supervision is a powerful capability allowing us to capture employee communications for examination by internal or external reviewers.

These tools are important in allowing us to show we have auditable processes and control within our platform and are taking the steps necessary to protect our data assets as we should.

Summary

The ability to govern and control our data wherever we hold it is a critical part of a modern data platform. If you use Office365 and are not using these capabilities then you are missing out.

The importance of governance is only going to continue to grow as ever more stringent data privacy and security regulations develop, governance can allow us to greatly reduce many of the risks associated with data breach and services such as Office365 have taken things that have been traditionally difficult to achieve and made them a whole lot easier.

If you are building a modern data platform then compliance and governance should be at the heart of your strategy.

This is part 4 in a series of posts on building a modern data platform, the previous parts of the series can be found below.

modern data platform
Introduction

modern storage
The Storage

031318_0833_Availabilit1.png
Availability

control
Control

Building a modern data platform – Availability

In part one we discussed the importance of getting our storage platform right, in part two we look at availability.

The idea that availability is a crucial part of a modern platform was something I first heard from a friend of mine, Michael Cade from Veeam, who introduced me to “availability as part of digital transformation” and how this was changing Veeam’s focus.

This shift is absolutely right, today as we build our modern platforms backup and recovery is still a crucial requirement, however, a focus on availability is at least, if not more, crucial. Today nobody in your business really cares how quickly you can recover a system, what our digitally driven businesses demand is that our systems are always there and downtime in ever more competitive environments is not tolerated.

With that in mind why do I choose Veeam to deliver availability to my modern data platform?

Keep it simple

Whenever I meet a Veeam customer their first comment on Veeam is “it just works”, the power of this rather simple statement should not be underestimated when you are protecting key assets. Too often data protection solutions have been overly complex, inefficient and unreliable and that is something I have always found unacceptable, for business big or small you need a data protection solution you can deploy and then forget and trust it just does what you ask, this is perhaps Veeam’s greatest strength and a crucial driver behind its popularity and what makes it such a good component part of a data platform.

I would actually say Veeam are a bit like the Apple of availability, although much of what they do has been done by others (Veeam didn’t invent data protection, in the same way Apple didn’t invent the smartphone) but what they have done is make it simple and usable and something that just works and can be trusted. Don’t underestimate the importance of this.

Flexibility

If ever there was a byword for modern IT, flexibility could well be it, it’s crucial that any solution and platform we build has the flexibility to react to ever changing business and technological demands. Look at how business needs for technology and the technology itself has changed in the last 10 years and how much our platforms have needed to change to keep up, flash storage, web scale applications, mobility, Cloud, the list goes on.

The following statement sums up Veeam’s view on flexibility perfectly

“Veeam Availability Platform provides businesses and enterprises of all sizes with the means to ensure availability for any application and any data, across any cloud infrastructure”

It is this focus on flexibility that make Veeam such an attractive proposition in the modern data platform, allowing me to design a solution that is flexible enough to meet my different needs, providing availability across my data platform, all with the same familiar toolset regardless of location, workload type or recovery needs.

Integration

As mentioned in part one, no modern data platform will be built with just one vendors tools, not if you want to deliver the control and insight into your data that we demand as a modern business. Veeam, like NetApp, have built a very strong partner ecosystem allowing them to integrate tightly with many vendors, but more than just integrate Veeam deliver additional value allowing me to simplify and do more with my platform (take a look at this blog about how Veeam allows you to get more from NetApp snapshots). Veeam are continuously delivering new integrations and not only with on-prem vendors, but also as mentioned earlier, with a vast range of cloud providers.

This ability to extend the capabilities and simplify the integration of multiple components in a multi-platform, multi-cloud world is very powerful and a crucial part of my data platform architecture.

Strategy

As with NetApp, over the last 18 months it has been the shift in Veeam’s overall strategy that has impressed me more than anything else, although seemingly a simple change, the shift from talking about backup and recovery to availability is significant.

As I said at the opening of this article, in our modern IT platforms nobody is interested in how quickly you can recover something, it’s about availability of crucial systems. A key part of Veeam’s strategy is to “deliver the next generation of availability for the Always-On Enterprise” and you can see this in everything Veeam are doing, focussing on simplicity, ensuring that you can have your workload where you need it when you need it and move those workloads seamlessly between on-prem, cloud and back again.

They have also been very smart, employing a strong leadership team and, as with NetApp, investing in ensuring that cloud services don’t leave a traditionally on-premises focussed technology provider adrift.

The Veeam and NetApp strategies are very similar, and it is this similarity that makes them attractive components in my data platform. I need my component providers to understand technology trends and changes so they, as well as our data platforms, can move and change with them.

Does it have to be Veeam?

In the same way it doesn’t have to be NetApp, of course it doesn’t have to be Veeam, but in exactly the same way, if you are building a platform for your data, then make sure your platform components deliver the kinds of things that we have discussed in the first two parts of this series, ensure that they provide the flexibility we need, the integration with components across your platform and a strategic vision that you are comfortable with, as long as you have that, that will give you rock solid foundations to build on.

In Part Three of this series we will look at building insight, compliance and governance into our data platform.

You can find the Introduction and Part One – “The Storage” below.

modern data platform
The Introduction
modern storage
Part One – The Storage