Assessing the risk in public cloud – Darron Gibbard – Ep72

As the desire to integrate public cloud into our organisations IT continues to grow, the need to ensure we maintain control and security of our key assets is a challenge but one that we need to overcome if we are going to use cloud as a fundamental part of our future IT infrastructure.

The importance of security and reducing our vulnerabilities is not, of course, unique to using public cloud, it’s a key part of any organisations IT and data strategy. However, the move to public cloud does introduce some different challenges with many of our services and data now sitting well outside the protective walls of our datacentre. This means that if our risks and vulnerabilities go unidentified and unmanaged it can open us up to the potential of major and wide-reaching security breaches.

This weeks Tech Interviews is the second in our series looking at what organisations need to consider as they make the move to public cloud. In this episode we focus on risk, how to assess it, gain visibility into our systems regardless of location and how to mitigate the risks that our modern infrastructure may come across.

To help discuss the topic of risk management in the cloud, I’m joined by Darron Gibbard. Darron is the Managing Director for EMEA North and Chief Technology Security Officer for Qualys with 25 years’ experience in the enterprise security, risk and compliance industry, he is well placed too discuss the challenges of public cloud.

In this episode we look at the vulnerabilities that a move to cloud can create as our data and services are no longer the preserve of the data centre. We discuss whether the cloud is as high a risk as we may be led to believe and why a lack of visibility to risk and threats is more of a problem than any inherent risk in a cloud platform.

Darron shares some insight into building a risk-based approach to using cloud and how to assess risk and why understanding the impact of a vulnerability is just, if not more useful that working out the likelihood of a cloud based “event”.

We wrap up with a discussion around Qaulys’s 5 principles of security and their approach to transparent orchestration ensuring that all this additional information we can gather can be used effectively.

The challenges presented around vulnerability and risk management when we move to public cloud shouldn’t be ignored, but it was refreshing to hear Darron presenting a balanced view and discussing that the cloud is no riskier than any enterprise environment when managed correctly.

Qualys are an interesting company with a great portfolio of tools, including a number that are free to use and can assist companies of all sizes to reduce their risk exposure both on-prem and in the cloud, to find out more about Qualys you can visit www.qualys.com.

You can also contact Darron by email dgibbard@qualys.com or connect with him on LinkedIn.

Thanks for listening.

For the first show in this series then check out – Optimising the public cloud – Andrew Hillier – Ep71

Advertisements

Protecting 365 – a look at Veeam Backup for Office 365

Recently Veeam announced version 2.0 of their Backup for Office 365 product this extended the functionality of its predecessor with much needed support for SharePoint and OneDrive for business. While looking into the release and what’s new it prompted me to revisit the topic of protecting Office 365 data, especially the approach of building your own solution to do so.

Back in April I wrote a post for Gestalt IT (“How to protect Office 365 data”), the basics of which considered the broadly held misconception that Microsoft are taking care of your data on their SaaS platform. While Microsoft provide some protection via retention and compliance rules and a 30-day rolling backup of OneDrive, this is not a replacement for a robust enterprise level data protection solution.

The article examined this issue and compared two approaches for dealing with the challenge, either via SaaS (NetApp’s SaaS backup platform was used as an example) or doing it yourself with Veeam. The article wasn’t intended to cover either approach in detail but to discuss the premise of Office 365 data protection.

This Veeam release though seemed like a good opportunity to look in more detail into the DIY approach to protecting our Office 365 data.

Why flexibility is worth the work

One of the drivers for many in the shift to 365 is simplification, removing the complexity that can come with SharePoint and Exchange deployments. It then surely follows that if I wanted simplicity, I’d want the same with my data protection platform. Why would I want to worry about backup repositories, proxy and backup servers or any other element of infrastructure?

The reality however, is when it comes to data protection, simplification and limiting complexity may not be the answer. Simplicity of SaaS can come at a price of reducing our ability to be flexible enough to meet our requirements, for example limiting our options to;

  • Have data backed up where we want it.
  • Deal with hybrid infrastructure and protect on-prem services.
  • Have full flexibility with restore options.

These limitations can be a problem for some organisations and when we consider mitigation against provider “lock-in” and the pressures of more stringent compliance, then you can see how for some, flexibility quickly overrides the desire for simplicity.

It is this desire for flexibility that makes building your own platform an attractive proposition. We can see with Veeam’s model the broad flexibility this approach can provide;

Backup Repository

Data location is possibly the key deciding factor when deciding to build your own platform, Veeam provide the flexibility to store our data in our own datacentre, a co-lo facility, or even a public cloud repository. Giving the flexibility to meet the most stringent data protection needs.

Hybrid Support

The next most important driver for choosing to build your own solution, is protecting hybrid workloads. While many have embraced Office365 in its entirety, there are still organisations who, for numerous reasons, have maintained an on-premises element to their infrastructure. This hybrid deployment can be a stumbling block for SaaS providers, with an Office 365 focus only.

Veeam Backup for Office365 fully supports the protection of data both on-prem and in the cloud, all through one console and one infrastructure, under a single licence. This capability is hugely valuable, simplifying the data protection process for hybrid environments and removing any need to have multiple tools protecting the separate elements.

Recovery

It’s not just backup flexibility when building your own platform that has value, it is also the range of options this can bring to recovery. This flexibility to take data backed up in any location and restore it to multiple different locations is highly valuable and sometimes an absolute necessity for anything from practicality to regulatory reasons.

What’s the flexibility cost?

Installation

Does this extra flexibility come with a heavy price of complexity and cost? In Veeam’s case no, they are renowned for simplicity of deployment and Backup for Office 365 is no different. It requires just the usual components of backup server, proxy, backup repository and product explorers with the size of the protected infrastructure dictating the scale of the protection platform.

There are of course limitations (Backup for Office 365 System Requirements) one major consideration is bandwidth, it’s important to consider how much data you’ll be bringing into your backup repository both initially and for subsequent incremental updates. While most SaaS providers will have substantial connectivity into Microsoft’s platform for these operations, you may not.

Licencing

A major benefit of software as a service is the commercial model, paying by subscription can be very attractive and can be lost when deploying our own solution. This is not the case with Backup for Office 365 which is licenced on a subscription basis.

Do it Yourself V as a Service

The Gestalt IT article ended with a comparison of the “pro’s and Cons” of the two approaches.

Do It Yourself

As A Service

Pro’s

Cons

Pro’s

Cons

Flexibility Planning Simplicity Lack of control
Control Management Overhead Lower Management Overhead Inability to customise
Customisation Responsibility Ease of Deployment Cloud only workloads
Protect Hybrid Deployments Data Sovereignty

I think these points remain equally relevant and when deciding what approach is right for you, regardless of what we’ve discussed here with Veeam’s offering. If SaaS is the right approach, it remains so, but If you do take the DIY approach, then I hope this post gives you an indication of the flexibility and customisation that is possible and why this can be crucial as part of your data protection strategy.

If building your own platform is your chosen route then Veeam Backup for Office365 V2 is certainly worthy of your consideration, But regardless of approach remember the data sat in Office365 is your responsibility, make sure its protected.

If you want to know more, you can contact me on twitter @techstringy or check out Veeam’s website.

Getting your VeeamON!

Recently software vendor Veeam held its 2018 VeeamON conference in Chicago. VeeamON was one of my favourite conferences of last year, unfortunately I couldn’t make it out this time, but I did tune in for the keynote to listen to the new strategy messages that were shared.

The availability market is an interesting space at the minute, highlighted by the technical innovation and talent recruitment you can see companies like Veeam, Rubrik and others making. Similar to the storage industry of 5 years ago, the data protection industry is being forced to change its thinking with backup, replication and recovery no longer enough to meet modern demands. Availability is now the primary challenge, and not just of the data in our datacentre but also that sat in service providers, on SaaS platforms or with the big public hyperscalers, we need our availability strategy to cover all of these locations.

As with the storage industry when it was challenged by performance and the emergence of flash, two things are happening; New technology companies are emerging offering different approaches and thinking to take on modern challenges that traditional vendors are not addressing. But that challenge also inspires those established vendors, with experience, proven technologies, teams and budgets to react and find answers to these new challenges, well at least it encourages the smart ones.

This is where the availability industry currently sits and why the recent VeeamON conference was of interest. Veeam’s position is interesting, a few years ago they were the upstart with a new way of taking on the challenge presented by virtualisation. However, as our world continues to evolve so do the challenges, cloud, automation, security, governance and compliance just a few of the availability headaches many of us face and Veeam must react to.

One of the things I like about Veeam (and one of the reasons I was pleased to be asked to be a part of their Vanguard program this year) is that they are a very smart company, some of the talent acquisition is very impressive and the shift in how they see themselves and the problem they are trying to solve is intriguing.

VeeamON 2018 saw a further development of this message as Veeam introduced their 5 stages of intelligent data management which sees them continue to expand their focus beyond Veeam “The backup company”. The 5 stages provide the outline of a maturity model, something that can be used to measure progress towards a modern way of managing data.

Of these 5 stages, many of us are on the left-hand side of the graph with a robust policy-based backup approach as the extent of our data management. However, for many this is no longer appropriate as our infrastructures become more complex, changing more rapidly with data stored in a range of repositories and locations.

This is coupled with a need to better understand our data for management, privacy and compliance reasons, we can no longer operate an IT infrastructure without understanding at the very least where our data is located and what that means for its availability.

In my opinion, modern solutions must provide us with a level of intelligence and the ability to understand the behaviour of our systems and act accordingly. This is reflected on the right-hand side of Veeam’s strategy, that to meet this modern challenge will demand increasingly intelligent systems that can understand the criticality of a workload or what is being done to a dataset and act to protect it accordingly.

Although Veeam aren’t quite doing all of that yet, you can see steps moving them along the way, solutions such as Availability Orchestrator which takes the complexities of continuity and delivers automation to its execution, documentation and ongoing maintenance, are good examples.

It’s also important to note that Veeam understand they are not the answer to all of an organisations data management needs, they are a ultimately a company focussed on availability, but what they do realise is that availability is crucial and far beyond just recovering lost data, this is about making sure data is available “any data, any app, across any cloud” and they see the opportunity in becoming the integration engine in the data management stack.

Is all this relevant? Certainly, a major challenge for most businesses I chat with is how to build an appropriate data strategy, one that usually includes only having the data they need, to know how it’s been used and by who, where it is at any given time and having it in the right place when needed so they can extract “value” and make data driven decisions. This can only be achieved with a coherent strategy that ties together multiple repositories and systems, ensures that data is where it should be and maintains the management and control of that data across any platform that is required.

With that in mind Veeam’s direction makes perfect sense, with the 5 steps to intelligent data management model providing a framework upon which you can build a data management strategy, which is hugely beneficial to anyone who is tasked with developing their organisations data management platform.

In my opinion, Veeam’s direction is well thought out and I’ll be watching with interest in not only how it continues to develop, but importantly how they deliver tools and partnerships that allow those invested in their strategy to successfully execute it.

You can find more information on the announcements from VeeamON on Veeam’s website here www.veeam.com/veeamon/announcements

NetApp, The Cloud Company?

051718_1626_NetAppTheCl1.jpgLast week I was fortunate enough to be invited to NetApp’s HQ in Sunnyvale to spend 2 days with their leadership hearing about strategy, product updates and futures (under very strict NDA, so don’t ask! ) as part of the annual NetApp A-Team briefing session. This happened in a week were NetApp revealed their spring product updates which, alongside a raft of added capabilities to existing products, also included a new relationship with Google Compute Platform (GCP).

The GCP announcement now means NetApp offer services to the 3 largest hyperscale platform providers. Yes that’s right, NetApp the “traditional” On-prem storage vendor are offering an increasing amount of cloud services and what struck me while listening to their senior executives and technologists was this is not just a faint nod to cloud but is central to NetApp’s evolving strategy.

But why would a storage vendor have public cloud so central to their thinking? It’s a good question and I think the answer lies in the technology landscape many of us operate in. The use of cloud is commonplace, its flexibility and scale are driving new technology into businesses more quickly and easily than ever before.

However, this comes with its own challenges, while quick and easy is fine for deploying services and compute, the same can not be said of our data and storage repositories, not only does data continue to have significant “weight” but it also comes with additional challenges, especially when we consider compliance and security. It’s critical in a modern data platform that our data has as much flexibility as the services and compute that need to access it, while at the same time, allowing us to maintain full control and stringent security.

NetApp has identified this challenge as something upon which they can build their business strategy and you can see evidence of this within their spring technology announcements not only as they tightly integrate cloud into their “traditional” platforms, but also the continued development of cloud native services such as those in the GCP announcement, the additional capabilities in AWS and Azure, as well as Cloud Volumes and services such as SaaS backup and Cloud Sync. It is further reflected in an intelligent acquisition and partnering strategy with a focus on those who bring automation, orchestration and management to hybrid environments.

Is NetApp the on-prem traditional storage vendor no more?

In my opinion this is an emphatic no. During our visit we heard from NetApp Founder Dave Hitz, he talked about NetApp’s view of cloud and how initially they realised that it was something they needed to understand and decided to take a gamble on it and its potential. What was refreshing was that they did this without any guarantees they could make money from cloud, but just they understood how potentially important it would be.

Over the last 4 years NetApp has been reinvigorated with a solid strategy built around their data fabric and this strong cloud centric vision, which has not only seen share prices rocket, but has also seen market share and revenue grow. That growth has not been from cloud services alone, in fact the majority is from strong sales of their “traditional” on-prem platforms and they are convinced this growth has been driven by their embracing of cloud, a coherent strategy that looks to ensure your data is where you need it, when you need it, while maintaining all of the enterprise class qualities you’d expect on-prem, whether the data is in your datacentre, near the cloud or in it.

Are NetApp a cloud company?

No. Are they changing? Most certainly.

Their data fabric message honed over the last 4 years is now mature in not only strategy but in execution, with NetApp platforms, driven by ONTAP as a common transport engine, providing a capability to move data between platforms be they on-prem, near the cloud or straight into public hyperscalers, while crucially maintaining the high quality of data services and management we are used to within our enterprise across all of those repositories.

This strategy is core to NetApp and their success and it certainly resonates with businesses that I speak with as they become more data focussed than ever, driven by compliance, cost or the need to garner greater value from their data. Businesses do not want their data locked away in silo’s, nor do they want it at risk when they move it to new platforms to take advantage of new tools and services.

While NetApp are not a cloud company, during the two days It seemed clear to me that their embracing of cloud puts them in a unique position when it comes to providing data services. As businesses look to develop their modern data strategy they would be, in my opinion, remiss to not at least understand NetApp’s strategy and data fabric and the value that approach can bring, regardless of ultimately if they use NetApp technology or not.

NetApp’s changes over the last few years have been significant and their future vision is fascinating and I for one look forward to seeing their continued development and success.

For more information on the recent spring announcements, you can review the following;

The NetApp official Press Release

Blog post by Chris Maki summarising the new features in ONTAP 9.4

The following NetApp blogs provide more detail on a number of individual announcements;

New Fabric Pool Capabilities

The new AFF A800 Platform

Google Compute Platform Announcement

Latest NMVe announcements

Tech ONTAP Podcast – ONTAP 9.4 Overview

 

 

Building a modern data platform – Prevention (Office365)

In this series so far, we have looked at getting our initial foundations right and ensuring we have insight and control of our data and have looked at components that I use to help achieve this. However, this time we are looking at something that many organisations are already using which has a wide range of capabilities that can help to manage and control data but which are often underutilised.

For ever-increasing numbers of us Office365 has become the primary data and communications repository. However, I often find organisations are unaware of many powerful capabilities within their subscription which can greatly reduce the risks of data breach.

Tucked away with Office365 is the Security and Compliance Section (protection.office.com) and is the gateway to several powerful features that should be part of your modern data strategy.

In this article we are going to focus on two such features “Data Loss Prevention” and “Data Governance”, both offer powerful capabilities that can be deployed quickly across your organisation and can help to significantly mitigate against the risks of data breach.

Data Loss Prevention (DLP)

DLP is an important weapon in our data management arsenal, DLP policies are designed to ensure sensitive information does not leave our organisation in ways that it shouldn’t and Office365 makes this straightforward for us to get started.

We can quickly create policies that we can apply across our organisation to help identify types of data that we hold, several predefined options already exist including ones that identify financial data, personally identifiable information (PII), social security numbers, health records, passport numbers etc. with templates for a number of countries and regions across the world.

Once our policies which identify our data types are created we can apply rules to that data on how it can be used, we can apply several rules and, depending on requirement, make them increasingly stringent.

The importance of DLP rules should not be underestimated, while it’s important we understand who has access to and uses our data, too many times we feel this is enough and don’t take that next crucial step of controlling the use and movement of that data.

We shouldn’t forget that those with the right access to the right data, may accidentally or maliciously do the wrong thing with it!

Data Governance

Governance should be a cornerstone of a modern data platform it is what defines the way we use, manage, secure, classify and retain our data and can impact the cost of our data storage, it’s security and our ability to deliver compliance to our organisations.

Office365 provides two key governance capabilities.

Labels

Labels allow us to apply classifications to our data so we can start to understand what is important and what isn’t. We can highlight what is for public consumption, what is private, sensitive, commercial in confidence or any other range of potential classifications that you have within your organisation.

Classification is crucial part of delivering a successful data compliance capability, giving us granular control on exactly how we handle data of all types.

Labels can be applied automatically based on the contents of the data we have stored, they can be applied by users as they create content or in conjunction with the DLP rules we discussed earlier.

For example a DLP policy can identify a document with credit card details in, then automatically apply a rule that labels it as sensitive information.

Retention

Once we have classified our data into what is important and what isn’t we can then, with retention policies, define what we keep and for how long.

These policies allow us to effectively manage and govern our information and subsequently allows us to reduce the risk of litigation or security breach by either retaining data for a period, as defined by a regulatory requirement, or, importantly, permanently deleting old content that you’re no longer required to keep.

The policies can be assigned automatically based on classifications or can be applied manually by a user as they generate new data.

For example, a user creates a new document containing financial data which must be retained for 7 years, that user can classify the data accordingly, ensuring that both our DLP and retention rules are applied as needed

Management

Alongside these capabilities Office365 provides us with two management tools, disposition and supervision.

Disposition is our holding pen for data to be deleted so we can review any deletions before actioning.

Supervision is a powerful capability allowing us to capture employee communications for examination by internal or external reviewers.

These tools are important in allowing us to show we have auditable processes and control within our platform and are taking the steps necessary to protect our data assets as we should.

Summary

The ability to govern and control our data wherever we hold it is a critical part of a modern data platform. If you use Office365 and are not using these capabilities then you are missing out.

The importance of governance is only going to continue to grow as ever more stringent data privacy and security regulations develop, governance can allow us to greatly reduce many of the risks associated with data breach and services such as Office365 have taken things that have been traditionally difficult to achieve and made them a whole lot easier.

If you are building a modern data platform then compliance and governance should be at the heart of your strategy.

This is part 4 in a series of posts on building a modern data platform, the previous parts of the series can be found below.

modern data platform
Introduction

modern storage
The Storage

031318_0833_Availabilit1.png
Availability

control
Control