Managing multiple clouds – Joe Kinsella – ep73

This show was recorded pre the announcement on August 27th, 2018 of CloudHealth Technologies acquisition by VMware.

This is the 3rd in our series looking at the move to public cloud, the challenges involved and some of the tips and technologies that can help you to overcome them. In this episode we look at perhaps the biggest challenge facing most organisations moving to public cloud, the issues of multi-cloud.

A few weeks ago I published a post about multi-cloud becoming the technology industries holy grail (Tech and the holy multi cloud grail) as they look at ways to extract the complexity from multi-cloud environments and allow us to build solutions that encompass our infrastructure be it on-prem, in a co-lo or a public hyperscale provider. The benefits of multi-cloud deployments are many and will be a major part of our future use of cloud.

On this weeks show we look at those issues surrounding multi-cloud and particularly how to manage it, maintain cost efficiency, govern and ensure security of our cloud based assets. To discuss this I’m joined by Joe Kinsella CTO and Founder of CloudHealth Tech, a company that have built a platform to pull together information from numerous environments, consolidate it into one place and allow you to make informed, proactive decisions to ensure you use your technology in the best way you can.

During the episode we explore some wide-ranging topics, we look at why complexity is an issue, how multi-cloud was initially “stumbled upon” but is now becoming a chosen strategy. We ask why don’t we expect cloud to be complex when much of what we do in our datacentres is very complicated? Joe also confesses that 3-4 years ago he was predicting the death of the on-prem DC and why he has revaluated that with hybrid becoming the deployment reality.

We also discuss the traits for successful multi-cloud deployment and why a cloud first strategy isn’t about everything in the cloud, but more about can we use cloud? should we use cloud?

We wrap up discussing the CloudHealth Tech platform, what it does and how it helps to manage a multi-cloud environment by pulling together clouds, on-prem and automation platforms, connecting all the information to provide the business insights needed for proactive decision making. Finally, we look at the maturity of cloud management and how it needs to move beyond cost control and embrace security and governance as the evolution of multi-cloud management.

Joe gives some great insight and CloudHealth Technologies deliver a very powerful platform, so powerful that VMware saw fit to acquire them.

To find out more about CloudHealth Tech you can visit their website www.cloudhealthtech.com

Follow them on twitter @cloudhealthtech

You can find out more from Joe on twitter @joekinsella, his CloudHealth Tech blog www.cloudhealthtech.com/blog and finally his own blog hightechinthehub.com.

Enjoy and thanks for listening.

Advertisements

Building a modern data platform – exploiting the cloud

No modern data platform would be complete if we didn’t talk about the use of public cloud. Public cloud can play a very important part in building a modern data platform and provide us with capabilities we couldn’t get any other way.

In this part of our series we look at the benefits of public cloud, the challenges of adoption and how to overcome them and ensure we can embrace cloud as part of our platform.

Why is public cloud useful for our data?

If we look at the challenges normally associated with traditional approaches to data storage, scale, flexibility, data movement, commercials, then it quickly becomes clear how cloud can be valuable.

While these challenges are common in traditional approaches, these are the areas were public cloud is strongest. It gives us scale that is almost infinite, a consumption model were we pay for what we need as we need it and of course flexibility, the ability to take our data and do interesting things with it once it’s within the public cloud. From analytics and AI to the more mundane backup and DR, flexbility is one of the most compelling reasons for considering public cloud at all.

While the benefits are clear, why are more organisations not falling over themselves to move to cloud?

What’s it lacking?

It’s not about what public cloud can do, it is more about what it doesn’t that tends to stop organisations wholeheartedly embracing it when it comes to data assets.

As we’ve worked through the different areas of building a modern data platform our approach to data is about more than storage, it’s insight, protection, availability, security, privacy and these are things not normally associated with native cloud storage and we don’t want our move to cloud to mean we lose all of those capabilities or have to implement and learn a new set of tools to deliver them.

Of course there is also the “data gravity” problem, we can’t have our cloud based data siloed away from the rest of our platform, it has to be part of it, we need to be able to move data in to the cloud, out again, between cloud providers, all while retaining enterprise control and management.

So how do we overcome these challenges?

How to make the cloud feel like the enterprise?

When it comes to the modern data platforms, NetApp have developed into an ideal partner for helping to integrate public cloud storage. If we look back at part one of this series (Building a modern data platform-the storage) we discussed NetApp’s data services which are built into their ONTAP operating system making it the cornerstone of their data fabric strategy. What makes ONTAP that cornerstone is, as a piece of software, the ability for it to be installed anywhere, which today also means public cloud.

Taking ONTAP and its data services into the cloud provides us with massive advantages, it allows us to deliver enterprise storage efficiencies, performance guarantees and the ability to use the enterprise tools we have made a key part of our platform with our cloud based data as well.

NetApp has two ways to deploy ONTAP into public cloud. It can be installed as Cloud Volumes ONTAP, a full ONTAP deployment on top of native cloud storage, providing all of the same enterprise data services we have on-prem and extend them into the cloud and seamlessly integrate them with our on-prem data stores.

An alternative and even more straightforward approach, is having ONTAP delivered as a native service, no ONTAP deployment or experience necessary. You order your service enter a size, performance characteristics and away you go, with no concern at all with underlying infrastructure, how it works and how it’s managed. You are provided with enterprise class storage with data protection, storage efficiencies and performance service levels previously unheard of in native cloud storage, in seconds.

It’s not a Strategy without integration

While adding enterprise capabilities are great, the idea of a modern data platform relies on having our data in the location we need it, when we need it while maintaining management and control. This is where the use of NetApp’s technology provides real advantage. The use of ONTAP as a consistent endpoint provides the platform for integration, allowing us to use the same tools, policies and procedures at the core of our data platform and extend this to our data in the public cloud.

NetApp’s SnapMirror provides us with a data movement engine so we can simply move data in and out of and between clouds. Replicating data in this way means that while our on-prem version can be the authoritative copy, it doesn’t have to be the only one, replicating a copy of our data to a location for a one off task, which once completed can then be destroyed, is a powerful capability and an important element of simplifying the extension of our platform into the cloud.

Summary

Throughout this series we have asked the question “do we have to use technology X to deliver this service?” the reality is of course no, but NetApp are a key element of our modern data platforms because of this cloud integration capability, the option to provide consistent data services across multiple locations is extremely powerful allowing us to take advantage of cloud while maintaining our enterprise controls.

While I’ve not seen any other data services provider coming close to what NetApp are doing in this space, the important thing in your design strategy, if it is to include public cloud, is ensure you have appropriate access to data services, integration, management and control, it’s crucial that you don’t put data at risk or diminish the capabilities of your data platform by using cloud.

This is part 6 in a series of posts on building a modern data platform, you can find the introduction and other parts of this series here.

Assessing the risk in public cloud – Darron Gibbard – Ep72

As the desire to integrate public cloud into our organisations IT continues to grow, the need to ensure we maintain control and security of our key assets is a challenge but one that we need to overcome if we are going to use cloud as a fundamental part of our future IT infrastructure.

The importance of security and reducing our vulnerabilities is not, of course, unique to using public cloud, it’s a key part of any organisations IT and data strategy. However, the move to public cloud does introduce some different challenges with many of our services and data now sitting well outside the protective walls of our datacentre. This means that if our risks and vulnerabilities go unidentified and unmanaged it can open us up to the potential of major and wide-reaching security breaches.

This weeks Tech Interviews is the second in our series looking at what organisations need to consider as they make the move to public cloud. In this episode we focus on risk, how to assess it, gain visibility into our systems regardless of location and how to mitigate the risks that our modern infrastructure may come across.

To help discuss the topic of risk management in the cloud, I’m joined by Darron Gibbard. Darron is the Managing Director for EMEA North and Chief Technology Security Officer for Qualys with 25 years’ experience in the enterprise security, risk and compliance industry, he is well placed too discuss the challenges of public cloud.

In this episode we look at the vulnerabilities that a move to cloud can create as our data and services are no longer the preserve of the data centre. We discuss whether the cloud is as high a risk as we may be led to believe and why a lack of visibility to risk and threats is more of a problem than any inherent risk in a cloud platform.

Darron shares some insight into building a risk-based approach to using cloud and how to assess risk and why understanding the impact of a vulnerability is just, if not more useful that working out the likelihood of a cloud based “event”.

We wrap up with a discussion around Qaulys’s 5 principles of security and their approach to transparent orchestration ensuring that all this additional information we can gather can be used effectively.

The challenges presented around vulnerability and risk management when we move to public cloud shouldn’t be ignored, but it was refreshing to hear Darron presenting a balanced view and discussing that the cloud is no riskier than any enterprise environment when managed correctly.

Qualys are an interesting company with a great portfolio of tools, including a number that are free to use and can assist companies of all sizes to reduce their risk exposure both on-prem and in the cloud, to find out more about Qualys you can visit www.qualys.com.

You can also contact Darron by email dgibbard@qualys.com or connect with him on LinkedIn.

Thanks for listening.

For the first show in this series then check out – Optimising the public cloud – Andrew Hillier – Ep71

Optimising the public cloud – Andrew Hillier – Ep71

 

The move to public cloud is nothing new, many companies have moved or attempted to move key workloads into the big hyperscale providers, AWS, Azure, Google and IBM, but for some it has been a mixed success.

Somethings of course move easily, especially if your initial forays into cloud are via software as a service platforms (SaaS) such as Microsoft Office365 and Salesforce, but if you’ve looked to move more customised, or traditional workloads this presents a whole set of new challenges.

We have probably all heard of cloud projects (or maybe even had projects) that have not gone to plan, this can be for a range of reasons, cost, technical difficulties, performance. There is a long list of reasons that cloud projects don’t go the way that’s expected. But at the heart of many of those projects is the presumption that cloud is both cheap and easy. It comes as quite the shock we we discover it isn’t!

However, things may be about to change as a new wave of technology companies are emerging that are starting to address, what is, the highly complex world of public cloud platforms. These companies are looking to extract some of the complexity away from the enterprise solutions architect and provide them with tools that assist them in their decision making and design, using a mixture of analytics, intelligence and human interaction to address the complexity of moving to the cloud.

This week is the first in a few shows where we look at the complexity of using public cloud and chat with some of the technology companies who trying to address some of these challenges by taking fresh approaches to the problem and aiming to make the cloud experience better, both technically and commercially.

In this first show I’m joined by Andrew Hillier, co-founder and CTO at Densify. Densify have taken a fascinating approach to the problem, built on Andrews long and strong analytics background.

Densify uses a robust analytics platform to build a full understanding of the workloads that have moved to the cloud, develop a performance profile then automatically modify those applications to fully take advantage of the cloud platform they are running on, ensuring they are optimised for the right services and right commercial cost models.

One particularly unique approach to their platform is the use of the Densify advisor, which then takes this analytics model and pairs it with a human being who works closely with their customer to take them through what the analytics platform has discovered and ensure that they understand any optimisation approach and its impact.

If that sounds interesting then dive in as we discuss a wide range of topics including why public cloud is complicated, why it should never be about the money alone, the limitations of first generation approaches to optimisation and how one of the biggest reasons cloud project fails is people buy the wrong cloud stuff!

Andrew provides some valuable insights and shares what is a pretty smart approach to the problem.

If you want to understand more about Densify you can visit densify.com

Find them on twitter @densify

Or on Instagram densify_cloud

Thanks for listening