Storage Ferraris in the cloud for $20 an hour – Lee Jiles – Ep80

 

A couple of months ago I wrote an article about the importance of enterprise data services inside of the public cloud (Building a modern data platform – exploiting the cloud) and why they are crucial to IT strategies of organisations as they look to transition to the public cloud.

The idea of natively been able to access data services that are commonplace In our datacentres such as the ability to apply service levels to performance, storage efficiencies and other enterprise-level capabilities to our cloud apps is very attractive.

In this week’s episode we take a look at one such solution, in the first in a series of shows recorded at some of the recent Tech Conferences I’ve visited, I’m joined by Lee Giles a Senior Manager from NetApp’s Cloud Business Division at their Insight Conference, to discuss Azure NetApp Files, an enterprise data services solution available natively inside Microsoft’s Azure datacentres.

Azure NetApp files is a very interesting technology and another example of the fascinating work NetApp’s cloud business unit is doing in extending enterprise data services to the locations we need them, on-prem, near to and inside the public cloud.

I discuss with Lee what Azure NetApp Files is, and why it was developed. We explore some of the challenges of public cloud storage and how it often leads to all of those good storage management practices you are used to on-prem having to be abandoned as we move into the cloud.

We look at why the ability to deliver a “familiar” experience has great advantages when it comes to speed and agility and Lee explains to us why stripping away the complexity of cloud storage is like getting yourself a Ferrari for $20 an hour!

I ask Lee about the technical deployment of Azure NetApp files and why it is different to solutions that are “near the cloud”. We also look at Microsoft’s view of the technology and the benefits they see in working with NetApp to deliver this service.

Lee also shares some of the planned developments as well as some of the initial use cases for the service. Finally, he explains how you can get access to the preview service and test out Azure NetApp files for yourself and see if it can help meet some of your public cloud storage challenges.

For more details on the service, as well as where to sign up to access the preview you can visit the Azure Storage Site here https://azure.microsoft.com/en-gb/services/storage/netapp/

If you have other questions then you can contact Lee, via email at lee.jiles@netapp.com.

Azure NetApp files is a really interesting option for public cloud storage and well worth investigating.

I hope you enjoyed the show and as always, thanks for listening.

Advertisements

Veeam, heading in the right direction?

As the way we use data in our ever more fragmented, multi-cloud world continues to change, the way we manage, protect and secure our data is having to change with it. This need to change is mirrored by the leading data protection vendors who are starting to take new approaches to the challenge.

Around 18 months ago Veeam started shifting theirs and their customers focus by introducing their “Intelligent Data Management” methodology, highlighting the importance of visibility, orchestration and automation in meeting the modern demands of data protection.

Recently I was invited to the Veeam Vanguard summit in Prague, to learn about the latest updates to their platforms, I was very interested to see how these updates would build upon this methodology and ensure they remained well placed to tackle these new problems.

There was a huge amount covered but I just wanted to highlight a couple of key strategic areas that caught my attention.

The initial challenge facing Veeam, as they evolve, is their “traditional” background, the innovative approach to protecting virtual workloads, upon which they have built their success has to change as protecting modern workloads is a very different challenge and we have seen Veeam, via a mix of innovation and acquisition starting to redesign and fill gaps in their platform to tackle these new challenges.

However, this has introduced a new problem, one of integrating these new developments into a cohesive platform.

Tying it together

Looking across many of the updates it is clear Veeam also recognise the importance integration plays in delivering a platform that can protect and manage the lifecycle of data in a hybrid, multi-cloud environment.

A couple of technologies really highlighted moves in this direction, the addition of an external repository to their Availability for AWS components, allows the backups of native EC2 instances to be housed in an object store external to AWS or the native snapshots of EC2. On its own this is useful, however, when we add the upcoming update 4 for Veeam Backup and Replication(B&R), we can see a smart strategic move.

Update 4 brings the ability for B&R to be able to read and use the information held inside this object store, providing the capability for an on-prem B&R administrator to be able to browse the repository and recover data from it to any location.

Update 4 also includes a “cloud tier” extension to a backup repository, this is a remote S3/Azure blob external tier in which aged backup data can be moved into, to enable an unlimited backup repository. With this an organisation can take advantage of “cheap and deep” storage to retain data for the very long term, without needing to continually grow more expensive primary backup tiers, this integration is seamless and allows the integration of cloud storage, where appropriate, to a data protection strategy.

This is only the start, the potential of providing similar capabilities and integration with other public clouds and storage types is clearly there and it would seem only a matter of time before the flexibility of the platform expands further.

Smart Protection thinking

While integration is crucial to Veeam’s strategy, more intelligence about how we can use our protected data is equally crucial, particularly as the demands to ensure system availability continues to grow and put pressure on our already strained IT resources.

Secure and staged restore both add intelligence to the data recovery process allowing for modifications to be made to a workload before placing it back into production.

Secure Restore

Allows a data set to be pre-scanned before been returned into production, think about this as part of an “anti-virus” strategy. Imagine as you recover a set of data after a virus infection if you could pre-scan the data and address any issues before you place it back into production, that is secure restore, a Powerful, time saving and risk-reducing step.

Staged Restore

An equally powerful capability, allowing for a system to have alterations made to it before restoring it into production. The example given during the session was based on compliance, carrying out a check on data ahead of recovery to make sure that non-compliant data is removed before recovery. However, use cases such as patching would be equally useful with staged restore allowing a VM to be mounted and system updates applied ahead of it been placed back in production. Again simple, but very useful.

Both additions are excellent examples of smart strategic thinking on Veeam’s part, reducing the risks of recovering data and systems into a production environment.

How are they doing?

I went to Prague wanting to see how Veeam’s latest updates would help them and their customers to meet the changing needs of data management and the signs are positive, the increased integration between the on-prem platforms and the capabilities of the public cloud are starting to make a reality of the “Intelligent Data Management” strategy and with update 4 of Backup and Replication, Veeam can protect a VM on-prem or in the cloud and restore that VM to any location, given you true workload portability.

Veeam’s Intelligent Data Management platform is by no means all in place, however, the direction of travel is certainly clear and, even now, you can see how elements of that strategy are deliverable today.

There was lots covered at the summit, which built on much of the intelligence and automation discussed here, Veeam, In my opinion, remain a very smart player in the data protection space and alongside some of the new and innovative entrants, continue to make the world of data protection a fascinating and fast-moving part of the data market, which is useful, as availability and data protection is central to pretty much all of our long-term data strategies.