The fast and the intelligent – Glenn Dekhayser & John Woodall – Ep83

It’s that most wonderful time of the year, yes, the last Tech Interviews of 2018. For this show, we focus on two of technologies biggest trends, ultra-high performance and the world of artificial intelligence and machine learning.

At the core of these technology trends is data, businesses of all types are now realising that there is value to be taken from being able to use their data to drive change, be that delivering new services, increasing operational efficiency, or finding new ways to make money from the things that they do.

As organisations realise the value of their data it inevitably leads to the question, “how can we use technology to help extract that value and information we need?”

That’s the focus of this episode, they are the last two chats I did on my recent tech conference tour, both of which focus on technology that can start to deliver, into organisations of all types, the ability to use their data more effectively to drive business change.

First, we look at ultra-high performance as I chat with Glenn Dekhayser, Field CTO at Red8, about NetApp’s new technology MAX Data and how it is changing the very nature of delivering extreme performance.

Glenn provides some insight into what MAX Data is and how it drives the storage industry close to its ultimate goal, to ensure that storage performance is never the bottleneck of any solution. We also discuss not only how this solution delivers extreme performance but how it does so while greatly simplifying your infrastructure.

Glenn also shares, importantly, how not only does this solution offer unparalleled performance it can do it at a very low cost, not only commercially, but by negating the need to refactor or modernise existing applications.

We wrap up taking a look at some of the use cases for such a high-performance solution.

In the second part of the show, it’s the technology industries favourite topics, artificial intelligence (AI) and machine learning (ML). John Woodall, VP of engineering at Integrated Archive Systems (IAS) joins me to talk about the industry in general and how NetApp is the latest tech provider, in association with NVIDIA, to offer businesses the chance to buy an “off the shelf” AI solution.

We talk about how the technology industry is finally starting to deliver something that technologists and scientists have spoken about for decades, the ability to build true AI and make it available in our daily lives. Beyond this NetApp solution (known as ONTAP AI) we look at the wider use of AI and ML. John shares some thoughts on why business is looking at these technologies and what it can deliver, but he also cautions on the importance of understanding the questions you want this kind of technology to answer before you get started.

We discuss the importance of not only knowing those questions but also ensuring we have the skills to know how to ask them. We also discuss why you may want to build an AI solution yourself as opposed to using the plethora of cloud services available to you.

We wrap up looking at why it’s important to be able to have AI platforms that allow you to start small and we also explore some of the use cases, operational efficiency, increasing margins or finding new ways to monetise your business expertise, but most importantly to focus on business outcomes and not the technology.

There is no doubt that AI and ML and the desire for extreme high performance are key parts of many technology strategies and it’s clear from both these developments from NetApp and the trends in the wider technology industry, that ways of meeting these business desires are becoming more widely available and importantly, affordable for an increasingly wide range of businesses.

To find out more about these technologies you can visit NetApp’s website for MAX Data and ONTAP AI.

If you want to get in touch with or find out more about Glenn and John, you can find them both on twitter @gdekhayser and @John_Woodall.

This is the last show of 2018, so I would just like to thank you all for listening to Tech Interviews throughout the year and I hope you’ll be able to join me for more shows in 2019.

That leaves me to just wish you all a great Christmas holiday and until next year, thanks for listening.

turned on red and blue merry christmas neon sign
Photo by Jameel Hassan on Pexels.com
Advertisements

Intelligent, secure, automated, your data platform future

I was recently part of a workshop event where we discussed building “your future data platform”, during the session I presented a roadmap of how a future data platform can look. The basis of the presentation, which looked at the relatively near future, was how developments from “data” vendors are allowing us to rethink the way we manage the data that we have in our organisations.

What’s driving the need for change?

Why do we need to change the way we manage data? The reality is that the world of technology is changing extremely quickly and at the heart of it is our desire for data, be it creating it, storing it, analysing it or learning from it and demanding that we use data increasingly to help drive business outcomes, strategies and improve customer experience.

Alongside this need to use our data more are other challenges, from increasing regulation to the ever more complex security risk (See the recent Marriot Hotels breach of 500 million customer records) which are making further, unprecedented demands on our technology platforms.

Why aren’t current approaches meeting the demand?

What’s wrong with what we are currently doing? Why aren’t current approaches helping us to meet the demands and challenges of modern data usage?

As the demands on our data grow, the reality for many is we have architected platforms that have never considered many of these issues.

Let’s consider what happens when we place data onto our current platform.

We take our data, it could be confidential, it may not be, often we don’t know, that data is placed into our data repository when it’s placed there how many of us know;

  • Where it is?
  • Who owns it?
  • What does it contain?
  • Who is accessing it?
  • What’s happening to it?

In most cases, we don’t, and this presents a range of challenges from management and security to reducing our ability to compete with those who are effectively using their data to innovate and gain an advantage.

What If that platform instead could recognise the data as it was deposited? Then made sure it was in the right secure area, with individual file securities that would ensure it remained secure regardless of its location, then the system, if necessary, would protect the file immediately (not when the next protection job ran) and then tracked the use of that data from its creation to deletion.

That would be useful, wouldn’t it?

What do we need our modern platform to be?

As the amount of data and the ways we want to use it continues to evolve, our traditional approaches will not be able to meet the demands placed upon them and we certainly cannot expect human intervention to be able to cope, the data sets are too big, the security threats too wide-reaching and the compliance requirements ever more stringent.

However, that’s the challenge we need our future data platforms to meet, they need to be, by design, secure, intelligent and automated. The only way we are going to be able to deliver this is with the help of technology augmenting our efforts in education, process and policy to ensure we use our data and get the very best from it.

That technology needs to be able to deliver this secure, intelligent and automated environment from the second it starts to ingest data, it needs to understand what we have and how it should be used and importantly it shouldn’t just be reactive, it has to be proactive, the minute new data is written it applies intelligence, ensuring immediately we secure our data, store and protect it accordingly and be able to fully understand its use throughout its lifecycle.

Beyond this, we also need to make sure that what we architect is truly a platform, something that acts as a solid foundation for how we want to use our data. We need to ensure once we have our data organised, secure and protected, that our platform can make sure that we can move it to places we need it, allowing us to take advantage of new cloud services, data analytics tools, machine learning engines or whatever may be around the corner, while ensuring we continue to maintain control and retain insights into its use regardless of where it resides.

These are key elements of our future data platform and ones we are going to need to consider to ensure that our data can meet the demands of our organisations to make better decisions and provide better services, driven by better use of data.

How do we do it?

Of course, the question is, can this be done today and if it can how?

The good news is, much of what we need to do this, is already available or coming very soon and which means, realistically within the next 6-18 months, if you have the desire, you can develop a strategy and build a more secure, intelligent and automated method for managing your data.

I’ve shared some thoughts here on why we need to modernise our platforms and what we need from them, in the next post I’ll share a practical example of how you can build this kind of platform by using tools that are available to you today or coming very shortly, to show that a future data platform is closer than you may think.

Veeam Virtual Goodness

Conference CrowdI’ve written before of the value I get from attending events, there’s always something to learn, be it strategic insight, technical information, or just a chance to meet someone new via a chat about a common topic, all of these things have value, sometimes big, sometimes small.

However, there are so many events and in reality we can’t attend them all, time, cost, and balancing demands of the day job, all make it impossible to attend everything you may want to, which is why increasingly the ability to join these events virtually has become a valuable option, be it via live streamed keynotes, catching up on-demand or increasingly via virtual conference.

I’m not sure that many people are aware of virtual conferences, but I’ve attended a couple in the past and they have worked really well, so I was pleased to see Veeam also delivering once such event with VeeamON Virtual on December 5th.

The protection and management of our data is, of course, a key topic for pretty much everybody. Ensuring it’s available, secure, protected and managed is a crucial element in the strategic planning for most of the CIO’s I speak with, whether driven by security concerns, regulation or future plans to extract value from data via analytics, then ensuring our datasets are in a fit healthy state is very important.

Why does this make this Veeam event interesting? The world of data protection is changing rapidly, as rapidly as the changing demands we are putting on our data and ensuring that we keep up with the changes that industry leaders like Veeam are making both technically and strategically, should be a core part of our education.

Making this kind of information accessible is very helpful and these virtual conferences are a great way of providing that access. What then, can we expect from VeeamON virtual that would encourage you to invest a proportion of your day in this conference?

The event has something to offer everyone, three distinct tracks providing differing levels of information. The strategic track includes sessions looking at 2019 industry trends and updates on Veeam’s Intelligent data management strategy, presented by their leadership team.

The technical track offers some great content, including a session covering the key elements of Backup and Replication Update 4 as well as a look at one of my favourite Veeam tools, Availability Orchestrator, a tool designed to help fully automate the complexities of a DR strategy, including testing and documentation.

Of course, no technical discussion is complete without looking at the impact of cloud, the cloud track explores Veeam’s capabilities around protection of AWS workloads as well as an update on their backup for Office365 product.

As with most conferences their value can often be found outside of the main presentations, and a virtual conference is no different, with an online chat community, where all attendees can chat in virtual “lounges” with Veeam staff, industry experts and of course other Veeam users, to share ideas, ask questions and maybe develop a new friendship!

The amount of interesting innovation in the data management/protection space makes it a fascinating part of the technology industry and Veeam are certainly one of its leading innovators.

The way we use and want to use our data puts huge demands on the data strategies we have in place to meet our business needs, so ensuring that we are aware of industry changes and trends has to be a high priority for any strategic IT decision maker or IT pro and if you can hear from one of the industry’s leading innovators from the comfort of your own chair via a virtual conference then it probably makes sense to do it.

If you’re responsible for ensuring your data management strategy continues to evolve to meet the ever-changing demands placed upon your data then book December the 5th in your diary, check out the sessions that catch your attention (Pro Tip you don’t have to do them all!) and settle down and hear from a wide range of industry leaders on both strategic and technical topics and understand how the world of data management is changing and how it may affect you.

You can find out more about the event, its speakers and agenda right here

112618_1252_VeeamVirtua1.jpg

Veeam, heading in the right direction?

As the way we use data in our ever more fragmented, multi-cloud world continues to change, the way we manage, protect and secure our data is having to change with it. This need to change is mirrored by the leading data protection vendors who are starting to take new approaches to the challenge.

Around 18 months ago Veeam started shifting theirs and their customers focus by introducing their “Intelligent Data Management” methodology, highlighting the importance of visibility, orchestration and automation in meeting the modern demands of data protection.

Recently I was invited to the Veeam Vanguard summit in Prague, to learn about the latest updates to their platforms, I was very interested to see how these updates would build upon this methodology and ensure they remained well placed to tackle these new problems.

There was a huge amount covered but I just wanted to highlight a couple of key strategic areas that caught my attention.

The initial challenge facing Veeam, as they evolve, is their “traditional” background, the innovative approach to protecting virtual workloads, upon which they have built their success has to change as protecting modern workloads is a very different challenge and we have seen Veeam, via a mix of innovation and acquisition starting to redesign and fill gaps in their platform to tackle these new challenges.

However, this has introduced a new problem, one of integrating these new developments into a cohesive platform.

Tying it together

Looking across many of the updates it is clear Veeam also recognise the importance integration plays in delivering a platform that can protect and manage the lifecycle of data in a hybrid, multi-cloud environment.

A couple of technologies really highlighted moves in this direction, the addition of an external repository to their Availability for AWS components, allows the backups of native EC2 instances to be housed in an object store external to AWS or the native snapshots of EC2. On its own this is useful, however, when we add the upcoming update 4 for Veeam Backup and Replication(B&R), we can see a smart strategic move.

Update 4 brings the ability for B&R to be able to read and use the information held inside this object store, providing the capability for an on-prem B&R administrator to be able to browse the repository and recover data from it to any location.

Update 4 also includes a “cloud tier” extension to a backup repository, this is a remote S3/Azure blob external tier in which aged backup data can be moved into, to enable an unlimited backup repository. With this an organisation can take advantage of “cheap and deep” storage to retain data for the very long term, without needing to continually grow more expensive primary backup tiers, this integration is seamless and allows the integration of cloud storage, where appropriate, to a data protection strategy.

This is only the start, the potential of providing similar capabilities and integration with other public clouds and storage types is clearly there and it would seem only a matter of time before the flexibility of the platform expands further.

Smart Protection thinking

While integration is crucial to Veeam’s strategy, more intelligence about how we can use our protected data is equally crucial, particularly as the demands to ensure system availability continues to grow and put pressure on our already strained IT resources.

Secure and staged restore both add intelligence to the data recovery process allowing for modifications to be made to a workload before placing it back into production.

Secure Restore

Allows a data set to be pre-scanned before been returned into production, think about this as part of an “anti-virus” strategy. Imagine as you recover a set of data after a virus infection if you could pre-scan the data and address any issues before you place it back into production, that is secure restore, a Powerful, time saving and risk-reducing step.

Staged Restore

An equally powerful capability, allowing for a system to have alterations made to it before restoring it into production. The example given during the session was based on compliance, carrying out a check on data ahead of recovery to make sure that non-compliant data is removed before recovery. However, use cases such as patching would be equally useful with staged restore allowing a VM to be mounted and system updates applied ahead of it been placed back in production. Again simple, but very useful.

Both additions are excellent examples of smart strategic thinking on Veeam’s part, reducing the risks of recovering data and systems into a production environment.

How are they doing?

I went to Prague wanting to see how Veeam’s latest updates would help them and their customers to meet the changing needs of data management and the signs are positive, the increased integration between the on-prem platforms and the capabilities of the public cloud are starting to make a reality of the “Intelligent Data Management” strategy and with update 4 of Backup and Replication, Veeam can protect a VM on-prem or in the cloud and restore that VM to any location, given you true workload portability.

Veeam’s Intelligent Data Management platform is by no means all in place, however, the direction of travel is certainly clear and, even now, you can see how elements of that strategy are deliverable today.

There was lots covered at the summit, which built on much of the intelligence and automation discussed here, Veeam, In my opinion, remain a very smart player in the data protection space and alongside some of the new and innovative entrants, continue to make the world of data protection a fascinating and fast-moving part of the data market, which is useful, as availability and data protection is central to pretty much all of our long-term data strategies.

Want to know more?

Interested in finding out more about Veeam? Then there’s a great opportunity coming up on December 5th, with the VeeamON Virtual event, where you can hear the very latest from Veeam, with both strategic and technical tracks for you to log in and watch, this event will give you a lot more detail on everything covered in this blog and a whole lot more.

You can find out more about the event and register here https://go.veeam.com/veeamon-virtual

If you want to find out for yourself if Veeam is on track, this is a great way to do it.

NetApp’s Future, do they matter?

A couple of weeks ago I was at a technology event speaking with some of the attendees when the subject of NetApp was raised, accompanied by the question “Are NetApp still relevant?” I was taken a back by this, particularly as over the last few years I felt NetApp had done a great job in re-positioning themselves and changing the view of them as a “traditional” storage company.

However, this message had clearly not reached everyone and made me consider “Does NetApp’s vision really deal with challenges that are relevant to the modern enterprise?” and “have they done enough to shake the traditional storage vendor label?”.

I’m writing this blog 33000 ft above the United States, heading home from NetApp’s Insight conference. Reflecting on the three days in Las Vegas, I wondered, did what I hear answer those questions? and would it keep NetApp relevant for a long time to come?

#DataDriven

The modern tech conference loves a hashtag, one that attempts to capture the theme of the event and #DataDriven was Insight 2018’s entry to the conference hashtag dictionary.

But what does  Data Driven actually mean?

Data plays a significant role in driving modern business outcomes and the way we handle, store and extract information from it, is a keen focus for many of us and this is clearly the same for NetApp.

Throughout Insight,  NetApp stated clearly their vision for the future is to be a data company not a storage one, a subtle but crucial difference. No longer are speeds and feeds (while still important) the thing that drives their decision making, it is Data that is at the heart of NetApp’s strategy, a crucial shift that matches how the majority of NetApp’s customers think.

Data Fabric 2.0

NetApp’s data fabric over the last 4 years has been at the centre of their thinking. Insight however, presented a fundamental shift in how they see the future of data fabric, starting with making it clear it is not “NetApp’s Data Fabric” but “your data fabric”.

A fabric shouldn’t be “owned” by a storage vendor, it is ours to build to meet our own needs. This shift is also driving how NetApp see the future delivery of a data fabric, no longer something that needs building, but “Data Fabric as a Service” a cloud powered set of tools and services that enable your strategy. This is a 180° turn for this approach making it no longer an on-prem infrastructure that integrates cloud services, but a cloud service that integrates and orchestrates all of your data end points regardless of location.

The demonstration of this vision was extremely impressive, the future data fabric was clear in its direction, a fabric is yours, to be consumed as you need it, helping us to deliver services and data as and when we need to, quickly, efficiently and at scale.

The awkward HCI Conversation

Perhaps the most immediate beneficiary of this shift is NetApp’s Hyper Converged Infrastructure (HCI) platform. NetApp are by no means early in this market and in some quarters there is debate as to whether NetApp HCI is a Hyper Converged platform at all. I’ll admit, while the industry definition of HCI doesn’t really bother me, as technology decisions should be about outcomes not arbitrary definitions, I do have reservations about the long term future of NetApp’s HCI platform.

However, what NetApp showed as part of their future Data Fabric vision was a redefinition of how they see HCI, redefined to the extent that NetApp’s view of HCI is no longer hyper converged but Hybrid Cloud Infrastructure.

What does this mean?

It’s about bringing the cloud “experience” into your datacentre, but this is much more than building a “private cloud” it is about HCI becoming a fully integrated part of a cloud enabled data strategy. Allowing organisations to deploy services and enable the simple movement of them from public cloud to on-prem and back again, making HCI just an end point, a location from which your cloud services could be delivered.

Ultimately HCI shouldn’t be about hardware or software, but outcomes and NetApp’s aim is to allow this technology to speed up your ability to drive those outcomes, regardless of location.

This transformed in my mind a platform from one that I struggled to see its long-term value to something that has the potential to become a critical component in delivering modern services to organisations of all types.

Summary

Did what I hear address the questions raised to me? Would it convince a wider audience that NetApp remain relevant? For that we will have to wait and see.

However, In my opinion NetApp presented a forward thinking, relevant strategy that if executed properly is going to be a fundamental shift in the way they are developing as a company and will ensure they remain relevant to organisations by solving real and complex business challenges.

I’m very interested to see how this new vision for Data Fabric evolves and if they can execute the vision presented so impressively at Insight, they may finally shed that “traditional” NetApp label and become the data authority company that they are aiming to be.

You can get further details on announcements from Insight by visiting the NetApp Insight site and where you will find a wide range of videos including the two general session keynotes.

If you want to find out more about NetApp’s vision for your self, then it’s not to late to register to attend NetApp’s Insight EMEA conference in Barcelona, details are here.