The fast and the intelligent – Glenn Dekhayser & John Woodall – Ep83

It’s that most wonderful time of the year, yes, the last Tech Interviews of 2018. For this show, we focus on two of technologies biggest trends, ultra-high performance and the world of artificial intelligence and machine learning.

At the core of these technology trends is data, businesses of all types are now realising that there is value to be taken from being able to use their data to drive change, be that delivering new services, increasing operational efficiency, or finding new ways to make money from the things that they do.

As organisations realise the value of their data it inevitably leads to the question, “how can we use technology to help extract that value and information we need?”

That’s the focus of this episode, they are the last two chats I did on my recent tech conference tour, both of which focus on technology that can start to deliver, into organisations of all types, the ability to use their data more effectively to drive business change.

First, we look at ultra-high performance as I chat with Glenn Dekhayser, Field CTO at Red8, about NetApp’s new technology MAX Data and how it is changing the very nature of delivering extreme performance.

Glenn provides some insight into what MAX Data is and how it drives the storage industry close to its ultimate goal, to ensure that storage performance is never the bottleneck of any solution. We also discuss not only how this solution delivers extreme performance but how it does so while greatly simplifying your infrastructure.

Glenn also shares, importantly, how not only does this solution offer unparalleled performance it can do it at a very low cost, not only commercially, but by negating the need to refactor or modernise existing applications.

We wrap up taking a look at some of the use cases for such a high-performance solution.

In the second part of the show, it’s the technology industries favourite topics, artificial intelligence (AI) and machine learning (ML). John Woodall, VP of engineering at Integrated Archive Systems (IAS) joins me to talk about the industry in general and how NetApp is the latest tech provider, in association with NVIDIA, to offer businesses the chance to buy an “off the shelf” AI solution.

We talk about how the technology industry is finally starting to deliver something that technologists and scientists have spoken about for decades, the ability to build true AI and make it available in our daily lives. Beyond this NetApp solution (known as ONTAP AI) we look at the wider use of AI and ML. John shares some thoughts on why business is looking at these technologies and what it can deliver, but he also cautions on the importance of understanding the questions you want this kind of technology to answer before you get started.

We discuss the importance of not only knowing those questions but also ensuring we have the skills to know how to ask them. We also discuss why you may want to build an AI solution yourself as opposed to using the plethora of cloud services available to you.

We wrap up looking at why it’s important to be able to have AI platforms that allow you to start small and we also explore some of the use cases, operational efficiency, increasing margins or finding new ways to monetise your business expertise, but most importantly to focus on business outcomes and not the technology.

There is no doubt that AI and ML and the desire for extreme high performance are key parts of many technology strategies and it’s clear from both these developments from NetApp and the trends in the wider technology industry, that ways of meeting these business desires are becoming more widely available and importantly, affordable for an increasingly wide range of businesses.

To find out more about these technologies you can visit NetApp’s website for MAX Data and ONTAP AI.

If you want to get in touch with or find out more about Glenn and John, you can find them both on twitter @gdekhayser and @John_Woodall.

This is the last show of 2018, so I would just like to thank you all for listening to Tech Interviews throughout the year and I hope you’ll be able to join me for more shows in 2019.

That leaves me to just wish you all a great Christmas holiday and until next year, thanks for listening.

turned on red and blue merry christmas neon sign
Photo by Jameel Hassan on Pexels.com
Advertisements

Why stay in the data industry? – Greg Knieriemen – Ep81

The data industry is a really interesting place right now, for many organisations their data and the challenges of how they use it, secure it and derive value from it is right at the top of the CIO’s priority list. However, data is not the only area of the technology industry that is interesting, automation, AI, machine learning, IoT, new development platforms and of course the fascinating world of the hyperscalers.

So, when you are an experienced technologist, well known in the industry and you are presented with a range of new opportunities, what is it that attracts you back to work in the data industry?

That’s the question I put to this week’s guest, experienced tech industry “veteran” Greg Knieriemen. Greg has just ended a highly successful stint at Hitachi and was presented with a range of interesting opportunities; however, it was one of those established storage vendors, NetApp, that appealed the most to him, but why? What could the data industry continue to offer someone with an already wide experience of it?

When we recorded Greg was only a couple of months into his new role as Chief Technologist, so we start by exploring why he chose to stay a part of the data industry and what about NetApp, in particular, attracted him. Greg shares how he realised that NetApp is not just a storage company but one looking to solve data problems.

We explore the reality of digital transformation and why it can’t be technology led, but we do look at why technology companies can play a part. Greg also shares his enthusiasm (more than once!) for new NetApp solution, MAX Data.

We discuss the world of multi-cloud as the natural evolution for companies and how it is likely to be a new reality. We also discuss why this multi-cloud world presents a range of new challenges, especially when it comes to security and privacy.

Greg also shares some thoughts on the reality of technology adoption and that there is never one way to solve a problem with tech!

We finish up looking at what’s exciting Greg about his new role and why he thinks NetApp is better placed than most to tackle the complex data challenges of the modern business and how the best way to judge a company’s success is to look at their proof points and why you should never believe a technology evangelist!

I really enjoyed meeting up with Greg at NetApp Insight and certainly enjoyed our chat here and Greg’s take on the industry, I hope you enjoy listening.

If you want to find out more from Greg you can find him on twitter @Knieriemen as well as on LinkedIn.

Until next time, thanks for listening.

Storage Ferraris in the cloud for $20 an hour – Lee Jiles – Ep80

 

A couple of months ago I wrote an article about the importance of enterprise data services inside of the public cloud (Building a modern data platform – exploiting the cloud) and why they are crucial to IT strategies of organisations as they look to transition to the public cloud.

The idea of natively been able to access data services that are commonplace In our datacentres such as the ability to apply service levels to performance, storage efficiencies and other enterprise-level capabilities to our cloud apps is very attractive.

In this week’s episode we take a look at one such solution, in the first in a series of shows recorded at some of the recent Tech Conferences I’ve visited, I’m joined by Lee Giles a Senior Manager from NetApp’s Cloud Business Division at their Insight Conference, to discuss Azure NetApp Files, an enterprise data services solution available natively inside Microsoft’s Azure datacentres.

Azure NetApp files is a very interesting technology and another example of the fascinating work NetApp’s cloud business unit is doing in extending enterprise data services to the locations we need them, on-prem, near to and inside the public cloud.

I discuss with Lee what Azure NetApp Files is, and why it was developed. We explore some of the challenges of public cloud storage and how it often leads to all of those good storage management practices you are used to on-prem having to be abandoned as we move into the cloud.

We look at why the ability to deliver a “familiar” experience has great advantages when it comes to speed and agility and Lee explains to us why stripping away the complexity of cloud storage is like getting yourself a Ferrari for $20 an hour!

I ask Lee about the technical deployment of Azure NetApp files and why it is different to solutions that are “near the cloud”. We also look at Microsoft’s view of the technology and the benefits they see in working with NetApp to deliver this service.

Lee also shares some of the planned developments as well as some of the initial use cases for the service. Finally, he explains how you can get access to the preview service and test out Azure NetApp files for yourself and see if it can help meet some of your public cloud storage challenges.

For more details on the service, as well as where to sign up to access the preview you can visit the Azure Storage Site here https://azure.microsoft.com/en-gb/services/storage/netapp/

If you have other questions then you can contact Lee, via email at lee.jiles@netapp.com.

Azure NetApp files is a really interesting option for public cloud storage and well worth investigating.

I hope you enjoyed the show and as always, thanks for listening.

NetApp’s Future, do they matter?

A couple of weeks ago I was at a technology event speaking with some of the attendees when the subject of NetApp was raised, accompanied by the question “Are NetApp still relevant?” I was taken a back by this, particularly as over the last few years I felt NetApp had done a great job in re-positioning themselves and changing the view of them as a “traditional” storage company.

However, this message had clearly not reached everyone and made me consider “Does NetApp’s vision really deal with challenges that are relevant to the modern enterprise?” and “have they done enough to shake the traditional storage vendor label?”.

I’m writing this blog 33000 ft above the United States, heading home from NetApp’s Insight conference. Reflecting on the three days in Las Vegas, I wondered, did what I hear answer those questions? and would it keep NetApp relevant for a long time to come?

#DataDriven

The modern tech conference loves a hashtag, one that attempts to capture the theme of the event and #DataDriven was Insight 2018’s entry to the conference hashtag dictionary.

But what does  Data Driven actually mean?

Data plays a significant role in driving modern business outcomes and the way we handle, store and extract information from it, is a keen focus for many of us and this is clearly the same for NetApp.

Throughout Insight,  NetApp stated clearly their vision for the future is to be a data company not a storage one, a subtle but crucial difference. No longer are speeds and feeds (while still important) the thing that drives their decision making, it is Data that is at the heart of NetApp’s strategy, a crucial shift that matches how the majority of NetApp’s customers think.

Data Fabric 2.0

NetApp’s data fabric over the last 4 years has been at the centre of their thinking. Insight however, presented a fundamental shift in how they see the future of data fabric, starting with making it clear it is not “NetApp’s Data Fabric” but “your data fabric”.

A fabric shouldn’t be “owned” by a storage vendor, it is ours to build to meet our own needs. This shift is also driving how NetApp see the future delivery of a data fabric, no longer something that needs building, but “Data Fabric as a Service” a cloud powered set of tools and services that enable your strategy. This is a 180° turn for this approach making it no longer an on-prem infrastructure that integrates cloud services, but a cloud service that integrates and orchestrates all of your data end points regardless of location.

The demonstration of this vision was extremely impressive, the future data fabric was clear in its direction, a fabric is yours, to be consumed as you need it, helping us to deliver services and data as and when we need to, quickly, efficiently and at scale.

The awkward HCI Conversation

Perhaps the most immediate beneficiary of this shift is NetApp’s Hyper Converged Infrastructure (HCI) platform. NetApp are by no means early in this market and in some quarters there is debate as to whether NetApp HCI is a Hyper Converged platform at all. I’ll admit, while the industry definition of HCI doesn’t really bother me, as technology decisions should be about outcomes not arbitrary definitions, I do have reservations about the long term future of NetApp’s HCI platform.

However, what NetApp showed as part of their future Data Fabric vision was a redefinition of how they see HCI, redefined to the extent that NetApp’s view of HCI is no longer hyper converged but Hybrid Cloud Infrastructure.

What does this mean?

It’s about bringing the cloud “experience” into your datacentre, but this is much more than building a “private cloud” it is about HCI becoming a fully integrated part of a cloud enabled data strategy. Allowing organisations to deploy services and enable the simple movement of them from public cloud to on-prem and back again, making HCI just an end point, a location from which your cloud services could be delivered.

Ultimately HCI shouldn’t be about hardware or software, but outcomes and NetApp’s aim is to allow this technology to speed up your ability to drive those outcomes, regardless of location.

This transformed in my mind a platform from one that I struggled to see its long-term value to something that has the potential to become a critical component in delivering modern services to organisations of all types.

Summary

Did what I hear address the questions raised to me? Would it convince a wider audience that NetApp remain relevant? For that we will have to wait and see.

However, In my opinion NetApp presented a forward thinking, relevant strategy that if executed properly is going to be a fundamental shift in the way they are developing as a company and will ensure they remain relevant to organisations by solving real and complex business challenges.

I’m very interested to see how this new vision for Data Fabric evolves and if they can execute the vision presented so impressively at Insight, they may finally shed that “traditional” NetApp label and become the data authority company that they are aiming to be.

You can get further details on announcements from Insight by visiting the NetApp Insight site and where you will find a wide range of videos including the two general session keynotes.

If you want to find out more about NetApp’s vision for your self, then it’s not to late to register to attend NetApp’s Insight EMEA conference in Barcelona, details are here.

Taking a grown-up look at cloud – Matt Watts – Ep77

Cloud is not new, I don’t think that’s news to anyone, many of us have deployed a cloud solution, be it a SaaS platform, some public cloud infrastructure or some VM’s for test and dev, cloud continues to play a major part in IT strategy for an ever-increasing amount of businesses.

However, this move to cloud has not come without us learning an awful lot on the way. We’ve probably all heard of, or maybe even been involved with, cloud deployments that have not gone as we expected, the technology hasn’t given us what we want, the commercials didn’t stand up to our calculations, or it just wouldn’t work in the way our on-premises platform did. Many of the issues that have led to those poor cloud experiences have been driven by an “immaturity” to our approach, often too quick to dictate a cloud first strategy, regardless of whether cloud is, in reality, the way to go.

Is our approach to cloud beginning to change? have we got, or do we need to consider our cloud strategy a little differently?

That’s the question we ask on this weeks podcast, an episode inspired by a fantastic article written by my guest Matt Watts, Director of Technology and Strategy, EMEA at NetApp. In the article Matt posed the question Are you Cloud First! or Cloud First? And the difference a bit of punctuation can make, you can read the article here.

I thought the topic he covered in the article and the question he raised were worthy of further investigation and that’s what we do on this weeks show.

During the show we discuss the article in depth, we start out looking at what drove Matt to write the article and the importance of understanding the difference between a strategy and a mandate. We also look at examples of mistakes that people originally made that have meant we’ve needed to start to change our approach.

We talk about the issues that are created by taking on-prem solutions and “dumping” them “as is” into the cloud without asking the question “is there any value in doing this?” and how this drives bad practice in cloud adoption. We also coin the phrase “there is no zealot like a technology zealot!”.

We also explore the idea that cloud adoption isn’t about cost savings, so if it’s not that, why do we want to adopt cloud?

We wrap up looking at examples of building a more mature cloud strategy and how this has worked well, Matt shares some examples of how NetApp’s own internal cloud maturity has driven their own internal decision making. Matt’s final thought is how, without an appropriate and mature cloud strategy, you run the risk building yourself a whole new set of silo’s and limitations.

Matt, as always, shares some fascinating insight into cloud strategy. To find out more from Matt you can check out his other blogs on his watts-innovating site. You can also follow Matt on twitter @mtjwatts.

Next week we get an update on the innovations and developments in VMware Cloud on AWS, until then, thanks for listening.