The fast and the intelligent – Glenn Dekhayser & John Woodall – Ep83

It’s that most wonderful time of the year, yes, the last Tech Interviews of 2018. For this show, we focus on two of technologies biggest trends, ultra-high performance and the world of artificial intelligence and machine learning.

At the core of these technology trends is data, businesses of all types are now realising that there is value to be taken from being able to use their data to drive change, be that delivering new services, increasing operational efficiency, or finding new ways to make money from the things that they do.

As organisations realise the value of their data it inevitably leads to the question, “how can we use technology to help extract that value and information we need?”

That’s the focus of this episode, they are the last two chats I did on my recent tech conference tour, both of which focus on technology that can start to deliver, into organisations of all types, the ability to use their data more effectively to drive business change.

First, we look at ultra-high performance as I chat with Glenn Dekhayser, Field CTO at Red8, about NetApp’s new technology MAX Data and how it is changing the very nature of delivering extreme performance.

Glenn provides some insight into what MAX Data is and how it drives the storage industry close to its ultimate goal, to ensure that storage performance is never the bottleneck of any solution. We also discuss not only how this solution delivers extreme performance but how it does so while greatly simplifying your infrastructure.

Glenn also shares, importantly, how not only does this solution offer unparalleled performance it can do it at a very low cost, not only commercially, but by negating the need to refactor or modernise existing applications.

We wrap up taking a look at some of the use cases for such a high-performance solution.

In the second part of the show, it’s the technology industries favourite topics, artificial intelligence (AI) and machine learning (ML). John Woodall, VP of engineering at Integrated Archive Systems (IAS) joins me to talk about the industry in general and how NetApp is the latest tech provider, in association with NVIDIA, to offer businesses the chance to buy an “off the shelf” AI solution.

We talk about how the technology industry is finally starting to deliver something that technologists and scientists have spoken about for decades, the ability to build true AI and make it available in our daily lives. Beyond this NetApp solution (known as ONTAP AI) we look at the wider use of AI and ML. John shares some thoughts on why business is looking at these technologies and what it can deliver, but he also cautions on the importance of understanding the questions you want this kind of technology to answer before you get started.

We discuss the importance of not only knowing those questions but also ensuring we have the skills to know how to ask them. We also discuss why you may want to build an AI solution yourself as opposed to using the plethora of cloud services available to you.

We wrap up looking at why it’s important to be able to have AI platforms that allow you to start small and we also explore some of the use cases, operational efficiency, increasing margins or finding new ways to monetise your business expertise, but most importantly to focus on business outcomes and not the technology.

There is no doubt that AI and ML and the desire for extreme high performance are key parts of many technology strategies and it’s clear from both these developments from NetApp and the trends in the wider technology industry, that ways of meeting these business desires are becoming more widely available and importantly, affordable for an increasingly wide range of businesses.

To find out more about these technologies you can visit NetApp’s website for MAX Data and ONTAP AI.

If you want to get in touch with or find out more about Glenn and John, you can find them both on twitter @gdekhayser and @John_Woodall.

This is the last show of 2018, so I would just like to thank you all for listening to Tech Interviews throughout the year and I hope you’ll be able to join me for more shows in 2019.

That leaves me to just wish you all a great Christmas holiday and until next year, thanks for listening.

turned on red and blue merry christmas neon sign
Photo by Jameel Hassan on Pexels.com
Advertisements

Intelligent, secure, automated, your data platform future

I was recently part of a workshop event where we discussed building “your future data platform”, during the session I presented a roadmap of how a future data platform can look. The basis of the presentation, which looked at the relatively near future, was how developments from “data” vendors are allowing us to rethink the way we manage the data that we have in our organisations.

What’s driving the need for change?

Why do we need to change the way we manage data? The reality is that the world of technology is changing extremely quickly and at the heart of it is our desire for data, be it creating it, storing it, analysing it or learning from it and demanding that we use data increasingly to help drive business outcomes, strategies and improve customer experience.

Alongside this need to use our data more are other challenges, from increasing regulation to the ever more complex security risk (See the recent Marriot Hotels breach of 500 million customer records) which are making further, unprecedented demands on our technology platforms.

Why aren’t current approaches meeting the demand?

What’s wrong with what we are currently doing? Why aren’t current approaches helping us to meet the demands and challenges of modern data usage?

As the demands on our data grow, the reality for many is we have architected platforms that have never considered many of these issues.

Let’s consider what happens when we place data onto our current platform.

We take our data, it could be confidential, it may not be, often we don’t know, that data is placed into our data repository when it’s placed there how many of us know;

  • Where it is?
  • Who owns it?
  • What does it contain?
  • Who is accessing it?
  • What’s happening to it?

In most cases, we don’t, and this presents a range of challenges from management and security to reducing our ability to compete with those who are effectively using their data to innovate and gain an advantage.

What If that platform instead could recognise the data as it was deposited? Then made sure it was in the right secure area, with individual file securities that would ensure it remained secure regardless of its location, then the system, if necessary, would protect the file immediately (not when the next protection job ran) and then tracked the use of that data from its creation to deletion.

That would be useful, wouldn’t it?

What do we need our modern platform to be?

As the amount of data and the ways we want to use it continues to evolve, our traditional approaches will not be able to meet the demands placed upon them and we certainly cannot expect human intervention to be able to cope, the data sets are too big, the security threats too wide-reaching and the compliance requirements ever more stringent.

However, that’s the challenge we need our future data platforms to meet, they need to be, by design, secure, intelligent and automated. The only way we are going to be able to deliver this is with the help of technology augmenting our efforts in education, process and policy to ensure we use our data and get the very best from it.

That technology needs to be able to deliver this secure, intelligent and automated environment from the second it starts to ingest data, it needs to understand what we have and how it should be used and importantly it shouldn’t just be reactive, it has to be proactive, the minute new data is written it applies intelligence, ensuring immediately we secure our data, store and protect it accordingly and be able to fully understand its use throughout its lifecycle.

Beyond this, we also need to make sure that what we architect is truly a platform, something that acts as a solid foundation for how we want to use our data. We need to ensure once we have our data organised, secure and protected, that our platform can make sure that we can move it to places we need it, allowing us to take advantage of new cloud services, data analytics tools, machine learning engines or whatever may be around the corner, while ensuring we continue to maintain control and retain insights into its use regardless of where it resides.

These are key elements of our future data platform and ones we are going to need to consider to ensure that our data can meet the demands of our organisations to make better decisions and provide better services, driven by better use of data.

How do we do it?

Of course, the question is, can this be done today and if it can how?

The good news is, much of what we need to do this, is already available or coming very soon and which means, realistically within the next 6-18 months, if you have the desire, you can develop a strategy and build a more secure, intelligent and automated method for managing your data.

I’ve shared some thoughts here on why we need to modernise our platforms and what we need from them, in the next post I’ll share a practical example of how you can build this kind of platform by using tools that are available to you today or coming very shortly, to show that a future data platform is closer than you may think.

Automate all of the things – Jason Benedicic – Ep82

The technology industry is changing as quickly as ever, with new ways of working, new technologies to absorb and new challenges presented almost daily to our IT teams. How are we supposed to keep up? One way is to automate, to take repetitive tasks and find ways to deliver them via code and automation tools.

If you spend any time listening to vendors in the technology market, you won’t go very far before you hear about how they are embracing automation, ensuring their API’s are there for all to use and sharing their expertise in community repositories like GitHub.

But what does it all really mean? What is automation good for and what can it help you achieve? Does it offer more than just a way to simplify repetitive tasks? These are all questions I’ve had for a while and on this week’s podcast I catch up with Jason Benedicic, an independent consultant, who amongst other things, specialises in automation, to help answer some of these basic questions about automation and why it’s useful.

In this week’s show, Jason provides some fantastic insights into the world of automation, we start by discussing what automation is and what in your business makes a good automation candidate. We talk about why automation is more than just coding tasks and how it introduces flexibility and reduces the number of times a task is “passed” around a business.

We look at how automation is not only key to innovation but also delivers uniformity, a crucial part in managing and securing a modern infrastructure.

Jason shares his experience of how to identify a process that makes a good automation candidate and how a good candidate doesn’t have to be some new Cloud architecture or innovation and how many of our traditional IT tasks make equally good candidates.

We wrap up with Jason providing some great advice on where to start with automation, focus on what you know, use the community and whatever you do, don’t reinvent the wheel.

Jason provides some great insight into the world of automation and where to get started, I hope you enjoy it and it helps you to start your automation journey.

To find out more from Jason you can reach him on Twitter at @jabenedicic and find his blog at www.thedatacentrebrit.co.uk

Jason also mentioned some resources you may want to try;

NetApp Pub

VMware {code}

Pluralsight

Code Academy

As always, thanks for listening.

Veeam Virtual Goodness

Conference CrowdI’ve written before of the value I get from attending events, there’s always something to learn, be it strategic insight, technical information, or just a chance to meet someone new via a chat about a common topic, all of these things have value, sometimes big, sometimes small.

However, there are so many events and in reality we can’t attend them all, time, cost, and balancing demands of the day job, all make it impossible to attend everything you may want to, which is why increasingly the ability to join these events virtually has become a valuable option, be it via live streamed keynotes, catching up on-demand or increasingly via virtual conference.

I’m not sure that many people are aware of virtual conferences, but I’ve attended a couple in the past and they have worked really well, so I was pleased to see Veeam also delivering once such event with VeeamON Virtual on December 5th.

The protection and management of our data is, of course, a key topic for pretty much everybody. Ensuring it’s available, secure, protected and managed is a crucial element in the strategic planning for most of the CIO’s I speak with, whether driven by security concerns, regulation or future plans to extract value from data via analytics, then ensuring our datasets are in a fit healthy state is very important.

Why does this make this Veeam event interesting? The world of data protection is changing rapidly, as rapidly as the changing demands we are putting on our data and ensuring that we keep up with the changes that industry leaders like Veeam are making both technically and strategically, should be a core part of our education.

Making this kind of information accessible is very helpful and these virtual conferences are a great way of providing that access. What then, can we expect from VeeamON virtual that would encourage you to invest a proportion of your day in this conference?

The event has something to offer everyone, three distinct tracks providing differing levels of information. The strategic track includes sessions looking at 2019 industry trends and updates on Veeam’s Intelligent data management strategy, presented by their leadership team.

The technical track offers some great content, including a session covering the key elements of Backup and Replication Update 4 as well as a look at one of my favourite Veeam tools, Availability Orchestrator, a tool designed to help fully automate the complexities of a DR strategy, including testing and documentation.

Of course, no technical discussion is complete without looking at the impact of cloud, the cloud track explores Veeam’s capabilities around protection of AWS workloads as well as an update on their backup for Office365 product.

As with most conferences their value can often be found outside of the main presentations, and a virtual conference is no different, with an online chat community, where all attendees can chat in virtual “lounges” with Veeam staff, industry experts and of course other Veeam users, to share ideas, ask questions and maybe develop a new friendship!

The amount of interesting innovation in the data management/protection space makes it a fascinating part of the technology industry and Veeam are certainly one of its leading innovators.

The way we use and want to use our data puts huge demands on the data strategies we have in place to meet our business needs, so ensuring that we are aware of industry changes and trends has to be a high priority for any strategic IT decision maker or IT pro and if you can hear from one of the industry’s leading innovators from the comfort of your own chair via a virtual conference then it probably makes sense to do it.

If you’re responsible for ensuring your data management strategy continues to evolve to meet the ever-changing demands placed upon your data then book December the 5th in your diary, check out the sessions that catch your attention (Pro Tip you don’t have to do them all!) and settle down and hear from a wide range of industry leaders on both strategic and technical topics and understand how the world of data management is changing and how it may affect you.

You can find out more about the event, its speakers and agenda right here

112618_1252_VeeamVirtua1.jpg

Why stay in the data industry? – Greg Knieriemen – Ep81

The data industry is a really interesting place right now, for many organisations their data and the challenges of how they use it, secure it and derive value from it is right at the top of the CIO’s priority list. However, data is not the only area of the technology industry that is interesting, automation, AI, machine learning, IoT, new development platforms and of course the fascinating world of the hyperscalers.

So, when you are an experienced technologist, well known in the industry and you are presented with a range of new opportunities, what is it that attracts you back to work in the data industry?

That’s the question I put to this week’s guest, experienced tech industry “veteran” Greg Knieriemen. Greg has just ended a highly successful stint at Hitachi and was presented with a range of interesting opportunities; however, it was one of those established storage vendors, NetApp, that appealed the most to him, but why? What could the data industry continue to offer someone with an already wide experience of it?

When we recorded Greg was only a couple of months into his new role as Chief Technologist, so we start by exploring why he chose to stay a part of the data industry and what about NetApp, in particular, attracted him. Greg shares how he realised that NetApp is not just a storage company but one looking to solve data problems.

We explore the reality of digital transformation and why it can’t be technology led, but we do look at why technology companies can play a part. Greg also shares his enthusiasm (more than once!) for new NetApp solution, MAX Data.

We discuss the world of multi-cloud as the natural evolution for companies and how it is likely to be a new reality. We also discuss why this multi-cloud world presents a range of new challenges, especially when it comes to security and privacy.

Greg also shares some thoughts on the reality of technology adoption and that there is never one way to solve a problem with tech!

We finish up looking at what’s exciting Greg about his new role and why he thinks NetApp is better placed than most to tackle the complex data challenges of the modern business and how the best way to judge a company’s success is to look at their proof points and why you should never believe a technology evangelist!

I really enjoyed meeting up with Greg at NetApp Insight and certainly enjoyed our chat here and Greg’s take on the industry, I hope you enjoy listening.

If you want to find out more from Greg you can find him on twitter @Knieriemen as well as on LinkedIn.

Until next time, thanks for listening.