Hyper Converged Now and Next – Troy Mangum – Ep49

The IT industry is full of new trends, some you get, some you don’t, one such trend, that until recently I didn’t really get, was Hyper-Converged, a new market, with a message of simplification and dominated initially by new technology players, like Nutanix and Simplivity (now part of HPE) and they have been pretty successful, so why have I not gotten onboard?

A good test with any new technology is does it solve a problem or improve the way I currently do things? Up to now with Hyper-Converged Infrastructure (HCI) I’m not sure it really does, is it helping me build a more automated, flexible, agile IT infrastructure? Is it helping me build a hybrid environment? Is it automating my IT environment so that my business gets the agility it wants? Not sure.

What HCI does do well is simplify your hardware infrastructure, takes something that may have been installed in a full rack and squeezes it down into 2 or 4U in a single chassis, with compute and storage integrated together and a scaling model which allows you to attach another HCI box and scale your compute and storage again.

But is that enough? When I’ve worked with organisations considering HCI, the cost of this model tends to be inline (if not more expensive) with buying the individual components and installing them yourselves and unless those accounts have been looking to refresh compute and storage at the same time, the value has been hard to find.

What’s changed my view? The starting point is nothing to do with changes to the HCI hardware model or addition of some great new feature, it’s actually and maybe not surprisingly driven by software, look at what Microsoft and VMware are doing for example, VMware is delivering an increasingly more software-defined infrastructure with every incremental release of their virtualisation stack.

Microsoft’s Azurestack, although limited currently, aims to bring a fully software-defined Azure like experience onto your local hardware and of course solutions from both of these companies are increasingly hybrid focussed, VMware on AWS and of course Azure both integrated tightly into these on-prem stacks.

This simplification of the software stack is now starting to drive the need for a hardware stack that matches this simplification and can take advantage of these software-defined infrastructure solutions.

It is this changing environment that is the focus of this latest podcast.

At the recent NetApp Insight conference, I met with Troy Mangum who shared some research he’s been working on reviewing the HCI market, how it stands today and the changes HCI  vendors need to make to ensure they build on the early success of first-generation solutions to deliver a platform to meet the needs of the modern data centre and take advantage of these software-defined infrastructure stacks.

We explore a range of discussion points from the research, we look at the drivers behind the adoption of HCI, the need for simplification and easier consumption of IT resources. We also discuss how the current technical design of HCI hardware architectures may limit their ability to grow in the way we need them to.

Troy shares how currently HCI comes with a risk of introducing infrastructure silo’s into our datacentres, focussed on solving individual problems and not the flexibility the modern data centre needs, we also explore the phenomenon of HCI tax, what this is and why it’s a problem.

Finally we take a look at the future, how architectural changes are driving a new breed of HCI architecture, a second generation, allowing a more flexible deployment model, decoupling the component parts so HCI can scale capacity and compute separately, which then begs the question, is this new breed of HCI really HCI at all and does it really matter? And of course, we look at NetApp’s entry into this market.

To find out more on this topic and what NetApp are doing you can find lots of HCI information on NetApp’s website here.

You can also find out more from Troy by following him on Twitter @cloudreveler

Next week we look at very large data repositories, as I’m joined by returning guest Justin Parisi to discuss the concept of Flexgroups.

To ensure you catch that show, you can subscribe to Tech Interviews on iTunes, Stitcher and Soundcloud and all good homes of podcasts.

Thanks for listening.

Advertisements

The heart of the data fabric – Justin Parisi – Ep48

I’ve discussed in a number of blog posts, as well as previous episodes of Tech Interviews, the importance of building a data strategy, a strategy that will allow you to architect and deploy a platform to tackle modern data management challenges.

The term “Data Fabric” is an increasingly common way of describing such a strategy, this was something I first heard 3 years ago at NetApp’s annual technical conference Insight, as they introduced their ideas for building a strategy that would start to move them from a storage company to a data management company.

This shift is also in line with what I see in the many businesses I work with, the move from just storing data to using it as something that will enable them to become more data focussed and data-driven organisations.

When NetApp first discussed this three years ago, they where a very different company, accused of living in the past, a traditional storage dinosaur with no place in this modern world, where new storage companies and the ever-growing influence of cloud would destroy a company focussed on selling hardware and of course their operating system Data ONTAP.

But NetApp have changed, today they are moving headlong into a data management future, focussed on allowing their customers to build infrastructures to store data in the most appropriate location at the right time and allowing them to easily move, manage, secure and protect that data, regardless of whether it’s on-prem, a virtual appliance or based in the cloud.

Surely then, as NetApp continue to change, their beloved ONTAP operating system can’t still play a key part in building a data fabric.

Nothing could be further from the truth, and that is the focus of this episode, as I’m joined by Justin Parisi, Senior Technical Marketing Engineer at NetApp and the host of NetApp’s Tech ONTAP podcast.

In this episode, we explore why ONTAP is anything but a legacy bit of technology and how not only is it still relevant, it is right at the core of NetApp’s data fabric future.

We look at the fact that ONTAP is a piece of software and although tied to hardware initially that reliance has gone, allowing ONTAP to be a versatile platform that can be installed on specific hardware, your own hardware or not on hardware at all, installed as a service within a public hyperscale cloud.

We discuss how ONTAP is not about storage but is much more focussed on data services, such as security, protection, efficiency and performance.

This ability to deploy ONTAP anywhere also allows us to ensure we can transfer not only our data easily between locations but also our policies and procedures can easily move with it.

We wrap up looking at some of the features in the latest version of ONTAP and how continuous improvements ensure ONTAP remains at the heart of NetApp’s data fabric strategy and can play a part in yours.

To find out more about ONTAP visit NetApp’s website

You can follow Justin on twitter @NFSDudeAbides

And hear the excellent Tech ONTAP podcast right here – https://soundcloud.com/techontap_podcast

Next week we look at the development of the hyper-converged market, where it is today and how it needs to change, as I discuss some interesting HCI research with Troy Mangum.

To catch that episode why not subscribe, you’ll find Tech Interviews in all the usual podcast places.

Thanks for listening.

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

As we come toward the end of the conference, day three provided us with the final general sessions including a fascinating insight into rocket science as Adam Steltzner, part of the Mars Rover landing team, shared the part data played in their work.

082917_1433_ITAvengersP2.jpgJoining me in this final review from Insight is Jon Woan (@jonwoan)jon woan and Mick Kehoe (@mickehoe) providing their views on this session and as it was the final day, they also share their thoughts on what they’d heard throughout the conference, how it met their expectations and where NetApp covering the kind of things that they felt relevant.

Enjoy this last review from NetApp Insight and look out for upcoming reviews from other tech conferences in the future, as well as new episodes of Tech Interviews.

Don’t miss the round ups from day’s one and two, you’ll find them here.

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

 

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

We heard views about Monday’s keynote yesterday (you can find that here Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One) What did Day two have for us?

scott gelbThis time I’m joined by Scott Gelb (@scottygelb) and Adam BerghAdam Bergh (@ajbergh) to get their views as we discuss the announcements of new platforms such as HCI and the fascinating move to cloud services including a unique arrangement with Microsoft Azure.

Don’t miss the round-ups from days one and three, you can find them here;

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

 

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

For these first shows, I’m at NetApp’s Insight conference in Berlin, where we expect four days full of the latest in what the data management industry are doing and hearing how data continues to be a focus for transformation for many of us.

With that in mind, what did Monday’s keynote session deliver?

 

082917_1433_ITAvengersP4.jpg

To find out, straight from the keynote I caught up with Jason Benedicic (@jabenedicic), Atanas Prezhdarov Atanas(@prezblahblah) and Mark Carlton (@mcarlton1983) to get their views on the key messages from the keynote and what they expected from the rest of the event.

mark carlton new twitterDon’t miss the round-ups from day two and three you can find them here;

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three