Imagine if the data you shared was written in pencil! – Val Bercovici – Ep54

One of my favourite phrases is that “those who make the most of their data, will be the success stories of the future” and I think that is echoed by our obsession with data as a new all-powerful resource for us to mine and exploit.

But, it’s not quite that straightforward is it? There is no getting away from the importance of security and privacy when it comes to our data. Be that keeping it from unwarranted access, data leakage or maintaining compliance with ever increasing regulatory requirements, maintaining our data in a secure environment is a primary concern.

Within that battle between wanting to exploit our data while maintaining control lies a huge problem, how do you share your data, so you can extract value, while maintaining control, security and privacy over this crucial asset?

That is the focus of this weeks show, as I’m joined by Val Bercovici, a well-respected technology leader, who is today working on a new project with his start-up company Pencildata, a company who’s mission it is to try to address this very challenge and provide a solution that allows us to both get value from our data, while maintaining control, regardless of where it is, who is accessing it and however many services it gets passed on to.

In a fascinating chat with Val, we explore the growing tension between getting value from data while maintaining its security. We examine some of the interesting problems that come with A.I. systems, how do you address the “cold start” problem? or ensure that your A.I. systems have a wider range of data, beyond academic and publicly available data sets to learn from?

We discuss why organisations find it hard to unlock the value of their data and effectively share it, not only with 3rd party providers but even with others inside their organisations.

We wrap up by looking at how to change this, how do we give an organisation the ability to maintain control of their data regardless of where it is shared and crucially the ability to change their minds, so when they no longer want to share that data they can withdraw authorisation, regardless of where that data resides. We also look at the importance of an immutable audit trail so that you always know who, where and how your data is used.

I hope you enjoyed the episode of what is a very interesting topic, as I do believe the ability to maintain this level of control of data as it moves between departments, or is shared with external services is going to be crucial as we look to fully exploit the value and insight that is held within our data sets.

If you want to know more about the work Val and the team at Pencildata you can find them at

You can follow them on twitter @Pencil_DATA or email Val

You can also follow Val on twitter @valb00

Thanks for listening.


Keep it to yourself, the data privacy challenge – Sheila Fitzpatrick – Ep53

There is no doubt for many 2018 will be the year of data privacy, driven, in no small part, by the impending EU General Data Protection Regulation (GDPR).

In my opinion GDPR has many positives elements to it, it’s the opportunity for businesses to look at how they manage, secure and retain the privacy of data across their organisations and for too many of us, this is something that’s long overdue.

GDPR (and other regulations) however are not just a good idea, they are something that we need to comply with and May 25th 2018 is the date when GDPR becomes enforceable.

With that in mind, what are some of the changes that we should expect? what things are no longer going to acceptable practice? and what are the things really that we should have in place by that date?

022617_1150_Besttakecar1.jpgThis week we look at exactly that topic as I’m joined by data privacy expert and attorney Sheila FitzPatrick, founder of FitzPatrick and Associates and a globally recognised data privacy expert with over 35 years’ experience in the data privacy field.

Who better then to ask and get some advice from on some of the pitfalls and common misconceptions of data privacy and, when it comes to GDPR,  what are some of the basics that we really need to have in place by May 25th 2018.

During this episode we discuss a range of issues, we look at where we should start and why that place really shouldn’t be technology, Sheila also touches on why it’s important to be weary of “GDPR” experts selling you their compliance technology.

We discuss some of the common misconceptions and mistakes that organisations are making in their business compliance work and how this often leads to companies spending a lot of money unnecessarily, we also look at why focusing on GDPR alone can be a big problem in itself.

Sheila also explains why security is not the answer to data privacy and why it’s important to make sure you understand exactly why you have the data in the first place, before you worry about “securing” it.

We explore where to start on your compliance journey by understanding your current policies and procedures and what they are based upon, are those procedures clear and transparent and then the importance of GAP analysis, so you can understand what work is needed to meet the requirements of GDPR, or any other relevant privacy regulations to your organisation.

To wrap up we look at the things organisations are currently doing with their data that come May 26th 2018 will no longer be acceptable and why it will be crucial to ensure your business compliance plans are fluid and capable of responding to the ever changing data landscape, May 25th is most certainly not an end date for GDPR.

Lastly, I ask Sheila whether she has advice for those that think GDPR won’t affect them and she does!

Sheila, as always, shares some great insights into the world of data privacy and compliance and does so with her usual enthusiasm for the topic.

If you want to hear more from Sheila on these subjects, Sheila has appeared on Tech Interviews a number of times before and those episodes can be found here;

Best Take Care Of Those Crown Jewels – Sheila Fitzpatrick – Ep 17

Don’t Build Your Data Privacy House Upside Down – Sheila Fitzpatrick – Ep 18

Myself and Sheila also recently appeared on The Cube discussing GDPR and data privacy, you can find that show here

If you want to contact Sheila on line you can find her on twitter @sheilafitzp.

You can contact here via her Linkedin profile or email

Looking forward and looking back – Chris Evans – Ep 52

It’s the kind of thing us podcasters like to do this time of year, we like to take a bit of a retrospective view of the previous year as well as a look forward to the new, so not wanting to miss a trick on the Tech Interviews podcast, that’s exactly what we do in this episode.

Chris M EvansTo help me to look forward and look back at the tech industry I’m joined by Analyst Chris Evans (@chrismevans), Chris has over 25 years of varied IT experience, starting his career on mainframes, Chris also successfully built and floated his own dotcom business, started his own consultancy practice and today is a widely read and respected industry analyst and hosts his own (excellent) podcast, Storage Unpacked.

With that background he’s the ideal person to provide some perspective and thoughts on the direction of the tech industry, especially the data and storage elements of it, so that is exactly what I ask him to do.

We take a look at a whole range of topics, we discuss how a move to a more software defined future has not stopped the re-emergence of the importance of hardware, with technologies such as NVMe becoming more prevalent and how the ever-increasing criticality of data and performance is driving this hardware evolution.

Chris also explores some thoughts around the development of hybrid cloud infrastructure and how this is not only driven by the traditional on-prem vendors, but by the big cloud providers, how the likes of Microsoft and AWS are investing increasingly in technology to help their customers simplify the process of moving data into their cloud platforms. (For example Chris references Microsoft’s purchase of Avere Systems, you can read more here).

We also ask why some of the technologies we expected to really take off didn’t, for example why a personal favourite of both myself and Chris, Object Storage, hasn’t quite captured the market as we thought.

We don’t of course just look back, we look at some of the technologies that we expect to be the big bets for CIO’s and IT decision makers in 2018.

We investigate why high performance and scale out file systems, both on-prem and in the cloud, will continue to grow and as we increasingly look at how to keep our data at the edge while taking advantage of cloud computing.

Will NVMe really take off in 2018? we discuss some potential use cases and why, if you are making those tech investment decisions in 2018, NVMe maybe for you.

Of course, we round up with a look at data privacy, no doubt 2018, at least the first half, will be the year of GDPR, Chris gives some thoughts on what technology can help and how the technology industry can be more helpful in the way it approaches this challenging topic.

To find out more from Chris, you can find him on twitter @chrismevans and Linkedin.

His writing and analysis can be found at

And if you are interested in storage industry related content, then I strongly recommend Chris’s excellent Storage Unpacked podcast you can listen to that here

Next week, sticking with the theme of data privacy as the topic for 2018, I’m joined by my favourite data privacy guru Sheila Fitzpatrick as we look at the upcoming impact of GDPR, the Myths and the areas you should be focussing on ahead of May 25th.

To ensure you catch that show, why not subscribe, you’ll find the show in all of the usual places.

Thanks for listening and have a great 2018.

NetApp Winning Awards, Whatever Next?

WP_20160518_07_53_57_Rich_LI.jpgIn the last couple of weeks I’ve seen NetApp pick up a couple of industry awards with the all flash A200 earing the prestigious Storage Review Editors Choice as well as CRN UK’s storage Vendor of the year 2017, this alongside commercial successes (How NetApp continue to defy the performance of the storage market) is part of a big turnaround in their fortunes over the last 3 years or so, but why? What is NetApp doing to garner such praise?

A bit of disclosure, as a Director at a long-term NetApp Partner, Gardner Systems, and a member of the NetApp A-Team advocacy programme, I could be biased, but having worked with NetApp for over 10 years, I still see them meeting our customers’ needs better than any other vendor, which in itself, also suggests NetApp are doing something right.

What is it they’re doing? In this post, I share some thoughts on what I believe are key parts of this recent success

Clear Strategy

If we wind the clock back 4 years, NetApp’s reputation was not at its best, tech industry analysts presented a bleak picture, the storage industry was changing, with public cloud storage and innovative start-ups offering to do more than those “legacy” platforms and in many cases, they could, NetApp were a dinosaur on the verge of extinction.

Enter the Data Fabric, first announced at NetApp’s technical conference, Insight, in 2014. Data Fabric was the beginning of NetApp’s move from a company focussed on storing data to a company focused on the data itself. This was significant as it coincided with a shift in how organisations viewed data, moving away from just thinking about storing data to managing, securing, analysing and gaining value from it.

NetApp’s vision for data fabric, closely aligned to the aims of more data focussed organisations and also changed the way they thought about their portfolio, less worried about speeds and feeds and flashing lights and more about how to build a strategy that was focussed on data in the way their customers were.

It is this data-driven approach that, in my opinion, has been fundamental in this change in NetApp’s fortunes.

Embrace the Cloud

A huge shift and something that is taking both customers and industry analysts by surprise is the way NetApp have embraced the cloud, not a cursory nod, but cloud as a fundamental part of the data fabric strategy and this goes way beyond “cloudifying” existing technology.

ONTAP Cloud seamlessly delivers the same data services and storage efficiencies into the public cloud as you get with its on-prem cousin, this provides a unique ability to maintain data policies and procedures across your on-prem and cloud estates.

But NetApp has gone beyond this, delivering native cloud services that don’t require any traditional NetApp technologies, Cloud Sync, allows the easy movement of data from on-prem NFS datastores into the AWS cloud. While Cloud Control provides a backup service for Office365 (and now Salesforce) bringing crucial data protection functionality that many SaaS vendors do not provide.

If that wasn’t enough there is the recently announced relationship with Microsoft, with NetApp now powering the Azure NFS service, yep that’s right, if you take the NFS service from the Azure marketplace this is delivered fully in the background by NetApp.

For a storage vendor, this cloud investment is unexpected, but a clear cloud strategy is also appealing to those making business technology decisions.

Getting the basics right

With these developments, it’s clear NetApp have a strategy and are expanding their portfolio into areas other storage vendors do not consider, but there is also no escaping that their main revenue generation continues to come from ONTAP and FAS (NetApp’s hardware platform).

If I’m buying a hardware platform, what do I want from it? It should be robust with strong performance and a good investment that evolves with my business and if NetApp’s commercial success is anything to go by, they are delivering this.

The all-flash NetApp platforms (such as the award winning A200 mentioned earlier) are meeting this need, a robust enterprise-level platform, allowing organisations to build an always-on storage infrastructure that scales seamlessly with new business demands. 6-year flash drive warranties and the ability to refresh your controllers after 3 years also give excellent investment protection.

It is not just the hardware however, these platforms are driven by software, NetApp’s ONTAP operating systems is like any other modern software platform, with regular code drops (every 6 months) delivering new features and improved performance to existing hardware via a non-disruptive software upgrade, providing businesses with the ability to “sweat” their hardware investment over an extended period, which in today’s investment sensitive market is hugely appealing.

Have an interesting portfolio

NetApp for a long time was the FAS and ONTAP company, and while those things are still central in their plans, their portfolio is expanding quickly, we’ve discussed the cloud focussed services, there’s also Solidfire with its unique scale and QoS capabilities, Storage Grid a compelling object storage platform, Alta Vault provides a gateway to move backup and archive data into object storage on-prem or in the cloud.

Add to this the newly announced HCI platform you can see how NetApp can play a significant part in your next-generation datacenter plans.

For me the awards I mentioned at the beginning of this article are not because of one particular solution or innovation, it’s the data fabric, that strategy is allowing NetApp, its partners and customers to have a conversation that is data and not technology focussed and having a vendor who understands that is clearly resonating with customers, analysts and industry influencers alike.

NetApp’s continued evolution is fascinating to watch, and they have more to come, with no doubt more awards to follow, whatever next!

Architecting the Future – Ruairi McBride and Jason Benedicic – Ep 51

As we become more data-driven in our organisations and ever more used to the way big public cloud providers deliver our services, it is putting more pressure on internal IT to deliver infrastructure that provides this data focussed, cloud like experience, but where do you start in designing this next-generation of datacentre?

That’s the subject of this week’s podcast, the last of the shows recorded at NetApp Insight in Berlin, where I catch up with two members of a fascinating panel discussion I attended at the event, Ruairi McBride and Jason Benedicic.

082917_1433_ITAvengersP3.jpgRuairi is focussed on partner education for global technology distribution company Arrow ECS and has spent the last 9 months working with partners to help them to understand next-generation datacenters.

You can find Ruairi on twitter @mcbride_ruairi and his blog site

Jason is a principal consultant at ANS Group in the UK with a focus on next-082917_1433_ITAvengersP4.jpggeneration datacentres, Jason spends his time designing and implementing next-gen technology for a wide range of customers and with nearly 20 years of industry experience offers great insight and experience.

Catch up with Jason on twitter @jabenedicic and look out for his coming soon blog site

Ruairi and Jason were part of a panel hosted by the NetApp A-Team which consisted of people who were not theorists but had practical experience of deploying next-generation technologies and working practices and as I know many listeners to this show are involved in developing their own next-generation strategy, I thought it would make an interesting episode.

We cover a range of topics and begin by looking to define what we mean by next-gen the types of technology and methodologies involved.

We discuss what is driving the move to next-generation datacentres, how public cloud and the move to automated, self-healing, self-service, software defined infrastructure is a major influence and how businesses who wish to maintain a competitive edge and improve the service to their customers and users, need to look at this next generation approach.

We wrap up by looking at how next gen datacenters are not about technology alone and is as much about philosophy and working practice, while Jason and Ruairi share ideas about the type of building blocks you need and the help and support that the technology community can bring as you look to deliver a next generation strategy to your organisation.

Jason and Ruairi provide some excellent insights and tips on developing a next generation datacentre approach if you have questions then please feel free to contact any of us by twitter or via the comments section on the site.

This is the last show of 2017, for all who have listened this year, thanks for your support and Tech Interviews will be back in the new year with a whole host of new interviews exploring a range of technology topics, if you have anything you’d like the show to explore in 2018, then why not drop me a note @techstringy.

To make sure you catch next years shows then why not subscribe in all of the usual places.

Just leaves me to say, enjoy the Christmas holiday season and I’d like to wish you the very best for 2018 and hope you’ll spend some of it listening to Tech Interviews .

For all of you who have enjoyed the show in 2017 – thanks for listening



Scale that NAS – Justin Parisi – Ep 50

There is no doubt that the amount of data we have, manage, control and analyse continues to grow ever more rapidly and much of  this is unstructured data, documents, images, engineering drawings, data that often needs to be stored in one place and be easily accessible.

However, this presents problems, how do you get all of this data in one place when it’s not just TB’s it 100’s of TB’s and made up of billions of files that need to be accessed quickly, how on earth do you build that kind of capacity and deliver the performance you need?

Like any compute problem, there are two ways to scale things, up by adding more capacity to your existing infrastructure or you can scale out, adding not only more capacity but also more compute.

The other week I heard an excellent episode of the Gestalt IT On-Premise podcast where they posed the question “should all storage be scale out?” (find the episode here) and the answer was basically yes and in a world where we have these ever-growing unstructured data repositories scaling out our NAS services makes perfect sense, delivering not only massive capacity in a single repository, but also taking advantage of scaled-out compute to give us the ability to process the billions of transactions that comes with a huge repository.

So for Episode 50 of the Tech Interviews podcast, it seemed apt to celebrate the big five-oh talking about big data storage.

112117_0834_Theheartoft1.jpgTo discuss this evolution of NAS storage I’m joined by a returning guest, fresh from episode 48 (The heart of the data fabric), Justin Parisi to discuss NetApp’s approach to this challenge, FlexGroups.

We start the episode by discussing what a FlexGroup is and importantly why you may want to use them and why it’s about more than just capacity, as we discuss the performance benefits of spreading our single storage volume across multiple controllers and look at those all important use cases from archives to design and automation.

We explore the importance of simplification, while our need to manage ever-increasing amounts of data continues to grow, the resources available to do it are ever more stretched, so we look at how NetApp has made sure that the complexity of scale-out NAS is hidden away from the user by presenting a simple, intuitive and quick to deploy technology that allows users to have the capacity without the need to rearchitect or relearn their existing solutions.

We wrap up by looking at some of NetApp’s future plans for this technology, including how it may become the standard deployment volume, simplification of migration and other uses such as VMware datastores.

FlexGroups is a really smart technology designed to simply address this ever-growing capacity and performance problem encountered by our traditional approach to file services and if you are looking at scale-out NAS for your file services, it’s a technology well worth reviewing.

For some very informative FlexGroup blogs visit NetApp Newsroom.

There is also a selection of NetApp Technical Documents around the subject, check out TR’s 4557, 4571 and 4616.

You can also hear more from Justin and the Tech ONTAP podcast team discussing FlexGroups here in episode 46.

And finally, you can contact the FlexGroup team via email at

If you want to find out more about Justin and the work he does in this area you can check out his excellent website and follow him on Twitter @nfsdudeabides.

Next week It’s the last show of the year as I’m joined by Jason Benedicic and Ruairi Mcbride to discuss the future of datacentre architecture as we talk next-gen technologies.

To catch that show why not subscribe in any of the usual podcast places.

Hope you enjoyed Episode 50 – here’s to the next 50!

Thanks for listening.

Hyper Converged Now and Next – Troy Mangum – Ep49

The IT industry is full of new trends, some you get, some you don’t, one such trend, that until recently I didn’t really get, was Hyper-Converged, a new market, with a message of simplification and dominated initially by new technology players, like Nutanix and Simplivity (now part of HPE) and they have been pretty successful, so why have I not gotten onboard?

A good test with any new technology is does it solve a problem or improve the way I currently do things? Up to now with Hyper-Converged Infrastructure (HCI) I’m not sure it really does, is it helping me build a more automated, flexible, agile IT infrastructure? Is it helping me build a hybrid environment? Is it automating my IT environment so that my business gets the agility it wants? Not sure.

What HCI does do well is simplify your hardware infrastructure, takes something that may have been installed in a full rack and squeezes it down into 2 or 4U in a single chassis, with compute and storage integrated together and a scaling model which allows you to attach another HCI box and scale your compute and storage again.

But is that enough? When I’ve worked with organisations considering HCI, the cost of this model tends to be inline (if not more expensive) with buying the individual components and installing them yourselves and unless those accounts have been looking to refresh compute and storage at the same time, the value has been hard to find.

What’s changed my view? The starting point is nothing to do with changes to the HCI hardware model or addition of some great new feature, it’s actually and maybe not surprisingly driven by software, look at what Microsoft and VMware are doing for example, VMware is delivering an increasingly more software-defined infrastructure with every incremental release of their virtualisation stack.

Microsoft’s Azurestack, although limited currently, aims to bring a fully software-defined Azure like experience onto your local hardware and of course solutions from both of these companies are increasingly hybrid focussed, VMware on AWS and of course Azure both integrated tightly into these on-prem stacks.

This simplification of the software stack is now starting to drive the need for a hardware stack that matches this simplification and can take advantage of these software-defined infrastructure solutions.

It is this changing environment that is the focus of this latest podcast.

At the recent NetApp Insight conference, I met with Troy Mangum who shared some research he’s been working on reviewing the HCI market, how it stands today and the changes HCI  vendors need to make to ensure they build on the early success of first-generation solutions to deliver a platform to meet the needs of the modern data centre and take advantage of these software-defined infrastructure stacks.

We explore a range of discussion points from the research, we look at the drivers behind the adoption of HCI, the need for simplification and easier consumption of IT resources. We also discuss how the current technical design of HCI hardware architectures may limit their ability to grow in the way we need them to.

Troy shares how currently HCI comes with a risk of introducing infrastructure silo’s into our datacentres, focussed on solving individual problems and not the flexibility the modern data centre needs, we also explore the phenomenon of HCI tax, what this is and why it’s a problem.

Finally we take a look at the future, how architectural changes are driving a new breed of HCI architecture, a second generation, allowing a more flexible deployment model, decoupling the component parts so HCI can scale capacity and compute separately, which then begs the question, is this new breed of HCI really HCI at all and does it really matter? And of course, we look at NetApp’s entry into this market.

To find out more on this topic and what NetApp are doing you can find lots of HCI information on NetApp’s website here.

You can also find out more from Troy by following him on Twitter @cloudreveler

Next week we look at very large data repositories, as I’m joined by returning guest Justin Parisi to discuss the concept of Flexgroups.

To ensure you catch that show, you can subscribe to Tech Interviews on iTunes, Stitcher and Soundcloud and all good homes of podcasts.

Thanks for listening.

The heart of the data fabric – Justin Parisi – Ep48

I’ve discussed in a number of blog posts, as well as previous episodes of Tech Interviews, the importance of building a data strategy, a strategy that will allow you to architect and deploy a platform to tackle modern data management challenges.

The term “Data Fabric” is an increasingly common way of describing such a strategy, this was something I first heard 3 years ago at NetApp’s annual technical conference Insight, as they introduced their ideas for building a strategy that would start to move them from a storage company to a data management company.

This shift is also in line with what I see in the many businesses I work with, the move from just storing data to using it as something that will enable them to become more data focussed and data-driven organisations.

When NetApp first discussed this three years ago, they where a very different company, accused of living in the past, a traditional storage dinosaur with no place in this modern world, where new storage companies and the ever-growing influence of cloud would destroy a company focussed on selling hardware and of course their operating system Data ONTAP.

But NetApp have changed, today they are moving headlong into a data management future, focussed on allowing their customers to build infrastructures to store data in the most appropriate location at the right time and allowing them to easily move, manage, secure and protect that data, regardless of whether it’s on-prem, a virtual appliance or based in the cloud.

Surely then, as NetApp continue to change, their beloved ONTAP operating system can’t still play a key part in building a data fabric.

Nothing could be further from the truth, and that is the focus of this episode, as I’m joined by Justin Parisi, Senior Technical Marketing Engineer at NetApp and the host of NetApp’s Tech ONTAP podcast.

In this episode, we explore why ONTAP is anything but a legacy bit of technology and how not only is it still relevant, it is right at the core of NetApp’s data fabric future.

We look at the fact that ONTAP is a piece of software and although tied to hardware initially that reliance has gone, allowing ONTAP to be a versatile platform that can be installed on specific hardware, your own hardware or not on hardware at all, installed as a service within a public hyperscale cloud.

We discuss how ONTAP is not about storage but is much more focussed on data services, such as security, protection, efficiency and performance.

This ability to deploy ONTAP anywhere also allows us to ensure we can transfer not only our data easily between locations but also our policies and procedures can easily move with it.

We wrap up looking at some of the features in the latest version of ONTAP and how continuous improvements ensure ONTAP remains at the heart of NetApp’s data fabric strategy and can play a part in yours.

To find out more about ONTAP visit NetApp’s website

You can follow Justin on twitter @NFSDudeAbides

And hear the excellent Tech ONTAP podcast right here –

Next week we look at the development of the hyper-converged market, where it is today and how it needs to change, as I discuss some interesting HCI research with Troy Mangum.

To catch that episode why not subscribe, you’ll find Tech Interviews in all the usual podcast places.

Thanks for listening.

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

As we come toward the end of the conference, day three provided us with the final general sessions including a fascinating insight into rocket science as Adam Steltzner, part of the Mars Rover landing team, shared the part data played in their work.

082917_1433_ITAvengersP2.jpgJoining me in this final review from Insight is Jon Woan (@jonwoan)jon woan and Mick Kehoe (@mickehoe) providing their views on this session and as it was the final day, they also share their thoughts on what they’d heard throughout the conference, how it met their expectations and where NetApp covering the kind of things that they felt relevant.

Enjoy this last review from NetApp Insight and look out for upcoming reviews from other tech conferences in the future, as well as new episodes of Tech Interviews.

Don’t miss the round ups from day’s one and two, you’ll find them here.

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two


Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

We heard views about Monday’s keynote yesterday (you can find that here Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One) What did Day two have for us?

scott gelbThis time I’m joined by Scott Gelb (@scottygelb) and Adam BerghAdam Bergh (@ajbergh) to get their views as we discuss the announcements of new platforms such as HCI and the fascinating move to cloud services including a unique arrangement with Microsoft Azure.

Don’t miss the round-ups from days one and three, you can find them here;

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three