Living on the data edge – Yadin Porter De Leon – Ep36

There has no doubt been a change in the way we all look at protecting and securing our key data, with robust data protection, enhanced data security plans and increased governance to ensure compliance with ever-increasing regulatory requirements.

However, much of this is concentrated in our datacentre and our central data sets, which although critical, is not all of our data. One of the biggest changes we have seen in recent years in the world of “corporate IT” is the increasingly mobile nature of our daily operations, starting out from getting email to a Blackberry, to today been able to access all of our key business data and applications from pretty much anywhere at any time on a wide range of devices.

This shift of course has a huge impact on where our data lives, it’s no longer just in our datacentre, behind corporate firewalls and security, it lives out on those range of mobile devices, laptops, tablets and smartphones, which brings a set of quite unique challenges.

These challenges are not necessarily easy to fix, how do you ensure you are protecting that data on a wide range of different devices, how do you know where it is and what it contains?

That is the focus of this weeks podcast, as I’m joined by Yadin Porter De Leon of Druva to discuss the challenge of data that lives right on the edge of our network.

We look at the complexity of the problem and why it has too be much more than “backup” and why it’s critical we have full visibility of our data, regardless of where it sits in our business, be that in our datacentre or on a mobile device 1000’s of miles away.

We define what we mean by edge data, look at how it needs to be just as available and accessible as that in our datacentre.

Yadin shares with us how easy it is to suffer from a data breach with edge data, how the loss of a single device can lead to significant breach and all the impact that brings, especially if we don’t know what data may or may not have been on that device!

This leads on to investigating the subject of information management, which is the true goal for many of us, knowing where our data is, what it contains, who has and is accessing it, while ensuring we have it protected and can always find it, regardless of whether our end device is accessible or not.

We wrap up our edge data discussion by providing a little bit of listener homework, as we point you in the direction of the kind of questions you should be asking yourself as an organisation, so you can build a data protection policy for all of your data, not just that which is stored in your datacentre.

Yadin gives some great insight to something, which in my opinion, is a challenge that is often not giving the focus that it requires.

To find out more about the work Druva do to help with this difficult challenge then visit the solution section of, you can also follow them @druvainc on twitter.

You can follow Yadin @porterdeleon on twitter.

Finally, I strongly recommend checking out the excellent In Tech We Trust podcast, where Yadin and the team explore a wide range of IT industry and business topics you can follow them on twitter @InTech_WeTrust or the website at

Tech Interviews is packing its swimming trunks and heading off on its podcast summer holidays for a few weeks, but worry not, you can find all of our previous episodes here on the website as well as at and to make sure you don’t miss out on the new shows when we’re back, then why not subscribe in all the usual podcast places.

Subscribe on Android


Listen to Stitcher

Have a great summer and thanks for listening.



What is a next generation data centre? – Martin Cooper – Ep35

There is no doubt that our organisations are becoming ever more data centric, wanting to know how we can gain insight into our day to day operations and continue to be competitive and relevant to our customers, while delivering a wide range of new experiences for them.

This move to a more data driven environment is also altering the way we engage and even purchase technology in our businesses, with technology decisions now no longer the preserve of IT people.

These changes do mean we need to reconsider how we design and deliver technology. Which has led to the idea of “The Next Generation Datacenter”, but what does that mean? What is a Next Generation Datacentre?

That is the subject of this week’s podcast, as I’m joined by Martin Cooper, Senior Director of the Next Generation Datacentre Group (NGDC), at NetApp.

With over 25 years in the technology industry, Martin is well placed to understand the changes that are needed to meet our increasingly digitally driven technology requirements.

In this episode, we look at a wide range of topics, starting with trying to define what we mean by Next Generation Datacentre. The good news is that NGDC is not necessarily about buying a range of new technologies, but about optimising the processes and technology that we already have.

We touch on how a modern business needs flexibility in its operations and how decisions made in different parts of the business, who focus on applications and data, not infrastructure, require IT teams to respond in an application and data focused way.

Martin also discusses the types of organisations that can benefit from this NGDC way of thinking, and how in fact, it’s not about entire organisations, but about understanding where the opportunities for transformation exist, and delivering change there, be that an entire business, a single department or even a single application.

We also provide a word of caution and how it’s important to understand that not all our current applications and infrastructure are going to migrate to this brave new world of Next Generation Datacentres.

Next Generation Datacentre is not about a technology purchase, but is about understanding how to optimise the things we do, to meet our changing business needs and Martin provides some excellent insight into how we do that and the kind of areas we need to consider.

To find out more from Martin and from NetApp you can follow them in all the usual ways.

Their website

On twitter @NetApp @NetAppEMEA

You can also follow Martin @mr_coops

Martin also mentioned a selection of podcasts that often discuss next generation datacentre, you can find more details on those shows by clicking the links below.


The Cloudcast

NetApp’s own TechONTAP podcast.

I hope you enjoyed the show, if you did and want to catch all future Tech Interviews episodes, then please subscribe and leave us a review in all of the normal places.

Subscribe on Android


Listen to Stitcher

Gotcha – the challenge of moving to the cloud – Jon Green – Ep33

This week’s podcast, is the second of three looking at the challenges associated with moving to the cloud, last weeks show (which can be found here) focussed on integrating our data with the cloud.

This week we look at the equally challenging area of picking the right cloud service provider partners to help deliver your services, be that running your entire infrastructure or discreet applications, the choice of the right partner is key.

I think it’s often easy for us to not pay attention to partnerships when it comes to cloud, we assume the move is easy, that once we go to cloud everything will be “just fine”, however in my experience there are a whole range of “gotchas”, the kind of issues that can be the difference between success and failure of a cloud migration project.

This week I’m joined by Jon Green, Technical Solutions Director at Navisite Europe Ltd, to address some of these potential “gotchas”.

Jon is an experienced IT professional, with many cloud migrations under his belt and is well placed to point out some of the challenges we need to consider as we plan our cloud migration and develop those all important partnerships that are going to help us successfully deliver cloud services into our organisation.

We look at a range of issues that can impact our attempts at taking advantage of the flexibility, scale and commercial attractiveness that cloud can bring.

We discuss the risk of assuming that cloud migration is simple, especially when your applications are not “cloud ready”. The importance of understanding the underlying cloud infrastructure, as well as what comes with the common misconception that our cloud provider is just going to deal with “everything” for us.

Jon also explains how resilience is a key part of any modern IT strategy and that those based in the cloud are no different.

Finally we explore where an experienced cloud partner can bring real benefits to your cloud adoption.

Jon provides some excellent insight and lots of useful tips that you can hopefully use as you plan your own moves to the cloud.

You can find out more about Navisite at as well as follow them on twitter @Navisite.

You can also follow Jon on Twitter


Or email him

You can hear more about building cloud provider partnerships as Jon’s colleagues from Navisite will be joining me at on July 5th in Liverpool, why not come and join us.

Next week we wrap up our cloud migration series with Lee Clark from GivepennyUK as he shares how cloud services have allowed him to deliver real innovation into the charity fundraising market.

To make sure you catch that show, you can subscribe in all the usual ways.

Thanks for listening.

Subscribe on Android


Listen to Stitcher

New fangled magic cloud buckets – Kirk Ryan – Ep32

We’re all heading to the cloud, data is the new gold, all of our competition is putting its data in the cloud and getting massive competitive advantage, if we don’t know how to use our data and take advantage of the cloud we’ll be obsolete.

All phrases that you may well have heard, or variations on them at least and to some degree they are all true, there’s no escaping that cloud services are here to stay and certainly there is a huge shift in organisations of all types becoming more data centric.

But, if that is all true, why isn’t everyone charging headlong into cloud services and finding more and more valuable information from their data and making the rest of us a thing of the past?

It’s because it’s not that straightforward, cloud services can be complex, moving our data to them is difficult and even when we get it there, what new and ingenious things can we do?

Over the next few Tech Interviews episodes we are going to explore that subject and look at how we can take advantage of cloud, how we can get our data there and indeed, what innovative things can we do once we have, over these episodes we’ll look at how to choose the right cloud partners and speak with a company who have used cloud and data to come up with an ingenious way for charities to raise money.

But first, we take on the data challenge, how do we make it accessible to the cloud? how do we move it there? do we move it there? and if we do how do we keep it under control?

To help me explore that topic, I’m joined by Kirk Ryan, a Cloud Solutions Architect for storage vendor NetApp.

First we discuss why a “traditional” on-prem storage vendor like NetApp would need a cloud architect and how the “new” NetApp really have allowed themselves to be a “cloud first” company.

We also discuss the changing role of digital services in our organisations and how the IT purse strings are not always pulled by IT.

We look at the challenges that companies keen to embrace new technologies have to overcome. The importance of building bridges for our data between on-prem, cloud and hybrid platforms as well as the criticality of understanding our data’s lifecycle from creation through to its deletion.

Finally we look at the importance of not taking our on-premises bad habits with us to the cloud, as well as ensuring that we fully understand the economics of our move.

Because, after all, it isn’t a new fangled magic cloud bucket!

To find out more from Kirk you can contact him on LinkedIn.

To learn more about NetApp and their cloud data management tools then visit or

Finally if you want to hear more from Kirk, you’ll find him, alongside me and some other industry colleagues discussing how to move to the cloud at the in Liverpool on July 5th.

If you enjoyed the show and want to make sure you catch next weeks episode as we discuss choosing the right cloud partners, then why not subscribe in all of the usual places.

Until next time, thanks for listening.

Subscribe on Android


Giving jetpacks to cavemen – Pete Flecha, John Nicholson – Ep31

Over the last 15 years or so, there has been major shifts in the way we deliver technology in the enterprise, cloud, mobility and of course virtualisation. You can’t look at the virtualisation industry without looking at the company who has defined the industry more than any other, VMware.

However, they are not immune to change and the world is certainly changing as we become more application and data centric and with this our expectations of how we consume our IT services has also changed.

I don’t think we are all going to throw our virtualisation infrastructures in a skip anytime soon, but we do have different demands, we want to integrate cloud, want our systems to be data centric, we need security, privacy and availability and we need our data and applications to be flexible, resilient and simple to consume.

How does this then effect VMware’s view of the world? –

In wp_20161116_12_44_06_rich_li.jpgthe last interview I recorded at Veeam’s recent VeeamON conference, I managed to catch up with the hosts of the excellent VMware Virtually Speaking podcast, Pete Flecha and John Nicholson, to ask this question and find out how VMware are changing, how they see the needs of their customers evolving and what future trends they see as the next important areas for focus.

We also investigate how customer expectations continue to rise for their technology and how technology not only needs to be resilient but also smart. We look at how organisations want their IT to be able to easily consume and integrate “cloud” into their on-prem solutions.

John also shares some thoughts on designing resilient solutions, and how availability is not only about what you buy, it also about what you do and the part your environment can also play.

The guys also talk about VMware’s shift toward simplifying the technology stack, how technologies like vSAN, storage policies and VVOL’s are making our technology faster, smarter and more straightforward, allowing us to focus on our applications and data and not the complexity of the infrastructure below it.

Pete and John provide a fantastic insight into how our organisations technology requirements are changing and how VMware are changing to remain a relevant and important part of our IT stack.

Last but not least, John also introduces us to the wonderful image of Cavemen with Jetpacks!

If you want to follow up with Pete and John you can find them both on twitter @vPedroArrow and @Lost_Signal.

You can catch find the excellent Virtually Speaking Podcasts over at and check out episode 44 for a great chat the guys had with Michael Dell.

Hope you enjoyed the show, Pete and John where great guests and provided some fantastic insights into the industry and VMware’s place in it.

If you did, then why not subscribe on iTunes, Soundcloud and other good homes of podcasts.

Subscribe on Android

Analysing the availability market – part two – Dave Stevens, Mike Beevor, Andrew Smith – Ep30

Last week I spoke with Justin Warren and Jeff Leeds at the recent VeeamON event about the wider data availability market, we discussed how system availability was more critical than ever and how or maybe even if our approaches where changing to reflect that, you can find that episode here Analysing the data availability market – part one – Justin Warren & Jeff Leeds – Ep29.

In part two I’m joined by three more guests from the event as we continue our discussion. This week we look at how our data availability strategy is not and can not just be a discussion for the technical department and must be elevated into our overall business strategy.

We also look how technology trends are affecting our views of backup, recovery and availability.

First I’m joined by Dave Stevens of Data Gravity,  as we look at how ou060617_0724_Analysingth1.jpgr backup data can be a source of valuable information, as well as a crucial part in helping us to be more secure, as well as compliant with ever more stringent data governance rules.

We also look at how Data Gravity in partnership with Veeam have developed the ability to trigger smart backup and recovery, Dave gives a great example of how a smart restore can be used to quickly recovery from a ransomware attack.

You can find Dave on Twitter @psustevens and find out more about Data Gravity on their website

Next I chat with Mike Beevor of HCI vendor Pivot3 about how simplifying our approach to system availability can be a huge benefit. Mike also makes a great point about how, although focussing on application and data availability is right, we must consider the impact on our wider infrastructure, because if we don’t we run the risk of doing more “harm than good”.

You can find Mike on twitter @MikeBeevor and more about Pivot 3 over at

Last but my no means least I speak with Senior Research Analyst at IDC, Andrew Smith, we chat about availability as part of the wider storage market and how over time, as vendors gain feature parity, their goal has to become to add additional value, particularly in areas such as security and analytics.

We also discuss how availability has to move beyond the job of the storage admin and become associated with business outcomes. Finally we look a little into the future and how a “multi cloud” approach is a key focus for business and how enabling this will become a major topic in our technology strategy conversations.

You can find Andrews details over on IDC’s website .

Over these two shows, to me, it has become clear that our views on backup and recovery are changing, the shift toward application and data availability is an important one and how, as businesses, we have to ensure that we elevate the value of backup, recovery and availability in our companies, making it an important part of our wider business conversations.

I hope you enjoyed this review, next week, is the last interview from VeeamON, as we go all VMWare as I catch up with the hosts of VMWare’s excellent Virtually Speaking Podcast Pete Flecha and John Nicholson.

As always, If you want to make sure you catch our VMWare bonanza then subscribe to the show in the usual ways.

Subscribe on Android

Analysing the data availability market – part one – Justin Warren & Jeff Leeds – Ep29

Now honestly, this episode has not gone out today sponsored by British Airways, or in any way taking advantage of the situation that affected 1000’s of BA customers over the weekend, the timing is purely coincidental.

However, those incidents have made this episode quite timely as they again highlight just how crucial to our day to day activities as individuals and businesses technology is.

As technology continues to be integral to pretty much anything we do, the recent events at BA and the disruption caused by WannaCrypt are all examples of what happens when our technology is unavailable, huge disruption, reputational damage, financial impacts, as well as the stress it brings to the lives of both those trying to deal with the outage and those on the end of it.

Last week I spoke with Veeam’s Rick Vanover (Remaining relevant in a changing world – Rick Vanover – Ep28) about how they where working to change the focus of their customers from backup and recovery to availability, ensuring that systems and applications where protected and available, not just the data they contained.

As part of my time at the recent VeeamON event, I also took the opportunity to chat with the wider IT community who attended, not just those charged with delivering availability and data protection, but also those who looked at the industry through a broader lens, trying to understand not just how vendors viewed availability, but also at the general data market trends and whether businesses and end users where shifting their attitudes in reaction to those trends.

So over the next couple of weeks, I’ve put together a collection of those chats to give you a wider view of the availability market, how analysts see it and how building a stack of technologies can play a big part in ensuring that your data is available, secure and compliant.

First up, I speak with Justin Warren and Jeff Leeds.

Justin, is a well-known industry analyst and consultant as well as the host of the fantastic Eigencast podcast (if you don’t already listen you should try it out) Justin is often outspoken, but always provides a fascinating insight into the wider industry, and shares some thoughts here, on how the industry is maturing, how vendors and technology is changing and how organisations are changing or perhaps not changing to meet new availability needs.

You can follow Justin on twitter @jpwarren and do check out the fantastic Eigencast podcast.

Jeff Leeds, was part of a big NetApp presence at the event and I was intrigued why a storage vendor, famed for their own robust suite of data protection and availability technologies, should be such a supporter of a potential “competitor”.

However, Jeff shared how partnerships and complimentary technologies are critical in building an appropriate data strategy, helping us all ensure our businesses remain on.

You can follow Jeff on twitter at @HMBcentral and find out more about NetApp’s own solutions over at

I hope you enjoyed the slightly different format and next week we’ll dig more into this subject as I speak with Andrew Smith from IDC and technology vendors Pivot3 and Data Gravity.

To catch it, please subscribe in all the normal homes of Podcasts, thanks for listening.

Subscribe on Android