Tech me for the weekend – June 23rd

No idea where this week has gone! – shot by like nobodies business! So here I am again with some tech content to keep you company over the weekend… check out the articles and podcasts that caught my attention this week.

Articles

Business Continuity Institute – Cultural issues the number one obstacle to digital transformation

First up, this from the Business Continuity Institute looking at the impact of organisational culture and its ability to inhibit progression and how if you are looking to make change and make that change successful then don’t underestimate the importance of buy in, from top to bottom.

http://www.thebci.org/index.php/about/news-room#/news/cultural-issues-are-the-number-one-obstacle-to-digital-transformation-245939

 

Technative – Beginners Guide to Big Data

A constant conversation at the minute is how taking advantage of our data is going to be crucial in our future success as organisations and businesses of all types.

This article looks at one of the areas that will be behind the successful adoption of data analytics and that is the increasing availability of big data tools, mainly powered by the big public cloud vendors, who are making it easier than ever for us to take advantage of our data.

https://www.technative.io/a-beginners-guide-to-big-data/

Matt Watts – Data Visionaries Wanted

To continue the focus on data and cultural change I came across this gem from Matt Watts as he discussed the importance of Data Visionaries in your organisation, those people who can see the value of data and can help you extract key information from it.

http://www.watts-innovating.com/data-visionaries-wanted/

All Aboard the Data Train

A new article from me this week, as myself and Mrs. Techstringy found out for ourselves the importance of data analytics in informing good business decisions.

That’s caught you attention hasn’t it!

All Aboard the Data Train

Podcasts

In Tech We Trust

This show has had a real revamp over the last few episodes and now has more of a focus on tech industry topics, rather than its old weekly news roundup format.

This week is a particularly interesting topic as the team discuss communication in the modern world, from Human to Human through to Human to machine a fascinating debate with a number of interesting guests, well worth a listen.

http://intechwetrustpodcast.com/e/124-technology-and-human-communication/

The ON-Premise IT podcast

Becoming a real favourite this show, this week the round table panel, take a bit of a diversion from the normal tech debate as they discuss careers, the kind of career moves to consider through to the importance of certifications. Good tips.

http://gestaltit.com/podcast/rich/managing-career-premise-roundtable/

Influence marketing podcast

A new show to the list, as John and Cathy Troyer host chats with IT folk who are involved in Tech Community programs.

Now on Episode three, but I though I’d mention the first one, with Veeam’s Rick Vanover as he discussed the extremely successful Veeam Vanguard Program an interesting insight into the work that goes on behind the scenes.
https://medium.com/influence-marketing-council/the-rickatron-is-always-on-9f626bce7211

Tech Interviews

This week was all about moving to the cloud, as I was joined by NetApp Cloud Architect Kirk Ryan, as we discussed the important things to consider as you look to see how you can take advantage of cloud services in your business.

We cover the importance of cloud economics, the options for integrating the cloud with your data and how to ensure you don’t take your on-prem bad habits with you.

A top guest, with lots of great insight.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Hope you enjoy that content.. love to hear your feedback.

Have a great weekend.

 

 

 

All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Tech me up – Your tech entertainment for this weekend – 16th June

It’s the summer, the sun is out, it’s the weekend, what’s the only thing missing? some great technical content to keep you company while you sip a cold one.

This weeks list of top tech has a data security slant to it, so not only will it be keeping you informed, it will ensure you can keep those data assets secure.

Settle back and enjoy;

Podcasts

For those who like to listen to their tech, here’s a list of great podcasts I caught this week.

Arrow Bandwidth

An excellent and somewhat unique chat with Marcus Hutchings, better known as Malware Tech, the information security engineer who discovered the “kill switch” for the WannaCrypt ransomware outbreak.

They discuss how he discovered the workings of the malware and how they came about the kill switch and how he’s dealing with the “fame” that came with it.

A great listen.

InfoSec Podcast

This is a relatively new podcast to my list, but an excellent weekly discussion on the latest news from the information security industry.

This week they focus on the latest release of SANS’ Security Awareness Report which attributed communication as one of the primary reasons why awareness programs thrive or fail.

The team also look at the difficulties of legislating for cyber security in a quick moving technology world.

Infosec Podcast

 

Datanauts

More security, this time from the Datanauts team, who are joined by James Holland and Aaron Miller from Palo Alto Networks to discuss the evolution of security architectures and approaches, the importance of application awareness, and the impact of virtualization, which can both create new risks and provide new opportunities.

They also look at where security is going, how cloud and virtualization will continue to shape your security infrastructure, and how skill sets will have to adapt to support more automation.

Datanauts Podcast

NetApp Tech ONTAP

A bumper week for podcasts and if you want something not security related, how about some coding?

I really enjoyed this episode of the NetApp Tech ONTAP podcast, not really NetApp focussed at all but a great chat with Ashley McNamara of  Pivotal discussing how storage administrators (and pretty much anyone) should be learning to code. Ashley also gives us places to look for resources for aspiring developers and scripters to be successful.

Great fun.. have a listen.

Tech Interviews

This week was a VMware special, as I was joined by the hosts of the excellent VMware vSPEAKING Podcast, Pete Flecha and John Nicholson , they make great guests as we discuss how peoples changing demands on technology are changing how we have to design and architect our infrastructure.

We also look at how our infrastructure not only needs to be faster and simpler, but also needs to be smarter and how our application and data centric world is driving demands for availability.

John also introduces us to the concept of giving jetpacks to cavemen!

Great fun with the guys, have a listen.

Articles

Rather settle back in the garden and read your tech, then try out these security focussed articles;

Data Privacy Monitor – Deeper Dive Security is a big deal for big data

We are all keen to take advantage of data analytics so we can get more value from our data assets, but how many of us consider the range of security challenges that comes with consuming those public big data services?

In this article Lavonne Hopkins looks at a range of issues to consider and provides some solid advice.

https://www.dataprivacymonitor.com/big-data-2/deeper-dive-security-is-a-big-deal-for-big-data/

Compare The Cloud – Refashioning data security with a nod to the cloud

The thing with the Internet is once it’s out there, it stays out there, but on the plus side you can find little gems of articles that you may not of read at the time.

This was one of those, posted back in Jan 2016 by the team at Compare the Cloud.

In this article they look at the challenge that CISO’s have. Pulled in many directions from their businesses, while having to deal with all of the evolving security threats.

It’s an interesting read looking at the kind of approaches that a CISO can look at to help take on the multiple challenges they are faced with.

https://www.comparethecloud.net/articles/refashioning-data-security-with-a-nod-to-cloud/

Forbes – Why manufacturers should be mindful of cybersecurity

This article discusses how cyber attackers target the manufacturing industry, what caught my attention was how the idea of small imperceptible changes can eventually have a huge impact.

This approach is not just a threat to manufacturing, it is also a threat for security built around analytics and AI. One of the approaches attackers take to overcome machine learning based security is using small changes that machine learning algorithms don’t notice and overtime they start to accept as normal behaviour.

This provides an interesting read and highlights the complexity of the security threat.

https://www.forbes.com/sites/forbestechcouncil/2017/06/01/why-manufacturers-should-be-mindful-of-cybersecurity/#11b188b810d2

Hopefully something there for everyone, enjoy your weekend.

Tech me for the weekend – June 9th

Well well, it’s the weekend again and I know you all find it tough to drag yourself away from the world of tech, so worry not, here’s this weeks list of top tech entertainment to keep you teched up this weekend.

This weeks there is a bit of a data  theme.. so dive in their is podcasts and articles a plenty…

The Reading

Why Data will drive your success in the cloud – Matt Watts

Matt works closely with the office of the CTO at NetApp and wrote this interesting piece on their site about the changing way we all see data’s value and what that means for the way we build our data strategy.

Inspired by a recent article in The Economist that notes in today’s economy, “the world’s most valuable resource is no longer oil, but data.” – I know we’ve all heard it, but Matt explores the topic and what it means.

Why data will drive your success in the cloud

Setting sail for uncharted waters – Ruairi McBride

Some big announcements from storage behemoth NetApp this week which included their entry into the world of Hyperconverged Infrastructure (HCI).

Now, no question that NetApp are late to this particular party, but Ruairi looks at why and the potential benefits that NetApp have from having sat back and watched the market for a little while and what their solution brings that may make them stand out.

If you want to know more about the NetApp HCI offering, this is a great place to start.

Setting sail for uncharted waters

One year until the EU GDPR

This complex regulation will finally be enacted in just under a year and personally I don’t think you can ever read up too much on what this may mean for us all and the way we collect, store, manage, protect and dispose of our data assets.

Good read here from CITY A.M. which also includes some quotes from my favourite data privacy attorney Sheila Fitzpatrick

A good piece, well worth a read.

One year until the EU GDPR

Big wheels keep on turning

Back to NetApp, I wrote a piece myself this week looking at NetApp’s continued evolution from storage company to data management company, how that was progressing, why that was important and how are they embracing this ever increasingly data driven world.

Big Wheels Keep On Turning

The Listening

Arrow bandwidth

This is an episode from a couple of weeks ago, but fits in nicely with this weeks data theme as Rich and Dave are joined by Vince Payne to discuss the data industry and the impact of business intelligence and analytics.

Rich plays the role of devils advocate perfectly, as the team discuss as to whether BI and Analytics really is a “thing” and whether people are actually using these technologies.

Excellent and thoughtful debate as they look at whether data really Is the new gold!

 

Gestalt IT – The ON-Premise IT round table

Another excellent debate asking what is the reality for the future of data, is AI, machine learning and data analytics really going to change the world and do something interesting?

And does the future include clever machines really replacing people? Or will us poor humans always have a place!

More excellent devils advocacy from Nigel Poulton.

What is big data?

Speaking In Tech

Rounding off the data chat, a brilliant guest joins the Speaking In Tech team, as Michel Feaster from Usermind discussed how data analytics and intelligence can have a massively positive impact on our customer experiences.

Some great practical examples of how using intelligence alongside traditional systems can revolutionise the kind of results we get, give it a listen.

Speaking in Tech: Blame millennials for customer engagement upheaval

Tech Interviews

Of course you don’t get this far without a plug for my own show, this week is the second part of a look at the data availability market.

My guests this week discuss how to gain more value from data backups, how to ensure that our focus on application availability doesn’t do more harm than good and whether availability is elevated to the right level of importance?

Three great guests in Mike Beevor of Pivot 3, Data Gravity’s Dave Stevens and Andrew Smith of IDC.

Enjoy.

Oh and if you missed part one where I chat with Justin Warren and Jeff Leeds, fear not.. it’s here…

So enough to keep you busy..

Have a great weekend.

 

Big Wheels Keep On Turning

Just about a year ago I wrote a piece about NetApp and how they were making a strategic shift (Turning a big storage ship),changing their focus as well as the perception of both the industry and customers. This coincided with the launch of the latest version of the companies bestselling storage operating system ONTAP – version 9

A year on, after spending a few days with the NetApp’s leadership as part of our annual NetApp A-Team get together, I thought It would be good to check in on how that big storage ship was doing and was it still turning in the right direction.

First some context, data is increasingly the lifeblood of our organisations, it’s in the top 2 or 3 assets any business holds and we are constantly seeing how organisations are using data in ever more creative ways. While of course we continue to create more and keep it for longer.

Not only do we need more from our data, the way we consume data services is changing, the big public cloud providers are giving us analytics services on demand, allowing us to solve more and more complex problems, well as long as we can get our data to their cloud offerings in the first place. Which means more data been housed in the cloud, which is great for analytics, but isn’t always a great fit for our data sets.

In that context, how does a big storage vendor remain relevant?

In my opinion they have to embrace the changing attitude to data, just wanting to store it isn’t enough, to quote a friend of mine “storing is boring” and in reality it kind of is, if your only view of data strategy is storing it neatly, you are missing a trick.

So the question is, are NetApp embracing this new data driven world?

Shift to a data management company

This is something I’ve been hearing over the last 6 months and fully expect this is going to be front and centre of a lot of NetApp messaging, as they start to move from storage company to data management company, this focus is absolutely right, in my own company, we have done the same thing, because it’s what our customers demand, it’s not about building infrastructure and storing data, it’s about taking a valuable asset and getting the most out of it.

There is no point just talking to a modern organisation about how much storage you can provide and how fast it is, organisations want to know “how can you make sure my data asset, remains an asset”

Data Fabric

For those not familiar with NetApp’s Data Fabric, it is a criticalData-Fabric_shutterstock_thumb.jpg part of their vision as they make the shift to a data management company. A data fabric is NetApp’s view of  how we build a data infrastructure that allows us to get the best from our data giving us the flexibility in how and where we store it, how we move it, while maintaining security and compliance, all crucial in a modern data strategy.

But this does go beyond just a strategic goal, this is baked into all of NetApp’s thinking, the idea that you can move data across any NetApp platform regardless of whether it’s hardware, white box, virtual machine or even sat in AWS or Azure, is very powerful. It also isn’t limited to ONTAP, allowing us to move data between ONTAP, Solidfire, E-Series, AltaVault and even none NetApp platforms via Flexarray.

Ultimately will data fabric be stretched beyond the NetApp portfolio? Who knows.. it would be great if it did, but there’s a lot of work to be done.

Embracing the new world

CloudComputing_thumb.jpgPart of the new way of working with data includes the cloud, there is no getting away from this reality, whether it’s consuming SaaS like Office365 and Salesforce or it’s holding our data long term in S3 or Azure Blob stores or needing to present our data to analytics tools, organisations are moving more data to the cloud.

What part does an on-prem storage vendor play in this? It has to be two fold;

Help me to move data to the cloud

Because they supply on-prem storage arrays, NetApp can’t ignore the reality that their customers want to move data to the cloud. To NetApp’s credit they are embracing this challenge and are helping enable this movement.

The data fabric strategy and ONTAP are a key part of this, the ability to take NetApp’s core storage OS and deploy it directly from either AWS and Azure, means that not only can you move your data from your on-prem array straight into a public cloud, but because it’s the same operating system end to end you can crucially maintain all of the on-premises  efficiencies, management and controls on your data in the public cloud and this is a real positive.

It’s not only moving of data to the cloud that NetApp have turned focus to however, it’s also looking at ways that cloud based services can play a part in thier future that is also interesting. This has started with two services, Cloud Sync and Cloud Control.

Cloud Sync assists users in automating the process of moving data from on-premises NFS datastores straight into Amazon S3 storage and back again.

While Cloud Control allows organisations to protect their Office365 data, by allowing us to back it up and hold it in an alternate location.

The important thing to note with these two services is they are exactly that, they are services, no traditional NetApp tools are needed as any part of the solution, you subscribe to the services and begin to use them.

If anything proves NetApp’s position on embracing the new world, it is this.

Big Wheels Still Turning?

With a range of new announcements due soon, including the much anticipated NetApp HCI platform, the storage behemoth, in my opinion, continues to evolve, it’s focus is right and certainly aligning to the challenges that the organisations I deal with talk about.

It continues to do smart things within it’s core product set, continuing to add tools that enable their wider data fabric strategy and working these directly into their portfolio, especially the product at the heart of it, ONTAP.

Personally I continue to be very enthused by what NetApp are doing and the direction they are taking and for me, those big wheels are not only turning, they are turning in exactly the right direction.

Let’s see if they can keep it up.

Tech Me For the Weekend – 26th May 2017

Was busy living it up in New Orleans at VeeamON last week, so didn’t get a chance to give you the weekly round of top tech titbits – but with a UK holiday weekend coming up, I didn’t want to leave you without some top content to enjoy – so here goes – enjoy the long weekend with this top tech..

Podcasts

The On-Premise IT roundtable (and yes On-Premise on purpose!)

I’m new to this show from the team behind tech field day – so a little behind in episodes but this first one was a cracker, Is DevOps a Disaster?  a roundtable discussion picking through the DevOps minefield and seeing if there is really anything to this DevOps thing.

Is DevOps a Disaster? The On-Premise IT Roundtable 2

Virtually Speaking Podcast and Interview with Michael Dell

Always a big fan of this show as a great way to keep up with VMware tech, but a bit of a departure this week as Pete and John are on the road at DellEMC world and they catch up with two key leaders from the business Chad Sakac and none other than Michael Dell himself, talking about the business and where they are heading.. fascinating stuff.

Virtually Speaking Podcast

The Geek Whisperers

Always like to give this show a plug when there is a new one.. and this week they are joined by Emily Hendershot and Renee Woods discussing the art of keynote presentations, a chat about the do’s and don’ts, if you fancy adding keynote speaker to your C.V., as always a great listen.

The Geek Whisperers

Tech Interview

Don’t forget my only little gem of a show, now with added theme tune! – this week we come from VeeamON as I catch up with Rick Vanover to discuss the future of Veeam, news from last weeks conference, including some of our favourite announcements, and what we can expect to hear in the future from them, give it a listen and let me know if you like the theme tune!

Remaining relevant in a changing world – Rick Vanover – Ep28

Articles

More of a reader than a listener?, worry not, these articles should keep you going

Office 365 adoption pack

I don’t normally go for product announcement stuff, but I made a bit of an exception this week with this Microsoft blog post on their new PowerBI dashboard for Office365. Not so much for the dashboard itself, but more because the area of data visualisation is a really interesting one and a topic I’m keen to understand better, and just thought this was an excellent example of the power of data visualisation.

Microsoft Office BLOG

Multi Cloud v Stacking

I thought this post from NetApp raised an interesting debating point, as many of us look to how we can take advantage of cloud services, this article raises a good question about whether you should consider a multi cloud strategy, obviously a NetApp slant, but a very good question, well worth a read.

NetApp Article

VeeamON Wrap Up’s

As I mentioned earlier, I was away in New Orleans last week at the VeeamON conference, lots of great announcements from the Veeam team, an awful lot to catch up on.

If you want a comprehensive list of the announcements then look no further than Michael Cade and his daily wrap up posts from the event, should give you all the Veeam goodness you could want.

VeeamON2017 Shakedown Part 1

VeeamON 2017 Shakedown Part 2

VeeamON it’s a wrap

Of course I couldn’t forget myself could I?, my own take on the VeeamON event and where Veeam are heading as a company is right here;

VeeamON It’s a Wrap

Well hopefully all of that will keep you busy this long weekend, enjoy it, whatever you are doing.

 

 

 

VeeamON It’s a Wrap

Last week as you may of spotted I attended Veeam’s technical conference VeeamON, I blogged a couple of pieces  while I was there (Veeaming On and On and On and Veeam On It – Day Two at Veeam ON),  but thought it was time to give a bit of an overall take on the event.

Day Three

Day threes main focus was Veeam’s relationship with Microsoft, especially the Microsoft cloud platforms. WP_20170518_09_40_34_Rich_LIThat focus is important in two ways, firstly, as Veeam look to move the conversation to one of wider availability, rather than just protection, support for the big public cloud players is going to be key.

Secondly, it’s refreshing to see a vendor putting this kind of focus on Microsoft cloud, too many vendors focus only on AWS and although there is no problem with that, it does ignore the amount of organisations, especially those big Microsoft shops, who have Azure as a key part of their data fabric strategy, that’s before we even begin to look at those who have Office365 as part of their software stack.

What was announced?

Veeam Disaster recovery in Microsoft Azure, combines Direct Restore and the new Veeam PN (Veeam Powered Network) providing the ability to not just recover your VM’s but importantly automate one of the trickiest parts of building cloud environments, networking, and when building a DR solution, the amount of automation you build in to it can make a big difference to the success or otherwise of your recovery strategy.

We also heard of extensions to Veeam Backup for Office365, with support for multi-tenancy, allowing organisations that have multiple Office365 deployments to protect those workloads with a single Veeam Backup platform.

This is additionally useful for those who deliver backup as a service using Veeam, the ability to use a single installation to back up multiple 365 customers is going to make your service much more efficient.

We also heard about the addition of native support for object storage in Availability Suite V10, including Azure Blob storage. In my opinion the use of object storage for long term archive and retention is going to become the norm relatively quickly, so native support as part of your availability solution, removing the need for 3rd party gateways, is a real plus.

What did I think?

Events like this for me are about trying to get a handle on business direction, it’s important technology companies have a direction that recognises the changing needs of businesses, both now and in the future.

I made the investment attending with one question in mind, “how are Veeam going to continue to be relevant in a changing world?”.

As our relationship with and requirements from our data, applications and technology change, the idea that our data only lives on-prem in virtual machines is unrealistic, so as a company who’s traditional strength is protection of those types of workloads, how, as your customers move away from your traditional strength, do you react to that and meet these changing needs.

Our digital lives

Today, technology is of course a constant in both our work and personal lives, be that Facebook or our own internal business apps, we increasingly rely on them and our tolerance for their unavailability is pretty low and in a world where it’s easy for us and our customers to move on to the next supplier, If our systems are unavailable, it’s not just our lack of patience that’s a problem, it also presents a real risk of significant business impact.WP_20170517_09_08_31_Rich_LI

In that context, Veeam have recognised that data protection alone is not the answer and availability has to be the focus. Of course they are absolutely right, our businesses are hugely reliant on data, however, it’s not just the existence of it, it’s our ability to access it, use it and to get value from it that’s important and we can only do that if our data and its supporting applications and services are available.

Veeam’s data fabric

I’ve written about data fabric before, normally in the context of NetApp and found it interesting to see Veeam using this same language, but again, they are right, we can no longer design a data infrastructure that includes silo’s, places where our data lives cannot be disjointed from our wider infrastructure and it needs to be flexible and mobile, it’s key our data and supporting services are where we need them, when we need them.

Veeam’s focus on easily moving data around from our physical servers, to virtual, to cloud was clear, and supported by announcements like native public object storage, Office365 backup, protection using both Azure and AWS and the ability to make both our data and services quickly available in all of those areas, as well as move between them is quite compelling.

Broadening the conversation

This strategic shift from Veeam, is not just technically useful, if you are Veeam, a Veeam partner, using Veeam or considering it, it encourages you to take a wider view of your data protection strategy, it stops us focussing on “backing up stuff” and doing the thing that we really need to do, focus on the availability of our systems.

I think we still see many people just focussing on data protection and although that is still important, it does sometimes mean we are blinkered and not considering the wider services needed to support our data and allow our businesses to be quickly operational again in the event of a service interruption.

Staying relevant?

Personally, I think Veeam’s messaging was exactly where I’d hoped it was, recognising the changing world, talking about problems I recognise and see our customers are experiencing and looking to deal with.

It’s also good to see them not only embracing trends such as cloud and object storage, but also recognising gaps, adding agents to allow more comprehensive physical server protection for example, is important as Veeam aim to deliver services to larger enterprises.

Of course, the trick with all of this is not the messaging, but will be in the execution.

For now Veeam are still delivering a product that their customers love and “just works” and if they can do the same in all of these wider areas, then Veeam will be relevant for a long time to come.

Keep on Veeaming ON!

If you want some more thoughts from Veeam, why not catch up on my Tech Interviews podcast, where I spoke with Veeam’s Director of Technical Product Marketing & Evangelism, Rick Vanover, as we discussed future strategy, some of the announcements from the event as well as what more we can expect from Veeam in the future give it a listen.