All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Analysing the availability market – part two – Dave Stevens, Mike Beevor, Andrew Smith – Ep30

Last week I spoke with Justin Warren and Jeff Leeds at the recent VeeamON event about the wider data availability market, we discussed how system availability was more critical than ever and how or maybe even if our approaches where changing to reflect that, you can find that episode here Analysing the data availability market – part one – Justin Warren & Jeff Leeds – Ep29.

In part two I’m joined by three more guests from the event as we continue our discussion. This week we look at how our data availability strategy is not and can not just be a discussion for the technical department and must be elevated into our overall business strategy.

We also look how technology trends are affecting our views of backup, recovery and availability.

First I’m joined by Dave Stevens of Data Gravity,  as we look at how ou060617_0724_Analysingth1.jpgr backup data can be a source of valuable information, as well as a crucial part in helping us to be more secure, as well as compliant with ever more stringent data governance rules.

We also look at how Data Gravity in partnership with Veeam have developed the ability to trigger smart backup and recovery, Dave gives a great example of how a smart restore can be used to quickly recovery from a ransomware attack.

You can find Dave on Twitter @psustevens and find out more about Data Gravity on their website www.datagravity.com

Next I chat with Mike Beevor of HCI vendor Pivot3 about how simplifying our approach to system availability can be a huge benefit. Mike also makes a great point about how, although focussing on application and data availability is right, we must consider the impact on our wider infrastructure, because if we don’t we run the risk of doing more “harm than good”.

You can find Mike on twitter @MikeBeevor and more about Pivot 3 over at www.pivot3.com

Last but my no means least I speak with Senior Research Analyst at IDC, Andrew Smith, we chat about availability as part of the wider storage market and how over time, as vendors gain feature parity, their goal has to become to add additional value, particularly in areas such as security and analytics.

We also discuss how availability has to move beyond the job of the storage admin and become associated with business outcomes. Finally we look a little into the future and how a “multi cloud” approach is a key focus for business and how enabling this will become a major topic in our technology strategy conversations.

You can find Andrews details over on IDC’s website .

Over these two shows, to me, it has become clear that our views on backup and recovery are changing, the shift toward application and data availability is an important one and how, as businesses, we have to ensure that we elevate the value of backup, recovery and availability in our companies, making it an important part of our wider business conversations.

I hope you enjoyed this review, next week, is the last interview from VeeamON, as we go all VMWare as I catch up with the hosts of VMWare’s excellent Virtually Speaking Podcast Pete Flecha and John Nicholson.

As always, If you want to make sure you catch our VMWare bonanza then subscribe to the show in the usual ways.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Analysing the data availability market – part one – Justin Warren & Jeff Leeds – Ep29

Now honestly, this episode has not gone out today sponsored by British Airways, or in any way taking advantage of the situation that affected 1000’s of BA customers over the weekend, the timing is purely coincidental.

However, those incidents have made this episode quite timely as they again highlight just how crucial to our day to day activities as individuals and businesses technology is.

As technology continues to be integral to pretty much anything we do, the recent events at BA and the disruption caused by WannaCrypt are all examples of what happens when our technology is unavailable, huge disruption, reputational damage, financial impacts, as well as the stress it brings to the lives of both those trying to deal with the outage and those on the end of it.

Last week I spoke with Veeam’s Rick Vanover (Remaining relevant in a changing world – Rick Vanover – Ep28) about how they where working to change the focus of their customers from backup and recovery to availability, ensuring that systems and applications where protected and available, not just the data they contained.

As part of my time at the recent VeeamON event, I also took the opportunity to chat with the wider IT community who attended, not just those charged with delivering availability and data protection, but also those who looked at the industry through a broader lens, trying to understand not just how vendors viewed availability, but also at the general data market trends and whether businesses and end users where shifting their attitudes in reaction to those trends.

So over the next couple of weeks, I’ve put together a collection of those chats to give you a wider view of the availability market, how analysts see it and how building a stack of technologies can play a big part in ensuring that your data is available, secure and compliant.

First up, I speak with Justin Warren and Jeff Leeds.

Justin, is a well-known industry analyst and consultant as well as the host of the fantastic Eigencast podcast (if you don’t already listen you should try it out) Justin is often outspoken, but always provides a fascinating insight into the wider industry, and shares some thoughts here, on how the industry is maturing, how vendors and technology is changing and how organisations are changing or perhaps not changing to meet new availability needs.

You can follow Justin on twitter @jpwarren and do check out the fantastic Eigencast podcast.

Jeff Leeds, was part of a big NetApp presence at the event and I was intrigued why a storage vendor, famed for their own robust suite of data protection and availability technologies, should be such a supporter of a potential “competitor”.

However, Jeff shared how partnerships and complimentary technologies are critical in building an appropriate data strategy, helping us all ensure our businesses remain on.

You can follow Jeff on twitter at @HMBcentral and find out more about NetApp’s own solutions over at www.netapp.com

I hope you enjoyed the slightly different format and next week we’ll dig more into this subject as I speak with Andrew Smith from IDC and technology vendors Pivot3 and Data Gravity.

To catch it, please subscribe in all the normal homes of Podcasts, thanks for listening.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Remaining relevant in a changing world – Rick Vanover – Ep28

One of the biggest challenges we face in technology is constant change. Change is not bad of course, but it presents challenges, from upgrading operating systems and applications, to integrating the latest technology advancements, to responding to new business problems and opportunities.

But it is not only those implementing and managing technology who are affected.

Technology vendors are equally effected, the IT industry is full of stories of companies who had great technologies but have then been blindsided by a shift in the needs of their customer base, or a technology trend that they failed to anticipate.

It was with this in mind that I visited Veeam’s, VeeamON conference.

Veeam are a technology success story, a vendor who arrived into the already established data protection market and shifted how people looked at it. They recognised the impact virtualisation was having on how organisations of all types where deploying their infrastructures and how traditional protection technologies where failing to evolve to meet these new needs.

Veeam changed this and that is reflected in their tremendous success over the last 9 years, today they are a $600M+ company, with 100’s of thousands of customers. But the world is now changing for them also, as we start to move more workloads to the cloud, as we want more value from our data, as security starts to impact every technology design decision, and of course as we all live ever more digitally focussed lives, our needs from our systems are changing hugely.

How are Veeam going to react to that ?, what are they going to do to continue the success they’ve had and to remain relevant in the new world that much of their market is shifting into ?.

For this week’s podcast, I look at that very question and discuss Veeam’s future with Rick Vanover, Director of Technical Product Marketing & Evangelism, Rick is a well-known industry face and voice, and we had an excellent conversation looking at Veeam’s future aims.

We discuss their repositioning as an availability company, look at how Veeam are developing a range of solutions to give them an availability platform and how this platform will allow their customers to build a strategy, to not only protect their critical data assets, in a range of different data repositories, but will also allow them to move their data seamlessly between them.

We also take a look at some of the big announcements from the show and pick out our top new features.

In my opinion, Veeam’s strategic vision is a good one, the ability to provide organisations with the data protection they need regardless of data location and the ability to move data between those locations is important, but, as ever, remaining relevant will be dictated by their ability to execute that vision.

Hope you enjoy the show.

To find more about Veeam you can of course check out their website www.veeam.com and engage with them on twitter @veeam and if you want to catch up with Rick, he can also be found on twitter @RickVanover.

Over the next couple of weeks we will be looking more at availability and protection, as we talk with the wider technology community as well as industry analysts on how they see the evolving data market.

To catch those shows then subscribe in all the normal ways.

Oh and I hope you like the new theme tune!

Thanks for listening.
Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Veeam On It – Day Two at Veeam ON

Day two of Veeam ON in the can, and a big day for thier core product Veeam Availability suite, with the announcement of Version 10, delivering some key new functionality. There was also some smart additions to the wider Veeam Platform family, but more on those at a later date.

Let’s start with Availability Suite V10, still very much at the core of what Veeam are delivering;

Physical Servers and NAS

While Veeam introduced the ability to backup physical servers with their free end point protection tool, V10 sees that capability more tightly integrated into the suite, this with the addition of agents for both Windows and Linux strengthens their capabilities in the wider enterprise, allowing Veeam to truly move just beyond virtual machines workloads.

NAS support is also a very welcome addition, allowing direct interaction with data housed on those enterprise storage repositories, housing TB’s of unstructured data. In a Veeam world previously the only way to protect that data would be if it resided on a Windows File Server and for many of us, that’s just not the case.

Although great additions, I don’t think I’m been overly harsh suggesting these are “table stakes”, fleshing out the suite to capture as many potential data sources as possible and really bringing them in line with most of the enterprise data protection market.

But, the announcements did more than just fill gaps, recognising both critical business challenges and embracing key technology developments in how we store our data much more effectively.

Continuous Data Protection

Some workloads in a business are a real challenge to protect, their availability is so critical to our business that they have the most stringent recovery point and time objectives, tolerating close to zero outages and data loss.

Often this is dealt with by the application design itself taking advantage of clustering and multiple copies of data across the business (think SQL Always on and Exchange DAG’s for example), but what if your application doesn’t allow that, how do you protect that equally critical asset.

CDP is the answer, limited currently to virtual machines hosted within a VMware environment (due to it exploiting specific VMware technologies) CDP provides a continuous backup of that key workload and in the event of a critical failure, not only can Veeam now make that workload quickly available again, but data loss will be only a matter of seconds, allowing us to meet the most stringent of service levels for those critical applications.

Object and Archives

My personal favourite announcement is the addition of native object storage support in V10. Object storage is becoming the de-facto standard for storing very large datasets needing long term retention, it is the basis of storage for the public hyperscale providers such as Microsoft and Amazon.

The addition of native support, alongside the addition of backup archiving capabilities, really start to introduce the possibility of a backup fabric giving On-Prem production, to backup repository, off to cloud for cheap and deep long-term retention.

Delivering that without the need for large and expensive 3rd party cloud gateway appliances, is a real plus.

The critical inclusion of S3 support also means that if you are already deploying any of the leading object storage platforms into your current infrastructure, as long as they support S3, and those leaders do, you can hook your Veeam data protection strategy straight in.

Veeam have certainly fleshed out version 10 nicely, adding some missing functionality, but also dealing with some tricky availability challenges, while embracing some of those emerging storage technologies.

And that’s just the Availability Suite, more to come on some of the wider announcements – but now, time for day 3…

 

Veeaming On and On and On

I always knew this tech community lark would get me into trouble, blogging, podcasting, engaging with the community, and finally it did! As I got invited out to New Orleans to attend Veeam’s technical conference VeeamON, so admittedly, it’s hardly a punishment!

As nice as been invited to the beautiful city of New Orleans is, the actual attraction for attending is Veeam themselves.

Many of our customers run Veeam solutions in their datacentres and only have good words to say about their backup and recovery tools and that is reflected in the companies impressive growth over the last 9 years, from niche virtual machine backup and recovery tool, to one of the standard tools you expect to see in a data protection strategy.

However, there are challenges. The world, as we know, is evolving, we are becoming more data centric and increasingly focussed on treating our data like the organisational asset it is, making sure it is protected, available, secure and private.

That’s not the only challenge, as cloud solutions become ever more pervasive, organisations are consuming more of their technology via service offerings, Office365, AWS, Azure, Salesforce, etc. the list goes on.

That’s a problem

If you are a traditional backup and recovery company, famed for your ability to protect virtual machines, normally residing on-prem, what happens when your market begins to change and move away from doing the things that have brought you success, where your traditional customers see data as an asset, wanting to move it around between all sorts of data stores, wanting to extract valuable information from it, wanting to move more and more infrastructure away to service providers and store data in other locations, in that world, what place is there for the traditional backup and recovery company?

It’s that question that encouraged me to pop over the Atlantic to VeeamON 2017 to see if and how Veeam are going to answer it.

Over the next couple of days then I’ll provide some immediate thoughts on what I hear on their future direction.

Day one, as with any vendor conference, was littered with announcements, not for regurgitation here, you can see the full range of announcements on Veeam’s website, rather than focus on that, I wanted to share a more general view on what Veeam shared throughout the keynote as well as some of the other sessions on show.

All about availability

There’s been a shift certainly in the way Veeam position themselves, the focus on availability is foremost in the conversations I’ve had today, as well as of course in that press release, but as subtle as this change from focussing on “backup and recovery” is, I do believe it’s a fundamental shift, a couple of weeks back I did a podcast episode with Veeam’s Michael Cade as we discussed availability as part of digital transformation (you can find the podcast here for the details ) how, as we become ever more reliant on our digital life, either as a consumer or of course as businesses and organisations, then our tolerance of system failure is very low indeed.

In a world where we need to be able to respond quickly to challenges, customers and changing markets, an inability to have our key data available is crippling and this subtle shift does allow us to change the conversation to start focussing beyond protecting data, and look at protecting the entire system that sits around it.

There were two other phrases that came from the main stage today, that really stuck with me and provides, I think, a good indication that Veeam are strategically, at least, focussed in the right area.

During the panel discussion on stage, HPE’s Patrick Osborne said he saw Veeam as “an enabler for data movement” and what he meant by that, was in a world of many potential homes for our data, it’s vitally important that we design a strategy that allows us to seamlessly move it to the locations where we need it, when we need it.

Many of you know my feelings on the importance of building this kind of fabric strategy, where regardless of location, on-prem, whitebox, as a virtual storage appliance or sat near to or in the cloud we need to be able to move our data, while maintaining full control over it, interesting that this is something that Veeam are enabling for vendors like HPE.

My favourite quote for the day came from Mario Angers from The University of British Columbia, it came at the end of a discussion about one of the days other key themes, the focus on business outcomes, understanding the business problem you are up against and solving that problem, or as it was perfectly summed up by Mario “fix my problem and I’ll buy it!” maybe something for us all to remember whether we are selling technology to a customer, or we are a buyer of technology selling the solution into our organisation, understand the problem and solve it!

A final thought for this first day, comes from the final session of the keynote, a panel sesions with Veeam’s leadership.

What struck me, was the background of some of the more recent talent to be acquired by the company, Peter McKay from a senior role at VMware, Paul Mattes, from a senior role at Microsoft and Danny Allan from a senior security role. Acquiring staff can be a challenge, but acquiring staff from companies who many aspire to be a part of, well that raises a question, what is the opportunity they see at Veeam that would make them take that leap?

Maybe over the next couple of days that will become clear, but for now, it does make you think, there’s some exciting times ahead for Veeam and today, it sounds like the strategic view is right, of course, the proof as always with companies moving onto a new period in their history, will be in the execution.

More from VeeamOn tomorrow.

At your data’s service – Dave Sobel – Ep 24

I think we all accept that as individuals, businesses and organisations, the way we see our data is changing, more than ever we see it as an asset and like any asset we see it as something to treat carefully, ensuring it is stored properly, secured, protected and of course something we are getting value from.

A big part of this shift is driven by the technology industry itself, tools, technologies and services are now available that allow us to use our data in ways that previously we had not been able to.

However, it is not just that these technologies exist that is driving this change, but it is how much more readily available these tools and services now are and this is mainly due to a new breed of service providers.

This is the focus of this week’s podcast, as I’m Joined by Dave Sobel, Sr. Director of Community and Field Marketing at SolarWinds MSP.

Dave has a wide experience in the technology industry, having both operated his own service provider and now with the provider of a global platform used by service providers and end users around the world.

In this episode, we talk about how the way we use our data is changing and how this is driving not only great new opportunities for business, but also creating a new breed of service providers and platforms to support new and inventive ways for us to make the very most from our data assets.

We talk about the evolution of what we think of as a computer from mine and Dave’s shared Commodore 64 experience to modern voice interfaces and how this evolution is changing how we collect and use data. But even with this change, we look at how the data and the information is the only thing that matters, that those devices are no longer that important to us.

We also discuss how the technology conversation in organisations is changing, how today technology decisions are not just with IT, but with application and service owners who are asking how to gain more insight from the data they collect and how technology can drive success in their parts of an organisation.

Finally, we look at security, how the complex security challenge is also driving a new breed of services and the things that you should consider before you take a new service into your organisation.

Dave also shared the difference between security advice from non-experts and those that truly understand the threat, summed up brilliantly in the graphic below.

I think Dave provides a great insight into the changing data market and the part that service providers play in allowing us to do the very best with our valuable data assets.

I hope you enjoyed it.

If you want to follow Dave online you can find him on twitter @djdaveet

His company SolarWinds MSP can be found here

You can of course contact me in all the usual places.

If you enjoyed the show, why not subscribe on SoundCloud, iTunes and all other homes of podcasts.

Subscribe on Android

 

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss