All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

New fangled magic cloud buckets – Kirk Ryan – Ep32

We’re all heading to the cloud, data is the new gold, all of our competition is putting its data in the cloud and getting massive competitive advantage, if we don’t know how to use our data and take advantage of the cloud we’ll be obsolete.

All phrases that you may well have heard, or variations on them at least and to some degree they are all true, there’s no escaping that cloud services are here to stay and certainly there is a huge shift in organisations of all types becoming more data centric.

But, if that is all true, why isn’t everyone charging headlong into cloud services and finding more and more valuable information from their data and making the rest of us a thing of the past?

It’s because it’s not that straightforward, cloud services can be complex, moving our data to them is difficult and even when we get it there, what new and ingenious things can we do?

Over the next few Tech Interviews episodes we are going to explore that subject and look at how we can take advantage of cloud, how we can get our data there and indeed, what innovative things can we do once we have, over these episodes we’ll look at how to choose the right cloud partners and speak with a company who have used cloud and data to come up with an ingenious way for charities to raise money.

But first, we take on the data challenge, how do we make it accessible to the cloud? how do we move it there? do we move it there? and if we do how do we keep it under control?

To help me explore that topic, I’m joined by Kirk Ryan, a Cloud Solutions Architect for storage vendor NetApp.

First we discuss why a “traditional” on-prem storage vendor like NetApp would need a cloud architect and how the “new” NetApp really have allowed themselves to be a “cloud first” company.

We also discuss the changing role of digital services in our organisations and how the IT purse strings are not always pulled by IT.

We look at the challenges that companies keen to embrace new technologies have to overcome. The importance of building bridges for our data between on-prem, cloud and hybrid platforms as well as the criticality of understanding our data’s lifecycle from creation through to its deletion.

Finally we look at the importance of not taking our on-premises bad habits with us to the cloud, as well as ensuring that we fully understand the economics of our move.

Because, after all, it isn’t a new fangled magic cloud bucket!

To find out more from Kirk you can contact him on LinkedIn.

To learn more about NetApp and their cloud data management tools then visit cloud.netapp.com or netapp.io.

Finally if you want to hear more from Kirk, you’ll find him, alongside me and some other industry colleagues discussing how to move to the cloud at the www.northwestdataforum.co.uk in Liverpool on July 5th.

If you enjoyed the show and want to make sure you catch next weeks episode as we discuss choosing the right cloud partners, then why not subscribe in all of the usual places.

Until next time, thanks for listening.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

 

Tech me up – Your tech entertainment for this weekend – 16th June

It’s the summer, the sun is out, it’s the weekend, what’s the only thing missing? some great technical content to keep you company while you sip a cold one.

This weeks list of top tech has a data security slant to it, so not only will it be keeping you informed, it will ensure you can keep those data assets secure.

Settle back and enjoy;

Podcasts

For those who like to listen to their tech, here’s a list of great podcasts I caught this week.

Arrow Bandwidth

An excellent and somewhat unique chat with Marcus Hutchings, better known as Malware Tech, the information security engineer who discovered the “kill switch” for the WannaCrypt ransomware outbreak.

They discuss how he discovered the workings of the malware and how they came about the kill switch and how he’s dealing with the “fame” that came with it.

A great listen.

InfoSec Podcast

This is a relatively new podcast to my list, but an excellent weekly discussion on the latest news from the information security industry.

This week they focus on the latest release of SANS’ Security Awareness Report which attributed communication as one of the primary reasons why awareness programs thrive or fail.

The team also look at the difficulties of legislating for cyber security in a quick moving technology world.

Infosec Podcast

 

Datanauts

More security, this time from the Datanauts team, who are joined by James Holland and Aaron Miller from Palo Alto Networks to discuss the evolution of security architectures and approaches, the importance of application awareness, and the impact of virtualization, which can both create new risks and provide new opportunities.

They also look at where security is going, how cloud and virtualization will continue to shape your security infrastructure, and how skill sets will have to adapt to support more automation.

Datanauts Podcast

NetApp Tech ONTAP

A bumper week for podcasts and if you want something not security related, how about some coding?

I really enjoyed this episode of the NetApp Tech ONTAP podcast, not really NetApp focussed at all but a great chat with Ashley McNamara of  Pivotal discussing how storage administrators (and pretty much anyone) should be learning to code. Ashley also gives us places to look for resources for aspiring developers and scripters to be successful.

Great fun.. have a listen.

Tech Interviews

This week was a VMware special, as I was joined by the hosts of the excellent VMware vSPEAKING Podcast, Pete Flecha and John Nicholson , they make great guests as we discuss how peoples changing demands on technology are changing how we have to design and architect our infrastructure.

We also look at how our infrastructure not only needs to be faster and simpler, but also needs to be smarter and how our application and data centric world is driving demands for availability.

John also introduces us to the concept of giving jetpacks to cavemen!

Great fun with the guys, have a listen.

Articles

Rather settle back in the garden and read your tech, then try out these security focussed articles;

Data Privacy Monitor – Deeper Dive Security is a big deal for big data

We are all keen to take advantage of data analytics so we can get more value from our data assets, but how many of us consider the range of security challenges that comes with consuming those public big data services?

In this article Lavonne Hopkins looks at a range of issues to consider and provides some solid advice.

https://www.dataprivacymonitor.com/big-data-2/deeper-dive-security-is-a-big-deal-for-big-data/

Compare The Cloud – Refashioning data security with a nod to the cloud

The thing with the Internet is once it’s out there, it stays out there, but on the plus side you can find little gems of articles that you may not of read at the time.

This was one of those, posted back in Jan 2016 by the team at Compare the Cloud.

In this article they look at the challenge that CISO’s have. Pulled in many directions from their businesses, while having to deal with all of the evolving security threats.

It’s an interesting read looking at the kind of approaches that a CISO can look at to help take on the multiple challenges they are faced with.

https://www.comparethecloud.net/articles/refashioning-data-security-with-a-nod-to-cloud/

Forbes – Why manufacturers should be mindful of cybersecurity

This article discusses how cyber attackers target the manufacturing industry, what caught my attention was how the idea of small imperceptible changes can eventually have a huge impact.

This approach is not just a threat to manufacturing, it is also a threat for security built around analytics and AI. One of the approaches attackers take to overcome machine learning based security is using small changes that machine learning algorithms don’t notice and overtime they start to accept as normal behaviour.

This provides an interesting read and highlights the complexity of the security threat.

https://www.forbes.com/sites/forbestechcouncil/2017/06/01/why-manufacturers-should-be-mindful-of-cybersecurity/#11b188b810d2

Hopefully something there for everyone, enjoy your weekend.

Giving jetpacks to cavemen – Pete Flecha, John Nicholson – Ep31

Over the last 15 years or so, there has been major shifts in the way we deliver technology in the enterprise, cloud, mobility and of course virtualisation. You can’t look at the virtualisation industry without looking at the company who has defined the industry more than any other, VMware.

However, they are not immune to change and the world is certainly changing as we become more application and data centric and with this our expectations of how we consume our IT services has also changed.

I don’t think we are all going to throw our virtualisation infrastructures in a skip anytime soon, but we do have different demands, we want to integrate cloud, want our systems to be data centric, we need security, privacy and availability and we need our data and applications to be flexible, resilient and simple to consume.

How does this then effect VMware’s view of the world? –

In wp_20161116_12_44_06_rich_li.jpgthe last interview I recorded at Veeam’s recent VeeamON conference, I managed to catch up with the hosts of the excellent VMware Virtually Speaking podcast, Pete Flecha and John Nicholson, to ask this question and find out how VMware are changing, how they see the needs of their customers evolving and what future trends they see as the next important areas for focus.

We also investigate how customer expectations continue to rise for their technology and how technology not only needs to be resilient but also smart. We look at how organisations want their IT to be able to easily consume and integrate “cloud” into their on-prem solutions.

John also shares some thoughts on designing resilient solutions, and how availability is not only about what you buy, it also about what you do and the part your environment can also play.

The guys also talk about VMware’s shift toward simplifying the technology stack, how technologies like vSAN, storage policies and VVOL’s are making our technology faster, smarter and more straightforward, allowing us to focus on our applications and data and not the complexity of the infrastructure below it.

Pete and John provide a fantastic insight into how our organisations technology requirements are changing and how VMware are changing to remain a relevant and important part of our IT stack.

Last but not least, John also introduces us to the wonderful image of Cavemen with Jetpacks!

If you want to follow up with Pete and John you can find them both on twitter @vPedroArrow and @Lost_Signal.

You can catch find the excellent Virtually Speaking Podcasts over at vspeakingpodcast.com and check out episode 44 for a great chat the guys had with Michael Dell.

Hope you enjoyed the show, Pete and John where great guests and provided some fantastic insights into the industry and VMware’s place in it.

If you did, then why not subscribe on iTunes, Soundcloud and other good homes of podcasts.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Tech me for the weekend – June 9th

Well well, it’s the weekend again and I know you all find it tough to drag yourself away from the world of tech, so worry not, here’s this weeks list of top tech entertainment to keep you teched up this weekend.

This weeks there is a bit of a data  theme.. so dive in their is podcasts and articles a plenty…

The Reading

Why Data will drive your success in the cloud – Matt Watts

Matt works closely with the office of the CTO at NetApp and wrote this interesting piece on their site about the changing way we all see data’s value and what that means for the way we build our data strategy.

Inspired by a recent article in The Economist that notes in today’s economy, “the world’s most valuable resource is no longer oil, but data.” – I know we’ve all heard it, but Matt explores the topic and what it means.

Why data will drive your success in the cloud

Setting sail for uncharted waters – Ruairi McBride

Some big announcements from storage behemoth NetApp this week which included their entry into the world of Hyperconverged Infrastructure (HCI).

Now, no question that NetApp are late to this particular party, but Ruairi looks at why and the potential benefits that NetApp have from having sat back and watched the market for a little while and what their solution brings that may make them stand out.

If you want to know more about the NetApp HCI offering, this is a great place to start.

Setting sail for uncharted waters

One year until the EU GDPR

This complex regulation will finally be enacted in just under a year and personally I don’t think you can ever read up too much on what this may mean for us all and the way we collect, store, manage, protect and dispose of our data assets.

Good read here from CITY A.M. which also includes some quotes from my favourite data privacy attorney Sheila Fitzpatrick

A good piece, well worth a read.

One year until the EU GDPR

Big wheels keep on turning

Back to NetApp, I wrote a piece myself this week looking at NetApp’s continued evolution from storage company to data management company, how that was progressing, why that was important and how are they embracing this ever increasingly data driven world.

Big Wheels Keep On Turning

The Listening

Arrow bandwidth

This is an episode from a couple of weeks ago, but fits in nicely with this weeks data theme as Rich and Dave are joined by Vince Payne to discuss the data industry and the impact of business intelligence and analytics.

Rich plays the role of devils advocate perfectly, as the team discuss as to whether BI and Analytics really is a “thing” and whether people are actually using these technologies.

Excellent and thoughtful debate as they look at whether data really Is the new gold!

 

Gestalt IT – The ON-Premise IT round table

Another excellent debate asking what is the reality for the future of data, is AI, machine learning and data analytics really going to change the world and do something interesting?

And does the future include clever machines really replacing people? Or will us poor humans always have a place!

More excellent devils advocacy from Nigel Poulton.

What is big data?

Speaking In Tech

Rounding off the data chat, a brilliant guest joins the Speaking In Tech team, as Michel Feaster from Usermind discussed how data analytics and intelligence can have a massively positive impact on our customer experiences.

Some great practical examples of how using intelligence alongside traditional systems can revolutionise the kind of results we get, give it a listen.

Speaking in Tech: Blame millennials for customer engagement upheaval

Tech Interviews

Of course you don’t get this far without a plug for my own show, this week is the second part of a look at the data availability market.

My guests this week discuss how to gain more value from data backups, how to ensure that our focus on application availability doesn’t do more harm than good and whether availability is elevated to the right level of importance?

Three great guests in Mike Beevor of Pivot 3, Data Gravity’s Dave Stevens and Andrew Smith of IDC.

Enjoy.

Oh and if you missed part one where I chat with Justin Warren and Jeff Leeds, fear not.. it’s here…

So enough to keep you busy..

Have a great weekend.

 

Analysing the availability market – part two – Dave Stevens, Mike Beevor, Andrew Smith – Ep30

Last week I spoke with Justin Warren and Jeff Leeds at the recent VeeamON event about the wider data availability market, we discussed how system availability was more critical than ever and how or maybe even if our approaches where changing to reflect that, you can find that episode here Analysing the data availability market – part one – Justin Warren & Jeff Leeds – Ep29.

In part two I’m joined by three more guests from the event as we continue our discussion. This week we look at how our data availability strategy is not and can not just be a discussion for the technical department and must be elevated into our overall business strategy.

We also look how technology trends are affecting our views of backup, recovery and availability.

First I’m joined by Dave Stevens of Data Gravity,  as we look at how ou060617_0724_Analysingth1.jpgr backup data can be a source of valuable information, as well as a crucial part in helping us to be more secure, as well as compliant with ever more stringent data governance rules.

We also look at how Data Gravity in partnership with Veeam have developed the ability to trigger smart backup and recovery, Dave gives a great example of how a smart restore can be used to quickly recovery from a ransomware attack.

You can find Dave on Twitter @psustevens and find out more about Data Gravity on their website www.datagravity.com

Next I chat with Mike Beevor of HCI vendor Pivot3 about how simplifying our approach to system availability can be a huge benefit. Mike also makes a great point about how, although focussing on application and data availability is right, we must consider the impact on our wider infrastructure, because if we don’t we run the risk of doing more “harm than good”.

You can find Mike on twitter @MikeBeevor and more about Pivot 3 over at www.pivot3.com

Last but my no means least I speak with Senior Research Analyst at IDC, Andrew Smith, we chat about availability as part of the wider storage market and how over time, as vendors gain feature parity, their goal has to become to add additional value, particularly in areas such as security and analytics.

We also discuss how availability has to move beyond the job of the storage admin and become associated with business outcomes. Finally we look a little into the future and how a “multi cloud” approach is a key focus for business and how enabling this will become a major topic in our technology strategy conversations.

You can find Andrews details over on IDC’s website .

Over these two shows, to me, it has become clear that our views on backup and recovery are changing, the shift toward application and data availability is an important one and how, as businesses, we have to ensure that we elevate the value of backup, recovery and availability in our companies, making it an important part of our wider business conversations.

I hope you enjoyed this review, next week, is the last interview from VeeamON, as we go all VMWare as I catch up with the hosts of VMWare’s excellent Virtually Speaking Podcast Pete Flecha and John Nicholson.

As always, If you want to make sure you catch our VMWare bonanza then subscribe to the show in the usual ways.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Big Wheels Keep On Turning

Just about a year ago I wrote a piece about NetApp and how they were making a strategic shift (Turning a big storage ship),changing their focus as well as the perception of both the industry and customers. This coincided with the launch of the latest version of the companies bestselling storage operating system ONTAP – version 9

A year on, after spending a few days with the NetApp’s leadership as part of our annual NetApp A-Team get together, I thought It would be good to check in on how that big storage ship was doing and was it still turning in the right direction.

First some context, data is increasingly the lifeblood of our organisations, it’s in the top 2 or 3 assets any business holds and we are constantly seeing how organisations are using data in ever more creative ways. While of course we continue to create more and keep it for longer.

Not only do we need more from our data, the way we consume data services is changing, the big public cloud providers are giving us analytics services on demand, allowing us to solve more and more complex problems, well as long as we can get our data to their cloud offerings in the first place. Which means more data been housed in the cloud, which is great for analytics, but isn’t always a great fit for our data sets.

In that context, how does a big storage vendor remain relevant?

In my opinion they have to embrace the changing attitude to data, just wanting to store it isn’t enough, to quote a friend of mine “storing is boring” and in reality it kind of is, if your only view of data strategy is storing it neatly, you are missing a trick.

So the question is, are NetApp embracing this new data driven world?

Shift to a data management company

This is something I’ve been hearing over the last 6 months and fully expect this is going to be front and centre of a lot of NetApp messaging, as they start to move from storage company to data management company, this focus is absolutely right, in my own company, we have done the same thing, because it’s what our customers demand, it’s not about building infrastructure and storing data, it’s about taking a valuable asset and getting the most out of it.

There is no point just talking to a modern organisation about how much storage you can provide and how fast it is, organisations want to know “how can you make sure my data asset, remains an asset”

Data Fabric

For those not familiar with NetApp’s Data Fabric, it is a criticalData-Fabric_shutterstock_thumb.jpg part of their vision as they make the shift to a data management company. A data fabric is NetApp’s view of  how we build a data infrastructure that allows us to get the best from our data giving us the flexibility in how and where we store it, how we move it, while maintaining security and compliance, all crucial in a modern data strategy.

But this does go beyond just a strategic goal, this is baked into all of NetApp’s thinking, the idea that you can move data across any NetApp platform regardless of whether it’s hardware, white box, virtual machine or even sat in AWS or Azure, is very powerful. It also isn’t limited to ONTAP, allowing us to move data between ONTAP, Solidfire, E-Series, AltaVault and even none NetApp platforms via Flexarray.

Ultimately will data fabric be stretched beyond the NetApp portfolio? Who knows.. it would be great if it did, but there’s a lot of work to be done.

Embracing the new world

CloudComputing_thumb.jpgPart of the new way of working with data includes the cloud, there is no getting away from this reality, whether it’s consuming SaaS like Office365 and Salesforce or it’s holding our data long term in S3 or Azure Blob stores or needing to present our data to analytics tools, organisations are moving more data to the cloud.

What part does an on-prem storage vendor play in this? It has to be two fold;

Help me to move data to the cloud

Because they supply on-prem storage arrays, NetApp can’t ignore the reality that their customers want to move data to the cloud. To NetApp’s credit they are embracing this challenge and are helping enable this movement.

The data fabric strategy and ONTAP are a key part of this, the ability to take NetApp’s core storage OS and deploy it directly from either AWS and Azure, means that not only can you move your data from your on-prem array straight into a public cloud, but because it’s the same operating system end to end you can crucially maintain all of the on-premises  efficiencies, management and controls on your data in the public cloud and this is a real positive.

It’s not only moving of data to the cloud that NetApp have turned focus to however, it’s also looking at ways that cloud based services can play a part in thier future that is also interesting. This has started with two services, Cloud Sync and Cloud Control.

Cloud Sync assists users in automating the process of moving data from on-premises NFS datastores straight into Amazon S3 storage and back again.

While Cloud Control allows organisations to protect their Office365 data, by allowing us to back it up and hold it in an alternate location.

The important thing to note with these two services is they are exactly that, they are services, no traditional NetApp tools are needed as any part of the solution, you subscribe to the services and begin to use them.

If anything proves NetApp’s position on embracing the new world, it is this.

Big Wheels Still Turning?

With a range of new announcements due soon, including the much anticipated NetApp HCI platform, the storage behemoth, in my opinion, continues to evolve, it’s focus is right and certainly aligning to the challenges that the organisations I deal with talk about.

It continues to do smart things within it’s core product set, continuing to add tools that enable their wider data fabric strategy and working these directly into their portfolio, especially the product at the heart of it, ONTAP.

Personally I continue to be very enthused by what NetApp are doing and the direction they are taking and for me, those big wheels are not only turning, they are turning in exactly the right direction.

Let’s see if they can keep it up.