What you don’t know, may hurt you – John Hughes – Ep 20

We are all familiar with the saying “what you don’t know, won’t hurt you”. Well in the world of data management, security and privacy the opposite is most definitely true.

For most of us, as our organisations become more digital, we are increasingly realising the value of our data, how big an asset it is and how important maintaining it is.

However, although we understand how valuable our data is, we actually have very little insight into what is happening to it on a day to day basis.

Ask yourself, do you know exactly what data you have across your business, do you know exactly who has access to it, where it is stored, when it gets accessed, if it even gets accessed and when it’s accessed what gets done with it?

In my time administering IT systems, or working with those that do, I’ve lost count of the amount of times I’ve been asked “who changed that file”, “who deleted that file?”, “can you tell me the files that a user has accessed and copied to a USB stick?” the answer is normally no, and it’s normally no, because our standard storage solutions can’t tell us.

Imagine a logistics company asking questions like, “who’s driving that lorry”, “who was the last person to drive it?”, “where is Fred taking that lorry?”, “can you tell me the type of lorries we have?” and been told, no, we don’t know any of that information, ridiculous right? Yet we do that with our data asset.

We have talked in recent episodes about the threat to our data security and privacy, be it policies or procedures or our people. Just as significant a threat is the inability to fully understand what is going on with our data sets, a lack of insight and analysis means it’s very easy for our data to be abused, lost and stolen without us having the slightest knowledge of it happening.

That’s our focus this week, in the last of our data security & privacy episodes, I chat withjohn hughes John Hughes of Varonis. Varonis provide data analytics and insights into how we use our data, what our data is, who is using it, what it’s used for and if it’s even used at all.

We discuss a little of the history of Varonis, why data insight is so critical, why it’s a cornerstone of our ability to meet compliance requirements and how it’s a crucial part of our defence against data security attacks.

Enjoy the show and thanks for listening.

To find out more about Varonis;

Check out varonis.com

Have a look at their excellent range of BLOGS at blog.varonis.com and of course follow them on twitter @varonis

You can also request a free GDPR data assessment via their website

If you want to learn more about any of the topics in this series, and you are in the North West England on April 5th, you can join me and a range of speakers at www.northwestdataforum.co.uk

You can find the previous 3 episodes in this series here;

Best Take Care Of Those Crown Jewels – Sheila Fitzpatrick – Ep 17

Don’t Build Your Data Privacy House Upside Down – Sheila Fitzpatrick – Ep 18

Make People Our Best Data Security Asset – Dom Saunders – Ep 19

If you’ve enjoyed this episode, then why not subscribe;
Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Don’t Build Your Data Privacy House Upside Down – Sheila Fitzpatrick – Ep 18

There is no doubt that there are many difficulties presented to organisations when it comes to their data.

We understand it’s an asset, something that, if we make the most of it, can be a significant advantage to us, but of course we also understand maintaining the security and privacy of it is critical.

I think it’s fair to say, as organisations and IT professionals we are becoming much more mature in our attitudes to data privacy and security and we understand more than ever the risks posed to it.

This increased level of maturity is going to become even more important, especially with significant regulation changes on the horizon and none are more significant than the EU’s General Data Protection Regulation (GDPR).

In this weeks podcast, the second part of my conversation with Global Data Protection Attorney Sheila Fitzpatrick (You can find part one here), we discuss exactly what GDPR is going to mean to us as organisations, including those organisations that are outside of the EU (including the impact on the UK).022617_1150_Besttakecar1.jpg

Not only do we look at the impacts of the legislation, Sheila also shares with us some of the initial steps you can take to start to build robust data privacy policies.

How important it is to get the foundation right. How we need to understand our data, where we get it from, how we get it and what we keep and how this is much more important, initially, than finding technology tools to deal with the problem. Build the foundation before you build the second floor!

We also explore how data privacy and GDPR is NOT the problem of IT, it’s a business challenge, IT are certainly a key part in helping to deliver security, privacy and compliance, but it not an issue to throw back at IT to solve.

I hope you’ve found these two episodes with Sheila useful in providing an outline of the problem, as well as some of the steps you can take to address it.

If you want to catch up more with Sheila, you can find her on twitter @sheilafitzp and on Linkedin.

Next week, we look at a different part of the data security challenge, People.

I chat with Dom Saunders from NETconsent as we look at how we can make our people a key asset in dealing with the data challenge.

If you want to make sure you don’t miss that episode, then please subscribe on iTunes, Soundcloud or wherever you get your podcasts.

Thanks for listening…

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Best Take Care Of Those Crown Jewels – Sheila Fitzpatrick – Ep 17

Data, it’s the new oil, new gold, your Crown Jewels. We’ve all heard these phrases, but it is hard to deny that data is a fantastic asset, companies who know how to mine true value from it have a distinct advantage over their competitors and we are continually creating more of it.

However, it’s fair to say that data also comes with its challenges, we must store it all, make sure we protect it all and of course we need to make sure it’s secure.

The challenge of data security and privacy is right at the top of the list of priorities for most IT executives, and, if it isn’t already, it should be high on the list of priorities for business owners and boards as well.
Maintaining the security and privacy of our data is going to continue to be a complex problem, from the multi-faceted security threat, to the introduction of more stringent data privacy laws.

To try to help to address this, this week’s podcast is the first of a short series focussing on the twin challenges of data security and privacy. First is a two-part episode exploring the issue of Data Privacy, with my guest Global Data Privacy Attorney Sheila Fitzpatrick.

Sheila is NetApp’s Chief Privacy officer and World Wide Data Governance and Privacy Council, and has nearly 35 years of experience in the field of data privacy, so is well placed to comment on the current data privacy landscape, the challenges of managing data and the issues presented by changing regulation.

In this first part, we look at what data privacy is, what defines personal data, why it’s important to understand the full lifecycle of your data management procedure, the difference between data security and privacy, as well as an introduction to the upcoming EU General Data Protection Regulation (GDPR).

Sheila couples her huge experience of data privacy with a tremendous enthusiasm for her topic, which makes her a fantastic person to learn from. Enjoy the episode.

If you want to catch up more with Sheila, you can find her on twitter @sheilafitzp and on Linkedin.

Next week we’ll be focussing on the biggest change to data privacy in the last 20 years, the EU General Data Protection Regulation (GDPR), its impact, what it means to us and how to start to build a data privacy strategy.

If you want to make sure you don’t miss that episode, then please subscribe on iTunes, Soundcloud or wherever you get your podcasts.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Tech Trends – Object Storage – Robert Cox – Ep13

Over the last couple of weeks I’ve chatted about some of the emerging tech trends that I expect to see continue to develop during 2017 (Have a read of my look ahead blog post for some examples). To continue that theme this episode of Tech Interviews is the first of three looking in a little more detail at some of those trends.

First up, we look at a storage technology that is growing rapidly if not necessarily obviously, Object Storage.

As the amount of data the world creates continues to grow exponentially it is becoming clear that some methods of traditional storage are no longer effective. When we are talking billions of files, spread across multiple data centers across multiple geographies, traditional file storage models are no longer as effective (regardless of what a vendor may say!) that’s not to say that our more traditional methods are finished, in fact a long way from it, however there are increasingly use cases where that traditional model doesn’t scale or perform well enough.

For many of us, we’ve probably never seen an object store, or at least think we haven’t, but if you’re using things like storage from AWS or Azure then you’re probably using object storage, even if you don’t realise it.

With all that said, what actually is object storage? why do we need it? how does it address the challenges of more traditional storage? what are the use cases?

It’s those questions that we attempt to answer in this episode of Tech Interviews with my robert-coxguest Robert Cox. Robert is part of the storage team at NetApp working with their StorageGrid Webscale object storage solution.

During our chat we focus on giving an introduction to object storage, why is it relevant, the issues with more traditional storage and how object overcomes them, as well as Robert sharing some great use cases.

So, if you are wondering what object is all about and where it maybe relevant in your business, then hopefully this is the episode for you.

Enjoy…

If you’d like to follow up with Robert with questions around NetApp’s object storage solutions you can email him at robert.cox@netapp.com

You can find information on NetApp StorageGrid Webscale here 

And if you’d like a demo of StorageGrid then request one here

Next week we take a look at one of the most high profile of tech trends the emergence of DevOps, to make sure you don’t miss out you can subscribe to the Tech Interviews below.

Hope you can join us next week, thanks for listening…

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Insights from the storage industry?

Last week I was away in Berlin at NetApp’s Insight conference (See what IFD572BD3-226B-428F-B6F4-849481A8B842.jpg did with the title there!) always an enjoyable event with good information, company, food and the occasional large German beer. That aside, I do try to attend a handful of these types of events a year as a part of my job.

How does it benefit my job?

A big part of my role is to identify key industry trends and challenges and to see whether our technology partners are developing solutions to take these on and help our customers to adapt and modernise their IT and maintain competitive edge in a fast changing business world. Whether that’s Microsoft, one of our  data management and security providers, or, as in this case a storage provider like NetApp. We need to know our partners are still delivering relevant solutions.

So how did NetApp measure up ?

Our answer to this is usually found in the keynote sessions, that’s the home of strategic presentations and product announcements, Insight was no exception.

Understanding the problems?

Did the NetApp leadership address the fundamental challenges that we are seeing?

Three messages really stood out for me at the event, each hit key concerns I see in my daily dealings with senior IT people.

Data is critical

Data was at different times the new gold, new oil and the new digital currency, but ultimately it was THE most important thing, it was the key focus of pretty much everything covered across the four days and that’s how it should be, it’s our businesses most critical asset, it’s the thing that has the opportunity to separate us from our competition by extracting true value, whether that’s better reporting, better analytics or more flexibility in movement from on-prem to cloud and back. Getting the best from it is a major goal for us all.WP_20161114_15_51_23_Rich_LI.jpg

This focus was refreshing it also included coining the phrase;

NetApp not the last independent storage vendor but the first data management company

That works for me, my conversations these days are never speeds and feeds based, much more around outcomes and aims, tick in the box then.

DevOps it

You just can’t have an IT discussion these days without throwing around the phrase DevOps – I’d be disappointed to be honest if it wasn’t brought up – I’m not even going to attempt to try to do justice to the breadth of the topic here, there’s lots of great DevOps content out there (For an excellent DevOps intro have a listen to the Tech ONTAP Podcast episode with Gene Kim here ) .

I think often we assume this kind of stuff is just about software development, but in my mind it’s much more about the way we are looking to consume technology in our businesses, IT cannot be an impediment to us doing business, the modern business needs to be able to respond quickly to new challenges and we need to have an IT infrastructure that can not only change but one we are not afraid to change when we need to.

There was a great session with a day in the life of DevOps, that although played for laughs, brought home the importance of automation, the ability to fail fast and how to manage modern development processes, of course with a healthy bit of how things like NetApp’s integration with Docker, access to API’s with both ONTAP and Solidfire can all help build a modern agile data infrastructure.

Integrating the cloud

NetApp has talked extensively about their data fabric message for the last couple of years, many of you know I’m a fan (for example Data Fabric – what is it good for). The driver behind the fabric is the reality, that for most of us and our IT infrastructure, the future is going to be hybrid, some stuff on prem, some stuff in the cloud. But this kind of hybrid environment comes with challenges, no challenge bigger than how we move data between our on-prem and cloud environments, and not just how we move the datasets around, but how we ensure that it remains under our control, secure and protected and does not end up living in a cloud storage silo.

Insight this year showed the maturity of what NetApp have been doing in this space, not only with the additional capabilities they added to the NetApp portfolio, closer integration of ONTAP and Alta Vault, the announcement of SnapMirror to Solidfire, the enhancements to ONTAP cloud with additional capabilities in AWS as well as support for Azure, but also the introduction of a couple of really interesting solutions that don’t need any “traditional” NetApp solutions at all.

Cloud Sync allows for the movement and conversion of data between an on-prem NFS datastore up into AWS’s analytics tools, designed  to greatly simplify the usage of services such as EMR. Alongside this is Cloud Control a solution to help protect the contents of your Office 365 services, email, SharePoint and OneDrive for Business, giving you the ability to back data from these services into anything from your NetApp based on-prem storage to storage blobs in Azure and AWS. Impressively both of these are just services that you can sign up to, point at the relevant cloud services and away you go, no requirement for any other NetApp tech if you don’t want it.

What I like about this is it shows their commitment to data, it’s no longer about selling you ONTAP or FAS hardware (even though they remain great platforms) but about helping us to enable our data to be used in this quickly changing technology and business world.

Did NetApp deliver what I was looking for?

Certainly for me they did, as I said right at the start, when I get time with key technology partners I’m looking to see if they are addressing the primary issues we and our customers are seeing and are they understanding the key technology trends, personally I think at Insight NetApp nailed it and will continue to be very relevant in the modern data management world.

So good job NetApp.

I hope you enjoyed the post, if you want some further info from Insight, here’s some resources you may find useful.

While I was out there I got to do a couple of interviews with key NetApp staff that were recorded for their YouTube channel.

I chatted here with Elliot Howard about the wider challenges that customers see and how NetApp and it’s partners can help;

On this video I spoke with Grant Caley NetApp UK’s chief technologist and asked about industry trends and how they are going to effect out storage usage in the future;

Finally I also spoke with some of the attendees at the event to see what they thought of Insight and tech conferences in general. You can find that here on TechStringy Interviews – or go get the podcast from iTunes or wherever you get your podcasts.

 

 

 

 

 

Make my cloud so…

A bit of a Star Trek misquote I know, but I’m pretty sure Captain Picard would have said that as the ships IT department looked to enable their hybrid cloud strategy. For many of us, hybrid cloud is the reality of our future IT designs, the flexibility provided by access to cloud compute and storage, both technically and commercially makes cloud services compelling in many instances.

However, those compelling cases do come with a cost. Using hugely scalable public cloud technologies presents challenges, application architecture, system design but more often than not they are data issues, security, governance, protection or even just moving big lumps of data around, all add to the challenge that comes with trying to enable these flexible cloud based services.

With that in mind, I took great interest in NetApp’s November 1st Hybrid Cloud announcements (You can find the press release here), especially the very strong emphasis on enablement, this was not your usual product range announcement. Often these announcements are almost “self serving”, get a new widget from, buy our updated solution or platform. Don’t get me wrong there is an element of that in these NetApp announcements, with updates to a couple of major products, but what was really interesting was the cloud service solutions that where mentioned, technologies that where not your “traditional” NetApp solution, no need for a hardware array, no need for ONTAP or anything else, these where purely service offerings that are designed for no other reason than to address the significant challenge of enabling our integration with cloud services.

I don’t plan on going into detail on all of the announcements, check out a great post like this from Mike Andrews (@trekintech) for wider analysis, I just wanted to look at a couple of the more generic cloud enablement solutions, that don’t need any “traditional” NetApp components.

Cloud Control for Office 365

In my experience, one of the early cloud integrations an enterprise will make is Office365, taking advantage of Microsoft’s Software as a service offering for email, document management and file storage. These services, which although critical, are often time intensive to deliver, while providing little additional value to the business, “table stakes” if you will, a company must have these things, but they are not going to give competitive advantage.

Giving it to Microsoft to run makes perfect sense, however one thing that is often missed when a business moves to 365 is data protection. Microsoft’s role is clear, it is to present you with a secure, scalable and resilient service, however it is not to protect your data. 365 offers several options for data retention, however, Microsoft do not protect you from data deletion, accidental or malicious, once that data is gone, it’s gone.

So how do you protect it? There is a growing market of solutions to this challenge and NetApp have now thrown their hat in to the ring with an extremely comprehensive offering.

Cloud Control is a full SaaS offering, no need to purchase equipment, or install anything on prem, take it as a service, point it at your 365 subscription and you have the capability to back up your Exchange, SharePoint and OneDrive for Business repositories.

What separates Cloud Control, in my opinion, is the number of possible backup targets you can use. If you have a NetApp environment, that’s great, you can take your 365 data and back it straight into your ONTAP environment, don’t have on-prem ONTAP? no problem, you can spin up ONTAP cloud and back off to that.

Don’t want ONTAP at all? Use AltaVault from the NetApp portfolio to move your data to an object store and of course, you don’t want anything at all from NetApp, no problem Cloud Control will allow you to move data straight into an AWS S3 bucket or an Azure storage blob.

Cloud Control provides granular data protection, with item level recovery for your 365 implementation, enabling you to deliver enterprise level data protection to your public cloud service.

Cloud Sync

A key benefit of cloud compute is the ability to get masses of processing power as and when you need it, without having to build a big compute cluster which spends most of its time idle.

Things like Hadoop are fantastic tools for data analytics, but it’s one heck of an expensive tool to deploy and has taken big data analytics away from many enterprises.

However, cloud providers like AWS have addressed this with services available to rent as you need them. The trick with these is, how do you move data to that analytics engine as and when you need it? how do we seamlessly integrate these services into our infrastructure?

Step forward the Cloud Sync service. Cloud Sync points at your on-prem NFS datastore (no it doesn’t have to be ONTAP based NFS) and your analytics service and seamlessly syncs the on-prem data to the analytics engine when needed, allowing you to take advantage of cloud compute, while ensuring your datasets are always refreshed.

Cloud Sync is all about automating those difficult tasks, and in modern IT, that is exactly what we are looking for, orchestrating the use of cloud compute allowing us to consume services in the most effective way.

Again, delivering this without the need for any of the more “traditional” NetApp technologies.

But Why?

I suppose this begs the question, why as a storage vendor, build solutions, that actively have no need for your storage products? Well let’s not be fooled, both of these services are NetApp subscription services, and of course both solutions can enhance existing NetApp technology, however I don’t think that’s the primary aim.

If you’ve ever looked at any of NetApp’s Data Fabric strategy, you’ll see that they are a very different storage company, who are much happier to talk about data strategy than selling you things, of course they have things that can enable your strategy, but a conversation about how we manage our data in this modern IT world, I see as something far more valuable than just selling something a bit faster, with a few more flashing lights, getting us to think about how we move, manage and secure data is far more important.

These November 1st announcements are just another example of NetApp’s commitment to its Data Fabric and how the right fabric can enable an organisation to fully exploit cloud flexibility, I very much look forward to seeing these solutions in action as they come to market and of course keen to see what NetApp add next to this increasingly impressive hybrid cloud story.

Cloud enabled captain…

For more detail on NetApp’s cloud solutions visit their cloud website where you can get information as well as access to trials of these services.

cloud.netapp.com

For some background reading on data fabric, please feel free to check one of my previous posts;

Data Fabric – What is it good for?

And if you have any questions, feel free to contact me @techstringy or on Linkedin.

For other posts with details on the announcements check out

Taylor Riggan’s View on Cloud Sync

Mike Andrews NetApp Hybrid Cloud Launch

And if you’d like a bit of audio to listen to, I also interviewed Mike Andrews for a TechStringy Interview discussing the announcements and their strategic and technical impact, feel free to have a listen here;

NetApp Hybrid Cloud Announcements with Mike Andrews

Gold medals for data

Last week was the end of a wonderful summer of sport from Rio, where the Olympics and Paralympics gave us sport at its best, people achieving life time goals, setting new records and inspiring a new generation of athletes.

I’m sure many of you enjoyed the games as much as I did, but why bring it up here? Well for someone who writes a BLOG it’s almost a contractual obligation in an Olympic year, to write something that has a tenuous Olympic link. So here’s my entry!

One part of the Team GB squad that really stood in Rio were the Olympic cyclists, winning more gold medals than all of the other countries combined (6 of the 10teamgb_trott_archibald_rowsell_barker_rio_2000-1471125302 available) a phenomenal achievement.

This led to one question getting continually asked “What’s the secret?”. In one BBC interview Sir Chris Hoy was asked that question and his answer fascinated me, during his career the biggest impact on British cycling was not equipment, facilities, training, or super human cyclists. It was data, yes, data, not just collecting data, but more importantly the ability to extract valuable insight from it.

We hear it all the time

“those who will be the biggest successes in the future are those that get the most value from their data”

and what a brilliant example the cyclists where. We see this constantly in sport where the smallest advantage matters , but not just sport, increasingly this is the case in business, as organsations see data as the key to giving them competitive edge.

We all love these kind of stories, how technology can provide true advantage, but it’s always great to see it in action.

A couple of weeks ago I was on a call with the technical lead of one of our customers. He and his company see the benefit of technology investment and how it delivers business advantage. I’ve been lucky enough to work with them over the last 4 years or so and have watched the company grow around 300% in that time, we were talking with one of his key technology vendors and explaining to them how their technology was an instrumental part of their success.

During the call I realised this was my opportunity for a tenuous Olympic link BLOG post and how, as with the cyclists, getting the best from data was delivering real bottom line success to the business.

The business is a smart energy company, doing very innovative stuff in the commercial and private energy sectors. They’re in a very competitive industry, dominated by some big companies, but these guys are bucking that trend and a great example of how a company that is agile and knows how to exploit its advantage can succeed.

In their industry data is king, they pick up tonnes of data every day, from customers, from devices, from sensors, and manipulating this data and extracting valuable information from it is key to their success.

Until about a year ago they were running their database and reporting engines (SQL based) on a NetApp storage array, running 7-mode. That had worked but a year ago we migrated his infrastructure to clustered data ONTAP to provide increased flexibility, mobility of data and more granular separation of workloads.

However, the smartest thing they did as part of the migration was to deploy flashpools into their environment, why was this so earth shattering?

A big part of the value of their SQL infrastructure is reporting. This allows them to provide better services to their customers and suppliers giving them advantage over their competitors.

However many of those reports took hours to run, in fact the process was request the report and it would be ready the next day.

The introduction of flashpools into the environment (flashpools are flash based acceleration technology available in NetApp ONTAP arrays) had a dramatic effect taking these overnight reports and delivering them in 30-60 minutes.

This significant reduction in report running times, meant more reports could be run, more reports producing different data that could be used to present new and improved services to customers.

Last year the technical lead attended NetApp Insight in Berlin. One of the big areas of discussion that caught his interest was the development of all flash FAS (AFF), NetApp’s all flash variants of their ONTAP driven FAS controllers.

They immediately saw the value in this high performance, low latency technology. So earlier this year, we arranged an AFF proof of concept to be integrated into the environment, during this POC, the team moved a number of SQL workloads to the flash based storage and it’s no understatement to say this transformed their data analysis capabilities, those 30-60 minute reports where now running in 2-3 minutes.

aff-performance-on-sql
An example of the kind of performance you can get from AFF (this is an AFF8080 cluster running ONTAP 8.3.1 – new platforms and ONTAP 9 have increased this performance further)

But this was not just about speed, this truly opened up brand new capabilities and business opportunities, now the organisation could provide their customers and suppliers with information that previously was impossible, providing quick access to data was allowing them to make decisions on their energy usage that gave true value.

They knew the proof of concept had gone well, when on taking it out the business began asking questions, why is everything so slow? Why can’t we do those reports anymore? And that was the business case, the deployment of NetApp flash was not just doing stuff quickly, or using flash because that’s what everyone says you should, this was because flash was delivering results, real business advantage.

As Chris Hoy discussed at the Olympics, it was not just getting the data because they could, it was getting the most out of it and in a sport where often 10th of seconds are between you and a gold medal, any advantage is critical.

A competitive business environment is no different, so an investment in technology that gives you the slightest edge makes perfect sense.

Today, all flash FAS is integrated into their new datacentre running the latest iterations of ONTAP, ensuring a low latency, high performance infrastructure, ensuring that they can continue to drive value from their most critical business asset, their Data.

A great use of technology to drive advantage, in fact Gold medals for data usage all round.

gold-medals

Hope that wasn’t to tenuous an Olympic link and if you have any questions then of course, @techstringy or via LinkedIN are great ways to get me.

If you’re interested in Flash you may also find this useful “Is Flash For Me?” from my company website.