Keeping on top of ONTAP

The last year has been a big one for NetApp, the turnaround in the company’s fortunes continues, fantastic growth in the all flash array market, the introduction of cloud native solutions with tools and of course not to forget Solidfire and the newly announced HCI platform. All have created lots of interest in this “new” NetApp.

If you have read any of my content previously, you’ll know I’m a fan of how NetApp operate and their data fabric strategy continues to make them the very best strategic data partner to meet the needs of many of the people I work with day-to-day.

Why am I telling you all of this? Well, like with all technology companies, it’s easy to get wrapped up in exciting new tech and sometimes forget the basics of why you work with them and what their core solutions still deliver.

For all the NetApp innovations of the last couple of years, one part of their business continues to be strong and even at 25 years old remains as relevant to customer needs as ever and that is the ONTAP operating system.

ONTAP, in its latest incarnation, version 9 (9.2 to be exact), maybe more than anything shows how NetApp continue to meet the ever-changing needs of the modern data market, because it would be easy, regardless of its strength, to write off an operating system that is 25 years old, but NetApp have not, they have developed it into something markedly different from the versions I first worked with 10 years ago.

These changes reflect the changes we, as users in more data focussed businesses, demand from our storage, it’s not even really storage we demand, it’s an ability to make our data a core part of our activities, to quote a friend “Storing is boring” and although storing is crucial, if all we are doing is worrying about storing it, then we are missing the point and if the focus for ONTAP was only that, then it would become very quickly irrelevant to a modern business.

How are NetApp ensuring that ONTAP 9 remains relevant and continues to be at the heart of data strategies big and small?

Staying efficient

Although storing may be boring, in a world where IT budgets continue to be squeezed and datacentre power and space are at a costly premium, squeezing more and more into less and less continues to be a core requirement.

Data Compaction, inline deduplication, and the newly introduced aggregate wide deduplication all provide fantastic efficiency gains. If you align this with integration of increasing media sizes (10TB SATA, 15TB Flash, something not always easy for NetApp’s competition), you can see how ONTAP continues to let you squeeze more and more of your data into smaller footprints (60Tb on one SSD drive anyone?), something that remains critical in any data strategy.

Let it grow

As efficient as ONTAP can be, nothing is efficient enough to keep up with our desire to store more data and different types of data. However, ONTAP is doing a pretty good job of keeping up. Not only adding additional scalability to ONTAP clusters (Supporting up to 24 nodes) NetApp have also taken on a different scaling challenge with the addition of FlexGroups.

FlexGroups allow you to aggregate together up to 200 volumes into one large, high performance single storage container, perfect for those who need a single point of storage for very large datasets. This is something I’ve already seen embraced in areas like analytics where high performance access to potentially billions of files is a must.

Keep it simple

A goal for any IT team should be the simplification of its environment.

NetApp have continued developing ONTAP’s ability to automate more tasks and by using intelligent analysis of system data they are helping you to take the guess-work out of workload placements and their impacts, allowing you to get it right, first time, every time.

The continued development of quick deployment templates has also greatly simplified provisioning of application storage environments from out of the box to serving data, taking just minutes not days.

In a world where an ability to respond quickly to business needs is crucial, then the value of developments like this cannot be underestimated.

Keep it secure

Maybe the most crucial part of our data strategy is security and in the last 12 months NetApp have greatly enhanced the capability and flexibility of this in ONTAP.

SnapLock functionality was added 12 months ago, allowing you to lock your data into data archives that can meet the most stringent regulatory and compliance needs.

However, the biggest bonus is the implementation of onboard, volume level encryption, previous to ONTAP9, the only way to encrypt data on a NetApp array, was like most storage vendors, with the use of self-encrypting drives.

This was a bit of an all or nothing approach, it meant buying different and normally more expensive drives and encrypting all data regardless of its sensitivity.

9.1 introduced the ability to deliver encryption on a more granular level, allowing you to encrypt single volumes, without the need for encrypting drives, meaning no need for additional hardware and importantly the ability to only encrypt what is necessary.

In modern IT, this kind of capability is critical both in terms of data security and compliance.

Integrate the future!

I started this piece by asking how you keep a 25-year-old operating system relevant, in my opinion the only way to do that is to ensure it seamlessly integrates with modern technologies.

ONTAP has a pretty good record of that, be it by luck or design, it’s port into the world of all flash, was smooth, no need for major rewrites, the ONTAP method of working was geared to work with flash before anyone had thought of flash!

The ability for ONTAP to see media as another layer of storage regardless of type was key in supporting 15TB SSD’s before any other major storage vendor and it is this flexibility of ONTAP to integrate new storage media which has led to one of my favourite features of the last 12 months, FabricPools.

This technology allows you to seamlessly integrate S3 storage directly into your production data, be that an on-prem object store, or a public cloud S3 bucket from a provider like AWS.

In the V1.0 release in ONTAP 9.2, FabricPools tier cold blocks from flash disk to your S3 complaint storage, wherever that is, bringing you the ability to lower your total cost of ownership for storage by moving data not actively in use to free up space for other workloads. All done automatically via policy, seamlessly providing an extension to your production storage capacity by integrating modern storage technology.

ONTAP everywhere

As ONTAP continues to develop, the ways you can consume it also continue to develop to meet our changing strategic needs.

Fundamentally ONTAP is a piece of software and like any piece of software it can run anywhere that meets the requirements to run it. ONTAP variants Select and Cloud, provide software defined versions of ONTAP that can be run on white box hardware or delivered straight from the cloud marketplaces of AWS and Azure.

The benefit of this stretches far beyond just been able to run ONTAP in more places, it means that management, security policies and data efficiencies are all equally transferable. It’s one way to manage, one set of policies to implement, meaning that where your data resides at a given moment becomes less important, as long as it is in the right place at the right time for the right people.

In my opinion, this flexibility is critical for a modern data strategy.

Keep it coming

Maybe what really keeps ONTAP relevant is the fact that these new capabilities are all delivered in software, none of the features have required new hardware or for you to purchase an add-on, they are all delivered as part of the ONTAP development cycle.

And the modern NetApp has fully embraced a more agile way of delivering ONTAP, with a 6-month release cadence, meaning they can quickly absorb feature requests and get them delivered to platforms that desire them quickly, allowing them and us to respond to changing business needs.

So, while NetApp have had a fascinating year, delivering great enhancements to their portfolio, ONTAP still retains a very strong place at the heart of their data fabric strategy and still, in my opinion, is the most complete data management platform, continuing to meet the needs presented by modern data challenges.

Find out more

If you want to know more about ONTAP and its development then try these resources.

NetApp’s Website

Justin Parisi’s BLOG – providing links to more detailed information on all of the technologies discussed and much more!

TechONTAP Podcast – NetApp’s excellent TechONTAP podcast has detailed information of all of the information shared here, it’s all in their back catalogue.

And of course you can leave a comment here or contact me on twitter @techstringy


Tech me for the weekend – 21st July

Those weekends just keep on rolling around don’t they! It’s been a hectic old week at work, but I’ve still managed to catch up on some really good tech content that I thought I’d share.

If you are after some interesting reads and listens to satisfy your insatiable desire for all things tech this weekend, then give these a go..


CIO.COM – Is your data ready to help you make game changing decisions?

I presented at an event recently on this very topic and thought this was an interesting article discussing the same issues.

Many of us see the value of data and see how getting a better understanding of it can help us make better decisions in our business, but how many of us have thought about how to package up our data so we can actually take advantage of analytics tools so we can become a more data driven businesses? Some areas to consider in this article;

Tech Crunch – Five building block of a data-driven culture

While on the data theme, also found this from Tech Crunch further exploring the idea of making our businesses more data centric, while the CIO article looked at how to prepare our data to be more useful to us, Tech Crunch look at the wider picture of what a business needs to become more data centric.

They explore the importance of an authoritative data set, but also the importance of having the right skills in your business, it’s no good doing all of this work with your data, if no one has the slightest idea how to use it!

Windows IT Pro – Microsoft Inspire: Simplify, Simplify, Simplify

A big supporter of this message in all areas of IT, as we become ever more reliant on our technology and it becomes ever more complex, what is crucial is we take that complexity away from our end users, they need to be able to focus on making the most of their technology so they can meet their desired outcomes, not wasting time worrying about making stuff work.

This article from Windows IT Pro comes from the recent Microsoft Inspire conference, as they take a look at Microsoft’s plans for simplifying technology delivery be that cloud offerings, building hybrid solutions with Azure Stack, or using Microsoft cognitive services, the focus is on simplification.


After all that reading, you may want to kick back for some tech listening, so here’s a coupe of shows to enjoy!

Tech ONTAP – Death of the specialised admin

I know the NetApp podcast team keep getting a mention, but they are knocking out some great episodes at the minute and episodes that are for a much wider IT listenership than just NetApp customers.

This episode is one of those as Andy Bantha and Josh Atwell join the team to talk about next generation infrastructure, but not a debate about technology, one about skill sets, what types of skills do we need as IT Pro’s and what kind of skills do we need as a business as you look to build your next generation technology platforms.

Well worth a listen.



Virtually Speaking Podcast – vSAN Off-Road

Another old favourite this one. I enjoy the VMware podcast, as a great way to keep up with what VMware are doing.

This episode, although vSAN focussed, does however touch on an interesting idea, that of building customised infrastructures, not necessarily ones that sit in any good practice guide or a reference architecture, however ones that are supported, even if their use cases are quite unique.

The team bring up some interesting points and areas to consider, worth a listen if you are indeed taking your own IT a little “off-road”.

In Tech We Trust – Luck and Innovation

Enjoying the new format of this show and an interesting topic this time out as Yadin discusses with a range of guests the part that luck plays in innovation, does it play a part and if so how big?

Interesting listen and some good sharing of experiences.

Give it a try, I’m pretty sure it will give you some things to think about!

Tech Interviews – Living on the data edge

Talking of Yadin (smooth transition if ever there was one) he is my guest on my Tech Interviews show this week, as I discuss Yadin’s day job at Druva, as we tackle the tricky and often ignored problem of edge data.

We discuss the data that sits out on our mobile devices, laptops, tablets, phones, USB sticks and the unique set of problems that this presents to our enterprise in terms of data management.

Yadin shares some great ideas and insights on how we can begin to tackle the challenge.

Plus it’s the last show for a few weeks as Tech Interviews takes a summer holiday – so heah, why not give it a try.

Hopefully that gives you plenty to enjoy over the weekend.

Happy teching.. watch out for some more tech content to enjoy soon…



Tech me for the weekend – 7th July

First up an apology, a podcast only list this weekend, it’s all been a bit hectic this week so not had a lot of reading time…

A bit of a theme this week with a focus on security. Data security is constant hot topic from Ransomware to governance and all that’s in between, so if you are fighting the good data security fight, you may find these shows provide some interesting insight.

All of the links below take you to the podcast show notes pages, each of those pages have the episodes embedded, or you can of course look for them in your favourite podcatchers.

.future from Microsoft – Securing a digital battlefield

This is a new podcast from Microsoft and episode 1 was an interesting start, looking at the growing impact of the modern cyber threat and how in many quarters it is seen as the next potential “war zone” a good insight and don’t worry, not as scary as it sounds!

.future podcast

Inside Out Security – what does GDPR mean for countries outside of the EU

The EU General Data Protection Regulation, GDPR, is going to be a massive change on how we handle personal data, this episode takes a look at one of the common questions associated with the regulation, “how does it impact me if I’m outside of the EU?” the reality is, it probably will.

A good debate about how and why it will impact you, regardless of location.

Inside Out Security

NetApp TechONTAP – Security update

The ONTAP boys are back in the list again this week, but to be honest, that’s because I’m biased as I’m on this show!

Don’t worry it’s not just me, I join the regular team as well as some of NetApp’s very own data security gurus as we talk about the latest threats, the importance of understanding your data and how none of this works if you don’t build an in-depth data security strategy.

Lots of good tips and advice.


Tech Interviews

If the Tech ONTAP podcast isn’t enough of me for you, then there is of course this weeks Tech Interviews.

A personal favourite this show and a little bit of a digression from normal, as rather than talking with someone from the tech industry, I catch up with an actual technology user!

I chat with Lee Clark of Givepenny UK about how he has integrated data and cloud technology to deliver innovation to the fundraising sector, allowing charities to find new ways of engaging with a whole new generation of fundraisers.

A fantastic example of how technology can really make a big difference – I hope you enjoy it.

How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Hopefully that should quench your tech thirst this weekend, enjoy the shows and have a great weekend.



How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Normally when I do this podcast I chat with people from the technology community, normally those involved on the supply side, to discuss how the market is changing and how technology can help us deliver innovation and change into our organisations.

This week I wanted to do something a little different as the third and final part of our brief series on using the cloud, by looking at a real use case, someone who has embraced cloud technology to bring innovation to an industry.

I always admire those people who can spot a gap in a market, they see new and inventive ways of delivering products and services or ways of working with their customers, but sometimes bringing those ideas to fruition can be difficult and expensive. One of the most powerful things of the modern cloud driven era is how cloud providers have made technology much more accessible to those with great new business ideas, making it easier than ever for them to quickly, efficiently and importantly, for relatively low investment, exploit often very advanced technologies to help bring an idea to fruition.

This week I’m joined by someone who has done just that, Lee Clark of Givepenny UK.

Lee describes how a 100-mile charity bike ride, that he never got to do, made him think about a whole new way of fundraising. He realised that although he couldn’t do the actual 100-mile event, he had tracked many 100’s of miles of training with his smartphone app and realised how we all track and store masses of data about our everyday activities and how this data could be utilised to raise much needed money for charities in the UK.

We discuss a whole range of issues which comes with innovation, issues that are not unique to trying to launch something new, but are equally experienced by those trying to bring change into any organisation.

We look at how organisations are often crippled by their own behaviour, restricting their ability to embrace innovation, and how to overcome it.

Lee shares some of the ideas behind the Givepenny platform and how, by looking for new ways to engage potential fundraisers, has created a whole range of new opportunities for charities of all types.

We also discuss how cloud has made traditionally complex technology so much more accessible and how this accessibility allowed Givepenny to deliver a platform quickly, effectively and at a low cost, especially when compared to how the charity sector first engaged with technology as part of fundraising some 15 years ago.

And finally, we look at the importance of understanding the way technology is changing so many aspects of both our personal and business lives and why understanding this change is not just for those looking to deploy new innovations, but crucially those who make decisions about the future of an organisation, decisions that will have a significant impact on future success.

Lee shares with real enthusiasm about Givepenny’s journey and how they are looking to bring innovation to the charity sector, allowing them to create a whole new range of ways to engage with a very generous population who’d love to help, if they could.

You can find out more about Givepenny on their website and you can follow them on twitter @givepennyuk.

You can also follow Lee @jannerinbrum

Subscribe on Android


Listen to Stitcher

Tech me for the weekend 30th June 2017

Another busy week in the tech world, so it’s time to kick back, relax and enjoy a weekend. But of course us techies never stop, always a chance to grab a tech article or maybe listen to a tech podcast, so here’s some stuff I’ve seen or heard this week that I thought I’d share, give them a go, see what you think.


Try these for your tech listening pleasure;

Arrow Bandwidth

The Arrow team have produced some great episodes recently, and here’s another one, a fascinating chat with IBM’s The Weather Company. We all talk a lot about how data is the new gold, and how our use of data is changing the way we do so many things. This episode outlines a great use case as The Weather company are using data on a massive scale to provide not only more accurate, but also a range of brand new weather services.

Arrow Bandwidth Podcast

IT Pro Podcast

This is a new podcast from one of my favourite podcast presenters Richard Hay, who presents the Windows Observer podcast talking about all things Microsoft, but mainly with a “consumer” focus, the IT Pro podcast has a more business slant and this week focusses on migration to the cloud, some of the things to consider and tips on how to deliver a great migration project.

Well worth a listen and making an addition to your podcast catcher of choice;

IT Pro Podcast

NetApp Tech ONTAP

My friends at NetApp have a great and very informative podcast for NetApp customers and partners, occasionally the team go off topic, this week was an excellent example of that with a fascinating chat looking at Women In Tech, NetApp have a great programme in place, but they also talk about the wider work that needs doing to ensure the tech industry is a place for us all to thrive.

Tech ONTAP Podcast

Tech Interviews

This weeks tech interviews was the 2nd of a 3 part series looking at moving to the cloud, as we tackled the interesting challenge of picking the right cloud service provider.

Often an area that doesn’t get considered to the degree that maybe it should, I’m joined by Jon Green of Navisite as we discuss the importance of picking the right partner, understanding cloud infrastructure, cloud economics and the “gotchas” of cloud.

Gotcha – the challenge of moving to the cloud – Jon Green – Ep33


Want to kick back and relax with a tech article, no fear, try these out;


A new website from Kirk Ryan, a cloud architect (and star of Tech Interviews Podcast Ep32) who has put together a list of great resources for helping those building cloud solutions.

This excellent post is one such example, as Kirk provides some excellent examples of how to control costs in the AWS cloud (There is also an equally good Azure version of the article on the site).

A great read and very insightful;

Microsoft – How artificial intelligence in changing our world

I’m a big fan of this topic and looking at how the way we use data is changing so many aspects of our life, this article is actually the route into an eBook on the subject.

Give the article a read and if it sounds interesting go download the eBook that goes with it;

NetApp – Is your SaaS Covered?

An article from another previous Tech Interviews guest, Robert Cox (Check him out talking object storage) If you ignore that this is a “NetApp sales pitch” Robert asks a very important question about “who is responsible for protecting your SaaS data?”, if you’re not sure, or don’t know, this is a good read.

Hopefully all of that will keep you technically entertained… enjoy and have a great weekend.

Tech me for the weekend – June 23rd

No idea where this week has gone! – shot by like nobodies business! So here I am again with some tech content to keep you company over the weekend… check out the articles and podcasts that caught my attention this week.


Business Continuity Institute – Cultural issues the number one obstacle to digital transformation

First up, this from the Business Continuity Institute looking at the impact of organisational culture and its ability to inhibit progression and how if you are looking to make change and make that change successful then don’t underestimate the importance of buy in, from top to bottom.


Technative – Beginners Guide to Big Data

A constant conversation at the minute is how taking advantage of our data is going to be crucial in our future success as organisations and businesses of all types.

This article looks at one of the areas that will be behind the successful adoption of data analytics and that is the increasing availability of big data tools, mainly powered by the big public cloud vendors, who are making it easier than ever for us to take advantage of our data.

Matt Watts – Data Visionaries Wanted

To continue the focus on data and cultural change I came across this gem from Matt Watts as he discussed the importance of Data Visionaries in your organisation, those people who can see the value of data and can help you extract key information from it.

All Aboard the Data Train

A new article from me this week, as myself and Mrs. Techstringy found out for ourselves the importance of data analytics in informing good business decisions.

That’s caught you attention hasn’t it!

All Aboard the Data Train


In Tech We Trust

This show has had a real revamp over the last few episodes and now has more of a focus on tech industry topics, rather than its old weekly news roundup format.

This week is a particularly interesting topic as the team discuss communication in the modern world, from Human to Human through to Human to machine a fascinating debate with a number of interesting guests, well worth a listen.

The ON-Premise IT podcast

Becoming a real favourite this show, this week the round table panel, take a bit of a diversion from the normal tech debate as they discuss careers, the kind of career moves to consider through to the importance of certifications. Good tips.

Influence marketing podcast

A new show to the list, as John and Cathy Troyer host chats with IT folk who are involved in Tech Community programs.

Now on Episode three, but I though I’d mention the first one, with Veeam’s Rick Vanover as he discussed the extremely successful Veeam Vanguard Program an interesting insight into the work that goes on behind the scenes.

Tech Interviews

This week was all about moving to the cloud, as I was joined by NetApp Cloud Architect Kirk Ryan, as we discussed the important things to consider as you look to see how you can take advantage of cloud services in your business.

We cover the importance of cloud economics, the options for integrating the cloud with your data and how to ensure you don’t take your on-prem bad habits with you.

A top guest, with lots of great insight.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Hope you enjoy that content.. love to hear your feedback.

Have a great weekend.




All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Tech me up – Your tech entertainment for this weekend – 16th June

It’s the summer, the sun is out, it’s the weekend, what’s the only thing missing? some great technical content to keep you company while you sip a cold one.

This weeks list of top tech has a data security slant to it, so not only will it be keeping you informed, it will ensure you can keep those data assets secure.

Settle back and enjoy;


For those who like to listen to their tech, here’s a list of great podcasts I caught this week.

Arrow Bandwidth

An excellent and somewhat unique chat with Marcus Hutchings, better known as Malware Tech, the information security engineer who discovered the “kill switch” for the WannaCrypt ransomware outbreak.

They discuss how he discovered the workings of the malware and how they came about the kill switch and how he’s dealing with the “fame” that came with it.

A great listen.

InfoSec Podcast

This is a relatively new podcast to my list, but an excellent weekly discussion on the latest news from the information security industry.

This week they focus on the latest release of SANS’ Security Awareness Report which attributed communication as one of the primary reasons why awareness programs thrive or fail.

The team also look at the difficulties of legislating for cyber security in a quick moving technology world.

Infosec Podcast



More security, this time from the Datanauts team, who are joined by James Holland and Aaron Miller from Palo Alto Networks to discuss the evolution of security architectures and approaches, the importance of application awareness, and the impact of virtualization, which can both create new risks and provide new opportunities.

They also look at where security is going, how cloud and virtualization will continue to shape your security infrastructure, and how skill sets will have to adapt to support more automation.

Datanauts Podcast

NetApp Tech ONTAP

A bumper week for podcasts and if you want something not security related, how about some coding?

I really enjoyed this episode of the NetApp Tech ONTAP podcast, not really NetApp focussed at all but a great chat with Ashley McNamara of  Pivotal discussing how storage administrators (and pretty much anyone) should be learning to code. Ashley also gives us places to look for resources for aspiring developers and scripters to be successful.

Great fun.. have a listen.

Tech Interviews

This week was a VMware special, as I was joined by the hosts of the excellent VMware vSPEAKING Podcast, Pete Flecha and John Nicholson , they make great guests as we discuss how peoples changing demands on technology are changing how we have to design and architect our infrastructure.

We also look at how our infrastructure not only needs to be faster and simpler, but also needs to be smarter and how our application and data centric world is driving demands for availability.

John also introduces us to the concept of giving jetpacks to cavemen!

Great fun with the guys, have a listen.


Rather settle back in the garden and read your tech, then try out these security focussed articles;

Data Privacy Monitor – Deeper Dive Security is a big deal for big data

We are all keen to take advantage of data analytics so we can get more value from our data assets, but how many of us consider the range of security challenges that comes with consuming those public big data services?

In this article Lavonne Hopkins looks at a range of issues to consider and provides some solid advice.

Compare The Cloud – Refashioning data security with a nod to the cloud

The thing with the Internet is once it’s out there, it stays out there, but on the plus side you can find little gems of articles that you may not of read at the time.

This was one of those, posted back in Jan 2016 by the team at Compare the Cloud.

In this article they look at the challenge that CISO’s have. Pulled in many directions from their businesses, while having to deal with all of the evolving security threats.

It’s an interesting read looking at the kind of approaches that a CISO can look at to help take on the multiple challenges they are faced with.

Forbes – Why manufacturers should be mindful of cybersecurity

This article discusses how cyber attackers target the manufacturing industry, what caught my attention was how the idea of small imperceptible changes can eventually have a huge impact.

This approach is not just a threat to manufacturing, it is also a threat for security built around analytics and AI. One of the approaches attackers take to overcome machine learning based security is using small changes that machine learning algorithms don’t notice and overtime they start to accept as normal behaviour.

This provides an interesting read and highlights the complexity of the security threat.

Hopefully something there for everyone, enjoy your weekend.

Tech me for the weekend – June 9th

Well well, it’s the weekend again and I know you all find it tough to drag yourself away from the world of tech, so worry not, here’s this weeks list of top tech entertainment to keep you teched up this weekend.

This weeks there is a bit of a data  theme.. so dive in their is podcasts and articles a plenty…

The Reading

Why Data will drive your success in the cloud – Matt Watts

Matt works closely with the office of the CTO at NetApp and wrote this interesting piece on their site about the changing way we all see data’s value and what that means for the way we build our data strategy.

Inspired by a recent article in The Economist that notes in today’s economy, “the world’s most valuable resource is no longer oil, but data.” – I know we’ve all heard it, but Matt explores the topic and what it means.

Why data will drive your success in the cloud

Setting sail for uncharted waters – Ruairi McBride

Some big announcements from storage behemoth NetApp this week which included their entry into the world of Hyperconverged Infrastructure (HCI).

Now, no question that NetApp are late to this particular party, but Ruairi looks at why and the potential benefits that NetApp have from having sat back and watched the market for a little while and what their solution brings that may make them stand out.

If you want to know more about the NetApp HCI offering, this is a great place to start.

Setting sail for uncharted waters

One year until the EU GDPR

This complex regulation will finally be enacted in just under a year and personally I don’t think you can ever read up too much on what this may mean for us all and the way we collect, store, manage, protect and dispose of our data assets.

Good read here from CITY A.M. which also includes some quotes from my favourite data privacy attorney Sheila Fitzpatrick

A good piece, well worth a read.

One year until the EU GDPR

Big wheels keep on turning

Back to NetApp, I wrote a piece myself this week looking at NetApp’s continued evolution from storage company to data management company, how that was progressing, why that was important and how are they embracing this ever increasingly data driven world.

Big Wheels Keep On Turning

The Listening

Arrow bandwidth

This is an episode from a couple of weeks ago, but fits in nicely with this weeks data theme as Rich and Dave are joined by Vince Payne to discuss the data industry and the impact of business intelligence and analytics.

Rich plays the role of devils advocate perfectly, as the team discuss as to whether BI and Analytics really is a “thing” and whether people are actually using these technologies.

Excellent and thoughtful debate as they look at whether data really Is the new gold!


Gestalt IT – The ON-Premise IT round table

Another excellent debate asking what is the reality for the future of data, is AI, machine learning and data analytics really going to change the world and do something interesting?

And does the future include clever machines really replacing people? Or will us poor humans always have a place!

More excellent devils advocacy from Nigel Poulton.

What is big data?

Speaking In Tech

Rounding off the data chat, a brilliant guest joins the Speaking In Tech team, as Michel Feaster from Usermind discussed how data analytics and intelligence can have a massively positive impact on our customer experiences.

Some great practical examples of how using intelligence alongside traditional systems can revolutionise the kind of results we get, give it a listen.

Speaking in Tech: Blame millennials for customer engagement upheaval

Tech Interviews

Of course you don’t get this far without a plug for my own show, this week is the second part of a look at the data availability market.

My guests this week discuss how to gain more value from data backups, how to ensure that our focus on application availability doesn’t do more harm than good and whether availability is elevated to the right level of importance?

Three great guests in Mike Beevor of Pivot 3, Data Gravity’s Dave Stevens and Andrew Smith of IDC.


Oh and if you missed part one where I chat with Justin Warren and Jeff Leeds, fear not.. it’s here…

So enough to keep you busy..

Have a great weekend.


Big Wheels Keep On Turning

Just about a year ago I wrote a piece about NetApp and how they were making a strategic shift (Turning a big storage ship),changing their focus as well as the perception of both the industry and customers. This coincided with the launch of the latest version of the companies bestselling storage operating system ONTAP – version 9

A year on, after spending a few days with the NetApp’s leadership as part of our annual NetApp A-Team get together, I thought It would be good to check in on how that big storage ship was doing and was it still turning in the right direction.

First some context, data is increasingly the lifeblood of our organisations, it’s in the top 2 or 3 assets any business holds and we are constantly seeing how organisations are using data in ever more creative ways. While of course we continue to create more and keep it for longer.

Not only do we need more from our data, the way we consume data services is changing, the big public cloud providers are giving us analytics services on demand, allowing us to solve more and more complex problems, well as long as we can get our data to their cloud offerings in the first place. Which means more data been housed in the cloud, which is great for analytics, but isn’t always a great fit for our data sets.

In that context, how does a big storage vendor remain relevant?

In my opinion they have to embrace the changing attitude to data, just wanting to store it isn’t enough, to quote a friend of mine “storing is boring” and in reality it kind of is, if your only view of data strategy is storing it neatly, you are missing a trick.

So the question is, are NetApp embracing this new data driven world?

Shift to a data management company

This is something I’ve been hearing over the last 6 months and fully expect this is going to be front and centre of a lot of NetApp messaging, as they start to move from storage company to data management company, this focus is absolutely right, in my own company, we have done the same thing, because it’s what our customers demand, it’s not about building infrastructure and storing data, it’s about taking a valuable asset and getting the most out of it.

There is no point just talking to a modern organisation about how much storage you can provide and how fast it is, organisations want to know “how can you make sure my data asset, remains an asset”

Data Fabric

For those not familiar with NetApp’s Data Fabric, it is a criticalData-Fabric_shutterstock_thumb.jpg part of their vision as they make the shift to a data management company. A data fabric is NetApp’s view of  how we build a data infrastructure that allows us to get the best from our data giving us the flexibility in how and where we store it, how we move it, while maintaining security and compliance, all crucial in a modern data strategy.

But this does go beyond just a strategic goal, this is baked into all of NetApp’s thinking, the idea that you can move data across any NetApp platform regardless of whether it’s hardware, white box, virtual machine or even sat in AWS or Azure, is very powerful. It also isn’t limited to ONTAP, allowing us to move data between ONTAP, Solidfire, E-Series, AltaVault and even none NetApp platforms via Flexarray.

Ultimately will data fabric be stretched beyond the NetApp portfolio? Who knows.. it would be great if it did, but there’s a lot of work to be done.

Embracing the new world

CloudComputing_thumb.jpgPart of the new way of working with data includes the cloud, there is no getting away from this reality, whether it’s consuming SaaS like Office365 and Salesforce or it’s holding our data long term in S3 or Azure Blob stores or needing to present our data to analytics tools, organisations are moving more data to the cloud.

What part does an on-prem storage vendor play in this? It has to be two fold;

Help me to move data to the cloud

Because they supply on-prem storage arrays, NetApp can’t ignore the reality that their customers want to move data to the cloud. To NetApp’s credit they are embracing this challenge and are helping enable this movement.

The data fabric strategy and ONTAP are a key part of this, the ability to take NetApp’s core storage OS and deploy it directly from either AWS and Azure, means that not only can you move your data from your on-prem array straight into a public cloud, but because it’s the same operating system end to end you can crucially maintain all of the on-premises  efficiencies, management and controls on your data in the public cloud and this is a real positive.

It’s not only moving of data to the cloud that NetApp have turned focus to however, it’s also looking at ways that cloud based services can play a part in thier future that is also interesting. This has started with two services, Cloud Sync and Cloud Control.

Cloud Sync assists users in automating the process of moving data from on-premises NFS datastores straight into Amazon S3 storage and back again.

While Cloud Control allows organisations to protect their Office365 data, by allowing us to back it up and hold it in an alternate location.

The important thing to note with these two services is they are exactly that, they are services, no traditional NetApp tools are needed as any part of the solution, you subscribe to the services and begin to use them.

If anything proves NetApp’s position on embracing the new world, it is this.

Big Wheels Still Turning?

With a range of new announcements due soon, including the much anticipated NetApp HCI platform, the storage behemoth, in my opinion, continues to evolve, it’s focus is right and certainly aligning to the challenges that the organisations I deal with talk about.

It continues to do smart things within it’s core product set, continuing to add tools that enable their wider data fabric strategy and working these directly into their portfolio, especially the product at the heart of it, ONTAP.

Personally I continue to be very enthused by what NetApp are doing and the direction they are taking and for me, those big wheels are not only turning, they are turning in exactly the right direction.

Let’s see if they can keep it up.