Keeping on top of ONTAP

The last year has been a big one for NetApp, the turnaround in the company’s fortunes continues, fantastic growth in the all flash array market, the introduction of cloud native solutions with tools and of course not to forget Solidfire and the newly announced HCI platform. All have created lots of interest in this “new” NetApp.

If you have read any of my content previously, you’ll know I’m a fan of how NetApp operate and their data fabric strategy continues to make them the very best strategic data partner to meet the needs of many of the people I work with day-to-day.

Why am I telling you all of this? Well, like with all technology companies, it’s easy to get wrapped up in exciting new tech and sometimes forget the basics of why you work with them and what their core solutions still deliver.

For all the NetApp innovations of the last couple of years, one part of their business continues to be strong and even at 25 years old remains as relevant to customer needs as ever and that is the ONTAP operating system.

ONTAP, in its latest incarnation, version 9 (9.2 to be exact), maybe more than anything shows how NetApp continue to meet the ever-changing needs of the modern data market, because it would be easy, regardless of its strength, to write off an operating system that is 25 years old, but NetApp have not, they have developed it into something markedly different from the versions I first worked with 10 years ago.

These changes reflect the changes we, as users in more data focussed businesses, demand from our storage, it’s not even really storage we demand, it’s an ability to make our data a core part of our activities, to quote a friend “Storing is boring” and although storing is crucial, if all we are doing is worrying about storing it, then we are missing the point and if the focus for ONTAP was only that, then it would become very quickly irrelevant to a modern business.

How are NetApp ensuring that ONTAP 9 remains relevant and continues to be at the heart of data strategies big and small?

Staying efficient

Although storing may be boring, in a world where IT budgets continue to be squeezed and datacentre power and space are at a costly premium, squeezing more and more into less and less continues to be a core requirement.

Data Compaction, inline deduplication, and the newly introduced aggregate wide deduplication all provide fantastic efficiency gains. If you align this with integration of increasing media sizes (10TB SATA, 15TB Flash, something not always easy for NetApp’s competition), you can see how ONTAP continues to let you squeeze more and more of your data into smaller footprints (60Tb on one SSD drive anyone?), something that remains critical in any data strategy.

Let it grow

As efficient as ONTAP can be, nothing is efficient enough to keep up with our desire to store more data and different types of data. However, ONTAP is doing a pretty good job of keeping up. Not only adding additional scalability to ONTAP clusters (Supporting up to 24 nodes) NetApp have also taken on a different scaling challenge with the addition of FlexGroups.

FlexGroups allow you to aggregate together up to 200 volumes into one large, high performance single storage container, perfect for those who need a single point of storage for very large datasets. This is something I’ve already seen embraced in areas like analytics where high performance access to potentially billions of files is a must.

Keep it simple

A goal for any IT team should be the simplification of its environment.

NetApp have continued developing ONTAP’s ability to automate more tasks and by using intelligent analysis of system data they are helping you to take the guess-work out of workload placements and their impacts, allowing you to get it right, first time, every time.

The continued development of quick deployment templates has also greatly simplified provisioning of application storage environments from out of the box to serving data, taking just minutes not days.

In a world where an ability to respond quickly to business needs is crucial, then the value of developments like this cannot be underestimated.

Keep it secure

Maybe the most crucial part of our data strategy is security and in the last 12 months NetApp have greatly enhanced the capability and flexibility of this in ONTAP.

SnapLock functionality was added 12 months ago, allowing you to lock your data into data archives that can meet the most stringent regulatory and compliance needs.

However, the biggest bonus is the implementation of onboard, volume level encryption, previous to ONTAP9, the only way to encrypt data on a NetApp array, was like most storage vendors, with the use of self-encrypting drives.

This was a bit of an all or nothing approach, it meant buying different and normally more expensive drives and encrypting all data regardless of its sensitivity.

9.1 introduced the ability to deliver encryption on a more granular level, allowing you to encrypt single volumes, without the need for encrypting drives, meaning no need for additional hardware and importantly the ability to only encrypt what is necessary.

In modern IT, this kind of capability is critical both in terms of data security and compliance.

Integrate the future!

I started this piece by asking how you keep a 25-year-old operating system relevant, in my opinion the only way to do that is to ensure it seamlessly integrates with modern technologies.

ONTAP has a pretty good record of that, be it by luck or design, it’s port into the world of all flash, was smooth, no need for major rewrites, the ONTAP method of working was geared to work with flash before anyone had thought of flash!

The ability for ONTAP to see media as another layer of storage regardless of type was key in supporting 15TB SSD’s before any other major storage vendor and it is this flexibility of ONTAP to integrate new storage media which has led to one of my favourite features of the last 12 months, FabricPools.

This technology allows you to seamlessly integrate S3 storage directly into your production data, be that an on-prem object store, or a public cloud S3 bucket from a provider like AWS.

In the V1.0 release in ONTAP 9.2, FabricPools tier cold blocks from flash disk to your S3 complaint storage, wherever that is, bringing you the ability to lower your total cost of ownership for storage by moving data not actively in use to free up space for other workloads. All done automatically via policy, seamlessly providing an extension to your production storage capacity by integrating modern storage technology.

ONTAP everywhere

As ONTAP continues to develop, the ways you can consume it also continue to develop to meet our changing strategic needs.

Fundamentally ONTAP is a piece of software and like any piece of software it can run anywhere that meets the requirements to run it. ONTAP variants Select and Cloud, provide software defined versions of ONTAP that can be run on white box hardware or delivered straight from the cloud marketplaces of AWS and Azure.

The benefit of this stretches far beyond just been able to run ONTAP in more places, it means that management, security policies and data efficiencies are all equally transferable. It’s one way to manage, one set of policies to implement, meaning that where your data resides at a given moment becomes less important, as long as it is in the right place at the right time for the right people.

In my opinion, this flexibility is critical for a modern data strategy.

Keep it coming

Maybe what really keeps ONTAP relevant is the fact that these new capabilities are all delivered in software, none of the features have required new hardware or for you to purchase an add-on, they are all delivered as part of the ONTAP development cycle.

And the modern NetApp has fully embraced a more agile way of delivering ONTAP, with a 6-month release cadence, meaning they can quickly absorb feature requests and get them delivered to platforms that desire them quickly, allowing them and us to respond to changing business needs.

So, while NetApp have had a fascinating year, delivering great enhancements to their portfolio, ONTAP still retains a very strong place at the heart of their data fabric strategy and still, in my opinion, is the most complete data management platform, continuing to meet the needs presented by modern data challenges.

Find out more

If you want to know more about ONTAP and its development then try these resources.

NetApp’s Website

Justin Parisi’s BLOG – providing links to more detailed information on all of the technologies discussed and much more!

TechONTAP Podcast – NetApp’s excellent TechONTAP podcast has detailed information of all of the information shared here, it’s all in their back catalogue.

And of course you can leave a comment here or contact me on twitter @techstringy

Tech me for the weekend – 21st July

Those weekends just keep on rolling around don’t they! It’s been a hectic old week at work, but I’ve still managed to catch up on some really good tech content that I thought I’d share.

If you are after some interesting reads and listens to satisfy your insatiable desire for all things tech this weekend, then give these a go..

Articles

CIO.COM – Is your data ready to help you make game changing decisions?

I presented at an event recently on this very topic and thought this was an interesting article discussing the same issues.

Many of us see the value of data and see how getting a better understanding of it can help us make better decisions in our business, but how many of us have thought about how to package up our data so we can actually take advantage of analytics tools so we can become a more data driven businesses? Some areas to consider in this article;

http://www.cio.com/article/3192349/analytics/is-your-data-ready-to-help-you-make-game-changing-decisions.html

Tech Crunch – Five building block of a data-driven culture

While on the data theme, also found this from Tech Crunch further exploring the idea of making our businesses more data centric, while the CIO article looked at how to prepare our data to be more useful to us, Tech Crunch look at the wider picture of what a business needs to become more data centric.

They explore the importance of an authoritative data set, but also the importance of having the right skills in your business, it’s no good doing all of this work with your data, if no one has the slightest idea how to use it!

https://techcrunch.com/2017/06/23/five-building-blocks-of-a-data-driven-culture/

Windows IT Pro – Microsoft Inspire: Simplify, Simplify, Simplify

A big supporter of this message in all areas of IT, as we become ever more reliant on our technology and it becomes ever more complex, what is crucial is we take that complexity away from our end users, they need to be able to focus on making the most of their technology so they can meet their desired outcomes, not wasting time worrying about making stuff work.

This article from Windows IT Pro comes from the recent Microsoft Inspire conference, as they take a look at Microsoft’s plans for simplifying technology delivery be that cloud offerings, building hybrid solutions with Azure Stack, or using Microsoft cognitive services, the focus is on simplification.

http://windowsitpro.com/cloud-data-center/microsoft-inspire-simplify-simplify-simplify-or-least-shove-complexity-out-view

Podcasts

After all that reading, you may want to kick back for some tech listening, so here’s a coupe of shows to enjoy!

Tech ONTAP – Death of the specialised admin

I know the NetApp podcast team keep getting a mention, but they are knocking out some great episodes at the minute and episodes that are for a much wider IT listenership than just NetApp customers.

This episode is one of those as Andy Bantha and Josh Atwell join the team to talk about next generation infrastructure, but not a debate about technology, one about skill sets, what types of skills do we need as IT Pro’s and what kind of skills do we need as a business as you look to build your next generation technology platforms.

Well worth a listen.

 

 

Virtually Speaking Podcast – vSAN Off-Road

Another old favourite this one. I enjoy the VMware podcast, as a great way to keep up with what VMware are doing.

This episode, although vSAN focussed, does however touch on an interesting idea, that of building customised infrastructures, not necessarily ones that sit in any good practice guide or a reference architecture, however ones that are supported, even if their use cases are quite unique.

The team bring up some interesting points and areas to consider, worth a listen if you are indeed taking your own IT a little “off-road”.

In Tech We Trust – Luck and Innovation

Enjoying the new format of this show and an interesting topic this time out as Yadin discusses with a range of guests the part that luck plays in innovation, does it play a part and if so how big?

Interesting listen and some good sharing of experiences.

Give it a try, I’m pretty sure it will give you some things to think about!

https://intechwetrustpodcast.com/e/125-luck-and-innovation/

Tech Interviews – Living on the data edge

Talking of Yadin (smooth transition if ever there was one) he is my guest on my Tech Interviews show this week, as I discuss Yadin’s day job at Druva, as we tackle the tricky and often ignored problem of edge data.

We discuss the data that sits out on our mobile devices, laptops, tablets, phones, USB sticks and the unique set of problems that this presents to our enterprise in terms of data management.

Yadin shares some great ideas and insights on how we can begin to tackle the challenge.

Plus it’s the last show for a few weeks as Tech Interviews takes a summer holiday – so heah, why not give it a try.

Hopefully that gives you plenty to enjoy over the weekend.

Happy teching.. watch out for some more tech content to enjoy soon…

 

 

Tech me for the weekend – 7th July

First up an apology, a podcast only list this weekend, it’s all been a bit hectic this week so not had a lot of reading time…

A bit of a theme this week with a focus on security. Data security is constant hot topic from Ransomware to governance and all that’s in between, so if you are fighting the good data security fight, you may find these shows provide some interesting insight.

All of the links below take you to the podcast show notes pages, each of those pages have the episodes embedded, or you can of course look for them in your favourite podcatchers.

.future from Microsoft – Securing a digital battlefield

This is a new podcast from Microsoft and episode 1 was an interesting start, looking at the growing impact of the modern cyber threat and how in many quarters it is seen as the next potential “war zone” a good insight and don’t worry, not as scary as it sounds!

.future podcast

Inside Out Security – what does GDPR mean for countries outside of the EU

The EU General Data Protection Regulation, GDPR, is going to be a massive change on how we handle personal data, this episode takes a look at one of the common questions associated with the regulation, “how does it impact me if I’m outside of the EU?” the reality is, it probably will.

A good debate about how and why it will impact you, regardless of location.

Inside Out Security

NetApp TechONTAP – Security update

The ONTAP boys are back in the list again this week, but to be honest, that’s because I’m biased as I’m on this show!

Don’t worry it’s not just me, I join the regular team as well as some of NetApp’s very own data security gurus as we talk about the latest threats, the importance of understanding your data and how none of this works if you don’t build an in-depth data security strategy.

Lots of good tips and advice.

Tech ONTAP

Tech Interviews

If the Tech ONTAP podcast isn’t enough of me for you, then there is of course this weeks Tech Interviews.

A personal favourite this show and a little bit of a digression from normal, as rather than talking with someone from the tech industry, I catch up with an actual technology user!

I chat with Lee Clark of Givepenny UK about how he has integrated data and cloud technology to deliver innovation to the fundraising sector, allowing charities to find new ways of engaging with a whole new generation of fundraisers.

A fantastic example of how technology can really make a big difference – I hope you enjoy it.

How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Hopefully that should quench your tech thirst this weekend, enjoy the shows and have a great weekend.

 

 

How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Normally when I do this podcast I chat with people from the technology community, normally those involved on the supply side, to discuss how the market is changing and how technology can help us deliver innovation and change into our organisations.

This week I wanted to do something a little different as the third and final part of our brief series on using the cloud, by looking at a real use case, someone who has embraced cloud technology to bring innovation to an industry.

I always admire those people who can spot a gap in a market, they see new and inventive ways of delivering products and services or ways of working with their customers, but sometimes bringing those ideas to fruition can be difficult and expensive. One of the most powerful things of the modern cloud driven era is how cloud providers have made technology much more accessible to those with great new business ideas, making it easier than ever for them to quickly, efficiently and importantly, for relatively low investment, exploit often very advanced technologies to help bring an idea to fruition.

This week I’m joined by someone who has done just that, Lee Clark of Givepenny UK.

Lee describes how a 100-mile charity bike ride, that he never got to do, made him think about a whole new way of fundraising. He realised that although he couldn’t do the actual 100-mile event, he had tracked many 100’s of miles of training with his smartphone app and realised how we all track and store masses of data about our everyday activities and how this data could be utilised to raise much needed money for charities in the UK.

We discuss a whole range of issues which comes with innovation, issues that are not unique to trying to launch something new, but are equally experienced by those trying to bring change into any organisation.

We look at how organisations are often crippled by their own behaviour, restricting their ability to embrace innovation, and how to overcome it.

Lee shares some of the ideas behind the Givepenny platform and how, by looking for new ways to engage potential fundraisers, has created a whole range of new opportunities for charities of all types.

We also discuss how cloud has made traditionally complex technology so much more accessible and how this accessibility allowed Givepenny to deliver a platform quickly, effectively and at a low cost, especially when compared to how the charity sector first engaged with technology as part of fundraising some 15 years ago.

And finally, we look at the importance of understanding the way technology is changing so many aspects of both our personal and business lives and why understanding this change is not just for those looking to deploy new innovations, but crucially those who make decisions about the future of an organisation, decisions that will have a significant impact on future success.

Lee shares with real enthusiasm about Givepenny’s journey and how they are looking to bring innovation to the charity sector, allowing them to create a whole new range of ways to engage with a very generous population who’d love to help, if they could.

You can find out more about Givepenny on their website givepenny.com and you can follow them on twitter @givepennyuk.

You can also follow Lee @jannerinbrum

Subscribe on Android

SoundCloud

Listen to Stitcher

Tech me for the weekend 30th June 2017

Another busy week in the tech world, so it’s time to kick back, relax and enjoy a weekend. But of course us techies never stop, always a chance to grab a tech article or maybe listen to a tech podcast, so here’s some stuff I’ve seen or heard this week that I thought I’d share, give them a go, see what you think.

Podcasts

Try these for your tech listening pleasure;

Arrow Bandwidth

The Arrow team have produced some great episodes recently, and here’s another one, a fascinating chat with IBM’s The Weather Company. We all talk a lot about how data is the new gold, and how our use of data is changing the way we do so many things. This episode outlines a great use case as The Weather company are using data on a massive scale to provide not only more accurate, but also a range of brand new weather services.

Arrow Bandwidth Podcast

IT Pro Podcast

This is a new podcast from one of my favourite podcast presenters Richard Hay, who presents the Windows Observer podcast talking about all things Microsoft, but mainly with a “consumer” focus, the IT Pro podcast has a more business slant and this week focusses on migration to the cloud, some of the things to consider and tips on how to deliver a great migration project.

Well worth a listen and making an addition to your podcast catcher of choice;

IT Pro Podcast

NetApp Tech ONTAP

My friends at NetApp have a great and very informative podcast for NetApp customers and partners, occasionally the team go off topic, this week was an excellent example of that with a fascinating chat looking at Women In Tech, NetApp have a great programme in place, but they also talk about the wider work that needs doing to ensure the tech industry is a place for us all to thrive.

Tech ONTAP Podcast

Tech Interviews

This weeks tech interviews was the 2nd of a 3 part series looking at moving to the cloud, as we tackled the interesting challenge of picking the right cloud service provider.

Often an area that doesn’t get considered to the degree that maybe it should, I’m joined by Jon Green of Navisite as we discuss the importance of picking the right partner, understanding cloud infrastructure, cloud economics and the “gotchas” of cloud.

Gotcha – the challenge of moving to the cloud – Jon Green – Ep33

Articles

Want to kick back and relax with a tech article, no fear, try these out;

Loudaboutcloud

A new website from Kirk Ryan, a cloud architect (and star of Tech Interviews Podcast Ep32) who has put together a list of great resources for helping those building cloud solutions.

This excellent post is one such example, as Kirk provides some excellent examples of how to control costs in the AWS cloud (There is also an equally good Azure version of the article on the site).

A great read and very insightful;

http://loudaboutcloud.com/wordpress/2017/06/26/cost-analysis-storage-costs-in-azure/

Microsoft – How artificial intelligence in changing our world

I’m a big fan of this topic and looking at how the way we use data is changing so many aspects of our life, this article is actually the route into an eBook on the subject.

Give the article a read and if it sounds interesting go download the eBook that goes with it;

https://enterprise.microsoft.com/en-us/articles/roles/it-leader/empowering-people-how-artificial-intelligence-is-changing-our-world/

NetApp – Is your SaaS Covered?

An article from another previous Tech Interviews guest, Robert Cox (Check him out talking object storage) If you ignore that this is a “NetApp sales pitch” Robert asks a very important question about “who is responsible for protecting your SaaS data?”, if you’re not sure, or don’t know, this is a good read.

https://newsroom.netapp.com/blogs/is-your-software-as-a-service-covered/

Hopefully all of that will keep you technically entertained… enjoy and have a great weekend.

Tech me for the weekend – June 23rd

No idea where this week has gone! – shot by like nobodies business! So here I am again with some tech content to keep you company over the weekend… check out the articles and podcasts that caught my attention this week.

Articles

Business Continuity Institute – Cultural issues the number one obstacle to digital transformation

First up, this from the Business Continuity Institute looking at the impact of organisational culture and its ability to inhibit progression and how if you are looking to make change and make that change successful then don’t underestimate the importance of buy in, from top to bottom.

http://www.thebci.org/index.php/about/news-room#/news/cultural-issues-are-the-number-one-obstacle-to-digital-transformation-245939

 

Technative – Beginners Guide to Big Data

A constant conversation at the minute is how taking advantage of our data is going to be crucial in our future success as organisations and businesses of all types.

This article looks at one of the areas that will be behind the successful adoption of data analytics and that is the increasing availability of big data tools, mainly powered by the big public cloud vendors, who are making it easier than ever for us to take advantage of our data.

https://www.technative.io/a-beginners-guide-to-big-data/

Matt Watts – Data Visionaries Wanted

To continue the focus on data and cultural change I came across this gem from Matt Watts as he discussed the importance of Data Visionaries in your organisation, those people who can see the value of data and can help you extract key information from it.

http://www.watts-innovating.com/data-visionaries-wanted/

All Aboard the Data Train

A new article from me this week, as myself and Mrs. Techstringy found out for ourselves the importance of data analytics in informing good business decisions.

That’s caught you attention hasn’t it!

All Aboard the Data Train

Podcasts

In Tech We Trust

This show has had a real revamp over the last few episodes and now has more of a focus on tech industry topics, rather than its old weekly news roundup format.

This week is a particularly interesting topic as the team discuss communication in the modern world, from Human to Human through to Human to machine a fascinating debate with a number of interesting guests, well worth a listen.

http://intechwetrustpodcast.com/e/124-technology-and-human-communication/

The ON-Premise IT podcast

Becoming a real favourite this show, this week the round table panel, take a bit of a diversion from the normal tech debate as they discuss careers, the kind of career moves to consider through to the importance of certifications. Good tips.

http://gestaltit.com/podcast/rich/managing-career-premise-roundtable/

Influence marketing podcast

A new show to the list, as John and Cathy Troyer host chats with IT folk who are involved in Tech Community programs.

Now on Episode three, but I though I’d mention the first one, with Veeam’s Rick Vanover as he discussed the extremely successful Veeam Vanguard Program an interesting insight into the work that goes on behind the scenes.
https://medium.com/influence-marketing-council/the-rickatron-is-always-on-9f626bce7211

Tech Interviews

This week was all about moving to the cloud, as I was joined by NetApp Cloud Architect Kirk Ryan, as we discussed the important things to consider as you look to see how you can take advantage of cloud services in your business.

We cover the importance of cloud economics, the options for integrating the cloud with your data and how to ensure you don’t take your on-prem bad habits with you.

A top guest, with lots of great insight.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Hope you enjoy that content.. love to hear your feedback.

Have a great weekend.

 

 

 

All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32