Cloud is not new, I don’t think that’s news to anyone, many of us have deployed a cloud solution, be it a SaaS platform, some public cloud infrastructure or some VM’s for test and dev, cloud continues to play a major part in IT strategy for an ever-increasing amount of businesses.
However, this move to cloud has not come without us learning an awful lot on the way. We’ve probably all heard of, or maybe even been involved with, cloud deployments that have not gone as we expected, the technology hasn’t given us what we want, the commercials didn’t stand up to our calculations, or it just wouldn’t work in the way our on-premises platform did. Many of the issues that have led to those poor cloud experiences have been driven by an “immaturity” to our approach, often too quick to dictate a cloud first strategy, regardless of whether cloud is, in reality, the way to go.
Is our approach to cloud beginning to change? have we got, or do we need to consider our cloud strategy a little differently?
That’s the question we ask on this weeks podcast, an episode inspired by a fantastic article written by my guest Matt Watts, Director of Technology and Strategy, EMEA at NetApp. In the article Matt posed the question Are you Cloud First! or Cloud First? And the difference a bit of punctuation can make, you can read the article here.
I thought the topic he covered in the article and the question he raised were worthy of further investigation and that’s what we do on this weeks show.
During the show we discuss the article in depth, we start out looking at what drove Matt to write the article and the importance of understanding the difference between a strategy and a mandate. We also look at examples of mistakes that people originally made that have meant we’ve needed to start to change our approach.
We talk about the issues that are created by taking on-prem solutions and “dumping” them “as is” into the cloud without asking the question “is there any value in doing this?” and how this drives bad practice in cloud adoption. We also coin the phrase “there is no zealot like a technology zealot!”.
We also explore the idea that cloud adoption isn’t about cost savings, so if it’s not that, why do we want to adopt cloud?
We wrap up looking at examples of building a more mature cloud strategy and how this has worked well, Matt shares some examples of how NetApp’s own internal cloud maturity has driven their own internal decision making. Matt’s final thought is how, without an appropriate and mature cloud strategy, you run the risk building yourself a whole new set of silo’s and limitations.
Matt, as always, shares some fascinating insight into cloud strategy. To find out more from Matt you can check out his other blogs on his watts-innovating site. You can also follow Matt on twitter @mtjwatts.
Next week we get an update on the innovations and developments in VMware Cloud on AWS, until then, thanks for listening.
A bit of disclosure, as a Director at a long-term NetApp Partner, Gardner Systems, and a member of the NetApp A-Team advocacy programme, I could be biased, but having worked with NetApp for over 10 years, I still see them meeting our customers’ needs better than any other vendor, which in itself, also suggests NetApp are doing something right.
What is it they’re doing? In this post, I share some thoughts on what I believe are key parts of this recent success
If we wind the clock back 4 years, NetApp’s reputation was not at its best, tech industry analysts presented a bleak picture, the storage industry was changing, with public cloud storage and innovative start-ups offering to do more than those “legacy” platforms and in many cases, they could, NetApp were a dinosaur on the verge of extinction.
Enter the Data Fabric, first announced at NetApp’s technical conference, Insight, in 2014. Data Fabric was the beginning of NetApp’s move from a company focussed on storing data to a company focused on the data itself. This was significant as it coincided with a shift in how organisations viewed data, moving away from just thinking about storing data to managing, securing, analysing and gaining value from it.
NetApp’s vision for data fabric, closely aligned to the aims of more data focussed organisations and also changed the way they thought about their portfolio, less worried about speeds and feeds and flashing lights and more about how to build a strategy that was focussed on data in the way their customers were.
It is this data-driven approach that, in my opinion, has been fundamental in this change in NetApp’s fortunes.
Embrace the Cloud
A huge shift and something that is taking both customers and industry analysts by surprise is the way NetApp have embraced the cloud, not a cursory nod, but cloud as a fundamental part of the data fabric strategy and this goes way beyond “cloudifying” existing technology.
ONTAP Cloud seamlessly delivers the same data services and storage efficiencies into the public cloud as you get with its on-prem cousin, this provides a unique ability to maintain data policies and procedures across your on-prem and cloud estates.
But NetApp has gone beyond this, delivering native cloud services that don’t require any traditional NetApp technologies, Cloud Sync, allows the easy movement of data from on-prem NFS datastores into the AWS cloud. While Cloud Control provides a backup service for Office365 (and now Salesforce) bringing crucial data protection functionality that many SaaS vendors do not provide.
If that wasn’t enough there is the recently announced relationship with Microsoft, with NetApp now powering the Azure NFS service, yep that’s right, if you take the NFS service from the Azure marketplace this is delivered fully in the background by NetApp.
For a storage vendor, this cloud investment is unexpected, but a clear cloud strategy is also appealing to those making business technology decisions.
Getting the basics right
With these developments, it’s clear NetApp have a strategy and are expanding their portfolio into areas other storage vendors do not consider, but there is also no escaping that their main revenue generation continues to come from ONTAP and FAS (NetApp’s hardware platform).
If I’m buying a hardware platform, what do I want from it? It should be robust with strong performance and a good investment that evolves with my business and if NetApp’s commercial success is anything to go by, they are delivering this.
The all-flash NetApp platforms (such as the award winning A200 mentioned earlier) are meeting this need, a robust enterprise-level platform, allowing organisations to build an always-on storage infrastructure that scales seamlessly with new business demands. 6-year flash drive warranties and the ability to refresh your controllers after 3 years also give excellent investment protection.
It is not just the hardware however, these platforms are driven by software, NetApp’s ONTAP operating systems is like any other modern software platform, with regular code drops (every 6 months) delivering new features and improved performance to existing hardware via a non-disruptive software upgrade, providing businesses with the ability to “sweat” their hardware investment over an extended period, which in today’s investment sensitive market is hugely appealing.
Have an interesting portfolio
NetApp for a long time was the FAS and ONTAP company, and while those things are still central in their plans, their portfolio is expanding quickly, we’ve discussed the cloud focussed services, there’s also Solidfire with its unique scale and QoS capabilities, Storage Grid a compelling object storage platform, Alta Vault provides a gateway to move backup and archive data into object storage on-prem or in the cloud.
Add to this the newly announced HCI platform you can see how NetApp can play a significant part in your next-generation datacenter plans.
For me the awards I mentioned at the beginning of this article are not because of one particular solution or innovation, it’s the data fabric, that strategy is allowing NetApp, its partners and customers to have a conversation that is data and not technology focussed and having a vendor who understands that is clearly resonating with customers, analysts and industry influencers alike.
NetApp’s continued evolution is fascinating to watch, and they have more to come, with no doubt more awards to follow, whatever next!
This weeks Tech Interviews is the first in a short series, where I bring together a selection of people from the IT community to try to gauge the current state of business IT and to gain some insight into the key day-to-day issues affecting those delivering technology to their organisations.
For this first episode i’m joined by three returning guests to the show.
Michael Cade is a Technical Evangelist at Veeam. Michael spends his time working closely with both the IT community and Veeam’s business customers to understand the day-to-day challenges that they face from availability to cloud migration.
Mike Andrews is a Technical Solutions Architect at storage vendor NetApp, specialising in NetApp’s cloud portfolio, today Mike works closely with NetApp’s wide range of customers to explore how to solve the most challenging of business issues.
Mark Carlton is Group Technical Manager at Concorde IT Group, he has an extensive experience in the industry having worked in a number of different types of technology businesses, today Mark works closely with a range of customers helping them to use technology to solve business challenges.
The panel discuss a range of issues, from availability to cloud migration, the importance of the basics and how understanding the why, rather than the how is a crucial part of getting your technology strategy right.
The team provide some excellent insights into a whole range of business IT challenges and I’m sure there’s some useful advice for everyone.
Next time I’m joined by four more IT avengers, as we look at some of the other key challenges facing business IT.
If you enjoyed the show and want to catch the next one, then please subscribe, links are below.
The idea that our data is critical to the future of our organisation isn’t a new one, the focus around managing it, protecting and securing it underlines its importance to any modern organisation.
But protecting our data and ensuring we maintain its privacy and security is not the only important focus we should have.
You don’t need to look around the technology industry too much to hear phrases such as “data is the new gold” or “data is the new oil”, but like any good marketing phrase, it is based on a degree of fact.
As marketing-y as those phrases are, it would be wrong to dismiss them. The image I chose for this blog post suggests, “if the future is digital, the guy with the most data wins”, However, I think that phrase is only partly correct.
It is certain that the modern organisation is becoming increasingly digital, transforming into one that is relying on data and digital workflows for its success, however when it comes to data, it’s not how much data you have, it’s what you do with it and learn from it that will determine who really wins.
That’s the focus of this week’s podcast as I’m joined by NetApp’s Director, Technology and Strategy, Matt Watts.
Matt is in an interesting position, working for one of the world’s largest “traditional” storage vendors and charged with helping them to develop a strategy for dealing with challenges faced by organisations in a world where “traditional” storage is seen as something less valuable.
Maybe to the surprise of many, Matt agrees, while NetApp have great products, they fully accept that the future isn’t about IOPS, Capacities and flashing lights. All that really matters is the data.
In this episode, Matt provides fascinating insights into the modern data world, how extracting valuable information from data is a significant advantage to an organisation, how 3rd party companies working with storage vendors is critical to the future of data management and how companies like Microsoft, Amazon and IBM with Watson are commoditising machine learning and artificial intelligence to a point where, organisations of all sizes, can take advantage of these very smart tools to give them insights and understanding that just a few years ago was out of the reach for all but the very wealthiest of companies.
We also look at how building an appropriate data management strategy is crucial in enabling organisations to access tools that can allow them to take full advantage of their data asset.
Have a listen, Matt provides some great information to help you to get the maximum from your data and be the person not with “the most data” but the one with “the most information from their data” that wins.
Last week was the end of a wonderful summer of sport from Rio, where the Olympics and Paralympics gave us sport at its best, people achieving life time goals, setting new records and inspiring a new generation of athletes.
I’m sure many of you enjoyed the games as much as I did, but why bring it up here? Well for someone who writes a BLOG it’s almost a contractual obligation in an Olympic year, to write something that has a tenuous Olympic link. So here’s my entry!
One part of the Team GB squad that really stood in Rio were the Olympic cyclists, winning more gold medals than all of the other countries combined (6 of the 10 available) a phenomenal achievement.
This led to one question getting continually asked “What’s the secret?”. In one BBC interview Sir Chris Hoy was asked that question and his answer fascinated me, during his career the biggest impact on British cycling was not equipment, facilities, training, or super human cyclists. It was data, yes, data, not just collecting data, but more importantly the ability to extract valuable insight from it.
We hear it all the time
“those who will be the biggest successes in the future are those that get the most value from their data”
and what a brilliant example the cyclists where. We see this constantly in sport where the smallest advantage matters , but not just sport, increasingly this is the case in business, as organsations see data as the key to giving them competitive edge.
We all love these kind of stories, how technology can provide true advantage, but it’s always great to see it in action.
A couple of weeks ago I was on a call with the technical lead of one of our customers. He and his company see the benefit of technology investment and how it delivers business advantage. I’ve been lucky enough to work with them over the last 4 years or so and have watched the company grow around 300% in that time, we were talking with one of his key technology vendors and explaining to them how their technology was an instrumental part of their success.
During the call I realised this was my opportunity for a tenuous Olympic link BLOG post and how, as with the cyclists, getting the best from data was delivering real bottom line success to the business.
The business is a smart energy company, doing very innovative stuff in the commercial and private energy sectors. They’re in a very competitive industry, dominated by some big companies, but these guys are bucking that trend and a great example of how a company that is agile and knows how to exploit its advantage can succeed.
In their industry data is king, they pick up tonnes of data every day, from customers, from devices, from sensors, and manipulating this data and extracting valuable information from it is key to their success.
Until about a year ago they were running their database and reporting engines (SQL based) on a NetApp storage array, running 7-mode. That had worked but a year ago we migrated his infrastructure to clustered data ONTAP to provide increased flexibility, mobility of data and more granular separation of workloads.
However, the smartest thing they did as part of the migration was to deploy flashpools into their environment, why was this so earth shattering?
A big part of the value of their SQL infrastructure is reporting. This allows them to provide better services to their customers and suppliers giving them advantage over their competitors.
However many of those reports took hours to run, in fact the process was request the report and it would be ready the next day.
The introduction of flashpools into the environment (flashpools are flash based acceleration technology available in NetApp ONTAP arrays) had a dramatic effect taking these overnight reports and delivering them in 30-60 minutes.
This significant reduction in report running times, meant more reports could be run, more reports producing different data that could be used to present new and improved services to customers.
Last year the technical lead attended NetApp Insight in Berlin. One of the big areas of discussion that caught his interest was the development of all flash FAS (AFF), NetApp’s all flash variants of their ONTAP driven FAS controllers.
They immediately saw the value in this high performance, low latency technology. So earlier this year, we arranged an AFF proof of concept to be integrated into the environment, during this POC, the team moved a number of SQL workloads to the flash based storage and it’s no understatement to say this transformed their data analysis capabilities, those 30-60 minute reports where now running in 2-3 minutes.
But this was not just about speed, this truly opened up brand new capabilities and business opportunities, now the organisation could provide their customers and suppliers with information that previously was impossible, providing quick access to data was allowing them to make decisions on their energy usage that gave true value.
They knew the proof of concept had gone well, when on taking it out the business began asking questions, why is everything so slow? Why can’t we do those reports anymore? And that was the business case, the deployment of NetApp flash was not just doing stuff quickly, or using flash because that’s what everyone says you should, this was because flash was delivering results, real business advantage.
As Chris Hoy discussed at the Olympics, it was not just getting the data because they could, it was getting the most out of it and in a sport where often 10th of seconds are between you and a gold medal, any advantage is critical.
A competitive business environment is no different, so an investment in technology that gives you the slightest edge makes perfect sense.
Today, all flash FAS is integrated into their new datacentre running the latest iterations of ONTAP, ensuring a low latency, high performance infrastructure, ensuring that they can continue to drive value from their most critical business asset, their Data.
A great use of technology to drive advantage, in fact Gold medals for data usage all round.
Hope that wasn’t to tenuous an Olympic link and if you have any questions then of course, @techstringy or via LinkedIN are great ways to get me.
If you’re interested in Flash you may also find this useful “Is Flash For Me?” from my company website.