Thanks for memory – Alex McDonald – Ep61

At the start of 2018 the technology industry was hit by two new threats unlike anything it had seen before. Spectre and Meltdown used vulnerabilities not in operating system code or poorly written applications, but ones at a much lower level than that.

This vulnerability was not only something of concern to today’s technology providers, but also to those looking at architecting the way technology will work in the future.

As we try to push technology further and have it deal with more data, more quickly than ever before. The technology industry is having to look at ways of keeping up and have our tech work in different ways beyond the limits of our current ways of working. One of these developments is storage class memory, or persistent memory, were our data can be housed and accessed at speeds many times greater than they are today.

However, this move brings new vulnerabilities in the way we operate, vulnerabilities like those exposed by Spectre and Meltdown, but how did Spectre and Meltdown look to exploit operational level vulnerabilities? and what does that mean for our desire to constantly push technology to use data in ever more creative and powerful ways?

That’s the topic of this week’s Tech Interviews podcast, as I’m joined by the always fascinating Alex McDonald to discuss exactly what Spectre and Meltdown are, how they Impact what we do today and how they may change the way we are developing our future technology.

Alex is part of the Standards Industry Association group at NetApp and represents them on boards such as SNIA (Storage Networking Industry Association).

In this episode, he brings his wide industry experience to the show to share some detail on exactly what Spectre and Meltdown are, how they operate, what vulnerabilities they exploit, as well as what exactly these vulnerabilities put at risk in our organisations.

We take a look at how these exploits takes advantage of side channels and speculative execution to allow an attacker to access data that you never would imagine to be at risk, and how our eagerness to push technology to its limits created those vulnerabilities.

We discuss how this has changed the way the technology industry is now looking at the future developments of memory, as our demands to develop ever larger and faster data repositories show no sign of slowing down.

Alex shares some insights into the future, as we look at the development of persistent memory, what is driving demand and how the need for this kind of technology means the industry has no option but to get it right.

To ease our fears Alex also outlines how the technology industry is dealing with new threats to ensure that development of larger and faster technologies can continue, while ensuring the security and privacy of our critical data.

We wrap up discussing risk mitigation, what systems are at risk to attack from exploits like Spectre and Meltdown, what systems are not and how we ensure we protect them long term.

We finish on the positive message that the technology industry is indeed smart enough to solve these challenges and how it is working hard to ensure that it can deliver technology to the demands we have for our data to help solve big problems.

You can find more on Wikipedia about Spectre and Meltdown.

You can learn more about the work of SNIA on their website.

And if you’d like to stalk Alex on line you can find him on twitter talking about technology and Scottish Politics! @alextangent

Hope you enjoyed the show, with the Easter holidays here in the UK we’re taking a little break, but we’ll be back with new episodes in a few weeks’ time, but for now, thanks for listening.

Advertisements

NetApp Winning Awards, Whatever Next?

WP_20160518_07_53_57_Rich_LI.jpgIn the last couple of weeks I’ve seen NetApp pick up a couple of industry awards with the all flash A200 earing the prestigious Storage Review Editors Choice as well as CRN UK’s storage Vendor of the year 2017, this alongside commercial successes (How NetApp continue to defy the performance of the storage market) is part of a big turnaround in their fortunes over the last 3 years or so, but why? What is NetApp doing to garner such praise?

A bit of disclosure, as a Director at a long-term NetApp Partner, Gardner Systems, and a member of the NetApp A-Team advocacy programme, I could be biased, but having worked with NetApp for over 10 years, I still see them meeting our customers’ needs better than any other vendor, which in itself, also suggests NetApp are doing something right.

What is it they’re doing? In this post, I share some thoughts on what I believe are key parts of this recent success

Clear Strategy

If we wind the clock back 4 years, NetApp’s reputation was not at its best, tech industry analysts presented a bleak picture, the storage industry was changing, with public cloud storage and innovative start-ups offering to do more than those “legacy” platforms and in many cases, they could, NetApp were a dinosaur on the verge of extinction.

Enter the Data Fabric, first announced at NetApp’s technical conference, Insight, in 2014. Data Fabric was the beginning of NetApp’s move from a company focussed on storing data to a company focused on the data itself. This was significant as it coincided with a shift in how organisations viewed data, moving away from just thinking about storing data to managing, securing, analysing and gaining value from it.

NetApp’s vision for data fabric, closely aligned to the aims of more data focussed organisations and also changed the way they thought about their portfolio, less worried about speeds and feeds and flashing lights and more about how to build a strategy that was focussed on data in the way their customers were.

It is this data-driven approach that, in my opinion, has been fundamental in this change in NetApp’s fortunes.

Embrace the Cloud

A huge shift and something that is taking both customers and industry analysts by surprise is the way NetApp have embraced the cloud, not a cursory nod, but cloud as a fundamental part of the data fabric strategy and this goes way beyond “cloudifying” existing technology.

ONTAP Cloud seamlessly delivers the same data services and storage efficiencies into the public cloud as you get with its on-prem cousin, this provides a unique ability to maintain data policies and procedures across your on-prem and cloud estates.

But NetApp has gone beyond this, delivering native cloud services that don’t require any traditional NetApp technologies, Cloud Sync, allows the easy movement of data from on-prem NFS datastores into the AWS cloud. While Cloud Control provides a backup service for Office365 (and now Salesforce) bringing crucial data protection functionality that many SaaS vendors do not provide.

If that wasn’t enough there is the recently announced relationship with Microsoft, with NetApp now powering the Azure NFS service, yep that’s right, if you take the NFS service from the Azure marketplace this is delivered fully in the background by NetApp.

For a storage vendor, this cloud investment is unexpected, but a clear cloud strategy is also appealing to those making business technology decisions.

Getting the basics right

With these developments, it’s clear NetApp have a strategy and are expanding their portfolio into areas other storage vendors do not consider, but there is also no escaping that their main revenue generation continues to come from ONTAP and FAS (NetApp’s hardware platform).

If I’m buying a hardware platform, what do I want from it? It should be robust with strong performance and a good investment that evolves with my business and if NetApp’s commercial success is anything to go by, they are delivering this.

The all-flash NetApp platforms (such as the award winning A200 mentioned earlier) are meeting this need, a robust enterprise-level platform, allowing organisations to build an always-on storage infrastructure that scales seamlessly with new business demands. 6-year flash drive warranties and the ability to refresh your controllers after 3 years also give excellent investment protection.

It is not just the hardware however, these platforms are driven by software, NetApp’s ONTAP operating systems is like any other modern software platform, with regular code drops (every 6 months) delivering new features and improved performance to existing hardware via a non-disruptive software upgrade, providing businesses with the ability to “sweat” their hardware investment over an extended period, which in today’s investment sensitive market is hugely appealing.

Have an interesting portfolio

NetApp for a long time was the FAS and ONTAP company, and while those things are still central in their plans, their portfolio is expanding quickly, we’ve discussed the cloud focussed services, there’s also Solidfire with its unique scale and QoS capabilities, Storage Grid a compelling object storage platform, Alta Vault provides a gateway to move backup and archive data into object storage on-prem or in the cloud.

Add to this the newly announced HCI platform you can see how NetApp can play a significant part in your next-generation datacenter plans.

For me the awards I mentioned at the beginning of this article are not because of one particular solution or innovation, it’s the data fabric, that strategy is allowing NetApp, its partners and customers to have a conversation that is data and not technology focussed and having a vendor who understands that is clearly resonating with customers, analysts and industry influencers alike.

NetApp’s continued evolution is fascinating to watch, and they have more to come, with no doubt more awards to follow, whatever next!

Tech Trends – Object Storage – Robert Cox – Ep13

Over the last couple of weeks I’ve chatted about some of the emerging tech trends that I expect to see continue to develop during 2017 (Have a read of my look ahead blog post for some examples). To continue that theme this episode of Tech Interviews is the first of three looking in a little more detail at some of those trends.

First up, we look at a storage technology that is growing rapidly if not necessarily obviously, Object Storage.

As the amount of data the world creates continues to grow exponentially it is becoming clear that some methods of traditional storage are no longer effective. When we are talking billions of files, spread across multiple data centers across multiple geographies, traditional file storage models are no longer as effective (regardless of what a vendor may say!) that’s not to say that our more traditional methods are finished, in fact a long way from it, however there are increasingly use cases where that traditional model doesn’t scale or perform well enough.

For many of us, we’ve probably never seen an object store, or at least think we haven’t, but if you’re using things like storage from AWS or Azure then you’re probably using object storage, even if you don’t realise it.

With all that said, what actually is object storage? why do we need it? how does it address the challenges of more traditional storage? what are the use cases?

It’s those questions that we attempt to answer in this episode of Tech Interviews with my robert-coxguest Robert Cox. Robert is part of the storage team at NetApp working with their StorageGrid Webscale object storage solution.

During our chat we focus on giving an introduction to object storage, why is it relevant, the issues with more traditional storage and how object overcomes them, as well as Robert sharing some great use cases.

So, if you are wondering what object is all about and where it maybe relevant in your business, then hopefully this is the episode for you.

Enjoy…

If you’d like to follow up with Robert with questions around NetApp’s object storage solutions you can email him at robert.cox@netapp.com

You can find information on NetApp StorageGrid Webscale here 

And if you’d like a demo of StorageGrid then request one here

Next week we take a look at one of the most high profile of tech trends the emergence of DevOps, to make sure you don’t miss out you can subscribe to the Tech Interviews below.

Hope you can join us next week, thanks for listening…

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Viva Las VVOL’s

In this episode I’m joined by Pete Flecha, Senior Technical Marketing Architect at VMware, as we discuss VVOL’s. VMware’s new approach to delivering storage to virtual WP_20161116_12_44_06_Rich_LI.jpginfrastructures. VVOL’s look to address many of the problems traditional SAN based storage presents to Virtual infrastructures. Pete provides an intro to the problems VVOL’s look to address, how they go about it and what we can expect from the recent vSphere 6.5 release that brings us VVOL’s v2.0.

Although I’m not a VVOL expert I find what VMware are looking to do here really interesting as they look to tackle one of the key issues that IT leaders constantly look to address. How  to reduce the complexity of their environments so they can react quicker to new demands from their business.

VVOL’s allows for the complexity of any underlying storage infrastructure to be hidden from the virtualisation administrators, giving those managing and deploying applications, servers and services a uniformity of experience, so they can focus on quickly deploying their infrastructure resources.

As we all strive to ensure our IT infrastructures meet the ever changing needs and demands of our organisations, anything that simplifies, automates and ensures consistency across our environments is, in my opinion, a good thing.

It certainly seems that VVOL’s are a strong step in that direction.

In this episode Pete provides a brilliant taster of what VVOL’s are designed to do and the challenges they meet. I hope you enjoy it.


If you want more VVOL details Pete is the host of VMware’s fantastic vSpeaking podcast and last week they had an episode dedicated to VVOLS’s you can pick that up here.

vSpeaking Podcast ep:32 VVOLs 2.0

You can find all the other episodes of the vSpeaking podcast here

You can keep up with Pete and the excellent work he’s doing at VMware by following him on twitter @vpedroarrow

And of course, if you have enjoyed this episode of the podcast please subscribe for more episodes wherever you get your podcasts. You won’t want to miss next week, as I discuss data privacy with global privacy expert Sheila Fitzpatrick.

Subscribe on Android

Gold medals for data

Last week was the end of a wonderful summer of sport from Rio, where the Olympics and Paralympics gave us sport at its best, people achieving life time goals, setting new records and inspiring a new generation of athletes.

I’m sure many of you enjoyed the games as much as I did, but why bring it up here? Well for someone who writes a BLOG it’s almost a contractual obligation in an Olympic year, to write something that has a tenuous Olympic link. So here’s my entry!

One part of the Team GB squad that really stood in Rio were the Olympic cyclists, winning more gold medals than all of the other countries combined (6 of the 10teamgb_trott_archibald_rowsell_barker_rio_2000-1471125302 available) a phenomenal achievement.

This led to one question getting continually asked “What’s the secret?”. In one BBC interview Sir Chris Hoy was asked that question and his answer fascinated me, during his career the biggest impact on British cycling was not equipment, facilities, training, or super human cyclists. It was data, yes, data, not just collecting data, but more importantly the ability to extract valuable insight from it.

We hear it all the time

“those who will be the biggest successes in the future are those that get the most value from their data”

and what a brilliant example the cyclists where. We see this constantly in sport where the smallest advantage matters , but not just sport, increasingly this is the case in business, as organsations see data as the key to giving them competitive edge.

We all love these kind of stories, how technology can provide true advantage, but it’s always great to see it in action.

A couple of weeks ago I was on a call with the technical lead of one of our customers. He and his company see the benefit of technology investment and how it delivers business advantage. I’ve been lucky enough to work with them over the last 4 years or so and have watched the company grow around 300% in that time, we were talking with one of his key technology vendors and explaining to them how their technology was an instrumental part of their success.

During the call I realised this was my opportunity for a tenuous Olympic link BLOG post and how, as with the cyclists, getting the best from data was delivering real bottom line success to the business.

The business is a smart energy company, doing very innovative stuff in the commercial and private energy sectors. They’re in a very competitive industry, dominated by some big companies, but these guys are bucking that trend and a great example of how a company that is agile and knows how to exploit its advantage can succeed.

In their industry data is king, they pick up tonnes of data every day, from customers, from devices, from sensors, and manipulating this data and extracting valuable information from it is key to their success.

Until about a year ago they were running their database and reporting engines (SQL based) on a NetApp storage array, running 7-mode. That had worked but a year ago we migrated his infrastructure to clustered data ONTAP to provide increased flexibility, mobility of data and more granular separation of workloads.

However, the smartest thing they did as part of the migration was to deploy flashpools into their environment, why was this so earth shattering?

A big part of the value of their SQL infrastructure is reporting. This allows them to provide better services to their customers and suppliers giving them advantage over their competitors.

However many of those reports took hours to run, in fact the process was request the report and it would be ready the next day.

The introduction of flashpools into the environment (flashpools are flash based acceleration technology available in NetApp ONTAP arrays) had a dramatic effect taking these overnight reports and delivering them in 30-60 minutes.

This significant reduction in report running times, meant more reports could be run, more reports producing different data that could be used to present new and improved services to customers.

Last year the technical lead attended NetApp Insight in Berlin. One of the big areas of discussion that caught his interest was the development of all flash FAS (AFF), NetApp’s all flash variants of their ONTAP driven FAS controllers.

They immediately saw the value in this high performance, low latency technology. So earlier this year, we arranged an AFF proof of concept to be integrated into the environment, during this POC, the team moved a number of SQL workloads to the flash based storage and it’s no understatement to say this transformed their data analysis capabilities, those 30-60 minute reports where now running in 2-3 minutes.

aff-performance-on-sql
An example of the kind of performance you can get from AFF (this is an AFF8080 cluster running ONTAP 8.3.1 – new platforms and ONTAP 9 have increased this performance further)

But this was not just about speed, this truly opened up brand new capabilities and business opportunities, now the organisation could provide their customers and suppliers with information that previously was impossible, providing quick access to data was allowing them to make decisions on their energy usage that gave true value.

They knew the proof of concept had gone well, when on taking it out the business began asking questions, why is everything so slow? Why can’t we do those reports anymore? And that was the business case, the deployment of NetApp flash was not just doing stuff quickly, or using flash because that’s what everyone says you should, this was because flash was delivering results, real business advantage.

As Chris Hoy discussed at the Olympics, it was not just getting the data because they could, it was getting the most out of it and in a sport where often 10th of seconds are between you and a gold medal, any advantage is critical.

A competitive business environment is no different, so an investment in technology that gives you the slightest edge makes perfect sense.

Today, all flash FAS is integrated into their new datacentre running the latest iterations of ONTAP, ensuring a low latency, high performance infrastructure, ensuring that they can continue to drive value from their most critical business asset, their Data.

A great use of technology to drive advantage, in fact Gold medals for data usage all round.

gold-medals

Hope that wasn’t to tenuous an Olympic link and if you have any questions then of course, @techstringy or via LinkedIN are great ways to get me.

If you’re interested in Flash you may also find this useful “Is Flash For Me?” from my company website.