Wrapping up VeeamON – Michael Cade – Ep 66

A couple of weeks ago in Chicago Veeam had their annual tech conference VeeamON, it was one of my favourite shows from last year, unfortunately I couldn’t make it out this time but did catch up remotely and shared my thoughts on some of the strategic messages that where covered in a recent blog post looking at Veeam’s evolving data management strategy ( Getting your VeeamON!).

That strategic Veeam message is an interesting one and their shift from031318_0833_Availabilit2.jpg “backup” company to one focused on intelligent data management across multiple repositories is, in my opinion, exactly the right move to be making. With that in mind, I wanted to take a final look at some of those messages as well as some of the other interesting announcements from the show and that is exactly what we do on this week’s podcast, as I’m joined by recurring Tech Interviews guest, Michael Cade, Global Technologist at Veeam.

Michael, who not only attended the show but also delivered some great sessions, joins me to discuss a range of topics. We start by taking a look at Veeam’s last 12 months and how they’ve started to deliver a wider range of capabilities which builds on their virtual platform heritage with support for more traditional enterprise platforms.

Michael shares some of the thinking behind Veeam’s goal to deliver an availability platform to meet the demands of modern business data infrastructures, be they on-prem, in the cloud, SaaS or service provider based. We also look at how this platform needs to offer more than just the ability to “back stuff up”

We discuss the development of Veeam’s 5 pillars of intelligent data management, a key strategic announcement from the show and how this can be used as a maturity model against which you can compare your own progress to a more intelligent way of managing your data.

We look at the importance of automation in our future data strategies and how this is not only important technically, but also commercially as businesses need to deploy and deliver much more quickly than before.

We finish up by investigating the value of data labs and how crucial the ability to get more value from your backup data is becoming, be it to carry out test, dev, data analytics or a whole range of other tasks without impacting your production platforms or wasting the valuable resource in your backup data sets.

Finally, we take a look at some of the things we can expect from Veeam in the upcoming months.

You can catch up on the event keynote on Veeam’s YouTube channel https://youtu.be/ozNndY1v-8g

You can also find more information on the announcements on Veeam’s website here www.veeam.com/veeamon/announcements

If you’d like to catch up with thoughts from the Veeam Vanguard team, you can find a list of them on twitter – https://twitter.com/k00laidIT/lists/veeam-vanguards-2018

You can follow Michael on twitter @MichaelCade1 and on his excellent blog https://vzilla.co.uk/

Thanks for listening.

Advertisements

NetApp Winning Awards, Whatever Next?

WP_20160518_07_53_57_Rich_LI.jpgIn the last couple of weeks I’ve seen NetApp pick up a couple of industry awards with the all flash A200 earing the prestigious Storage Review Editors Choice as well as CRN UK’s storage Vendor of the year 2017, this alongside commercial successes (How NetApp continue to defy the performance of the storage market) is part of a big turnaround in their fortunes over the last 3 years or so, but why? What is NetApp doing to garner such praise?

A bit of disclosure, as a Director at a long-term NetApp Partner, Gardner Systems, and a member of the NetApp A-Team advocacy programme, I could be biased, but having worked with NetApp for over 10 years, I still see them meeting our customers’ needs better than any other vendor, which in itself, also suggests NetApp are doing something right.

What is it they’re doing? In this post, I share some thoughts on what I believe are key parts of this recent success

Clear Strategy

If we wind the clock back 4 years, NetApp’s reputation was not at its best, tech industry analysts presented a bleak picture, the storage industry was changing, with public cloud storage and innovative start-ups offering to do more than those “legacy” platforms and in many cases, they could, NetApp were a dinosaur on the verge of extinction.

Enter the Data Fabric, first announced at NetApp’s technical conference, Insight, in 2014. Data Fabric was the beginning of NetApp’s move from a company focussed on storing data to a company focused on the data itself. This was significant as it coincided with a shift in how organisations viewed data, moving away from just thinking about storing data to managing, securing, analysing and gaining value from it.

NetApp’s vision for data fabric, closely aligned to the aims of more data focussed organisations and also changed the way they thought about their portfolio, less worried about speeds and feeds and flashing lights and more about how to build a strategy that was focussed on data in the way their customers were.

It is this data-driven approach that, in my opinion, has been fundamental in this change in NetApp’s fortunes.

Embrace the Cloud

A huge shift and something that is taking both customers and industry analysts by surprise is the way NetApp have embraced the cloud, not a cursory nod, but cloud as a fundamental part of the data fabric strategy and this goes way beyond “cloudifying” existing technology.

ONTAP Cloud seamlessly delivers the same data services and storage efficiencies into the public cloud as you get with its on-prem cousin, this provides a unique ability to maintain data policies and procedures across your on-prem and cloud estates.

But NetApp has gone beyond this, delivering native cloud services that don’t require any traditional NetApp technologies, Cloud Sync, allows the easy movement of data from on-prem NFS datastores into the AWS cloud. While Cloud Control provides a backup service for Office365 (and now Salesforce) bringing crucial data protection functionality that many SaaS vendors do not provide.

If that wasn’t enough there is the recently announced relationship with Microsoft, with NetApp now powering the Azure NFS service, yep that’s right, if you take the NFS service from the Azure marketplace this is delivered fully in the background by NetApp.

For a storage vendor, this cloud investment is unexpected, but a clear cloud strategy is also appealing to those making business technology decisions.

Getting the basics right

With these developments, it’s clear NetApp have a strategy and are expanding their portfolio into areas other storage vendors do not consider, but there is also no escaping that their main revenue generation continues to come from ONTAP and FAS (NetApp’s hardware platform).

If I’m buying a hardware platform, what do I want from it? It should be robust with strong performance and a good investment that evolves with my business and if NetApp’s commercial success is anything to go by, they are delivering this.

The all-flash NetApp platforms (such as the award winning A200 mentioned earlier) are meeting this need, a robust enterprise-level platform, allowing organisations to build an always-on storage infrastructure that scales seamlessly with new business demands. 6-year flash drive warranties and the ability to refresh your controllers after 3 years also give excellent investment protection.

It is not just the hardware however, these platforms are driven by software, NetApp’s ONTAP operating systems is like any other modern software platform, with regular code drops (every 6 months) delivering new features and improved performance to existing hardware via a non-disruptive software upgrade, providing businesses with the ability to “sweat” their hardware investment over an extended period, which in today’s investment sensitive market is hugely appealing.

Have an interesting portfolio

NetApp for a long time was the FAS and ONTAP company, and while those things are still central in their plans, their portfolio is expanding quickly, we’ve discussed the cloud focussed services, there’s also Solidfire with its unique scale and QoS capabilities, Storage Grid a compelling object storage platform, Alta Vault provides a gateway to move backup and archive data into object storage on-prem or in the cloud.

Add to this the newly announced HCI platform you can see how NetApp can play a significant part in your next-generation datacenter plans.

For me the awards I mentioned at the beginning of this article are not because of one particular solution or innovation, it’s the data fabric, that strategy is allowing NetApp, its partners and customers to have a conversation that is data and not technology focussed and having a vendor who understands that is clearly resonating with customers, analysts and industry influencers alike.

NetApp’s continued evolution is fascinating to watch, and they have more to come, with no doubt more awards to follow, whatever next!

IT Avengers Assemble – Part One – Ep38

This weeks Tech Interviews is the first in a short series, where I bring together a selection of people from the IT community to try to gauge the current state of business IT and to gain some insight into the key day-to-day issues affecting those delivering technology to their organisations.

For this first episode i’m joined by three returning guests to the show.

Mich040317_0726_Availabilit1.jpgael Cade is a Technical Evangelist at Veeam. Michael spends his time working closely with both the IT community and Veeam’s business customers to understand the day-to-day challenges that they face from availability to cloud migration.

You can find Michael on twitter @MichaelCade1 and his blog at vzilla.co.uk 

mike andrews

Mike Andrews is a Technical Solutions Architect at storage vendor NetApp, specialising in NetApp’s cloud portfolio, today Mike works closely with NetApp’s wide range of customers to explore how to solve the most challenging of business issues.

You can find Mike on social media on twitter @TrekinTech and on his blog site trekintech.com

Mark CarltonMark Carlton is Group Technical Manager at Concorde IT Group, he has an extensive experience in the industry having worked in a number of different types of technology businesses, today Mark works closely with a range of customers helping them to use technology to solve business challenges.

Mark is on twitter @mcarlton1983 and at his fantastically titled justswitchitonandoff.com blog.

The panel discuss a range of issues, from availability to cloud migration, the importance of the basics and how understanding the why, rather than the how is a crucial part of getting your technology strategy right.

The team provide some excellent insights into a whole range of business IT challenges and I’m sure there’s some useful advice for everyone.

Next time I’m joined by four more IT avengers, as we look at some of the other key challenges facing business IT.

If you enjoyed the show and want to catch the next one, then please subscribe, links are below.

Thanks for listening.

Subscribe on Android

SoundCloud

Listen to Stitcher

 

 

 

 

All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

Tech Trends – Object Storage – Robert Cox – Ep13

Over the last couple of weeks I’ve chatted about some of the emerging tech trends that I expect to see continue to develop during 2017 (Have a read of my look ahead blog post for some examples). To continue that theme this episode of Tech Interviews is the first of three looking in a little more detail at some of those trends.

First up, we look at a storage technology that is growing rapidly if not necessarily obviously, Object Storage.

As the amount of data the world creates continues to grow exponentially it is becoming clear that some methods of traditional storage are no longer effective. When we are talking billions of files, spread across multiple data centers across multiple geographies, traditional file storage models are no longer as effective (regardless of what a vendor may say!) that’s not to say that our more traditional methods are finished, in fact a long way from it, however there are increasingly use cases where that traditional model doesn’t scale or perform well enough.

For many of us, we’ve probably never seen an object store, or at least think we haven’t, but if you’re using things like storage from AWS or Azure then you’re probably using object storage, even if you don’t realise it.

With all that said, what actually is object storage? why do we need it? how does it address the challenges of more traditional storage? what are the use cases?

It’s those questions that we attempt to answer in this episode of Tech Interviews with my robert-coxguest Robert Cox. Robert is part of the storage team at NetApp working with their StorageGrid Webscale object storage solution.

During our chat we focus on giving an introduction to object storage, why is it relevant, the issues with more traditional storage and how object overcomes them, as well as Robert sharing some great use cases.

So, if you are wondering what object is all about and where it maybe relevant in your business, then hopefully this is the episode for you.

Enjoy…

If you’d like to follow up with Robert with questions around NetApp’s object storage solutions you can email him at robert.cox@netapp.com

You can find information on NetApp StorageGrid Webscale here 

And if you’d like a demo of StorageGrid then request one here

Next week we take a look at one of the most high profile of tech trends the emergence of DevOps, to make sure you don’t miss out you can subscribe to the Tech Interviews below.

Hope you can join us next week, thanks for listening…

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Make my cloud so…

A bit of a Star Trek misquote I know, but I’m pretty sure Captain Picard would have said that as the ships IT department looked to enable their hybrid cloud strategy. For many of us, hybrid cloud is the reality of our future IT designs, the flexibility provided by access to cloud compute and storage, both technically and commercially makes cloud services compelling in many instances.

However, those compelling cases do come with a cost. Using hugely scalable public cloud technologies presents challenges, application architecture, system design but more often than not they are data issues, security, governance, protection or even just moving big lumps of data around, all add to the challenge that comes with trying to enable these flexible cloud based services.

With that in mind, I took great interest in NetApp’s November 1st Hybrid Cloud announcements (You can find the press release here), especially the very strong emphasis on enablement, this was not your usual product range announcement. Often these announcements are almost “self serving”, get a new widget from, buy our updated solution or platform. Don’t get me wrong there is an element of that in these NetApp announcements, with updates to a couple of major products, but what was really interesting was the cloud service solutions that where mentioned, technologies that where not your “traditional” NetApp solution, no need for a hardware array, no need for ONTAP or anything else, these where purely service offerings that are designed for no other reason than to address the significant challenge of enabling our integration with cloud services.

I don’t plan on going into detail on all of the announcements, check out a great post like this from Mike Andrews (@trekintech) for wider analysis, I just wanted to look at a couple of the more generic cloud enablement solutions, that don’t need any “traditional” NetApp components.

Cloud Control for Office 365

In my experience, one of the early cloud integrations an enterprise will make is Office365, taking advantage of Microsoft’s Software as a service offering for email, document management and file storage. These services, which although critical, are often time intensive to deliver, while providing little additional value to the business, “table stakes” if you will, a company must have these things, but they are not going to give competitive advantage.

Giving it to Microsoft to run makes perfect sense, however one thing that is often missed when a business moves to 365 is data protection. Microsoft’s role is clear, it is to present you with a secure, scalable and resilient service, however it is not to protect your data. 365 offers several options for data retention, however, Microsoft do not protect you from data deletion, accidental or malicious, once that data is gone, it’s gone.

So how do you protect it? There is a growing market of solutions to this challenge and NetApp have now thrown their hat in to the ring with an extremely comprehensive offering.

Cloud Control is a full SaaS offering, no need to purchase equipment, or install anything on prem, take it as a service, point it at your 365 subscription and you have the capability to back up your Exchange, SharePoint and OneDrive for Business repositories.

What separates Cloud Control, in my opinion, is the number of possible backup targets you can use. If you have a NetApp environment, that’s great, you can take your 365 data and back it straight into your ONTAP environment, don’t have on-prem ONTAP? no problem, you can spin up ONTAP cloud and back off to that.

Don’t want ONTAP at all? Use AltaVault from the NetApp portfolio to move your data to an object store and of course, you don’t want anything at all from NetApp, no problem Cloud Control will allow you to move data straight into an AWS S3 bucket or an Azure storage blob.

Cloud Control provides granular data protection, with item level recovery for your 365 implementation, enabling you to deliver enterprise level data protection to your public cloud service.

Cloud Sync

A key benefit of cloud compute is the ability to get masses of processing power as and when you need it, without having to build a big compute cluster which spends most of its time idle.

Things like Hadoop are fantastic tools for data analytics, but it’s one heck of an expensive tool to deploy and has taken big data analytics away from many enterprises.

However, cloud providers like AWS have addressed this with services available to rent as you need them. The trick with these is, how do you move data to that analytics engine as and when you need it? how do we seamlessly integrate these services into our infrastructure?

Step forward the Cloud Sync service. Cloud Sync points at your on-prem NFS datastore (no it doesn’t have to be ONTAP based NFS) and your analytics service and seamlessly syncs the on-prem data to the analytics engine when needed, allowing you to take advantage of cloud compute, while ensuring your datasets are always refreshed.

Cloud Sync is all about automating those difficult tasks, and in modern IT, that is exactly what we are looking for, orchestrating the use of cloud compute allowing us to consume services in the most effective way.

Again, delivering this without the need for any of the more “traditional” NetApp technologies.

But Why?

I suppose this begs the question, why as a storage vendor, build solutions, that actively have no need for your storage products? Well let’s not be fooled, both of these services are NetApp subscription services, and of course both solutions can enhance existing NetApp technology, however I don’t think that’s the primary aim.

If you’ve ever looked at any of NetApp’s Data Fabric strategy, you’ll see that they are a very different storage company, who are much happier to talk about data strategy than selling you things, of course they have things that can enable your strategy, but a conversation about how we manage our data in this modern IT world, I see as something far more valuable than just selling something a bit faster, with a few more flashing lights, getting us to think about how we move, manage and secure data is far more important.

These November 1st announcements are just another example of NetApp’s commitment to its Data Fabric and how the right fabric can enable an organisation to fully exploit cloud flexibility, I very much look forward to seeing these solutions in action as they come to market and of course keen to see what NetApp add next to this increasingly impressive hybrid cloud story.

Cloud enabled captain…

For more detail on NetApp’s cloud solutions visit their cloud website where you can get information as well as access to trials of these services.

cloud.netapp.com

For some background reading on data fabric, please feel free to check one of my previous posts;

Data Fabric – What is it good for?

And if you have any questions, feel free to contact me @techstringy or on Linkedin.

For other posts with details on the announcements check out

Taylor Riggan’s View on Cloud Sync

Mike Andrews NetApp Hybrid Cloud Launch

And if you’d like a bit of audio to listen to, I also interviewed Mike Andrews for a TechStringy Interview discussing the announcements and their strategic and technical impact, feel free to have a listen here;

NetApp Hybrid Cloud Announcements with Mike Andrews