Taking VMware to the cloud – Ben Meadowcroft – Ep43

Over the last couple of episodes, we’ve had some interesting round-ups from the recent VMWorld conference, reviewing the announcements from the show, as well as how VMware are evolving to maintain relevance to their many customers in an ever more software defined, data-centric and of course cloud and as-a-service based world.

Part of the VMware response to these changes ( I hasten to add, not the only technological evolutions they are making) is a smart one, rather than fight the tide King Canute style, VMware is not only embracing that change, but looking to empower it and make businesses ability to embrace a cloud-based world a more straightforward transition.

Embracing this change comes in the form of VMware Cloud on AWS, providing the ability to run your own VMware vSphere environment on top of a dedicated set of AWS resources, providing the flexibility and economics of cloud, while maintaining an infrastructure and management platform that you already know.

This sounds like a really smart move, helping customers to make that tricky transition, keeping it seamless by providing flexibility and integration with your existing on-prem environments, without your IT teams needing to embark on a whole new learning path to understand your cloud platforms.

However, as smart as this sounds, the response has not been totally supportive, with some people asking is there really a need for this type of technology and why, if you are making the investment in AWS, why not just do that, why add these additional VMware costs and infrastructure components?

That is the topic we explore on this week’s show as I’m joined by Ben Meadowcroft, a Product Line Manager at VMware with a focus on VMware Cloud on AWS.

I catch up with Ben to understand more about the solution, why the solution exists at all, the challenges that business faces when building a hybrid solution and how VMware Cloud on AWS is helping to ease that transition, simplify the integration and allow us to start taking advantage of the capabilities of the AWS platform, while removing some of the challenges many of us face when making that transition.

Ben gives some great insight into the platform as well as some helpful use case examples to help you decide whether this kind of technology is a good fit for you.

To find out more details on the solution you can find great resources in the following places;

For an overview of the solution check out cloud.vmware.com/vmc-aws

You can get some hands-on experience with VMWare’s hands-on lab environment at vmware.com/go/try-vmc-aws-hol

To keep up with the latest news you can also follow @vmwarecloudaws on twitter.

Finally, if you want to catch up with Ben you can also find him on twitter @benmeadowcroft

Personally, I think VMWare cloud on AWS is a really interesting solution and I can see it meeting needs in a number of enterprises, check out the show and provide your feedback, either on here or message me @techstringy on twitter.

Next time we start a series of shows looking at the ever-evolving data security challenge.

To make sure you catch those, why not subscribe and if you have the chance leave a review.

Thanks for listening.

As an Interesting bit of information, friend of the show @MichaelCade1 of Veeam produced this really handy blog post on how you can protect your VMware Cloud on AWS environment, using the Veeam tools you already know and love, worth a read, as protecting your data in AWS is your responsibility.

You can read his post here.

Ben in this episode did cover some VMware Cloud on AWS roadmap items, with this in mind, he’s asked me to include the following disclaimer.

Disclaimer

This presentation may contain product features that are currently under development.

This overview of new technology represents no commitment from VMware to deliver these features in any generally available product.

Features are subject to change, and must not be included in contracts, purchase orders, or sales agreements of any kind.

Technical feasibility and market demand will affect final delivery.

Pricing and packaging for any new technologies or features discussed or presented have not been determined.

 

Advertisements

IT Avengers Assemble – Part One – Ep38

This weeks Tech Interviews is the first in a short series, where I bring together a selection of people from the IT community to try to gauge the current state of business IT and to gain some insight into the key day-to-day issues affecting those delivering technology to their organisations.

For this first episode i’m joined by three returning guests to the show.

Mich040317_0726_Availabilit1.jpgael Cade is a Technical Evangelist at Veeam. Michael spends his time working closely with both the IT community and Veeam’s business customers to understand the day-to-day challenges that they face from availability to cloud migration.

You can find Michael on twitter @MichaelCade1 and his blog at vzilla.co.uk 

mike andrews

Mike Andrews is a Technical Solutions Architect at storage vendor NetApp, specialising in NetApp’s cloud portfolio, today Mike works closely with NetApp’s wide range of customers to explore how to solve the most challenging of business issues.

You can find Mike on social media on twitter @TrekinTech and on his blog site trekintech.com

Mark CarltonMark Carlton is Group Technical Manager at Concorde IT Group, he has an extensive experience in the industry having worked in a number of different types of technology businesses, today Mark works closely with a range of customers helping them to use technology to solve business challenges.

Mark is on twitter @mcarlton1983 and at his fantastically titled justswitchitonandoff.com blog.

The panel discuss a range of issues, from availability to cloud migration, the importance of the basics and how understanding the why, rather than the how is a crucial part of getting your technology strategy right.

The team provide some excellent insights into a whole range of business IT challenges and I’m sure there’s some useful advice for everyone.

Next time I’m joined by four more IT avengers, as we look at some of the other key challenges facing business IT.

If you enjoyed the show and want to catch the next one, then please subscribe, links are below.

Thanks for listening.

Subscribe on Android

SoundCloud

Listen to Stitcher

 

 

 

 

All Aboard the Data Train

The other night myself and Mrs Techstringy were discussing a work challenge. She works for a well-known charity and one of her roles is to book locations for fundraising activities, on this occasion the team were looking at booking places at railway stations and considering a number of locations, however all they really had to go on was a “gut feeling”.

As we discussed it we did a bit of searching and came across this website http://www.orr.gov.uk/statistics/published-stats/station-usage-estimates which contained information of footfall in every UK railway station over the last 20 years, this information was not only train geek heaven, it also allowed us to start to use the data available to make a more informed choice and to introduce possibilities that otherwise would not have been considered.

This little family exercise was an interesting reminder of the power of data and how with the right analysis we can make better decisions.

Using data to make better decisions is hardly news, with the ever-increasing amounts of data we are collecting and the greater access to powerful analytics, machine learning and AI engines, all of us are already riding the data train taking us to a world of revolutionary ideas, aren’t we?

The reality is, that most of us are not, but why?

For many, especially with data sets gathered over many years, it’s hard, hard to package our data in such a way that we can easily present it to analytics engines and get something useful from it.

But don’t let it stop you, there is potentially huge advantage to be had from using our data effectively, all we need is a little help to get there.

So what kind of steps can we take so we too can grab our ticket and board the data train?

Understand our data

The first thing may seem obvious, understand our data, we need to know, where is it? what is it? is it still relevant?

Without knowing these basics, it is going to be almost impossible to identify and package up the “useful” data.

The reality of data analytics is we just can’t throw everything at it, remember the old adage garbage in, garbage out, it’s not changed, if we feed our data analytics elephant a lot of rubbish, we aren’t going to like what comes out the other end!

Triage that data

Once we’ve identified it, we need to make sure we don’t feed our analytics engine a load of nonsense, it’s important to triage, throw out the stuff that no one ever looks at, the endless replication, the stuff of no business value, we all store rubbish in our data sets, things that shouldn’t be there in the first place, so weed it out, otherwise at best we are going to process irrelevant information, at worst we are going to skew the answers and make them worthless.

Make it usable

This is perhaps the biggest challenge of all, how do we make our massive onsite datasets useful to an analytics engine.

Well we could deploy an on-prem analytics suite, but for most of us this is unfeasible and the reality is, why bother? Amazon, Microsoft, Google, IBM to name but a few have fantastic analytics services ready and waiting for your data, however the trick is how to get it there.

man-lifting-heavy-boxThe problem with data is it has weight, gravity, it’s the thing in a cloud led world that is still difficult to move around, it’s not only its size that makes it tricky, but there is our need to maintain control, meet security requirements, maintain compliance, these things can make moving our data into cloud analytics engines difficult.

This is where building an appropriate data strategy is important, we need to have a way to ensure our data is in the right place, at the right time, while maintaining control, security and compliance.

When looking to build a strategy that allows us to take advantage of cloud analytics tools, we have two basic options;

Take our data to the cloud

Taking our data to the cloud is more than just moving it there, it can’t just be a one off copy, ideally in this kind of setup, we need to move our data in, keep it synchronised with changing on-prem data stores and then move our analysed data back when we are finished, all of this with the minimum of intervention.

Bring the cloud to our data

Using cloud data services doesn’t have to mean moving our data to the cloud, we can bring the cloud to our data, services like Express Route into Azure or Direct Connect into AWS means that we can get all the bandwidth we need between our data and cloud analytics services, while our data stays exactly where we want it, in our datacentre, under our control and without the heavy lifting required for moving it into a public cloud data store.

Maybe it’s even a mix of the two, dependent on requirement, size and type of dataset, what’s important is that we have a strategy, a strategy that gives us the flexibility to do either.

All aboard

Once we have our strategy in place and have the technology to enable it, we are good to go, well almost, finding the right analytics tools and of course what to do with the results when we have them, are all part of the solution, but having our data ready is a good start.

That journey does have to start somewhere, so first get to know your data, understand what’s important and get a way to ensure you can present it to the right tools for the job.

Once you have that, step aboard and take your journey on the data train.

If you want to know more on this subject and are in or around Liverpool on July 5th, why not join me and a team of industry experts as we discuss getting the very best from your data assets at our North West Data Forum.

And for more information on getting your data ready to move to the cloud, check out a recent podcast episode I did with Cloud Architect Kirk Ryan of NetApp as we discuss the why’s and how’s of ensuring our data is cloud ready.

New fangled magic cloud buckets – Kirk Ryan – Ep32

When Public Cloud Isn’t The Answer – Matt McSpirit – Ep 16

The power, flexibility, scale and simplicity that comes with “cloud” services is something that many of us have embraced.

The ability to deliver quickly and easily, complicated application and platform infrastructures is very appealing, especially for those of us who are continually challenged to deliver solutions to business problems ever more efficiently.

Public cloud providers like Microsoft, Amazon, Google and IBM are a great answer to many of the modern technology challenges we are faced with, but, what happens when public cloud can’t be the answer to our challenge?

There are many reasons that a public cloud solution isn’t right,technical, commercial or of course, security driven, privacy and data sovereignty are concerns of many a business as they consider cloud.

What do we do? we can see the benefit, but also understand why we can’t take advantage of the solution.

The answer?

Build your own, deliver your own on-premises cloud solution. But How? how do I build my own Microsoft Azure, where on earth do I start?

Well you’ve come to the right place, in part two of my conversation with Microsoft Technical Evangelist Matt McSpirit, we discuss Azure Stack, Microsoft’s forthcoming private cloud converged solution, currently available in Technical Preview, ahead of it’s launch later this year, Azure Stack gives you all of the flexibility and deployment efficiency of Azure, with all the control, security and privacy of delivering it from your own data centre.021317_1151_EmbracingDe1.jpg

In this episode we discuss  what Azure Stack is, who it is (and is not) for, as well as how you to get your hands on it.

It’s a fascinating technology solution and Matt provides great insight into why it may be for you and how you get started finding out.

Enjoy the show.

Matt mentioned a range of resources that you can get your hands on to find out more about Azure Stack;

The Main Azure Stack page for more background and detail on the solution

Click here to access the Azure Stack Tech Preview

Check out the very informative sessions from Microsoft Ignite.

You can find Matt on Twitter @mattmcspirit

And if  you missed part one of our chat, don’t worry, it’s here .

If you enjoyed the show and want to make sure you don’t miss the next one, then why not subscribe on iTunes or Soundcloud or wherever else you get your podcasts.

Thanks for listening.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

 

 

Make my cloud so…

A bit of a Star Trek misquote I know, but I’m pretty sure Captain Picard would have said that as the ships IT department looked to enable their hybrid cloud strategy. For many of us, hybrid cloud is the reality of our future IT designs, the flexibility provided by access to cloud compute and storage, both technically and commercially makes cloud services compelling in many instances.

However, those compelling cases do come with a cost. Using hugely scalable public cloud technologies presents challenges, application architecture, system design but more often than not they are data issues, security, governance, protection or even just moving big lumps of data around, all add to the challenge that comes with trying to enable these flexible cloud based services.

With that in mind, I took great interest in NetApp’s November 1st Hybrid Cloud announcements (You can find the press release here), especially the very strong emphasis on enablement, this was not your usual product range announcement. Often these announcements are almost “self serving”, get a new widget from, buy our updated solution or platform. Don’t get me wrong there is an element of that in these NetApp announcements, with updates to a couple of major products, but what was really interesting was the cloud service solutions that where mentioned, technologies that where not your “traditional” NetApp solution, no need for a hardware array, no need for ONTAP or anything else, these where purely service offerings that are designed for no other reason than to address the significant challenge of enabling our integration with cloud services.

I don’t plan on going into detail on all of the announcements, check out a great post like this from Mike Andrews (@trekintech) for wider analysis, I just wanted to look at a couple of the more generic cloud enablement solutions, that don’t need any “traditional” NetApp components.

Cloud Control for Office 365

In my experience, one of the early cloud integrations an enterprise will make is Office365, taking advantage of Microsoft’s Software as a service offering for email, document management and file storage. These services, which although critical, are often time intensive to deliver, while providing little additional value to the business, “table stakes” if you will, a company must have these things, but they are not going to give competitive advantage.

Giving it to Microsoft to run makes perfect sense, however one thing that is often missed when a business moves to 365 is data protection. Microsoft’s role is clear, it is to present you with a secure, scalable and resilient service, however it is not to protect your data. 365 offers several options for data retention, however, Microsoft do not protect you from data deletion, accidental or malicious, once that data is gone, it’s gone.

So how do you protect it? There is a growing market of solutions to this challenge and NetApp have now thrown their hat in to the ring with an extremely comprehensive offering.

Cloud Control is a full SaaS offering, no need to purchase equipment, or install anything on prem, take it as a service, point it at your 365 subscription and you have the capability to back up your Exchange, SharePoint and OneDrive for Business repositories.

What separates Cloud Control, in my opinion, is the number of possible backup targets you can use. If you have a NetApp environment, that’s great, you can take your 365 data and back it straight into your ONTAP environment, don’t have on-prem ONTAP? no problem, you can spin up ONTAP cloud and back off to that.

Don’t want ONTAP at all? Use AltaVault from the NetApp portfolio to move your data to an object store and of course, you don’t want anything at all from NetApp, no problem Cloud Control will allow you to move data straight into an AWS S3 bucket or an Azure storage blob.

Cloud Control provides granular data protection, with item level recovery for your 365 implementation, enabling you to deliver enterprise level data protection to your public cloud service.

Cloud Sync

A key benefit of cloud compute is the ability to get masses of processing power as and when you need it, without having to build a big compute cluster which spends most of its time idle.

Things like Hadoop are fantastic tools for data analytics, but it’s one heck of an expensive tool to deploy and has taken big data analytics away from many enterprises.

However, cloud providers like AWS have addressed this with services available to rent as you need them. The trick with these is, how do you move data to that analytics engine as and when you need it? how do we seamlessly integrate these services into our infrastructure?

Step forward the Cloud Sync service. Cloud Sync points at your on-prem NFS datastore (no it doesn’t have to be ONTAP based NFS) and your analytics service and seamlessly syncs the on-prem data to the analytics engine when needed, allowing you to take advantage of cloud compute, while ensuring your datasets are always refreshed.

Cloud Sync is all about automating those difficult tasks, and in modern IT, that is exactly what we are looking for, orchestrating the use of cloud compute allowing us to consume services in the most effective way.

Again, delivering this without the need for any of the more “traditional” NetApp technologies.

But Why?

I suppose this begs the question, why as a storage vendor, build solutions, that actively have no need for your storage products? Well let’s not be fooled, both of these services are NetApp subscription services, and of course both solutions can enhance existing NetApp technology, however I don’t think that’s the primary aim.

If you’ve ever looked at any of NetApp’s Data Fabric strategy, you’ll see that they are a very different storage company, who are much happier to talk about data strategy than selling you things, of course they have things that can enable your strategy, but a conversation about how we manage our data in this modern IT world, I see as something far more valuable than just selling something a bit faster, with a few more flashing lights, getting us to think about how we move, manage and secure data is far more important.

These November 1st announcements are just another example of NetApp’s commitment to its Data Fabric and how the right fabric can enable an organisation to fully exploit cloud flexibility, I very much look forward to seeing these solutions in action as they come to market and of course keen to see what NetApp add next to this increasingly impressive hybrid cloud story.

Cloud enabled captain…

For more detail on NetApp’s cloud solutions visit their cloud website where you can get information as well as access to trials of these services.

cloud.netapp.com

For some background reading on data fabric, please feel free to check one of my previous posts;

Data Fabric – What is it good for?

And if you have any questions, feel free to contact me @techstringy or on Linkedin.

For other posts with details on the announcements check out

Taylor Riggan’s View on Cloud Sync

Mike Andrews NetApp Hybrid Cloud Launch

And if you’d like a bit of audio to listen to, I also interviewed Mike Andrews for a TechStringy Interview discussing the announcements and their strategic and technical impact, feel free to have a listen here;

NetApp Hybrid Cloud Announcements with Mike Andrews

Simplify My Data Leak Prevention

data_theftA little while back I wrote a post about how important it is to stop making technology so hard (feel free to have a look back here) and that successful technology delivers what people need.

How do we do that? by giving them technology that just simply works, I’ve written a few times about the OAP Internet Virgins show on Sky, here in the UK, which gave older folk an iPad and taught them how this simple bit of well designed technology could work and how it truly changes lives in a host of these cases.

Well I also said i’d give some examples of where I’ve seen simplification of technology have real benefit, however since that promise, times have been hectic, traveling, presenting, doing press and video interviews, a podcast debut and my actual job, all that got in the way of my good blogging intentions!

Well in the midst of all that was a presentation I was asked to do by Microsoft to the Institute of Financial accountants, the topic of which was data security. The idea been to give these predominantly small business owners some tips on how to secure their most critical business asset, their data. Just because these where small businesses, it doesn’t make their data any less critical than the very largest enterprise. However these guys potentially have a much bigger problem, they are financial services people not IT people and the idea that they need complex technology solutions to stop them losing critical data would mean that, in reality, they never would have that option and that’s not the way it’s supposed to work, technology should be an enabler and help us do things better, smarter, easier and shouldn’t be bound by budget, or in-depth IT skills.

Well what have all these things go to do with making things simpler?

Take a bow Office365, Microsoft do lots of really good stuff on their cloud platforms, across 365 and Azure, it’s what you’d expect from a hyperscale cloud provider. One of the things that cloud does is help to greatly simplify IT deployment, need a new server, go to the portal click go and up it comes, need storage, select what you need and like technology magic these things appear, the behind the scenes technology is very complex, but to you the user, it looks a doddle and that is exactly how it should be.

How does that relate back to our our finance friends?

During our event we focussed on a number of areas that you should look at as part of a data leak prevention strategy.

data protection areas

Now some of those things are practical things you can do, sole trader or huge corporate, but some of these areas are more tricky.

If we wind back 5 years or so, how many businesses of all sizes, found some or all of the above areas a real challenge, both technically and commercially.

Technology to address all of these things of course has been around for ages, but let’s just pick on one area and show how cloud and Office365 specifically has made something so much simpler, both technically and commercially.

I remember sitting in a presentation a few years ago showing the power of information rights management (IRM) in a Microsoft infrastructure, for those not familiar, this is a really powerful ability, where you can start building rules into your document work flows and applications to stop important and sensitive information being shared in ways it shouldn’t.

Let’s give an example, how many of us have accidentally emailed the wrong person thanks to auto addresses? I know i have, now normally you are emailing something relatively harmless, but a few months back, I was accidentally sent someone’s personal financial information, as I shared the first name of their financial adviser.

How do we stop that? Well that’s what IRM is there for, IRM would either have rules in the document or rules in exchange that would stop information leaving the safety of your internal systems by mistake.

Brilliant, so why don’t lots of people do it? Because it’s to hard, it’s complex and expensive to set up on-prem.

“But I’d love that kind of capability” I hear you shout, well step forward the bright world of cloud based service, specifically in this case Office365 and Azure.

As we look in our 365 management portal, what’s this handy little option?

rights management

When we click into manage, we get the opportunity to activate rights management, if it’s not already running, and when you click activate – that’s kind of it, your organisation now has rights management enabled for it’s Office365 estate.

What does that mean?

We can now add data security policies to a whole range of documents and emails, so yes, there is a bit of configuration (don’t be afraid to ask for some skilled advice here) but to get you started there is a range of preconfigured templates ready to roll.

ILM Templates

Once enabled, then you have ILM implemented and usable in your business productivity applications.

ILM in Word

There it is, now sat as an option in Word, where you can simply add rights management controls and apply protection templates to your sensitive company info.

Enabling this in your organisation also opens up capabilities into tools like Exchange and SharePoint Online.

For me this is a great example of how cloud technology can hugely simplify, what in reality, is a complex bit of technology too setup.

That is the power of well built cloud (whether that’s private, public or hybrid), making technology deployment quick and easy to deliver and in many businesses allowing you to enable technology that, in a more traditional model, would be too complex or expensive.

It is this kind of approach that is revolutionising the IT industry at the minute, and for all of us in the industry we need to understand this, whether we create applications, architect them or even consult on them. To meet the challenges in the modern business regardless of how complex and challenging it may be behind the scenes.

There’s the challenge for us all!

Like I said at the beginning of this, when working with our financial services friends, their data is just as important as everyone else’s and they shouldn’t be excluded from solutions to their business challenges by complexity and cost, now should they!

If you’re looking for Information Rights Management as part of your data leak prevention strategy, hopefully this post has given you some ideas of how this is not out of your reach either technically or commercially by utilising cloud services where appropriate.

Any questions, feel free to give me a shout on Twitter, LinkedIn or via the comments section here and we can swap some ideas.

Thanks for reading.

Want to know more – try these

What is Azure Rights Management (Technet Article)

What is Azure Rights Management Overview (Short Video)