When Public Cloud Isn’t The Answer – Matt McSpirit – Ep 16

The power, flexibility, scale and simplicity that comes with “cloud” services is something that many of us have embraced.

The ability to deliver quickly and easily, complicated application and platform infrastructures is very appealing, especially for those of us who are continually challenged to deliver solutions to business problems ever more efficiently.

Public cloud providers like Microsoft, Amazon, Google and IBM are a great answer to many of the modern technology challenges we are faced with, but, what happens when public cloud can’t be the answer to our challenge?

There are many reasons that a public cloud solution isn’t right,technical, commercial or of course, security driven, privacy and data sovereignty are concerns of many a business as they consider cloud.

What do we do? we can see the benefit, but also understand why we can’t take advantage of the solution.

The answer?

Build your own, deliver your own on-premises cloud solution. But How? how do I build my own Microsoft Azure, where on earth do I start?

Well you’ve come to the right place, in part two of my conversation with Microsoft Technical Evangelist Matt McSpirit, we discuss Azure Stack, Microsoft’s forthcoming private cloud converged solution, currently available in Technical Preview, ahead of it’s launch later this year, Azure Stack gives you all of the flexibility and deployment efficiency of Azure, with all the control, security and privacy of delivering it from your own data centre.021317_1151_EmbracingDe1.jpg

In this episode we discuss  what Azure Stack is, who it is (and is not) for, as well as how you to get your hands on it.

It’s a fascinating technology solution and Matt provides great insight into why it may be for you and how you get started finding out.

Enjoy the show.

Matt mentioned a range of resources that you can get your hands on to find out more about Azure Stack;

The Main Azure Stack page for more background and detail on the solution

Click here to access the Azure Stack Tech Preview

Check out the very informative sessions from Microsoft Ignite.

You can find Matt on Twitter @mattmcspirit

And if  you missed part one of our chat, don’t worry, it’s here .

If you enjoyed the show and want to make sure you don’t miss the next one, then why not subscribe on iTunes or Soundcloud or wherever else you get your podcasts.

Thanks for listening.

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

 

 

Tech Trends – Object Storage – Robert Cox – Ep13

Over the last couple of weeks I’ve chatted about some of the emerging tech trends that I expect to see continue to develop during 2017 (Have a read of my look ahead blog post for some examples). To continue that theme this episode of Tech Interviews is the first of three looking in a little more detail at some of those trends.

First up, we look at a storage technology that is growing rapidly if not necessarily obviously, Object Storage.

As the amount of data the world creates continues to grow exponentially it is becoming clear that some methods of traditional storage are no longer effective. When we are talking billions of files, spread across multiple data centers across multiple geographies, traditional file storage models are no longer as effective (regardless of what a vendor may say!) that’s not to say that our more traditional methods are finished, in fact a long way from it, however there are increasingly use cases where that traditional model doesn’t scale or perform well enough.

For many of us, we’ve probably never seen an object store, or at least think we haven’t, but if you’re using things like storage from AWS or Azure then you’re probably using object storage, even if you don’t realise it.

With all that said, what actually is object storage? why do we need it? how does it address the challenges of more traditional storage? what are the use cases?

It’s those questions that we attempt to answer in this episode of Tech Interviews with my robert-coxguest Robert Cox. Robert is part of the storage team at NetApp working with their StorageGrid Webscale object storage solution.

During our chat we focus on giving an introduction to object storage, why is it relevant, the issues with more traditional storage and how object overcomes them, as well as Robert sharing some great use cases.

So, if you are wondering what object is all about and where it maybe relevant in your business, then hopefully this is the episode for you.

Enjoy…

If you’d like to follow up with Robert with questions around NetApp’s object storage solutions you can email him at robert.cox@netapp.com

You can find information on NetApp StorageGrid Webscale here 

And if you’d like a demo of StorageGrid then request one here

Next week we take a look at one of the most high profile of tech trends the emergence of DevOps, to make sure you don’t miss out you can subscribe to the Tech Interviews below.

Hope you can join us next week, thanks for listening…

Subscribe on Android

http://feeds.soundcloud.com/users/soundcloud:users:176077351/sounds.rss

Insights from the storage industry?

Last week I was away in Berlin at NetApp’s Insight conference (See what IFD572BD3-226B-428F-B6F4-849481A8B842.jpg did with the title there!) always an enjoyable event with good information, company, food and the occasional large German beer. That aside, I do try to attend a handful of these types of events a year as a part of my job.

How does it benefit my job?

A big part of my role is to identify key industry trends and challenges and to see whether our technology partners are developing solutions to take these on and help our customers to adapt and modernise their IT and maintain competitive edge in a fast changing business world. Whether that’s Microsoft, one of our  data management and security providers, or, as in this case a storage provider like NetApp. We need to know our partners are still delivering relevant solutions.

So how did NetApp measure up ?

Our answer to this is usually found in the keynote sessions, that’s the home of strategic presentations and product announcements, Insight was no exception.

Understanding the problems?

Did the NetApp leadership address the fundamental challenges that we are seeing?

Three messages really stood out for me at the event, each hit key concerns I see in my daily dealings with senior IT people.

Data is critical

Data was at different times the new gold, new oil and the new digital currency, but ultimately it was THE most important thing, it was the key focus of pretty much everything covered across the four days and that’s how it should be, it’s our businesses most critical asset, it’s the thing that has the opportunity to separate us from our competition by extracting true value, whether that’s better reporting, better analytics or more flexibility in movement from on-prem to cloud and back. Getting the best from it is a major goal for us all.WP_20161114_15_51_23_Rich_LI.jpg

This focus was refreshing it also included coining the phrase;

NetApp not the last independent storage vendor but the first data management company

That works for me, my conversations these days are never speeds and feeds based, much more around outcomes and aims, tick in the box then.

DevOps it

You just can’t have an IT discussion these days without throwing around the phrase DevOps – I’d be disappointed to be honest if it wasn’t brought up – I’m not even going to attempt to try to do justice to the breadth of the topic here, there’s lots of great DevOps content out there (For an excellent DevOps intro have a listen to the Tech ONTAP Podcast episode with Gene Kim here ) .

I think often we assume this kind of stuff is just about software development, but in my mind it’s much more about the way we are looking to consume technology in our businesses, IT cannot be an impediment to us doing business, the modern business needs to be able to respond quickly to new challenges and we need to have an IT infrastructure that can not only change but one we are not afraid to change when we need to.

There was a great session with a day in the life of DevOps, that although played for laughs, brought home the importance of automation, the ability to fail fast and how to manage modern development processes, of course with a healthy bit of how things like NetApp’s integration with Docker, access to API’s with both ONTAP and Solidfire can all help build a modern agile data infrastructure.

Integrating the cloud

NetApp has talked extensively about their data fabric message for the last couple of years, many of you know I’m a fan (for example Data Fabric – what is it good for). The driver behind the fabric is the reality, that for most of us and our IT infrastructure, the future is going to be hybrid, some stuff on prem, some stuff in the cloud. But this kind of hybrid environment comes with challenges, no challenge bigger than how we move data between our on-prem and cloud environments, and not just how we move the datasets around, but how we ensure that it remains under our control, secure and protected and does not end up living in a cloud storage silo.

Insight this year showed the maturity of what NetApp have been doing in this space, not only with the additional capabilities they added to the NetApp portfolio, closer integration of ONTAP and Alta Vault, the announcement of SnapMirror to Solidfire, the enhancements to ONTAP cloud with additional capabilities in AWS as well as support for Azure, but also the introduction of a couple of really interesting solutions that don’t need any “traditional” NetApp solutions at all.

Cloud Sync allows for the movement and conversion of data between an on-prem NFS datastore up into AWS’s analytics tools, designed  to greatly simplify the usage of services such as EMR. Alongside this is Cloud Control a solution to help protect the contents of your Office 365 services, email, SharePoint and OneDrive for Business, giving you the ability to back data from these services into anything from your NetApp based on-prem storage to storage blobs in Azure and AWS. Impressively both of these are just services that you can sign up to, point at the relevant cloud services and away you go, no requirement for any other NetApp tech if you don’t want it.

What I like about this is it shows their commitment to data, it’s no longer about selling you ONTAP or FAS hardware (even though they remain great platforms) but about helping us to enable our data to be used in this quickly changing technology and business world.

Did NetApp deliver what I was looking for?

Certainly for me they did, as I said right at the start, when I get time with key technology partners I’m looking to see if they are addressing the primary issues we and our customers are seeing and are they understanding the key technology trends, personally I think at Insight NetApp nailed it and will continue to be very relevant in the modern data management world.

So good job NetApp.

I hope you enjoyed the post, if you want some further info from Insight, here’s some resources you may find useful.

While I was out there I got to do a couple of interviews with key NetApp staff that were recorded for their YouTube channel.

I chatted here with Elliot Howard about the wider challenges that customers see and how NetApp and it’s partners can help;

On this video I spoke with Grant Caley NetApp UK’s chief technologist and asked about industry trends and how they are going to effect out storage usage in the future;

Finally I also spoke with some of the attendees at the event to see what they thought of Insight and tech conferences in general. You can find that here on TechStringy Interviews – or go get the podcast from iTunes or wherever you get your podcasts.

 

 

 

 

 

Make my cloud so…

A bit of a Star Trek misquote I know, but I’m pretty sure Captain Picard would have said that as the ships IT department looked to enable their hybrid cloud strategy. For many of us, hybrid cloud is the reality of our future IT designs, the flexibility provided by access to cloud compute and storage, both technically and commercially makes cloud services compelling in many instances.

However, those compelling cases do come with a cost. Using hugely scalable public cloud technologies presents challenges, application architecture, system design but more often than not they are data issues, security, governance, protection or even just moving big lumps of data around, all add to the challenge that comes with trying to enable these flexible cloud based services.

With that in mind, I took great interest in NetApp’s November 1st Hybrid Cloud announcements (You can find the press release here), especially the very strong emphasis on enablement, this was not your usual product range announcement. Often these announcements are almost “self serving”, get a new widget from, buy our updated solution or platform. Don’t get me wrong there is an element of that in these NetApp announcements, with updates to a couple of major products, but what was really interesting was the cloud service solutions that where mentioned, technologies that where not your “traditional” NetApp solution, no need for a hardware array, no need for ONTAP or anything else, these where purely service offerings that are designed for no other reason than to address the significant challenge of enabling our integration with cloud services.

I don’t plan on going into detail on all of the announcements, check out a great post like this from Mike Andrews (@trekintech) for wider analysis, I just wanted to look at a couple of the more generic cloud enablement solutions, that don’t need any “traditional” NetApp components.

Cloud Control for Office 365

In my experience, one of the early cloud integrations an enterprise will make is Office365, taking advantage of Microsoft’s Software as a service offering for email, document management and file storage. These services, which although critical, are often time intensive to deliver, while providing little additional value to the business, “table stakes” if you will, a company must have these things, but they are not going to give competitive advantage.

Giving it to Microsoft to run makes perfect sense, however one thing that is often missed when a business moves to 365 is data protection. Microsoft’s role is clear, it is to present you with a secure, scalable and resilient service, however it is not to protect your data. 365 offers several options for data retention, however, Microsoft do not protect you from data deletion, accidental or malicious, once that data is gone, it’s gone.

So how do you protect it? There is a growing market of solutions to this challenge and NetApp have now thrown their hat in to the ring with an extremely comprehensive offering.

Cloud Control is a full SaaS offering, no need to purchase equipment, or install anything on prem, take it as a service, point it at your 365 subscription and you have the capability to back up your Exchange, SharePoint and OneDrive for Business repositories.

What separates Cloud Control, in my opinion, is the number of possible backup targets you can use. If you have a NetApp environment, that’s great, you can take your 365 data and back it straight into your ONTAP environment, don’t have on-prem ONTAP? no problem, you can spin up ONTAP cloud and back off to that.

Don’t want ONTAP at all? Use AltaVault from the NetApp portfolio to move your data to an object store and of course, you don’t want anything at all from NetApp, no problem Cloud Control will allow you to move data straight into an AWS S3 bucket or an Azure storage blob.

Cloud Control provides granular data protection, with item level recovery for your 365 implementation, enabling you to deliver enterprise level data protection to your public cloud service.

Cloud Sync

A key benefit of cloud compute is the ability to get masses of processing power as and when you need it, without having to build a big compute cluster which spends most of its time idle.

Things like Hadoop are fantastic tools for data analytics, but it’s one heck of an expensive tool to deploy and has taken big data analytics away from many enterprises.

However, cloud providers like AWS have addressed this with services available to rent as you need them. The trick with these is, how do you move data to that analytics engine as and when you need it? how do we seamlessly integrate these services into our infrastructure?

Step forward the Cloud Sync service. Cloud Sync points at your on-prem NFS datastore (no it doesn’t have to be ONTAP based NFS) and your analytics service and seamlessly syncs the on-prem data to the analytics engine when needed, allowing you to take advantage of cloud compute, while ensuring your datasets are always refreshed.

Cloud Sync is all about automating those difficult tasks, and in modern IT, that is exactly what we are looking for, orchestrating the use of cloud compute allowing us to consume services in the most effective way.

Again, delivering this without the need for any of the more “traditional” NetApp technologies.

But Why?

I suppose this begs the question, why as a storage vendor, build solutions, that actively have no need for your storage products? Well let’s not be fooled, both of these services are NetApp subscription services, and of course both solutions can enhance existing NetApp technology, however I don’t think that’s the primary aim.

If you’ve ever looked at any of NetApp’s Data Fabric strategy, you’ll see that they are a very different storage company, who are much happier to talk about data strategy than selling you things, of course they have things that can enable your strategy, but a conversation about how we manage our data in this modern IT world, I see as something far more valuable than just selling something a bit faster, with a few more flashing lights, getting us to think about how we move, manage and secure data is far more important.

These November 1st announcements are just another example of NetApp’s commitment to its Data Fabric and how the right fabric can enable an organisation to fully exploit cloud flexibility, I very much look forward to seeing these solutions in action as they come to market and of course keen to see what NetApp add next to this increasingly impressive hybrid cloud story.

Cloud enabled captain…

For more detail on NetApp’s cloud solutions visit their cloud website where you can get information as well as access to trials of these services.

cloud.netapp.com

For some background reading on data fabric, please feel free to check one of my previous posts;

Data Fabric – What is it good for?

And if you have any questions, feel free to contact me @techstringy or on Linkedin.

For other posts with details on the announcements check out

Taylor Riggan’s View on Cloud Sync

Mike Andrews NetApp Hybrid Cloud Launch

And if you’d like a bit of audio to listen to, I also interviewed Mike Andrews for a TechStringy Interview discussing the announcements and their strategic and technical impact, feel free to have a listen here;

NetApp Hybrid Cloud Announcements with Mike Andrews

A stitch in time… and data!

data fabricA quick look back at my blogging and social media back catalogue will show that I’m a bit of a fan of the concept of “Data Fabric” , yep guilty – Well one thing I’ve noticed while I’m on my data fabric rounds sharing the importance of why a fabric strategy is important, one question that often comes up is,

all sounds great this fabric idea, but where do I start?

It’s a great question, which has inspired me to produce a series of posts explaining the practicality of how you can build your own data fabric.

First though, some background including the answer to the critical question “Why on earth go on about this fabric thing in the first place?”

Why a Fabric?

why dataIn this most transformative of times in technology, the need for flexibility in our technical architectures has never being greater, the march toward “cloud” models of technical deployment continues at a pace, be that private, public or hybrid clouds. One part of our infrastructure presents a bigger challenge than most, our data, and that’s a problem!

Why a problem? ultimately the reason we build any infrastructure is so that we can present data, protect data, make data available, manipulate data, analyse data – but it’s all about data, compute, cloud, mobility, all about getting value from our data and delivering to our data consumers.

The issue is that data has weight and volume, this makes it hard to move around as well as potentially expensive (look at how much the public cloud providers charge you to get it into and out of their platforms) and of course slow (You cannot beat the laws of physics – to throw in a Star Trek misquote!). But these problems don’t help in a world where we want complete flexibility, where we want to be able to drop our data into a development environment, where we want to have our data moved into and out of appropriate repositories for backup,recovery or DR, without the commercials or the physics defeating us.

All of these challenges are among the considerations that we have to make and why a fabric strategy is important.

What is a fabric?

what is itData fabric is a strategy rather than a technology, but that doesn’t alter just how critical it is, all of the reasons we want a fabric are outlined above and a fabric strategy is the answer to those challenges, it provides us complete mobility of our data between many data repositories with the minimum amount of tools and is absolutely key to a successful data strategy, for today, and certainly for the future.

it provides us complete mobility of our data between many data repositories with the minimum amount of tools and is absolutely key to a successful data strategy for today, and certainly for the future.

Think about how a data fabric could change the way we deal with public cloud storage. One of the questions I always get about cloud is “Am I locked in” (or “do they have me over a barrel”) and the reality is yes, because getting your data in and out is hard – but what if you could break that barrier so you had complete flexibility of choice, one month you have your data in Azure, the next AWS are commercially a better fit, so you quickly flip your resources across and save yourself significant costs. Now that not only allows us to exploit many of the capabilities available to us, but also opens up whole new ways to operate our business.

It is this kind of flexibility that makes a data fabric strategy a critical part of our future infrastructure plans, be they on-premises, public cloud, private cloud or a mixture of them all, our data strategy has to ensure our data is available wherever we need it to be, whenever we want it to be there.

Who’s fabric?

The idea of building a fabric makes sense, of course we want and need that ability to move our data between different storage repositories.

guess who

This begs the question, who’s technology is capable of building such a fabric?

There are technologies that kind of allow bits of a solution, things like migration tools that move VM’s into public clouds, storage gateways, backup and DR as a service solutions that allow us to replicate our data into clouds. These technologies are great and can indeed be part of an overall strategy, but in those cases they are solution silos and there is the potential for an awful lot of stitching to be done to create a data fabric.

It will probably come as no surprise to those who’ve looked at my stuff on data fabric before, that the main strategic partner for me in this space is NetApp. The NetApp fabric strategy is extremely compelling, built on a backbone of Data ONTAP, but including so much more, cloud and virtual versions of ONTAP, AltaVault, Storage Grid, NetApp Private Storage (NPS) for public cloud and of course the upcoming addition of SolidFire.

All provide NetApp with a wide range of storage solutions, but importantly the fabric strategy builds into this the ability to move data between each of these platforms. Many of these tools are in already in place, moving between ONTAP and it’s physical, virtual and cloud solutions is as easy as you’d expect, but the capability to move between Object Stores, AltaVault, 3rd party storage, E-Series, all with a simple set of tools is either already with us or will be in the not to distant future.

NetApp data Fabric

This in my opinion delivers the most complete strategy of any of the data storage players.

So if Data Fabric sounds like something you want to deliver into your business then read on, as we look at how you can start that journey.

Starting the fabric journey

StartOfANewJourney

What part of our current infrastructure is a good place to start? A NetApp fabric world presents us with multiple starting points, over this occasional series we’ll look at each of these potential Data Fabric entry points ;

  • Production Storage
  • Test and Dev
  • Public Cloud
  • IT Continuity

Today though we’ll start with a bit of “low hanging fruit” as the sales folk like to say, by looking at backup and archive.

Backup and archive is often a good place to start with any new technology, it’s relatively non disruptive and relatively low risk, as we can keep existing strategies in place until we are absolutely sure our new solution is what we need.

With that in mind then, how do NetApp help us move into data fabric through our backups and archive.

If we think about what we want, which is our data in the most appropriate place, then public cloud is a great fit for many of our backup and archive needs, hugely scalable and relatively cheap and there are lots of cloud backup products out there – from the simple to the complex, however the key to data fabric is ensuring this is flexible.

Step up NetApp AltaVault. AltaVault is a cloud integrated backup appliance, presenting itself to your existing backup solution (so not necessarily any need to change that) as a backup target, while at the other end of the appliance – it talks to an object store, be that yours, or more likely a cloud based service (such as Azure, Amazon, Softlayer etc..) the AltaVault appliance then works as a gateway between your on-premises solution and your business appropriate object store, deduplicating, compressing and encrypting data before sending it off to your storage repository, for performance it also caches a large segment of that data for local recovery of the most recent data sets, as well as of course optimising the performance as the backup/archive job is written to it.

altavault ecosystem

That’s all great and is a really nice way of opening up the advantages of private and public cloud platforms to our data backup and archive. But how is this part of a fabric? how does this give me flexibility?

Where AltaVault really opens up data fabric is with the availability of public cloud based variants of the on-premises appliances.

How does this help?

Let’s say that we have decided the best place for our backup and archive data is an Amazon S3 store, we deploy our on-prem AltaVault which takes our backup data and sends it off, securely and efficiently to the cloud.

Role back to the beginning, why do we need fabric?

Because we want to be able to have access to our data in the best place possible.

Let’s say we have a disaster and lose the site that houses our Alta Vault appliance, fear not, we go off to AWS marketplace and fire up a cloud version of AltaVault . With this cloud appliance we can point it at our AWS based cloud storage and heah presto, all of our backups… and even better, if we want that back and don’t have access to our original data store, we can restore it into the cloud, maybe even a version of Cloud ONTAP, and there it is, available to us in the best and most convenient place we could need it.

Remember what we said at the start, the idea of a fabric is to ensure that our data is where we want it when we want it, hopefully you can see here how AltaVault takes one part of our data infrastructure and starts to weave that straight into a future data fabric, no disruption on site, no changing of any of our fundamental infrastructure, just taking our existing backup approach and taking advantage of today’s technology paradigms and giving you a whole new and flexible way of protecting your data.

the idea of a fabric is to ensure that our data is where we want it when we want it

That’s what you can do today, right now –  but it doesn’t stop there, why not check out what NetApp have planned for data fabric, have a look at this demo presented by NetApp founder Dave Hitz at the recent NetApp Insight conference in Berlin (running time about 13 minutes)

The Future of Data Fabric

There you go then, step one on how you can start your move to a data fabric, and yes, this is very much about NetApp and their fabric, as I believe their vision is by far and away the most complete in the market, but heah, even if you don’t want to use NetApp in it’s entirety or even in part, hopefully this has opened up some of the practical considerations of a data fabric and gives you some ideas to consider as you plan the next part of your data strategy.

Any questions, feel free to contact me on twitter, LinkedIn or the BLOG comments and I’d love to talk more with you about Data Fabric.

Below are links to a bunch of other things you may want to read, some by me, some from others.

Happy stitching!

Introducing Data Fabric

How To Save 90% On Cloud Backup Costs – By Mike Andrews of NetApp

A Hybrid Cloud Strategy needs Data Fabric – Short Video

Simplify My Data Leak Prevention

data_theftA little while back I wrote a post about how important it is to stop making technology so hard (feel free to have a look back here) and that successful technology delivers what people need.

How do we do that? by giving them technology that just simply works, I’ve written a few times about the OAP Internet Virgins show on Sky, here in the UK, which gave older folk an iPad and taught them how this simple bit of well designed technology could work and how it truly changes lives in a host of these cases.

Well I also said i’d give some examples of where I’ve seen simplification of technology have real benefit, however since that promise, times have been hectic, traveling, presenting, doing press and video interviews, a podcast debut and my actual job, all that got in the way of my good blogging intentions!

Well in the midst of all that was a presentation I was asked to do by Microsoft to the Institute of Financial accountants, the topic of which was data security. The idea been to give these predominantly small business owners some tips on how to secure their most critical business asset, their data. Just because these where small businesses, it doesn’t make their data any less critical than the very largest enterprise. However these guys potentially have a much bigger problem, they are financial services people not IT people and the idea that they need complex technology solutions to stop them losing critical data would mean that, in reality, they never would have that option and that’s not the way it’s supposed to work, technology should be an enabler and help us do things better, smarter, easier and shouldn’t be bound by budget, or in-depth IT skills.

Well what have all these things go to do with making things simpler?

Take a bow Office365, Microsoft do lots of really good stuff on their cloud platforms, across 365 and Azure, it’s what you’d expect from a hyperscale cloud provider. One of the things that cloud does is help to greatly simplify IT deployment, need a new server, go to the portal click go and up it comes, need storage, select what you need and like technology magic these things appear, the behind the scenes technology is very complex, but to you the user, it looks a doddle and that is exactly how it should be.

How does that relate back to our our finance friends?

During our event we focussed on a number of areas that you should look at as part of a data leak prevention strategy.

data protection areas

Now some of those things are practical things you can do, sole trader or huge corporate, but some of these areas are more tricky.

If we wind back 5 years or so, how many businesses of all sizes, found some or all of the above areas a real challenge, both technically and commercially.

Technology to address all of these things of course has been around for ages, but let’s just pick on one area and show how cloud and Office365 specifically has made something so much simpler, both technically and commercially.

I remember sitting in a presentation a few years ago showing the power of information rights management (IRM) in a Microsoft infrastructure, for those not familiar, this is a really powerful ability, where you can start building rules into your document work flows and applications to stop important and sensitive information being shared in ways it shouldn’t.

Let’s give an example, how many of us have accidentally emailed the wrong person thanks to auto addresses? I know i have, now normally you are emailing something relatively harmless, but a few months back, I was accidentally sent someone’s personal financial information, as I shared the first name of their financial adviser.

How do we stop that? Well that’s what IRM is there for, IRM would either have rules in the document or rules in exchange that would stop information leaving the safety of your internal systems by mistake.

Brilliant, so why don’t lots of people do it? Because it’s to hard, it’s complex and expensive to set up on-prem.

“But I’d love that kind of capability” I hear you shout, well step forward the bright world of cloud based service, specifically in this case Office365 and Azure.

As we look in our 365 management portal, what’s this handy little option?

rights management

When we click into manage, we get the opportunity to activate rights management, if it’s not already running, and when you click activate – that’s kind of it, your organisation now has rights management enabled for it’s Office365 estate.

What does that mean?

We can now add data security policies to a whole range of documents and emails, so yes, there is a bit of configuration (don’t be afraid to ask for some skilled advice here) but to get you started there is a range of preconfigured templates ready to roll.

ILM Templates

Once enabled, then you have ILM implemented and usable in your business productivity applications.

ILM in Word

There it is, now sat as an option in Word, where you can simply add rights management controls and apply protection templates to your sensitive company info.

Enabling this in your organisation also opens up capabilities into tools like Exchange and SharePoint Online.

For me this is a great example of how cloud technology can hugely simplify, what in reality, is a complex bit of technology too setup.

That is the power of well built cloud (whether that’s private, public or hybrid), making technology deployment quick and easy to deliver and in many businesses allowing you to enable technology that, in a more traditional model, would be too complex or expensive.

It is this kind of approach that is revolutionising the IT industry at the minute, and for all of us in the industry we need to understand this, whether we create applications, architect them or even consult on them. To meet the challenges in the modern business regardless of how complex and challenging it may be behind the scenes.

There’s the challenge for us all!

Like I said at the beginning of this, when working with our financial services friends, their data is just as important as everyone else’s and they shouldn’t be excluded from solutions to their business challenges by complexity and cost, now should they!

If you’re looking for Information Rights Management as part of your data leak prevention strategy, hopefully this post has given you some ideas of how this is not out of your reach either technically or commercially by utilising cloud services where appropriate.

Any questions, feel free to give me a shout on Twitter, LinkedIn or via the comments section here and we can swap some ideas.

Thanks for reading.

Want to know more – try these

What is Azure Rights Management (Technet Article)

What is Azure Rights Management Overview (Short Video)

Back to the Future…decoded 2015

microsoft-future-decoded-event-featured1

roving reporterNow, I’m no roving reporter, just a techie who likes a bit of Blogging, but on this occasion I thought I’d do a bit of roving reporting, with my view on the Microsoft Future decoded event which I had the pleasure of attending at the beginning of November.

So after a few hectic weeks, I thought it was about time I dusted off my notes and gave a couple of thoughts about what Microsoft discussed and what I picked up that maybe of interest to us Enterprise tech folk.

Let’s start with my general view of Microsoft – full disclosure, I’m a fan, I’ve always been a fan of Microsoft technology, cutting my very first technical teeth on Windows 3.0 (well even a little bit of 2.11… but only a little bit) and I think they do lots of things well, there had been a “cool” problem recently for Microsoft, with lots of industry watchers getting very excited by anything Apple, Google, Amazon or any other current flavour of the IT industry month did, while everything Microsoft did was most definitely lacking on the cool front.

However lots of this is changing, under Satya Nadella Microsoft are visibly changing, the focus now is returning Microsoft to their roots as a software company first and foremost. Aiming at delivering great software and services and getting these solutions out to a wide audience, focussing on getting it to them though, because they want it, not because they feel there is no choice.

To Microsoft’s great credit they are delivering, the very impressive Azure and the businesskoolaid behemoth that is Office 365 are dominating forces in the public cloud arena. This alongside the generally positive views of Windows 10 and some great hardware such as Surface has started to have many industry watchers sipping away at the Microsoft Kool-Aid.

Certainly in my day job, we are seeing customers embrace this new Microsoft, one of my colleagues this week was telling me about how a couple of his customers had reported to him about the positive impact of their move to the Microsoft cloud, seeing both productivity improvements as well as a real reduction in running costs.

So that’s the context I approached the event with. From the event itself I was looking at what Microsoft where bringing to the IT party, that would allow us to meet other business challenges that our customers are facing.

If you’ve not attended Future Decoded before, this is a UK event aimed at Microsoft partners, customers and industry professionals from systems architects to developers, and certainly something worth putting in your calendar. This year it was split into two days, business (the day I attended) and developer.

What gems did the business day reveal for us all this year?

The morning session included some interesting insights from a range of business leaders from Richard Reid to Martha Lane Fox, with a healthy dose of Satya Nadella thrown in for good measure.

All of the non Microsoft speakers had something of interest to say, and if there was a theme it was the power of innovation in their day to day jobs, be that from IT at Arsenal, to Virgin Atlantic, Innocent drinks to the Ministry of Defence, they all focussed on how innovation and disruption where the key to moving forward.

What was equally important however was placing innovation in the context of your organisations overall goals and certainly not technology innovation for the sake of it.

One of my favourite statements of the day came from Hywel Sloman, IT director at Arsenal who knew it was critical for him to

Be a business leader first and IT leader second

Alongside innovation was the simplification of IT, (A favourite hobby horse of mine) Mike Stone CIO at the MOD, has spent the first 18 months of his tenure working his way through the mixture of old tech, over running projects and restrictive contracts which are no longer fit for purpose and although his issues where maybe more complex than many of us find because of the nature of his work, those themes of complexity and inflexible IT are something I certainly see in many of the customers I work with.

However, although his current infrastructure was complex, he understood that embracing the cloud and mobility where key to his future strategy. However, all this was in the critical context of security, in fact secure by design, not built and then secured, but securely built.. good tip!

A selection of really informative and diverse business leaders, but regardless of their diverse backgrounds and requirements, their needs where very similar.

  • Innovation
  • Simplicity
  • Security
  • Efficiency
  • Data Control

That wish list sound familiar?

What did Microsoft have to say?

Well if that was the business leaders wish list, what did Microsoft business leader Satya Nadella have to say about how Microsoft where helping IT leaders to meet those challenges.

This is the first time I’ve seen Satya speak and as you’d expect a good presenter, but maybe most impressively, he knew how to do a demo!!!satya-nadella-future-decoded-370x229

What where the key takeaways from Nadella’s session?

One of my favourite phrases from him was the take on mobility, he re-emphasised the Microsoft strategy of cloud first, mobile first, but the slant on what was meant by mobility was interesting.

Nadella said that the view of mobility, was not about devices, but mobility was about the user experience, that a user should be able to move between devices, both those we use today and those that will come in the future and their application experience should be able to go with them.

If you look at the Office experience now across Windows, IOS and Android, you can start to see how that vision is taking shape, with the experience been pretty much the same.

I like that, I like the idea of simplicity in technology and if we can develop our software experience to be the same on whatever device we access our information on, then I think that’s a huge step in the right direction.

As an aside, I spoke with a couple of really interesting guys on the Microsoft mobility team about some of the plans for mobile device management, they covered some really interesting ideas about building the mobile management into the applications and not the devices, that gives freedom of choice to do what you like with the device and pick the device you want, you then bake all the control into that corporate app you want your users to use, if you think back to Mike Stone and secure by design, you can see where that helps.

Mobility of human experience not about devices, computing will be ubiquitous it’s about your experience using it that will matter

Next up on the list of things I couldn’t agree more with, was the assertion that the future for our organisations is about our data and how we use it, it’s a drum I’ve been banging for quite a while and anyone who’s read my Data Fabric info know my views on how we manage and move our data, is key to taking advantage of technology innovation.

However this was also about how we understand what’s in our data, Nadella talked about the interesting work Biobeats are doing around their analytics and machine learning based solutions in healthcare, which was a great example, but to be honest I’m seeing that in all kinds of areas, we all have so much data in our lives, organising and understanding it is absolutely key to how we approach our business and increasingly personal lives in the future.

Nadella also spoke about the importance of not designing silo’s in our technology solutions, again, right on board with that discussion. The ability to integrate our technology stacks is a huge part of that, but also Nadella talked about how Microsoft where developing the ability to collaborate and looking at how we do that regardless of platform and technology choices, that shouldn’t matter, if we need to work together, if we need to access data from all kinds of sources, we have to consider that in our design and enable it.

He then gave a couple of great demo’s showing how Microsoft technology was helping people to understand their data with Power BI and the very interesting Delve (available in Office365) , both of these technologies are interesting tools, because they make data analytics easily available, removing it from the preserve of just those who can afford it. Analytics is a huge part of the value of 365 and Azure, delivering “big data” capability to organisations of all sizes.

Last up was a great presentation on continuum showing how a Lumia 950 could be docked and turned into a usable tablet device with a keyboard, monitor etc. switching seamlessly from its mobile persona to that of a tablet device, again a neat feature and maybe demonstrates Microsoft’s view of it’s phone platform, as something to show the art of the possible, rather than as something that is going to displace Android and IOS as the dominant smartphone platforms.

There was also a couple of really interesting videos shown during the session, I’ve popped the links below as something worth taking a look at.

The afternoon was a selection of break out sessions and the expo area, where I picked up some interesting bits of information, but they can wait for another BLOG.

Future Decoded delivered an interesting day, certainly something that if you’ve not attended, it’s well worth giving a spin, it’s always interesting to see Microsoft’s view of the world and where they are looking to head strategically and always quite nice to hear from technology leaders highlighting the same challenges I hear day to day, so at least we know we are working on the right kind of challenges!

Well that’s your roving reporter at Microsoft Future Decoded, signing off!. Hope you enjoyed the little review – Give the videos a look and maybe check out Future Decoded for yourself next year.

Future Decoded Website

Hololens – Transform your world with holograms

Cities Unlocked – A Microsoft Collaboration with UK Guide Dogs (Brilliant video about technology changing lives)

There was also a really great video on MS futures – but I can’t quite remember which one – so I’ve found a couple that are pretty cool anyway, so here’s some bonus material!!

Productivity Future

Future Vision