Techstringy Interviews – Docker, Containers and Server 2016 – Oh my!

Welcome to the second of the Techstringy interviews, this one came about after visiting a recent Tech User Group event in Manchestermarcus-robinson (very good events – check them out).

One of the speakers at the event was Marcus Robinson (@techdiction) of Microsoft, his topic was
containers in Windows Server 2016. For those of us in the technology industry, containers are a seriously hot topic and Microsoft adding container support as a core function of Server 2016 has the potential to take their use right into the mainstream.

In this 9 minute chat, I ask Marcus about what containers are and why should we bother?who actually uses them and why? what’s the benefit? how are they delivered in Server 2016? and what does this mean for developers and Windows admins alike?

Marcus shared some great information, I hope you find it interesting and of course if you have any questions you can contact me via the site, on twitter @techstringy or find me on LinkedIN.

You may also find my post “Bringing containers to the masses” a useful companion piece where I look at containers in both Windows Server and VMware.

Enjoy!

 

Bringing containers to the masses

No doubt that one of the hottest topics in the IT industry right now is containers, the world of Docker and its ilk are fascinating developers, IT industry watchers and technology strategists the world over.

The containers world is still something that on the whole is restricted to (another IT buzzword warning) the world of DevOps, developers and coders are seeing containers as a great way to quickly develop, deploy and refresh thier applications.

However, this is not the point of this blog, full disclosure, I’m no containers expert, and if you want to know what containers are, then there is a whole bunch of resources out there that can give you all the background you need.

Why write about, “containers to the masses” then?

As I mentioned, containers right now, certainly from the infrastructure side of the house, are still a bit of a mystery, locked away in Linux or a cloud host somewhere, not something we can easily get a handle on in our Windows or vCenter worlds. The idea of these strange self-contained environments running in a way we understand and can manage seems impossible.

And there’s the crux of this post, for many of us, the idea of enterprise wide containers is a long way off. And that’s a problem. In the modern IT world, it’s critical that those who administer infrastructure and business technology cannot be seen as blockers to delivering agile IT in our increasingly DevOps world and if we are, then we are not serving our organisations or our careers well.

How do we square that circle? how do we deal with the problem of delivering agile development platforms for our developers in a world of traditional infrastructure.

A couple of weeks ago, I attended one of the excellent Tech User Group events in Manchester (if you’ve never checked out one of their IT community events then you should, have a look at the website) and among the great topics on the agenda we had speakers from both VMware and Microsoft.

Now I think it’s fair to say if we were to do a poll of the major enterprise infrastructure providers, Microsoft and VMware would feature strongly and it is those platforms that infrastructure guys know and love, however, they are also the things that seem a long way removed from the modern DevOps world, well that is until now.

At the event, I saw a couple of presentations that shifted my view on deployment of containers in the Enterprise, Cormac Hogan from VMware and Marcus Robinson from Microsoft, both covered how these software giants where looking at the container space.

The approach overall is pretty similar, but importantly both are taking something that maybe we don’t quite understand and seamlessly dropping it into an environment we do.

Both are focussing on delivering support for Docker , by pretty much publishing Docker API’s so that dev’s can use all of their Docker skills to deploy containers into these infrastructure environments, without knowing, or to a degree caring what the infrastructure looks like.

That works both ways, with the infrastructure admins, seeing the container resources as they see any other resource, but again, not understanding or caring what they are.

Let’s take a little look at the two implementations;

Microsoft

Firstly, there has been support for containers in Azure for quite a while, so this is nothing new, but what Microsoft are doing is bringing that native container support on-prem in Windows Server 2016. This is done with two slightly different container delivery methods;

Windows Server Containers – provide application isolation through process and namespace isolation technology. A Windows Server container shares a kernel with the container host and all containers running on the host.

Hyper-V Containers – expand on the isolation provided by Windows Server Containers by running each container in a highly optimized virtual machine. In this configuration, the kernel of the container host is not shared with the Hyper-V Containers.

Check out this video for more details on Server 2016 Container Deployment;

https://channel9.msdn.com/Blogs/windowsserver/Containers-in-Windows-Server-2016/player

VMware

As with Microsoft there are two distinct routes to deliver containers into the VMware driven enterprise.

vSphere Integrated Containers – provides a Docker-compatible interface for developers while allowing IT operations to continue to use existing VMware infrastructure, processes and management tools. And it offers enterprise-class networking, storage, resource management and security capabilities based on vSphere.

Photon OS™ – is a minimal Linux container host, optimized to run on VMware platforms. Compatible with container runtimes, like Docker, and container scheduling frameworks, like Kubernetes.

Check this video from VMworld 2016 for a short intro to vSphere integrated containers;

And a brief intro to Photon OS can be watched here;

In my mind, it is the management of these that is key to their adoption, from the Dev side both will be deployable using Docker API’s and Docker client, so a methodology developers already understand. To the enterprise admin, it’s a Windows Server or a Vmware environment that they understand and can manage.

Certainly, in the enterprise the idea of deploying Docker containers has been hampered by the need for Linux container farms, and when you are in an environment that “doesn’t do Linux” that’s a problem, however bringing the likes of Docker seamlessly into your traditional enterprise infrastructure systems like Windows Server and vSphere so that they can be managed within your traditional IT frameworks is massive.

Like I said, I’m no containers expert, not a developer, however I have spent 20+ years working in infrastructure environments and the more Marcus and Cormac spoke, the more my light bulb moment brightened, if you can take these flexible development environments out of the dark corners and place them in an environment that enterprise IT can manage and understand you are opening the world of Docker and containers to a whole new audience.

Watch out masses… here comes Containers!

To find our more from the excellent presenters on the day, you can follow both Marcus and Cormac on twitter

Marcus Robinson @techdiction

Cormac Hogan @CormacJHogan

For a bit more information have a look at some of these resources.

For an introduction to containers from Microsoft read Mark Russinovich’s BLOG

Read more on Windows Containers here

For info from Docker on their Microsoft relationship check here

For an introduction to the latest on VMware containers check here

vSphere Containers on Github

read here for an introduction to Photon OS

A new venture- the Techstringy Interviews have arrived!

I’ve been thinking for a while about how to add something new to my tech social media output and the other week I finally found my inspiration listening to Justin Parisi (@nfsdudeabides) and his round ups from the  NetApp Insight conference in Las Vegas.

Justin, as the host of the NetApp podcast, produces lots of content, but I particularly enjoyed his short soundbite interviews taken live at the show.

That was my light bulb moment. I’m lucky enough to meet lots of interesting people from the IT community, smart people at vendors, or clever customers doing interesting things with technology and I thought wouldn’t it be great to get them doing little interviews that I could share.

So with that moment of Inspiration, the Techstringy interviews were born!

The aim is to produce a series of 5-10 minute interviews with people in the IT community, getting their views on tech.

I’ve no idea how well this will go, but I’ve got the first in the can, as they say in the world of media, and a couple more lined up… and if people like them… I’ll keep churning them out.. and if they don’t, I’ll stop churning!WP_20161004_09_29_18_Rich_LI.jpg

First up is an interview with my NetApp A Team mate, Michael Cade of Veeam. I met up with Michael at Microsoft’s UK partner conference at Twickenham last week and with my phone and shiny new microphone at hand, I grabbed a few minutes with him at the end of the event to ask his views on some of the things we’d heard on the day, how cloud technology was changing the backup landscape and how Veeam was helping to take on modern data protection challenges.

We enjoyed doing the interview, I hope you like it, feel free to get in touch with any comments.

Welcome to Interview number one of Techstringy Interviews.