Straight as an Arrow – David Fearne & Richard Holmes – Ep58

If there is one thing that we can say is a certainty in the technology industry it is the constant state of change, how technology works, how we want to use it, where we want to use it and what we expect from it is constantly changing and in reality is ever more demanding.

For those of us who work in technology, either as IT pro’s or IT decision makers, this presents its own challenges, when we are planning our IT strategy how do we know where to focus, what technology bets should we be taking and what trends are others taking advantage of that we could bring into our organisation to help us to improve our services.

One of the things I like to do in my role is spend time looking at technology predictions and listen to ideas from those in the industry tasked with defining the strategic direction of their businesses, not to judge whether they are right or wrong (predicting things in this industry is so very difficult) but to pick out trends and areas that are of interest to the work I do and then at least be aware of it and keep a watching brief on how it develops.

Keeping a watching brief gave me the idea for this week’s podcast as I catch up with two guests who produce an annual technology predictions blog and back that up with episodes on their own successful podcast where they look in more detail at those predictions.

David Fearne and Richard Holmes work for Arrow ECS, a global technology supplier and one of the worlds largest companies. David is Technical Director, charged with looking after the relationship and developing strategy for over 100 different technology partners and suppliers. Richard is Business Development Director for Arrow’s Internet Of Things (IoT) business. The gents also present the excellent Arrow Bandwidth podcast.

This week we look at their predictions from 2017, not to review whether they have been successful, but rather to focus on just a few areas of particular interest and look at how those areas have developed over the last 12 months and how we expect they will continue to shift.

We start by discussing data management and the concept of “data divorce” and why in a rapidly changing landscape how we look after our data will become increasingly important. We also look at how, in a world that is removing barriers to our ability to collect more and more data, how we manage that and importantly how we only collect things that are relevant and of use to us and our organisations.

The second area we explore is data analytics and how do we build into our businesses the ability to make data driven decisions. We discuss the fact that all businesses make decisions based on data, however, how do we remove our human inefficiencies and more importantly bias when we look at data, how many of us make decisions based on someone’s “version of the truth”?

We also investigate the inhibitors to more of us embracing data analytics capabilities, capabilities that are increasingly available to us particularly via providers like Microsoft, AWS and Google, the challenge isn’t a technology one, but more about how we get those tools into the hands of the right people and empower them.

We wrap up looking at security and David’s assertion of a change in “security posture” and how it’s crucial that we rethink the way we look at security of our systems. We discuss why “assuming breach” is an important part of that change. We look at, as the security problem becomes ever more complex, how do we continue to address it, is the answer to employ ever more security specialists?

We wrap up by discussing how each of these areas have a common thread running through them and how as technology strategists it is important that, when making technology decisions, we don’t focus on technology but fully understand the business outcomes we are trying to achieve.

It’s a great chat with David and Richard and we could have discussed these trends for hours, luckily for you, it’s only 40 minutes!

Enjoy the Show.

You’ll find David and Richards full list of prediction from 2017 here – https://www.arrowthehub.co.uk/blog/posts/2017/february/what-are-the-hottest-technology-trends-of-2017-part-1/

You’ll also find the 2018 predictions here https://www.arrowthehub.co.uk/blog/posts/2018/january/what-are-the-hottest-technology-trends-for-2018-part-1/

If you’d rather listen, then check out the excellent Arrow Bandwidth podcast you can find the episodes discussing all of last years predictions as well as this years in the following places Tech Trends 2017 Part One, Tech Trends 2017 Part Two, Tech Trends 2018 Part One, Tech Trends 2018 Part Two.

If you’d like to keep up with David and Richard, you can find them both on twitter @davidfearne and @_Rich_Holmes.

Thanks for listening.

Advertisements

Building a modern data platform – Availability

In part one we discussed the importance of getting our storage platform right, in part two we look at availability.

The idea that availability is a crucial part of a modern platform was something I first heard from a friend of mine, Michael Cade from Veeam, who introduced me to “availability as part of digital transformation” and how this was changing Veeam’s focus.

This shift is absolutely right, today as we build our modern platforms backup and recovery is still a crucial requirement, however, a focus on availability is at least, if not more, crucial. Today nobody in your business really cares how quickly you can recover a system, what our digitally driven businesses demand is that our systems are always there and downtime in ever more competitive environments is not tolerated.

With that in mind why do I choose Veeam to deliver availability to my modern data platform?

Keep it simple

Whenever I meet a Veeam customer their first comment on Veeam is “it just works”, the power of this rather simple statement should not be underestimated when you are protecting key assets. Too often data protection solutions have been overly complex, inefficient and unreliable and that is something I have always found unacceptable, for business big or small you need a data protection solution you can deploy and then forget and trust it just does what you ask, this is perhaps Veeam’s greatest strength and a crucial driver behind its popularity and what makes it such a good component part of a data platform.

I would actually say Veeam are a bit like the Apple of availability, although much of what they do has been done by others (Veeam didn’t invent data protection, in the same way Apple didn’t invent the smartphone) but what they have done is make it simple and usable and something that just works and can be trusted. Don’t underestimate the importance of this.

Flexibility

If ever there was a byword for modern IT, flexibility could well be it, it’s crucial that any solution and platform we build has the flexibility to react to ever changing business and technological demands. Look at how business needs for technology and the technology itself has changed in the last 10 years and how much our platforms have needed to change to keep up, flash storage, web scale applications, mobility, Cloud, the list goes on.

The following statement sums up Veeam’s view on flexibility perfectly

“Veeam Availability Platform provides businesses and enterprises of all sizes with the means to ensure availability for any application and any data, across any cloud infrastructure”

It is this focus on flexibility that make Veeam such an attractive proposition in the modern data platform, allowing me to design a solution that is flexible enough to meet my different needs, providing availability across my data platform, all with the same familiar toolset regardless of location, workload type or recovery needs.

Integration

As mentioned in part one, no modern data platform will be built with just one vendors tools, not if you want to deliver the control and insight into your data that we demand as a modern business. Veeam, like NetApp, have built a very strong partner ecosystem allowing them to integrate tightly with many vendors, but more than just integrate Veeam deliver additional value allowing me to simplify and do more with my platform (take a look at this blog about how Veeam allows you to get more from NetApp snapshots). Veeam are continuously delivering new integrations and not only with on-prem vendors, but also as mentioned earlier, with a vast range of cloud providers.

This ability to extend the capabilities and simplify the integration of multiple components in a multi-platform, multi-cloud world is very powerful and a crucial part of my data platform architecture.

Strategy

As with NetApp, over the last 18 months it has been the shift in Veeam’s overall strategy that has impressed me more than anything else, although seemingly a simple change, the shift from talking about backup and recovery to availability is significant.

As I said at the opening of this article, in our modern IT platforms nobody is interested in how quickly you can recover something, it’s about availability of crucial systems. A key part of Veeam’s strategy is to “deliver the next generation of availability for the Always-On Enterprise” and you can see this in everything Veeam are doing, focussing on simplicity, ensuring that you can have your workload where you need it when you need it and move those workloads seamlessly between on-prem, cloud and back again.

They have also been very smart, employing a strong leadership team and, as with NetApp, investing in ensuring that cloud services don’t leave a traditionally on-premises focussed technology provider adrift.

The Veeam and NetApp strategies are very similar, and it is this similarity that makes them attractive components in my data platform. I need my component providers to understand technology trends and changes so they, as well as our data platforms, can move and change with them.

Does it have to be Veeam?

In the same way it doesn’t have to be NetApp, of course it doesn’t have to be Veeam, but in exactly the same way, if you are building a platform for your data, then make sure your platform components deliver the kinds of things that we have discussed in the first two parts of this series, ensure that they provide the flexibility we need, the integration with components across your platform and a strategic vision that you are comfortable with, as long as you have that, that will give you rock solid foundations to build on.

In Part Three of this series we will look at building insight, compliance and governance into our data platform.

You can find the Introduction and Part One – “The Storage” below.

modern data platform
The Introduction
modern storage
Part One – The Storage

 

 

IT Pro’s and the Tech Community – Yadin Porter de Leon – Ep 57

One of the favourite parts of my role over the last few years has been my involvement in tech community, whether that’s been working with advocacy groups like the NetApp A-Team, with local user groups like TechUG, presenting at a range of different community events or just answering questions in technical communities, all of these investments (and they are investments) have paid back, they’ve introduced me to great people, given me access to resources and expertise I would never have found normally and opened up great opportunities for travel and too develop some great friendships.

We are fortunate to be part of an industry that does have a strong sense of community, full of people with shared interests and a passion for their subject, a passion they are often happy to share with anyone who’s interested.

One of the challenges with tech community is however its size and if you are new to it or even a part of it, it can be overwhelming and hard to know where to start? How do you find the resources you need, find out which events you can attend or find out who the leaders are that you can engage with?

071517_1725_Livingonthe1.jpgLast year I was invited to get involved in a project called “Level Up”, a project started by this week’s guest on the podcast Yadin Porter de Leon, Yadin has been on the show before in his capacity at data protection company Druva, however that’s not what we discuss this week as we chat about the Level Up project, why he started it, the project aims and how it can help you in your career.

In this week’s episode we discuss why you may want to get involved in community and what benefits it can bring and how involvement in the wider community can benefit both you and your business providing you with opportunities to develop your skills.

Yadin shares how one of the focuses of the project is to engage those who are not already involved in community and provide them a way to get started.

We look at Level Up’s first project the vTrail Map a fantastic guide to the world of VMware and the virtualisation community and we also look ahead to what’s next for the project and the longer terms aims.

We wrap up by asking Yadin about another project he is involved in which is the excellent Tech Village Podcast, again focussed on career development and the technology business, a great show which I’d recommend anyone gets on their regular podcast list you can find the show on Soundcloud and follow the show on twitter @TechVillagePod

For more information on Level Up, you can find them on twitter @Tech_LevelUp

You can also contact Yadin on twitter @porterdeleon

Hope you find the show interesting and if you’re not already involved in tech community maybe this will give you a bit of inspiration to involve yourself more, it’s most definitely worth it.

Thanks for listening.

Building a modern data platform – The Series – Introduction

For many of you who read my blog posts (thank you) or listen to the Tech Interviews Podcast (thanks again!) you’ll know talking about data is something I enjoy, it has played a significant part in my career over the last 20 years, but today data is more central than ever too what so many of us are trying to achieve.

pexels-photo-373543.jpegIn today’s modern world however, storing our data is no longer enough, we need to consider much more, yes storing it effectively and efficiently is important, however, so is its availability, security, privacy and of course finding ways to extract value from it, whether that’s production data, archive or backup, we are looking at how we can make it do more (For examples of what I mean, read this article from my friend Matt Watts introducing the concept of Data Amplification Ratio) and deliver a competitive edge to our organisations.

To do this effectively means developing an appropriate data strategy and building a data platform that is fit for today’s business needs. This is something I’ve written and spoken about on many occasions, however, one question I get asked regularly is “we understand the theory, but how do we build this in practice, what technology do you use to build a modern data platform?”.

That’s a good question, the theory is all great and important, however seeing practical examples of how you deliver these strategies can be very useful. With that in mind I’ve put together this series of blogs too go through the elements of a data strategy and share some of the practical technology components I use to help organisations build a platform that will allow them to get the best from their data assets.

Over this series we’ll discuss how these components deliver flexibility, maintain security and privacy, provide governance control and insights, as well as interaction with hyperscale cloud providers to ensure you can exploit analytics, AI and Machine Learning.

So, settle back and over the next few weeks I hope to provide some practical examples of the technology you can to deliver a modern data strategy, parts one and two are live now and can be accessed in the links below. The other links will become live as I post them, so do keep an eye out for them.

modern storage
Part One – The Storage
alwayon
Part Two – Availability
control
Part Three – Control
what the cloud can bring
Part Four – Prevention (Office365)
Mobile-edge.jpg
Part Five – Out On The Edge
082218_1642_Buildingamo1.jpg
Part Six – Exploiting The Cloud
101018_1716_Buildingamo1.jpg
What have we learned?

 

I hope you enjoy the series and that you find these practical examples useful, but remember, these are just some of the technologies I’ve used and are not the only technologies available and you certainly don’t have to use any of these to meet your data strategy goals, however, the aim of this series is to help you understand the art of the possible, if these exact solutions aren’t for you, don’t worry, go and find technology partners and solutions that are and use them to help you meet your goals.

Good Luck and happy building!

Coming Soon;

Part Seven – A strategic approach

Building a modern data platform – The Storage

wp_20160518_07_53_57_rich_li.jpgIt probably isn’t a surprise to anyone who has read my blogs previously to find out that when it comes to the storage part of our platform, NetApp are still first choice, but why?

While it is important to get the storage right, getting it right is much more than just having somewhere to store data, it’s important, even at the base level, that you can do more with it. As we move through the different elements of our platform we will look at other areas where we can apply insight and analytics, however, it should not be forgotten that there is significant value in having data services available at all levels of a data platform.

What are data services?

These services provide added capabilities beyond just a storage repository, they may provide security, storage efficiency, data protection or the ability to extract value from data. NetApp provide these services as standard with their ONTAP operating system bringing considerable value regardless of whether data capacity needs are large or small, the ability to provide extra capabilities beyond just storing data is crucial to our modern data platform.

However, many storage providers offer data services on their platforms, not often as comprehensive as those provided in ONTAP, but they are there, so if that is the case, why else do I choose to use NetApp as a foundation of a data platform?

Data Fabric

“Data Fabric” is the simple answer (I won’t go into detail here, I’ve written about the Data-Fabric_shutterstock.jpgfabric before for example Data Fabric – What is it good for?), when we think about data platforms we cannot just think about them in isolation, we need considerably more flexibility than that, we may have data in our data centre on primary storage, but we may also want that data in another location, maybe with a public cloud provider, we may want that data stored on a different platform, or in a different format all together, object storage for example. However, to manage our data effectively and securely, we can’t afford for it to be stored in different locations that need a plethora of separate management tools, policies and procedures to ensure we keep control.

The “Data Fabric” is why NetApp continue to be the base storage element of my data platform designs, the key to the fabric is the ONTAP operating system and its flexibility which goes beyond an OS installed on a traditional controller. ONTAP can be consumed as a software service within a virtual machine or from AWS or Azure, providing the same data services, managed by the same tools, deployed in all kinds of different ways, allowing me to move my data between these repositories while maintaining all of the same management and controls.

Beyond that, the ability to move data between NetApp’s other portfolio platforms, such as Solidfire and StorageGrid (Their Object storage solution), as well as to third party storage such as Amazon S3 and Azure Blob, ensures I can build a complex fabric that allows me to place data where I need it, when I need it. The ability to do this while maintaining security, control and management with the same tools regardless of location is hugely powerful and beneficial.


API’s and Integration

When we look to build a data platform it would be ridiculous to assume it will only ever contain the components of a single provider and as we build through the layers of our platform, integration between those layers is crucial and does play a part in the selection of the components I use.

API’s are increasingly important in the modern datacentre as we look for different ways to automate and integrate our components, again this is an area where NetApp are strong, providing great third party integrations with partners such as Microsoft, Veeam, VMware and Varonis (some of which we’ll explore in other parts of the series) as well as options to drive many of the elements of their different storage platforms via API’s so we can automate the delivery of our infrastructure.

Can it grow with me?

One of the key reasons that we need a more strategic view of data platforms is the continued growth of our data and the demands we put on it, therefore scalability and performance are hugely important when we chose the storage components of our platform.

NetApp deliver this across all their portfolio. ONTAP allows me to scale a storage cluster up to 24 nodes delivering huge capacity, performance and compute capability. The Solidfire platform, inspired by the needs of service providers, allows simple and quick scale and a quality of service engine which lets me guarantee performance levels of applications and data, this is before we talk about the huge scale of the StorageGrid object platform or the fast and cheap capabilities of E-Series.

Crucially NetApp’s Data Fabric strategy means I can scale across these platforms providing the ability to grow my data platform as I need and not be restricted by a single technology.

Does it have to be NetApp?

Do you have to use NetApp to build a data platform? Of course not, but do look at whatever you choose as the storage element of your platform that it can tick the majority of the boxes we’ve discussed , data services, a strategic vision and ability to move data between repositories and locations and provide great integration , while ensuring your platform can meet the performance and scale demands you have on it.

If you can do that, then you’ll have a great start for your modern data platform.

In the next post In this series we’ll look at the importance of availability – that post is coming soon.

Click below to return to “The Intro”

 

modern data platform
Building a modern data platform – The Series – Introduction