Intelligent, secure, automated, your data platform future

I was recently part of a workshop event where we discussed building “your future data platform”, during the session I presented a roadmap of how a future data platform can look. The basis of the presentation, which looked at the relatively near future, was how developments from “data” vendors are allowing us to rethink the way we manage the data that we have in our organisations.

What’s driving the need for change?

Why do we need to change the way we manage data? The reality is that the world of technology is changing extremely quickly and at the heart of it is our desire for data, be it creating it, storing it, analysing it or learning from it and demanding that we use data increasingly to help drive business outcomes, strategies and improve customer experience.

Alongside this need to use our data more are other challenges, from increasing regulation to the ever more complex security risk (See the recent Marriot Hotels breach of 500 million customer records) which are making further, unprecedented demands on our technology platforms.

Why aren’t current approaches meeting the demand?

What’s wrong with what we are currently doing? Why aren’t current approaches helping us to meet the demands and challenges of modern data usage?

As the demands on our data grow, the reality for many is we have architected platforms that have never considered many of these issues.

Let’s consider what happens when we place data onto our current platform.

We take our data, it could be confidential, it may not be, often we don’t know, that data is placed into our data repository when it’s placed there how many of us know;

  • Where it is?
  • Who owns it?
  • What does it contain?
  • Who is accessing it?
  • What’s happening to it?

In most cases, we don’t, and this presents a range of challenges from management and security to reducing our ability to compete with those who are effectively using their data to innovate and gain an advantage.

What If that platform instead could recognise the data as it was deposited? Then made sure it was in the right secure area, with individual file securities that would ensure it remained secure regardless of its location, then the system, if necessary, would protect the file immediately (not when the next protection job ran) and then tracked the use of that data from its creation to deletion.

That would be useful, wouldn’t it?

What do we need our modern platform to be?

As the amount of data and the ways we want to use it continues to evolve, our traditional approaches will not be able to meet the demands placed upon them and we certainly cannot expect human intervention to be able to cope, the data sets are too big, the security threats too wide-reaching and the compliance requirements ever more stringent.

However, that’s the challenge we need our future data platforms to meet, they need to be, by design, secure, intelligent and automated. The only way we are going to be able to deliver this is with the help of technology augmenting our efforts in education, process and policy to ensure we use our data and get the very best from it.

That technology needs to be able to deliver this secure, intelligent and automated environment from the second it starts to ingest data, it needs to understand what we have and how it should be used and importantly it shouldn’t just be reactive, it has to be proactive, the minute new data is written it applies intelligence, ensuring immediately we secure our data, store and protect it accordingly and be able to fully understand its use throughout its lifecycle.

Beyond this, we also need to make sure that what we architect is truly a platform, something that acts as a solid foundation for how we want to use our data. We need to ensure once we have our data organised, secure and protected, that our platform can make sure that we can move it to places we need it, allowing us to take advantage of new cloud services, data analytics tools, machine learning engines or whatever may be around the corner, while ensuring we continue to maintain control and retain insights into its use regardless of where it resides.

These are key elements of our future data platform and ones we are going to need to consider to ensure that our data can meet the demands of our organisations to make better decisions and provide better services, driven by better use of data.

How do we do it?

Of course, the question is, can this be done today and if it can how?

The good news is, much of what we need to do this, is already available or coming very soon and which means, realistically within the next 6-18 months, if you have the desire, you can develop a strategy and build a more secure, intelligent and automated method for managing your data.

I’ve shared some thoughts here on why we need to modernise our platforms and what we need from them, in the next post I’ll share a practical example of how you can build this kind of platform by using tools that are available to you today or coming very shortly, to show that a future data platform is closer than you may think.

Advertisements

Logging and learning your public cloud – Colin Fernandes – Ep 74

In the last of our series looking at the shift to public cloud, we discuss getting the best from your cloud and the value of understanding the behaviour of your cloud infrastructure.

Initially the move to cloud was seen as a way of delivering lower cost infrastructure or test and dev environments. However this is beginning to change, today more than ever this move is driven by agility, flexibility and reducing time to delivery, a focus on outcomes rather than cost and technology. This shift is a positive, technology investments should always be about the outcome and a broader end goal and not technology adoption for technologies sake.

When attempting to achieve these outcomes it’s important that our platforms are performing and delivering in the way we need them too, the ability therefore to log, analyse and gain useful insight into the performance of our estate is a crucial part of making sure our public cloud investment is successful.

On this show I’m joined by Sumo Logic’s Colin Fernandes as we look at public cloud, the value of what it delivers and how an understanding of its performance is crucial to not only help achieve desired outcomes, but to do so while still meeting those ever-critical security and governance requirements.

Colin is a self-proclaimed IT veteran with 32 years’ experience in the industry, starting out at ICL and arriving at Sumo Logic via the likes of IBM and VMware and that longevity in the industry puts Colin in a great position to comment on what he sees in today’s market and how cloud has and is disrupting our use of technology.

We start by looking at the similarities Colin sees in today’s shift to cloud to those early days with VMware. We also discuss how organisations are starting to look at cloud as a way to drive new applications and innovation and how this is as much about a cultural shift as it is technology.

We chat about big shifts in focus, with the adoption of serverless and modern design architectures such as containers and the increasingly pervasive ability to utilise machine learning and analytics. We also explore the problems that come with cloud, particularly those “day one” problems of monitoring, security and compliance and why it’s critical that security be part of the cloud delivery cycle and not an afterthought.

We finish up talking about Sumo Logic and what they bring to the market and how their ability to analyse and use data from their customers can provide them with the valuable insight needed to achieve value from their cloud investment.

This is a great time to find out more about Sumo Logic as this week (Starting 12th September 2018) it’s their annual user conference Illuminate, you can track the event via their live keynote stream and you can find that on www.sumologic.com where you can also find more info about what they do.

If you want to follow up with Colin you can find him on LinkedIn as well as via email cfernandez@sumologic.com

I really enjoyed this chat, with Colin’s experience in the market he provided valuable insight into public cloud and how to get real value from it.

Next time we are looking at the world of incident management, how to plan for it and how to ensure a technology disaster or data breach doesn’t catch you out.

Until then, thanks for listening.