Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

As we come toward the end of the conference, day three provided us with the final general sessions including a fascinating insight into rocket science as Adam Steltzner, part of the Mars Rover landing team, shared the part data played in their work.

082917_1433_ITAvengersP2.jpgJoining me in this final review from Insight is Jon Woan (@jonwoan)jon woan and Mick Kehoe (@mickehoe) providing their views on this session and as it was the final day, they also share their thoughts on what they’d heard throughout the conference, how it met their expectations and where NetApp covering the kind of things that they felt relevant.

Enjoy this last review from NetApp Insight and look out for upcoming reviews from other tech conferences in the future, as well as new episodes of Tech Interviews.

Don’t miss the round ups from day’s one and two, you’ll find them here.

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

 

Advertisements

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

Our first set of keynote reviews come from NetApp Insight 2017 in Berlin, getting the very latest in the data management field.

We heard views about Monday’s keynote yesterday (you can find that here Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One) What did Day two have for us?

scott gelbThis time I’m joined by Scott Gelb (@scottygelb) and Adam BerghAdam Bergh (@ajbergh) to get their views as we discuss the announcements of new platforms such as HCI and the fascinating move to cloud services including a unique arrangement with Microsoft Azure.

Don’t miss the round-ups from days one and three, you can find them here;

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

 

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day One

One of the main components of any tech conference is the keynote sessions, these are the sessions that share the vision, set the context for the show and a good keynote is a vital part of creating the right atmosphere for those attending.

What I wanted to do with these special shows was to try and grab some of the immediate reaction from those attending the events and the keynote presentations that come with them.

For these first shows, I’m at NetApp’s Insight conference in Berlin, where we expect four days full of the latest in what the data management industry are doing and hearing how data continues to be a focus for transformation for many of us.

With that in mind, what did Monday’s keynote session deliver?

 

082917_1433_ITAvengersP4.jpg

To find out, straight from the keynote I caught up with Jason Benedicic (@jabenedicic), Atanas Prezhdarov Atanas(@prezblahblah) and Mark Carlton (@mcarlton1983) to get their views on the key messages from the keynote and what they expected from the rest of the event.

mark carlton new twitterDon’t miss the round-ups from day two and three you can find them here;

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Two

Tech Interviews the Keynote Round Up – NetApp Insight 2017 Day Three

 

Going to gain some Insight – What I’m looking forward to from NetApp Insight 2017

This week I’m in Berlin at NetApp’s data management conference Insight.

Always a great chance to catch up with industry friends, hear from leaders in the data industry, a range of technology companies and about the strategic direction that NetApp and the data management industry is taking.

With 4 days ahead in Berlin, what are the things I’m hoping to hear about at Insight 2017?

Extending the fabric

If you’ve read any of my blogs on data strategy in the past you’ll be familiar with NetApp’s data fabric concept, the fabric was developed to enable us to break down the data silos’s that we have become used to and enable us to build a strategy to allow us to simply and freely move data between any repository be that on-prem, software-defined, in the cloud or near the cloud while maintaining all of the security, management and control of our data that we have grown used to on-prem.

Today the data fabric is much more than a strategic aim as it is now practically delivered across much of the NetApp portfolio and I’ll be paying attention to how this continues to evolve.

Gaining understanding of our data

This is the next step for “storage” companies, especially those, like NetApp, who are repositioning themselves as data management companies.

Long gone are the days where we just want somewhere to store our data, you have to remember not only is “storing boring” it also does not serve us well, whether you are concerned about governance and security, or how to extract value from your data, this can only come with full understanding of where your data is, what it contains, who accesses it and when, all are critical in a modern data strategy and I’ll be interested in how NetApp is allowing us to gain more understanding.

Securing all of the data things

Nothing is higher on the priority list for CIO’s and those making business decisions than the security of our business data (well it should be high on the priority list), I’m keen to see how NetApp build on what they currently have (encryption, data security policies, API’s for 3rd party security vendors) to fully secure and understand the data within an environment.

I’ll also be interested to hear more about the changes the data industry continues to make to enable us to not only secure our data from the ever-evolving security challenge but also how we can meet increasing compliance and privacy demands.

Analysing the stuff

I fully expect to hear more about how data continues to be the new oil, gold etc, as marketing based as this messaging is, it is not without validity, I constantly speak with business decision makers who are eager to understand how they can use the data they own and collect to gain a business advantage.

NetApp has made some interesting moves in this space, with integrated solutions with Splunk and the Cloud Connect service allowing easy movement of data into AWS analytics tools.

It will be interesting to see how this evolves and how NetApp can ensure the data fabric continues to extend to so we can take advantage of the ever-growing analytics tools that allow us to gain value from our data sets.

Integrating all of the things

NetApp has long innovated in the converged infrastructure market, with their joint Cisco solution Flexpod.

However, this market continues to evolve with the emergence of hyper-converged infrastructure (HCI), which companies like Nutanix and Simplivity (now owned by HPE) have led the way. However, up to now, I have the feeling HCI is only scratching the surface by taking infrastructure, servers, storage and networking and squeezing it into a smaller box. In my opinion what’s missing is the software and automation to allow us to use HCI to deliver the software-defined architectures many are striving for.

It is this that is beginning to change, VMware and Microsoft, amongst others, are bringing us more tightly integrated software stacks extracting hardware complexity and letting us drive infrastructure fully in software, bringing that cloud like experience into the on-prem datacentre.

It is these software stacks that really starts to make HCI an interesting platform, marrying this simplified hardware deployment method, with automated software driven infrastructure has the potential to be the future of on-prem datacentres.

I’ll certainly be keeping an eye on NetApp’s new HCI platform and how that will allow us to continue to simplify and automate infrastructure so we can deliver a flexible, scalable, agile IT into our businesses.

What else will I be up to?

Many of you know I’m proud to be a part of the NetApp A-Team, and this association has also made Insight a very different proposition from a couple of years ago.

For the first time I’ll be part of a couple of sessions at the event, feel free to come and check them out and say hello;

You’ll find me doing session 18345-1-TT – Ask the A-Team – Cloud and Possibilities with NetApp Data Fabric and 18348-2 – From the Beginning – Becoming a service provider.

I’ll also be hosting the pop-up tech talks sessions – If you want to come and meet up and chat (on camera) about your views of NetApp or the data market in general, why not come find me.

And lastly, I’ll be appearing on The Cube as they broadcast live from Berlin giving in-depth coverage of Insight.

I’ll be discussing HCI platforms on Tuesday 14th at 2.30, you’ll find that broadcast on thecube.net

If you’re at Insight, do come say hello or hook up on the twitters @techstringy

And let’s discuss if you too have gained any insights.

Look out for more blogs and podcasts from Insight 2017 over the coming weeks.

Webcache, webcache, what on earth’s a webcache? – Francesco Giarletta – Ep47

As part of my role as a Technical Director, one of my tasks is to attend events and hear from the community about the challenges they have and to hear from tech vendors about how they are fixing them.

However, every now and again one of the things that happens is someone presents something that introduces me to a whole new challenge that I’d never considered.

That happened at a recent User group event, run by the excellent folk over at TechUG. On this occasion, they were joined by Francesco Giarletta of Avanite to discuss the mysterious world of the webcache and the vast array of web data that lives within it!

He shared how this cache, alongside the amount of web tracking data that is dropped down onto your systems via web browsing (regardless of browser), can have significant and unexpected consequences.

avanite logo wideSo it seemed only fair to ask Francesco to come onto the show and introduce you all to this often unconsidered world.

In this episode, we look at the problem of webdata, the kind of information that websites collect about us, the unknown amount of data that they drop onto our machines, from cookies to the tracking information that Windows keeps about our web whereabouts.

We look at the impact of this tracking on both system performance and maybe more importantly security.

Francesco shares some of the security impacts and how they can potentially expose us to the risk of breach, from the storing of unencrypted user credentials, to how this data that we don’t fully understand can expose us to regulatory infringement.

Finally, we share some ideas on how you can start to deal with the problem and how Avanite maybe able to help.

To find out more about the work Avanite do and the risks of storing unmanaged and uncontrolled web data on your machine you can visit their website www.avanite.com and follow them on twitter @Avanite_Ltd .

Next week I head off to NetApp Insight in Berlin, so no new show, but look out for a series of shows focussed on data and data management as I catch up with a host of industry leaders at the Insight conference.

To make sure you catch future episodes, why not subscribe and if you have any questions, contact me on twitter @techstringy.

If you are going to NetApp Insight why not come and find me, I’m hosting some sessions as well in charge of the Pop-up Tech Talks mic, come say Hi and have a chat.

 

Chaining the blocks, a Blockchain 101 – Ian Moore – Ep46

As the world continues to “transform” and be more digitally driven, then the inevitable also has to happen, systems that support our day to day processes start to become outdated, inefficient and ineffective for a world that needs to move more quickly and in different ways.

One such innovation that is gathering momentum is the use of blockchain and it is starting to have a major disruptive impact on the way many traditional transactions are done, with current mechanisms often been slow, inefficient and vulnerable to compromise, as well as in many cases, especially with financial transactions, a lack of trust in many of the existing methods.

But what exactly is blockchain, like many people it’s a technical term I’m familiar with but don’t fully understand how it works, why it’s relevant and how is it impacting business right now, as well as the potential future applications.

If you are like me and interested in the technology and would like to know more, then maybe this week’s podcast episode is for you as I’m joined by Ian Moore to provide a beginners guide to blockchain, a blockchain 101 no less.

Ian is not a blockchain expert, but certainly is an enthusiast and the perfect person to introduce the concept and provide a good overview of the technology. In his day job, Ian works for IBM in their data division.

During our conversation, he introduces us to the importance of ledgers, how the four key blockchain tenants of consensus, provenance, immutability and finality are allowing blockchain transactions to be quick, secure and trusted.

We also discuss how the speed of digital transformation is demanding improvement in speed and efficiency and how transactions that used to take weeks are no longer acceptable as blockchain takes those long slow processes and does them almost instantly.

Ian also shares some great use cases, as well as outlining the basic requirements needed for a blockchain, we wrap up by discussing possible futures uses for this technology approach and how blockchain will do for transactions what the Internet has done for communications.

Ian provides us with an excellent introduction to blockchain, to find out more on this topic and how it may impact your business, IBM has some great resources on their blockchain page here https://www.ibm.com/blockchain/what-is-blockchain.html

You can find out more from Ian on twitter @Ian_DMoore

I also mentioned during the show another fascinating blockchain introduction podcast, where Stephen Foskett joins Yadin Porter De Leon on the Intech We Trust podcast, you can find that show here https://intechwetrustpodcast.com/e/130-on-the-blockchain/

I hope you enjoyed the show, to catch future episodes then you can subscribe on iTunes, Soundcloud and Stitcher as well as other good homes of podcasts.

Keeping your data incognito – Harry Keen – Ep 45

Sharing our data is an important part of our day to day activities, be that for analysis, collaboration or system development, we need to be able to share data sets.

However, this need to share has to be balanced with our needs to maintain the security of our data assets.

I saw a great example of this recently with a company who were convinced they were suffering a data breach and having data leak to their competitors. They investigated all the areas you’d expect, data going out via email, been uploaded to sites that it shouldn’t, or been copied to external devices and leaving the company. None of this investigation seemed to identify any areas of leak.

They then discovered that they had a team of developers who, in order to carry out their dev and test work, where given copies of the full production database, so not only given all of the organisations sensitive data, but they had full and unencumbered administrative access to it.

Now, I’m not saying the developers where at the centre of the leak, however you can see the dilemma, for the business to function and develop, the software teams needed access to real data that represented actual working sets, but too provide that, the business was exposing itself to a real data security threat.

How do we address that problem and allow our data to be useful for analysis, collaboration and development, while keeping it secure and the information contained safe and private?

One answer is data anonymization and that is the subject of this week’s show, as I’m joined by Harry Keen, CEO and founder of anon.ai an innovative new company looking to address many of the challenges that come with data anonymization.

In our wide-ranging discussion, we explore the part anonymization plays in compliance and protection and why the difficulty of current techniques means that we often poorly anonymize data, or we are not even bothering.

We explore why anonymization is so difficult and how solutions that can automate and simplify the process will make this important addition to our data security toolkit, more accessible to us all.

Anonymization plays an important part in allowing us to maintain the value of our data as a usable and flexible asset while maintaining its privacy and our compliance with ever-tightening regulation.

Harry provides some great insights into the challenge and some of the ways to address it.

To find out more on this topic, check out the following resources;

The UK Anonymization Network (UKAN)

The UK Information Commissioner (ICO)

And of course you can find out more about anon.ai here

You can follow Harry on twitter @harry_keen18 and anon.ai @anon_dot_ai

You can contact anon.ai via info@anon.ai

Hopefully, that’s given you some background into the challenges of data anonymization and how you can start to address them, allowing you to continue to extract value from your data while maintaining its privacy.

Next week I’m joined by Ian Moore as we take a Blockchain 101, to ensure you catch that episode why not subscribe to the show? you can find us in all the usual podcast homes.

Until next time, thanks for listening.

Securing all of the things – Valory Batchellor – Ep44

It’s not news to anyone listening to this show that the challenge around the security of our data and systems is a substantial one. Our technology seems to be under constant threat, from external hackers, to insiders, from targeted attacks to malware finding its way randomly onto our systems and causing havoc and all of this before we look at increased regulation and compliance demands.

The ever-increasing threat has led to us looking to technology to help protect our systems, however this has now led to its own problems, with many of us investing in numerous platforms and tools which has created a huge sprawl of solutions, that do not interact, all have their own consoles and all are presenting us with alerts and notifications that we then expect our already stretched IT function to understand and act upon.

This range of individual tools of course, also means that problems can “slip through the net” as the disjointed use of technology does not necessarily allow us to see the correlation between alerts that in themselves are insignificant, but when put together point to an attack or breach in progress.

It is this problem that has inspired this series of Tech Interviews episodes looking at the security challenge, we have episodes looking at some new approaches with anonymization and blockchain, but we start by looking at the bigger picture, of building a modern security strategy.

I’m joined by Valory Batchellor of IBM. IBM has done some interesting work in building what they call their Immune System, this looks to help people step back from the problem and take a wider strategic approach to tackling the security threat.

In this chat we look at the current and evolving threat, the challenges presented by multiple, disjointed security tools and we also discuss the future and how machine learning and artificial intelligence could give us an infinite amount of security analysts, working on an infinite amount of security problems, with unlimited resources!

Valory provides some fantastic insight with a real enthusiasm and obvious expertise for her subject, so enjoy the show as we look to “secure all of the things”.

You can find Valory on twitter @ValBatchellor

You can find out more from IBM security at securityintelligence.com and www.ibm.com as well as look at some of the research from IBM x-force.

And do look at the work the national cybersecurity centre here in the UK is doing via their website www.ncsc.gov.uk

Next week I’m joined by Harry Keen from anon.ai as we look at data anonymization and the part it plays in data security.

To catch that show, why not subscribe on iTunes, SoundCloud or Stitcher.

Thanks for listening

Don’t be scared – GDPR is a good thing, embrace it!

I can’t open my inbox these days without someone telling me about the European Union, General Data Protection Regulation (GDPR), the content of these emails ranging from the complex to the scaremongering.

However, what I don’t see are the ones extolling the positives of the regulation.

In my humble opinion, GDPR is a driver for some very positive change in the way that we as businesses, use the data that we have and will continue to collect in ever-growing amounts.

I’m sure we’ve all heard how data is the new gold, oil, etc, and to many of us our data is among the most valuable assets we hold and as I heard recently “the ability to gain actionable insights from data is what will separate us from our competition.” I personally believe this to be true, the businesses that know how to manage and gain value from their data will be the ones that are the success stories of the future.

If data is such an asset, then…

Why do we keep hearing stories about high profile data breaches, such as Equifax and Deloitte, where sensitive information has found itself in the public domain? If data is an asset, then why are we so lax with its security? Are we that lax with other assets?

Data is hard

The problem is, that managing data is hard, we don’t know what we have, where it is, who has access, and when or even if they access it. This lack of insight makes securing and managing data a huge challenge — and why the idea of more stringent regulation is a frightening prospect for many.

Why is GDPR a good thing?

The GDPR is going to force organizations to address these problems head-on, something that, for mthumbs upany of us, is long overdue. Although the regulation focuses on the privacy of “data subjects,” the principles can and should be applied to all of our data.

To be clear, GDPR is not a data management framework. Its scope is much wider than that. It is a legal and compliance framework and should be treated as such. But, while GDPR is “not an IT problem,” it’s certainly a technology challenge, and technology will be crucial in our ability to be compliant.

Why GDPR and technology is helpful

Even If GDPR did not demand our compliance, I would still thoroughly recommend it as a set of good practices that, if you’re serious about the value of your data, you should be following.

I believe the principles of the GDPR, along with smart technology choices, can positively revolutionize how we look after and get the very best from our data.

In the last 12 months or so, I’ve done a lot of work in this area and have found 4 key areas, where the GDPR alongside some appropriate technology choices has made a real difference.

1. Assessment

assessment-1024x819

As with any project, we start by fully understanding our current environment. How else are you going to manage, secure and control something if you don’t know what it looks like, to begin with?

Your first step should be to carry out a thorough data assessment, understand what you have, where it is, how much there is, if it’s looked at, what’s contained within it and of course, who, when, where and why it’s accessed.

This is crucial in allowing us to decide what data is important, what you need to keep and what you can dispose of. This is not only valuable for compliance but has commercial implications as well: why take on the costs of storing, protecting and securing stuff that nobody even looks at?

2. Education

It’s too easy to look at our users as the weakness in our security strategy when they should be our strength. They won’t ever be, however, if we don’t encourage, educate and train them.

Technology can help provide training, develop simple-to-use document repositories or keep them on their toes with regular orchestrated phishing tests. This helps users develop skills, keeps them aware and allows us to develop metrics against which we can measure our success.

We must move away from the annual “lunch and learn” briefing and realize we need tools that allow us to continually educate.

3. Breaches

breachThe GDPR places a major focus on our ability to identify breaches quickly and accurately and be able to report on exactly what data we have lost. Traditionally this is an area in which business have been lacking, taking weeks, months or maybe even years to be aware of a breach. In a world where we are ever more data-reliant, this cannot be acceptable.

Technology is the only way to meet these stringent reporting requirements. How else will you know the when, where and how of a breach?

But technology isn’t only about reporting. The ability to have such visibility of data usage —  the who, where and when of access — will allow us to quickly detect and stop a breach, or at least reduce its impact.

4. Data protection by design

This is perhaps the most positive part of GDPR, as it will encourage us to build data protection into the very core of our infrastructure, systems and data repositories. Whether on-prem or in the cloud, under our control or a service providers, security has to be at the heart of our design — not an afterthought.

We need to use this as an opportunity to encourage cultural change, one where the importance of our data is not underestimated, where maintaining its integrity, security and privacy is a priority for everyone, not just IT.

Is the GDPR a lot of work? Yes.

Is it worth it? In my opinion, 100%, yes — GDPR is a real positive driver for a long overdue and crucial change and should be embraced.


Taking VMware to the cloud – Ben Meadowcroft – Ep43

Over the last couple of episodes, we’ve had some interesting round-ups from the recent VMWorld conference, reviewing the announcements from the show, as well as how VMware are evolving to maintain relevance to their many customers in an ever more software defined, data-centric and of course cloud and as-a-service based world.

Part of the VMware response to these changes ( I hasten to add, not the only technological evolutions they are making) is a smart one, rather than fight the tide King Canute style, VMware is not only embracing that change, but looking to empower it and make businesses ability to embrace a cloud-based world a more straightforward transition.

Embracing this change comes in the form of VMware Cloud on AWS, providing the ability to run your own VMware vSphere environment on top of a dedicated set of AWS resources, providing the flexibility and economics of cloud, while maintaining an infrastructure and management platform that you already know.

This sounds like a really smart move, helping customers to make that tricky transition, keeping it seamless by providing flexibility and integration with your existing on-prem environments, without your IT teams needing to embark on a whole new learning path to understand your cloud platforms.

However, as smart as this sounds, the response has not been totally supportive, with some people asking is there really a need for this type of technology and why, if you are making the investment in AWS, why not just do that, why add these additional VMware costs and infrastructure components?

That is the topic we explore on this week’s show as I’m joined by Ben Meadowcroft, a Product Line Manager at VMware with a focus on VMware Cloud on AWS.

I catch up with Ben to understand more about the solution, why the solution exists at all, the challenges that business faces when building a hybrid solution and how VMware Cloud on AWS is helping to ease that transition, simplify the integration and allow us to start taking advantage of the capabilities of the AWS platform, while removing some of the challenges many of us face when making that transition.

Ben gives some great insight into the platform as well as some helpful use case examples to help you decide whether this kind of technology is a good fit for you.

To find out more details on the solution you can find great resources in the following places;

For an overview of the solution check out cloud.vmware.com/vmc-aws

You can get some hands-on experience with VMWare’s hands-on lab environment at vmware.com/go/try-vmc-aws-hol

To keep up with the latest news you can also follow @vmwarecloudaws on twitter.

Finally, if you want to catch up with Ben you can also find him on twitter @benmeadowcroft

Personally, I think VMWare cloud on AWS is a really interesting solution and I can see it meeting needs in a number of enterprises, check out the show and provide your feedback, either on here or message me @techstringy on twitter.

Next time we start a series of shows looking at the ever-evolving data security challenge.

To make sure you catch those, why not subscribe and if you have the chance leave a review.

Thanks for listening.

As an Interesting bit of information, friend of the show @MichaelCade1 of Veeam produced this really handy blog post on how you can protect your VMware Cloud on AWS environment, using the Veeam tools you already know and love, worth a read, as protecting your data in AWS is your responsibility.

You can read his post here.

Ben in this episode did cover some VMware Cloud on AWS roadmap items, with this in mind, he’s asked me to include the following disclaimer.

Disclaimer

This presentation may contain product features that are currently under development.

This overview of new technology represents no commitment from VMware to deliver these features in any generally available product.

Features are subject to change, and must not be included in contracts, purchase orders, or sales agreements of any kind.

Technical feasibility and market demand will affect final delivery.

Pricing and packaging for any new technologies or features discussed or presented have not been determined.