Building a modern data platform – The Series – Introduction

For many of you who read my blog posts (thank you) or listen to the Tech Interviews Podcast (thanks again!) you’ll know talking about data is something I enjoy, it has played a significant part in my career over the last 20 years, but today data is more central than ever too what so many of us are trying to achieve.

pexels-photo-373543.jpegIn today’s modern world however, storing our data is no longer enough, we need to consider much more, yes storing it effectively and efficiently is important, however, so is its availability, security, privacy and of course finding ways to extract value from it, whether that’s production data, archive or backup, we are looking at how we can make it do more (For examples of what I mean, read this article from my friend Matt Watts introducing the concept of Data Amplification Ratio) and deliver a competitive edge to our organisations.

To do this effectively means developing an appropriate data strategy and building a data platform that is fit for today’s business needs. This is something I’ve written and spoken about on many occasions, however, one question I get asked regularly is “we understand the theory, but how do we build this in practice, what technology do you use to build a modern data platform?”.

That’s a good question, the theory is all great and important, however seeing practical examples of how you deliver these strategies can be very useful. With that in mind I’ve put together this series of blogs too go through the elements of a data strategy and share some of the practical technology components I use to help organisations build a platform that will allow them to get the best from their data assets.

Over this series we’ll discuss how these components deliver flexibility, maintain security and privacy, provide governance control and insights, as well as interaction with hyperscale cloud providers to ensure you can exploit analytics, AI and Machine Learning.

So, settle back and over the next few weeks I hope to provide some practical examples of the technology you can to deliver a modern data strategy, parts one and two are live now and can be accessed in the links below. The other links will become live as I post them, so do keep an eye out for them.

modern storage
Part One – The Storage
alwayon
Part Two – Availability

I hope you enjoy the series and that you find these practical examples useful, but remember, these are just some of the technologies I’ve used and are not the only technologies available and you certainly don’t have to use any of these to meet your data strategy goals, however, the aim of this series is to help you understand the art of the possible, if these exact solutions aren’t for you, don’t worry, go and find technology partners and solutions that are and use them to help you meet your goals.

Good Luck and happy building!

Coming Soon;

Part Three – Governance and Control

Part Four – What the cloud can bring

Part Five – out on the edges

Part Six – Exploiting the Cloud

Part Seven – A strategic approach

Advertisements

Building a modern data platform – The Storage

wp_20160518_07_53_57_rich_li.jpgIt probably isn’t a surprise to anyone who has read my blogs previously to find out that when it comes to the storage part of our platform, NetApp are still first choice, but why?

While it is important to get the storage right, getting it right is much more than just having somewhere to store data, it’s important, even at the base level, that you can do more with it. As we move through the different elements of our platform we will look at other areas where we can apply insight and analytics, however, it should not be forgotten that there is significant value in having data services available at all levels of a data platform.

What are data services?

These services provide added capabilities beyond just a storage repository, they may provide security, storage efficiency, data protection or the ability to extract value from data. NetApp provide these services as standard with their ONTAP operating system bringing considerable value regardless of whether data capacity needs are large or small, the ability to provide extra capabilities beyond just storing data is crucial to our modern data platform.

However, many storage providers offer data services on their platforms, not often as comprehensive as those provided in ONTAP, but they are there, so if that is the case, why else do I choose to use NetApp as a foundation of a data platform?

Data Fabric

“Data Fabric” is the simple answer (I won’t go into detail here, I’ve written about the Data-Fabric_shutterstock.jpgfabric before for example Data Fabric – What is it good for?), when we think about data platforms we cannot just think about them in isolation, we need considerably more flexibility than that, we may have data in our data centre on primary storage, but we may also want that data in another location, maybe with a public cloud provider, we may want that data stored on a different platform, or in a different format all together, object storage for example. However, to manage our data effectively and securely, we can’t afford for it to be stored in different locations that need a plethora of separate management tools, policies and procedures to ensure we keep control.

The “Data Fabric” is why NetApp continue to be the base storage element of my data platform designs, the key to the fabric is the ONTAP operating system and its flexibility which goes beyond an OS installed on a traditional controller. ONTAP can be consumed as a software service within a virtual machine or from AWS or Azure, providing the same data services, managed by the same tools, deployed in all kinds of different ways, allowing me to move my data between these repositories while maintaining all of the same management and controls.

Beyond that, the ability to move data between NetApp’s other portfolio platforms, such as Solidfire and StorageGrid (Their Object storage solution), as well as to third party storage such as Amazon S3 and Azure Blob, ensures I can build a complex fabric that allows me to place data where I need it, when I need it. The ability to do this while maintaining security, control and management with the same tools regardless of location is hugely powerful and beneficial.


API’s and Integration

When we look to build a data platform it would be ridiculous to assume it will only ever contain the components of a single provider and as we build through the layers of our platform, integration between those layers is crucial and does play a part in the selection of the components I use.

API’s are increasingly important in the modern datacentre as we look for different ways to automate and integrate our components, again this is an area where NetApp are strong, providing great third party integrations with partners such as Microsoft, Veeam, VMware and Varonis (some of which we’ll explore in other parts of the series) as well as options to drive many of the elements of their different storage platforms via API’s so we can automate the delivery of our infrastructure.

Can it grow with me?

One of the key reasons that we need a more strategic view of data platforms is the continued growth of our data and the demands we put on it, therefore scalability and performance are hugely important when we chose the storage components of our platform.

NetApp deliver this across all their portfolio. ONTAP allows me to scale a storage cluster up to 24 nodes delivering huge capacity, performance and compute capability. The Solidfire platform, inspired by the needs of service providers, allows simple and quick scale and a quality of service engine which lets me guarantee performance levels of applications and data, this is before we talk about the huge scale of the StorageGrid object platform or the fast and cheap capabilities of E-Series.

Crucially NetApp’s Data Fabric strategy means I can scale across these platforms providing the ability to grow my data platform as I need and not be restricted by a single technology.

Does it have to be NetApp?

Do you have to use NetApp to build a data platform? Of course not, but do look at whatever you choose as the storage element of your platform that it can tick the majority of the boxes we’ve discussed , data services, a strategic vision and ability to move data between repositories and locations and provide great integration , while ensuring your platform can meet the performance and scale demands you have on it.

If you can do that, then you’ll have a great start for your modern data platform.

In the next post In this series we’ll look at the importance of availability – that post is coming soon.

Click below to return to “The Intro”

 

modern data platform
Building a modern data platform – The Series – Introduction

 

 

NetApp Winning Awards, Whatever Next?

WP_20160518_07_53_57_Rich_LI.jpgIn the last couple of weeks I’ve seen NetApp pick up a couple of industry awards with the all flash A200 earing the prestigious Storage Review Editors Choice as well as CRN UK’s storage Vendor of the year 2017, this alongside commercial successes (How NetApp continue to defy the performance of the storage market) is part of a big turnaround in their fortunes over the last 3 years or so, but why? What is NetApp doing to garner such praise?

A bit of disclosure, as a Director at a long-term NetApp Partner, Gardner Systems, and a member of the NetApp A-Team advocacy programme, I could be biased, but having worked with NetApp for over 10 years, I still see them meeting our customers’ needs better than any other vendor, which in itself, also suggests NetApp are doing something right.

What is it they’re doing? In this post, I share some thoughts on what I believe are key parts of this recent success

Clear Strategy

If we wind the clock back 4 years, NetApp’s reputation was not at its best, tech industry analysts presented a bleak picture, the storage industry was changing, with public cloud storage and innovative start-ups offering to do more than those “legacy” platforms and in many cases, they could, NetApp were a dinosaur on the verge of extinction.

Enter the Data Fabric, first announced at NetApp’s technical conference, Insight, in 2014. Data Fabric was the beginning of NetApp’s move from a company focussed on storing data to a company focused on the data itself. This was significant as it coincided with a shift in how organisations viewed data, moving away from just thinking about storing data to managing, securing, analysing and gaining value from it.

NetApp’s vision for data fabric, closely aligned to the aims of more data focussed organisations and also changed the way they thought about their portfolio, less worried about speeds and feeds and flashing lights and more about how to build a strategy that was focussed on data in the way their customers were.

It is this data-driven approach that, in my opinion, has been fundamental in this change in NetApp’s fortunes.

Embrace the Cloud

A huge shift and something that is taking both customers and industry analysts by surprise is the way NetApp have embraced the cloud, not a cursory nod, but cloud as a fundamental part of the data fabric strategy and this goes way beyond “cloudifying” existing technology.

ONTAP Cloud seamlessly delivers the same data services and storage efficiencies into the public cloud as you get with its on-prem cousin, this provides a unique ability to maintain data policies and procedures across your on-prem and cloud estates.

But NetApp has gone beyond this, delivering native cloud services that don’t require any traditional NetApp technologies, Cloud Sync, allows the easy movement of data from on-prem NFS datastores into the AWS cloud. While Cloud Control provides a backup service for Office365 (and now Salesforce) bringing crucial data protection functionality that many SaaS vendors do not provide.

If that wasn’t enough there is the recently announced relationship with Microsoft, with NetApp now powering the Azure NFS service, yep that’s right, if you take the NFS service from the Azure marketplace this is delivered fully in the background by NetApp.

For a storage vendor, this cloud investment is unexpected, but a clear cloud strategy is also appealing to those making business technology decisions.

Getting the basics right

With these developments, it’s clear NetApp have a strategy and are expanding their portfolio into areas other storage vendors do not consider, but there is also no escaping that their main revenue generation continues to come from ONTAP and FAS (NetApp’s hardware platform).

If I’m buying a hardware platform, what do I want from it? It should be robust with strong performance and a good investment that evolves with my business and if NetApp’s commercial success is anything to go by, they are delivering this.

The all-flash NetApp platforms (such as the award winning A200 mentioned earlier) are meeting this need, a robust enterprise-level platform, allowing organisations to build an always-on storage infrastructure that scales seamlessly with new business demands. 6-year flash drive warranties and the ability to refresh your controllers after 3 years also give excellent investment protection.

It is not just the hardware however, these platforms are driven by software, NetApp’s ONTAP operating systems is like any other modern software platform, with regular code drops (every 6 months) delivering new features and improved performance to existing hardware via a non-disruptive software upgrade, providing businesses with the ability to “sweat” their hardware investment over an extended period, which in today’s investment sensitive market is hugely appealing.

Have an interesting portfolio

NetApp for a long time was the FAS and ONTAP company, and while those things are still central in their plans, their portfolio is expanding quickly, we’ve discussed the cloud focussed services, there’s also Solidfire with its unique scale and QoS capabilities, Storage Grid a compelling object storage platform, Alta Vault provides a gateway to move backup and archive data into object storage on-prem or in the cloud.

Add to this the newly announced HCI platform you can see how NetApp can play a significant part in your next-generation datacenter plans.

For me the awards I mentioned at the beginning of this article are not because of one particular solution or innovation, it’s the data fabric, that strategy is allowing NetApp, its partners and customers to have a conversation that is data and not technology focussed and having a vendor who understands that is clearly resonating with customers, analysts and industry influencers alike.

NetApp’s continued evolution is fascinating to watch, and they have more to come, with no doubt more awards to follow, whatever next!

Going to gain some Insight – What I’m looking forward to from NetApp Insight 2017

This week I’m in Berlin at NetApp’s data management conference Insight.

Always a great chance to catch up with industry friends, hear from leaders in the data industry, a range of technology companies and about the strategic direction that NetApp and the data management industry is taking.

With 4 days ahead in Berlin, what are the things I’m hoping to hear about at Insight 2017?

Extending the fabric

If you’ve read any of my blogs on data strategy in the past you’ll be familiar with NetApp’s data fabric concept, the fabric was developed to enable us to break down the data silos’s that we have become used to and enable us to build a strategy to allow us to simply and freely move data between any repository be that on-prem, software-defined, in the cloud or near the cloud while maintaining all of the security, management and control of our data that we have grown used to on-prem.

Today the data fabric is much more than a strategic aim as it is now practically delivered across much of the NetApp portfolio and I’ll be paying attention to how this continues to evolve.

Gaining understanding of our data

This is the next step for “storage” companies, especially those, like NetApp, who are repositioning themselves as data management companies.

Long gone are the days where we just want somewhere to store our data, you have to remember not only is “storing boring” it also does not serve us well, whether you are concerned about governance and security, or how to extract value from your data, this can only come with full understanding of where your data is, what it contains, who accesses it and when, all are critical in a modern data strategy and I’ll be interested in how NetApp is allowing us to gain more understanding.

Securing all of the data things

Nothing is higher on the priority list for CIO’s and those making business decisions than the security of our business data (well it should be high on the priority list), I’m keen to see how NetApp build on what they currently have (encryption, data security policies, API’s for 3rd party security vendors) to fully secure and understand the data within an environment.

I’ll also be interested to hear more about the changes the data industry continues to make to enable us to not only secure our data from the ever-evolving security challenge but also how we can meet increasing compliance and privacy demands.

Analysing the stuff

I fully expect to hear more about how data continues to be the new oil, gold etc, as marketing based as this messaging is, it is not without validity, I constantly speak with business decision makers who are eager to understand how they can use the data they own and collect to gain a business advantage.

NetApp has made some interesting moves in this space, with integrated solutions with Splunk and the Cloud Connect service allowing easy movement of data into AWS analytics tools.

It will be interesting to see how this evolves and how NetApp can ensure the data fabric continues to extend to so we can take advantage of the ever-growing analytics tools that allow us to gain value from our data sets.

Integrating all of the things

NetApp has long innovated in the converged infrastructure market, with their joint Cisco solution Flexpod.

However, this market continues to evolve with the emergence of hyper-converged infrastructure (HCI), which companies like Nutanix and Simplivity (now owned by HPE) have led the way. However, up to now, I have the feeling HCI is only scratching the surface by taking infrastructure, servers, storage and networking and squeezing it into a smaller box. In my opinion what’s missing is the software and automation to allow us to use HCI to deliver the software-defined architectures many are striving for.

It is this that is beginning to change, VMware and Microsoft, amongst others, are bringing us more tightly integrated software stacks extracting hardware complexity and letting us drive infrastructure fully in software, bringing that cloud like experience into the on-prem datacentre.

It is these software stacks that really starts to make HCI an interesting platform, marrying this simplified hardware deployment method, with automated software driven infrastructure has the potential to be the future of on-prem datacentres.

I’ll certainly be keeping an eye on NetApp’s new HCI platform and how that will allow us to continue to simplify and automate infrastructure so we can deliver a flexible, scalable, agile IT into our businesses.

What else will I be up to?

Many of you know I’m proud to be a part of the NetApp A-Team, and this association has also made Insight a very different proposition from a couple of years ago.

For the first time I’ll be part of a couple of sessions at the event, feel free to come and check them out and say hello;

You’ll find me doing session 18345-1-TT – Ask the A-Team – Cloud and Possibilities with NetApp Data Fabric and 18348-2 – From the Beginning – Becoming a service provider.

I’ll also be hosting the pop-up tech talks sessions – If you want to come and meet up and chat (on camera) about your views of NetApp or the data market in general, why not come find me.

And lastly, I’ll be appearing on The Cube as they broadcast live from Berlin giving in-depth coverage of Insight.

I’ll be discussing HCI platforms on Tuesday 14th at 2.30, you’ll find that broadcast on thecube.net

If you’re at Insight, do come say hello or hook up on the twitters @techstringy

And let’s discuss if you too have gained any insights.

Look out for more blogs and podcasts from Insight 2017 over the coming weeks.

Chaining the blocks, a Blockchain 101 – Ian Moore – Ep46

As the world continues to “transform” and be more digitally driven, then the inevitable also has to happen, systems that support our day to day processes start to become outdated, inefficient and ineffective for a world that needs to move more quickly and in different ways.

One such innovation that is gathering momentum is the use of blockchain and it is starting to have a major disruptive impact on the way many traditional transactions are done, with current mechanisms often been slow, inefficient and vulnerable to compromise, as well as in many cases, especially with financial transactions, a lack of trust in many of the existing methods.

But what exactly is blockchain, like many people it’s a technical term I’m familiar with but don’t fully understand how it works, why it’s relevant and how is it impacting business right now, as well as the potential future applications.

If you are like me and interested in the technology and would like to know more, then maybe this week’s podcast episode is for you as I’m joined by Ian Moore to provide a beginners guide to blockchain, a blockchain 101 no less.

Ian is not a blockchain expert, but certainly is an enthusiast and the perfect person to introduce the concept and provide a good overview of the technology. In his day job, Ian works for IBM in their data division.

During our conversation, he introduces us to the importance of ledgers, how the four key blockchain tenants of consensus, provenance, immutability and finality are allowing blockchain transactions to be quick, secure and trusted.

We also discuss how the speed of digital transformation is demanding improvement in speed and efficiency and how transactions that used to take weeks are no longer acceptable as blockchain takes those long slow processes and does them almost instantly.

Ian also shares some great use cases, as well as outlining the basic requirements needed for a blockchain, we wrap up by discussing possible futures uses for this technology approach and how blockchain will do for transactions what the Internet has done for communications.

Ian provides us with an excellent introduction to blockchain, to find out more on this topic and how it may impact your business, IBM has some great resources on their blockchain page here https://www.ibm.com/blockchain/what-is-blockchain.html

You can find out more from Ian on twitter @Ian_DMoore

I also mentioned during the show another fascinating blockchain introduction podcast, where Stephen Foskett joins Yadin Porter De Leon on the Intech We Trust podcast, you can find that show here https://intechwetrustpodcast.com/e/130-on-the-blockchain/

I hope you enjoyed the show, to catch future episodes then you can subscribe on iTunes, Soundcloud and Stitcher as well as other good homes of podcasts.

Keeping on top of ONTAP

The last year has been a big one for NetApp, the turnaround in the company’s fortunes continues, fantastic growth in the all flash array market, the introduction of cloud native solutions with tools and of course not to forget Solidfire and the newly announced HCI platform. All have created lots of interest in this “new” NetApp.

If you have read any of my content previously, you’ll know I’m a fan of how NetApp operate and their data fabric strategy continues to make them the very best strategic data partner to meet the needs of many of the people I work with day-to-day.

Why am I telling you all of this? Well, like with all technology companies, it’s easy to get wrapped up in exciting new tech and sometimes forget the basics of why you work with them and what their core solutions still deliver.

For all the NetApp innovations of the last couple of years, one part of their business continues to be strong and even at 25 years old remains as relevant to customer needs as ever and that is the ONTAP operating system.

ONTAP, in its latest incarnation, version 9 (9.2 to be exact), maybe more than anything shows how NetApp continue to meet the ever-changing needs of the modern data market, because it would be easy, regardless of its strength, to write off an operating system that is 25 years old, but NetApp have not, they have developed it into something markedly different from the versions I first worked with 10 years ago.

These changes reflect the changes we, as users in more data focussed businesses, demand from our storage, it’s not even really storage we demand, it’s an ability to make our data a core part of our activities, to quote a friend “Storing is boring” and although storing is crucial, if all we are doing is worrying about storing it, then we are missing the point and if the focus for ONTAP was only that, then it would become very quickly irrelevant to a modern business.

How are NetApp ensuring that ONTAP 9 remains relevant and continues to be at the heart of data strategies big and small?

Staying efficient

Although storing may be boring, in a world where IT budgets continue to be squeezed and datacentre power and space are at a costly premium, squeezing more and more into less and less continues to be a core requirement.

Data Compaction, inline deduplication, and the newly introduced aggregate wide deduplication all provide fantastic efficiency gains. If you align this with integration of increasing media sizes (10TB SATA, 15TB Flash, something not always easy for NetApp’s competition), you can see how ONTAP continues to let you squeeze more and more of your data into smaller footprints (60Tb on one SSD drive anyone?), something that remains critical in any data strategy.

Let it grow

As efficient as ONTAP can be, nothing is efficient enough to keep up with our desire to store more data and different types of data. However, ONTAP is doing a pretty good job of keeping up. Not only adding additional scalability to ONTAP clusters (Supporting up to 24 nodes) NetApp have also taken on a different scaling challenge with the addition of FlexGroups.

FlexGroups allow you to aggregate together up to 200 volumes into one large, high performance single storage container, perfect for those who need a single point of storage for very large datasets. This is something I’ve already seen embraced in areas like analytics where high performance access to potentially billions of files is a must.

Keep it simple

A goal for any IT team should be the simplification of its environment.

NetApp have continued developing ONTAP’s ability to automate more tasks and by using intelligent analysis of system data they are helping you to take the guess-work out of workload placements and their impacts, allowing you to get it right, first time, every time.

The continued development of quick deployment templates has also greatly simplified provisioning of application storage environments from out of the box to serving data, taking just minutes not days.

In a world where an ability to respond quickly to business needs is crucial, then the value of developments like this cannot be underestimated.

Keep it secure

Maybe the most crucial part of our data strategy is security and in the last 12 months NetApp have greatly enhanced the capability and flexibility of this in ONTAP.

SnapLock functionality was added 12 months ago, allowing you to lock your data into data archives that can meet the most stringent regulatory and compliance needs.

However, the biggest bonus is the implementation of onboard, volume level encryption, previous to ONTAP9, the only way to encrypt data on a NetApp array, was like most storage vendors, with the use of self-encrypting drives.

This was a bit of an all or nothing approach, it meant buying different and normally more expensive drives and encrypting all data regardless of its sensitivity.

9.1 introduced the ability to deliver encryption on a more granular level, allowing you to encrypt single volumes, without the need for encrypting drives, meaning no need for additional hardware and importantly the ability to only encrypt what is necessary.

In modern IT, this kind of capability is critical both in terms of data security and compliance.

Integrate the future!

I started this piece by asking how you keep a 25-year-old operating system relevant, in my opinion the only way to do that is to ensure it seamlessly integrates with modern technologies.

ONTAP has a pretty good record of that, be it by luck or design, it’s port into the world of all flash, was smooth, no need for major rewrites, the ONTAP method of working was geared to work with flash before anyone had thought of flash!

The ability for ONTAP to see media as another layer of storage regardless of type was key in supporting 15TB SSD’s before any other major storage vendor and it is this flexibility of ONTAP to integrate new storage media which has led to one of my favourite features of the last 12 months, FabricPools.

This technology allows you to seamlessly integrate S3 storage directly into your production data, be that an on-prem object store, or a public cloud S3 bucket from a provider like AWS.

In the V1.0 release in ONTAP 9.2, FabricPools tier cold blocks from flash disk to your S3 complaint storage, wherever that is, bringing you the ability to lower your total cost of ownership for storage by moving data not actively in use to free up space for other workloads. All done automatically via policy, seamlessly providing an extension to your production storage capacity by integrating modern storage technology.

ONTAP everywhere

As ONTAP continues to develop, the ways you can consume it also continue to develop to meet our changing strategic needs.

Fundamentally ONTAP is a piece of software and like any piece of software it can run anywhere that meets the requirements to run it. ONTAP variants Select and Cloud, provide software defined versions of ONTAP that can be run on white box hardware or delivered straight from the cloud marketplaces of AWS and Azure.

The benefit of this stretches far beyond just been able to run ONTAP in more places, it means that management, security policies and data efficiencies are all equally transferable. It’s one way to manage, one set of policies to implement, meaning that where your data resides at a given moment becomes less important, as long as it is in the right place at the right time for the right people.

In my opinion, this flexibility is critical for a modern data strategy.

Keep it coming

Maybe what really keeps ONTAP relevant is the fact that these new capabilities are all delivered in software, none of the features have required new hardware or for you to purchase an add-on, they are all delivered as part of the ONTAP development cycle.

And the modern NetApp has fully embraced a more agile way of delivering ONTAP, with a 6-month release cadence, meaning they can quickly absorb feature requests and get them delivered to platforms that desire them quickly, allowing them and us to respond to changing business needs.

So, while NetApp have had a fascinating year, delivering great enhancements to their portfolio, ONTAP still retains a very strong place at the heart of their data fabric strategy and still, in my opinion, is the most complete data management platform, continuing to meet the needs presented by modern data challenges.

Find out more

If you want to know more about ONTAP and its development then try these resources.

NetApp’s Website

Justin Parisi’s BLOG – providing links to more detailed information on all of the technologies discussed and much more!

TechONTAP Podcast – NetApp’s excellent TechONTAP podcast has detailed information of all of the information shared here, it’s all in their back catalogue.

And of course you can leave a comment here or contact me on twitter @techstringy

Tech me for the weekend – 21st July

Those weekends just keep on rolling around don’t they! It’s been a hectic old week at work, but I’ve still managed to catch up on some really good tech content that I thought I’d share.

If you are after some interesting reads and listens to satisfy your insatiable desire for all things tech this weekend, then give these a go..

Articles

CIO.COM – Is your data ready to help you make game changing decisions?

I presented at an event recently on this very topic and thought this was an interesting article discussing the same issues.

Many of us see the value of data and see how getting a better understanding of it can help us make better decisions in our business, but how many of us have thought about how to package up our data so we can actually take advantage of analytics tools so we can become a more data driven businesses? Some areas to consider in this article;

http://www.cio.com/article/3192349/analytics/is-your-data-ready-to-help-you-make-game-changing-decisions.html

Tech Crunch – Five building block of a data-driven culture

While on the data theme, also found this from Tech Crunch further exploring the idea of making our businesses more data centric, while the CIO article looked at how to prepare our data to be more useful to us, Tech Crunch look at the wider picture of what a business needs to become more data centric.

They explore the importance of an authoritative data set, but also the importance of having the right skills in your business, it’s no good doing all of this work with your data, if no one has the slightest idea how to use it!

https://techcrunch.com/2017/06/23/five-building-blocks-of-a-data-driven-culture/

Windows IT Pro – Microsoft Inspire: Simplify, Simplify, Simplify

A big supporter of this message in all areas of IT, as we become ever more reliant on our technology and it becomes ever more complex, what is crucial is we take that complexity away from our end users, they need to be able to focus on making the most of their technology so they can meet their desired outcomes, not wasting time worrying about making stuff work.

This article from Windows IT Pro comes from the recent Microsoft Inspire conference, as they take a look at Microsoft’s plans for simplifying technology delivery be that cloud offerings, building hybrid solutions with Azure Stack, or using Microsoft cognitive services, the focus is on simplification.

http://windowsitpro.com/cloud-data-center/microsoft-inspire-simplify-simplify-simplify-or-least-shove-complexity-out-view

Podcasts

After all that reading, you may want to kick back for some tech listening, so here’s a coupe of shows to enjoy!

Tech ONTAP – Death of the specialised admin

I know the NetApp podcast team keep getting a mention, but they are knocking out some great episodes at the minute and episodes that are for a much wider IT listenership than just NetApp customers.

This episode is one of those as Andy Bantha and Josh Atwell join the team to talk about next generation infrastructure, but not a debate about technology, one about skill sets, what types of skills do we need as IT Pro’s and what kind of skills do we need as a business as you look to build your next generation technology platforms.

Well worth a listen.

 

 

Virtually Speaking Podcast – vSAN Off-Road

Another old favourite this one. I enjoy the VMware podcast, as a great way to keep up with what VMware are doing.

This episode, although vSAN focussed, does however touch on an interesting idea, that of building customised infrastructures, not necessarily ones that sit in any good practice guide or a reference architecture, however ones that are supported, even if their use cases are quite unique.

The team bring up some interesting points and areas to consider, worth a listen if you are indeed taking your own IT a little “off-road”.

In Tech We Trust – Luck and Innovation

Enjoying the new format of this show and an interesting topic this time out as Yadin discusses with a range of guests the part that luck plays in innovation, does it play a part and if so how big?

Interesting listen and some good sharing of experiences.

Give it a try, I’m pretty sure it will give you some things to think about!

https://intechwetrustpodcast.com/e/125-luck-and-innovation/

Tech Interviews – Living on the data edge

Talking of Yadin (smooth transition if ever there was one) he is my guest on my Tech Interviews show this week, as I discuss Yadin’s day job at Druva, as we tackle the tricky and often ignored problem of edge data.

We discuss the data that sits out on our mobile devices, laptops, tablets, phones, USB sticks and the unique set of problems that this presents to our enterprise in terms of data management.

Yadin shares some great ideas and insights on how we can begin to tackle the challenge.

Plus it’s the last show for a few weeks as Tech Interviews takes a summer holiday – so heah, why not give it a try.

Hopefully that gives you plenty to enjoy over the weekend.

Happy teching.. watch out for some more tech content to enjoy soon…

 

 

Tech me for the weekend – 7th July

First up an apology, a podcast only list this weekend, it’s all been a bit hectic this week so not had a lot of reading time…

A bit of a theme this week with a focus on security. Data security is constant hot topic from Ransomware to governance and all that’s in between, so if you are fighting the good data security fight, you may find these shows provide some interesting insight.

All of the links below take you to the podcast show notes pages, each of those pages have the episodes embedded, or you can of course look for them in your favourite podcatchers.

.future from Microsoft – Securing a digital battlefield

This is a new podcast from Microsoft and episode 1 was an interesting start, looking at the growing impact of the modern cyber threat and how in many quarters it is seen as the next potential “war zone” a good insight and don’t worry, not as scary as it sounds!

.future podcast

Inside Out Security – what does GDPR mean for countries outside of the EU

The EU General Data Protection Regulation, GDPR, is going to be a massive change on how we handle personal data, this episode takes a look at one of the common questions associated with the regulation, “how does it impact me if I’m outside of the EU?” the reality is, it probably will.

A good debate about how and why it will impact you, regardless of location.

Inside Out Security

NetApp TechONTAP – Security update

The ONTAP boys are back in the list again this week, but to be honest, that’s because I’m biased as I’m on this show!

Don’t worry it’s not just me, I join the regular team as well as some of NetApp’s very own data security gurus as we talk about the latest threats, the importance of understanding your data and how none of this works if you don’t build an in-depth data security strategy.

Lots of good tips and advice.

Tech ONTAP

Tech Interviews

If the Tech ONTAP podcast isn’t enough of me for you, then there is of course this weeks Tech Interviews.

A personal favourite this show and a little bit of a digression from normal, as rather than talking with someone from the tech industry, I catch up with an actual technology user!

I chat with Lee Clark of Givepenny UK about how he has integrated data and cloud technology to deliver innovation to the fundraising sector, allowing charities to find new ways of engaging with a whole new generation of fundraisers.

A fantastic example of how technology can really make a big difference – I hope you enjoy it.

How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Hopefully that should quench your tech thirst this weekend, enjoy the shows and have a great weekend.

 

 

How a 100-mile bike ride inspired a new way of fundraising – Lee Clark – Ep34

Normally when I do this podcast I chat with people from the technology community, normally those involved on the supply side, to discuss how the market is changing and how technology can help us deliver innovation and change into our organisations.

This week I wanted to do something a little different as the third and final part of our brief series on using the cloud, by looking at a real use case, someone who has embraced cloud technology to bring innovation to an industry.

I always admire those people who can spot a gap in a market, they see new and inventive ways of delivering products and services or ways of working with their customers, but sometimes bringing those ideas to fruition can be difficult and expensive. One of the most powerful things of the modern cloud driven era is how cloud providers have made technology much more accessible to those with great new business ideas, making it easier than ever for them to quickly, efficiently and importantly, for relatively low investment, exploit often very advanced technologies to help bring an idea to fruition.

This week I’m joined by someone who has done just that, Lee Clark of Givepenny UK.

Lee describes how a 100-mile charity bike ride, that he never got to do, made him think about a whole new way of fundraising. He realised that although he couldn’t do the actual 100-mile event, he had tracked many 100’s of miles of training with his smartphone app and realised how we all track and store masses of data about our everyday activities and how this data could be utilised to raise much needed money for charities in the UK.

We discuss a whole range of issues which comes with innovation, issues that are not unique to trying to launch something new, but are equally experienced by those trying to bring change into any organisation.

We look at how organisations are often crippled by their own behaviour, restricting their ability to embrace innovation, and how to overcome it.

Lee shares some of the ideas behind the Givepenny platform and how, by looking for new ways to engage potential fundraisers, has created a whole range of new opportunities for charities of all types.

We also discuss how cloud has made traditionally complex technology so much more accessible and how this accessibility allowed Givepenny to deliver a platform quickly, effectively and at a low cost, especially when compared to how the charity sector first engaged with technology as part of fundraising some 15 years ago.

And finally, we look at the importance of understanding the way technology is changing so many aspects of both our personal and business lives and why understanding this change is not just for those looking to deploy new innovations, but crucially those who make decisions about the future of an organisation, decisions that will have a significant impact on future success.

Lee shares with real enthusiasm about Givepenny’s journey and how they are looking to bring innovation to the charity sector, allowing them to create a whole new range of ways to engage with a very generous population who’d love to help, if they could.

You can find out more about Givepenny on their website givepenny.com and you can follow them on twitter @givepennyuk.

You can also follow Lee @jannerinbrum

Subscribe on Android

SoundCloud

Listen to Stitcher

Tech me for the weekend 30th June 2017

Another busy week in the tech world, so it’s time to kick back, relax and enjoy a weekend. But of course us techies never stop, always a chance to grab a tech article or maybe listen to a tech podcast, so here’s some stuff I’ve seen or heard this week that I thought I’d share, give them a go, see what you think.

Podcasts

Try these for your tech listening pleasure;

Arrow Bandwidth

The Arrow team have produced some great episodes recently, and here’s another one, a fascinating chat with IBM’s The Weather Company. We all talk a lot about how data is the new gold, and how our use of data is changing the way we do so many things. This episode outlines a great use case as The Weather company are using data on a massive scale to provide not only more accurate, but also a range of brand new weather services.

Arrow Bandwidth Podcast

IT Pro Podcast

This is a new podcast from one of my favourite podcast presenters Richard Hay, who presents the Windows Observer podcast talking about all things Microsoft, but mainly with a “consumer” focus, the IT Pro podcast has a more business slant and this week focusses on migration to the cloud, some of the things to consider and tips on how to deliver a great migration project.

Well worth a listen and making an addition to your podcast catcher of choice;

IT Pro Podcast

NetApp Tech ONTAP

My friends at NetApp have a great and very informative podcast for NetApp customers and partners, occasionally the team go off topic, this week was an excellent example of that with a fascinating chat looking at Women In Tech, NetApp have a great programme in place, but they also talk about the wider work that needs doing to ensure the tech industry is a place for us all to thrive.

Tech ONTAP Podcast

Tech Interviews

This weeks tech interviews was the 2nd of a 3 part series looking at moving to the cloud, as we tackled the interesting challenge of picking the right cloud service provider.

Often an area that doesn’t get considered to the degree that maybe it should, I’m joined by Jon Green of Navisite as we discuss the importance of picking the right partner, understanding cloud infrastructure, cloud economics and the “gotchas” of cloud.

Gotcha – the challenge of moving to the cloud – Jon Green – Ep33

Articles

Want to kick back and relax with a tech article, no fear, try these out;

Loudaboutcloud

A new website from Kirk Ryan, a cloud architect (and star of Tech Interviews Podcast Ep32) who has put together a list of great resources for helping those building cloud solutions.

This excellent post is one such example, as Kirk provides some excellent examples of how to control costs in the AWS cloud (There is also an equally good Azure version of the article on the site).

A great read and very insightful;

http://loudaboutcloud.com/wordpress/2017/06/26/cost-analysis-storage-costs-in-azure/

Microsoft – How artificial intelligence in changing our world

I’m a big fan of this topic and looking at how the way we use data is changing so many aspects of our life, this article is actually the route into an eBook on the subject.

Give the article a read and if it sounds interesting go download the eBook that goes with it;

https://enterprise.microsoft.com/en-us/articles/roles/it-leader/empowering-people-how-artificial-intelligence-is-changing-our-world/

NetApp – Is your SaaS Covered?

An article from another previous Tech Interviews guest, Robert Cox (Check him out talking object storage) If you ignore that this is a “NetApp sales pitch” Robert asks a very important question about “who is responsible for protecting your SaaS data?”, if you’re not sure, or don’t know, this is a good read.

https://newsroom.netapp.com/blogs/is-your-software-as-a-service-covered/

Hopefully all of that will keep you technically entertained… enjoy and have a great weekend.