Written by Peter Ruffley, CEO, Zizo

Following the news that the UK arm of Sungard Availability Services has gone into administration, this is an unfortunate wake up call for businesses who have an overreliance on data centres or the cloud, to ensure its data is available across multiple platforms. 

Businesses are starting to realise that a 100% reliance on cloud infrastructure is not the best approach to storing data, particularly when dealing with critical data points and critical information to the business. Instead, the industry has started to see movement towards on-premise solutions, specifically for mission critical systems. Instead with tremendous improvements in processing power and capability with on-premise and edge devices, having a balanced portfolio of data storage is key.

Firstly, it is vital that businesses ensure they have good data practice. The fundamental principles of this are both understanding the data and knowing where it is. For data success, organisations need to ensure that they have a grip on this – moving away from ‘shadow IT’ where the location of the data is unknown. 

One of the benefits of the cloud is the low cost storage of masses of data, and the relatively secure environment it offers. But just because you can store masses of data, it doesn’t mean you should. With the advances in edge computing, organisations can aggregate data so that they only store what they need to, precisely where it needs to be. Therefore, reliance on the cloud can be reduced, and control can be given to the business to deliver the data it needs. 

However, it doesn’t have to be a one-size-fits-all-solution. It’s fine to have backups and non-mission critical systems down at the edge or in a data centre, for example. But what happens if there is a comms outage? Or what happens if, similar to Sungard, your data centres become potentially compromised, either through a security breach or commercial issue? Reliance on one out-sourcing model is not the best approach to take – especially when across the globe, businesses have witnessed two years of uncertainty, and are still unsure of what is around the corner. 

Businesses of all sizes need to be more cautious and undertake careful planning to ensure that data doesn’t reside in just one place, but instead across multiple locations. Many may think this will complicate matters, but that doesn’t have to be the case. Instead, it is crucial that businesses, regardless of where its data is located, can access and analyse information without the need to change its location. 

There are opportunities to manage data in a much simpler, better way. For example, data federation, which is a software process that allows multiple databases to function as one. Just because data is stored in different places, doesn’t mean that it can’t be both accessible and queryable. Traditionally, data was categorised as ‘primary’, ‘secondary’ and ‘tertiary;’ at primary level, the data was accessible straight away, at a secondary level, the data was still accessible, but with limitations in place, whereas at a tertiary level, it involved multiple hurdles to gain access to the data. 

This tiering process needs to happen again – quickly. The best approach businesses can take, particularly following the Sungard news, is to keep the data that is important to them, close to them. In turn, the data that is ‘less important’ can be stored in the cloud, at the edge, or rather, utilise a hybrid approach – where the data is still available, but with the ability to query it if necessary.