How CIOs can gain data superpowers to manage data pipelines in the face of real time decisions

Real-time decision support is a success factor for all businesses impacted by the current global uncertainty, as well as the rapid digital transformation of the market. This could impact the responsibility of CIOs who may need to develop data superpowers because existing models of data processes and pipeline may not be adapted to the required speed to insight.

We hosted a roundtable that brought together a group of cloud architects, IT, software engineers, information security, intellectual property and process & innovation professionals to take a deeper look at:

  • How data managed to change in the last 5-10 years and how this has impacted the role of CIOs
  • What does a typical end-to-end data analytics chain look like today and how it should look
  • The key reasons that make ‘self-service; analytics look risky and making the model contradictory with data governance and integrity

Rela8 Group’s Technology Leaders Club roundtables are held under the Chatham House Rule. Names, organisations and some anecdotes have been withheld to protect privacy.

About Incorta

Incorta is helping world leading brands to gain insights they previously thought impossible, with agility, simplicity and business results that are simply incredible. Their in-memory analytics and Direct Data Platform™ is a modern approach to data management, analytics and BI that sets them apart.

Gaining visibility of data

Volumes of data are rising and the complexity within the CIO’s role across an organisation is increasing. CIOs need to respond to the speed at which business is running and gain visibility of data across the organisation to allow people to make decisions quickly and to support the business in its aim of maximising profit. So how are CIOs able to support the business in the decision-making process with this complexity.

Changing data

Data could be incomplete, for example, purely financial rather than transactional, so the starting point for many organisations is to change that data in order to satisfy the needs of clients or customers. Existing approval for GDPR is normally found on hard copy forms, so that has to be digitalised to ensure compliance. Companies may also have to build time transactional ETLS because existing ones were programmatic and build data warehouses.

Now, many companies find themselves in another period of change, with the implementation of CRM solutions and multiple ecosystems and touching points. The challenge is getting systems to talk to each other, to give customers the possibility to act on their data in real time, so that data can be updated.

Businesses face many challenges, especially when they are the result of mergers and acquisitions, with different ecosystems in maturity and architecture. Different types of back end ERP systems can be difficult to migrate across to a standard SAP – extracting and mapping the right data - takes time and slowing down insights. There is transactional data and having the right data governance and there is operational data and structuring the architecture and infrastructure to support that. Unifying this data helps to break all those functional silos and leverage those experiences in the operations to give value to the whole company.

Data pipelines and security

As companies deal with increasing amounts of data from different systems, it is important to understand where the data came from and what it means. Data can also be in different formats, so bringing it all together and analysing it can be problematical. From a security perspective, it is not about the purchasing behaviour of a customer, but whether there is something worth paying attention to within this data flow.

With traditional ETL processes, the data is staged before being transformed, so the data goes through multiple steps. From a trustworthy perspective, companies need to understand how it has been transformed. A two-layered architecture structure looks at the primary process, i.e. the business process or application. The secondary process is about collection of information data about what is happening within that primary process, so it can be analysed. So, raw data is collected, and transformations are applied when that data needs to be used, with multiple security analytics solutions to consume the same data from that pipeline. Post facto analysis can also be applied, so organisations have the ability to learn today something that may have been suspicious in the past.

As always, access and authorisation can prove a headache for security, what is in the data, why it is stored and who has access to it.

End-to-end data analytics change

The starting point of any business in knowing what data they need to capture, why they need it and how they are going to use it in order to improve business performance and/or processes. The next step is data collection. When data is collected from different sources, it can be put into one data lake with the same language. The cleaning process should be automated because manual processes slow everything down and introduce the human error factor. Once a business has the right data, it can be analysed and visualised. The data can be used to predict automating actions on an operational level, not just for visualisation for decision-making, but using it as an escalation matrix for real actions in operations. Introducing AI to automate some of the actions can transform the way companies are using data.

Data analytics is about finding the right balance between keeping everything and only recording what is relevant or deleting data that is useless. When it comes to harmonisation, companies may try to mandate standards for data producers, but the reality is that different technologies use different languages, as do different locations. Companies need to assess what they are able to achieve, find where the gaps are, and from this determine what they need to change, then review on a regular basis because the business environment is constantly changing.

Enabling timely decisions

The problem with data management in data pipelines is universal, applying to every business, no matter the industry. There may not be one perfect solution. Businesses need to evaluate how they are performing on their metrics and see how they can improve by focusing on aspects related to data collection and analytics.

Different types of back end and ERP systems as a result of a merger/acquisition, all with different architecture and at different stages of maturity can prove especially difficult to migrate across to one system. It take time to extract and map the right data, which can slow down insights and visualisation, resulting in the inability to provide data for making timely business decisions.

The main focus should be to take the right decision at the right time, and the data is there to help businesses to take that decision to optimise the business, ensuring survival within the current trading environment and also the future growth of the company.

If you want to get in touch then give us a shout