The future is happening right now. In memory computing, ever increasing computer processing power, stronger networks, real time processing, data lakes, more powerful and user friendly data discovery, data visualisation and predictive tools are giving rise to new opportunities in getting value out of data.
Some of the trends in this new world that we see at McCoy are:
For a lot of companies, analytics still consists of reporting on the past and presenting the current status of KPIs against targets. This is not very surprising as 80% of the information users in a company need operational data (eg. on the current status of inventory) while the remaining 20% of information users have the opportunity to make predictions and to make fact based decisions.
Predictions are not just limited to projections based on trend analysis, but are also about foreseeing the behaviour of your stakeholders, such as customers and employees. The predictive analytic tools that are currently on the market include statistical applications and algorithms that support business processes and decision making. An example of sucha business process is maintenance. By collecting data from sensors that register the noise of machines, patterns can be identified. By monitoring the outliers, maintenance could be performed preventatively, avoiding downtime due to machine failure.
For predicting behaviour and taking fact based decisions there is more data needed than what is residing in a company’s transactional system. According to Merrill Lynch, 85% of the data within a company is unstructured data. With the latest technologies it is possible to retrieve relevant information from such unstructured or semi-structured data: e-mails, memos, reports, white papers, and social media. Currently in most companies this data is not available for analysis. Using the proper algorithms on this kind of data about your employees or customers could, based on this untapped data, help in detecting employee’s dissatisfaction before it is too late and help increasing turnover and gain market share.
Performance management tools can become more efficient and powerful with the available technologies. Whereas, for performance reasons, data used to be aggregated in data warehouses, in memory computing allows that data is kept at a far greater level of detail. This level of detail can make Activity Based Costing (ABC) more efficient to use. ABC gives insight in customer and product profitability and enables better decision making. ABC models were however complex and required a lot of calculations. Therefore the models were inflexible in the face of changing situations. In-memory computing provides the opportunity to build more flexible ABC models. Another performance management tool is planning. Although the planning process has improved significantly over the last years, with in-memory computing, statistical analysis and algorithms, the planning process can change from yearly cycles to continuous rolling forecasts whereby a company can focus on reporting on exceptions or outliers to adjust the planning.
There are far more opportunities coming from new offerings and increased service levels then analytics can bring, but a last opportunity to mention here is the opportunity to reduce the number of reports thatexist in a company. 20 years ago, consultants advocated that business intelligence would reduce the number of reports and that this would generate savings on resource time and on cost of paper. The reports got digitized, saving paper and time, but the actual number of reports did not decrease. Companies still have a lot of reports that are variations on the same theme. Some of the causes are listed below.
Governance rules resulting in multiple copies of the same report (per level, region, product line…)
Versions with the same layout but differing levels of detail (aggregated vs detailed)
Rigid layout resulting for example in a report that compares months, another for weeks, quarters, etc.
Lack of flexibility that impedes ad hoc analysis, resulting in n-versions of the same report.
New data visualisation tools that are on the market have a high degree of self service and allow users to tell a story with their data, have the latest data available in a meeting and answer to ad hoc questions in the meeting.
At McCoy we also believe that these opportunities come with challenges. They are not necessarily new and they are mainly of an organisational and people nature.
From a people’s perspective to get the most value out of an organisation’s data, besides statistical knowledge, some users need to have/gain knowledge on which algorithms to use. This requires the company to take on staff with a data scientist profile or to hire the knowledge in the form of a contractor.
On the organisational side, the challenge remains to establish a productive collaboration between business and IT. With the capabilities of the new reporting tools that are on the market, the role of making reports (the front end part) can shift from IT to business.
When everyone starts to make reports, the challenge is to make sure that there is a single version of the truth. Here we see the main role for IT in providing back end systems with verified data on which the correct level of security is applied. To come to the single version of the truth, it is important that an organisation understands which data is important for their business; what needs to be analysed, measured and reported on. The business knowledge, analytics and data storing skills required to do so, can be combined in an analytics competence center. This competence center would then be responsible for the governance of the single version of the truth.
Olivier Hermans, McCoy Belgium
Ludo Bourguignon, McCoy Belgium