Master Data Management: The Precursor to Big Data Analytics

There are many articles describing the benefits of Big Data Analytics, how it can highlight areas of inefficiency within an organisation, highlight trends and ultimately improve positioning within the market. The general misconception however is that analytic software solutions make sense of all of your data and produce a glossy report telling you exactly what needs to be done to better your current situation.

In reality the old adage of ‘Garbage In, Garbage Out’ has never been more applicable and many organisations are experiencing that they must first implement a strategy for Master Data Management (MDM) to cleanse their data as a precursor to reaping the benefits of Big Data Analytics.

Master Data Management can be applicable to many sourcing categories. Take the purchase of spare parts for Maintenance, Repair and Operations in manufacturing for example. All factories located worldwide within an organisation require certain spare parts to ensure their production lines function at the optimal capacity all year round. Breakdowns or scheduled maintenance will stop production and if the parts are not available on site the overall cost of the production downtime can be compounded by delays, rogue spending by engineers and duplication of parts in the system.

When Elemica engage with a customer regarding an MDM project we focus on two key areas.

  1. Historic Master Data Quality Improvement
  2. Future Purchase and Master Data Governance

Historic Master Data Quality Improvement

Mergers and Acquisitions are responsible for many of the quality issues seen in MRO Data Management. Factories in different countries will not only record descriptions in different languages but will also have a specific way of recording information about a certain part. They have their own system or data repository to store the information and may also be storing the same items as another factory under different part numbers.

This level of complexity and a general lack of attention over many years can result in a data set of very poor quality. It is important to collate and understand the quality and complexity of the data in order to cleanse, consolidate, de-duplicate and formalise the information to a standard that can be read and understood by all end users in all locations globally.

Elemica recommend customers appoint a team of internal specialists to define the scope of items to be cleansed, assign rules for which classification to apply to certain families of parts, agree abbreviations (i.e. Bearing to BRG) and define the attributes or specification detail to be associated to each part (dimensions, weight, supplier name, colour etc). There are software solutions that can assist with the classification and de-duplication activity and are capable of reviewing and classifying tens of thousands of items in a few minutes which might take a human many weeks to complete manually. Elemica can support an organization to ensure that the data to be uploaded into the software is prepared in such a way to get the best result from the MDM system.

Future Purchase and Master Data Governance

Cleansing the historic data is important but making sure there is ongoing compliance in terms of purchase and procedure is paramount to prevent additional problems and should be implemented immediately.

One key issue causing poor data listings in ERP systems is the ability for end users to input data using free text fields. Abbreviated or cryptic descriptions in multiple languages lead to confusion and duplication of parts that already exist. There are MDM software providers that supply add-ons to most ERP systems which take away the free text capability and force the user to select from drop down lists that categorise the item during input. This does not always prevent input errors but at least the descriptions are legible to everyone. The classification rules will determine which family and sub-family the part belongs to whilst also translating the description into multiple languages. Elemica already has a proven ERP add-on for those looking for a way to control the data input at the source.

Categorised parts allow for search functions to be used more effectively and reduce the time spent by an end user trying to locate a part needed for a machine. This improved overview of parts available across factories globally can in some instances identify a part in stock at a neighbouring factory that can be transported rather than purchased locally. This is especially applicable to very expensive specialist spare parts such as industrial pumps or laser cutters that can bring an entire production line to a standstill if not replaced quickly.

Pressure to prevent downtime of machines can result in some users ordering excess stock which is often hidden and not recorded on inventory. Known as ‘inflated’ or ‘bloated’ inventory, these items are not seen in the ERP system and cause discrepancies in procurement spend analysis. Restricting the methods of purchase and providing a list of preferred suppliers are two ways to prevent rogue spending.

Big Data Analytics to maximise Procurement Leverage

Creating an MDM strategy in an organisation takes a great deal of planning, communication and time. Implementation is only the initial step towards clean data. Ongoing governance will inevitably lead to a cultural change within the organisation and ensure the data is kept from descending in to disarray. Increased uptime of production lines, decreased inventory and storage space, improved maintenance efficiency and improved visibility of spend are just some of the key benefits associated with clean data.

Big Data Analytics is still in its infancy and is challenged to show its true potential in light of the sheer volume of poor data present in some large manufacturers. Procurement professionals use information from multiple sources (historic spend, demand forecasts, supply chain KPI audits) to create and implement strategies and negotiate better deals with their suppliers. This analysis is often done manually and takes considerable time to deliver often based on assumptions as the data provided is poor or incomplete. As the quantity of data rises exponentially, including new forms of data (customer trends, climate change reports, raw material availability) as well as increased data complexity, future procurement strategies will be based on the results of Big Data Analytics that combine multiple sources of data into one dynamic overview.

Elemica has the knowledge, software solution and expertise to advise customers how to implement a cleansing and classification project in a selected category as their first step towards an efficient Big Data Analytics process and eventual operational excellence.

If you are interested to know more please contact:

Russell Smith, Sourcing Manager, Elemica