At a time when finance departments must meet ever-increasing demands, an ERP system that offers both technological and financial expertise is essential. Microsoft Dynamics 365 Business Central offers exactly this solution – and Finclair is your trusted partner for a successful implementation.
Data relevant to the Finance function can be found in many places around the organisation and it goes by many names – chart of account, product code, tax number, bank account number, billing codes and many more. Finance relies on this data and its accuracy to perform task such as internal & external reporting, financial planning, or billing – efficiently and with high quality.
At our Finclair clients we often observe that this is extremely challenging because the data required for these tasks is stuck in data silos created outside of Finance. Data is stored in multiple locations, independent from other systems, without integrations to exchange data and changes made in one location can take days to be reflected in other systems – if it is at all. We’re also observing that Finance data is often created, updated and even deleted outside and without the involvement of Finance. On top of that, employees creating or updated the master data lack the awareness of how incorrect data is affecting finance processes downstream. Finance in turn lacks the ability to read or maintain the required data which is often due to limited access rights or the inability to interpret the data appropriately due to missing data definition.
This lack of Finance data management or ownership over the required data leads to a multitude of problems. When we conduct process analysis, we’ve seen that Finance employees rely heavily on emails, phone calls, and spreadsheets to help facilitate data gathering and data cleansing. This observation is shared by PWC that have found, that even in top quartile companies, business SMEs spend 40 percent of their time gathering data, instead of working with it.
Another problem resulting from this scattered data landscape is inaccurate master data. At one of our client engagements, we have observed an invoice accuracy rate of 80 percent – meaning that 1 out of 5 invoices created automatically was incorrect and had to be re-worked due to incorrect customer addresses, tax ID and other master data inaccuracies. Another typical result are rejections of payments by banks due to incorrect bank account numbers or other receiver information. Eventually, all master data inaccuracies have a negative impact on the customer / supplier experience and will be negatively represented in the company’s Net Promoter Score or CSAT ratings.
To sum it up – disconnected, inaccurate and unreliable data make it extremely challenging for orgainsationsto operate efficiently and drive informed business decisions. This is particularly true for clients that have experienced rapid growth, expanded to multiple markets globally, or have gone through mergers or divestments.
Finance organisations that face these challenges need to take an active stake in managing relevant data.
At Finclair, we have defined three data principles that enable a successful Finance Organisation:
Finance Data needs to be
Our Finclair approach to make these Finance Data Principles a reality is twofold: establish measures that KEEP THE DATA CLEAN and implement continuous processes that GET THE DATA CLEAN. In order to KEEP THE DATA CLEAN, Finance organisations need to establish a robust foundation that clearly defines Standards, Organisational Model, Processes, Quality Metrics and Technology. Establishing this foundation is a cross-functional exercise that needs to be repeated as the organisation matures and requires to be C-Level sponsorship. The other phase, GETTING THE DATA CLEAN, is the operational side of the solution that is executed continuously to ensure high data quality.
The Finance functions needs to clearly define which data objects (Customer, Partner, Supplier, etc) and their related attributes (Email, Tax ID, Bank account number, etc) are required to perform their tasks. This definition needs to include business requirements (e.g. number of tax ID digits for a specific country) as well as technical requirements(system specific requirements, such as limited field length). Once there is clarify on these data objects and attribute requirements, it is important to define the minimum required data attributes per data object from the Finance perspective. At Finclair we refer to this as the Golden Record definition that serves as a starting point to define all other elements of the Finance data organisation and that supports Finance in clearly articulating their needs to other LOBs.
We know Finance Data often originates and is owned by other functions which requires strong cross functional alignment on the Organizational Model. It is important to choose and implement the right one, depending upon the organizational structure and culture. The structure can be purely functional, federated, or hybrid and this might change over time as the organisation becomes more mature. Importance should be on the respective roles & responsibilities, the associated CRUD (create, read, update, delete) matrix and the definition of local vs regional vs global for informational organisations. This organisational model needs to be defined in collaboration with all affected functions and a Data Board overseeing the governance should be established to frequently address data related questions.
To ensure that Finance data is in accordance with the Data Principles (Accurate, One source of Truth, Available), we advice to review existing data workflows and Standard Operating Procedures (SOPs) across the organisation and redesign them in accordance to the newly defined Golden Record Data standards. For areas without SOPs in place, these need to be created and standardised across functions. In addition, the reporting structure on Data quality needs to be defined and communicated. Once all processes are redesigned and standardised, it is crucial to facilitate process trainings for affected users and continuously communicate data standards, processes and best practices.
To measure Finance data quality and taking corrective measures in case of issues, Data Quality Metrics and the respective quality thresholds need to be defined. These metrics should be on a data attribute level (bank account number, Tax ID, etc) and adhere to the special requirements (business or technical) that have been defined as part of the Finance Golden Record. Based on the quality thresholds, cross-functional SLAs with the data owning function need to be established and monitoring systems should be specified as part of a comprehensive data controlling concept.
Understanding which systems are used to create and maintain Finance data records that are part of the Golden Record is crucial. A data system and distribution concept supports our clients to capture where the data attributes are created, updated, stored and how and where this data is distributed to. Such a concept serves as a basis for additional Finance Data automation identification and improvements of system integrations. Also, as part of the concept source system data validations can be defined to increase the overall quality of data attributes.
Getting the data clean consists of activities that need to be executed continuously
When data is inaccessible in systems and can only be retrieved via Email share requests or in case new data silos are added to an organisation due to acquisitions or merger, data migration might be required. Of course, this activity should not be repeated continuously but rather replaced by a data integration. As long as that’s not available, data migration from the source system to the system of truth might be required.
Implementing Quality Reporting is crucial to identify data issues at the source systems and to be able to take action when required. As many data source systems are owned by functions outside of Finance, this reporting implementation is mostly carried out by other functions which set up the reporting based on the Finance Data Golden Record. Scheduled reports inform Finance and non-Finance users about the status quo of the Finance Data and in some organisations are the foundation for cross-functional data quality SLAs.
Data cleansing requires the ability to identify incorrect data based on the FInance Golden Record. Once Data Quality Reporting is available, the identification of inaccurate data is made much easier and can be fixed much quicker. Typical cleansing activities contain spotting spelling mistakes, incomplete data or removing duplicates.
Once data is cleansed, it can be distributed into to the consuming systems. Again, ideally this is also workflow automated and supported by integration systems.