Coming from the CAD industry, where not using the latest design technology ensures you will fall far behind, the HR industry in comparison seems to be more reluctant to adopt new technologies for managing their workforce.
In my experiences from meeting and talking with HR representatives, a topic of serious discussion is how to manage dirty data – especially with older HR systems.
When you are about to swap and replace a new HR payroll system or move to a new talent management system, having an understanding of what is clean and what is dirty in your old system really impacts the speed of upgrading.
Usually a lot of effort goes into cleaning up data, such as when dealing with old records before migration.
In some cases up to 40% of company data can be found to be inaccurate, meaning data cleansing if often a massive undertaking.
Moreover, dirty data can be costly if your new system has a licensing model based on the amount of records you import, importing old records could potentially mean higher licensing costs.
Most HR leaders and line managers across your organisation will have a few key metrics or KPI’s they use as measurements crucial in managing their workforce. The quality of this information is important to make sure that the metrics HR use are accurate.
Furthermore, an HR manager interested in moving into more advanced strategic HR and workforce analytics, needs to ensure metrics are accurate and up to date.
Recently, tools have become available that give you the ability to view compelling insights into your business using predictive analytics. Predictive analytics are widely used in the consumer market in the form of predictive online advertising. However, these tools require accurate data to enable them to produce insightful results.
Data Accuracy and Broken Data
Accurate data will also reduce time spent on re-work.
For example with payroll, incorrect payroll tax codes can produce a whole set of issues. A bad setup of codes can cause a real problem when you generate payment summaries and you need to reconcile them back to the right buckets of pay-codes.
Data can also become broken when it spread across multiple systems. This issue is especially problematic when payroll sits outside of HR. As such, using an all in one system for Payroll and HR makes source of truth management easier. Splitting information over multiple systems, or even spreadsheets can be a problem when managing this information or keeping this data clean.
Missing information is also a common issue, especially if HR have not been collecting particular pieces of data. In these situations the data would need to be “retrospectively” obtained.
An example of this situation is occupancy information, where employees hold temporary positions and they have not been updated in the system, or they have been updated but not put back in their old role. Keeping this information up to date can be a major challenge for some organisations.
A lot of the common HR systems are code heavy, from look up codes such as gender and location, through to payroll codes, leave accrual codes and GL code – the list goes on!
Furthermore there are the relationships between codes, or code rules, that add to the pile of potential errors.
Typically, naming conventions used in certain codes can over time become broken or antiquated. As new codes are created, old codes are de-commissioned but do not get end-dated. This can really create issues in the management of codes. As new HRIS administrators join your organisation, they are often unaware of historic decisions or legacy configurations that have been put in place to manage a system and can further add to the list of issues plaguing code management.
Put the data in clear view and make it everyone’s responsibility to incrementally improve it.
There are many elements in your workforce information that your entire organisation can assist with validating and correcting. Here especially is where a self service type application or an Org Chart is immensely beneficial.
Finally, allowing employees to send proposed changes back to HR who can then can validate them and update the SOT if required will ensure a constant feedback loop to improve the integrity of your data. Many eyes will always be better than a few!