At the Diligent Modern Governance Summit 2020, held in September, governance experts and practitioners came together to share experiences and hear from experts on a range of topics.
One of the presentations explored entity data integrity and management. Here, we summarise the key takeaways for professionals involved in corporate governance.
The Data Integrity Challenge
Opening the session, Diligent’s Director of Data Integrity, Cathy Cartieri shared her ‘top-of-mind’ issue, stating that:
“What keeps me awake at night is the flow of bad data, and not knowing where it goes across the firm.”
Particularly where your remit spans more than one legal entity, this is likely to be a familiar worry. In the session, Jan van der Ham of Aegon and Robert Moorhead of AIG joined Cartieri to discuss the challenges of data integrity across multiple entities.
The Importance of Data Integrity
So, why is data integrity so essential? Firstly, because it is an ever-shifting goalpost. Cartieri started by sharing some concerning statistics:
- Data decays at a rate of 25% per year
- 23% of spreadsheets contain errors
Data is not necessarily reliable. But a lack of trust in data is a major concern, and cost, for organisations. Not surprising, then, that data integrity – a core component of robust corporate governance – is a constant priority.
Moorhead then walked delegates through the Six Core Dimensions of Data Integrity – the pillars that should underpin any best practice entity data-gathering system:
- Completeness
- Conformity
- Consistency
- Accuracy
- Uniqueness
- Integrity
Completeness: Data should be complete but include nothing unnecessary – for example, the zip code or postcode field should ONLY contain the zip or postcode.
Conformity: Does the data within a field conform to the correct format? A phone number would be a good example here.
Consistency: Using pre-defined business logic to ensure data is consistent. Moorhead gave the example of a cost center description, which should be consistent across your organisation.
Accuracy: Does, for example, a legal entity name match exactly the way it’s written in incorporation minute book records, share certificates, etc?
Uniqueness: If there are multiple data elements, there needs to be a single source representation for each entity across the system.
Integrity: Avoiding duplication via missing linkages between data in the system. Moorhead gave the all-too-common example of multiple entities with slight variations of names in the system.
Considering these core dimensions of data quality when designing an entity management approach will increase your confidence that data in the system can be relied upon.
The Data Quality Journey
Achieving data integrity isn’t something that happens overnight. Next, Moorhead shared the steps in the data quality journey; the key phases of a data quality initiative.
- Define the data. Identify the key data elements for your organisation, and then write, approve and communicate standardised definitions for each field. This is an essential first step; as Moorhead says: “When all stakeholders are working from the same definition, then that greatly helps data integrity in your system.”
- Analyse. Carry out a baseline assessment that shows how good/bad your data is, so you know what you’re dealing with and what is in scope for remediation. You may want to focus on specific cadres of data; addresses, for instance, are usually pretty poor quality. This assessment will enable you to prioritise by data field and by active/inactive entities, allowing you to focus your efforts.
- Develop. This step focuses on developing the processes you need to ensure the ongoing integrity of your data; how will you maintain the cleaned data? Often, organisations – particularly larger ones – will have data management professionals or teams who can help guide best practices here.
- Report. Compile data quality results from your baseline at the start of the clean-up process to the result; you should be able to evidence improvement as inaccurate data is removed.
- Execute – “the big one” as Robert puts it. Data quality has the advantage of being measurable; “it’s either correct or it’s not correct… there’s no grey area”. In the Execute stage, you will deploy a regular data quality report and document its findings.
Accountability Is Key
Next, the session looked at accountability. Cartieri opened by noting that it’s vital to make those closest tolegal entity managementresponsible for data:
“What’s critical to ensuring the data in Diligent Entities is true, complete and accurate is to make those closest to the actual entity minute book accountable for the data in Diligent Entities.”
Moorhead, with extensive experience of this, then looked in more detail at the importance of clear and transparent accountability for legal entity data.
It’s essential, first, to establish who is accountable. The good news is that identifying the people who should be the source of accurate data is “the easy part”.
You should put in place data stewards or data owners – people who are responsible for the accuracy of the data. They should be held accountable for data being true, complete and accurate – and should hopefully already have this data in their minute books. Their responsibility for this should be made non-negotiable, and should form part of their annual performance review.
Best practicecorporate governancedemands that legal entity data should be trusted – but also verified. Although the inclination is to assume entity data is correct, it should also be verified, to ensure there are no data entry errors. This mirrors what an auditor or regulator would do when checking an entity’s data.
Then, you need to ensure your identified people are held accountable. Training and clear instructions as the process is rolled out are key. Policy compliance needs to be monitored to ensure entity management is held to the same standards as other areas. Any incidents of non-compliance should be promptly investigated and remedial action carried out.
Doing this will drive a firmwide culture of compliance – something that’s an essential foundation to any organisation driven by good corporate governance. Or as Moorhead puts it, “The entities are the building block of any large corporation… having a firm-wide culture of compliance and really moving the needle on compliance on true complete and accurate data in an entity management system is pretty critical.”
How Can Technology Support Your Process?
Cartieri then handed over to van der Ham to discuss howtechnologycan underpin data integrity.
Van der Ham looks after legal entity data for Aegon. He outlined the three-step process Aegon takes to ensure data integrity, all supported by leading-edge entity management software.
First, they prepare. The company’s legal entities are divided into domains, with each entity assigned to a single domain. Each domain goes through the review cycle twice a year, with two key phases: the execution, and a random check to ensure the review has been correctly carried out.
Two key people are made accountable for the process; the end user, who is responsible for the execution of the project, and the focal point, who has responsibility for the results.
The second stage is the review process itself. Diligent’s Entities is central to this, and as van der Ham notes, makes the process easier at every review cycle because:
“There are standard templates… it’s really quite a straightforward process and once you’ve set it up correctly, it’s easy to repeat.”
Entities clearly lists out all the organisation’s entities, enabling each end user to check if all are present. Then, they carry out the review, checking the accuracy of each entity’s data.
Flagging missing or inaccurate data is made simple, via a straightforward form, and a deadline of a month to remedy any deficiencies is made clear in the system. Following the review process, a formal Letter of Representation by the focal point confirms that the review has been carried out correctly. This provides a reference document in the event of any missing information or queries at a later date.
Two annexes to this letter are sent to the legal department, outlining the process and accountable individuals as well as any deficiencies. This gives the central team a good overview of any legal entities that need to be prioritised for remedial action, and what needs to be done.
The third stage of the process is a random check, brought in to double-check the accuracy of the review. Between two and four entities are chosen at random, and the end user is asked to provide evidence of the data currently within Diligent Entities.
The knowledge that this random check will be done gives end users an additional imperative to provide full, accurate data for the initial review, as well as providing an additional backstop to ensure the robustness of the data.
The entire process takes around a month; two weeks for the review and two for the random check.
Cartieri flagged the importance of transparency in all of this. At Aegon, all entity owners were given the opportunity to report any deficiencies; the clear message was that it was ok to report and then fix shortcomings. Transparency and openness are key in encouraging accountability and honest reporting.
Wrapping up, Cartieri thanked the two speakers for their contributions and invited viewers to get in touch if they wanted to find out more about Diligent Entities and the ways it can support entity data integrity.
Environmental, social and governance (ESG) issues have become more complex and multifaceted than ever before. At the same time, ESG continues to ascend on board and leadership agendas.
In this buyer’s guide, we explore what a market-leading ESG solution should look like and highlight the key areas organisations should be prioritising as they embark on their search.