a hot potato in the hands of financial institutions

Informatiemanagement:
A HOT POTATO IN THE HANDS
OF FINANCIAL INSTITUTIONS:
DATA QUALITY
Poor quality of data is still causing headaches to financial institutions. In an
environment of ever-growing regulations, increasingly complex and difficult
markets, a growing demand for transparent products, and the urgent need to
establish sustainable profitability, data quality is an issue that requires a major
shift in the mindset of banks or insurers. Data cannot be seen anymore as an IT
topic. Data requires everyone’s attention and needs to be treated as a valuable
strategic asset, with a great potential to create a competitive advantage. Data
issues cannot be treated anymore as a hot potato that no one wants to hold for
long. It will only get hotter until everyone burns their hands.
Peter Berger: Whether we like it or not, data
has become our daily companion. The core business
of financial institutions (‘FIs’) is in most cases based
on data. While only a very small fragment of financial products are tangible, most of the products
exist in the form of a mere electronic record on
someone’s computer. Data is continuously being
created, collected, and flowing from clients into the
company, inside the company from one department
to another and then back to the clients. This data is
also the key basis for creating management information and therefore the input for decision-making
of the top management. So what happens if this
data is incorrect, incomplete, is leaked, becomes
outdated, or is misunderstood? Indeed – there is a
great risk that someone takes a wrong decision.
Unfortunately, too many bad decisions were recently taken by many financial institutions, primarily banks. The question is how much of these decisions were caused by the poor quality of information that the executives relied on...
Obviously bad data can lead to bad
decisions – but why is it suddenly a big
problem?
The key issue is that management of banks or insurers lack a good insight into how trustworthy the
information they receive is. FIs try hard to tackle
this issue, often as a reaction to the strict laws and
regulations on this matter, such as Basel II/III and
Solvency II, where data quality has been given a
high priority. Financial statements – traditionally
14
the most scrutinised information – are no longer
the only thing on the radar of stakeholders. Today’s
laws and regulations go way beyond that.
The fact is that banks or insurers are fully operating on data – each product, client or transaction
has its data element. And although one simple data
error may seem unimportant, many small errors
may have a huge impact in aggregate, especially when
we look at the company’s overall portfolios. Additonally, one significant mistake may cause severe misinformation that has a very large impact. For example,
it was not long time ago that the German public debt
had to be adjusted by 55.5 billion euro due to a calculation error of a nationalised mortgage bank.
Data pollution often starts the moment data is
created or collected: about customers, collaterals,
insured objects, products, terms and conditions and
the like. Furthermore, even if the data is initially
correct, it becomes very quickly outdated. With
some banks and insurers, the risk of wrong data is
higher because of complex products, a complicated
internal organisation, opaque processes, hundreds
(or thousands) of legacy systems, unreliable external data sources, or simply human error because
processes often are performed manually.
Data is eventually converted into management information, upon which management makes assumptions and takes decisions. Unfortunately, due to the
aforementioned reasons, even basic questions are
sometimes difficult to answer: Should we allocate
more capital to our initiatives? How profitable are our
single products? How risky are our portfolios? Insuf-
MCA: april 2013, nummer 2
ficient management information, and especially assuming it is built on reliable data, can create a false
sense of control over a business. Questions that management should ask themselves before taking any
decisions based on data are: What is the quality of the
information I am looking at? What are the top data
issues and what does this mean for my decision? Do I
understand the information? Am I in control here?
Answering these questions may not be easy in a
typical environment of a financial institution. There
are thousands of systems in use, many of which
were inherited from the past with lots of issues to
start with, spaghetti of reporting processes that
barely anyone can fully oversee, complex financial
products, risks which few people really understand,
interconnected markets with hardly any clear borders, increasing demands for information from all
sides. Something needs to be done to turn this
around with a proper long term solution.
Key questions analysed
There are a number of existing studies and frameworks, which offer various approaches for the implementation of the data quality concept, however,
they are mostly generic (not specific to a financial
institution) or do not necessarily reflect the aforementioned regulatory requirements. Therefore, it
became an interesting subject for my thesis to look
primarily at the following key questions:
~ Question 1. Why should FIs’ executives care
about data quality?
~ Question 2. How should FIs start tackling this
issue, and set the direction and plans?
~ Question 3. Who should be responsible?
~ Question 4. What should FIs do in practice to
enhance quality of their data and in parallel
satisfy the regulators?
By a careful analysis of the regulatory requirements,
applying existing theoretical methodologies, concepts and frameworks, and collecting examples
from the practice of financial institutions, the thesis
provides the following suggestions.
1. Data – source of competitive advantage, or
really just a hot potato? (Why should the executives care?)
A complete switch of mindset may be necessary to
understand that quality data is a core ingredient of
successful management. This link can be made clear
‘Management of
banks or insurers
lack a good insight into how
trustworthy the
information they
receive is’
by the following relation: Data is the basis for creating information; information is the basis for creating knowledge; knowledge is the basis for decisionmaking. The challenge becomes more visible if we
ask the questions in reverse: Do we take our decisions by applying the right knowledge? Is the knowledge we rely on based on relevant and reliable information? Is that information built on clear, understandable, complete, accurate, and timely data?
High quality data has become a key ingredient
for good decision-making. Data should therefore be
treated as an asset – and not a liability. It is a strategically important asset, which can become a great
competitive advantage if exploited in the right way.
It should be treated as such, because it has a huge
potential for making money: exploring customers’
demands, understanding customers’ behaviour,
noticing new market developments, discovering internal issues, and of course responding to emerging
risks and avoiding unexpected losses.
The other dimension of how high quality data
can ‘make money’ is by saving money. Without
much deliberation we can safely say that the availability of reliable and timely data clearly contributes
to higher effectiveness and efficiency of daily operations. When management and staff on various
levels of a company can rely on the information they
receive and they receive it on time, they can potentially eliminate most of the data checking activities,
data cleansing and fixing, manual workarounds,
and other time consuming efforts. Of course, this is
only possible if aspects of data quality are carefully
embedded in the entire business process, where
data issues are an exception instead of a rule.
2. Data vs. business strategy and compliance
(How to start tackling this issue?)
There is a reason why regulators are increasingly
worried about data in financial institutions. Too
MCA: april 2013, nummer 2
15
many poor decisions were recently made at many
FIs, leading organisations and their stakeholders to
serious trouble. And because poor quality of information is an important contributor to this decisionmaking, data became a focal point of key regulations such as Basel II/III, Solvency II, and other.
Recent examples include the Principles for Effective
Risk Data Aggregation and Risk Reporting, issued
by the Basel Committee. Although expected to
be implemented in 2016, the first queries from central banks will already be sent to banks this year,
starting with the globally significant financial institutions (G-SIFIs). Almost in parallel to these Principles, the Financial Stability Board issued a set of
principles and recommendations for enhancing risk
disclosures of banks. Before banks open up new
projects to ensure timely compliance with the new
principles, banks’ senior management should first
look at several fundamental questions: How are
these principles relevant to our business? Do we (or
would we) comply with these principles (if it was
not required)? If not, why were such principles not a
natural part of our business activities before? How
can they be integrated and at the same time create
additional value?
Before jumping into generic solutions and
projects to quickly patch the existing gaps, the
companies’ Boards and Senior Management should
deliberate about the role data plays in their core
business. Only then should they determine a robust
approach to manage it, and at the same time satisfy
any applicable regulation. However, not the other
way around – first create a new bureaucracy around
the existing data to satisfy the regulatory requirements, and then somehow make it fit the core business. It can become a pitfall to set up an additional
administration to patch the existing gaps, create additional layers of management that do not really add
much value or do not make sense to the business.
This cannot work in the long term.
The initial approach should be ‘how do we
create value by adhering to these principles’, rather
than ‘how do we demonstrate compliance with
these principles’. After all, these principles have been
designed to enhance risk management and decisionmaking of banks. Although the whole change process from the ‘as-is’ situation towards ‘to-be’ may initially require additional staff, reports, processes etc.,
these should eventually become a natural part of the
‘new business’, a business based on high quality data.
16
‘Data should be
treated as an
asset – and not a
liability’
So again – given the great potential that high quality of data carries along with it – quality of data
should not be seen as just a regulatory requirement,
and definitely not as a burden!
3. It is your problem, not mine (Who should be
responsible?)
One of the first questions that an FI should think of
is: who is responsible for data? At first it may seem
an easy answer: ‘It sounds like a task for the IT department’. A big mistake. Appointing IT is one of
the main pitfalls that some companies fall into.
Although IT plays an important role in enabling
and facilitating proper data management (i.e. by
acting as a good ‘data steward’), it is a fundamental
change in perception to realise that all C-level directors carry their own responsibility towards data.
It is not a surprise that all directors can already
today be called ‘data users’ (they do use information
for decision-making), and most likely also ‘data
owners’ (their businesses/functions create data on a
daily basis). As both a data user and a data owner
they have expectations towards the data: they want
to receive high quality information from their domains. So it is not a surprise to say that this debate
most likely ends with a conclusion that ‘data quality
is everyone’s business’. Most of the existing processes, people, systems, and external parties have a
relation with data. Realising this is already a big
step. Accepting the accountability is the next one.
Only then data can get the attention it really needs.
4. Make data quality everyone’s business! (What
should FIs do in practice to enhance the quality
of their data, and in parallel satisfy the regulators?)
The approach to making data quality everyone’s
business should include the following 3 steps:
~ Step 1. Determine the role data plays in the core
business, and design a robust data quality framework around it.
~ Step 2. Integrate this framework step by step on
all levels of the organisation, starting from the
very top.
MCA: april 2013, nummer 2
~ Step 3. Embed the data quality efforts as part of a
daily business – and reward for it!
It is not an easy process to make this happen. Similar to any other changes that require the attention
of the whole organisation – people, processes, systems, external parties – it requires careful planning
and change management. Use of a suitable capability-maturity model to carefully plan these improvements should be considered.
Step 1 – Data quality framework
Define a single integrated framework, incorporating
a number of key building blocks of data quality
management. Do not consider it to be a checklist of
new documents created to merely demonstrate how
data is managed, but rather as a set of important
company concepts, which should be integrated into
the existing structures of the company, to be actively and consciously adhered to. Refer to the
figure.
The first set of building blocks describes the ‘data
quality infrastructure’ which prepares the environment for facilitating the required quality of data.
The key building blocks are:
~ Data strategy, policies & standards – describe
the importance of data to realise the company’s
strategy. Policies & standards capture the overall
objective, framework, and how the building
blocks fit with the rest of the organisation.
Static Building Blocks:
Data Quality Infrastructure
Corporate level components:
‘Quality of data
should not be
seen as just a
regulatory
requirement’
~ Data governance – determine a clear set of responsibilities towards the company’s data. The
key roles include data owners, data stewards and
data users – three very distinct roles that determine everyone’s relation to data. These roles
should be appointed to the respective senior
managers, including a robust governance to
ensure proper oversight, decision-making, and
reporting about data quality.
~ Data requirements and quality criteria –
translate the wishes of the data users and the
commitments of the data owners and the data
stewards. A crucial tool that establishes a common understanding between these groups.
~ Data dictionary – a common place where the
data taxonomy is determined, making sure that
everyone follows the same core definition for
data, and knows on a more granular level who are
the data owners, users, stewards, and which systems are the leading ones for the data.
~ Data flow diagrams and descriptions, process
controls and IT controls – a set of high-level
visual descriptions of where the data originates
Dynamic Building Blocks:
Data Quality Processes
Basis for
Data Strategy,
Policy & Standards
DQ
Controls
Monitoring
Data Governance
DQ
Assessment
DQ Issue
Management
DQ
Reporting
Data Requirements &
Quality Criteria
Data Dictionary
Approved changes
DQ Change Management
Bussiness/process level components:
Data Flow Diagrams
and Descriptions
Process Controls
and IT Controls
Related processes: Risk processes, Model Validation,
Financial reporting etc.
Figure 1. Data quality framework: integrating the business, IT, and support functions
MCA: april 2013, nummer 2
17
from, and how it is flowing through the organisation until it reaches the key management information users. This should be accompanied by the
description of the key controls in place to ensure
that data requirements are fulfilled.
Another set of ‘dynamic’ building blocks that
should be in place to enable a continuous effort of
maintaining data quality. These blocks should also
be to the extent possible incorporated processes
into the existing processes, instead establishing new
ones.
~ Data quality (DQ) controls monitoring – reporting on the noted issues (e.g. exceptions) when
executing the key data quality controls. For
example: outcomes of the key reconciliations performed between risk data and accounting data.
~ DQ assessments – assessment of the effectiveness
and efficiency of the data controls, and assessment of the impact of the key issues noted. This
may include activities to find out details about
specific data issues, such as data mining.
~ DQ issue management – description of actions focussed on remediating the key issues, including owners and due dates. Examples of such actions: data
cleansing, alignment of data flows between departments, resolving the known master data errors.
~ DQ reporting – all key reports should include a
data quality indicator, to provide management
with a context, i.e. to what extent the presented
information can be relied on, and what limitations should be taken into account.
~ DQ change management procedures – an organised way of adjusting any element of the data infrastructure. Examples include new data, changes in
systems, processes, etc. or changes in other related
processes in the company, e.g. risk management,
model validation, financial reporting, or other.
methods, because it is all about people – their attitudes, willingness to cooperate, and willingness and
capability to make changes in their activities.
Step 3 – Business as usual
This step is not the end, but the beginning of the new
company based on high quality data. During this
step, it should be ensured that data quality continues
to be an ongoing effort of the company. Management
should therefore ensure that there is sufficient continuous awareness about data quality amongst all staff.
There should be sufficient controls in their processes,
so that data issues can be prevented rather than corrected later. But given the fact that no controls are
100% effective, issues will always occur – so the organisation should be able to pick them up at the source,
and take corrective actions at the required time and
speed. These efforts should be transparent to foster
awareness about the key data issues. And these
efforts should be recognised and awarded.
Long journey ahead
It is not an easy journey for a financial institution to
change the mindsets of their people, to make them
care about the data quality, and act upon it, while
doing their regular businesses. But there is no real
alternative to this – there will be a growing importance of data, and the sooner it is embraced as a strategic asset, the better. The aforementioned suggestions should help management to understand what
such a journey entails and where to start. Soon hopefully, high quality data will bring tangible benefits to
financial institutions – better management. That is
after all what the regulators are also aiming for. And
by exploring the aforementioned suggestions, demonstrating compliance with the strict regulations
does not have to be a difficult challenge.
Peter Berger, Manager Financial Services Industry at Protiviti
(risk and business consulting firm), is a graduate of the Inter-
Step 2 – Embedding
national Executive Master of Finance and Control Programme at
A conscious change program should be established,
with the objective that all relevant employees adopt
the aforementioned framework as part of their daily
activities. Starting with the ‘tone at the top’, there
needs to be a clear commitment towards quality
data on all levels of the company. Eventually, everyone should see the relevance and the link to the
daily business. Making this happen requires a careful deliberation on a suitable approach, timing and
18
Maastricht University. He is the winner of the VRC Thesis Award
2012.
MCA: april 2013, nummer 2