itm-leaderboard-banner_oct2016
2017: The Data Challenge

2017: The Data Challenge


Hi and welcome to our Pensions Age video interview. I’m Laura Blows, editor of Pensions Age, and joining me today is Duncan Watson, managing director of products and services for EQ Paymaster, to talk to me about why pension schemes need to focus on data quality.



So first of all Duncan, please could you talk me through why exactly pension schemes should be looking at data quality, and why now in particular?



I think 2017 could be a perfect storm in terms of pension scheme data for a number of reasons. There is a convergence of issues from several sources that scheme trustees, sponsors, managers will need to take account of. This is a combination of legislative changes, government policy and regulator interest, but also some market trends in terms of member access to data that are continuing to grow in 2017.


If I may I will just point out five particular examples where I think the key issues will come from.


So first of all, we’ve got IORP II, which is some European legislation that is due to come into force in 2018. Clearly with the state of play with Europe and Brexit, there are some commentators that are suggesting we may not adopt the full breadth of IORP II. But equally there are many commentators that are saying we should plan on the assumption that we will adopt some of that legislation.


Now if we do adopt that legislation in the UK there will be a stipulation for pension scheme trustees to communicate with all of their members, including their deferred members, every year and give them an annual pensions benefits statement. Now clearly that has an implication for data, and we’ll talk about that a little bit later I think. Deferred pension scheme data is a particular issue, in terms of the quality and robustness, and if this legislation comes into force then scheme trustees will have a couple of years to try and sort out that element of the data.



The second point I would like to cover is the pensions dashboard. Clearly that got a lot of press through 2016, and this is the government’s initiative to give pension scheme members, and the public, access to all of their pension scheme information in one place. There’s a lot going on at the moment, which is being driven by certain government sponsors – agencies but also industry bodies are coming together for that. The real dashboard I think is due to launch in 2019. Again, if that comes into force, pension schemes will need to have that data if they are going to take part in the dashboard and give their members access.


The third area I want to talk about is The Pensions Regulator. The Pensions Regulator has, to be frank, dipped in and out of data quality over the past 10 years or so. But they issued particular guidance on record keeping in the past and I think they expect pension scheme trustees to take that guidance seriously and get their data and record keeping in order.


Now the regulator commissioned a survey last year, of which the results were published in the autumn. On the 30th November they issued a press statement effectively saying they weren’t happy with the way trustees had been progressing data quality over recent times. And they are challenging trustees for, when they complete their report in 2017, they include a statement in their report about data quality. So trustees are going to expected to provide that assurance to the regulator and that will involve both common data reporting and conditional data reporting as well.

The fourth particular area that I was going to highlight was that there’s a exercise right now in terms of reconciliation and rectification of the contracting-out records that schemes hold and HMRC holds with NICO.


Most schemes are engaging in that exercise now and are looking to sort out their contracting-out data, their GMP records. The government has also published a consultation around GMP equalisation, an issue that has been rumbling on for many, many years, if not decades, without a resolution. Now if that eventually does reach a resolution then trustees will need to act and address the GMP equalisation issue duties, straight off the back of contracted-out reconciliation, which is another reason why data is becoming increasingly important this year.


The final reason I would like to highlight is just the continuing trend to give members self-service access and access to their pension information, regardless whether this is through the dashboard or through individual pension scheme websites. If the data isn’t robust and up to date there is no point giving members access because they’ll just be able to see rubbish effectively.



Ok so there are many reasons why schemes need to focus on data quality. One aspect of what you said that I would like to find out a bit more about is The Pensions Regulator’s interest. You mentioned that they have been quite interested in data quality over the years, but it seems that they have ramped up their attention to it over the past year or so. Why do you think the regulator is so concerned?


Ultimately they are concerned at the progress, or lack of progress that has been made. I think you’re right, they’ve issued guidance, they’ve issued record keeping guides for both the private and public sector over the past 10 years and I think they’ve expected trustees to take them seriously. And whilst I think some progress has been made around common data reporting and addressing common data, clearly the results of the survey that was produced last year indicate that not as much progress has been made in other data areas – in particular the conditional data aspects.


While the regulator may have been distracted by other issues over 2016, BHS being one of them, I think they’re now turning their attention, on the back of the survey, to say ‘come on, you really need to make some effort here’.


Clearly data underpins data outcomes, but it also underpins value for money for trustees, for some of the services they procure and the regulator isn’t seeing progress being made. So they are reacting in the way they do and they’re insisting on having visibility of that progress through the statutory reporting from 2017 onwards.


So you said the regulator expects trustees to take the issue of data quality seriously, and the many benefits it can provide for schemes and members. But what about for the trustees and sponsors themselves, what risks are they exposed to individually if data isn’t up to scratch?


I think ‘risk’ broadly falls into two main categories. There’s the cost risk of having poor data but I also think there’s the reputational risk of having poor data.


The cost risk manifests itself in many ways. I think there’s a cost of getting data wrong, getting individual benefits wrong and the member discovering the error and coming back for rectification to take place. Rectification costs money and also leads to reputational risk if there is a systematic failure of the data and therefore the benefits being provided. There’s the risk of lawsuit or class action from the members against the trustees and the scheme sponsors.


I think from an overall funding perspective, clearly if the data isn’t robust then for the actuaries placing a value on the scheme’s liabilities, if that is out then they are running the risk that sponsors are potentially funding too much, paying too much in terms of contributions into the scheme.


Another area of risk is if the regulator really does lose patience or the government starts levying sanctions on the schemes for poor quality data, which may manifest itself down the line.


The other area is trustees or scheme sponsors, they want to transact, and they want to carry out a liability management exercise. They go to the market to get a price for a transaction for a buyout or buy-in. But if their data is of poor quality, the likelihood is the insurer will charge a risk premium to the scheme, to the trustees, to the scheme sponsors for that transaction. It could increase their absolute-pound costs for that exercise, and also it could delay the exercise. If the insurer insists on the data being sorted out, they could lose the opportunity to buy that transaction at the right price. There is a real cost, should scheme sponsors and trustees want to transact in a liability management exercise.


I think the reputational risk is clear, and certainly, if you think about the pensions dashboard, there’s a population of the trustees and sponsors that are proactive in this space and get their data cleaned up and participates in the pensions dashboard. If some schemes are doing it, questions may be raised by the members of other schemes as to why their trustees or sponsors aren't participating. Therefore there could be a swell of reputation against particular schemes for not taking part, purely because they have not got their data up to scratch.




So there are clearly a number of risks if schemes have poor quality data. But the catalyst for change can’t only be for negative reasons I imagine. On the flip side of that, what benefits does having good quality data provide for schemes?


The benefits are the reverse of some of the risks and issues we discussed in the previous question. I think one of the benefits, and we didn’t cover it earlier, is the benefit of lower cost administration, the lower cost of services going forward.


Simply put, cleaner data allows automation, self-service from the pensions administrator to be put into place. Without clean data neither of which are possible. The more automated the scheme, the more access to self-service, where the member corrects information themselves or accesses quotations or information, so lowers the cost of administration, which is passed on to the scheme trustees and the sponsors.


There is a benefit of not only lower cost, but a better experience for the scheme members who want to access information and engage with their pension. With better data, better tools can be employed, so there is a better experience for the members.


The flipside of the risks and the costs we outlined were better data, faster transactions when going to the market compared to a buyout or buy-in, any kind of hedge. The better quality data the better chance they have of getting the right price and the insurer matching the right liabilities with the right assets. Clearly better data and a swifter engagement with pensions dashboard equals better experience with their members


So, again, mainly the flipside of the risks and costs, but also the ongoing management of the scheme should be simpler, more cost effective, with better data.


So despite all those benefits, and the increasing pressure on ensuring your data is of sufficient quality, there’s still inadequacy in terms of pension scheme data. Now these issues and benefits you mention aren’t particularly new ones, so why has the industry arguably been so slow perhaps to tackle this issue?


I think some of the trends we talked about and some of the trends that are manifesting themselves in 2017, they're not new but they are gathering a newer sense of urgency. I think there is a lack of patience around the regulator with some of the issues that have been around, as you say, for a long time.


The other way of answering that is that scheme trustees and sponsors can get away with 'just in time' data fixing. By that I mean they wait for a member to die, retire, transfer, and at that point in time for that single transaction they will go in and cleanse the data for that particular member on a member-by-member basis, by digging out the paper files or microfiche or whatever happens to be in place for that particular member. So there wasnt that burning platform on a member-by-member basis to clean and address data quality.


The other thing from a pensions valuation perspective, valuing the scheme, assumptions can be made around the quality and completeness of the data. Not ideal, but trustees and sponsors can probably get away with not really addressing their data and satisfying the actuary and caveats around the data quality.


So there was never really a burning platform unless the scheme was going to then transact, that tends to be the burning platform to sort wholesale data issues out to make the transaction in the marketplace.


As we said earlier, there seems to be a convergence or acceleration of those issues that may prompt trustees to act. Coupled with the 'just in time' solving of data issues, it is a thorny and complicated subject data. It is not a very easy process to fix, it’s very manual in some cases, so it introduces time and costs. And frankly to date, trustees have had higher priorities to do with their time and money, but I think 2017 will be the year when data comes to the top of trustees' set of priorities.


So, data quality may be rising to the top of trustees agendas now, but as you mentioned, it can be quite a thorny and complicated issue to address. Perhaps you could provide some practical advice and tips for schemes and trustees for how to manage this?

The first thing is to recognise they have a problem – sounds a bit like an addict doesn’t it – the first step is to say I've got a problem, and I think the first step for trustees, sponsors and scheme managers in both the public and private sector is to take a good honest look at your data.

So do an assessment. How complete is it, are there any issues with some of the conditional or calculated data in terms of robustness for who the scheme is being administered to over a number of years. So really, first step, take a look. When I say take a look, I mean take a look through several lenses as well.

Clearly the regulator has a particular lens on the data and they’ve set that out very clearly in their record-keeping guidance. That isn’t the only lens to apply. There’s also the lens of what the dashboard might require, what insurers might require if a transaction is to take place. What might IORP II require if we are required to communicate with deferred members on an annual basis.

So there are several lenses that need to be applied. I don’t think trustees should be scared of that. There are some very simple tools out there that can be applied to have a look at that data but that has to be the first step, taking on that assessment, and then, with that assessment, really looking at what are the key priorities.

This isn’t going to be resolved in months, it’s not going to be resolved in years, but the first step is to take that honest assessment and really then try to prioritise where the focus should be in 2017, 2018 and beyond.

Related Articles

Cautious optimism in a challenging world
Matthew J. Bullock, Investment Director, Global Multi-Asset Strategies, Wellington Management, meets Francesca Fabrizi to discuss how multi-asset strategies can help investors

Latest News Headlines
Adam Cadle provides a summary of the big pensions stories to have hit the headlines this week
Most read stories...
World Markets (15 minute+ time delay)
FTSE 100
7452.91
-34.96
Nikkei 225
20099.75
-44.84
S&P 500
2469.47
-3.98
Crude Oil
N/A
N/A