Shaping The Guidelines For A Information-Pushed Future

Speakers: Nikhil Rathi, Chief Executive
Events: AFM 20th anniversary seminar
Delivered: 16 June 2022
Grade: this is the speech as drafted and may differ from the delivered version

highlights

  • The value of consumer data has soared and we often trade privacy for expediency.
  • We have had constructive engagement with Google on promotional adverts of non-regulated firms and more recently with Twitter and are hopeful other technology giants follow.
  • Data is the lifeblood of a modern regulator and in the next five years, we expect to become as much a regulator of data as a financial one.
  • While data can help institutions identify risk, it can also entrench bias and make it more difficult for consumers to access loans or products such as insurance.
  • International coordination will be increasingly vital with the rise in risk associated with digitalization and gamification.
  • Investing in tech and skills is the key to staying ahead.

The last time I attended a twentieth, I was a graduate. Luckily, there will be few, if any, traces of what happened at those post-university events. Back then we did not have social media to document our every adventure.

For that is what happens today.

To get here, some of you booked your hotel using Google (after reading Tripadvisor reviews of course), you traveled to the airport using a cab-hailing app and were ushered onto the plane after filling in advance passenger information online and downloading the airline app to board.

At every point, you were tracked. Willingly.

And we give that permission, trading some of our privacy for expediency.

The growing value of data

As we have become more relaxed about the use of our data, the value of that data has soared .

Take Google ad revenue. Back when the AFM was set up, each Google user generated around 1 euro 70 cents of ad revenue for the search engine.

By 2021, this had soared to over 30 euros per user – an increase of 1,800 per cent.

Progress with tech firms but more work ahead

Google’s unparalleled reach has driven us to have constructive conversations with the tech firm about not unwittingly allowing the promotion of scams.

Since August, there has been barely a single example of harmful content relating to financial promotions.

We were pleased to reach an agreement with Google in the UK that it would ban ads for financial products that have not been approved by an authorized person. TikTok and Microsoft have also implemented similar policies.

Twitter last month introduced a ban on ads for certain products and services and implemented further restrictions on promotions.

We hope that other platforms such as Meta and YouTube will follow their lead and work with us to stop consumers being duped.

There is more work to be done to ensure online platforms and social media firms are accountable for harmful ad content which appears on their platforms.

We hope the Online Safety Bill due to come into force in the UK will ensure consistent protections. Platforms which host user-generated content will be held to account over that content. Tech firms will also have a duty to protect users from harm and search engines will have to minimize the presentation of harmful search results.

We are also working with other regulators in the UK – in the Digital Regulation Cooperation Forum – to share ideas on digital regulation as online platforms become ever more powerful – fueled largely by their holding of data.

This forum was created with our partners at the Information Commissioner’s Office, the Competition and Markets Authority and the media watchdog Ofcom to meet the complex challenges of regulating online platforms.

Data is the lifeblood of a modern regulator and in the next five years, we expect to become as much a regulator of data as a financial one.

AI and algorithms

We have seen retail banks and insurers relying more on algorithms – and we have explored the challenges and opportunities of algorithms in a study on artificial intelligence (AI) with the Bank of England . For example, when giving customers credit, the wrong model could deny them a loan or give a high-risk customer more than they would normally be able to repay.

AI can help mitigate risk – for example we have embarked on a web scraping program to detect internet scams rather than relying on humans reporting them to us.

However, AI is not foolproof.

The belief that allowing algorithms to control the supply and demand of crypto could avoid the volatility of markets controlled by fickle humans was challenged recently with the Terra Luna episode.

The stable-coin ended up being anything but, and was subject to dramatic volatility as seen with other crypto currencies.

The future of regulation

As the complexity and breadth of data broadens, so too will our remit. Financial service products are moving away from legacy institutions to tech firms and other challengers who hold data.

And while data can help institutions identify risk, it can also entrench bias and make it more difficult for consumers to access loans or products such as insurance.

And where will that data be held and regulated? In China, public attitudes towards and government handling of data and AI is very different from the Netherlands or the UK.

A study produced last month showed the huge data leaks posed by Real Time Bidding. A person in the UK will have their location tracked 476 times a day through their online activity. In the Netherlands, it is a little less than 380 times each and every day.

Nobody can control how that data is used or which organization, individual or regime looks at it.

Regulators in the coming decades may have to step in to decide on the boundaries of firms’ interventions. And they will also have to decide whether it is appropriate for financial institutions to differentiate on access and pricing of services on the basis of tracked data.

Data and the AI ​​to help with risk

But digital developments can also help mitigate risk.

Our regtech systems monitor transactions and spot outliers that could suggest fraudulent behaviour. We are already working on automated systems for our threshold permissions.

And our organization has moved to the Cloud and revolutionized the ease with which we can access vital data to prevent financial crime and protect consumers.

We have migrated more than 52,000 firms and 120,000 users to our RegData platform, which grants access to flexible and scalable data collections.

The disclosures we are requiring on environmental, social and governance (ESG) products are already breaking new ground and the FCA is partnering with other regulators to share our experience. Last year, the creation of the International Sustainability Standards Board was welcomed. The board will set a global baseline of complete, consistent and comparable reporting standards on sustainability for the first time.

More data will give customers more power to choose products with better ESG ratings.

The future for regulators

As frictionless trading gathers a pace, we will need more cooperation between regulators across borders and industries.

International coordination in this area is becoming increasingly important – both in terms of effectively protecting consumers from the increased risks of digitalization and from gamification across the provision of financial services.

Casting our minds forwards, it seems that coders could become the designers of society. They may have to become not just engineers but philosophers, deciding moral questions.

And artificial intelligence if implemented by machine learning could have the same implicit biases that humans have.

For regulators, the policy challenge for the future will be about complexity and breadth: in the AI ​​and data space, rules will cover multiple areas including financial services, data protection, labor law, competition policy.

Regulators will have to ensure that firms cannot show just how this data was gathered or stored or used, but why it was decided it was important in the first place and how they avoid it being used to discriminate against minorities and people with other protected characteristics .

That’s why we have to create diversity of thought in our culture and in our organizations so we can have a fighting chance of pre-empting multiple and competing interests.

Undoubtedly regulators like the AFM and FCA will face their next 20 years regulating much more data. That data will be broader, possibly deeper and delivered at a faster pace.

Regulators have to invest in the digital skills – and the right people – to keep pace with these developments. We are doing that at the FCA, with a further £34 million being invested in the next year and the recruitment of 100 more data experts.

And whilst we will rightly continue to hold firms to account, in the UK we will be committed to ensuring our regulation does not stifle innovation or dramatically add to the burden – otherwise it will be customers, the very people we exist to protect – who will bear the costs.

Comments are closed.