Tech

Bank of England had a heavy week. Now for analytics in cloud

The Bank of England had a busy end to September. On Wednesday last week, it said it would buy £65 billion (c $72 billion) of government bonds after the pound tumbled to historic lows and pensions funds went into meltdown, all seemingly the result of the government’s mini-budget days earlier.

The move sent shockwaves through global financial markets. Whatever the politics, within the Bank of England’s remit is to ensure monetary and financial stability – and to do so, it leans heavily on data.

To improve the data side of its operations, the bank has kicked off a far-reaching programme to improve data quality across the financial sector while at the same time is at the early stages of migrating its on-premises data analytics platform to the cloud.

Speaking to The Register, Peter Eckley, head of data and analytics strategy at the Bank of England, said that in times of crisis or instability, as the UK saw last week, the Bank relies on a vast array of data and forecasts.

Collecting unconventional data sources… means economists don’t then have to wait for the end of the month or end of the quarter statistics. They are fighting a crisis and can see day-to-day what is happening

“When events like the pandemic, or the war in Ukraine, trigger changes in the macro-economy, which trigger changes in the forecasts… it feeds through to in that way,” he said.

At the time of the pandemic, for example, the Bank’s strategy led to it collecting unconventional data sources to understand economic activity in lieu of waiting for quarterly updates.

“That might be how many trucks are sitting on the motorway to Dover or hotel occupancies, flight data, and so on to try and build a higher frequency picture of what’s happening to macroeconomic activity more in real time. That means economists don’t then have to wait for the end of the month or end of the quarter statistics. They are fighting a crisis and can see day-to-day what is happening,” he said.

The set-up

The Bank of England currently relies on an on-premises data stack, based on a Hadoop data lake and a SQL data warehouse. It uses a gamut of analytics tools including Microsoft Excel, R, Python, Git for version control, and Tableau for data visualization. It also uses more specialist econometrics tools such as Stata and EViews.

“It is sort of a bit of a heterogeneous picture at the moment and the medium-term vision is to be migrating the data analytics workloads to cloud,” he said.

It is perhaps a coincidence then that Eckley was speaking at a Teradata event, and meeting with executives from the vendor which has been keen to demonstrate it has successfully re-engineered its data warehouse and analytics platform for the cloud. Teradata counts global banks among its customers, including NatWest and HSBC.

The Bank of England is the UK’s central bank. It was given responsibility for setting interest rates in 1997 by the then Labour government. In 2013, Financial Policy Committee (FPC) was established as part of the new system of regulation brought in to improve financial stability after the financial crisis.

Speaking at the Teradata event in London this week, Eckley said: “As regulators we are responsible for monetary and financial stability: we need to stabilise an incredibly complex global system. We need to be able to see where the risks are so we can mitigate them. We need lots of data and firms spend hundreds of millions if not billions of pounds per year in the UK along assembling that data. And yet the data is of variable quality and not always fit for purpose.”

To meet this challenge, the Bank of England is one year deep into a joint program with the Financial Conduct Authority to improve data quality. The project is expected to take a decade.

But new data technologies are not simply helping the Bank of England fulfil its remit in terms of monetary and financial stability. They are also creating bigger challenges in the regulatory side of its remit, given that many of the institutions it regulates are adopting AI and machine learning to make decisions. Ensuring they do so without introducing unsound risk into the system is a challenge for the Bank.

“A lot of the firms we regulate are adopting [AI and ML], and so it’s this sort of frontier of policy looking at operational resilience. We need to look at the use of machine learning and artificial intelligence and make sure that they are applied in a way which harnesses innovation but also avoids threats to the safety and soundness of the firm, or to financial stability,” Eckley said.

In its report published in February, the Bank said AI could benefit consumers, businesses, and the wider economy, but also can also amplify risks and create new challenges. “The AI models used in the financial system are becoming increasingly sophisticated. Their speed, scale, and complexity, as well as their capacity for autonomous decision-making, have already sparked considerable debate,” it said.

With the global economy and financial markets set for a stormy ride in the near future, despite the U-turn on top tax rate earlier this morning providing some immediate relief, the Old Lady of Threadneedle Street is at least striving to get to grips with the data it needs to improve its view of the future. ®


Source link

Related Articles

Back to top button