Eryk Walczak

Measuring Complexity of Banking Regulations Using Natural Language Processing & Network Analysis

The banking reforms that followed the financial crisis of 2007–08 led to an increase in UK banking regulation from almost 400,000 to over 720,000 words, and to concerns about their complexity. We define complexity in terms of the difficulty of processing linguistic units, both in isolation and within a broader context, and use natural language processing and network analysis to calculate complexity measures on a novel dataset that covers the near universe of prudential regulation for banks in the United Kingdom before (2007) and after (2017) the reforms. Linguistic, ie textual and network, complexity in banking regulation is concentrated in a relatively small number of provisions, and the post-crisis reforms have accentuated this feature. In particular, the comprehension of provisions within a tightly connected ‘core’ requires following long chains of cross-references.

Key Takeaways: • AI/ML techniques can be used to study complexity of banking regulations • We describe the changes to the UK banking regulations before and after the Great Financial Crisis (2007 vs. 2017) • We develop a new dataset that can be used for other purposes. This research can be seen as an early step towards automating banking regulations (RegTech)

Eryk Walczak is a senior research data scientist in the Advanced Analytics Division at the Bank of England. Prior to joining the Bank, Eryk worked in analytic roles for a fintech and a social media company. His current research interests involve applying data science and experimental methods to study macroeconomics.

Buttontwitter
This website uses cookies to ensure you get the best experience. Learn more