ECB reserve tiering: time for recalibration?

The European Central Bank introduced “tiered” remuneration for bank reserves in October 2019. Since then, bank reserves at the ECB have doubled, but the tiering methodology has remained unchanged. There may be reason to revisit it

Tiered remuneration on reserves: how does it work?

Bank reserves at the ECB have been subject to negative rates since June 2014. In late 2019, the ECB decided to exempt some bank reserves from this negative rate. The “exemption allowance” was set at six times required reserves (plus required reserves themselves, making seven times required reserves in total). Required reserves, in turn, are a percentage of deposits and other short-term liabilities that banks have issued to the general public. So loosely speaking, if households or businesses deposit an amount of money at a bank, then the exemption allowance for that particular bank increases by 7% of that amount (seven times the required reserve ratio of 1%), allowing it to have more reserves at the ECB without facing the negative deposit rate. If banks extend credit to the real economy, this will also create more deposit liabilities for banks, which in turn increases the amount of reserves that banks can park at the ECB without paying negative rates.

The exemption allowance share of reserves has almost halved

The share of bank reserves exempt from negative rates has dropped from 50% in late 2019 to 28% today

When the exemption allowance was first enacted, about half of all bank reserves at the ECB qualified. Yet as the pandemic struck Europe, the ECB conducted new TLTRO operations, restarted its Asset Purchase Programme (APP) and implemented an additional Pandemic Emergency Purchase Programme (PEPP). Together, these programmes caused bank reserves at the ECB to double in a year’s time to over €3700bn today. The exemption allowance increased as well, in tandem with bank liabilities. But the allowance increase came nowhere near the increase in total reserves, and as a result the exemption allowance share has dropped from 50% in late 2019 to 28% today (see chart below). The negative rate costs that banks incur over their reserves has correspondingly increased. Has the time come for the ECB to review its exemption parameters?

Bank reserves at Eurosystem and exemption allowance share


For “involuntarily” held reserves only!

One could ask, why would the ECB give a free gift to banks in the form of negative rate exemptions? An answer to this justified question could be: indeed the ECB should only provide relief for reserves that banks cannot avoid holding. It should not be so generous with reserves that banks voluntarily choose to hold. The latter applies e.g. to reserves that are created as a consequence of TLTRO-borrowing by banks. The costs and benefits of those reserves should be assessed in conjunction with the TLTRO borrowing rates the ECB applies.

It seems neither necessary not warranted to exempt reserves that are a consequence of TLTRO loans

If banks don’t like paying the negative rate costs on these reserves, they should repay their corresponding TLTRO loan. The fact that many banks choose to keep their TLTRO loans, shows that the benefits exceed the negative rate cost of holding the corresponding reserves. It therefore seems neither necessary not warranted to exempt reserves that are a consequence of TLTRO loans.

But how about reserves that banks cannot avoid? Asset purchases (APP and PEPP) are an ECB initiative. While it could be argued that banks indirectly benefit as asset purchases contribute to a healthier economy and compress credit premiums, in a direct sense they lead to higher bank reserves and associated negative rate costs. Isn’t this like the government obliging people to hoard toilet paper, and then proceeding to tax toilet paper holdings? The box below explains in more detail why asset purchases necessarily lead to higher reserves that banks collectively cannot avoid, and how this differs from the reserves over which banks do have influence.

 Monetary operations since 1999: the ECB’s Money Trumpet

To explain today’s glut of excess reserves, it helps to briefly review the history of the ECB’s monetary policy operations. In an effort to present geeky monetary stuff as something cool, we’ll refer to it as the ECB’s Money Trumpet. The chart below shows supply and uses of liquidity in the base money market – the market where central and commercial banks interact – from the perspective of the Eurosystem’s balance sheet. We won’t attempt to summarise the ECB’s monetary operations in just a few paragraphs, but the Money Trumpet shows five phases of ECB liquidity policies in action. Key to understanding this chart is the law of double-entry bookkeeping dictating that any asset acquired by the Eurosystem (liquidity supplied to the market, recorded above the x-axis) is matched 1-on-1 by a Eurosystem liability increase (liquidity parked at the Eurosystem, recorded below the x-axis).

ECB Money Trumpet: Eurosystem liquidity supply and absorption (€ tr)


1. 1999-2008: the good old days

The first phase runs from 1999 until 2008. During this time, Eurosystem liquidity was steady and predictable. The Eurosystem made sure liquidity was always tight, meaning there was net demand for liquidity from banks. By offering liquidity sources in scarce quantities, the ECB could steer money market rates. The auctions of ECB liquidity supply (the open market operations) were capped, meaning that banks were oftentimes only allotted part of what they had asked for. The flipside of this was that bank reserves at the ECB were low as well. Aside from the required minimum reserves, “excess” reserves tended to be minimal.

2. 2008: switching to full allotment

As the interbank market dried up in 2008, maintaining tight liquidity in the system became too risky. The ECB instead switched to full allotment auctions, allocating banks whatever credit they asked for. Indeed Eurosystem claims on banks under open market operations increased. Bank reserves at the ECB increased in tandem. The abundance of excess reserves meant that at least the financial system would not be brought down by liquidity shortages, even as interbank lending had all but stopped.

3. 2015: fighting the deflation demon

The policy goal of achieving flush liquidity with full allotment auctions remains in place today. But in 2015, the ECB added another element. It embarked on large-scale asset purchases (in fact asset purchases had started in 2009 already, but on a limited basis only). Indeed, bank reserves at the ECB increased in step with asset purchases. Excess reserves in the system ballooned.

It is here that our distinction becomes clear between reserves that banks hold by choice, versus reserves they cannot avoid. The ECB’s auctions (open market operations) had always resulted in liquidity supplied at the request of banks. The resulting bank reserves were thus driven by bank demand. The newly introduced asset purchases however are “supply-driven”. They are initiated and controlled by the Eurosystem. They too result in bank reserve increases, but not at the request of the banks, nor do the banks have a choice here. They can merely try to push the hot potato around between them, but they cannot influence the total amount of reserves created by ECB asset purchases.

4. 2019: introduction of tiered remuneration on deposits

The ECB decelerated and then stopped its large-scale asset purchases in 2018. As the ECB continued to reinvest maturing assets, excess liquidity in the form of bank reserves at the ECB stabilised, but did not drop. In September 2019, the ECB introduced a two-tier remuneration system for excess liquidity, to “support bank-based transmission of monetary policy”. The first tier, the allowance of currently six-plus-one times required reserves, is exempt from the negative deposit rate. The light orange part in the chart below shows the estimated exemption allowance. Note that in practice, exemptions are calculated per individual bank, and may therefore be lower for some banks (if their excess liquidity remains below their allowance). Therefore, the shown aggregate allowance should be interpreted as an upper bound estimate.

Bank reserves at Eurosystem, attributed to liquidity sources (€ trn)


5. 2020: Renewed asset purchases and TLTROs

In April 2020, the pandemic prompted renewed asset purchases plus an expansion of long-term refinancing operations. Bank reserves increased from €2046bn on 31 March to €3740bn today and with further asset purchases and TLTRO operations in the pipeline, the end is not in sight.

 Thanks, but how do ECB asset purchases drive higher reserves?

When the ECB (strictly speaking, the Eurosystem) buys a government bond or any other asset directly from a bank, it credits the bank’s reserve account. As such, the ECB’s balance sheet lengthens: it acquires an asset (the bond) and a liability (increased reserves). When the ECB buys a bond from e.g. a pension fund, more steps are involved, but the effect on reserves is the same. As the ECB acquires the asset, it credits the reserves of the bank where the pension fund has an account. The ECB instructs the bank, in turn, to credit the pension fund’s bank account. The pension fund may move the deposit to another bank or reinvest in another asset, but this does not reduce the reserves in the system. The reserves created by the asset purchase are an ECB liability, and can only be reduced at the ECB’s discretion, if it decides to sell the bond again (or to not reinvest the principal when the bond matures). Of course, individual banks can try and reduce their reserves, but total reserves in the system (insofar as they are created by ECB asset purchases) are determined by the ECB.

Without further measures, the negative rate calculated over involuntarily held reserves could almost double this year 

Armed with this knowledge about the role of monetary operations and asset purchases, we can now split bank reserves into a voluntary and involuntary part. We do this by allocating reserves to these two sources of liquidity. This allocation cannot be calculated to the last euro, therefore we provide a range estimate based on different liquidity attribution assumptions. The exempted allowance amounted to roughly 70% of “involuntary” bank reserves in late 2019. As the ECB restarted its asset purchase programmes in 2020, the share of the exempted allowance in “involuntary” reserves started to fall, reaching about 50% currently and set to fall further in the year ahead. Without further measures, the negative rate calculated over involuntarily held reserves could almost double this year, from about €2.5bn in 2020 to some €5bn in 2021. 

Bank reserves at Eurosystem, attributed to liquidity sources (€ bn)


But isn’t the TLTRO negative rate borrowing compensating for all of this?

So, you may say, this complicated reasoning about where reserves come from is all very well, but if banks meet their TLTRO benchmark lending, then the negative rate they pay on their reserves is in fact more than compensated for by the negative rate they get paid on their TLTRO borrowing – at least in 2020 (see table). So why worry about negative rates on reserves?

Indeed over the past year and TLTRO iterations, the ECB has progressively lowered the TLTRO borrowing rate further into negative territory. The most recent TLTRO-III offers a base borrowing rate equalling the deposit rate (-50bp). This means that the negative borrowing rate banks receive on their TLTRO borrowing, and the rate they pay on the corresponding “voluntary” reserves, cancel out. In other words, the negative rate costs for banks associated with the “voluntary” reserves are matched by TLTRO borrowing rate revenues. This leaves the involuntary reserves.

The TLTRO “bonus” rate is not meant to address negative rate costs of reserves. It serves the separate goal of incentivising bank lending to the real economy

Yet with TLTRO-III, the borrowing rate drops to -100bp if banks succeed in attaining a certain benchmark lending to the real economy. As can be seen from the table above, this “bonus” rate revenue (€4.3bn) exceeded APP+PEPP rate costs in 2020 (€2.2-3.1bn), provided benchmarks have been reached, which is likely the case for most banks, though not all.

Yet this TLTRO “bonus” rate is not meant to address negative rate costs of reserves. Instead, it serves a separate policy goal, namely “incentivising bank lending to the real economy”. In our view, one instrument (the TLTRO rate in this case) can serve only one policy purpose at a time. This goal should not be conflated with the issue of involuntary reserve costs.

Besides, there are composition effects to take into account: the banks that take out TLTRO loans and those that hold the highest reserves, are not necessarily the same. Moreover, whether banks will reach their benchmark lending in 2021 too, remains to be seen. Business credit demand has been weak in recent months, and the outlook isn’t great either. Given all these considerations, a review of tiering might increasingly be warranted as asset purchases continue to add to reserves. Whether the ECB sees it this way too, is a different matter.


Initially, tiering was calibrated in such a way that some 71% of reserves that banks hold involuntarily as a result of ECB asset purchases, was exempt from negative rates. At the time of writing, this share has dropped to just above 50%. With the ECB having committed to further asset purchases, involuntary reserves may increase by a few hundred billion more this year. As the negative rate-exempted allowance is linked to bank liabilities, it rises much more slowly, and thus shrinks as a share of involuntary reserves. Negative rate costs of involuntary reserves may double in 2021 compared to 2020. The question thus becomes increasingly pressing as to whether the ECB will revisit the tiering methodology. This will depend on whether the ECB considers the TLTRO arrangement exclusively to incentivise lending to the real economy as it officially states, or whether it considers the TLTRO to simultaneously serve the secondary goal of providing negative rate relief to banks as well.

This article originally appeared on

Can data-based lending improve inclusion and reduce economic volatility?

Tech platforms make their credit assessments based on different data than banks traditionally tend to do. Recent research finds that this could have profound implications on access to credit, credit risk, monetary policy and the economic cycle

The basics of credit are the same, always, everywhere

The essence of getting credit, from an economic perspective, is as old as humanity. A borrower obtains funds, and promises to pay them back some day. But the deal suffers from “asymmetric information”. The lender knows less than the borrower does about e.g. the borrowers intentions, his behaviour, the risks he faces. So the lender wants to see proof of income, looks at the prospects of the economy the borrower is operating in, and draws from experience with similar borrowers in the past.

The essence of getting credit, from an economic perspective, is as old as humanity.

Based on all that, the lender demands a risk premium: a markup on the interest charged, as an insurance premium against losses incurred should the borrower be unable to repay. The premium can be lowered if the borrower is able to pledge collateral: assets, such as a house, the lender can repossess if things go wrong. This basic process is true for banks, but also for markets, where the rates on collateralised loans and bonds tend to be lower than those on unsecured ones.

Current bank and market lending systems do have their downsides

This basic mechanism tends to work reasonably well, but it does have a number of side effects.

1. A minimum amount of financial data is needed to assess credit risk

At the individual level, borrowers that do not have a multi-year history of stable income to show, or a valuable asset to pledge as collateral, may find it more difficult and expensive to obtain credit. As such, self-employed people with volatile incomes and start-up entrepreneurs may find it more difficult to obtain credit

2. At the macro level, the system amplifies the economic cycle, for better and for worse

When the economy grows, lenders will make a more favourable assessment of borrowers’ chances to repay their loan. Retail clients are more likely to retain their job and get wage increases, while SMEs are more likely to prosper and grow. If real estate prices increase, this increases the value of the collateral borrowers can pledge. More borrowers will be able to obtain more credit. This, in turn, increases spending power in the economy, reinforcing the upward cycle of GDP growth and increasing real estate prices. However, once the cycle turns, this self-reinforcing mechanism goes into reverse. Borrowers’ prospects deteriorate, and with it, lenders’ willingness to lend to them. Moreover, if real estate prices fall as well, lower collateral valuations reduce the amount of credit a borrower can obtain. This process has been analysed as the “financial cycle” and, earlier on, as the “financial accelerator mechanism”.

If only there were a way to reduce the asymmetry of information. If the lender were able to get a real-time and thorough understanding of the borrower’s business, the need to assess broader economic conditions might be lowered, and collateral requirements might be relaxed or even removed completely. This would enhance access to credit and reduce the procyclical tendencies embedded in both bank and market lending today. But how, you say?

Credit based on alternative, non-financial data to the rescue?

Scoring based on “non-traditional data” is already as good as, or even better than, traditional ratings.

This is where tech platforms come in. They gather data on their users’ interactions and transactions on the platform, and often beyond. Their users tend to spend a lot of time on those platforms, (including site and app, but also e.g. partner websites). Based on the profile they build of their users, platforms can make real-time credit assessments, and may feel comfortable e.g. extending consumer credit to buyers, but also lending to merchants active on the platform. Recent BIS analysis shows that such credit scoring based on “non-traditional data” is already as good as, or even better than, traditional ratings.

On the one hand, such profiling may sound creepy, and indeed there are trade-offs to consider, especially in terms of privacy. Policymakers in Europe and elsewhere continue to review data use developments and applicable regulations. But on the other hand, real-time credit assessments made based on data gathered about interactions and transactions may extend credit and other financial services to new groups that were previously excluded, because of lack of a stable income history or lack of collateral to pledge.

There are trade-offs to consider, especially in terms of privacy.

That is a welcome development in a world where temporary and self-employment become more prevalent, and where digital markets invite entrepreneurship. Moreover, analysis of “non-traditional” data allows lenders to provide borrowers with more timely, personalised and accurate tools to monitor their financial situation and loan obligations. This should benefit both borrower and lender.

For the economy at large, loosening the relationship between economic conditions, real estate valuation and credit may reduce procyclicality. A recent BIS paper studying bigtech credit in China, shows that indeed, the correlation with real estate prices and GDP is lower for bigtech credit than it is for credit assessed on a traditional basis, which suggests this form of credit should be less procyclical.

Tapping new sources of data is not the exclusive domain of fintech and bigtech

While bigtech credit thus far is not the dominant form of borrowing in most economies, and plays only a minor role in Europe, this will change, and possibly faster than many currently anticipate. See e.g. this recent FSB report about how bigtech is expanding its finance offering in emerging markets. Of course, it’s not just bigtech firms that can lend using “non-traditional data”. Other lenders may also rely on data-crunching credit assessment by platforms in an originate-to-distribute setup – this is the way China’s Ant Group works together with banks.

Any lender can develop the analytical capability and expertise to use alternative data, provided they have access to them.

Apart from relying on other companies’ expertise, any lender can try to develop the analytical capability and expertise to use non-traditional data, provided they have access to them. Not every lender will be in a position to develop a platform of their own, where buyers and merchants generate enough data to make meaningful credit assessments. But of course lenders could also access data from other platforms, for example by entering into a bilateral partnership with the platform. Alternatively, regulation could be designed to enforce data portability, allowing borrowers to give lenders access to platform data they have generated. Incidentally, policymakers such as the European Commission indeed recognise the value of data and data sharing for financial services and beyond. Policies enforcing more wide data sharing – all subject to user permission – are being actively explored.

To put things into perspective: platform data-based borrowing will not solve all financial exclusion and procyclicality problems overnight. It may help to reduce information asymmetry between borrower and lender, but will not fully eliminate it. Moreover, the future always holds risk and uncertainty that even the most advanced algorithm fed with the most up-to-date data cannot foresee – a lesson we have again learnt in 2020. But the addition of a platform-based credit channel could certainly help enhance credit access and financial and economic stability. Moreover, a less cosy relationship between real estate prices and credit would be a welcome development both for credit markets and associated risks, and for housing markets and affordability.

This article originally appeared on

EBF-panel Tech meets Finance

With the year 2020 dominated by Covid-19-related restrictions of physical interaction and an increasing relevance of technology and digital interaction for businesses and citizens, legislative initiatives by the European Commission aim at addressing digital transformations in Europe’s economy. Technology actors, i.e. BigTech, are expanding their reach from their core businesses into adjacent industry sectors. In turn, the European market is increasingly facing questions of competition, market conditions and available digital infrastructure. The financial sector is a frontrunner in the application of digital innovation to the benefit of customers and business operations. Consequently, the financial ecosystem meets newcomers who – thanks to possible gatekeeper roles –carry significant changes into the known environment of business relations and regulatory framework.

PANEL DEBATE: Innovation in finance – digital platform solutions

How are European banks, regulators and tech companies perceiving these changes? With the European Commission’s Digital Services Act coming up, the conversation will touch upon the role for new ex ante regulation and the protection of fair competition and innovation required for a prosperous Europe.

  • Pablo Urbiola Ortún, Head of Digital Regulation and Trends, BBVA
  • Elisabeth Noble, Senior Policy Adviser, EBA
  • Jan Boehm, European FinTech Association
  • Teunis Brosens, Head Economist for Digital Finance and Regulation, ING

Moderated by: Sébastien de Brouwer, Chief Policy Officer, EBF

CBDCs and commercial banks: Evolution or Revolution?

As countries around the world accelerate the development of retail central bank digital currency, the impact on commercial banks and their role in this endeavour remain uncertain. OMFIF’s Digital Monetary Institute convened a panel discussion to explore what a retail CBDC public-private partnership would look like and how this would shape banks’ business models.

The panel included Hanna Armelius (Riksbank), Henny Arslenian (PWC), David Birch, and me. We discussed the potential division of labour between central banks, commercial banks and technology companies, and assessed how non-bank providers’ participation in a CBDC roll-out could impact traditional banks’ strategy and operations, as well as wider implications for global banks if digital currency is adopted for cross-border payments.

Tellingly, at the start of the seminar 78% of the audience thought the benefits of CBDC would outweigh the risks posed to the commercial banking sector, but this had dropped to 61% by the end of the webinar.

The consensus view among panelists was that central banks could distribute digital currencies through banks, in a public-private partnership not unlike the way the distribution of physical cash is organised. This would be a less disruptive scenario for banks and for the financial system more broadly, although availability of deposits would remain an issue for the supply and pricing of credit to businesses and households. While preserving banks should never be a policy goal, preserving financial stability while introducing CBDC should be. This does not mean a no-go for CBDC but does imply a reality check, and a warning to proceed carefully.

 Sources: ING THINK, OMFIF DMI, video registration. 

How Covid-19 is drastically changing digital finance

Digital finance was already a fast-changing place before Covid-19. After the pandemic, expect more government intervention. A changing appreciation of data may have strong implications for finance as well.

The Covid-19 pandemic will eventually pass, but not before disrupting vast swathes of the global economy and making its mark on digital finance.

1. Bigger government role

The increased role of government in the economy and financial sector is likely to persist to some degree. Before Covid-19, the changing geopolitical landscape had already made policymakers aware of the strategic importance of vital domestic infrastructure, including communications and payments. Governments are now also looking into data, an area already identified as a strategic priority by the European Commission earlier. In finance, governments have quickly established huge guarantee schemes to see businesses through the crisis.

It will take years to wind down the increased public role in finance 

Even in a best case scenario, it will take years to wind down this increased public role in finance and the broader economy. A renewed debate on the division of labour between public and private sectors in finance will likely flare up, once the dust settles.

2. Reduced foreign dependence

Calls for autarky (e.g. in producing face masks) may fade quickly, but both businesses and authorities will look for less complex cross-border supply lines and smaller foreign dependencies, as Covid-19 demonstrates their fragility, but also on national security grounds. At the same time, governments have also been made acutely aware of the need for high quality communications infrastructure, e.g. to facilitate working from home. Countries with leading positions in the required technologies (think China and 5G) will be aware of their good negotiating position.

Authorities realise digital platforms are playing useful roles in locked down societies.

Finance  has grown into a business with complex cross-border linkages. This ranges from financial ties to IT outsourcing and supervision. These international, sometimes global linkages, were already critically scrutinised by authorities, a development that may intensify post-Covid-19. How bigtech’s endeavours in finance fit the revised picture, remains to be seen. National authorities e.g. in Europe were increasingly critical about bigtech before the pandemic struck, but also realise that major digital platforms have become an important part of daily life and are playing useful roles in locked down societies.

3. Cybersecurity

The switch to working from home has stretched across corporate and national network infrastructures. With resources reallocated to keeping the show on the road, this exposes businesses (and an already heavily burdened health care sector) to increased cybersecurity risks, such as ransomware attacks and data leaks, but also to things like fake news. We have to reckon with the possibility that bad actors use security gaps today to establish a presence, only to exploit this presence later on, when it suits them best. To counter this threat, cybersecurity may be expected to move up the policymakers’ agenda. Given that cybercrime knows no borders, it is best fought at the international level. In the EU, while there is an EU agency, there is no deep cooperation as in markets and finance, for example. That might change in the future.

4. Faster adoption of digital interactions in retail

On the retail end, we expect the adoption of digital finance technologies to accelerate post-Covid-19. Many people have had no other choice than to acquaint themselves with ways to do business digitally – ranging from video conferencing to exchanging documents securely by digital means, or paying contactless to minimise physical contact.

Identity verification via video call may become accepted practice rapidly

We expect identity verification via a video call to become accepted rapidly. In the medium term, this may provide a boost to digital-only financial intermediaries, and may accelerate the demise of brick-and-mortar financial shops.

5. Focus on inequality and financial inclusion

The pandemic has put new focus on inequality in society. Digital financial intermediaries may be asked to intensify their efforts towards financial inclusion, e.g. by improving access to financial products for groups such as the self-employed, temporary workers and small and medium-sized enterprises (SMEs). Access for these groups is hindered by the limited availability of financial data and the high costs of processing them. One way to go about this is to augment financial data with a diversity of non-financial data sources.

6. Changing attitudes vis-à-vis data

Until recently, data debates centred around things like privacy and the question of whether people are comfortable paying for platform services with their data. While people may not like the idea when they think about it, in practice they continue to use said services. Several European governments are currently considering tracking location, health and other personal data for Covid-19 control monitoring and personalised health advise (e.g. to self-quarantine). There are other examples where data sharing could be useful in a lockdown society. Acutely arising liquidity needs due to the lockdown have demonstrated the need for quick credit checks. In the absence of readily available financial data, alternative (platform) data may prove very valuable.

People may reconsider the potential gains of sharing data and the need to protect users and society at large from data abuse

While data protection and usage monitoring remain paramount, far-reaching data sharing may be temporarily acceptable to most given the challenge at hand. But after the pandemic too, people may reconsider the value of data, the potential gains of sharing this data and the need to protect users and society at large from data abuse. Such shifting views may in turn enable new business models such as data guardians (a function that prima facie banks might be in a good position to fulfil), and may accelerate the establishment of legal frameworks to govern data sharing and protection.

Conclusion: Don’t stop thinking a step ahead

The coronavirus pandemic is such a fundamental and monumental shock that it will have a lasting influence on digital finance. In particular, and in no particular order, we see shifts in the relationship between the public and private sector in finance, changes to global interconnectedness, the need for increased cybersecurity cooperation, an acceleration of digitisation, an increased focus on financial inclusion and shifting attitudes towards data. For both financial intermediaries and policymakers, it is wise to start thinking about these tectonic shifts too.

 This article first appeared on ING THINK 

Will Covid-19 accelerate the arrival of digital currencies?

The Covid-19 pandemic took the world by surprise with an unprecedented political and economic shock. As a result, we’ve updated our outlook on digital currency attitudes and trajectories

CBDC – as we knew it

Two months ago, the factors that drove research and development into central bank digital currencies (CBDC) included:

  • Technology considerations: the possibilities unlocked by today’s tech to create e.g. programmable money and decentralised, even offline exchange infrastructures;
  • Efficiency and financial inclusion: the desire to develop payment systems and use them as a development tool for the rest of the economy (less of a driver in developed economies);
  • Geostrategic considerations: the dominance of the dollar in finance and trade, the emergence of China on the world stage and the role of US and Chinese big tech firms;
  • Monetary autonomy: the dystopian idea of a private sector global currency operator sharply reducing national central banks’ monetary degrees of freedom and efficacy;
  • The declining use of physical cash and the urge to develop public sector alternatives to private digital infrastructures;
  • The realisation that any CBDC is potentially highly disruptive and therefore has financial stability implications that need to be managed carefully.

Covid-19 will accelerate, not slow, CBDC developments

We don’t think that Covid-19 alone will be a good enough reason for central banks to suddenly adopt digital currencies. However, the pandemic is likely to accelerate the process.

Here are a few reasons:

  • The declining use of physical cash is likely to accelerate, as contactless payments are encouraged to reduce contagion risk. While this may not win over the staunchest physical cash fans, the forced introduction to contactless payments may convince a silent majority;
  • The role of government is likely to increase, as we discuss here. This may make it easier for central banks to obtain the necessary political mandate to introduce a digital currency;

The pandemic may clear the political way towards introduction of CBDC

  • The financial system will come under increased pressure from Covid-19. The financial stability concerns related to CBDC (mainly substitution from bank deposits into CBDC) will therefore be even more pressing. At the same time, so will be calls to insulate payment systems from pressures in the lending parts of the financial system;
  • The pandemic will reshuffle the cards on the geopolitical stage. Some countries may emerge with less economic damage, giving them a clear opportunity to flex their muscles;
  • This and de-globalisation may intensify attempts to establish “national champions” in digital payments, either private or public.

So what are central banks up to? Will Libra 2.0 host any CBDC?

The global Financial Stability Board launched a consultation on global stablecoins (such as Libra), however, that was already planned. The Dutch central bank stated this week it’s ready to test CBDC in the Netherlands, once CBDC is properly debated at the Eurozone level. Yet this statement too, like other communications about intensifying research and pilots starting, was already in the pipeline before Covid-19. In other words, it’s too soon to see a corona effect.

De-globalisation may intensify attempts to establish “national champions” in digital payments

Last week the Libra association updated their white paper and introduced “single-currency stablecoins” alongside the original multi-currency Libra coin. In this new version, they argue that if central banks were to create a digital dollar, euro or British Pound, the Libra association could host these on the Libra infrastructure.

This offer has put the ball firmly back in the central bankers’ court. It may sound like an offer central banks can’t refuse, but we doubt whether they will take it up. In principle, a central bank may like the idea of having its CBDC hosted on multiple private platforms, in addition to its own public infrastructure. Availability on widely used platforms is in fact necessary for broad CBDC acceptance. So from that perspective, hosting CBDC on the Libra platform may be fine. There are a few problems though.

The dreaded multi-currency original Libra is not off the table yet. Authorities will continue to regard this global stablecoin with suspicion

The biggest one is that, even though Libra added single-currency Libra’s and potentially CBDC to the mix, the dreaded multi-currency original is not off the table yet. Authorities will continue to look at this global stablecoin with suspicion, and will probably demand guarantees in some form that it does not threaten monetary autonomy. We doubt whether Libra is able and willing to give such guarantees. The best one, from the authorities’ perspective, is to have no multi-currency Libra at all. But even in this long-awaited version 2.0, Libra refused to bite that bullet.

Moreover, Libra 2.0, which still has the potential to quickly become a dominant payment platform, will not make the previously mentioned questions on financial stability any easier. The Libra association may feel it has addressed all authorities’ objections to its v1.0 proposal, yet it may face more stiff conversations with central bankers.

A new dynamic?

In the end, whether CBDC arrives or not, was never, and never will be, a purely technological question. It was and will primarily be about political acceptance and alignment with other political strategic goals, both domestic and international.

De-globalisation, bigger role of governments and close cooperation with the financial sector will guide CBDC discussions

In that respect, our initial assessment is that CBDC will be a more likely option post-Covid-19. The bigger role of governments and the close cooperation between them and the financial sector in combating the economic fallout will guide discussions about CBDC in the context of the role the financial sector has in serving society.

That said, CBDC will not be introduced overnight. Right now, authorities and the rest of society are in crisis-fighting mode. However, previous crisis episodes have shown us that the foundations for the post-crisis institutional framework are laid in crisis times. We expect the debate to start soon.

This article first appeared on ING THINK

Facebook’s Libra updates its plans, now back in business?

It’s been a few months since we heard from Libra. As it turns out, the consortium has been working hard on version 2.0 of its white paper. This time, its proposed digital currency has a serious chance of being acceptable to authorities

It turns out that Libra had a lot of homework to do. When the initial white paper was published in June, a storm of criticism followed. Authorities all over the world were afraid a global stablecoin with the userbase of Facebook would create a de facto global private central bank, reducing the monetary autonomy of existing central banks. Concerns were also raised about Libra’s governance and its compliance framework. It quickly became clear that Libra would not fly in its initially proposed form, simply because many authorities would outlaw it.

But the Libra Association paid attention, and their 2.0 plan contains a number of fundamental changes that should to a very large extent address the fundamental concerns raised. To name the most important ones:

  • Libra has introduced “single-currency stablecoins” alongside the “global stablecoin”. In other words: the original Libra will get company from EUR-Libra, USD-Libra etc. These local currency versions blend in much easier in domestic monetary, financial and regulatory framework, and do not pose direct threats to monetary autonomy. That said, a successful Libra network could still influence financial stability. While an important concern, this should not be a complete showstopper. We do see potential issues around the fact that the “global” stablecoin, the original Libra, continues to exist. Authorities will continue to regard this global stablecoin with suspicion, and demand guarantees in some form that it does not threaten monetary autonomy. In what was probably a well-timed coincidence, the Financial Stability Board issued a consultation about global stablecoins earlier this week..
  • Libra is attempting to bring its business participants clearly within existing regulatory parameters. For example, exchanges and wallet providers are to register as Virtual Asset Service Providers (VASPs), meaning they have to comply with global standards to counter money laundering and terrorist financing. This too is important, as it will provide regulators with the tools needed to monitor and enforce compliance.
  • The third important change is that Libra is giving up on a fully decentralised future. Doubts about the feasibility arose immediately on publication of the initial white paper. There was great uncertainty about how a decentralised Libra network would look, and how authorities and supervisors would interact with it. Libra has apparently not been able to provide authorities with a satisfactory sketch of a decentralised network that nonetheless can be supervised and controlled effectively, and has instead opted to let go of the decentralised idea altogether. This is a very important signal with wider implications. Various crypto-projects are still working on fully decentralised approaches. But they now have a hard question to ask: will a decentralised setup ever be acceptable to authorities, or will it cause the coin/asset to languish on the fringes of the financial system forever?

Libra has shown that those who gave up on the project, did so too quickly. Libra 2.0 is very different from the initial version and now has a serious chance of being acceptable to authorities, and actually come into existence. The biggest issue in our view remains that the global currency basket-version of Libra is not off the table enitrely. Moreover, even in its watered down local-currency form, Libra, in combination with Facebook’s vast userbase, would remain a strong disruptive power to existing payment systems and the financial system at large. Let’s see how Libra’s proposals are received this time round.

This article first appeared on ING THINK