Global Regulatory Brief: Digital finance, January edition | Insights | Bloomberg Professional Services

Global Regulatory Brief: Digital finance, January edition

The Global Regulatory Brief provides monthly insights on the latest risk and regulatory developments. This brief was written by Bloomberg’s Regulatory Affairs Specialists.

Digital finance regulatory developments

The fast pace of technological innovation in financial services is set to continue in 2024 and regulators and policy-setters are embarking on a range of initiatives to manage risks and set appropriate standards. From the EU to Korea, the following developments from the past month in digital finance stand out:

  • EU: Lawmakers agree Artificial Intelligence Act 
  • UK: Regulators set out detailed rules for critical service providers 
  • Korea: FSC proposes virtual asset user protection rules
  • US: CFTC proposes new rules on cybersecurity
  • US: SEC denies request for new rules on digital asset trading
  • EU: ESAs consult on detailed policy measures under DORA
  • South Africa: FSCA publish report on crypto market trends
  • Switzerland: FINMA issue staking guidance 
  • Abu Dhabi: Government and University of AI sign MoU on AI

From digital finance, the green agenda and financial stability, we look at vital regulatory matters for 2023 and beyond.

EU agrees Artificial Intelligence Act

European Union (EU) lawmakers reached a political agreement on the legal framework for regulating Artificial Intelligence (AI) in the bloc, the first-ever comprehensive set of rules on AI worldwide. 

Scope: The AI Act will apply to AI systems serviced in the EU with the exception of those AI systems solely used for military and research and innovation purposes. 

Risk-based approach: The AI Act will introduce a regulatory regime founded on a risk-based approach along the value chain depending on risk presented by AI applications due to their use cases.

Unacceptable risk: AI systems considered a clear threat to fundamental rights will be banned, including systems or applications that manipulate human behavior to circumvent users’ free will, such as ‘social scoring’, and certain applications of predictive policing. Some uses of biometric systems will be prohibited, for example emotion recognition systems used at the workplace and some systems for categorizing people or real time remote biometric identification for law enforcement purposes in publicly accessible spaces (with narrow exceptions). AI used to exploit the vulnerabilities of people due to their age, disability, social or economic situation is banned.

High-risk: Strict requirements will apply these systems, including risk-mitigation systems, high quality of training data sets, logging of activity, detailed documentation, clear user information, human oversight, and a high level of robustness, accuracy and cybersecurity. High-risk AI systems include certain critical infrastructures; medical devices; systems to determine access to educational institutions or for recruiting people; or certain systems used in the fields of law enforcement, border control, administration of justice and democratic processes. Biometric identification, categorisation and emotion recognition systems are also considered high-risk.

Specific transparency risk (‘limited risk’): Subject to transparency requirements, so users can know that it is AI generated and make informed decisions. This applies to chatbots, deep fakes and other AI generated content that will have to be labeled as such. Users need to be informed when biometric categorisation or emotion recognition systems are being used. Providers will have to design systems in a way that synthetic audio, video, text and images content is marked in a machine-readable format, and detectable as artificially generated or manipulated.

Minimal risk: Voluntary rules will apply to the vast majority of AI apps and systems. Examples of these applications are AI-enabled recommender systems or spam filters. No mandatory rules apply as they present only minimal or no risk for citizens’ rights or safety. 

Regime for General Purpose AI (GPAI): There is a dedicated set of rules for GPAI models to ensure transparency along the value chain. 

  • Models posing systemic risks must comply with additional obligations related to managing risks and monitoring serious incidents, performing model evaluation and adversarial testing
  • The obligations will be operationalised through codes of practices developed by industry, the scientific community, civil society and other stakeholders together with the commission
  • GPAIs will be subject to transparency requirements covering technical documentation and compliance with EU copyright law

Supervision: National competent market surveillance authorities will supervise the implementation of the new rules at national level. 

  • Coordination at EU level will be carried out at the new AI Office within the European Commission, which will also supervise the implementation and enforcement of the new rules on general purpose AI models 
  • A scientific panel will advise the AI Office, including on classifying and testing the models

Fines: Firms that are found to be non-compliant could face fines ranging from:

  • €35 million or 7% of global annual turnover (whichever is higher) for violations of banned AI applications
  • €15 million or 3% for violations of other obligations and €7.5 million or 1.5% for supplying incorrect information

Next Steps: Once the final rules are published in the official journal (expected in mid-2024), they will be phased-in progressively over two years. 

  • Unacceptable risk prohibitions will apply 6 months after that, and the rules for GPAI after 12 months
  • For the period before rules apply, the commission will be launching an AI Pact, which will convene AI developers from around the world to commit on a voluntary basis to implement key obligations of the AI Act ahead of the legal deadlines 
  • Some of the provisions might need additional guidance from EU regulators before the rules start kicking in gradually from late 2024

UK authorities set out detailed rules for critical service providers

UK supervisors have published a first set of draft rules under the new Critical Third Parties regime put in place by the Financial Services and Markets Act (2023).

Context: The FSMA 2023 gives the UK Treasury the ability to designate ‘critical’ service providers and follows concerns over cyber and operational risk spilling over to the UK regulated financial sector. Designated companies would be subject to oversight by the Bank of England, Prudential Regulation Authority, and Financial Conduct Authority.

The rules in detail: The three regulators have each published draft ‘Handbook’ rules that set out legal obligations on critical third parties, alongside draft ‘Supervisory Statements’ that set out in more detail how the regulators would expect critical third parties to comply with their obligations. 

  • The rules cover areas including governance, risk management including supply chain risks, technology and cyber resilience, change management, and scenario testing 
  • The rules also establish an incident notification framework and specify when and how service providers can share information with both users and regulators

Pre-empting abuse: Responding to a key concern raised by industry in an earlier consultation, the regulators have also proposed a rule prohibiting designated critical third parties from marketing their status by making claims that their services are in any way endorsed by the regulators or superior in quality or soundness.

Implementation: For public consultation until March 15, 2024, the regulators are expected to publish further guidance documents next year before conducting their assessment of which companies should be designated as ‘critical’ by the treasury. Once designated, critical third parties have three months to conduct a first self-assessment against the rules, with most other obligations kicking in within 12 months of designation.

Korea proposes virtual asset user protection rules

The Financial Services Commission (FSC) proposed detailed rules under the Act on the Protection of Virtual Asset Users, which is scheduled to take effect on July 19, 2024. 

Important context: The act is designed to protect virtual asset users and establish a sound order in virtual asset transactions by defining the scope of virtual assets subject to the law and requiring virtual asset service providers (VASPs) to safely manage and store their customers’ deposits and virtual assets. 

  • It also provides statutory grounds for sanctions, including criminal penalty, and fines to punish unfair trading activities using virtual assets

In more detail: The proposals are intended to specify important details, such as:

  • Specification of more types of tokens not currently covered by the Act, such as electronic bonds, mobile gift certificates, deposit tokens linked to CBDC, and non-fungible tokens (NFTs). Under the Act, virtual assets are defined as electronic tokens with economic value which can be traded or transferred electronically, and this excludes game money, electronic money, electronic stocks, electronic bills, electronic B/L and central bank digital currency (CBDC) 
  • Prescription of what kind of financial institutions should be a custodian for VASP customers’ money and how customers’ funds should be managed 
  • VASPs are required to store 80% or more of their customers’ virtual assets in cold wallets and calculate the economic value of their customers’ virtual assets on a monthly basis
  • Establishment of criteria for insurance deductibles or reserves for VASPs to fulfill liability in the event of incidents such as hacking or computer failures 
  • Specification of the timing for material nonpublic information to be deemed public to prevent unfair trading activities 
  • Prohibition of VASPs from arbitrarily blocking user’s deposits and withdrawals in principle without justifiable grounds. Under the Act, VASPs are prohibited from arbitrarily blocking their customers’ deposits and withdrawals without justifiable grounds
  • The duty of monitoring abnormal transactions on VASPs and establishes a procedure for imposing fines for unfair trading activities

Looking ahead: The proposed rules are open for public comments until January 22, 2024 and are expected to be implemented from July 19, 2024 after going through legislative proceedings.

CFTC proposes new rules on cybersecurity

The US CFTC proposed a cybersecurity rule for futures commissions merchants and swaps dealers in response to this year’s highly disruptive ransomware attack on software company, Ion Trading UK.

In summary: The proposal outlines how firms the Commission oversees should handle security risks posed by outside vendors, following a cyber incident in February 2023 at Ion Trading which forced traders to process transactions manually for days. 

In more detail: Under the plan, CFTC-regulated companies would have to identify, monitor and manage their critical third-party service providers for potential risks. 

  • Futures brokers and swap dealers would also have to assess and monitor risks related to possible disruptions from natural disasters or other events

Deadline for comments: The comment period will be open for 75 days after publication, with a closing date of March 2, 2024.

SEC denies request for new rules on digital asset trading

The US Securities and Exchange Commission denied Coinbase Global Inc‘s request for new rules around the trading of digital assets.

In more detail: Maintaining the stance Chair, Gary Gensler has held, the commission argued that existing federal securities laws and regulations work for digital assets, contrary to Coinbase’s contentions. 

  • In his statement supporting the decision, Chair Gensler wrote: “The existing securities regime appropriately governs crypto asset securities”

Why it matters: This decision is consistent with the approach Chair Gensler has advocated for throughout his tenure, underpinned by his view that most digital assets are securities, and that crypto exchanges should register with the agency.

Republican dissension: Republican SEC commissioners Hester Peirce and Mark Uyeda dissented from the decision to deny Coinbase’s petition. Peirce and Uyeda argued that Coinbase’s petition raised important questions that would benefit from engagement with market participants. 

EU consults on detailed policy measures under DORA

The EU Supervisory Authorities (ESAs) issued a second batch of draft policy measures under the Digital Operational Resilience Act (DORA). 

In more detail: These draft policy measures cover the draft rules (regulatory technical standards and implementing technical standards, as well as guidelines), which according to the DORA Regulation need to be finalized by July 17, 2024. Specifically, these cover:

  • Draft rules on the harmonization of conditions enabling the conduct of the oversight activities of critical providers (CTPPs) by the ESAs, in addition to draft guidance on cooperation and information exchange between the ESAs and national authorities
  • Draft rules on major ICT incidents reporting by financial entities, and draft guidance on costs and losses caused by major ICT incidents
  • Draft rules on subcontracting ICT services supporting critical or important functions 
  • Draft rules on threat-led penetration testing (TLPT)

Timeline: The draft rules are submitted to a three-month stakeholder consultation, until March 4, 2024. Once the consultation period concludes, the ESAs will submit the final drafts to the EU Commission for endorsement by July 17, 2024. 

Closely related: The ESAs have also released and consulted on their first batch of policy measures under DORA focusing on the areas of ICT risk management, major ICT-related incident classification and ICT third-party risk management. These are due to be finalized and sent to the EU Commission by January 17, 2024. 

Go-live: DORA is set to apply in the EU from January 17, 2025.

South African FSCA releases findings of crypto-assets market study

The South African Financial Sector Conduct Authority (FSCA) published the findings of its Crypto Market Study to assist the FSCA better understand crypto-asset related activities in South Africa. 

Important context: In October 2022, the FSCA officially declared crypto assets as a financial product in terms of the Financial Advisory and Intermediary Services Act (FAIS).

  • South African consumers are increasingly engaging in financial activities involving crypto assets, including investing in derivative instruments with crypto as the underlying asset, especially given the proliferation of online trading platforms

Key findings: The study finds that most crypto-asset Financial Service Providers (FSPs) in South Africa provide financial services by making use of unbacked crypto assets (60%), followed by stablecoins (26%) such as USD Coin and Binance Coin. This is followed by security tokens (7%) and Non-Fungible Tokens (NFT) (4%). 

  • Another key finding in the study, is the fact that some of the financial services in respect of crypto assets and technical activities inherent to the provision of crypto assets financial services, are outsourced. 

Looking ahead: Financial services related to crypto-assets will likely be included in the licensing activities under the Conduct of Financial Institutions (COFI) Bill, potentially expanding the scope of crypto asset activities that are currently regulated under the FAIS Act. 

Abu Dhabi and MBZUAI sign MoU on AI

The Abu Dhabi Department of Economic Development (ADDED) and Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) signed a Memorandum of Understanding (MoU) to jointly pioneer artificial intelligence (AI)-driven solutions for small and medium enterprises (SMEs). 

In more detail: ADDED and MBZUAI, the world’s first graduate-level AI university dedicated to research, and Wio Bank PJSC, the region’s first platform bank, will partner to create a virtual assistant (chatbot) to help streamline SMEs access to Abu Dhabi’s resources and programmes. 

  • The chatbot will connect SMEs with stakeholders across the business journey, from ideation to growth, expansion, and long-term success
  • The AI-powered chatbot will be curated to assist SMEs in navigating business complexities by providing valuable insights, guidance, and references to resources to help SMEs streamline operations, identify opportunities, and make informed business decisions

View the additional regulatory briefs from this month:

Sign up to receive these updates in your inbox first.

How we can help

Bloomberg’s Public Policy and Regulatory team brings you insight and analysis on policy developments to help navigate the complex and fast changing global regulatory landscape. To discuss regulatory solutions, please get in touch with our specialists or read more insights from our Regulatory team.

Request a demo.