Skip to main content
Download Loading Download Loading text
The World Bank The World Bank
Inclusive Digital Financial Services
  • Home
  • Topics
  • ECONOMY-LEVEL DATA
  • Glossary
  • Further Reading
global Search dropdown
Search
Financial Consumer Protection

Cross-cutting consumer risks from DFS

#640231

Breadcrumb

  1. Home
  2. Topics
  3. Financial Consumer Protection
  4. Cross-cutting consumer risks from DFS
Back
Gaps in the regulatory perimeter
Data protection and privacy risks
Consumers not being provided with adequate information
Product unsuitability
Algorithmic/AI decision-making
Conflicts of interest and conflicted business models
Fraud and other misconduct
Platform / technology unreliability and vulnerability
Business failure or insolvency
Cross-cutting consumer risks from DFS

Gaps in the regulatory perimeter

RisksPossible regulatory approaches

Consumers of DFS products may receive less protection than consumers of traditional financial products if there are gaps in the coverage of their country’s existing Financial Consumer Protection regulation and financial sector oversight. This risk applies across DFS products and providers. For example:

  • Licensing or registration rules do not apply to non-traditional e-money providers (for example, unregulated fintech firms / mobile network operators).
  • New peer-to-peer lenders may not be regulated as they do not fit easily within existing categories of provider types.
  • App-based lenders based overseas may not fall under the oversight of domestic regulators.

Or existing Financial Consumer Protection regulation and supervision may not extend to new DFS providers or products, particularly where institution-based approach is used.

  • Licensing and regulating by activity and ensuring Financial Consumer Protection rules apply for all providers of the same activity. See Country Examples

    Australia / Indonesia are examples of countries that apply Financial Consumer Protection rules on an activities basis and various other countries are considering the development of new Financial Consumer Protection regulatory frameworks that apply on this basis.

    Australia regulates ‘credit activities’ (lending side) and ‘financial services’ such as dealing or providing advice (investment side).

    The Malaysia Financial Services Act provides another example of this approach. Under it, no person can carry on a business providing for the issuance of a “designated payment instrument” unless it is approved by BNM. Malaysia’s Financial Services (Designated Payment Instruments) Order prescribes “electronic money” as a “designated payment instrument” for the purposes of these requirements. Financial Services Act 2013 (Malaysia), s. 8 (1) and Division 1 of Part 1 of Schedule 1.

  • Provide flexibility in Financial Consumer Protection rules to cover innovative new products. See Country Example

    Mexico’s Financial Technology Institutions Law both extended existing Financial Consumer Protection requirements to peer-to-peer lending and allowed for new rules.

  • Where financial sector oversight is limited, leveraging powers of other regulators (such as competition or telco authorities). See Country Examples

    Competition authority in Kenya, Data privacy authority in Philippines.

  • To address cross-border issues, explore options such as:
    • Coordinating with domestic and international regulatory authorities.
    • Regulating domestic agents of foreign fintech companies.
    • Applying Financial Consumer Protection framework to foreign fintech providers offering products/services to domestic consumers.
  • Pursue complementary, non-regulatory measures such as industry CoCs, working with platform operators to establish rules for app developers.
Cross-cutting consumer risks from DFS

Data protection and privacy risks

RisksPossible regulatory approaches

Business models often revolve around the innovative use of big data and alternative data to target consumers for product offerings, assess product applications, or design products.

  • Consumers may lack awareness or understanding of how and what data about them is collected or used, not assisted by common approaches to notifications and consent.
  • Delivery of information through digital channels, such as through feature phones, and the speed with which fintech products are acquired can make it difficult for consumers to process information adequately, including data privacy-related notifications.
  • The complexity of data-sharing relationships underlying business arrangements, and the uses to which such data may be applied (such as algorithmic decision-making), can make it inherently more difficult for consumers to understand privacy related disclosures and their implications.
  • Personal information may be subject to data mining, purchasing, or analytics regardless of any existing or prospective consumer relationship, such as for product development or marketing research.
  • Data privacy risks are not confined to the financial sector, given how data travels through and is exchanged and handled across different sectors.

Data privacy risks typically involve considerations beyond a financial consumer lens and are ideally addressed through regulatory approaches that go beyond sector-specific regulation. Some regulatory approaches include:

  • Definitions of personal data (or equivalents) should be broad and flexible to cover alternative data and reflect the increasing ability to identify individuals from data.
  • Move away from bundled, overarching consent and require more active, granular, and targeted consent.
  • A customer should be permitted to withdraw consent at any time.
  • Personal data being processed should be processed for legitimate purposes only, irrespective of customer consent. See Country Examples

    Most jurisdictions with comprehensive data protection regimes offer individuals certain protections with respect to decisions based solely upon automated processing of personal data. Some jurisdictions prohibit purely automated decision-making for decisions with “legal effects” or “other significant effects” (e.g., African Union, ECOWAS), while others permit automated decision-making but give individuals the right to ensure that decisions that significantly affect them are not based solely upon automated processing of personal data (e.g., EU, Ghana).

  • Data minimization: Personal data collected should be relevant and limited to what is necessary for the purpose for which data is being processed.
  • Personal data should be stored for no longer than necessary for the purposes for which it is processed.
  • Providers should be made responsible for data practices of the third parties that they contract. This includes requiring providers to exercise reasonable due diligence in selecting service providers and conduct reasonable oversight of them to ensure compliance with data-protection rules.
Cross-cutting consumer risks from DFS

Consumers not being provided with adequate information

RisksPossible regulatory approaches
The standard risks arising from consumers not being provided with adequate product information can be heightened when digital channels for communication pose challenges to consumer comprehension due to limited space, poor formats, poor user interface, etc.
  • Adapt for digital channels, such as bite-sized chunks of information consistently presented and secondary layers and offline channels for further info.
  • Require disclosure of key terms and conditions in channel being used for transaction and access to full terms and conditions, including after transaction completed.
  • Require order and flow of information to enhance transparency; disclose pricing and key terms and conditions earlier in transaction process.
  • Leverage behavioural insights to encourage consumers to engage with information and require user-friendly user interface.
Cross-cutting consumer risks from DFS

Product unsuitability

RisksPossible regulatory approaches
Fintech and DFS can increase access to riskier or complex financial products to consumers that may lack knowledge or experience to assess or use them properly, leading to greater risks of harm due to product unsuitability.
  • Limits on individual investments or borrowing on marketplace lending and investment-based crowdfunding platforms.
  • Prominent warnings to consumers re: risks of product.
  • Requirements to assess affordability or suitability of a product for a particular consumer.
Cross-cutting consumer risks from DFS

Algorithmic/AI decision-making

RisksPossible regulatory approaches

The use of algorithms/artificial intelligence (AI) for consumer-related decisions is becoming particularly prevalent in highly automated business models. Some potential risks of the use of AI in financial services are as follows:

  • Poor algorithm design, incomplete or unrepresentative input data, biased input data may lead to unfair, discriminatory, or biased outcomes. For example, to develop creditworthiness assessments in the absence of formal credit histories, algorithms are analyzing a wide variety of other criteria, such as social reputation, use of airtime and mobile money services, and other considerations. In the absence of clear regulatory limitations and proper internal oversight, algorithms could consider factors that are either de jure discriminatory (e.g., age, race, gender) or de facto discriminatory (e.g., shopping preferences, social circle, education/literacy).
  • Certain demographic groups, particularly those with limited access to technology, may face exclusion from the benefits of AI-driven financial services.
  • Lack of transparency in AI decision-making processes can leave consumers in the dark about how and why certain financial decisions are made.
    • Inadequate mechanisms for consumers to challenge or appeal AI-driven decisions may further hinder their ability to rectify errors or unfair outcomes.
  • The use of large datasets increases the potential for mishandling or unauthorized access to sensitive customer information, leading to data breaches.
  • Dependency on AI introduces operational risks, including system failures, technical glitches, or insufficient training of staff managing AI systems.
  • Regulators lack technical expertise to evaluate algorithmic systems; proprietary nature of algorithms.
  • Apply fair treatment and anti-discrimination rules to algorithms/AI. See Country Examples

    New guiding principles from Hong Kong Monetary Authority state that financial service providers should ensure big data analytics and AI models produce fair outcomes that comply with applicable laws, including related to discrimination.

    Monetary Authority of Singapore (MAS) has issued an open source toolkit to enable the responsible use of Artificial Intelligence in the financial industry. The toolkit, developed in collaboration with the industry players, will help financial institutions carry out the assessment methodologies for the Fairness, Ethics, Accountability and Transparency (FEAT) principles.

    In the US, the New York State Department of Financial Services issued guidance on the use of AI in the financial sector, emphasizing the importance of transparency, fairness, and accountability.

  • Require appropriate procedures, controls, and safeguards during development, testing, and deployment of algorithms/AI to assess and manage risks related to bias and discrimination. EBA guideline

    EBA guidelines require financial service providers have adequate documentation of automated credit scoring models and internal policies and procedures to detect and prevent bias and ensure quality of input data.

  • Require regular auditing of algorithmic/AI systems by external experts.
  • Ensure transparency to consumers regarding use of algorithms/AI. See Country Example

    In Portugal, financial service providers required to inform bank customers when creditworthiness assessments rely exclusively on automated decision-making processes, particularly AI models.

  • Provide consumers with right not to be subject solely to automatic processing and the right to request human intervention. See Country Examples

    Some jurisdictions prohibit purely automated decision-making for decisions with “legal effects” or “other significant effects” (e.g., African Union, ECOWAS), while others permit automated decision-making but give individuals the right to ensure that decisions that significantly affect them are not based solely upon automated processing of personal data (e.g., EU, Ghana).

Very recently major developments have occurred in development of regulatory frameworks for artificial intelligence that, among other things, seek to mitigate relevant risks. The measures above are some examples of the kinds of approaches being proposed or implemented.

Cross-cutting consumer risks from DFS

Conflicts of interest and conflicted business models

RisksPossible regulatory approaches
DFS and fintech-enabled business models can give rise to conflicts of interest in new circumstances not foreseen by regulators or expected by consumers.
  • Impose general conflict mitigation obligations on operators.
  • Require operators to comply with duties to act in consumers’ interests – for example, ‘best interests’ duties.
  • Impose targeted requirements and restrictions to address key conflicts (for example for marketplace lending fair loan pricing and fee-setting obligations for marketplace lending; for both marketplace lending and other crowdfunding, restrictions on operators or their associates investing in loans / offers facilitated by their platforms).
Cross-cutting consumer risks from DFS

Fraud and other misconduct

RisksPossible regulatory approaches

Fraud and other misconduct:

There are other forms of fraud that can be perpetrated by third party fraudsters who are not related to service providers or platform operators. For example:

  • Mobile app fraud: occurs when a fraudster uses a malicious mobile application to deceive a customer.
  • Biometric identity fraud: occurs when fraudsters obtain copies of fingerprints or high-resolution pictures to access customer accounts. Biometric data storage can be breached, leading to data misuse.
  • Authorized push payment (APP) scams: when a fraudster tricks a consumer into sending money to a criminally controlled account.
  • Synthetic identity fraud: happens when new identities are made by blending elements from multiple individuals, making the uncovering of fraudulent transactions more complicated.
Source: CGAP (2017)

Risks of loss from fraud or misconduct can be increased by factors such as opaqueness or complexity of platform arrangements and lack of consumer awareness about exposures.

  • Licensing/registration and vetting and competence requirements on operators and related parties.
  • Require operators to have in place adequate risk management and governance arrangements.
  • Require operators to segregate consumers’ funds and deal with them only in prescribed ways.
Cross-cutting consumer risks from DFS

Platform / technology unreliability and vulnerability

RisksPossible regulatory approaches
Platform/technology unreliability or vulnerability to external threats can expose consumers to heightened risks of loss and other harm.
  • Targeted risk management and operational reliability requirements, including for technology-related risks and outsourcing.
  • Specific competence requirements on operators in relation to matters such as information technology–related risk.

Note: These are in addition to general risk-management and competence requirements.

Cross-cutting consumer risks from DFS

Business failure or insolvency

RisksPossible regulatory approaches
Inexperience of new fintech and DFS entrants and riskier or novel fintech-enabled business models can increase the risk of loss of funds from insolvency or business failure.
  • Vetting and competence requirements for digital financial service providers.
  • Requiring operators to segregate consumers’ funds, hold them with an appropriately regulated entity, and deal with them only in prescribed ways.
  • Requiring operators to have in place business continuity and handover/resolution arrangements.
  • Requiring operators to comply with record-keeping requirements to support business continuity arrangements.
PREVIOUS TOPICAgent Networks
VIEW ALL TOPICS
  • Overview
  • DFS Regulation & Supervision
  • E-Money Regulation & Supervision
  • E-Money Competition Issues
  • E-Money Integrity & Security
  • Agent Networks
  • Financial Consumer Protection
  • Innovation Facilitators
  • Data Protection & Privacy
  • Gender
  • Digital Credit
  • Outsourcing
  • Investment Based Crowdfunding
NEXT TOPICInnovation Facilitators
HELP

Feedback or Suggestions?

fintechcoordinationgroup [at] worldbank.org

Close

  • Facebook
  • Linkedin
  • Twitter
  • Email
WB
afi
melinda
CGAP
UNCDF
  • Legal
  • Privacy Notice
  • Site Accessibility
  • Access to Information
  • Jobs
  • Contact
  • SCAM ALERT
  • REPORT FRAUD OR CORRUPTION
  • World Bank Group logo
  • IBRD
  • IDA
  • IFC
  • MIGA
  • ICSID
© The World Bank Group, All Rights Reserved.