Data privacy

Digital lending models require intensive data gathering and sharing. Borrowers often do not know what data is being used or how it is used and shared, nor can they easily discover and control how lenders and their partners use this data. Customer data handling procedures of digital lenders and partners have often proved inadequate, with numerous cases of failure to ensure their due protection and security. Weak protections contribute to online fraud and client data misuse, including unauthorized charges, social engineering scams, and social shaming of debtors.

Improper data practices

These can take several forms in the digital credit sector:

  • No disclosure to customers on the lender’s practices regarding collection and use of data.
  • Consumers are unable to consent meaningfully to the lender’s data practices where these are disclosed.
  • Collection of data goes far beyond the purpose for which it is required or agreed to.
  • Personal information is collected from handsets, browser history, etc without customer consent.
  • Customers are not allowed to view, correct, or control their data trail.
  • Improper handling, storage, and retention of sensitive data.
  • Lenders post about loan defaults on a borrower’s social media page to shame them into repaying.
  • Lenders contact a delinquent borrower’s contacts on their mobile phone to shame them into repaying.
Source: Izaguirre et al. 2025 forthcoming [link]

A 2020 analysis found that a majority of leading credit apps were collecting sensitive data such as GPS location and contact information. In a review of data security practices among 27 prominent digital lenders, a majority (17) used unsafe security algorithms. A test of 110 popular, free apps found large percentages that shared personal information such as email addresses and location data with third parties.1

Ownership of data, and the rights of consumers concerning its use, are frequently unclear. Consumers often do not know what data is being used or how this data is being used and shared, nor can they easily discover and control how lenders and their partners use this data. In a survey of 11 markets, it was found that only one regulator required lenders to disclose to customers which data was used for their credit scoring.2

Institutional frameworks for data governance vary significantly across countries. Data protection regulations specific to financial services may be issued by a financial regulator under privacy provisions of financial laws, or general data protection provisions are enforced by ICT agencies, consumer agencies, or prosecutors. Alternatively, a data protection law may be enacted. Often such a law sets up a specialized agency to enforce the law, for example Ghana’s recently-established Data Protection Commission. This authority can coordinate with financial and telecommunications authorities to monitor data privacy in digital credit products.3

Recommendation: Data protection regulations, to be comprehensive, must cover disclosure, what data may be shared, consent by the consumer to specific data uses, limitation of data uses to specifically agreed purposes, consumers’ access to their data, accuracy and security of the data, obligations of the data controller (e.g., the lender), and the consumer’s right to withhold or withdraw consent. Collaboration among data, consumer protection, and financial authorities is often critical to effective regulation. Also important are regulations that set strict limits on what can be shared and require proactive customer consent.

Data regulations lean heavily on customer consent. Yet there is a growing recognition that consent processes are ineffective in protecting consumers or in limiting firms’ uses of their data. Customers must routinely consent to data uses and flows that they are not in a position to understand or control. New approaches attempt to balance customer data protection with the need for data to flow throughout the financial system. Researchers have proposed ways of making protection less reliant on express consent.4These include:

  • Shifting responsibility to the providers to ensure that data is used only for a limited set of legitimate purposes and only in the customers’ interests;
  • Adopting and enforcing customer rights, e.g., to review and correct their data, to object to data uses or revoke their consent, and to transfer their data to other providers;
  • Affording consumers expert assistance in assessing how data is used and in particular how algorithmic decisions are made – to protect against biases; and
  • Improving data privacy and security by actively engaging consumers in authorizing the sharing of data with third parties and the use of periodic reminders to (re)consent to data sharing or opt out of continued sharing of data with providers when it may no longer be necessary.5

Country Examples

Link to India case studies
India
Link to Philippines case studies
Philippines