Logo

SAMAANTA

A Corporate Law Firm

THE DIGITAL PERSONAL DATA PROTECTION ACT, 2023 WITH RESPECT TO FINTECH COMPANIES

ABSTRACT
The Digital Personal Data Protection (DPDP) Act, 2023 introduces a comprehensive legal framework for the processing of digital personal data in India. For FINTECH companies, the Act has significant implications, particularly regarding data collection, consent management, data storage, cross-border data transfers, and accountability. It mandates that companies obtain clear and informed consent from users, ensure robust data security measures, and appoint Data Protection Officers if classified as Significant Data Fiduciaries. The Act also emphasizes transparency, user rights such as data access and correction, and imposes penalties for non-compliance. Fintech firms must now align their data handling practices with the DPDP Act to ensure regulatory compliance and build user trust.
Top of Form

Bottom of Form
INTRODUCTION :
In an era where data is often regarded as the new oil, the protection of personal digital information has become a pressing concern for governments, businesses, and individuals alike. Recognizing the need for a robust data protection framework, the Government of India enacted the Digital Personal Data Protection Act, 2023. This landmark legislation seeks to safeguard the personal data of individuals while balancing the needs of innovation, governance, and economic growth. The DPDP Act marks a significant step in aligning India’s data privacy norms with global standards, and it lays down clear obligations for data fiduciaries and processors, along with rights for data principals (individuals). As digital ecosystems continue to expand, the DPDP Act serves as a foundational piece of legislation to ensure trust, accountability, and transparency in the handling of digital personal data.

WHY DATA PROTECTION IS IMPORTANT FOR FINTECH :
In the digital age, where financial transactions increasingly occur through mobile apps and online platforms, FINTECH companies are transforming the way people manage, invest, and move money. However, this innovation comes with heightened responsibility—protecting customer data is not optional; it's essential. Here's why data protection is so crucial in the FINTECH sector.

1. Handling Highly Sensitive Information
Fintech platforms collect and process vast amounts of sensitive data, including:
Personal Identification Information (PII): names, addresses, contact details
Financial data: bank account numbers, credit card information, transaction history
Biometric data: fingerprints, facial recognition (used in KYC and login features)
A breach of this data could lead to identity theft, financial fraud, and massive legal consequences.

2. Trust is the Currency of Fintech
In an industry where users are trusting apps and digital platforms with their money, data privacy is directly tied to trust. A single data breach can irreparably damage a company’s reputation, leading to:
Customer churn
Negative media coverage
Loss of investor confidence
Maintaining robust data protection practices helps FINTECH firms build long-term relationships with users based on security and reliability.

3. Regulatory Compliance
FINTECH operate in a heavily regulated space, often needing to comply with:
GDPR (EU)
CCPA (California)
PCI-DSS (for payment data)
GLBA (Gramm-Leach-Bliley Act in the U.S.)
Data Protection Bills in emerging FINTECH markets like India, Nigeria, Brazil, etc.
Non-compliance can result in hefty fines, license revocation, and criminal liability.

4. Cybersecurity Threats are Growing
As FINTECH usage grows, so does the interest of cybercriminals. Fintech firms face threats like:
Phishing and social engineering attacks
Ransomware
API attacks
Insider threats
Mobile app vulnerabilities
Strong data protection practices—including encryption, multi-factor authentication, and intrusion detection—are essential to safeguarding against these evolving threats.

5. Third-party Risks
Fintech companies often rely on third-party providers for services like cloud storage, payment processing, KYC/AML checks, and more. Each vendor introduces a potential attack vector.
Data protection policies must extend across the entire supply chain, ensuring third-party vendors also follow stringent privacy and security protocols.

6. Ethical Responsibility
Beyond legal requirements, there is an ethical obligation to protect users' data. Many FINTECH users come from underserved or vulnerable populations—startups that collect their data must do so responsibly, ensuring:
Data minimization (only collecting what is necessary)
Transparent privacy policies
User control over their data (opt-ins, deletion, data portability)

7. Enabling Innovation Through Security
Ironically, strong data protection enables innovation. When users feel secure, they are more likely to adopt new services like:
Open banking
Robo-advisors
Crypto wallets
P2P lending
Security becomes a competitive advantage, not just a compliance requirement.

Conclusion
Data protection is not just a technical requirement—it’s a strategic imperative for fintech companies. As financial services continue to digitize, companies that prioritize data privacy and cybersecurity will be the ones that win customer trust, meet regulatory demands, and scale responsibly.
Fintech without data protection is FINTECH without a future.

OBLIGATIONS OF FINTECH COMPANIES UNDER DPDP Act, 2023:
FINTECH, as Data Fiduciaries, are expected to:

1. Obtain Consent – Process personal data only after obtaining clear, informed consent from the Data Principal (user).
2. Data Minimization – Collect only necessary data for the specified purpose.
3. Purpose Limitation – Use data strictly for the purpose consented to.
4. Security Safeguards – Implement robust security measures to prevent data breaches.
5. Data Principal Rights – Facilitate rights such as access, correction, erasure, and grievance redressal.
6. Data Breach Notification – Notify the Data Protection Board and affected users in case of breaches.
7. Retention Limits – Not retain data longer than necessary for the purpose.

Any FINTECH company operating in India that collects or processes digital personal data is covered under the DPDP Act, 2023, primarily as a Data Fiduciary and must ensure compliance with all applicable obligations.

THE DOMAIN ( i. e. scope / applicability) OF THE DIGITAL PERSONAL DATA PROTECTION (DPDP) ACT, 2023 IN INDIA
It include the following key aspects:

What it applies to

1.Digital personal data :
Data about an individual (data principal) that is collected in digital form, or collected in non-digital form and then digitized.
“Personal data” is broadly defined: any data about an individual who can be identified by or in relation to such data.

2. Territorial scope:
It applies to processing inside India.
It also applies to processing outside India if the processing is connected to offering goods or services to data principals within India.

What it does not apply to

1. Non‑personal data — data that doesn’t identify a person.

2. Personal/domestic use — processing by an individual for purely personal or domestic purposes is excluded.

3. Publicly available personal data — if the individual (data principal) themselves has made the data public, or it is made public under legal obligation.

KEY SECTIONS OF THE DPDP ACT, 2023 RELEVANT TO FINTECH

Section 5 — Notice
Requires that any request for consent to process personal data be preceded or accompanied by a notice to the data principal. This notice has to include what personal data will be processed, for what purpose, how the data principal can exercise rights (including under Section 6(4) and Section 13), and how to lodge a complaint with the Data Protection Board.

FINTECH must ensure before collecting user/borrower/customer data that they provide proper notices. E.g. when collecting KYC or transaction data, or when using data for analytics or cross‑selling. The notice must be in plain language, available in Indian languages as prescribed.

Section 6 — Consent
Sets out requirements for valid consent: must be free, specific, informed, unconditional, unambiguous, with a clear affirmative action; limited to data necessary for the specified purpose. The principal must be able to withdraw consent, and the withdrawal must be as easy as the giving. Consent requests must be in plain and clear language; contact details of DPO/authorised person must be provided; consent managers may be used and registered.

Fintech companies often collect sensitive or large amounts of data. Consent flows (in apps, websites) will need to be designed carefully. For example, collecting more data than required or embedding consent for marketing along with KYC may be non‑compliant. Withdrawal of consent features must be implemented.

Section 7 — Certain legitimate uses
Permits processing of personal data even without consent in some specified “legitimate uses” (grounds) which may include lawful obligations, performance of contract, etc. Also, exceptions under other laws. 
FINTECHS often rely on legal/regulatory obligations (e.g. KYC, anti-money laundering laws). This section clarifies when processing without explicit consent may be allowed. But must carefully map use‑cases to these legitimate grounds.

Section 8 — General obligations of Data Fiduciary
Among obligations: ensure accuracy, consistency, completeness of data especially if decisions are made using the data or if disclosed to other fiduciaries; implement reasonable security measures; notify about personal data breaches; erase personal data when purpose is fulfilled or consent withdrawn (unless other laws require retention); set up grievance redressal mechanism; ensure processors / vendors also comply. 
Very relevant because FINTECHs often make decisions (credit scoring, risk, etc.) based on data. Any wrong / stale data can lead to customer harm or regulatory risk. Also, many FINTECH outsource parts of operations (e.g. analytics, cloud, payment processing), so vendor contracts must reflect these obligations. Data breach management is also critical in financial services.

Section 9 — Processing of personal data of children
Requires verifiable consent from parents/guardians before processing personal data of children (under 18 or such age notified). Also prohibits behavioural tracking, profiling or targeted advertising of children, or processing that may have detrimental effect on their well‑being. 
Many FINTECH may not target or handle data of minors, but some do (student loans, youth banking, etc.). Must check whether any feature involves minors. If so, compliance here is mandatory.

Section 10 — Additional obligations of Significant Data Fiduciary (SDF)
The government can notify any Data Fiduciary (or class) as an SDF based on factors like volume & sensitivity of data, risk to rights of Data Principals, impact on sovereignty, etc. Once notified, SDFs must do the following: appoint a Data Protection Officer (in India, senior role, reporting to top management), appoint an independent data auditor, undertake periodic Data Protection Impact Assessments (DPIA) and periodic audits and other prescribed measures. 

Many FINTECH will likely be designated as SDFs, given processing of financial data, large user‑base, etc. That imposes extra governance burden (audit, DPIA etc.). FINTECH must prepare accordingly.

PENALTIES FOR VIOLATING DPDP ACT, 2023
Non‑compliance penalties vary depending on the nature of the breach: e.g. failure to take reasonable security safeguards, failure to notify breaches, children’s data violations, and failure to meet SDF obligations. The amounts can be large (hundreds of crores). 

Highest penalty for breaching DPDP Act, 2023
Under the Digital Personal Data Protection Act, 2023 (India), the highest penalty for breach depends on what kind of violation it is. The maximum is ₹250 crores for the most serious breaches.

Here are details of highest penalties by type of violation:

•Failure to take reasonable security safeguards to prevent a personal data breach
• ₹250 crores 

•Failure to notify the Board and affected Data Principals about a breach
• ₹200 crores 

•Non‑fulfilment of additional obligations relating to processing children’s data
• ₹200 crores 

•Failure to comply with obligations of a Significant Data Fiduciary
• ₹150 crores 

WITH RESPECT TO ARTICLE 21 OF THE INDIAN CONSTITUTION IS IT LEGAL TO SHARE DATA TO THE THIRD PARTY BY ANY COMPANY?
Yes, a company can in some cases lawfully include terms and conditions allowing sharing of data with third parties, but only under specific conditions under the DPDP Act, 2023 — and any such sharing must pass constitutional scrutiny under the right to privacy (which is part of Article 21 of the Indian Constitution).

Relevant legal context:
• Right to Privacy under Article 21
The Supreme Court in K. S. Puttaswamy v. Union of India (2017) held that privacy is a fundamental right under Article 21 (right to life and liberty). This includes informational privacy: the ability of individuals to control their personal data (who gets it, how it is used etc.). Any interference with privacy must satisfy tests of legality, necessity, proportionality. 

• Digital Personal Data Protection Act, 2023 (DPDP Act)
This Act is India’s statutory framework for handling digital personal data — its collection, usage, sharing etc. 

Relevant Key provisions :

• Consent: Data fiduciaries (companies in effect) need to get the free, specific, informed, explicit consent of the data principal (i.e. the individual) before processing their data, unless some exceptions apply. 

The Act defines “processing” broadly to include “sharing, disclosure by transmission, dissemination or otherwise making available” etc. 

The Act allows legitimate grounds other than consent under which data may be processed/share without consent — for example in cases of law, public interest, security etc., subject to prescribed safeguards. 

The rules must ensure transparency, purpose limitation, etc. The individual should know reasonably who data may be shared with etc. 

Exemptions and limitations
There are exemptions under the DPDP Act — for example for state instrumentalities, or for specific public interest or national security grounds. These are sensitive areas where sharing or other processing may be permitted even without explicit consent. 

Also, there is concern among analysts that some of the wording in the Act or rules might allow more latitude than ideal, especially via “legitimate use” exceptions.

When “sharing with third parties” via terms & conditions is legal vs when it is not

1.The company gives clear, explicit notice in the terms (or privacy policy), obtains valid consent from the user for sharing their data with certain third parties, for a specified purpose, and with limitations.
- ✅ Lawful under DPDP.
The consent must be free, specific, informed. It must also meet the purposes stated. The sharing must be within that scope.

2.The terms say that data may be shared widely (e.g. “third parties including partners, affiliates, etc.”) but are vague as to who, for what purpose, etc. And/or no real opt‑in or letting user withdraw.
-⚠️ Likely invalid:
lack of specificity, insufficient consent; may violate the proportionality test under Article 21; could be struck down.

3.The company tries to rely on “terms and conditions” fine print to share sensitive personal data, e.g. health data or biometrics, without specific consent.
- 🚫 Probably illegal under DPDP,
Possibly also unconstitutional under Article 21. The Act does not clearly classify sensitive data separately (unlike GDPR etc.), but still there are principles of necessity, purpose limitation, etc. S&R Associates+3Chambers and Partners+3India Briefing+3

4.The company includes a clause that allows sharing to third parties for “legitimate uses” under the Act, e.g. analytics, security, etc., with the safeguards in place.
- ✅ Could be valid,
Provided all conditions in the Act are met (notification, transparency, the data principal is aware, etc.).

5. The government or public authority shares data under some exemption (public interest, national security etc.) as allowed under the Act.
- ✅ Possibly valid,
But the same constitutional tests (law, necessity, proportionality) apply under Article 21.

Relationship between DPDP Act’s rules and Article 21
The DPDP Act is meant to give legislative backing to the right to informational privacy recognized under Article 21. So any terms or practices allowed under the Act must align with the constitutional requirement: interference with privacy must be by law, for a legitimate purpose, necessary and proportionate.
If a company’s terms violate what DPDP mandates (e.g. sharing without valid consent, or using vague/unlimited clauses), they could be challenged under DPDP (e.g. by the Data Protection Board) and possibly under Article 21 (i.e. in court).
Courts might examine whether the user had real choice; whether the consent was coerced or buried in fine print; whether the data sharing was more than necessary for the purpose stated.

CASE LAW : Facebook - Cambridge Analytica Scandal
Facts:
1. Data Collection : The political consulting business Cambridge Analytica gathered information
from millions of Facebook users using a third-party app called "thisisyourdigitallife."
2. Data Sharing: Without the users' knowledge or approval, the program gathered information
from both their Facebook friends and the people who installed it.
3. Data Use: Cambridge Analytica allegedly influenced the 2016 US presidential election by using
the data it had gathered to produce Facebook political ads that were specifically targeted at certain
demographics.
4. Facebook's Role:* Facebook said it was unaware that the data was being shared with Cambridge
Analytica, but it did permit the app to gather data.
Legal Repercussions:
1. Data Protection Laws: The controversy sparked questions about Facebook's adherence to EU
General Data Protection Regulation (GDPR) and other data protection laws.
2. Federal Trade Commission (FTC) Investigation: The FTC fined Facebook $5 billion for
breaking a 2012 consent order after looking into how the company handled user data.
3. Class-Action Lawsuits: Users impacted by the data leak filed multiple class-action lawsuits
against Facebook.
Ethical Issues:
1. Informed Consent: Users did not give their consent for the collection and sharing of their data
with outside parties.
2. Data Privacy: In order to protect user privacy, the incident made clear the necessity for more
robust data protection measures.
3. Manipulation and Disinformation:* The use of disinformation and targeted ads sparked
questions about the integrity of democratic processes and the manipulation of public opinion.
4. Corporate Accountability: * The controversy brought into question Facebook's corporate
responsibility and its duty to safeguard user information.

Conclusion
So, it is not per se illegal for a company to have terms that permit sharing of data with third parties; but to be legal under DPDP + constitutional privacy rights:
The data principal must be given clear notice about such sharing.
Consent must be valid under the DPDP Act (free, specific, informed, explicit).
The purpose must be legitimate, and sharing must be within the scope of that purpose.
The sharing must comply with safeguards, limits, and any other obligations DPDP imposes.
The clause cannot override constitutional rights; if it goes beyond what is permitted (e.g. too broad, no limits, non‑proportionate privacy intrusion), courts might strike it down under Article 21.

AMBIGUITIES UNDER DPDP ACT,2023
Here are some of the key ambiguities, challenges, or gaps that people have identified in the Digital Personal Data Protection (DPDP) Act, 2023 in India.

Main Ambiguities / Areas of Concern

1.“Publicly available personal data”
It’s unclear how the Act treats personal data that is “publicly available.” IAMAI has flagged that restrictions on accessing and processing such data could impose heavy burdens, especially for AI firms.

Questions arise: what qualifies as “publicly available”? If someone shares their data for one purpose (say, RTI or regulatory disclosure), is that automatically “public,” and is processing for other purposes allowed? How much redistribution or re-dissemination is allowed?


2. “Consent” vs “Legitimate Uses” / Implied consent issues
The law requires “explicit consent” in many cases, but there are broad “legitimate uses” where processing might be allowed without explicit consent. How these interact is not fully clear. For example, Section 7(a) allows processing of data voluntarily provided unless the person has indicated non-consent. That may amount to implied consent by silence, which raises questions about how meaningful “consent” remains.

Also, once consent is withdrawn, it’s ambiguous how quickly processing must cease, and how past processing is treated.

3. Wide “exemptions” given to government / state entities
The Act gives broad power to the government to exempt state instrumentalities under certain grounds (public order, national security, etc.). But the criteria, limitations, oversight and checks for such exemptions are vague. This risks overuse or misuse.

Also, the act’s relation to RTI (Right to Information) has been affected — for example, personal information is more broadly excluded from RTI disclosures, which could reduce transparency.

4.Timelines, formats, thresholds etc.
The rules/guidelines for things like breach notification (how soon, in what form), or how to carry out verifiable consent especially for children, or what constitutes a “significant data fiduciary” (thresholds) are not uniformly specified yet. Businesses have noted this leads to risk of non‑compliance due to uncertainty.

5.Right of Data Principal / Withdrawal, Deletion, Retention
The Act provides for withdrawal of consent, but it's not fully clear in every case what obligations the fiduciary has after withdrawal (especially for data already used, or if there are conflicting legitimate uses).

Also, how long fiduciaries need to retain data once purpose is fulfilled is vague. There is risk of indefinite retention.