Part 1: Introduction and Complete Overview of the DPDP Act in Simple Language
Introduction: India enters a new era of digital trust
India has more than one billion digital citizens. They use mobile apps, ecommerce platforms, banks, hospitals, government portals, social networks, games, learning apps and every form of digital service. Until now, most companies followed privacy expectations based on global laws or informal practices. There was no single, clear and enforceable privacy law that applied across every industry and every business model.
The Digital Personal Data Protection Act changes this reality. Together with the 2024 and 2025 Rules, India now has a complete privacy framework that tells companies what they can collect, how they must process it, how long they can store it, when they must delete it, how they must secure it, and what rights every person has regarding their personal data.
This handbook exists for a simple reason. There is a significant information gap in the ecosystem. Most explanations are too legal, too vague or too short. Many teams still do not know what they need to change inside their technology, security, product, design, data and compliance workflows. The DPDP Act is clear, but the impact on real systems is often misunderstood.
This guide bridges that gap. It translates the entire law into plain language and connects it to real world business and engineering work.
It is written for everyone who uses personal data in any form: founders, CISOs, CTOs, PMs, security teams, compliance teams, data teams, AI teams and product builders.
What the DPDP Act covers in the simplest possible explanation
The DPDP Act focuses on one central idea.
- Personal data belongs to the person who provided it.
- The company that collects it is only a custodian.
Everything in the Act flows from this idea. Here is the entire Act explained section by section in clear language.
What personal data means in DPDP
Personal data is any information that can identify a person. Examples include name, email, phone number, location, financial details, health information, device identifiers and anything connected to a person through direct or indirect signals.
If your product stores, processes or exchanges any such data, you are covered under DPDP.
Who the Act applies to
The Act applies to:
- Every business that collects or processes personal data in India
- Every global company that serves Indian users
- Every startup and every mature enterprise
- Every cloud or SaaS provider that touches Indian personal data
- Every entity that uses third party APIs or AI systems where personal data flows
There are no exceptions based on company size or maturity.
Rights of data principals
A data principal is the person to whom the data belongs. The Act gives them several rights.
- Right to know - They have the right to know why the data is collected and how it will be used.
- Right to access - They can ask a company for a copy of their data.
- Right to correction - If the data is wrong, they can request a correction.
- Right to deletion - When the purpose is over, the person can ask the company to delete the data.
- Right to grievance - Every company must provide a proper grievance channel.
The Rules define clear response timelines for all of these.
Duties of data fiduciaries
A data fiduciary is the company that decides why data is collected and how it will be used. The fiduciary is responsible for:
- Transparent notice and consent
- Collecting only the required data
- Securing the data
- Deleting the data when the purpose ends
- Responding to grievances within the allowed period
- Ensuring processors follow the rules
- Reporting breaches
- Proving compliance when required
Responsibility cannot be transferred. Even if another vendor processes the data, the fiduciary remains accountable.
What data processors do under the Act
A data processor only processes data on behalf of the fiduciary. Examples include cloud providers, analytics tools, CRM platforms and AI services.
Processors must follow the instructions of the fiduciary and must not misuse the data. The fiduciary must ensure the processor meets all security requirements.
Consent and notice requirements
Consent must be:
- Free
- Specific
- Informed
- Clear
- Reversible
A person must always know why their data is being collected and what will happen to it. Dark patterns or confusing wording are not allowed. Consent withdrawal must be as easy as consent provision.
Purpose limitation and data minimization
- The company must collect only what is needed.
- If three fields are enough, it cannot collect fifteen.
- If the purpose ends, the data cannot be kept forever.
This is one of the biggest operational changes for Indian companies.
Data retention and data deletion
DPDP requires companies to delete data once the purpose is complete. This includes:
- User accounts that are inactive
- Logs containing personal identifiers
- AI training data with user details
- RAG and vector memories that store personal content
- Old CRM entries that have no valid purpose
Companies need automated or well documented deletion processes.
Children and minor data rules
Handling data of anyone under eighteen comes with stronger rules. The company must:
- Obtain verifiable consent from a parent or guardian
- Avoid tracking
- Avoid targeted advertising
- Avoid profiling
The Rules give detailed specifications for age verification and processing safeguards.
Cross border data transfers
The Act does not ban international transfer. Instead, India will maintain an approved list of countries where personal data can be sent.
Companies that rely on global cloud services must closely track the notification of approved countries.
Penalties and consequences
The DPDP Act includes significant monetary penalties. Examples include penalties for:
- Failure to protect personal data
- Failure to notify breaches
- Failure to delete data
- Failure to provide user rights
- Violation of children related rules
- Repeated non compliance with Board orders
Penalties can reach two hundred fifty crore rupees for major violations.
Why the DPDP Act matters for modern digital businesses
The Act brings clarity to four long standing gaps.
Gap one: unclear responsibility
- Companies were never fully sure who was accountable for personal data.
- DPDP makes this very clear.
- The fiduciary is responsible for everything.
Gap two: excessive data collection
- Most apps collected far more information than needed.
- DPDP forces real discipline.
Gap three: no deletion systems
- Indian companies rarely built deletion workflows.
- DPDP requires deletion once the purpose ends.
Gap four: weak observability across APIs and AI systems
- Modern systems involve thousands of data flows across APIs, AI models, agents and processors.
- DPDP requires companies to know where personal data flows and how it is protected.
- This aligns naturally with runtime discovery and visibility platforms such as Levo.
Part 2: The DPDP Rules of 2024 and 2025 Explained in Clear Language
The DPDP Act provides the principles and the legal foundation. The Rules of 2024 and 2025 provide the operational instructions that every company must follow. This is where the law becomes real.
This is also where many teams get confused because the Rules introduce specific processes, timelines, safeguards and documentation requirements that did not exist earlier.
This section breaks down the Rules into simple and practical explanations so that every team can understand what needs to change in their workflow.
The purpose of the DPDP Rules
The Rules aim to do four things.
- Make the Act operational - They convert the broad ideas of the Act into real world tasks that companies must perform.
- Define exact procedures - For example breach reporting, grievance response times, consent formats and children verification.
- Establish compliance and audit structures - They explain how companies must maintain records and prove compliance.
- Guide enforcement and penalties - They help the Data Protection Board interpret actions and determine penalties.
The biggest changes introduced by the Rules
Here is a clear view of what the Rules add that the Act alone did not specify.
Clear consent and notice instructions
Consent must contain a proper notice that explains:
- the purpose of data collection
- the type of data collected
- the processing activities
- the retention logic
- the withdrawal process
- the rights available to the person
The notice must use simple language.
- No misleading elements.
- No pressure based design.
- No hidden consequences.
Consent withdrawal must be easy and immediate. If consent is withdrawn, the company must stop processing that data unless another valid legal basis exists.
Verification rules for children and minors
The Rules provide precise guidance on the following.
- Age confirmation - Companies must implement a verification mechanism to confirm if a person is a child.
- Guardian consent - If the person is under eighteen, consent must come from a parent or guardian.
- Processing restrictions - Companies cannot track, profile or target children.
- Record keeping - The company must maintain logs of verification and consent.
This part is very important for EdTech, online gaming, ecommerce, social platforms and health startups.
Grievance redressal timelines
The Rules specify how fast a company must respond.
- Acknowledgment must be prompt
- Resolution must be within a clearly defined duration
- Escalation must be allowed if the person is not satisfied
If the user remains unhappy, they can approach the Data Protection Board. These timelines require companies to create a proper support or compliance desk with defined internal service levels.
Breach reporting procedures
One of the most significant additions relates to how companies must act in the event of a breach. The Rules clarify:
- what counts as a data breach
- when the fiduciary must notify the Data Protection Board
- what the notification must contain
- when individuals must be informed
- what intermediate steps must be taken
- what logs must be presented during investigations
Examples of what must be included in the report:
- nature of the breach
- categories of data affected
- estimated number of people affected
- immediate containment actions
- recovery measures taken
- long term mitigation plan
The speed and completeness of this report influence the penalty.
Classification of Significant Data Fiduciaries
The DPDP Act introduced the concept but the Rules refine how this classification works. A company may be classified as a Significant Data Fiduciary based on:
- volume of personal data processed
- sensitivity of data
- risk to national interests
- risk to public order and health
- use of advanced technologies including AI with personal data
- potential impact of breaches
Once classified, the entity must meet enhanced obligations. These include:
- conducting regular Data Protection Impact Assessments
- completing independent audits
- appointing a Data Protection Officer
- improving security controls
- maintaining detailed processing records
This classification is extremely relevant to financial institutions, hospitals, ecommerce marketplaces, telecom providers and AI based service platforms.
Data retention and deletion instructions
The Act required deletion once the purpose is complete. The Rules explain how to operationalize this. Companies must:
- define retention periods for each category of data
- document the logic clearly
- delete data once the purpose is complete
- ensure processors delete data as well
- maintain deletion records
- automate deletion when possible
This becomes one of the most complex implementation tasks for large enterprises that have many systems and old data.
Requirements for audit and compliance records
The Rules expect companies to maintain detailed logs that prove compliance. Examples include:
- consent logs
- access logs
- processing records
- purpose records
- data deletion logs
- breach investigation logs
- DPIA reports where applicable
- records of third party processors and their controls
The company must be able to produce these records on request.
Specific instructions for data processing by third parties
The Rules clarify that:
- any processor must follow the instructions of the fiduciary
- the fiduciary must ensure the processor uses appropriate security controls
- the fiduciary must remain accountable for any misuse by the processor
- contracts must include required safeguards
- processors cannot combine data for their own business advantage
Companies that rely on CRM tools, marketing automation platforms, analytics tools, cloud AI models, API gateways and integration partners must update their agreements accordingly.
Rules for cross border data transfer
The Rules explain how the government will maintain a list of trusted jurisdictions where personal data may be transferred. Companies must:
- store or process personal data in approved countries
- ensure contracts include protections
- ensure data transferred still meets DPDP safeguards
- track future updates to the allowed list
This affects SaaS, cloud, support centers, global AI services and multinational companies.
Strengthening of security practices
The Rules place strong emphasis on measurable security. Companies must adopt reasonable security safeguards which include:
- encryption
- access control
- multi factor authentication
- monitoring
- continuous detection
- prompt breach containment
- log integrity
- vulnerability reduction measures
- controlled data access
- role based access
- third party risk monitoring
The Rules make clear that negligence or weak systems can result in heavy penalties.
Why the Rules change how companies operate
Until the Rules were published, many companies thought compliance meant policy documents, privacy pages and an occasional audit. The Rules now require companies to redesign how they collect, store, process, transfer and delete personal data.
The Rules make DPDP a living operational discipline, not a soft guideline. For example:
- engineering teams must redesign data flows
- product teams must redesign consent screens
- security teams must improve observability across APIs and AI systems
- data teams must create deletion processes
- compliance teams must maintain detailed logs
- support teams must handle grievances under strict timelines
The Rules also make runtime visibility a key part of compliance because companies must know how personal data moves through internal and external systems. This includes APIs, agents, AI models, MCP servers and third party services.
Platforms like Levo naturally assist in this area because they uncover unknown data flows, unknown API exposure, unmonitored AI agent activity and sensitive data paths that may otherwise go untracked.
Part 3 : What Companies Must Do to Comply With the DPDP Act (Practical and Technical Guidance)
The DPDP Act and the Rules are clear about what companies must achieve. However, they do not tell you how to redesign your systems, processes and workflows to reach compliance.
This part of the handbook explains what companies must actually do in real world terms. It covers the required changes inside engineering, security, product, design, data and compliance teams.
This is the action oriented, operational part of DPDP.
The Six Pillars of DPDP Compliance
Every organisation must adopt six core pillars of privacy and security practice.
- Pillar 1 - Know what personal data you collect and where it lives.
- Pillar 2 - Collect only what you need and use it for the stated purpose.
- Pillar 3 - Protect the data with strong security controls.
- Pillar 4 - Provide rights and grievance pathways to users.
- Pillar 5 - Delete the data when the purpose is complete.
- Pillar 6 - Maintain records that prove you did all of the above.
These pillars form the foundation for compliance work across teams.
Step by step view of what companies must implement
Below is a practical blueprint that any company can follow. This works for small startups, mid sized businesses, large enterprises and global companies.
Data discovery and classification
This is the first and most critical step because everything else depends on it. You cannot secure or delete what you cannot see or identify.
Companies must:
- create an inventory of all personal data they collect
- identify which systems store personal data
- identify which APIs transmit personal data
- identify which AI models, agents or processors see personal data
- classify the risk level of each data set
- record the purpose for which each data set exists
This step requires cooperation between product teams, engineering teams, data teams and security teams.
Runtime platforms that reveal unknown APIs, unmonitored agent actions, vector storage patterns, and third party data flows become essential in this phase because many companies have personal data flowing in ways nobody documented.
Purpose definition and minimisation
For each field of personal data, the company must define a purpose. This purpose must be clear and specific. If the data is not required for that purpose, it cannot be collected.
Examples:
- Collecting location only when delivery tracking is required
- Collecting health data only when it is needed for medical service delivery
- Collecting phone number only for account verification
The purpose must be documented inside internal records. This record is required during investigations or audits.
Consent and notice design
Engineering teams and product teams must redesign the collection screens and flows. Consent must be tied to the exact purpose. A user must understand what they are agreeing to.
Key requirements
- The notice should be short and clear
- The purpose must be stated in simple words
- The user must easily see the retention logic
- The withdrawal option must be simple
- Consent logs must be recorded with time stamps
- The user should not be forced or tricked
Common mistakes
- Bundled consent where a single click covers many unrelated purposes
- Dark pattern interfaces that confuse users
- Lack of transparency about how long the data is stored
Correcting these mistakes requires close work between design, product and legal teams.
Data flows and engineering changes
Once the company knows what data exists and the purpose for which it exists, engineering teams must update backend systems.
Required changes
- Map every flow of personal data between internal services
- Map every flow of personal data into third party APIs
- Remove unnecessary data collection
- Remove unnecessary data forwarding
- Mask or tokenize sensitive data where possible
- Add access control checks
- Add proper retention logic
- Add deletion triggers
- Ensure processors do not store extra data
This often becomes one of the largest workstreams because modern software stacks are complex, distributed and deeply interconnected.
A high number of companies discover that a large segment of personal data moves through unknown or forgotten APIs or through AI agents that perform tasks behind the scenes. This is a major compliance risk that must be corrected.
Data retention and deletion systems
DPDP requires deletion once the purpose ends. This is a major engineering and data management responsibility.
Companies must
- define retention periods for every data category
- store the retention logic in a clear document
- build automatic deletion jobs wherever possible
- build deletion request workflows for user initiated erasure
- verify that processors delete the data as well
- record each deletion action in a log
Challenges
- legacy systems that do not support deletion
- AI training corpora that contain personal data
- vector store memories that keep user details for long periods
- log storage practices that contain identifiers
Addressing these challenges requires strong involvement from engineering, data and security teams.
Access control and internal security
The Rules require reasonable security safeguards. At minimum, this means:
- strong authentication
- proper role based access
- limited access to sensitive data
- regular monitoring of access logs
- immediate revocation for inactive accounts
- secure development practices
- audit of internal privilege levels
Access control is central to breach prevention. In many organisations, too many systems have access to personal data. This must be reduced.
Monitoring and continuous observability
Companies must know how personal data flows inside their environment. They must detect leaks and anomalies. They must track misuse or accidental exposure.
This requires
- runtime visibility across services
- API traffic inspection
- AI agent action tracing
- third party API call monitoring
- detection of sensitive data transfers
- alerting for unusual patterns
Many companies do not have this today. This is where platforms like Levo become critical because they reveal hidden or unexpected data movement that can violate DPDP.
User rights and internal response systems
The Act grants several rights to users. Companies must build systems that can handle these requests. These include
- request for access
- request for correction
- request for erasure
- request for grievance support
The company must respond within the timelines defined by the Rules. Internal teams must
- track every request
- record every response
- maintain proof of resolution
- ensure deletion is complete
This requires a coordinated flow between customer support, compliance and engineering teams.
Grievance handling systems
The Rules expect companies to create structured grievance redressal systems. These include
- a clear contact address or portal
- timely acknowledgement
- timely resolution
- escalation path
- proper documentation
If companies fail to resolve grievances on time, users can approach the Data Protection Board. Persistent failures increase penalty risks.
Third party management and contractual updates
Most companies rely on many third party services that process personal data. Examples include
- CRM tools
- analytics tools
- cloud services
- AI services
- payment services
- communication providers
- customer support tools
DPDP requires companies to update their contracts with these services to include
- security safeguards
- retention logic
- deletion rules
- breach reporting duty
- restrictions on reuse
- verification of controls
The fiduciary remains accountable even if the processor fails. This is a major change in responsibility.
Data Protection Impact Assessments
Companies classified as Significant Data Fiduciaries must perform Data Protection Impact Assessments for high risk processing activities. The assessment must cover
- nature of processing
- type of data
- potential harm
- likelihood of misuse
- risk to rights of individuals
- safeguards implemented
- mitigation steps
DPIAs must be available for inspection. They must be updated when systems change.
Audit readiness and documentation
Companies must maintain detailed records. These include
- consent logs
- processing records
- purpose records
- access logs
- deletion logs
- breach logs
- DPIA documents
- third party reports
Many companies fail DPDP compliance not because they lack controls but because they lack documentation.
The real world difficulty of DPDP compliance
Compliance looks simple in theory. In practice it is complex because most modern systems have many
- microservices
- APIs
- background jobs
- data queues
- third party services
- AI models
- agent actions
Many flows of personal data are invisible to the team. This is why DPDP requires continuous observability and complete visibility into
- internal API calls
- external API calls
- AI actions
- data storage patterns
- data movement across networks
- identity behaviour across the stack
The stronger the visibility, the lower the compliance risk.
Part 4: DPDP Implications for API Systems, AI Systems and Comparison With Indian Sector Laws
Modern companies depend on complex networks of APIs, AI models, agents, workflows and third party platforms. These systems often process personal data in ways teams do not fully see or understand. This creates a major compliance challenge because DPDP expects companies to know exactly where personal data flows and how it is protected.
Part 4 explains how DPDP applies to API environments, AI architectures and the broader Indian regulatory landscape. This is where technology complexity meets legal responsibility.
DPDP and API ecosystems
Every company uses APIs. They may be internal service calls, partner integrations or third party platforms. These APIs often contain or transmit personal data. DPDP requires companies to observe and manage this entire ecosystem.
Personal data often flows through internal APIs without teams knowing
In most companies, internal APIs pass personal data between services for functions such as
- authentication
- user profile retrieval
- payments
- order management
- address handling
- support workflows
- marketing activities
DPDP requires companies to know which API carries what data and why. This must be documented and secured.
Shadow APIs create compliance risk
A shadow API is any API that exists in production but is not tracked or documented. This happens frequently when teams move fast or when older code is not removed. Shadow APIs are dangerous because
- they may expose personal data
- they often lack proper authentication
- no one monitors them
- they violate purpose limitation
- they violate deletion requirements
DPDP does not accept ignorance as a defence. If an unknown API leaks personal data, the fiduciary is accountable.
Third party API calls must follow DPDP rules
Companies often send personal data to third party tools such as
- CRM platforms
- email providers
- analytics tools
- support platforms
- payment gateways
- AI model providers
- marketing tools
DPDP requires companies to
- verify that the third party uses proper safeguards
- document the exact purpose of the transfer
- ensure deletion happens at the third party
- maintain a log of every transfer
- update contracts with required protections
If the third party misuses the data, the fiduciary still remains responsible.
API security aligns directly with DPDP
DPDP expects personal data to be protected with reasonable safeguards. APIs become one of the most important control points for privacy. Required protections include:
- authentication
- access control
- rate control
- encryption
- sensitive field masking
- secure error messages
- monitoring and detection
- anomaly alerts
A breach through an insecure API triggers penalty exposure under the Act.
API observability is essential for compliance
DPDP requires companies to know and control personal data flows. This means
- tracking all API calls
- detecting unknown endpoints
- seeing personal data in requests or responses
- mapping data paths across services
- linking them to the purpose of processing
Platforms like Levo provide this type of runtime API visibility. This becomes foundational for DPDP compliance because it reveals data movement that teams never documented.
DPDP and AI systems
AI systems introduce new privacy risks because they process personal data in ways that are often invisible. AI systems include
- large language models
- RAG pipelines
- enterprise agents
- consumer facing chatbots
- model context servers
- vector memory stores
- automated decision engines
DPDP applies fully to all these systems when personal data is involved.
AI inputs often contain personal data
Examples
- customer questions with names
- support tickets with sensitive details
- health queries
- financial details
- addresses
- phone numbers
- chat history
If personal data enters an AI model, DPDP applies.
AI outputs may also contain personal data
AI models can leak:
- content from previous sessions
- memorised patterns
- sensitive tokens
- internal hints of private information
This is counted as a privacy breach under DPDP.
RAG and vector memory create hidden retention
Many AI systems use retrieval based workflows. This creates
- long term storage of user messages
- storage of business records
- storage of sensitive personal content
DPDP requires deletion once the purpose ends. Vector memories need proper retention logic and deletion workflows.
AI agents create new privacy and security concerns
Enterprise agents can
- fetch data from internal APIs
- call third party services
- update records
- run tasks with privileged access
If an agent retrieves personal data without purpose alignment or without authority, this violates DPDP. Companies must track
- which agent accessed what data
- when the access was performed
- whether the purpose was valid
- whether sensitive information was transmitted externally
This requires strong runtime observability.
Model logs and telemetry may contain personal data
Personal data may be stored unintentionally in
- prompt logs
- output logs
- error logs
- developer debug logs
- sandbox sessions
Log retention is often long, which violates DPDP if deletion rules are not applied.
When a company must perform a Data Protection Impact Assessment for AI
A Data Protection Impact Assessment becomes necessary when:
- AI processes sensitive personal data
- AI makes decisions that impact individuals
- AI combines data from many sources
- AI processes large volumes of personal data
- AI uses inferred attributes
- AI interacts with minors
The DPIA must evaluate risks and safeguards before deployment.
Observability is essential for AI compliance
Companies must be able to track and explain:
- which model was called
- what type of data went into the model
- which agent triggered that interaction
- what external services received the model output
- what personal data flowed through the system
- what retention rules apply
Visibility platforms that capture model usage, agent actions and data flows become essential for DPDP readiness.
Comparison of DPDP with Indian sector laws
- India already has strong sector specific privacy obligations.
- DPDP sits on top of these laws and creates a unified foundation.
- Sector laws still apply.
- DPDP adds another layer of responsibility.
Below is a comparison with the most important sector laws.
RBI instructions for financial institutions
RBI guidelines already require:
- strong security controls
- data minimisation
- breach reporting
- customer protection
- control over third parties
- strict handling of account data
DPDP adds:
- deletion rules
- explicit consent standards
- cross border transfer controls
- rights for individuals
- record keeping for audits
BFSI companies must merge both sets of expectations.
IRDAI guidelines for insurance companies
IRDAI requires:
- strict protection of health and policy data
- privacy safeguards in digital channels
- oversight of third party processors
- breach communication
DPDP adds:
- purpose definition
- consent clarity
- retention limits
- deletion obligations
- user rights
- detailed grievance timelines
Insurance platforms must update workflows accordingly.
NDHM requirements for healthcare and health tech
NDHM already defines strict rules for:
- health record confidentiality
- consent for data sharing
- retention norms for medical data
- secure exchange across providers
DPDP adds:
- uniform notice standards
- deletion rules once purpose ends
- rights to erasure
- duties for AI models handling health data
Health tech companies must align both frameworks.
Industry codes for telecom, ecommerce and others
Telecom and ecommerce sectors already follow:
- data security rules
- user privacy expectations
- lawful access procedures
DPDP strengthens:
- consent frameworks
- storage limits
- grievance handling
- deletion obligations
- accountability of processors
These changes apply to every large consumer platform.
Why DPDP requires stronger technical observability than older sector laws
Older laws focused mostly on policy and technical safeguards. DPDP is more operational. It expects companies to prove what happened in their systems. This includes:
- data flow proof
- access proof
- deletion proof
- purpose proof
- breach investigation proof
Without observability across APIs, services and AI agents, companies cannot meet these expectations. This is why DPDP readiness often requires more than policy documents. It requires runtime visibility across the digital environment.
Part 5: Industry Deep Dives and Visual Flow Diagrams for Consent and Data Life Cycle
- DPDP does not affect every industry equally.
- Some sectors handle large volumes of personal data.
- Some process sensitive categories of data.
- Some operate complex digital systems with many APIs, agents and external partners.
These differences create unique compliance challenges.
Industry Deep Dives
Each industry faces specific risks, obligations and operational changes under DPDP. This section provides guidance that connects legal expectations to real system behavior.
DPDP for Banking and Fintech
Banking and fintech platforms handle some of the most sensitive personal and financial information in India. They also rely on complex API ecosystems and use advanced analytics and AI models.
Core risks
- exposure of account details
- exposure of KYC data
- misuse of device information
- transfer of personal data to third party payment partners
- retention of transaction identifiers for long periods
- use of AI models trained on personal financial patterns
- cross border operations with cloud partners
DPDP impact
- Purpose control - Banks must document why they collect each piece of information such as addresses, identifiers, KYC documents and transaction metadata.
- Deletion - KYC and financial data have retention rules defined by RBI. DPDP adds a clear need to delete any data that falls outside those rules.
- AI models - AI models used for fraud detection, credit scoring or risk profiling must undergo Data Protection Impact Assessments if personal or sensitive data is involved.
- Third party oversight - Banks must update contracts with payment partners, fintech aggregators, data verification services and cloud AI providers.
- Observability - Banks must map every data path across internal systems and API gateways to remove blind spots.
DPDP for Insurance
Insurance platforms process medical information, identity documents, financial data, photographs, case records and health claims. This includes some of the most sensitive information under DPDP.
Core risks
- uncontrolled storage of medical records
- transfer of claim files to external assessors
- data reuse for product upselling
- long term retention of photographs and reports
- AI models used for claim prediction leaking sensitive information
DPDP impact
- Verification of purpose - Insurance companies must define exactly why each data element is collected. For example a photograph collected for claim assessment cannot be reused for marketing.
- Deletion - After regulatory retention ends, data cannot be kept forever. This requires structured deletion workflows.
- AI governance - AI models that assess claims or detect fraud must have proper safeguards and audit records.
- Grievance handling - Insurance customers must be able to correct or request deletion of personal information that is no longer required.
DPDP for Healthcare and Health Tech
Healthcare data is extremely sensitive.
Health tech platforms store electronic health records, lab reports, diagnostic images, chat conversations with doctors and even biometric data.
Core risks
- storage of patient history across many internal systems
- unmonitored access by staff
- data sharing across hospitals
- AI models that process medical conversations
- long term retention of highly sensitive records
- unstructured data being stored in logs
DPDP impact
- Strict purpose limitation - Every collection of patient data must be tied to a specific care purpose.
- Consent clarity - Patients must understand how their data will be used across providers and digital systems.
- Deletion - Once the purpose ends and regulatory retention is complete, the data must be deleted.
- AI activity - AI based triage or diagnostic systems must undergo detailed risk assessments.
- Third party risk - Hospitals often use many external platforms such as lab systems, imaging systems, support systems and telehealth platforms. Contracts must reflect DPDP safeguards.
DPDP for Ecommerce and Retail
Ecommerce systems collect large volumes of personal data such as name, phone number, address, device identifiers, purchase history, preferences and support conversations.
Core risks
- unnecessary data retention
- exposure through logistics APIs
- weak access control for order history
- targeted marketing based on sensitive patterns
- chat systems that store personal information for long periods
DPDP impact
- Tracking control - Users must know what data is collected during browsing and purchase activity.
- Purpose clarity - Address and phone number must be used for delivery and support, not for unrelated profiling.
- Deletion - Users should be able to request deletion when the account is inactive.
- Breach reporting - Any exposure of personal order history must be promptly reported.
- Third party oversight - Logistics and payment partners must meet DPDP standards.
DPDP for EdTech
EdTech platforms work with minors, which introduces strict restrictions under DPDP.
Core risks
- storing student profiles
- recording sessions with minors
- training AI systems with student data
- collecting behaviour data
- retention of conversations or classroom recordings
DPDP impact
- Guardian consent - All processing must be approved by a parent or guardian.
- Strict limitations - No tracking, profiling or targeted advertising is allowed for minors.
- Deletion - Old student records must be deleted when they are no longer required.
- AI governance - Any AI used for tutoring must be checked for sensitive output and retention.
DPDP for SaaS and Cloud Platforms
SaaS and cloud companies process the data of many customers. They act as processors but carry significant operational responsibility.
Core risks
- unknown personal data inside customer logs
- data reuse for analytics or model training
- cross border processing
- weak deletion workflows
- large volumes of third party requests
DPDP impact
- Processor limits - SaaS providers cannot use the data for their own purposes unless permitted by the customer.
- Deletion - SaaS providers must delete customer data when the customer stops using the service.
- Cross border - Cloud regions must comply with India approved transfer locations.
- Security safeguards - SaaS companies must implement strong controls and produce audit records during investigations.
Visual Flow Diagrams for DPDP Compliance
Many companies struggle to visualise how data should move under DPDP. The following diagrams use simple text blocks to illustrate correct flows.
No dash symbols are used. Arrows are written using greater than symbols which comply with your rule.
Consent and Notice Architecture
User
provides data
after reading clear notice
and choosing consent
Clear Notice
shows purpose
shows retention logic
shows processing activity
shows withdrawal link
Consent Storage
records time
records purpose
records user identifier
records withdrawal history
Application System
checks consent
before using data
for any purpose
If user withdraws
stop processing
except for legal cases
update logs This flow shows how the system must treat consent as a gate that controls all downstream activities.
Data Life Cycle Under DPDP
Collection
only collect required data
only after notice
only after consent
Storage
encrypt data
restrict access
maintain logs
classify purpose
Use
use only for permitted purpose
track usage in logs
prevent cross purpose reuse
Transfer
send only to allowed countries
send only to permitted processors
record transfer logs
Retention
define how long data stays
review retention logic
avoid long term storage
Deletion
delete after purpose ends
delete after account closure
delete from processors
record deletion logs This diagram unifies the DPDP requirements into a single structured life cycle.
How companies can use these diagrams
These diagrams should be used during:
- engineering design sessions
- privacy reviews
- Data Protection Impact Assessments
- Audits
- vendor evaluations
- AI system design
- API architecture reviews
They help companies understand how the law expects data to move and how their systems must be adjusted to match this structure.
Part 6: The DPDP Compliance Roadmap, Breach Playbook and Readiness Checklist
DPDP is not something a company can handle through policy documents alone. It requires a structured and phased approach that coordinates work across engineering, security, product, legal, data and support teams.
DPDP Compliance Roadmap
A phased approach that reduces confusion, prevents overload and ensures that teams progress in a predictable and controlled way.
The first 30 days: Discover and understand
The goal of the first month is clarity. Companies must figure out what data they have, where it flows and what risks exist.
Key actions:
Create a personal data inventory
- List all categories of personal data collected.
- Identify collection points such as apps, forms, web portals and support channels.
Map data flows
- Document which internal systems store or process personal data.
- Identify every API that transmits or receives personal data.
- Identify every third party processor that handles personal data.
Classify personal data
- Identify sensitive categories such as health information or financial information.
- Classify every category by risk and purpose.
Review all consent and notice points
- Check if current notices explain purpose, use, storage and withdrawal.
- Check if consent is clear and does not mislead the user.
Identify blind spots
- Look for unknown APIs, unmonitored data flows, AI model inputs and logs that contain personal data.
Prepare an internal DPDP readiness report
- Summarise gaps, risks and required changes.
This stage is foundation work. Without it, every later step becomes unreliable.
The next 60 days: Fix and formalise
Once the discovery phase is complete, the company must begin fixing gaps and redesigning systems to comply with DPDP.
Key actions:
Redesign consent and notice architecture
- Ensure clear notice language.
- Ensure simple consent withdrawal.
- Ensure consent covers specific purposes rather than broad categories.
Implement purpose limitation
- Remove unnecessary collection fields.
- Remove unnecessary forwarding of personal data across services.
- Restrict personal data access to the teams that genuinely need it.
Establish retention rules
- Set clear retention periods for each data category.
- Document them inside a formal retention schedule.
Build deletion workflows
- Create automatic deletion jobs for expired data.
- Create manual deletion workflows for user initiated requests.
- Ensure deletion also happens at processors.
Strengthen access controls
- Audit who can see personal data.
- Reduce unnecessary access.
- Introduce stronger authentication and role based restrictions.
Improve security posture
- Encrypt sensitive data.
- Monitor API traffic.
- Inspect AI system behaviour for unintentional exposure.
- Add anomaly detection for unusual data movement.
Update contracts with third party partners
- Include DPDP obligations.
- Include retention rules.
- Include deletion responsibilities.
- Include breach notification duties.
The goal of the 60 day phase is to transform DPDP principles into real system behaviour.
The final 90 days: Prove and operationalise
This phase focuses on preparing for audits, investigations and real world compliance events.
Key actions:
Create complete documentation
- Maintain processing records.
- Maintain consent logs.
- Maintain access logs.
- Maintain deletion logs.
- Maintain breach logs.
- Maintain DPIA reports if applicable.
Set up grievance handling
- Create a dedicated grievance channel.
- Train internal teams to respond within required timelines.
- Maintain proof of every response.
Test deletion workflows - Run test deletions to confirm that data is removed from all internal systems and from processors.
Test breach reporting systems
- Conduct mock breach drills.
- Check if teams can collect required information within hours.
Implement continuous observability
- Monitor API calls in real time.
- Monitor AI agent actions.
- Monitor sensitive data movement.
- Monitor external transfers.
Prepare a DPDP compliance declaration
- Summarise how the company meets all obligations.
- Keep this ready for board review or government review.
At the end of 90 days, the company should be able to prove that its systems and workflows match DPDP expectations.
The DPDP Breach Playbook for the First 72 Hours
DPDP penalties increase sharply when companies do not respond correctly to breaches. Most damage happens because companies are slow to detect, slow to respond and slow to inform stakeholders. This playbook gives clear steps to follow within the first 72 hours after a breach is discovered.
The first hour: Confirm and contain
Identify the breach source
- Find the affected system or API.
- Check if personal data was exposed.
Stop the breach path
- Disable affected tokens.
- Isolate the system.
- Block access if required.
Secure logs
- Preserve evidence immediately.
- Do not alter logs.
- Do not restart systems without snapshotting.
Containment is the first priority.
The next twelve hours: Assess the scale
- Identify categories of personal data affected - Name, address, phone number, health data, financial data or other sensitive information.
- Estimate the number of affected individuals - Document the approximate impact.
- Analyse how the breach occurred - Was it an API failure, an agent misuse, a server exposure, an error in access control or a third party leak.
- Prepare internal reports - Summarise findings for leadership and legal teams.
This information will be required for the formal DPDP report.
The next twenty four hours: Notify the Data Protection Board
The DPDP Rules require prompt notification. Your report must include:
- nature of the breach
- category of personal data breached
- approximate number of affected persons
- containment actions taken
- planned remediation steps
- any mitigation offered to affected people
This report must be accurate and complete. Delays or incomplete information increase penalty risk.
The next thirty six hours: Notify affected individuals if required
If the breach poses a significant risk to individuals, the company must inform them. The communication should include:
- what happened
- what personal data may be affected
- what immediate steps the user should take
- what actions the company is taking
- how the user can raise grievances
This communication must be simple, clear and uninterrupted by technical language.
The next forty eight hours: Begin long term remediation
Fix the root cause
- Repair broken access control.
- Patch insecure APIs.
- Delete leaked data.
- Strengthen authentication.
- Update AI agent permissions.
Review logs for additional compromise - Look for unusual patterns or actions that indicate deeper exposure.
Prepare long term safeguards
- Update internal policies.
- Update processor contracts.
- Update data flow diagrams.
- Update deletion logic.
- Update monitoring systems.
The final seventy two hours: Document and close
Prepare a complete report that includes:
- timeline of events
- root cause
- actions taken
- user communication
- impact assessment
- safeguards adopted
This report becomes part of the DPDP audit trail.
The DPDP Readiness Checklist
This checklist helps teams verify compliance across all essential areas. It can be used weekly or monthly as a quick audit tool.
Data discovery
- All personal data categories recorded
- All collection points identified
- All API flows mapped
- All third party processors listed
Consent and notice
- Notice written in simple language
- Consent tied to specific purpose
- Consent withdrawal available
- Consent logs maintained
Purpose and minimisation
- Purpose defined for each data category
- No unnecessary collection
- No unnecessary forwarding
- No cross purpose reuse
Retention and deletion
- Retention schedule documented
- Automatic deletion implemented
- Manual deletion workflow present
- Processor deletion ensured
- Deletion logs maintained
Access and security
- Role based access implemented
- Sensitive data encrypted
- Authentication strengthened
- Logs collected and protected
- Anomaly detection enabled
AI and model behaviour
- Inputs and outputs reviewed
- Vector memory managed
- Agent actions monitored
- Model logs inspected
- DPIA performed if required
Third party management
- Processor contracts updated
- Cross border transfers reviewed
- Processor safeguards verified
- Breach duties included
User rights and grievance
- Access, correction and deletion requests supported
- Clear grievance channel present
- SLA for responses defined
- All resolutions documented
Breach readiness
- Incident response plan available
- Breach contact team identified
- Reporting template prepared
- Mock drill performed
Final Perspective: Building trust through clarity and responsibility
The DPDP Act and the 2024 and 2025 Rules represent a major shift in how India treats personal data. The law brings clarity, structure and accountability to a digital ecosystem that is growing fast and becoming more complex each year.
The most important insight from DPDP is simple. Personal data is a responsibility, not a resource. Companies must treat it with the same seriousness that they treat financial security, operational safety or customer trust.
When implemented correctly, DPDP brings several benefits:
- stronger customer trust
- safer digital products
- lower breach risk
- clearer internal processes
- better data quality
- more disciplined engineering practices
DPDP also encourages better design of AI systems and API infrastructures. When companies have visibility into their data flows, they can innovate without fear of hidden exposure.
Platforms that provide real time observability across APIs, data flows, AI agents and model interactions naturally support DPDP readiness because they give companies the clarity they need to prevent violations and maintain continuous compliance.
DPDP is not only a law. It is a framework for responsible innovation. Companies that embrace it build a stronger foundation for the future digital economy.


.jpg)
.jpg)
.jpg)
.jpg)
