The Indian government does not want your data. It wants something more consequential. It wants to look under the hood of the device that never leaves your hand - your phone. The outcome? A standoff between the government and global smartphone makers. It is being described as a technical disagreement over source code. That description is soothing—and misleading.
This is not about some lines of software. It is about whether the state intends to supervise the behaviour of technology, or insert itself into its design. According to Reuters, the government is considering mobile security standards that would require phone makers to store phone device logs for up to a year, notify authorities before major software updates, submit those updates for testing, and potentially share elements of source code. On its part, the government says consultations are ongoing and has denied seeking source code access. But documents reviewed by Reuters suggest otherwise. The gap between those positions is not procedural. It is philosophical.
Source code needs a plain explanation. It is not your messages, photos, or contacts. It is the instruction manual that engineers write to tell a phone how to function. Access to source code is access to the internal logic that governs how a device behaves.
From first principles, the state’s concern is not irrational. India has close to 750 million smartphones in use. These are wallets, identity tools, work terminals, and political megaphones rolled into one. Fraud is rising. Cyber vulnerabilities are real. A government that does nothing would rightly be accused of negligence.
The danger lies not in the objective, but in the method.
Modern digital security is built on speed. Vulnerabilities emerge constantly. Fixes work only if they are deployed fast. Any system that slows updates—even with good intentions—widens the window of exposure. This is not a hypothetical risk. When exploits are active, delays of days, even hours, can matter.
Sujit Janardanan’s concern begins earlier, and elsewhere. Janardanan, the CMO at Neysa Networks, points out that even before questions of surveillance or control, India has struggles with internet accessibility and affordability. That remains true even when viewed narrowly through telecommunications. Technologies like 5G were sold as platforms that would unlock last-mile innovation in areas such as agriculture and education. In practice, access, cost, and weak ecosystems have limited those outcomes.
Against that backdrop, Janardanan questions initiatives introduced without clearly stating what specific risk they are meant to solve, and why these tools are the right ones. Broad invocations of “user data privacy,” he argues, are not explanations. The absence of any global precedent only sharpens the unease.
Where Janardanan frames the issue as one of trust and adoption, Apar Gupta, founder director of the Internet Freedom Foundation (IFF), locates the problem in structure and law.
Gupta’s concern is not source code access in isolation, it is the system forming around it. Even if the state already has powers to surveil in specific cases, combining source code access with requirements such as pre-approval of operating system updates, advance notice of patches, long-term device logging, and restrictions on operating system modification changes the nature of that power. Surveillance shifts from targeted use to built-in capability. Scale becomes the point.
There is also a constitutional problem. Supreme Court jurisprudence requires intrusions into privacy to meet tests of legality, necessity, and proportionality. Measures embedded across the entire population, built into every handset, with unclear statutory footing and weak independent oversight, struggle to meet that bar. In practice, Gupta argues, limits become difficult to enforce.
Other democracies have drawn this line differently. Handset security is typically improved through standards, audits, and vulnerability disclosure—not by giving the state the ability to slow, gatekeep, or condition software updates, or mandate behavioural logging by design.
The worst risk, Gupta says, is not that the government “reads the code.” It is that access becomes a pathway to mandate weakening changes, delay critical patches, or impose compliance requirements that reduce real security while increasing state control. India’s own experience with spyware allegations and zero-click attacks has already shown how powerful vulnerabilities can be when abused—even by state actors.
The second-order effects are predictable. Devices that generate long-term behavioural logs invite self-censorship. Update delays weaken cybersecurity. Anti-rollback and anti-modification rules reduce user control, locking people into vendor- and state-approved software choices.
Taken together, the warning is stark. When trust erodes, users do not revolt. They retreat—quietly, rationally, and at scale. The real question, then, is not whether India should secure smartphones. It must.
The question is whether security will be built on speed, resilience, and accountability—or on control, pre-clearance, and architectural oversight. That decision will not remain confined to phones. It will shape how software is written, how trust is sustained, and how power is exercised in India’s digital economy.
And once a phone is designed for control, it does not easily relearn freedom.