![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
加密货币新闻
The draft Digital Personal Data Protection Rules, 2025 has led to confusion about whether or not all users will have to verify their age and identity to access online services.
2025/01/07 19:03
The draft Digital Personal Data Protection Rules, 2025 have led to confusion about whether or not all users will have to verify their age and identity to access online services. Under the Digital Personal Data Protection Act, 2023 (which the rules seek to operationalise) online platforms have to obtain verifiable parental consent before processing the data of anyone under 18 years of age.
The rules then elaborate that platforms have to verify the age and identity of anyone claiming to be a parent and giving consent on behalf of a child. While the rules specify what platforms can do when an under-18 user declares themself as a child and when a parent comes forward, they don’t take note of situations where a child inputs the wrong information and claims to be an adult.
In such cases, platforms will have to verify everyone’s age, some like MediaNama’s editor Nikhil Pahwa argue. On the other hand, some, like Aparajita Bharti, the co-founder of Quantum Hub Consulting believe that the way the rules read right now, companies could use self-declaration measures as a means to determine whether the person signing up is a child or not. “The illustrations 1 & 2 seem to suggest if the user indicates they are a child then the platform has to take steps to gather verifiable parental consent,” she explained in a post on X (formerly Twitter).
Similarly, the founder of social media impact consulting Space2Grow, also told MediaNama that the DPDP Rules “do not explicitly require mandatory age verification unless the user’s data triggers any sign of them being a child”.
How consent provisions could cause user drop-offs:
When a parent comes forward to give consent on behalf of a child, the platform has to verify their age and identity as well. The rules provide two ways in which platforms can approach this verification—
Bharti expressed concern that getting synchronous consent (consent right before a child uses the platform) will be operationally difficult. She explained that it would lead to “huge drop-offs (especially among low-income/rural households) and increased costs of compliance.” Talking about the synchronicity of consent, she said that there could be circumstances where the parent isn’t available to give consent when the child needs to access a specific online service.
During a spaces discussion on X, Bharti explained that her organisation Young Leaders for Active Citizenship (YLAC) works with rural communities where there is a lot of shared device usage. “Children [in these communities] are way more sophisticated users of technology than their parents. Parents on the other hand ask children for help to navigate the tech world,” she explained. She said that while children do need to be safe online, cutting their access to the internet off is a bigger harm to them.
The grey area for establishing parent-child relationships:
One of the age verification scenarios under the rules is where a person comes forward identifying themself as child’s parent. The platform then verifies the age and identity of the parent. The rules do not specify how platforms have to go about verifying this parent and child relationship.
“The ‘due diligence’ methods expected of data fiduciaries to establish relationship with the minor is a grey area – in effect indicating that people have to surrender more data about themselves, their relationships, and online behaviour to either platforms or the government,” Nidhi Sudhan, co-founder of Citizen Digital Foundation told MediaNama. According to her, the rules appear to favour the interests of businesses and the Government more than the people whose data it was meant to protect.
Other key comments about the rules:
Missing out room for positive behavioral monitoring of children:
Besides parental consent, the act also restricts platforms from carrying out tracking/behavioral monitoring of children. It says that the government can exempt certain platforms from these restrictions as well as verification restrictions provided that they process a child’s data in a verifiably safe manner.
While the rules list a range of different services that the government allows to carry out behavioral monitoring/exempts from verifiable consent, Sidharth Deb from Quantum Hub Consulting mentioned that they “seem to miss out on an opportunity to incentivise positive/beneficial processing activities that can preserve meaningful internet experiences for under 18 users.” He adds that the rules could have initiated a discussion around what standards companies must meet to qualify as verifiably safe so that the Government allows them to curate digital products for under 18 users.
Lack of inclusion of vicarious consent:
The Data Protection Act says that companies can only process the personal data of an Indian citizen for purposes to which the citizen has specifically consented or for legitimate uses as specified under the act such as court orders, medical emergencies, epidemics, employment and so on. Bharti says that in certain situations like sending gifts to friends or family, or fraud prevention require vicarious consent.
Now that I have had the time to sleep over them for a (very) few hours, here are some observations:
1) At
免责声明:info@kdj.com
所提供的信息并非交易建议。根据本文提供的信息进行的任何投资,kdj.com不承担任何责任。加密货币具有高波动性,强烈建议您深入研究后,谨慎投资!
如您认为本网站上使用的内容侵犯了您的版权,请立即联系我们(info@kdj.com),我们将及时删除。
-
-
- 位于索拉纳的令牌发行平台泵。FUN可能会推出自己的AMM
- 2025-02-25 10:30:29
- AMM是加密市场中的交换系统,通过使用通常(至少两个令牌)的流动性库来轻松进行交易。
-
- 华盛顿中部警察寻找醉酒的人,他们带领他们进行了漫长的高速追击
- 2025-02-25 10:30:29
- 华盛顿中部的警察正在寻找一个他们说的男人,当时他带领他们在周日凌晨漫长的高速追击。
-
-
- 由于比特币接近90,000美元的支持,加密市场崩溃了3万亿美元
- 2025-02-25 10:30:29
- 加密货币市场在过去24小时内遭受了巨大打击,总市值降至至关重要的3万亿美元大关以下
-
-
-
- Floki INU(FLOKI)尊重趋势线,尽管最近市场下降,但可能会突破ATHS
- 2025-02-25 10:30:29
- Floki INU(Floki)陷入了市场动荡浪潮中,在过去30天内的下降了35%。
-