Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label DPC. Show all posts

Meta Fined €91 Million by EU Privacy Regulator for Improper Password Storage

 

On Friday, Meta was fined €91 million ($101.5 million) by the European Union's primary privacy regulator for accidentally storing some user passwords without proper encryption or protection.

The investigation began five years ago when Meta informed Ireland's Data Protection Commission (DPC) that it had mistakenly saved certain passwords in plaintext format. At the time, Meta publicly admitted to the issue, and the DPC confirmed that no external parties had access to the passwords.

"It is a widely accepted practice that passwords should not be stored in plaintext due to the potential risk of misuse by unauthorized individuals," stated Graham Doyle, Deputy Commissioner of the Irish DPC.

A Meta spokesperson mentioned that the company took swift action to resolve the error after it was detected during a 2019 security audit. Additionally, there is no evidence suggesting the passwords were misused or accessed inappropriately.

Throughout the investigation, Meta cooperated fully with the DPC, the spokesperson added in a statement on Friday.

Given that many major U.S. tech firms base their European operations in Ireland, the DPC serves as the leading privacy regulator in the EU. To date, Meta has been fined a total of €2.5 billion for violations under the General Data Protection Regulation (GDPR), which was introduced in 2018. This includes a record €1.2 billion penalty issued in 2023, which Meta is currently appealing.

Irish Data Protection Commission Halts AI Data Practices at X

 

The Irish Data Protection Commission (DPC) recently took a decisive step against the tech giant X, resulting in the immediate suspension of its use of personal data from European Union (EU) and European Economic Area (EEA) users to train its AI model, “Grok.” This marks a significant victory for data privacy, as it is the first time the DPC has taken such substantial action under its powers granted by the Data Protection Act of 2018. 

The DPC initially raised concerns that X’s data practices posed a considerable risk to individuals’ fundamental rights and freedoms. The use of publicly available posts to train the AI model was viewed as an unauthorized collection of sensitive personal data without explicit consent. This intervention highlights the tension between technological innovation and the necessity of safeguarding individual privacy. 

Following the DPC’s intervention, X agreed to cease its current data processing activities and commit to adhering to stricter privacy guidelines. Although the company did not acknowledge any wrongdoing, this outcome sends a strong message to other tech firms about the importance of prioritizing data privacy when developing AI technologies. The immediate halt of Grok AI’s training on data from 60 million European users came in response to mounting regulatory pressure across Europe, with at least nine GDPR complaints filed during its short stint from May 7 to August 1. 

After the suspension, Dr. Des Hogan, Chairperson of the Irish DPC, emphasized that the regulator would continue working with its EU/EEA peers to ensure compliance with GDPR standards, affirming the DPC’s commitment to safeguarding citizens’ rights. The DPC’s decision has broader implications beyond its immediate impact on X. As AI technology rapidly evolves, questions about data ethics and transparency are increasingly urgent. This decision serves as a prompt for a necessary dialogue on the responsible use of personal data in AI development.  

To further address these issues, the DPC has requested an opinion from the European Data Protection Board (EDPB) regarding the legal basis for processing personal data in AI models, the extent of data collection permitted, and the safeguards needed to protect individual rights. This guidance is anticipated to set clearer standards for the responsible use of data in AI technologies. The DPC’s actions represent a significant step in regulating AI development, aiming to ensure that these powerful technologies are deployed ethically and responsibly. By setting a precedent for data privacy in AI, the DPC is helping shape a future where innovation and individual rights coexist harmoniously.

TikTok Faces Massive €345 Million Penalty for Mishandling Kids' Data Privacy

 


As a result of TikTok's failure to shield underage users' content from public view as well as violating EU data laws, the company has been fined €345 million (£296 million) for mishandling children's accounts and for breaking the laws. 

Data watchdogs in Ireland, which oversee the Chinese video app TikTok across the EU, recently told legal watchdogs that the video app had violated multiple GDPR rules in its operation. In its investigation, TikTok was found to have violated GDPR by making it mandatory for its users to place their accounts on a public setting by default; failing to give transparent information to child users; allowing a parent to view a child's account using the "family pairing" option to enable direct messaging for those over 16; and not considering the risks to children who were placed on the platform in a public setting and not considering that. 

Children's personal information was not sufficiently protected by the popular Chinese-owned app because it made its account public by default and did not adequately address the risks associated with under-13 users being able to access its platform, according to a decision published by the Irish Data Protection Commission (DPC). 

In a statement released on Tuesday, the Irish Data Protection Commission (DPC) said the company violated eight articles in the GDPR, the EU's primary regulatory authority for the company. There are several legal aspects of data processing which are covered by these laws, and they go from the legal use of personal data to protecting it from unlawful use. 

In most children's accounts, the settings for the profile page are set to public by default, so that everyone will be able to see any content that they post there. In an attempt to allow parents to link to their older child's account and use Direct Messages, this feature called Family Pairing allowed any adult to pair up with their child's account.  

There was no indication the child could be at risk from this feature. In the process of registering users and posting videos, TikTok did not provide the information it should have to child users and instead resorted to what's known as "dark patterns" to encourage users to choose more privacy-invasive options during their registration process. 

According to a DPC decision, the media company has been fined £12.7m after the UK data regulator found TikTok had illegally processed 1.4 million children's data under the age of 13 who were using its platform without their parent's consent in April. 

Despite being a popular social media platform, TikTok has done "very little or nothing, if anything" to ensure the safety of the platform's users from illicit activity. According to TikTok, the investigation examined the privacy setup the company had between 31 July and 31 December 2020, and it has said that it has addressed all of the issues raised as a result of the investigation.

Since 2021, all new and existing TikTok accounts that are 13- to 15-year-olds as well as those that are already set up have been set up as private, meaning that only people the user has authorized will be able to view their content. Additionally, the DPC pointed out that some aspects of their decision had been overruled by the European Data Protection Board (EDPB), a body made up of data protection regulators from various EU member states, on certain aspects. 

The German regulator had to propose a finding that the use of “dark patterns” – the term for deceptive website and app design that leads users to choose certain behaviours or make certain choices – violated the GDPR's provisions for the fair processing of personal data, and this was the reason why it had to include the proposed finding. 

TikTok has been accused of unlawfully making accounts of its users aged 13 to 17 public by default, which effectively means anyone can watch and comment on the videos that individuals have posted on their TikTok accounts between July and December 2020, according to the Irish privacy regulator. 

Moreover, the company failed to adequately assess the risks associated with the possibility of users under the age of 13 gaining access to its platform through marketing channels. Also, the report found that TikTok is still manipulating teenagers who join the platform by requesting them to share their videos and accounts publicly through pop-up advertisements that manipulate them. 

A regulator has ordered the company to change these misleading designs, also known as dark patterns, within three months to prevent any further harm to consumers. As early as the second half of 2020, accounts of minors could be linked to unverified accounts of adults. 

It was also reported that the video platform failed to explain to teenagers previous to the release of their content and accounts to the general public the consequences of making those content and accounts public. It has also been mentioned by the board of European regulators that there were serious doubts in their minds about the effectiveness of TikTok's measures to keep under 13 users off its platform in the latter half of 2020. 

As a result, the EDPB found that TikTok was failing to check the ages of existing users "in a sufficiently systematic manner" even though the mechanisms could be easily circumvented. Because of a lack of information available during the cooperation process, the group was unable to find an infringement, according to the group.

There was a fine of £12.7 million (€14.8 million) from the United Kingdom's data regulator in April for allowing children under 13 to use the platform and use their data. In addition, the company also received a fine of €750,000 from the Dutch privacy authority in 2021 for failing to provide a privacy policy in the Dutch language, which was meant to protect Dutch children.