Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Privacy Threats. Show all posts

Privacy Breach Rocks Australian Nightlife as Facial Recognition System Compromised

 

A significant privacy breach has shaken up the club scene in Australia, as a facial recognition system deployed across multiple nightlife venues became the target of a cyberattack. Outabox, the Australian firm responsible for the technology, is facing intense scrutiny in the aftermath of the breach, sparking widespread concerns regarding personal data security in the era of advanced surveillance. 

Reports indicate that sensitive personal information, including facial images and biometric data, has been exposed, raising alarms among patrons and authorities. As regulators rush to assess the situation and ensure accountability, doubts arise about the effectiveness of existing safeguards against such breaches. 

Outabox has promised full cooperation with investigations but is under increasing pressure to address the breach's repercussions promptly and decisively. Initially introduced as a safety measure to monitor visitors' temperatures during the COVID-19 pandemic, Outabox's facial recognition kiosks evolved to include identifying individuals in self-exclusion programs for gambling, showcasing the company's innovative use of technology. 

However, recent developments have revealed a troubling scenario with the emergence of a website called "Have I Been Outaboxed." Claiming to be created by former Outabox employees based in the Philippines, the site alleges mishandling of over a million records, including facial biometrics, driver's licenses, and various personal identifiers. 

This revelation highlights serious concerns regarding Outabox's security and privacy practices, emphasizing the need for robust data protection measures and transparent communication with both employees and the public. Allegations on the "Have I Been Outaboxed" website suggest that the leaked data includes a trove of personal information such as facial recognition biometrics, driver's licenses, club memberships, addresses, and more. 

The severity of this breach is underscored by claims that extensive membership data from IGT, a major supplier of gaming machines, was also compromised, although IGT representatives have denied this assertion. This breach has triggered a robust reaction from privacy advocates and regulators, who are deeply concerned about the significant implications of exposing such extensive personal data. 

Beyond the immediate impact on affected individuals, the incident serves as a stark reminder of the ethical considerations surrounding the deployment of surveillance technologies. It underscores the delicate balance between security imperatives and the protection of individual privacy rights.

Google DeepMind Researchers Uncover ChatGPT Vulnerabilities

 

Scientists at Google DeepMind, leading a research team, have adeptly utilized a cunning approach to uncover phone numbers and email addresses via OpenAI's ChatGPT, according to a report from 404 Media. This discovery prompts apprehensions regarding the substantial inclusion of private data in ChatGPT's training dataset, hinting at the risk of inadvertent information exposure. 

The researchers expressed astonishment at the success of their attack and emphasized that the vulnerabilities they exploited could have been identified earlier. They detailed their findings in a study, which is currently available as a not-yet-peer-reviewed paper. The researchers also mentioned that, to their knowledge, the notable frequency with which ChatGPT emits training data had not been observed before the release of this paper. 

Certainly, the revelation of potentially sensitive information represents merely a fraction of the issue at hand. As highlighted by the researchers, the broader concern lies in ChatGPT mindlessly reproducing extensive portions of its training data verbatim at an alarming rate. This susceptibility opens the door to widespread data extraction, possibly supporting the claims of incensed authors who contend that their work is falling victim to plagiarism. 

How Researchers Executed Their Attack? 

The researchers acknowledge that the attack is rather simple and somewhat amusing. To execute it, one just needs to instruct the chatbot to endlessly repeat a specific word, like "poem," and then let it do its thing. After a while, instead of repetitive behaviour, ChatGPT begins generating varied and mixed pieces of text, often containing substantial chunks copied from online sources. 

OpenAI introduced ChatGPT (Chat Generative Pre-trained Transformer) to the public on November 30, 2022. This chatbot, built on a robust language model, empowers users to shape and guide conversations according to their preferences in terms of length, format, style, level of detail, and language. 

According to the Nemertes enterprise AI research study for 2023-24, over 60% of the organizations surveyed were actively employing AI in production, and nearly 80% had integrated AI into their business operations. Surprisingly, less than 36% of these organizations had established a comprehensive policy framework to govern the use of generative AI.

Behind the Wheel, Under Surveillance: The Privacy Risks of Modern Cars

 


The auto industry is failing to give drivers control over their data privacy, according to researchers warning that modern cars are "wiretaps on wheels." An analysis published on Wednesday revealed that in an era when driving is becoming increasingly digital, some of the most popular car brands in the world are a privacy nightmare, collecting and selling personal information about their customers. 

According to the Mozilla Foundation's 'Privacy Not Included' survey, most major manufacturers admit to selling drivers' personal information, with half of those manufacturers saying they'd make it available without a court order to governments, law enforcement agencies, or the insurance company. 

Automobiles have become prodigious data-collection hubs since the proliferation of sensors - from telematics to fully digitalised control consoles - has enabled us to collect huge amounts of data about vehicles. 

The findings of a new study indicate that car brands intentionally collect "too much personal data" from drivers, which gives them little or no choice regarding what they want to share. In addition to automobiles, the new study also examined products from a wide variety of categories, including mental health apps, electronic entertainment devices, smart home devices, wearables, fitness products, and health and exercise products, among other categories. 

There is, however, one concern that the authors addressed when reviewing cars, namely that they found them to be the worst products in terms of privacy, calling them a "privacy nightmare". Mozilla Foundation Spokesperson Kevin Zawacki stated that cars were the first category to be reviewed in which all of the products were given the warning label "Privacy Not Included" in the privacy information. 

As reported by several different sources, all car brands are also said to be collecting a significant amount of personal information about their customers, with 84% sharing or selling their collected data. According to the study, car manufacturers are becoming tech manufacturers in order to collect data from their customers that can easily be shared or sold without their knowledge or permission, which is why privacy concerns are rising. 

Among other things, the data from the car includes super in-depth information about the car user, such as biometric information, medical information, genetic information, driving speeds, travel locations, and music preferences; among many other things. 

Taking care of your privacy is one of the most frustrating aspects of owning a car for several reasons. In addition to the fact that they collect too much personal information, as stated in the report, many automakers do the same. 

The report goes on to explain that every manufacturer does the same thing. From the way users interact with their cars to data from third parties such as Google Maps, this type of data can include many different kinds of information. 

Some cars can even collect data from the phones associated with them if they have an accompanying app. There is perhaps nothing worse about these kinds of privacy violations than the fact that there is no way for the user, unlike with devices like TVs, to opt out of them. 

As far as the user's data is concerned, 92% of car manufacturers do not allow them to have control over it - while only two car manufacturers allow the user to delete the data they have collected. Mozilla has identified no car company that has met its Minimum Security Standards, which include the very basics as well as such things as encrypted data. 

Caltrider mentioned that car buyers are limited to several options if they do not opt for a used, pre-digital model. Since 2017, Mozilla has studied a wide range of products - including fitness trackers, reproductive-health apps, smart speakers, and other connected home appliances - and since 2017, cars ranked lowest for privacy out of more than a dozen product categories. 

Is it Possible for Cars to Spy on Drivers? 

There has been a trend of automakers openly bragging about their cars being 'computers on wheels' for years to promote their advanced features, but these features have been especially augmented with the advent of the internet, which has transformed new cars into "powerful data-hungry machines," according to Mozilla. 

Nowadays, there are cameras mounted on both sides of the vehicle, microphones, and many other sensors that assist in monitoring driver activity. The companies that provide apps, maps, and connected services that combine with your phone collect or access your data when you pair the phone to the computer.

A lot of car buyers don't have many choices on the market today, other than opting for a used, pre-digital model, Caltrider told the Associated Press. She points out that automobile manufacturers seem to behave better in Europe, where the laws are tougher, and she believes the United States could pass similar laws if they wished. 

The Mozilla Foundation is hoping that raising awareness among consumers will raise awareness and fuel a backlash against companies that are guilty of the same kind of surveillance practices in their "smart" devices, as was the case with TV manufacturers during the 2010s. "Cars seem to have slipped under the radar in terms of privacy."