A recent academic study has revealed alarming security gaps in global satellite communications, exposing sensitive personal, corporate, and even military information to potential interception. Researchers from the University of California, San Diego, and the University of Maryland discovered that a large portion of geostationary satellites transmit unencrypted data, leaving them open to eavesdropping by anyone with inexpensive receiving equipment.
Over a three-year investigation, the research team assembled an $800 receiver setup using readily available components and placed it on the roof of a university building in La Jolla, California. By adjusting their dish toward various satellites visible from their location, the team intercepted streams of data routinely transmitted from orbit to ground-based receivers. To their surprise, much of this information was sent without any encryption or protective measures.
The intercepted traffic included mobile phone calls and text messages linked to thousands of users, in-flight Wi-Fi data from airlines, internal communications from energy and transportation systems, and certain military and law enforcement transmissions revealing positional details of personnel and assets. These findings demonstrate that many critical operations rely on satellite systems that fail to protect private or classified data from unauthorized access.
According to the researchers, nearly half of all geostationary satellite signals they analyzed carried unencrypted content. However, their setup could only access about 15 percent of the satellites in orbit, suggesting that the scale of exposure could be significantly higher. They presented their findings in a paper titled “Don’t Look Up,” which highlights how the satellite industry has long relied on the assumption that no one would actively monitor satellite traffic from Earth.
After identifying the vulnerabilities, the researchers spent months notifying affected organizations. Several companies, including major telecom providers, responded quickly by introducing encryption and tightening their satellite communications. Others, particularly operators of older or specialized systems, have yet to implement necessary protections.
Experts in cybersecurity have called the study a wake-up call for both industry and government agencies. They stress that satellite networks often act as the communication backbone for remote locations, from offshore platforms to rural cell towers, and unprotected data transmitted through these systems poses a serious privacy and security risk.
The findings underline the pressing need for standardized encryption protocols across satellite networks. As the reliance on space-based communication continues to grow, ensuring the confidentiality and integrity of transmitted data will be vital for national security, business operations, and personal privacy alike.
Prospect, one of the UK's leading trade unions, has revealed that in June 2025, it was seriously affected by a cyberattack which had been discovered in the wake of a sophisticated cyberattack that had been launched against it. This underscores the sophistication and persistence of cyber attacks against professional bodies that are becoming ever more sophisticated. A significant part of the data that has been compromised is sensitive financial and personal data belonging to members of Prospect, the union affiliated with Prospect, and its member union, Bectu, a major representation body for professionals in the film and television industry in the country.
Prospect, a national organisation of close to 160,000 engineers, scientists, managers, and specialists from companies including BT Group, Siemens, and BAE Systems, disclosed that the breach involved a considerable amount of confidential information from its members. Based on preliminary findings, it has been found that the attackers have accessed names, birthdates, contact information, bank account information, including sort codes, for over one year.
Moreover, it has been suggested that data related to protected personal characteristics, including gender, race, religion, disability status, and employment status, may also have been compromised. A disclosure of this nature is not surprising considering that unions and membership-based organisations are increasingly relying on digital platforms for managing member records, communicating with members, and processing subscriptions – all of which make them attractive targets for cybercriminals who are looking for large quantities of personal information in bulk.
In August, a 30-year-old developer from Aalborg, identified only as Joachim, built a platform called Fight Chat Control to oppose a proposed European Union regulation aimed at tackling the spread of child sexual abuse material (CSAM) online. The EU bill seeks to give law enforcement agencies new tools to identify and remove illegal content, but critics argue it would compromise encrypted communication and pave the way for mass surveillance.
Joachim’s website allows visitors to automatically generate and send emails to European officials expressing concerns about the proposal. What began as a weekend project has now evolved into a continent-wide campaign, with members of the European Parliament and national representatives receiving hundreds of emails daily. Some offices in Brussels have even reported difficulties managing the flood of messages, which has disrupted regular communication with advocacy groups and policymakers.
The campaign’s influence has extended beyond Brussels. In Denmark, a petition supported by Fight Chat Control gained more than 50,000 signatures, qualifying it for parliamentary discussion. Similar debates have surfaced across Europe, with lawmakers in countries such as Ireland and Poland referencing the controversy in national assemblies. Joachim said his website has drawn over 2.5 million visitors, though he declined to disclose his full name or employer to avoid associating his workplace with the initiative.
While privacy advocates applaud the campaign for sparking public awareness, others believe the mass email tactic undermines productive dialogue. Some lawmakers described the influx of identical messages as “one-sided communication,” limiting space for constructive debate. Child rights organisations, including Eurochild, have also voiced frustration, saying their outreach to officials has been drowned out by the surge of citizen emails.
Meanwhile, the European Union continues to deliberate the CSAM regulation. The European Commission first proposed the law in 2022, arguing that stronger detection measures are vital as online privacy technologies expand and artificial intelligence generates increasingly realistic harmful content. Denmark, which currently holds the rotating presidency of the EU Council, has introduced a revised version of the bill and hopes to secure support at an upcoming ministerial meeting in Luxembourg.
Danish Justice Minister Peter Hummelgaard maintains that the new draft is more balanced than the initial proposal, stating that content scanning would only be used as a last resort. However, several EU member states remain cautious, citing privacy concerns and the potential misuse of surveillance powers.
As European nations prepare to vote, the controversy continues to reflect a broader struggle: finding a balance between protecting children from online exploitation and safeguarding citizens’ right to digital privacy.
Oura, the Finnish company known for its smart health-tracking rings, has recently drawn public attention after announcing a new manufacturing facility in Texas aimed at meeting the needs of the U.S. Department of Defense (DoD). The partnership, which has existed since 2019, became more widely discussed following the August 27 announcement, leading to growing privacy concerns among users.
The company stated that the expansion will allow it to strengthen its U.S. operations and support ongoing defense-related projects. However, the revelation that the DoD is Oura’s largest enterprise customer surprised many users. Online discussions on Reddit and TikTok quickly spread doubts about how user data might be handled under this partnership.
Concerns escalated further when users noticed that Palantir Technologies, a software company known for its government data contracts, was listed as a technology partner in Oura’s enterprise infrastructure. Some users interpreted this connection as a potential risk to personal privacy, particularly those using Oura rings to track reproductive health and menstrual cycles through its integration with the FDA-approved Natural Cycles app.
In response, Oura’s CEO Tom Hale issued a clarification, stating that the partnership does not involve sharing individual user data with the DoD or Palantir. According to the company, the defense platform uses a separate system, and only data from consenting service members can be accessed. Oura emphasized that consumer data and enterprise data are stored and processed independently.
Despite these assurances, some users remain uneasy. Privacy advocates and academics note that health wearables often operate outside strict medical data regulations, leaving gaps in accountability. Andrea Matwyshyn, a professor of law and engineering at Penn State, explained that wearable data can sometimes be repurposed in ways users do not anticipate, such as in insurance or legal contexts.
For many consumers, especially women tracking reproductive health, the issue goes beyond technical safeguards. It reflects growing mistrust of how private companies and governments may collaborate over sensitive biometric data. The discussion also highlights the shifting public attitude toward data privacy, as more users begin to question who can access their most personal information.
Oura maintains that it is committed to protecting user privacy and supporting health monitoring “for all people, including service members.” Still, the controversy serves as a reminder that transparency and accountability remain central to consumer trust in an age where personal data has become one of the most valuable commodities.
Meta’s Instagram, WhatsApp, and Facebook have once again been flagged as the most privacy-violating social media apps. According to Incogni’s Social Media Privacy Ranking report 2025, Meta and TikTok are at the bottom of the list. Elon Musk’s X (formerly Twitter) has also received poor rankings in various categories, but has done better than Meta in a few categories.
The report analyzed 15 of the most widely used social media platforms globally, measuring them against 14 privacy criteria organized into six different categories: AI data use, user control, ease of access, regulatory transgressions, transparency, and data collection. The research methodology focused on how an average user could understand and control privacy policies.
Discord, Pinterest, and Quora have done best in the 2025 ranking. Discord is placed first, thanks to its stance on not giving user data for training of AI models. Pinterest ranks second, thanks to its strong user options and fewer regulatory penalties. Quora came third thanks to its limited user data collection.
But the Meta platforms were penalized strongly in various categories. Facebook was penalized for frequent regulatory fines, such as GDPR rules in Europe, and penalties in the US and other regions. Instagram and WhatsApp received heavy penalties due to policies allowing the collection of sensitive personal data, such as sexual orientation and health. X faced penalties for vast data collection
X was penalized for vast data collection and privacy fines from the past, but it still ranked above Meta and TikTok in some categories. X was among the easiest platforms to delete accounts from, and also provided information to government organizations at was lower rate than other platforms. Yet, X allows user data to be trained for AI models, which has impacted its overall privacy score.
“One of the core principles motivating Incogni’s research here is the idea that consent to have personal information gathered and processed has to be properly informed to be valid and meaningful. It’s research like this that arms users with not only the facts but also the tools to inform their choices,” Incogni said in its blog.