Officers seized computers and other records; the pair is in police custody. On Sunday, the hospital stated the alleged leak, but the exact details were not disclosed at that time. The hospital’s chief executive, Dr. Kenny Yuen Ka-ye, said that the data of a few patients had been given to a third party. An internal complaint a month ago prompted the investigation.
According to Dr Ka-ye, the hospital found at least one doctor who accessed the patient’s personal data without permission. The hospital believes the documents containing information about other patients might have also been exposed to the third party. Police said experts are working to find out more details concerning the number of patients impacted by the incident.
While the investigation is ongoing, the consultant Dr has given his resignation, while the associate consultant has been suspended. At the time of writing this story, the motivation behind the attack is not known. According to Yuen, every doctor has access to the clinical management system that has patient information, but the use is only permitted under a strict “need-to-know” for research purposes or as part of the medical team taking care of a patient.
The investigation revealed that the two doctors didn’t fit into either category, which was a violation. According to SCMP’s conversation with a source, the portal reported that the two doctors (both members of the surgery department) sent details of a female pancreatic cancer patient who died after a surgical operation.
The pair illegally accessed the info and sent it to the family, asking them to file a complaint against the doctor who did the operation. This was done to show the doctor’s alleged incompetence.
The hospital has sent the case to the Office of the Privacy Commissioner for Personal Data, and has also reported the incident to the police and the Medical Council.
On 12 September, the EU Council will share its final assessment of the Danish version of what is known as “Chat Control.” The proposal has faced strong backlash, as it aims to introduce new mandates for all messaging apps based in Europe to scan users’ chats, including encrypted ones.
Belgium and the Czech Republic are now opposing the proposed law, with the former calling it "a monster that invades your privacy and cannot be tamed." The other countries that have opposed the bill so far include Poland, Austria, and the Netherlands.
But the list of supporters is longer, including important member states: Ireland, Cyprus, Spain, Sweden, France, Lithuania, Italy, and Ireland.
Germany may consider abstaining from voting. This weakens the Danish mandate.
Initially proposed in 2022, the Chat Control Proposal is now close to becoming an act. The vote will take place on 14 October 2025. Currently, the majority of member states are in support. If successful, it will mean that the EU can scan chats of users by October 2025, even the encrypted ones.
The debate is around encryption provisions- apps like Signal, WhatsApp, ProtonMail, etc., use encryption to maintain user privacy and prevent chats from unauthorized access.
If the proposed bill is passed, the files and things you share through these apps can be scanned to check for any CSAM materials. However, military and government accounts are exempt from scanning. This can damage user privacy and data security.
Although the proposal ensures that encryption will be “protected fully,” which promotes cybersecurity, tech experts and digital rights activists have warned that scanning can’t be done without compromising encryption. This can also expose users to cyberattacks by threat actors.
Google announced a new step to make Android apps safer: starting next year, developers who distribute apps to certified Android phones and tablets, even outside Google Play, will need to verify their legal identity. The change ties every app on certified devices to a named developer account, while keeping Android’s ability to run apps from other stores or direct downloads intact.
What this means for everyday users and small developers is straightforward. If you download an app from a website or a third-party store, the app will now be linked to a developer who has provided a legal name, address, email and phone number. Google says hobbyists and students will have a lighter account option, but many independent creators may choose to register as a business to protect personal privacy. Certified devices are the ones that ship with Google services and pass Google’s compatibility tests; devices that do not include Google Play services may follow different rules.
Google’s stated reason is security. The company reported that apps installed from the open internet are far more likely to contain malware than apps on the Play Store, and it says those risks come mainly from people hiding behind anonymous developer identities. By requiring identity verification, Google intends to make it harder for repeat offenders to publish harmful apps and to make malicious actors easier to track.
The rollout is phased so developers and device makers can prepare. Early access invitations begin in October 2025, verification opens to all developers in March 2026, and the rules take effect for certified devices in Brazil, Indonesia, Singapore and Thailand in September 2026. Google plans a wider global rollout in 2027. If you are a developer, review Google’s new developer pages and plan to verify your account well before your target markets enforce the rule.
A similar compliance pattern already exists in some places. For example, Apple requires developers who distribute apps in the European Union to provide a “trader status” and contact details to meet the EU Digital Services Act. These kinds of rules aim to increase accountability, but they also raise questions about privacy, the costs for small creators, and how “open” mobile platforms should remain. Both companies are moving toward tighter oversight of app distribution, with the goal of making digital marketplaces safer and more accountable.
This change marks one of the most significant shifts in Android’s open ecosystem. While users will still have the freedom to install apps from multiple sources, developers will now be held accountable for the software they release. For users, it could mean greater protection against scams and malicious apps. For developers, especially smaller ones, it signals a new balance between maintaining privacy and ensuring trust in the Android platform.