Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Data Storage. Show all posts

Security Risks Discovered in Popular End-to-End Encrypted Cloud Storage Platforms

 

Recent cryptographic analysis by researchers at ETH Zurich has uncovered significant security vulnerabilities in five major end-to-end encrypted (E2EE) cloud storage platforms: Sync, pCloud, Icedrive, Seafile, and Tresorit. These platforms are collectively used by over 22 million people and are marketed as providing secure data storage. However, the study revealed that each of these platforms has exploitable flaws that could allow malicious actors to gain access to sensitive user data, manipulate files, or inject harmful data. The research was conducted under the assumption that a malicious attacker could control a server with full ability to read, modify, and inject data. 

This is a plausible scenario in the case of sophisticated hackers or nation-state actors. The researchers found that while these platforms promise airtight security and privacy through their E2EE models, their real-world implementation may fall short of these claims. Sync, for instance, exhibited critical vulnerabilities due to unauthenticated key material, which allows attackers to introduce their own encryption keys and compromise data. It was found that shared files could be decrypted, and passwords were inadvertently exposed to the server, compromising confidentiality. Attackers could also rename files, move them undetected, and inject folders into user storage. pCloud’s flaws were similar, with attackers able to overwrite private keys, effectively forcing encryption using attacker-controlled keys. 

This, coupled with public keys that were unauthenticated, granted attackers access to encrypted files. Attackers could also alter metadata, such as file size, reorder file chunks, or even inject files. Icedrive was shown to be vulnerable to file tampering due to its use of unauthenticated CBC encryption. Attackers could modify the contents of files, truncate file names, and manipulate file chunks, all without detection. Seafile also presented several serious vulnerabilities, including susceptibility to protocol downgrade attacks, which made brute-forcing passwords easier. The encryption used by Seafile was not authenticated, enabling file tampering and manipulation of file chunks. As with other platforms, attackers could inject files or folders into a user’s storage space. 

Tresorit fared slightly better than its peers, but still had issues with public key authentication, where attackers could potentially replace server-controlled certificates to gain access to shared files. While Tresorit’s flaws didn’t allow direct data manipulation, some metadata was still vulnerable to tampering. The vulnerabilities discovered by the ETH Zurich researchers call into question the marketing promises made by these platforms, which often advertise their services as providing the highest level of security and privacy through end-to-end encryption. In light of these findings, users are advised to exercise caution when trusting these platforms with sensitive data, particularly in cases where the server is compromised.  

The researchers notified Sync, pCloud, Seafile, and Icedrive of their findings in April 2024, while Tresorit was informed in late September 2024. Responses from the vendors varied. Icedrive declined to address the issues, Sync is fast-tracking fixes, and Tresorit is working on future improvements to further safeguard user data. Seafile has promised to patch specific vulnerabilities, while pCloud had not responded as of October 2024. While no evidence suggests that these vulnerabilities have been exploited, the flaws are nonetheless concerning for users who rely on these platforms for storing sensitive data. 

The findings also emphasize the need for ongoing scrutiny and improvement of encryption protocols and security features in cloud storage solutions, as even end-to-end encryption does not guarantee absolute protection without proper implementation. As more people rely on cloud storage for personal and professional use, these discoveries are a reminder of the importance of choosing platforms that prioritize transparent, verifiable security measures.

Why You Should Clear Your Android Browser’s Cache and Cookies



The web browsers of your Android devices, whether it's Google Chrome, Mozilla Firefox, or Samsung Internet, stores a variety of files, images, and data from the websites you visit. While this data can help load sites faster and keep you logged in, it also accumulates a lot of unnecessary information. This data buildup can potentially pose privacy risks.

Over time, your browser’s cookies and cache collect a lot of junk files. Some of this data comes from sites you’ve visited only once, while others track your browsing habits to serve targeted ads. For example, you might see frequent ads for items you viewed recently. Clearing your cache regularly helps eliminate this unnecessary data, reducing the risk of unknown data trackers lurking in your browser.

Though clearing your cache means you’ll have to log back into your favourite websites, it’s a small inconvenience compared to the benefit of protecting your privacy and freeing up storage space on your phone.

How to Clear Cookies and Cache in Google Chrome

To clear cookies and cache in Google Chrome on your Android device, tap the More button (three vertical dots) in the top right corner. Go to History and then Delete browsing data. Alternatively, you can navigate through Chrome’s Settings menu to Privacy and Security, and then Delete browsing data. You’ll have options under Basic and Advanced settings to clear browsing history, cookies and site data, and cached images and files. You can choose a time range to delete this data, ranging from the past 24 hours to all time. After selecting what you want to delete, tap Clear data.

How to Get Rid Of Unnecessary Web Files in Samsung Internet

For Samsung Internet, there are two ways to clear your cookies and cache. In the browser app, tap the Options button (three horizontal lines) in the bottom right corner, then go to Settings, and select Personal browsing data. Tap Delete browsing data to choose what you want to delete, such as browsing history, cookies, and cached images. Confirm your choices and delete.

Alternatively, you can clear data from the Settings app on your phone. Go to Settings, then Apps, and select Samsung Internet. Tap Storage, where you’ll find options to Clear cache and Clear storage. Clear cache will delete cached files immediately, while Clear storage will remove all app data, including cookies, settings, and accounts.

How to Declutter in Mozilla Firefox

In Mozilla Firefox, clearing cookies and cache is also straightforward. Tap the More button (three vertical dots) on the right of the address bar, then go to Settings and scroll down to Delete browsing data. Firefox offers options to delete open tabs, browsing history, site permissions, downloads, cookies, and cached images. Unlike Chrome, Firefox does not allow you to select a time range, but you can be specific about the types of data you want to remove.

Firefox also has a feature to automatically delete browsing data every time you quit the app. Enable this by going to Settings and selecting Delete browsing data on quit. This helps keep your browser tidy and ensures your browsing history isn’t accessible if your phone is lost or stolen.

Regularly clearing cookies and cache from your Android browser is crucial for maintaining privacy and keeping your device free from unnecessary data. Each browser—Google Chrome, Samsung Internet, and Mozilla Firefox—offers simple steps to manage and delete this data, boosting both security and performance. By following these steps, you can ensure a safer and more efficient browsing experience on your Android device.


The Role of Immutable Data Storage in Strengthening Cybersecurity


 

In today’s rapidly advancing digital world, how organisations store their data is crucial to their cybersecurity strategies. Whether protecting sensitive customer information, securing intellectual property, or ensuring smooth business operations, effective data storage methods can prominently impact an organisation's defence against cyber threats.

Modern businesses are experiencing a massive increase in data generation. This surge is driven by technological innovation, growing customer interactions, and expanding business operations. As data continues to grow at an exponential rate, organisations must find ways to fully utilise this data while also ensuring its security and availability.

Cyberattacks are becoming more frequent and sophisticated, making data protection a top priority for businesses. Ransomware attacks, in particular, are a major concern. These attacks involve cybercriminals encrypting an organisation’s data and demanding a ransom for its release. According to the Verizon 2023 Data Breach Investigations report, ransomware is involved in over 62% of incidents linked to organised crime and 59% of financially motivated incidents. The consequences of such attacks are severe, with businesses taking an average of 9.9 days to return to normal operations after a ransomware incident. Additionally, 1 in 31 companies worldwide faces weekly ransomware attacks, underscoring the urgent need for robust data protection measures.

Immutable data storage has become a key strategy in bolstering cybersecurity defences. Unlike traditional storage methods, which allow data to be modified or deleted, immutable storage ensures that once data is written, it cannot be altered or erased. This feature is crucial for maintaining data integrity and protecting critical information from tampering and unauthorised changes.

By adopting immutable storage solutions, organisations can significantly reduce the risks associated with cyberattacks, particularly ransomware. Even if attackers manage to penetrate the network, the immutable data remains unchanged and intact, rendering ransom demands ineffective. This approach not only protects sensitive information but also helps maintain business continuity during and after an attack.

As businesses continue to face the growing threat of cybercrime, adopting advanced data storage solutions like immutable storage is essential. By ensuring that data cannot be altered or deleted, organisations can better protect themselves from the devastating impacts of cyberattacks, safeguard critical information, and maintain operations without interruption. In an age where data is both a valuable asset and a prime target, robust storage strategies are indispensable to a comprehensive cybersecurity strategy.



Here's How to Solve Top Challenges in Data Storage

 

Data volumes are not only expanding, but also accelerating and diversifying. According to recent IDG research, data professionals state that data volumes are rising by 63 percent every month on average in their organisations. The majority of these organisations also collect data from 400 or more sources; 20% of respondents report having over 1,000 data sources. 

The result is an increasing demand for dependable, scalable storage. Companies want systems that can do more than just store data in an IT ecosystem informed by evolving compliance, agility, and sustainability requirements. Here are three of the most common data storage challenges, along with how suitable remedies can help. 

Top three challenges in data storage 

While more data opens up greater options for analytics and insight, the sheer volume of data collected and stored by companies creates issues. Three of the most major problems are security, complexity, and efficiency. 

Companies require storage security frameworks that prioritise cyber resilience, as cyberattacks are inevitable. According to Ben Jastrab, director of storage product marketing at Dell Technologies, “this is such a big topic, and such an important one. Every company in every industry is worried.”A zero-trust framework built on least privilege principles and advanced detecting technologies can assist businesses in identifying storage attacks and minimise the damage done. 

Storage faces additional challenges as complexity increases. IT teams can easily become complacent when it comes to purchasing, maintaining, and replacing physical hardware, as well as adopting, monitoring, and upgrading storage software. "Companies have more things to manage than ever," explains Jastrab. "To make the most of storage, they need to automate operations.” 

More data, less time. Higher expenses and lower costs. Higher demands and a smaller pool of skilled staff. These common challenges share a unifying thread: efficiency. Companies that can increase the efficiency of their storage solutions will be better prepared to manage the ever-changing storage landscape. 

Consider the recent data from the United States Energy Information Administration, which estimates that wholesale power rates would be 20% to 60% higher this winter than in 2022. As storage volumes grow, companies require a solution to cut physical footprints and energy costs.

Global Businesses Navigate Cloud Shift and Resurgence in In-House Data Centers

In recent times, businesses around the world have been enthusiastically adopting cloud services, with a global expenditure of almost $230 billion on public cloud services last year, a significant jump from the less than $100 billion spent in 2019. The leading players in this cloud revolution—Amazon Web Services (AWS), Google Cloud Platform, and Microsoft Azure—are witnessing remarkable annual revenue growth of over 30%. 

What is interesting is that these tech giants are now rolling out advanced artificial intelligence tools, leveraging their substantial resources. This shift hints at the possible decline of traditional on-site company data centers. 

Let’s Understand First What is In-House Data Center 

An in-house data center refers to a setup where a company stores its servers, networking hardware, and essential IT equipment in a facility owned and operated by the company, often located within its corporate office. This approach was widely adopted for a long time. 

The primary advantage of an in-house data center lies in the complete control it provides to companies. They maintain constant access to their data and have the freedom to modify or expand on their terms as needed. With all hardware nearby and directly managed by the business, troubleshooting and operational tasks can be efficiently carried out on-site. 

Are Companies Rolling Back? 

Despite the shift towards cloud spending surpassing in-house investments in data centers a couple of years ago, companies are still actively putting money into their own hardware and tools. According to Synergy Research Group, a team of analysts, these expenditures crossed the $100 billion mark for the first time last year. 

Particularly, many businesses are discovering the advantages of on-premises computing. Notably, a significant portion of the data generated by their increasingly connected factories and products, expected to surpass data from broadcast media or internet services soon will remain on their own premises. 

While the public cloud offers convenience and cost savings due to its scale, there are drawbacks. The data centers of major cloud providers are frequently located far from their customers' data sources. Moving this data to where it's processed, sometimes halfway around the world, and then sending it back takes time. While this is not always crucial, as not all business data requires millisecond precision, there are instances where timing is critical. 

What Technology Global Companies Are Adopting? 

Manufacturers are creating "digital twins" of their factories for better efficiency and problem detection. They analyze critical data in real-time, often facing challenges like data transfer inconsistencies in the public cloud. To address this, some companies maintain their own data centers for essential tasks while utilizing hyperscalers for less time-sensitive information. Industrial giants like Volkswagen, Caterpillar, and Fanuc follow this approach. 

Businesses can either build their own data centers or rent server space from specialists. Factors like rising costs, construction delays, and the increasing demand for AI-capable servers impact these decisions. Hyperscalers are expanding to new locations to reduce latency, and they're also providing prefabricated data centers. Despite the cloud's appeal, many large firms prefer a dual approach, maintaining control over critical data.

Datafication: What is it and Why is it Important ?


While the word ‘datafication’ will sound like an irrelevant word, being forgotten as years went by, it actually hold much importance and impacts individual on a significant basis. And, it is not the same as ‘digitization.’

But, before this conclusive statement, let us understand the difference between the two:

What is Digitization? 

Digitization refers to records created by digitizing physical data records, such as those found in books, videotapes, and CDs. These records are much easier to store, back up, share, edit, and analyze.

When computers were still a fresh invention, this was a big transition, since a lot of data was created and kept in “hard copy.”

Digitization has held its importance since its introduction, since it has helped many declutter their lives by digitizing the paper notes and records, they had horded. 

However, digitization is no longer a crucial trend as trend as it used to be, taking into account that most of that data is already created in a digital format.

And here is when Datafication kicks in.

What Is Datafication? 

"Datafication" is the process of gathering data by quantitatively analyzing previously examined qualitative events. Datafication is different from digitization, because of the fact that so much data now originates and exists in digital form.

Why Is Datafication Important? 

Datafication is significant because it reflects a compromise between internet users and the businesses that sell us products and services. The better those businesses understand us, the more they can improve the services they offer.

This is in turn a win-win situation. Online users' experience is possible when using the internet, and the better those experiences are, the more likely they are to support the businesses that make the websites they use accessible to them.

Future of Datafication

One of the reasons that make datafication so crucial in today’s world, in fact, the only reason that makes it significant is that people spend much of their time online than ever before. Not only this, the devices used today, to access online elements are becoming increasingly ubiquitous.

On a home desktop, information can be gathered from an online experience. With access to its user's camera, location, and other features, a mobile device may capture considerably more data. Additionally, many people use mobile devices more frequently than desktop PCs.

These trends are highly likely to increase. The methods used to store and manage personal data, however, may prevent a loss of privacy from being a natural consequence of an increase in data.

Currently, cookies allow users to customize the extent to which they exchange data for online efficiency, but many aspects are still beyond the control of the familiar may. This leaves the ‘developments in datafication’ a significant theme of conversation to look forward to.

What Must You Do Before Uploading Your Sensitive Data to the Cloud?


Cloud storage has emerged as a prominent tool when it comes to managing or storing users’ data. Prior to the establishment of cloud storage technology, more than a decade ago, emailing individual files to yourself or saving them to an external drive and physically moving them from one computer to another were the two most popular methods for backing up documents or transferring them between devices. 

But now data storage has witnessed a massive breakthrough in technology, thanks to cloud storage solutions. Some of the prominent cloud storage services like Google Drive, Microsoft OneDrive, Dropbox, and Apple iCloud Drive made it dead simple to back up, store, and keep our documents synced across devices. 

Although, this convenience came to the users at a cost of privacy. When we use any of the Big 4's major cloud services, we theoretically give them—or anybody who can hack them—access to whatever we keep on their cloud, including our financial and health information, as well as our photos, notes, and diaries. 

One of the major reasons why user privacy is at stake is because all four prominent cloud service providers meagerly encrypt the documents while uploading. Since these documents are not end-to-end encrypted, it indicates that the user is the only one with the ability to decrypt. 

Minimal encryption would mean that the service provider too holds the key to decrypt users’ documents, and is capable of doing so at all times. Moreover, in some severe instances, a hacker may as well get hold of the decryption key. 

Out of the four major cloud services, Apple is the only service provider with Advanced Data Protection for iCloud, launched recently, which enables users to choose to have their documents end-to-end encrypted when stored in iCloud Drive. This makes Apple void of any access to the files, ensuring the user’s privacy. However, this setting is still optional, making the merely encrypted iCloud Drive a default setting. 

Since the remaining three major cloud storage providers are yet to provide users with the choice of end-to-end encryption and taking into consideration the exploded usage of such personal cloud services in recent years, billions of users are currently at risk of getting their sensitive documents exposed to the third party. 

Encrypt First, Then Upload to the Cloud 

It is possible to use the popular cloud storage services while preventing anyone who gains access to your account from seeing the files stored therein by encrypting those files prior to uploading them. The best part? You do not require a computer scientist or a security developer to do so. With the numerous applications, that are available for free, one could encrypt any file on one's own. 

What is Encrypto?

One such well-known encryption program is Encrypto, sponsored by a company called MacPaw. You may drag a file into the program, give it a password, and then encrypt it using industry AES-256 encryption. The software then enables you to save a file with an encrypted version (.crypto file type). 

After encrypting the files, the user can now upload the encrypted version of the file to their preferred cloud storage provider rather than the original file containing sensitive data. If your cloud storage is then compromised, the attacker should be unable to open the Crypto file without knowing the password the user has established for it. 

Encrypto is a cross-platform tool that works on both Macs and Windows PCs, despite the fact that MacPaw is known for producing Mac-specific utility apps. The recipient merely needs to download the free Encrypto app to be able to open sensitive documents that have been sent to them over email and have been encrypted using Encrypto (and you need to let them know the password, of course). 

Another nice feature that the app possesses is that it enables users to set different passwords for each file they create. One can even include a password hint in the encrypted file to remind what password is being used in the file. Users are advised to establish a password that would be difficult to decipher through brute force or something that would be difficult to guess. 

This being said, no matter the choice of app, encrypting the files yourself before uploading them to Google Drive Microsoft OneDrive, Dropbox, or iCloud Drive adds an additional layer of encryption and security to the sensitive data while still maintaining to reap the numerous benefits of cloud storage.  

A Catastrophic Mutating Event Will Strike the World in 2 Years, Claims WEF


The World Economic Forum (WEF) in Devos, Switzerland has come up with its set of uplifting predictions for 2023. The latest report warns of a global catastrophic cyber event in the near future. 

The WEF Annual Meeting includes government leaders, businesses, and civil society addressing the state of the world, while also discussing the priorities of the year ahead. 

“The most striking finding that we’ve found is that 93 percent of cyber leaders, and 86 percent of cyber business leaders, believe that the geopolitical instability makes a catastrophic cyber event likely in the next two years. This far exceeds anything that we’ve seen in previous surveys,” says WEF managing director Jeremy Jurgens during a presentation, highlighting the WEF Global Security Outlook Report 2023. 

Adding to the unpredictability of the turn of events, Jurgens cited a recent cyberattack that was intended to disable Ukrainian military capabilities but inadvertently also shut down a portion of the production of energy across Europe. 

In regards to this, Jürgen Stock, Secretary-General of Interpol, says that “This is a global threat[…]It calls for a global response and enhanced and coordinated action.” 

According to him, the increased profit that various bad actors acquire from cybercrime should encourage world leaders into working in a collaborative manner, making it a top priority as they face "new sophisticated tools." 

Albania Set to Combat Cybercrime 

Albania, which recently experienced a significant cyberattack is now collaborating with larger allies to thwart the criminals, acting as a sort of laboratory for people to understand what is to come. 

During the presentation, Edi Rama, the Prime Minister of Albania, illustrated on the industry's growth— from $3 trillion in 2015 to an anticipated $10.5 trillion in 2025. This, according to Rama means that if cybercrime were a state, it would have the third-largest global economy after the U.S. and China. 

Expected Cybercrime Trends in the Next Two Years 

Cyber threats are evolving at a faster rate, with cybercrime underground turning into an organized cybercrime ecosystem. In order to effectively combat these threats, it has become essential to stay up-to-date on the trends in cybercrime, which will eventually reflect its future status in the cyber world. 

Here, we are listing some of the trends that are likely to be prevalent in cybercrime tactics in the coming years: 

  • Artificial Intelligence/ Machine Learning 

AI and machine learning have the ability to boost attack automation, speed, frequency, and efficiency while also enabling the possibility of targeted attacks that are specifically aimed at particular groups. They might also speed up cyber detection, protection, and recovery systems from a cybersecurity perspective. 

  • Computing and Data Storage Technology 

The innovation and immense usage of computing and data storage technologies in all sectors and services will eventually give threat actors more chances to exploit, gain unauthorized access to and disseminate illicit data. 

  • Blockchain and Distributed Ledger Technologies (DLTs) 

Digitalized transactions could be manipulated for nefarious purposes, such as blocking them from being processed, since they are digitalized and processed by DLTs. DLTs may also be used to store inappropriate or disruptive content that is difficult to get rid of. 

  • Botnets and Automated Malware Deployment Tools 

The rapid expansion of the Internet of Things (IoT), which is connecting more and more devices to the internet, is also giving a massive opportunity for threat actors to conduct malicious activities. The increasing inclination towards bots and automated malware deployment tools have as well contributed as an aid to the attackers. These inexpensive and easy-to-use tools lower the skill level barrier for hackers to launch attacks. 

3-2-1 Backup Strategy: How Does It Work?

 


It is well known that the majority of businesses are aware of the importance of backing up their data regularly. Data storage policies can arguably be one of the most significant aspects of ensuring that a business is not only protected against ransomware.  IT infrastructure can also prevent loss of data even in the case of a single hardware failure. 

There is a problem with simply having multiple copies of data because that is not always enough to guarantee data security. If they are both stored on the same server, the possibility exists that both files might be deleted in the case of one incident. 

The 3-2-1 backup strategy is one of the most effective ways to protect critical files, which can be followed by anyone. Why should you care about it and what are the benefits it can provide for your business? 

What Is the 3-2-1 Backup Strategy?


The 3-2-1 backup strategy is a method of storing data that consists of three files and two folders. There is a system designed to protect data in the event of a security breach or certain natural disasters which may cause data loss. A 3-2-1 strategy recommends storing three copies of your data. Ideally, two of them should be stored in different types of storage. One copy should be kept off-site so that you retain a copy of your data if anything goes wrong.

Consequently, it is much more difficult for a single event to cause data loss due to large outbreaks of malicious software. Below you will find a step-by-step guide on how each of these steps should be carried out.

Make sure you have three copies of your data as three copies are considered necessary to ensure that your data can always be recovered in case something goes wrong. Generally, one primary copy should be kept which is easy to access, and two backup copies should be kept which can be used to serve as backups. This means that there are three copies in total.

Make use of two different storage devices


When all of your data is stored on the same type of storage device, it makes it more likely that all of your devices will be unable to run at the same time if one of them fails. There are several ways to reduce the likelihood of this, one of which is to store data in at least two different kinds of storage. There are various storage types available in the market, including hard drives, network-attached storage, tape drives, and cloud storage.

Make sure you keep a copy off-site as well


Whether you have multiple copies of all your data or not, if all of your data is stored in the same location, then a natural disaster can cause you to lose all of them, regardless of how many copies you have. A company with only one location is also more likely to be a victim of a security breach. 

Make sure you keep a copy off-site as well


Whether you have multiple copies of all your data or not, if all of your data is stored in the same location, then a natural disaster can cause you to lose all of it, regardless of how many copies you have. A company with only one location is also more likely to be a victim of a security breach.

There are various storage types available in the market, including hard drives, network-attached storage, tape drives, and cloud storage.

Regardless of the size of the business, all businesses should have a backup plan


Keeping your information safe depends largely on putting a backup in place to ensure your information is safe. There are some backup strategies that, unfortunately, do not provide adequate protection against the loss of critical data. A backup alone is not enough, you also need to consider how and where you save them along with how and where you protect them from threats.

The 3-2-1 backup approach ensures that there are three copies of everything, using different storage types and locations. As a result, it is very unlikely that one incident will be able to damage all of your data in a single incident.