Search This Blog

Powered by Blogger.

Blog Archive

Labels

Footer About

Footer About

Labels

Showing posts with label NIST. Show all posts

NIST Issues Lightweight Cryptography Standard to Secure Small Devices

 


A new lightweight cryptography standard has been finalized by the National Institute of Standards and Technology (NIST), aiming to enhance the security of billions of connected devices worldwide. It is intended to provide protection for small, resource-constrained technologies that have limited resources. Whether they be Internet of Things (IoT) sensors, RFID tags, or even medical implants, these devices have a limited memory, power, and processing capacity, allowing them to be vulnerable to modern cyber attacks due to their limited memory, power, and processing capability. 

As a result, NIST has issued Special Publication 800-232, which establishes Lightweight Cryptography Standards for Constrained Devices based on Ascon. An authentication framework as part of this framework allows for the use of tools for authenticated encryption and hashing that minimize energy consumption, memory usage, and computation demands without compromising on robust security. 

The Ascon algorithm family, which forms the basis for the standard, was originally developed in 2014 by Graz University of Technology researchers, Infineon Technologies researchers, and Radboud University researchers. Ascon has already proven its resilience by participating in the international CAESAR competition which was launched in 2023, and has since emerged as a leader in lightweight encryption, now elevated to an official benchmark for securing the next generation of connected technologies, following a rigorous global review process. 

The NIST has developed its new standard in order to deliver robust protection in situations where conventional cryptographic techniques are often too heavy and cannot be implemented as soon as possible, taking into account the fact that even the smallest digital components play an important role in today's interconnected world. 

Ascon-Based Lightweight Cryptography Standards for Constrained Devices was published as Special Publication 800-232 to introduce specialized tools for authenticated encryption and hashing suited to safeguard information generated and transmitted by billions of Internet of Things (IoT) devices, RFID tags, toll transponders, and medical implants in the form of encrypted data. There are numerous ways to attack these tiny technological devices, but they are equally vulnerable to cyberattacks as smartphones or computers. 

With lightweight cryptography, it is possible even resource-constrained electronics can be able to resist modern security threats without exceeding their performance limits without exceeding their performance limits, and this is the key to ensuring a balance. It is the NIST's intention to formalize this standard, which aims to address a long-standing threat in digital security. 

By establishing the new standard, NIST offers a practical, scalable and attainable defense for the rapidly expanding ecosystem of connected devices. The newly established standard is based on the Ascon algorithm family, which was selected after a rigorous, multi-round public review process in 2023. It has been developed since 2014 by researchers at Graz University of Technology, Infineon Technologies, and Radboud University. 

It is a cryptographic protocol that has been extensively tested for its security and has gained international recognition for its performance. In 2019, when the prestigious CAESAR competition named it the top choice for lightweight encryption, this solidified its credibility as a robust encryption solution that is resistant to multiple types of attacks. Four Ascon variants have been incorporated into the NIST framework, each aiming to meet a unique requirement of constrained devices. 

The ASCON-128 AEAD is an authenticated encryption system with associated data that allows devices to both secure and verify information, while offering increased protection against side-channel attacks, an increasingly common threat where adversaries exploit subtle hints, such as power consumption or processing time, for their attacks.

The ASCON-Hash 256 technology complements this by delivering a lightweight mechanism for ensuring data integrity through generating unique fingerprints of information that can detect tampering, assist with software updates, and enhance security of passwords as well as digital signatures. In order to increase hashing capacity and flexibility, ASCON-XOF 128 and ASCON-CXOF 128 offer longer hash lengths on low-power devices to reduce energy consumption and saving time, while the CXOF variant also adds custom labeling to prevent collisions that might be exploited by an attacker. 

Despite its immediate adoption, the standard has also been designed to be scalable in order to evolve along with the future needs of an expanding digital ecosystem, according to NIST cryptography expert Kerry McKay, who emphasizes that the standard is not just for immediate adoption. At the heart of the new standard is a suite of four interrelated algorithms derived from the Ascon family of cryptographic primitives. 

It was introduced in 2014 at the Eurocrypt Conference, and was designed specifically for high performance in environments that are constrained. There are three types of encryption algorithms that are included in the package: a key-derivation function, a hash function, and an authenticated encryption algorithm, all of which offer developers a range of choices that are suitable for the specific needs of their applications. NIST chose Ascon as its processor because of its emphasis on simplicity, efficiency, and resilience, qualities that are crucial for devices that have limited processing power, memory, and power supply. 

IoT devices, RFID tags, and embedded systems are often exposed to cyber threats due to the fact that conventional algorithms, including Advanced Encryption Standard (AES) and Secure Hash Algorithm 2 (SHA-2), are often overburdened by computational requirements, so they are vulnerable to cyber threats ranging from data breaches to denial-of-service attacks. 

By delivering comparable levels of security with a fraction of the computation overhead that traditional cryptography requires, lightweight cryptography bridges this gap. There was a public call for algorithms in 2016 that led to this standard, followed by years of intensive analysis and rigorous testing, which included evaluations across microcontrollers and embedded platforms, as well as extensive analysis of both theoretical and practical aspects of algorithms. 

Through this thorough vetting, Ascon was able to distinguish itself as offering robust security, ease of implementation, and adaptability across a variety of hardware environments by implementing a robust security framework. It goes beyond the Internet of Things, reaching into domains such as wireless sensor networks, industrial control systems, and smart cards that are increasingly in need of interoperability and secure communication protocols. 

With the release of Special Publication 800-232, NIST not only provides developers with well-vetted cryptographic tools but also lowers the barriers that developers need to overcome when designing secure systems in environments that were previously considered too constrained for modern encryption techniques. Having reached this milestone, NIST has shown that it is committed to addressing the unique security challenges posed by the rapid proliferation of small, networked devices. Ascons is also positioned as an integral part of NIST's next-generation cryptography efforts. 

It is not just a technical milestone that NIST has finalized its lightweight cryptography standard, but a strategic investment into making sure that the digital infrastructure that underpins modern life is resilient. It is inevitable that security challenges will only become more complex as billions of devices continue to be connected to healthcare, transportation, energy, and consumer technologies. In introducing a standardized, rigorously vetted framework that combines strength with efficiency, NIST has laid the foundation for a new era of secure design practices in environments that were once unprotected. 

Experts in the industry note the potential benefits of a widespread adoption of such standards, including more trust in emerging technologies, a better understanding of how hardware and software are developed to be secure, and less vulnerability that is prone to causing systemic risks in the future. Although future cryptographic advances may continue to evolve, the Ascon-based framework has already taken a significant step towards ensuring that even the smallest devices - often overlooked but crucial - no longer become the weakest link in the digital environment. 

Moreover, NIST aims to enhance its role as the global leader in cryptographic standardization and research by providing guidance and guidance to the government as well as industries towards a more secure, interoperable, and resilient technological future.

Microsoft and Amazon’s Quantum Progress Poses New Risks for Encryption

 


Microsoft, Amazon, and Google have all announced recent advances in quantum computing that are likely to accelerate the timeline for the possible obsolescence of current encryption standards. These developments indicate that it will become increasingly important to address the vulnerabilities posed by quantum computing to existing cryptographic protocols shortly. Those who are leading the way in the technological race are those who are advancing quantum computing technology, which is the most powerful technology that will be able to easily decrypt the encryption mechanisms that safeguard the internet's security and data privacy. 

On the other hand, there are researchers and cybersecurity experts who are working on the development of post-quantum cryptography (PQC) - a new generation of encryption technologies that can handle quantum system computational power with ease. A quantum-resistant encryption system must be prioritized by organisations and governments to ensure long-term security of their data and digital communications, especially as the quantum era has come closer than anticipated to being realized. 

Even though quantum decryption and quantum-resistant encryption are competing more than ever, the race for global cybersecurity infrastructure requires strategic investment and proactive measures. There has been an important advancement in quantum computing in the field, with Amazon Web Services (AWS) announcing the inaugural quantum computing chip called Ocelot, which represents a significant step in the pursuit of practical quantum computing. 

One of the most critical challenges in the field is error correction. Using Ocelot, Amazon Web Services claims that it may be possible to drastically reduce the cost of quantum error correction by as much as 90 percent, thus speeding up the process toward fault-tolerant quantum systems being realized. In the future, error correction will continue to be an important barrier to quantum computing. This is because quantum systems are inherently fragile, as well as highly susceptible to environmental disturbances, such as fluctuating temperatures, electromagnetic interference, and vibrations from the environment.

As a result of these external factors, quantum operations are exposed to a substantial amount of computational errors, which make it extremely challenging to maintain their stability and reliability. Research in quantum computing is progressing rapidly, which means innovations like Ocelot could play a crucial role in helping mitigate these challenges, paving the way for more robust and scalable quantum computing in the future. 

If a sufficiently advanced quantum computer has access to Shor's algorithm or any potential enhancements to it, it will be possible for it to decrypt existing public key encryption protocols, such as RSA 2048, within 24 hours by leveraging Shor's algorithm. With the advent of quantum computing, modern cybersecurity frameworks are going to be fundamentally disrupted, rendering current cryptographic mechanisms ineffective. 

The encryption of any encrypted data that has been unauthorizedly acquired and stored under the "harvest now, decrypt later" strategy will become fully available to those who have such quantum computing capabilities. A severe breach of internet communications, digital signatures, and financial transactions would result in severe breaches of trust in the digital ecosystem, resulting in serious losses in trust. The inevitability of this threat does not depend on the specific way by which PKE is broken, but rather on the certainty that a quantum system with sufficient power will be able to achieve this result in the first place. 

Consequently, the National Institute of Standards and Technology (NIST) has been the frontrunner in developing advanced encryption protocols designed to withstand quantum-based attacks in response to these threats. Post-quantum cryptography (PQC) is an initiative that is based on mathematical structures that are believed to be immune from quantum computational attacks, and is a product of this effort. To ensure the long-term security of digital infrastructure, PKE must be replaced with PQC. There is, however, still a limited amount of awareness of the urgency of the situation, and many stakeholders are still unaware of quantum computing's potential impact on cybersecurity, and are therefore unaware of its potential. 

As the development of quantum-resistant encryption technologies through 2025 becomes increasingly important, it will play an increasingly important role in improving our understanding of these methodologies, accelerating their adoption, and making sure our global cybersecurity standards will remain safe. For a cryptographic method to be effective, it must have computationally infeasible algorithms that cannot be broken within a reasonable period. These methods allow for secure encryption and decryption, which ensures that data is kept confidential for authorized parties. However, no encryption is completely impervious indefinitely. 

A sufficiently powerful computing machine will eventually compromise any encryption protocol. Because of this reality, cryptographic standards have continuously evolved over the past three decades, as advances in computing have rendered many previous encryption methods obsolete. For example, in the "crypto wars" of the 1990s, the 1024-bit key encryption that was at the center of the debate has long been retired and is no longer deemed adequate due to modern computational power. Nowadays, it is hardly difficult for a computer to break through that level of encryption. 

In recent years, major technology companies have announced that the ability to break encryption is poised to take a leap forward that has never been seen before. Amazon Web Services, Google, and Microsoft have announced dramatic increases in computational power facilitated by quantum computing technology. Google introduced "Willow" in December and Microsoft announced "Majorana 1" in February, which signals a dramatic rise in computational power. A few days later, Amazon announced the "Ocelot" quantum computing machine. Each of these breakthroughs represents an important and distinct step forward in the evolution of quantum computing technology, a technology that has fundamentally redefined the way that processors are designed. 

In contrast to traditional computing systems, quantum systems are based on entirely different principles, so their efficiency is exponentially higher. It is evident that advances in quantum computing are accelerating an era that will have a profound effect on encryption security and that cybersecurity practices need to be adjusted urgently to cope with these advances. In recent years, quantum computing has made tremendous strides in computing power. It has led to an extraordinary leap in computational power unmatched by any other technology. In the same manner as with any technological breakthrough that has an impact on our world, it is uncertain what it may mean. 

However, there is one aspect that is becoming increasingly clear: the computational barriers that define what is currently infeasible will be reduced to problems that can be solved in seconds, as stated by statements from Google and Microsoft. In terms of data security, this change has profound implications. It will be very easy for quantum computers to unlock encrypted information once they become widely accessible, thus making it difficult to decrypt encrypted data today. Having the capability to break modern encryption protocols within a matter of seconds poses a serious threat to digital privacy and security across industries. 

The development of quantum-resistant cryptographic solutions has been undertaken in anticipation of this eventuality. A key aspect of the Post-Quantum Cryptography (PQC) initiative has been the leadership role that NIST has been assuming since 2016, as it has played a historical role in establishing encryption standards over the years. NIST released a key milestone in global cybersecurity efforts in August when it released its first three finalized post-quantum encryption standards. 

Major technology companies, including Microsoft, Amazon Web Services (AWS), and Google, are not only contributing to the advancement of quantum computing but are also actively participating in the development of PQC solutions as well. Google has been working with NIST on developing encryption methods that can withstand quantum-based attacks. These organizations have been working together with NIST to develop encryption methods that can withstand quantum attacks. During August, Microsoft provided an update on their PQC efforts, followed by AWS and Microsoft. 

The initiatives have been in place long before the latest quantum hardware advances, yet they are a strong reminder that addressing the challenges posed by quantum computing requires a comprehensive and sustained commitment. However, establishing encryption standards does not guarantee widespread adoption, as it does not equate to widespread deployment. As part of the transition, there will be a considerable amount of time and effort involved, particularly in ensuring that it integrates smoothly into everyday applications, such as online banking and secure communications, thereby making the process more complex and time consuming. 

Because of the challenges associated with implementing and deploying new encryption technologies on a large scale, the adoption of new encryption technologies has historically spanned several years. Due to this fact, it cannot be overemphasized how urgent it is for us to prepare for a quantum era. A company's strategic planning and system design must take into account PQC considerations proactively and proactively. It has become increasingly clear that all organizations must address the issue of PQC rather than delay it. The fundamental principle remains that if the user breaks encryption, they are much more likely to break it than if they construct secure systems. 

Moreover, cryptographic implementation is a complex and error-prone process in and of itself. For the cybersecurity landscape to be successful at defending against quantum-based threats, a concerted, sustained effort must be made across all aspects. There is a lot of excitement on the horizon for encryption, both rapidly and very challenging. As quantum computing emerges, current encryption protocols face an existential threat, which means that organizations that fail to react quickly and decisively will suffer severe security vulnerabilities, so ensuring the future of digital security is imperative.

Rethinking Password Security: Why Length Matters More Than Complexity

 



The growing number of online accounts has made managing passwords increasingly difficult. With users juggling dozens of accounts, creating secure yet memorable passwords has become a major challenge.

Traditional password guidelines emphasize complexity, requiring combinations of uppercase and lowercase letters, numbers, and special characters. While intended to enhance security, these rules often lead to predictable, unsafe practices:

  • Reusing passwords across multiple platforms.
  • Writing down passwords in insecure locations.
  • Choosing overly simple yet easy-to-guess passwords.

Recent research indicates that the emphasis on complexity may be counterproductive. The US National Institute of Standards and Technology (NIST) has revised its password management guidelines, prioritizing password length over complexity. Key changes include:

  • Eliminating the need for frequent password changes.
  • Removing restrictions on special characters.
  • Discouraging security questions for account recovery.

Longer passwords, even without special characters, are significantly harder to crack and easier to remember. This shift marks a departure from the belief that complexity alone ensures safety.

The Risks of Complexity

Overly complex passwords often lead users to adopt risky behaviours, such as:

  • Writing passwords on paper or digital notes.
  • Using the same password for multiple accounts.
  • Neglecting password updates due to frustration.

These habits compromise security, leaving accounts vulnerable to brute-force attacks or credential theft. Reports such as the 2021 Verizon Breach Investigations indicate that 80% of hacking-related breaches stem from stolen or brute-forced credentials.

Managing an average of 85 passwords presents a significant burden for individuals and organizations. Enterprises, for instance, spend substantial resources—around $495,000 annually for every 1,000 employees—resolving access-related issues. Despite the availability of password managers, gaps in security remain.

The Rise of Passwordless Authentication

As "security fatigue" grows, passwordless authentication methods are gaining traction. Technologies such as biometrics and adaptive single sign-on (SSO) offer enhanced security and convenience. By leveraging machine learning, these solutions adjust access controls dynamically, reducing login friction and improving the user experience.

Length plays a decisive role in password security. Advanced computing power has diminished the effectiveness of short, complex passwords, while longer ones remain resilient against brute-force attacks. For example, Eric Adams, Mayor of New York City, increased his smartphone passcode from four to six digits, dramatically raising the number of possible combinations.

NIST now recommends passwords up to 64 characters in length. Even a password composed solely of lowercase letters becomes exponentially harder to crack when its length increases. Adding uppercase letters and symbols makes it virtually impenetrable.

Practical Solutions for Stronger Security

In today’s cybersecurity landscape, balancing usability and security is essential. Experts recommend:

  • Creating long, memorable passwords instead of complex ones.
  • Avoiding password reuse across platforms.
  • Utilizing tools such as password managers and two-factor authentication.

By adopting practical measures, users can minimize risky behaviours and enhance digital security. As cyber threats evolve, prioritizing password length and implementing user-friendly solutions are key to safeguarding online accounts.

NIST Introduces ARIA Program to Enhance AI Safety and Reliability

 

The National Institute of Standards and Technology (NIST) has announced a new program called Assessing Risks and Impacts of AI (ARIA), aimed at better understanding the capabilities and impacts of artificial intelligence. ARIA is designed to help organizations and individuals assess whether AI technologies are valid, reliable, safe, secure, private, and fair in real-world applications. 

This initiative follows several recent announcements from NIST, including developments related to the Executive Order on trustworthy AI and the U.S. AI Safety Institute's strategic vision and international safety network. The ARIA program, along with other efforts supporting Commerce’s responsibilities under President Biden’s Executive Order on AI, demonstrates NIST and the U.S. AI Safety Institute’s commitment to minimizing AI risks while maximizing its benefits. 

The ARIA program addresses real-world needs as the use of AI technology grows. This initiative will support the U.S. AI Safety Institute, expand NIST’s collaboration with the research community, and establish reliable methods for testing and evaluating AI in practical settings. The program will consider AI systems beyond theoretical models, assessing their functionality in realistic scenarios where people interact with the technology under regular use conditions. This approach provides a broader, more comprehensive view of the effects of these technologies. The program helps operationalize the framework's recommendations to use both quantitative and qualitative techniques for analyzing and monitoring AI risks and impacts. 

ARIA will further develop methodologies and metrics to measure how well AI systems function safely within societal contexts. By focusing on real-world applications, ARIA aims to ensure that AI technologies can be trusted to perform reliably and ethically outside of controlled environments. The findings from the ARIA program will support and inform NIST’s collective efforts, including those through the U.S. AI Safety Institute, to establish a foundation for safe, secure, and trustworthy AI systems. This initiative is expected to play a crucial role in ensuring AI technologies are thoroughly evaluated, considering not only their technical performance but also their broader societal impacts. 

The ARIA program represents a significant step forward in AI oversight, reflecting a proactive approach to addressing the challenges and opportunities presented by advanced AI systems. As AI continues to integrate into various aspects of daily life, the insights gained from ARIA will be instrumental in shaping policies and practices that safeguard public interests while promoting innovation.

NIST to establish consortium that can collaborate on research to improve the NVD

 

The US National Institute of Standards and Technology (NIST) is to establish  a consortium to partner with NIST in responding to challenges presented by the current and expected growth in CVEs, such as through development of a way to automate some analysis activities.

The official announcement came during VulnCon, a cybersecurity conference hosted by the Forum of Incident Response and Security Teams (FIRST), held from March 25 to 27, 2024. Tanya Brewer, the NVD program manager, disclosed the news, addressing the longstanding speculation surrounding the fate of the NVD. 

In February 2024, NIST halted the enrichment of Common Vulnerabilities and Exposures (CVEs) data on the NVD website, leading to a backlog of unanalyzed vulnerabilities. This development raised alarms among security researchers and industry professionals, as the NVD plays a critical role in identifying and addressing software vulnerabilities. 

The implications of the NVD backlog are profound, potentially impacting the security posture of organisations worldwide. Without timely analysis and remediation of vulnerabilities, companies face increased risks of cyberattacks and data breaches. The situation prompted some security companies to explore alternative solutions to supplement the NVD's functions temporarily. Amidst the challenges, speculation swirled regarding the underlying causes of the NVD's issues. 

Budget constraints, contractual changes, and discussions around updating vulnerability standards were among the factors cited. The uncertainty underscored the need for transparency and clarity from NIST regarding the future of the NVD. In response to the concerns, Brewer acknowledged the challenges faced by the NVD program, attributing them to a "perfect storm" of circumstances. Despite the setbacks, NIST remains committed to addressing the issues and revitalizing the NVD. 

Plans for the establishment of an NVD Consortium, aimed at fostering collaboration and innovation, signal a proactive approach to future management. Looking ahead, NIST aims to enhance the NVD's capabilities and processes within the next one to five years. Proposed initiatives include expanding partnerships, improving software identification methods, and leveraging automation to streamline CVE analysis. 

These efforts reflect a concerted push to modernize the NVD and ensure its relevance in an ever-evolving cybersecurity landscape. The announcement at VulnCon provided much-needed clarity and reassurance to the cybersecurity community. While challenges persist, the collaborative efforts of industry stakeholders and government agencies offer hope for a resilient and robust NVD ecosystem.

Cyber Trust Mark: U.S. Administration Introduces Program to Boost Home Security


This Tuesday, Joe Biden’s government announced a ‘U.S. Cyber Trust Mark’ program that will focus on cybersecurity certification and product labels of smart home tech, as a step to help consumers choose products that provide better protection against cyber activities.

The new program was proposed by the Federal Communications Commission Chairwoman Chairperson Jessica Rosenworcel. The program apparently aims at helping consumers make well-informed decisions over purchasing products, like identifying the marketplace with advance cybersecurity standards.

"The goal of the program is to provide tools for consumers to make informed decisions about the relative security of products they choose to bring into their homes," the administration said.

U.S. Cyber Trust Mark

Under the proposed programs, consumers are likely to see a newly formed “U.S. Cyber Trust Mark” label, that will serve as a shield logo, distinguishing the products that satisfies the established cybersecurity criteria. Apparently, these criteria will be decided by the National Institute of Standards and Technology (NIST), which will include criteria like unique and strong default passwords, data protection, software updates and incident detection capabilities.

According to the administration, a number of significant retailers, trade groups, and manufacturers of consumer goods such electronics, appliances, and consumer goods have made voluntarily commitments to improve cybersecurity for the products they sell. Amazon, Best Buy, Google, LG Electronics USA, Logitech, and Samsung Electronics are among the participants.

Plans for the program was prior discussed by the Biden administration in late 2022 to establish a voluntary initiative with internet of things makers to help ensure products meet minimum security standards.

Reportedly, the FCC, which is responsible for regulating wireless communication devices is set to seek public comment regarding the labeling program by 2024.

According to the administration, the FCC is applying for registration to the U.S. Patent and Trademark Office to register a national trademark that would be used on products that satisfy the predetermined standards. 

"The proposal seeks input on issues including the scope of devices for sale in the U.S. that should be eligible for inclusion in the labeling program, who should oversee and manage the program, how to develop the security standards that could apply to different types of devices, how to demonstrate compliance with those security standards, how to safeguard the cybersecurity label against unauthorized use, and how to educate consumers about the program," the FCC notice says.

The proposal highlights inclusion of a QR code to products that will provide consumers with information, pending a certification mark approval by the U.S. Patent and Trademark Office.

NIST Seeking Feedback for a New Cybersecurity Framework and Supply Chain Guidance

 

Addressing the SolarWinds disaster and other major third-party assaults targeting vital infrastructure, the National Institute of Standards and Technology is due to publish advice for securing organizations against supply chain breaches. [Special Publication 800-161] is the most important cybersecurity supply chain risk management guidance.' Angela Smith of the National Institute of Standards and Technology (NIST) stated. 

Angela Smith of the NIST talked at an Atlantic Council session on Tuesday about initiatives to protect information and communications technology supply chains. The first big revised version will be released by the end of next week, so stay tuned if you haven't already reviewed some of the public drafts. 

The NIST upgrade comes as the Biden administration tries to use the government's procurement power to prod contractors such as IT management firm SolarWinds and other software vendors to improve the security of their environments. 

Vendors of the underlying information and communications technology are pitching in and the Cybersecurity and Infrastructure Security Agency consider expanding private-sector partnerships and taking a more comprehensive approach to tackling dangers to critical infrastructure. 

Future guidelines on trying to manage cybersecurity risks that emerge through the supply chain, according to Smith, would focus more on actions for providers along the chain to address, in addition to the upcoming change. The current literature on the subject has been centered on the organizations' responsibilities for integrating supply-chain aspects into existing surroundings. 

The previous draft version, R2, which was released in October 2021, had a new appendix, Appendix F, which gave implementation assistance for Executive Order 14028 to government agencies. Following NIST's February 4, 2022, Secure Software Development Framework (SSDF) Recommendations, the SP 800-161 release scheduled for next week is likely to deliver more EO 14028 guidance.

The CSF was last updated by NIST in 2018. "There is no single reason causing this transition, This is a scheduled upgrade to keep the CSF current and consistent with other regularly used tools," said Kevin Stine, Chief Cybersecurity Advisor at the NIST. NIST is seeking public input on three primary topics to help guide the revision: revisions to the CSF itself, relationships and alignment between the CSF and other resources, and approaches to improve supply chain cybersecurity. President Barack Obama directed NIST to develop the CSF and directed federal agencies to use it, as well as advising the private sector to do so.

NIST should give a definition for an agency to "use" the framework, and agencies should furnish NIST with cybersecurity risk documents developed and used to comply with this requirement. For enterprises that are utilizing or considering adopting the NIST Cybersecurity Framework, seeing how it is used by US government entities would be extremely beneficial.

NIST NVD Report Shows Increase in Low-Complexity CVEs

 

Common vulnerabilities and exposures, or CVEs, are seemingly increasing at a faster rate as a proportion of the overall number of bugs reported, which, according to a survey, have increasingly risen as per the cybersecurity teams. These are very easy to exploit. 

Recently, Redscan, a managed detection, response, and pen-testing professional, evaluated more than 18,000 CVEs filed in the National Vulnerability Database (NVD) of the U.S. National Institute of Standards and Technology (NIST) in 2020 and published a report, NIST Security Vulnerability Trends in 2020: An Analysis.

It shows that just over half (57%) is graded as "high" and "critical" - the most significant figure reported in any year till date. The report often discusses the increase in low difficulty vulnerabilities and the rise of those vulnerabilities that do not involve user interaction. That means that an attacker can take advantage of the user with limited technical skills as well. According to the research, this number has hiked since 2017, after declining dramatically between 2001 and 2014. These developments demonstrate the need for companies to enhance the awareness of wild vulnerabilities and to follow a multi-layered approach for the management of vulnerabilities. In 2020, almost 4000 vulnerabilities can be defined as the “worst of worst” – meeting the worst criteria for all types of NVD filters. 

The research report says, “The prevalence of low complexity vulnerabilities in recent years means that sophisticated adversaries do not need to ‘burn’ their high complexity zero-days on their targets and have the luxury of saving them for future attacks instead.” 

“Low complexity vulnerabilities lend themselves to mass exploitation as the attacker does not need to consider any extenuating factors or issues with an attack path. This situation is worsened once exploit code reaches the public and lower-skilled attackers can simply run scripts to compromise devices.” 

Another vulnerability trend is to be tackled: low-complex CVEs, 63 percent of vulnerabilities found in 2020, are increasing. A rising challenge for safety teams has been a large number of vulnerabilities with low complexity. Complexity is one of the most critical things to consider while evaluating vulnerability risks and in-wild exploitation the timeframes. The low-complex CVEs are loaned to rapid mass manipulation because attackers do not have to consider extenuating circumstances or route problems. 

Alongside, companies also need to improve oversight of tech vendors' activities. They must determine how their manufacturers test their custom code and the use of their goods of non-member libraries. 

“Vulnerabilities which require no interaction to exploit present a complex challenge for security teams, underscoring the need for defense-in-depth. This includes enhancing the visibility of attack behaviors once a compromise has occurred,” added George Glass, Head of Threat Intelligence at Redscan