In the field of information security, a security vulnerability a flaw in software or hardware that allows a malicious program (exploit) or an attacker to penetrate a computer system. A vulnerability poses a threat to the security of a computer system. There is a risk that the vulnerability could be exploited and the affected computer system could be compromised. Security vulnerabilities arise, among other things, from inadequate protection of a computer against attacks from the network (for example, due to a lack of firewall or other security software) as well as from programming errors in the operating system, web browser or other software applications that are operated on the system.
Security gaps can arise in the development process if security aspects are neglected in planning, design and development and security specifications are not sufficiently taken into account, e.g. as quality goals. Furthermore, security gaps arise from errors that can occur during the development process due to the complexity of software systems. Rough estimates show that one programmer generates one error per 1000 lines of code, which corresponds to one per thousand error frequency; so with 1,000,000 rows, about 1000 errors are to be expected. If the alpha and beta process does not find all errors, the result will be a defective product.
Many bugs are never discovered because the error content is low or the impact would not cause damage until the program runs for a long time. In highly complex programs, such simple errors are initially only documented upon discovery and only corrected later. This is not only for cost reasons, but also because any change to the program code necessary to fix it can in turn be the source of new errors. However, some bugs create serious security vulnerabilities without immediately causing a complete crash.
---
Such vulnerabilities are symptomatic of programs written with programming languages that are optimized for performance (e.g. C or assembly language) and are susceptible to errors due to their programming model. Due to the widespread use of such programming languages, the high time pressure in software production, combined with the pronounced cost pressure of software manufacturers and the less sensitive handling of the topic of secure software, security vulnerabilities are the rule rather than the exception.
A frequently mentioned problem is software offers from hardware manufacturers for their products, which are often only included with certain products for marketing reasons (compare video editing software for camcorders). Due to cost-effective development and thus poor programming, a large number of bugs and security vulnerabilities are created, which mainly affect the home user area. To make matters worse, hardware companies often do not specialize in the development of application software, so they can no longer easily check development contracts with external companies and thus the product quality themselves. On the other hand, the external companies may not even specialize in the development of the special software. These factors lead to new, error-prone software coming onto the market again and again, instead of old software being further developed and improved.
Some errors could easily be avoided today if programming languages such as Rust were used instead of the very system-like languages that allow direct addressing of memory areas. Some developers of the Linux operating system, which is widely used in the server sector, are considering using Rust for individual kernel modules. Microsoft also has projects to rewrite low-level components of Windows that were originally written in C and C++.

Exploitation of Security Vulnerability
These bugs may allow an attacker to penetrate a computer system with an exploit, for example, and execute programs there that can cause harm. One of the most common flaws used to penetrate computer systems is buffer overflow. Lack of or even missing verification of the copied amount of data leads to the overwriting of other parts of the program, which is used by attackers specifically to modify the program or introduce foreign program parts.
In the case of existing hardware failures, exploitation can be prevented or made more difficult by adapting the software running on it. For example, in the case of hardware design errors, either patches are made to the microcode of the processors themselves or workarounds are implemented in the software running on the systems, or both in combination.
Handling of Security Vulnerability
In so-called closed-source applications, it is the responsibility of the manufacturer of the program to fix the vulnerability by patching or providing a new, bug-corrected version. The cleanup is not mandatory and can be omitted if, for example, the support cycle for the product has expired or the manufacturer does not recognize the vulnerability as such and sees no need for action.
In the case of open source and free software, it is often several developers (usually those who have been involved in this software for a long time) scattered around the world who write a patch for it as soon as the bug is discovered and published. Especially in the case of large open source projects, such as Linux, patches are usually released shortly after the vulnerability is discovered to fix it.