A significant problem currently faced in the field of immunology is the proliferation of bacteria which have gained resistance or tolerance to antibiotics. Bacteria can gain resistance or tolerance by a number of different methods; 1) by evolving genes which allow them to survive, 2) by acquiring genes via from other bacteria (transduction by a bacteriophage or conjugation [e.g. horizontal gene transfer]), or 3) uptake of genetic material from the environment (transformation). Bacteria such as these are responsible for a large number of infections that are difficult to treat and are becoming more common in environments such as hospitals. Methicillin-resistant Staphylococcus aureus (MRSA) is one such example. Although MRSA is resistant to most antibiotics, it has a lower fitness than non-antibiotic resistant Staphylococcus aureus (Staph) in an environment without antibiotics. This trait means that if the antibiotics treatments are stopped, the common forms of Staph will out compete and replace MRSA as the dominant form of bacteria in a colony.
It is possible that such observations could lead some people within the information security community to believe that possibly reducing the barriers to malware could cause malware to become less sophisticated or more easy to observe and subsequently easier to remediate. Although this is a possibility, it is unlikely since the costs involved in maintaining genes are different than those in maintaining attack strategies. Evolutionary trade-offs or costs manifest themselves in different ways. They are paid by the reduction of the fitness of an organism. An organism is said to have a higher fitness with the more off-spring that survive into subsequent generations. An organism which must reallocate resources away from the production off-spring runs the risk of reducing its fitness. As an example, removing resources away from reproduction to defense, reduces the theoretical number of off-spring and organism can produce. Defensive strategies can allow an organism to survive and reproduce. Mutations in a genome cause an organism to reallocate resources and depending on the phenotypic effects, they can increase or decrease the fitness of an organism. Evolutionary costs can be thought of as having three different costs and benefits: 1) there is a cost of evolving a strategy (e.g. the costs associated with the creation of a new strategy), 2) there are developmental costs of a strategy (e.g. the specific implementation of within an organism), and 3) there is a cost for maintaining a strategy (e.g. the day-to-day costs associated with maintaining a strategy or maintaining the ability to utilize a strategy). These three costs combined with the benefits of maintaining a set of strategies work in conjunction to raise or lower the overall fitness of an organism.
With bacteria, the reduction of any non-essential genes results in an increased fitness as the costs associated with replication are reduced. The replication of a smaller genome utilizes less resources than the replication of a larger genome. This reduction means that anytime a gene can successfully be removed from the bacterial genome without reducing its fitness, it will benefit for the bacteria to do so as it will reduce the costs associated with replication. This process is referred to as genome economization and has been observed with the Mimivirus in a controlled laboratory setting and the resulting genome reduction in an environment in which its competitors have been removed. In the case of tolerance or resistance genes, the costs to the bacteria are greater than just occupying a portion of the genome and increasing its size. There are production costs associated with tolerance or resistance genes. These genes create proteins and the production of these proteins consumes resources that the bacteria could have utilized elsewhere. Beyond the simple consumption of resources due to the production of these proteins, these proteins that are being produced can interfere with common intracellular functions. All of these factors combined mean that bacteria can make substantial gains in fitness if they are able to remove these genes when they are no longer required. In the case of malware or the tools of determined attackers, the replication and storage of the software used is not a significant issue. In the case of exploits with stagers or malware with droppers being able to remotely load software the advantages of maintaining a smaller code base are not a limitation as resources can be remotely accessed as needed. Actually having a smaller code base to utilize during an attack can limit the options of an adversary as they may not be able to try all of the possible avenues of attack. Blind application of the strategies and methods used by organism for survival may not function as expected within information security without understanding the costs and trade-offs associated with these strategies. The adaptations that bacteria and other micro-organisms utilize for dealing with evolutionary costs are different than those encountered within information security.
Another thing to consider is that even if antibiotics are not applied in the environment to reduce the population of tolerant or resistant bacteria, it does not mean that the human immune system is not going to react to an infection. A substantial portion of the human genome is dedicated to the immune system. Of the entire genome (estimated at 27,478 genes), it is estimated that there are approximately 1,562 genes are dedicated to the immune system. This quantity of genes represents a significant amount of resources dedicated to fighting pathogens. Furthermore when the immune system is actively fighting a pathogen an average metabolism of a human host increases by 14%. Maybe simply reducing the application of security controls to fight malware is not the best solution.
Looking at the issue of bacteria gaining tolerance and resistance from a different perspective may provide another insight into the issue. The problem is not that MRSA exists in the environment but it exists within an environment in which the potential hosts are already suffering from weakened or compromised immune systems. The resistance of MRSA means that the application of traditional antibiotics is ineffective. It seems that the main issue is that MRSA already has the tools to defend itself against the common defenses in that environment. To rephrase this, MSRA has the tools to persist in the prevailing environmental conditions otherwise it would not have survived. From the perspective of information security, attackers have already acquired the necessary tools and techniques to persist in the common computing environments otherwise they would not be successful. Furthermore the tools and techniques that have used previously in compromising similar security controls means that if those security controls are encountered else where they can also be compromised as they have been primed with the necessary experience.
Instead of reducing the security controls in an enterprise to possibly make the detection and remediation of malware based on observations of various bacterial adaptations to antibiotics, security should instead attempt to understand how the environment is being prepared for attackers and focus on making it more difficult for attackers to persist in the enterprise.
Wednesday, July 6, 2011
Thursday, June 2, 2011
Shared Strategies and Shared Costs
Within evolutionary biology there are essentially two pathways in which genes are transferred. They can either be transfered vertically or horizontally. The vertical passing of characters is inheritance, sometimes referred to as Vertical Gene Transfer (VGT). In VGT, genes are passed from one generation to the next inherited from parents. Horizontal sharing, commonly called Horizontal Gene Transfer (HGT), is something only a few types of organisms participate. These organisms are unicellular such as bacteria. Bacteria use HGT as a way of obtaining genes which are immediately beneficial to their survival. Some bacteria utilize HGT to "steal" genes from their hosts or other organisms in the environment and acquire immunity to antibiotics or host defenses. There are a number of different examples in which bacteria acquire tolerance or even resistance to antibiotics or heavy metals via HGT presented within Microbial Ecology: An Evolutionary Approach by McArthur.
If the methods of VGT and HGT are to be considered within the framework of information security, they have to be applied in a more general sense such that instead of applying to the transfer of genes they are applied to the transfer of strategies. One of the basic principles of evolutionary biology is that even the most perfectly adapted organism has a fitness of zero if its characters are not passed along to subsequent generations (typically via VGT). Within information security, an enterprise must be able to retain and pass along desirable characters to new developers and engineers otherwise it may continue to suffer from persistent and/or reoccurring issues. This transfer can occur via training staff or acquisition of outside expertise.
The methods of transfer can work in different ways within information security as strategies are ideas that can easily and quickly replicated between different enterprises and organizations. A strategy only needs to be replicated. Strategies can be of different sizes. A smaller strategy can be the sharing or reuse of code within application frameworks or even the reuse of code within malware. As a possible example of replicating code, with the recent release of the Zeus bot's source code, variants of Zeus may become more common or other malware families can replicate and incorporate that code into their code base. On the larger scale, strategies of implementing demilitarized zones (DMZs) or virtualized application hosting systems can be shared. In evolutionary biology there is the unwanted replication or transfer of strategies. This method is prevalent in microbiology in which bacteria utilize HGT to gain immunity from antibiotics or parasites covering themselves in proteins from a host to prevent an immune response (Roitt's Essential Immunity or Foundations of Parasitology by Roberts and Janovy). In information security unwanted transfer of strategies most likely relates to the exfiltration or release of information outside of the organization. Examples of this exfiltration include data breaches, breaches which are commonly reported or the theft of intellectual property in which trade secret, research and/or ideas is transfer to other parties which did not spend the time or resources developing it.
Beyond sharing genes between generations or organisms, some animals are capable of forming social units such as flocks or herds. With social animals, there are a number of benefits for sharing information between individuals.
Standing out in a herd or in a community can work against an individual. Unique identification works against individuals in a herd as it allows predators to identify and focus on specific individuals when the herd scatters. Herds are an organized collection which are typically sub-divided into three segments: (a) the females and young, (b) the alpha males/females, and (c) the sick, elderly, and/or injured. When a predator encounters the herd, it scatters. Because of the structure of the herd, group (b) guarding (a)'s retreat while (c) is sacrificed as a distraction. Encounters may not always function this way as predators can actively target other individuals within the herd. Within the natural environment and within information security no single entity wants to be sacrificed for the well being of the herd but sometimes this happens as no one is immune from predation. It would be possible to design an infrastructure which is scarified during an attack, but typically enterprises attempt to enforce a homogeneity over the community so there does not exist community of sick and injured systems which can be sacrificed so that the enterprise can respond. Sometimes the legacy systems exist within the enterprise because they are performing a function essential to the enterprise, and as such this elderly systems must be protected.
Another way in which larger social structures enable survival is by allowing a division of tasks. Some individuals assume vigilant/sentry roles and actively attempt to identify and alert the community of threats while others are allowed to perform their normal tasks. If the sentry role is rotated, individuals are allowed to spend more resources and energy on other tasks then scanning for threats. In this way, an individual spends some time foraging and some time acting as a sentry. By sharing the role of sentry, each individual is allowed to increase their time foraging but overall a higher level of vigilance is obtained by the community. By sharing information and alerts in the information security community the danger signs of predation can be shared and the entire community can be alerted. This works well in environments in which predation is not constantly occurring, in these types of environments alerts are constant and become meaningless so the threshold for alerting needs to be adjusted and it needs to be reserved for appropriate events. Depending upon what is going to happen, simply making the community aware of an event can prevent it from occurring or minimize its impact. Although it is possible that the level of vigilance has increased for the herd, as discussed in Natural Enemies by Crawley, this may not be the primary reason for organizing into herd structures. Furthermore the effective level of vigilance the community may decrease to a level below that which a single individual would normally have expended.
Another process that can occur when tasks are divided among individuals is cheating. Microbial Ecology: An Evolutionary Approach by McArthur, defines the term cheating as “obtaining benefits from a collectively produced public good that disproportionately large relative to the cheater's own contribution to that good”. By cheating, an entity should be performing a task such as watching for predators but instead it opts to collect resources for itself or the sentry watches for predators and does not alert when they are detected, instead fleeing without warning the community. The second method of cheating is less likely as others in the community would be able to notice when the sentry behaves as though it is attempting to evade or flee a predator. There are similar behaviors that can occur within the information security community as companies are not always willing to admit that they have been attacked or successfully compromised. This results in an opportunity to share information about the methods of compromise or even at the most fundamental level letting the rest of the community understand the rate at which attacks are occurring. In these instances they are benefiting by receiving the alerts but they not performing the sentry role, they are not sharing their information so no one else can benefit. Similar to cheating in nature, the second method is less likely to occur as conceptually it is more difficult for an organization to and flee from an attack.
Developing a new strategy to exploit or compromise a system takes time and resources. It is not a trivial task but in some cases it can occur fairly quickly. By sharing (or alerting) a newly discovered exploitation strategy, the community can act upon the alert thus time and effort of the adversary can be effectively wasted or the amount of gain can be minimized as the entire community is now aware of the strategy and it can be countered. If the community acts upon the alert and effectively counters the strategy, in order for the attackers to remain successful they must continue to expend time and resources to develop new strategies. The problem occurs when the strategy is not known or it is known and fails to be countered. It allows the attackers to leverage existing attacks and effectively forage on defender systems and resources with very small handling times.
Predators are able to leverage the benefits of existing in social units. Predator benefits are different since they are searching for and foraging on the prey items. By being social, predators can also reduce their individual risk of injury. Prey that attempts to retaliate when attacked will have a more difficult time if it is attacked by multiple predators.
Similarly, black hats or hacktivists can organize into online communities. Anonymous is an example of an online community which has performed large scale Distributed Denial of Service (DDoS) attacks and even successfully compromised the HBGary website with relatively little organization. In Anonymous the wide range of experience levels help to contribute to successful attacks as those who lack the skills to perform a specific attack can easily locate some one in the community with the required skill set. Cybercrime is an industry with many different individuals, each of which specializes in a different task. There are those who develop exploits, individuals who develop the bots, develop the software to prevent reverse engineering with packing and obfuscation, and others who collect and manage the deployed bots. The black hat and/or cybercrime community can also watch the open security community and
If the methods of VGT and HGT are to be considered within the framework of information security, they have to be applied in a more general sense such that instead of applying to the transfer of genes they are applied to the transfer of strategies. One of the basic principles of evolutionary biology is that even the most perfectly adapted organism has a fitness of zero if its characters are not passed along to subsequent generations (typically via VGT). Within information security, an enterprise must be able to retain and pass along desirable characters to new developers and engineers otherwise it may continue to suffer from persistent and/or reoccurring issues. This transfer can occur via training staff or acquisition of outside expertise.
The methods of transfer can work in different ways within information security as strategies are ideas that can easily and quickly replicated between different enterprises and organizations. A strategy only needs to be replicated. Strategies can be of different sizes. A smaller strategy can be the sharing or reuse of code within application frameworks or even the reuse of code within malware. As a possible example of replicating code, with the recent release of the Zeus bot's source code, variants of Zeus may become more common or other malware families can replicate and incorporate that code into their code base. On the larger scale, strategies of implementing demilitarized zones (DMZs) or virtualized application hosting systems can be shared. In evolutionary biology there is the unwanted replication or transfer of strategies. This method is prevalent in microbiology in which bacteria utilize HGT to gain immunity from antibiotics or parasites covering themselves in proteins from a host to prevent an immune response (Roitt's Essential Immunity or Foundations of Parasitology by Roberts and Janovy). In information security unwanted transfer of strategies most likely relates to the exfiltration or release of information outside of the organization. Examples of this exfiltration include data breaches, breaches which are commonly reported or the theft of intellectual property in which trade secret, research and/or ideas is transfer to other parties which did not spend the time or resources developing it.
Beyond sharing genes between generations or organisms, some animals are capable of forming social units such as flocks or herds. With social animals, there are a number of benefits for sharing information between individuals.
- Social animals benefit from safety in numbers against predation. As a single individual is more likely to be foraged upon by a predator during a single encounter, if alone rather than in a group.
- A single individual can alert the entire herd to danger. Multiple eyes are more likely to identify a threat. Even if not everyone is actively searching for threats, multiple sentries may offer an opportunity for better detection. If the sentry role is shared or rotated, then all within the community can benefit from increased vigilance and they are able to spend more time foraging.
- For social animals experiences can be passed between individuals by social learning. Social learning allows learned strategies that are successful to be passed on to other individuals within the community.
Standing out in a herd or in a community can work against an individual. Unique identification works against individuals in a herd as it allows predators to identify and focus on specific individuals when the herd scatters. Herds are an organized collection which are typically sub-divided into three segments: (a) the females and young, (b) the alpha males/females, and (c) the sick, elderly, and/or injured. When a predator encounters the herd, it scatters. Because of the structure of the herd, group (b) guarding (a)'s retreat while (c) is sacrificed as a distraction. Encounters may not always function this way as predators can actively target other individuals within the herd. Within the natural environment and within information security no single entity wants to be sacrificed for the well being of the herd but sometimes this happens as no one is immune from predation. It would be possible to design an infrastructure which is scarified during an attack, but typically enterprises attempt to enforce a homogeneity over the community so there does not exist community of sick and injured systems which can be sacrificed so that the enterprise can respond. Sometimes the legacy systems exist within the enterprise because they are performing a function essential to the enterprise, and as such this elderly systems must be protected.
Another way in which larger social structures enable survival is by allowing a division of tasks. Some individuals assume vigilant/sentry roles and actively attempt to identify and alert the community of threats while others are allowed to perform their normal tasks. If the sentry role is rotated, individuals are allowed to spend more resources and energy on other tasks then scanning for threats. In this way, an individual spends some time foraging and some time acting as a sentry. By sharing the role of sentry, each individual is allowed to increase their time foraging but overall a higher level of vigilance is obtained by the community. By sharing information and alerts in the information security community the danger signs of predation can be shared and the entire community can be alerted. This works well in environments in which predation is not constantly occurring, in these types of environments alerts are constant and become meaningless so the threshold for alerting needs to be adjusted and it needs to be reserved for appropriate events. Depending upon what is going to happen, simply making the community aware of an event can prevent it from occurring or minimize its impact. Although it is possible that the level of vigilance has increased for the herd, as discussed in Natural Enemies by Crawley, this may not be the primary reason for organizing into herd structures. Furthermore the effective level of vigilance the community may decrease to a level below that which a single individual would normally have expended.
Another process that can occur when tasks are divided among individuals is cheating. Microbial Ecology: An Evolutionary Approach by McArthur, defines the term cheating as “obtaining benefits from a collectively produced public good that disproportionately large relative to the cheater's own contribution to that good”. By cheating, an entity should be performing a task such as watching for predators but instead it opts to collect resources for itself or the sentry watches for predators and does not alert when they are detected, instead fleeing without warning the community. The second method of cheating is less likely as others in the community would be able to notice when the sentry behaves as though it is attempting to evade or flee a predator. There are similar behaviors that can occur within the information security community as companies are not always willing to admit that they have been attacked or successfully compromised. This results in an opportunity to share information about the methods of compromise or even at the most fundamental level letting the rest of the community understand the rate at which attacks are occurring. In these instances they are benefiting by receiving the alerts but they not performing the sentry role, they are not sharing their information so no one else can benefit. Similar to cheating in nature, the second method is less likely to occur as conceptually it is more difficult for an organization to and flee from an attack.
Developing a new strategy to exploit or compromise a system takes time and resources. It is not a trivial task but in some cases it can occur fairly quickly. By sharing (or alerting) a newly discovered exploitation strategy, the community can act upon the alert thus time and effort of the adversary can be effectively wasted or the amount of gain can be minimized as the entire community is now aware of the strategy and it can be countered. If the community acts upon the alert and effectively counters the strategy, in order for the attackers to remain successful they must continue to expend time and resources to develop new strategies. The problem occurs when the strategy is not known or it is known and fails to be countered. It allows the attackers to leverage existing attacks and effectively forage on defender systems and resources with very small handling times.
Predators are able to leverage the benefits of existing in social units. Predator benefits are different since they are searching for and foraging on the prey items. By being social, predators can also reduce their individual risk of injury. Prey that attempts to retaliate when attacked will have a more difficult time if it is attacked by multiple predators.
Similarly, black hats or hacktivists can organize into online communities. Anonymous is an example of an online community which has performed large scale Distributed Denial of Service (DDoS) attacks and even successfully compromised the HBGary website with relatively little organization. In Anonymous the wide range of experience levels help to contribute to successful attacks as those who lack the skills to perform a specific attack can easily locate some one in the community with the required skill set. Cybercrime is an industry with many different individuals, each of which specializes in a different task. There are those who develop exploits, individuals who develop the bots, develop the software to prevent reverse engineering with packing and obfuscation, and others who collect and manage the deployed bots. The black hat and/or cybercrime community can also watch the open security community and
- gain new inspiration and methodologies for attacks,
- discover which attacks and techniques have been disclosed and allow them to determine when it is time to research new attacks and/or utilized previous undisclosed attacks,
- gaining a better understanding of the techniques used to detect/identify/mitigate their attacks and develop new strategies for defending their acquired systems.
Sunday, June 6, 2010
Adapting a Security Program to an Evolving Threat Landscape
Security programs (e.g. processes and procedures, not applications) must evolve and must be able to adapt in order to survive an evolutionary arms race. Programs as they are implemented today are largely unable to survive in the diverse and evolving threat environment. One prediction of the Red Queen hypothesis is that overtime fitness of an entity is reduced as a result of selection pressures eliminating those attacks which are unsuccessful (e.g. selection works against the unsuccessful and promotes the successful).
Security programs need to evolve in order to account for how an evolutionary arms race will affect the enterprise. These changes will need to be more than simply employing a technology which is biologically inspired or employs an evolutionary learning algorithm. There are a number of different ways in which a security program can be adapted to increase the likelihood of surviving a diverse threat environment. Among those aspects of a security program that can be adapted to an environment in which the threat is evolving include:
Depending upon the industry in which the enterprise participates, they may be subject to a wide variety of requirements. Federal entities are required to be compliant with the Federal Information Security Management Act (FISMA). Part of this compliance process includes implementing the National Institute of Science & Technology (NIST) Special Publication (SP) 800-53 Security Controls according to the System's security impact. It is expected that most systems will be part of the 'Low' baseline. Security controls like Information Input Validation (SI-10) and Error Handling (SI-11) are not required at this level. The lack of Input Validation can lead to successful attacks and compromises resulting from Buffer Overflows, Cross Site Scripting (XSS), SQL Injection (SQLi), O/S Command injection, LDAP injection (LDAPi), etc. Buffer Overflows are common in desktop and operating systems, while XSS and SQLi are some of the most common causes of web application compromises. Information Input Validation was only finalized as a requirement in Revision 3 of SP 800-53. Requirements should be updated and applied in order to mitigate threats that are present and active in the environment. Implementation of a security control to prevent threats from actively exploiting systems in the environment should not be held off until a system meets higher impact level.
Another reason that the "classical" risk management methodology must be adapted is that the methodology does not take into account for the operational strategies that threats are employing in the environment. The recommended standard for assessing and mitigating risk, as described in the Official (ISC)2 - Guide to The Certified Information Systems Security Professional (CISSP) - Common Body of Knowledge (CBK), controls are implemented based on the cost or the value of the information being protected against the cost of the control that is going to be implemented. This is the value of the information to the enterprise if it is lost, stolen or otherwise exposed. This process does not take into account the value of the system in relationship to other internal or external entities. This methodology falls apart if the attacker does not care about the value of the information on the system, but is more interested in the business relationships of the enterprise. An attacker may be willing to spend significant resources to compromise the enterprise in order to attack an external entity.
In addition to the "classical" risk management methodology, the methods used for testing should be adaptable. Simply stated, there is more to security testing then performing a "compliance" check of the system or clicking a button to generate a security report. Although compliance checks are required, they do very little in terms of aiding an enterprise in understanding their risk posture. One could assume that simply subjecting the enterprise to "internal predation" by using testers would be of benefit. Security Test and Evaluation (ST&E), Compliance Scanning, Independent Verification and Validation (IV&V), and Penetration Testing are different types of testing that most enterprises use to evaluate a system. Of those, Penetration Testing is potentially the one that offers to most similarities to how an attacker works to compromise a system. It can be implemented blindly such that penetration testers are permitted to attack the enterprise as though an attacker would. There are a number of issues with simply assuming that a penetration tester will mimic an attacker:
Some enterprises do not think there is any value to allow penetration testers to use attacks which are commonly seen against the enterprise. If they are not allowing some types of attacks, they should at minimum be providing some type of mitigation to protect the enterprise. For example, most enterprises are subjected to phishing and spear phishing but they do not allow penetration testers to use this methodology during a test. If the enterprise does not already have a program which actively phishes the enterprise, they are leaving an actively exploited vulnerability unmitigated. Similarly enterprises do not commonly allow clients and end points to be targeted during a penetration test, despite client side attacks being cited as a source of enterprise compromises as a user visits a website and the browser was exploited and/or hijacked.
Enterprises also do not have the luxury of allowing penetration tests to occur over months and years, despite attackers being allowed to attack the enterprise for extended periods. An enterprise could allow a penetration tester extended periods to attack the enterprise, but it would be more beneficial to actively engage the penetration tester during the exercise. As an example, instead of requiring a penetration tester to acquire credentials via a brute force attack against the system, they should be provided credentials for the test. Or during the test, they should have limited access to the source code of an application to help them refine their attacks. Requiring a tester to brute force an attack vector will be a waste of both parties time, and unnecessarily increase the time before exploitable vulnerabilities can be identified, reporting and beginning mitigation.
Lastly as far as testing is concerned, an attacker only needs to find one exploitable vector while a penetration tester should find and report as many exploitable vulnerabilities. A test should not conclude when a single exploitable vector is found, the penetration tester should continue to find as many exploitable vectors as possible. Although testing can provide valuable information as far as the exploitability of vulnerabilities within a system, it typically only occurs at the end of the SDLC. Correcting vulnerabilities at the end of development prior to an application being placed into production can be expensive in time and resources.
The integration of security into the SDLC needs to be adapted to ensure that systems placed in production are capable of defending themselves against the threats inherent in the environment. Integrating security from an evolutionary point of view into the SDLC is more than just implementing static source code analysis or application/protocol fuzzing into the unit tests. It is also more than just selecting an appropriate set of security requirements at the time in which requirements are collected. Between the time in which the application is designed and the time in which it is deployed into production the threat environment can drastically change. It should be understood that security requirements leveled against a program can change throughout the SDLC as new threats adapt and evolve in the environment. Beyond ensuring that program management and the developers understanding that requirements will change, the application should be designed to with stand attacks that the enterprise has seen or will likely see during its operational phase. This means investigating and tracking the attacks are actually occurring in the environment, and understanding what other attacks will occur in the near future.
In addition to monitoring attacks that the enterprise is actually encountering, the enterprise should also monitor what attacks are being used against other enterprises in the same industry, and monitoring what researchers are presenting. Only by observing what is actually occurring in the environment, can the enterprise understanding how to adapt and modify the security requirements as the threat environment changes. When monitoring the research community, it should be understood that it can take between 5 and 10 years before the popularization of an attack and before the attack becomes common place although it can occur more quickly. When looking for new attacks, attackers are not going to publish them before they use them, by contrast researchers will publish their attacks. In some cases attackers will adopt these new attacks, but in other cases a researcher will publish details about a previously undisclosed attack that only attackers have been employing. By watching what researchers are publishing, strategies can be planned and implemented to mitigate attacks which before they are common place.
Implementing a security program that is able to adapt and respond to the evolving threat landscape requires more than selecting vendor solutions using "leap ahead" or evolutionary algorithms. It will require that the way in which a security program operates to be adaptable. In summary, some of the necessary adaptations will require changes ranging from modifying the risk management methodology that is employed to adapting a more predatory approach to security testing, and modifying the SDLC (and expectations on the part of developers) to watching (and participating) in active security research.
Security programs need to evolve in order to account for how an evolutionary arms race will affect the enterprise. These changes will need to be more than simply employing a technology which is biologically inspired or employs an evolutionary learning algorithm. There are a number of different ways in which a security program can be adapted to increase the likelihood of surviving a diverse threat environment. Among those aspects of a security program that can be adapted to an environment in which the threat is evolving include:
- Risk Management - Moving from a "risk management" based methodology to a methodology that understands the threat environment of the enterprise. The methodology should be updated to include processes that expect that the threat will adapt to the countermeasures that are implemented, and eventually bypass them.
- Security Testing - Moving from a "compliance checking" based testing methodology to one that mirrors and mimics how adversaries are attacking the enterprise.
- System Development Life Cycle - Moving from implementing security at the end of the Software Development Life-cycle (SDLC) to through out the SDLC. Beyond just implementing security through out the SDLC, it should be understood by program management and the developers that security requirements may change during development as the threat environment changes.
- Attack Research - Monitoring and researching attacks, along with providing information about potential attacks as they are discovered.
Depending upon the industry in which the enterprise participates, they may be subject to a wide variety of requirements. Federal entities are required to be compliant with the Federal Information Security Management Act (FISMA). Part of this compliance process includes implementing the National Institute of Science & Technology (NIST) Special Publication (SP) 800-53 Security Controls according to the System's security impact. It is expected that most systems will be part of the 'Low' baseline. Security controls like Information Input Validation (SI-10) and Error Handling (SI-11) are not required at this level. The lack of Input Validation can lead to successful attacks and compromises resulting from Buffer Overflows, Cross Site Scripting (XSS), SQL Injection (SQLi), O/S Command injection, LDAP injection (LDAPi), etc. Buffer Overflows are common in desktop and operating systems, while XSS and SQLi are some of the most common causes of web application compromises. Information Input Validation was only finalized as a requirement in Revision 3 of SP 800-53. Requirements should be updated and applied in order to mitigate threats that are present and active in the environment. Implementation of a security control to prevent threats from actively exploiting systems in the environment should not be held off until a system meets higher impact level.
Another reason that the "classical" risk management methodology must be adapted is that the methodology does not take into account for the operational strategies that threats are employing in the environment. The recommended standard for assessing and mitigating risk, as described in the Official (ISC)2 - Guide to The Certified Information Systems Security Professional (CISSP) - Common Body of Knowledge (CBK), controls are implemented based on the cost or the value of the information being protected against the cost of the control that is going to be implemented. This is the value of the information to the enterprise if it is lost, stolen or otherwise exposed. This process does not take into account the value of the system in relationship to other internal or external entities. This methodology falls apart if the attacker does not care about the value of the information on the system, but is more interested in the business relationships of the enterprise. An attacker may be willing to spend significant resources to compromise the enterprise in order to attack an external entity.
In addition to the "classical" risk management methodology, the methods used for testing should be adaptable. Simply stated, there is more to security testing then performing a "compliance" check of the system or clicking a button to generate a security report. Although compliance checks are required, they do very little in terms of aiding an enterprise in understanding their risk posture. One could assume that simply subjecting the enterprise to "internal predation" by using testers would be of benefit. Security Test and Evaluation (ST&E), Compliance Scanning, Independent Verification and Validation (IV&V), and Penetration Testing are different types of testing that most enterprises use to evaluate a system. Of those, Penetration Testing is potentially the one that offers to most similarities to how an attacker works to compromise a system. It can be implemented blindly such that penetration testers are permitted to attack the enterprise as though an attacker would. There are a number of issues with simply assuming that a penetration tester will mimic an attacker:
- Skill/Resource Levels - The penetration tester may or may not have the resources or skill to mimic an attacker.
- Test Methodology - The penetration tester may not be permitted to attack the enterprise using the methods that an attacker is using (or even be aware of the methods that attackers have used to successfully exploit the enterprise).
- Test Duration - An enterprise is not going to be willing to wait months or years for a penetration test to conclude. Most enterprises do not have the resources to waste on an extended penetration test.
- Test Depth and Breath - Most importantly a penetration tester should be reporting multiple attack vectors while an attacker only needs one.
Some enterprises do not think there is any value to allow penetration testers to use attacks which are commonly seen against the enterprise. If they are not allowing some types of attacks, they should at minimum be providing some type of mitigation to protect the enterprise. For example, most enterprises are subjected to phishing and spear phishing but they do not allow penetration testers to use this methodology during a test. If the enterprise does not already have a program which actively phishes the enterprise, they are leaving an actively exploited vulnerability unmitigated. Similarly enterprises do not commonly allow clients and end points to be targeted during a penetration test, despite client side attacks being cited as a source of enterprise compromises as a user visits a website and the browser was exploited and/or hijacked.
Enterprises also do not have the luxury of allowing penetration tests to occur over months and years, despite attackers being allowed to attack the enterprise for extended periods. An enterprise could allow a penetration tester extended periods to attack the enterprise, but it would be more beneficial to actively engage the penetration tester during the exercise. As an example, instead of requiring a penetration tester to acquire credentials via a brute force attack against the system, they should be provided credentials for the test. Or during the test, they should have limited access to the source code of an application to help them refine their attacks. Requiring a tester to brute force an attack vector will be a waste of both parties time, and unnecessarily increase the time before exploitable vulnerabilities can be identified, reporting and beginning mitigation.
Lastly as far as testing is concerned, an attacker only needs to find one exploitable vector while a penetration tester should find and report as many exploitable vulnerabilities. A test should not conclude when a single exploitable vector is found, the penetration tester should continue to find as many exploitable vectors as possible. Although testing can provide valuable information as far as the exploitability of vulnerabilities within a system, it typically only occurs at the end of the SDLC. Correcting vulnerabilities at the end of development prior to an application being placed into production can be expensive in time and resources.
The integration of security into the SDLC needs to be adapted to ensure that systems placed in production are capable of defending themselves against the threats inherent in the environment. Integrating security from an evolutionary point of view into the SDLC is more than just implementing static source code analysis or application/protocol fuzzing into the unit tests. It is also more than just selecting an appropriate set of security requirements at the time in which requirements are collected. Between the time in which the application is designed and the time in which it is deployed into production the threat environment can drastically change. It should be understood that security requirements leveled against a program can change throughout the SDLC as new threats adapt and evolve in the environment. Beyond ensuring that program management and the developers understanding that requirements will change, the application should be designed to with stand attacks that the enterprise has seen or will likely see during its operational phase. This means investigating and tracking the attacks are actually occurring in the environment, and understanding what other attacks will occur in the near future.
In addition to monitoring attacks that the enterprise is actually encountering, the enterprise should also monitor what attacks are being used against other enterprises in the same industry, and monitoring what researchers are presenting. Only by observing what is actually occurring in the environment, can the enterprise understanding how to adapt and modify the security requirements as the threat environment changes. When monitoring the research community, it should be understood that it can take between 5 and 10 years before the popularization of an attack and before the attack becomes common place although it can occur more quickly. When looking for new attacks, attackers are not going to publish them before they use them, by contrast researchers will publish their attacks. In some cases attackers will adopt these new attacks, but in other cases a researcher will publish details about a previously undisclosed attack that only attackers have been employing. By watching what researchers are publishing, strategies can be planned and implemented to mitigate attacks which before they are common place.
Implementing a security program that is able to adapt and respond to the evolving threat landscape requires more than selecting vendor solutions using "leap ahead" or evolutionary algorithms. It will require that the way in which a security program operates to be adaptable. In summary, some of the necessary adaptations will require changes ranging from modifying the risk management methodology that is employed to adapting a more predatory approach to security testing, and modifying the SDLC (and expectations on the part of developers) to watching (and participating) in active security research.
Tuesday, May 18, 2010
Why Apply Evolutionary Biology to Information Security?
Very few security professionals will agree that the situation within information security is globally improving. There may be local pockets in which an organization is able to hold/maintain a strong security posture. Problems discovered over a decade ago (e.g. Buffer Overflows, Cross Site Scripting, etc.) still persist, and are consistently rated as being some of the most dangerous programming flaws (see the OWASP Top 10 and the CWE/SANS Top 25). The state of cybersecurity is severe enough that some professionals are seeking solutions for financial institutions which assume that the clients that they are conducting business transactions with are compromised. Given that some estimates find that well over 90% of the systems on the Internet are not fully patched, and a significant percentage of the systems on the Internet are compromised with at least one form of malware, this is a reasonable approach.
Events like these can be considered to be signs that efforts in the area information security are failing. There can be many reasons that an entity fails in a game; one possible reason is that the rules of the game are not understood. If the rules of the game are not understood, it can be difficult at best to consistently play a game well, especially if the rules are stacked against you. Lately there have been a number of organizations looking to implement "game changing" strategies. Again, changing the rules requires that the rules are understood.
Most fields of science have one or more major theories which are used to explain observable phenomenon and provide a basis for testing and interacting with the world. Physics has the Theory of General Relativity and the Standard Model, Chemistry has the Periodic Table and Quantum Mechanics, and Biology has Genetics and Evolution. Despite being drawn from several scientific fields of study such as Mathematics, Linguistics, and Solid State Physics, with the more recent introduction of Psychology and Economics, information security lacks a framework to provide predictive and testable hypothesizes.
Some institutions have recognized that simply teaching computer science provides an approach that is too narrow of a focus for their curriculum, and have reorganized their departments to apply a more broad-based and interdisciplinary approach to their studies and moved into the field of Informatics. Bioinformatics and Security Informatics being some specific examples of the resulting reorganization. There are already attempts to apply the concepts of biology to information security, as there are attempts to build automated immune systems, predicting computer virus outbreaks with models similar to those that are used for their biological analogies, and programs are being implemented with evolutionary algorithms to facilitate machine learning.
The hypothesis that is being presented is that information security is an evolutionary system, similar to what is occurring naturally and can be modeled and explained by the field of evolutionary biology. Specifically some of the situations that are occurring in the field can be understood as an evolutionary arms race (e.g. malware). Evolutionary Biology has existed for 150 years and been able to provide an understanding of one of the most complicated natural systems in existence; life. Some of the frameworks within evolutionary biology can be directly applied, while others may need to be modified or even replaced, and others may even prove to not apply. Applying evolutionary biology could provide a richer understanding of the rules in which the game is being played. Once the rules are understood, it should be possible to understand where and how the rules can be modified to change the game in a meaningful and substantive way.
Events like these can be considered to be signs that efforts in the area information security are failing. There can be many reasons that an entity fails in a game; one possible reason is that the rules of the game are not understood. If the rules of the game are not understood, it can be difficult at best to consistently play a game well, especially if the rules are stacked against you. Lately there have been a number of organizations looking to implement "game changing" strategies. Again, changing the rules requires that the rules are understood.
Most fields of science have one or more major theories which are used to explain observable phenomenon and provide a basis for testing and interacting with the world. Physics has the Theory of General Relativity and the Standard Model, Chemistry has the Periodic Table and Quantum Mechanics, and Biology has Genetics and Evolution. Despite being drawn from several scientific fields of study such as Mathematics, Linguistics, and Solid State Physics, with the more recent introduction of Psychology and Economics, information security lacks a framework to provide predictive and testable hypothesizes.
Some institutions have recognized that simply teaching computer science provides an approach that is too narrow of a focus for their curriculum, and have reorganized their departments to apply a more broad-based and interdisciplinary approach to their studies and moved into the field of Informatics. Bioinformatics and Security Informatics being some specific examples of the resulting reorganization. There are already attempts to apply the concepts of biology to information security, as there are attempts to build automated immune systems, predicting computer virus outbreaks with models similar to those that are used for their biological analogies, and programs are being implemented with evolutionary algorithms to facilitate machine learning.
The hypothesis that is being presented is that information security is an evolutionary system, similar to what is occurring naturally and can be modeled and explained by the field of evolutionary biology. Specifically some of the situations that are occurring in the field can be understood as an evolutionary arms race (e.g. malware). Evolutionary Biology has existed for 150 years and been able to provide an understanding of one of the most complicated natural systems in existence; life. Some of the frameworks within evolutionary biology can be directly applied, while others may need to be modified or even replaced, and others may even prove to not apply. Applying evolutionary biology could provide a richer understanding of the rules in which the game is being played. Once the rules are understood, it should be possible to understand where and how the rules can be modified to change the game in a meaningful and substantive way.
Thursday, December 3, 2009
Evolutionary Processes and Natural Selection Reloaded
There are four basic evolutionary processes: Natural Selection, Genetic Drift, Undirected Mutation, and Gene Flow; all of which operate on populations of entities. The interplay between these processes can enhance or suppress the fitness of the individuals within given a population.
The process most commonly discussed when addressing evolutionary biology is the process of Natural Selection. In the basic formulation of natural selection, it only requires four conditions to operate on a population (based upon those found within Evolution, 3rd Edition by Mark Ridley);
Selection is readily evident in information security, as cryptographic algorithms which are broken are slowly removed from general use and newer algorithms are designed. Selection usually does not cause changes to occur instantaneously unless it is a strong selection pressure. MD5 is still within wide spread use through out the computing base despite it being known to be a weak algorithm for some time.
When considering how selection is applied to information security, it is important to understand that in evolutionary biology when an entity is selected against, it has been removed from the reproductive (or effective) population. In most cases, this means that the entity has died. When an entity has been selected against in information security it does not necessarily mean that the system has died. A more complete way of stating that an entity has been selected against in information technologies, it could be to state that when a system has been selected against it is no longer present or is no longer in its intended operating state. In the case of malware, this would be that the malware has been removed from a system, or its command and control infrastructure has been eradicated. In the case of an IT system, if a system has been compromised it has been selected against or if the system has been wiped (as in the case of a complete rebuild). Unlike organisms in the environment, once an organism has been eliminated it cannot not be brought back to life but IT systems on the other hand can be wiped and rebuilt.
Another significant process within evolutionary biology is the Undirected Mutation. Although life is able to have a high fidelity when it is replicating, errors are introduced when replication occurs. The errors cause the cellular processes to vary in ways that can enhance the organisms fitness, reduce it's fitness or have little effect on its fitness. Natural Selection works with Undirected Mutations to select for entities which have a higher fitness, and prune out the entities with a lower fitness. Although there are evolutionary algorithms within the fields of artificial intelligence, most of the processes that modify the behavior or enhance the functionality of programs are guided by Directed Mutation. Directed Mutations are mutations which are deliberately made with the goal of producing an desired effect, and in the case of malware it can range from increasing its ability to infect remote hosts, or hinder the ability of a malware analyst in determining the true nature of the application or simply getting past the latest anti-virus scanner definitions.
If Directed Mutations as a process are reinterpreted to include modifications at a larger scale, then it is tempting to think of directed mutations as being applied by an intelligent entity, most commonly referred to as an intelligent designer. Although the intelligence designer has no scientific basis in evolutionary biology, it can apply to information security in a more limited way. Evolutionary biology works without having an "intelligent designer" guiding the evolution and development of an entity. Information systems typically work with a designer and/or engineer who designs a system which is then implemented. The fitness of a system is then determined and it can be revised during the next design and subsequent implementation. As within evolutionary biology, the application of a designer to information security does not require that a single overall designer exist. Indeed the opposite is true. There are large numbers of individual designers operating and competing by proxy through their fielded applications and programs for systems and resources.
Genetic Drift is one of two evolutionary processes which can directly work against natural selection. In Genetic Drift, random inheritance of weakly or neutrally selected characters during reproduction can cause characters to either eventually dominate or be removed from a population. Weakly selected characters are those characters which only have minor selection pressures working against them, while neutral characters effectively have no selection pressures operating on them. Genetic Drift is one of the processes which is able to counter the act of Natural Selection. It is able to work against Natural Selection in that during reproduction, despite being a character that provides for a strongly enhanced fitness, it may not be passed along to subsequent generations. If there are two characters {A,B} and only one will be passed along, it will be either A or B. The other character can be lost unless there are sufficient numbers to ensure that statistically it is passed along. Unlike most of the other evolutionary processes, Genetic Drift does not have an easily identifiable analogy to information security other than personal preferences in the choice of browsers, office automation applications, operating systems, etc.
The last of the four major evolutionary processes is Gene Flow. Gene Flow occurs when two populations having different allele frequencies interbreed (usually due to a period of isolation and then reintroduction). Typically in an isolated population, Natural Selection and Genetic Drift will alter the characters of the population from their original frequencies. When the population encounters another population with which it can interbreed, the resulting interactions cause gene frequencies to change in the resulting population. It is not required that the two populations be environmentally isolated. Gene Flow can also occur if there is a strong selection pressure operating locally within the population (normally on a fringe population in which the environment is different from that of the main body).
When selection favors specific adaptations within a population, the adjusted gene frequency of initial population's genes may flow into another population with a different set allele frequency altering the resulting gene frequencies for both populations. This situation can occur because a population has becomes isolated due to environmental conditions or because an adaptation favors a specific frequency in a sub population. Since the genes favored in the different populations can be different, and the intermixing of the genes results in an intermediate allele frequencies, this process can actually work against Natural Selection preventing optimal solutions from being established. As an example of this in evolutionary biology, Stephen Stearns and Richard Sage (Mal-Adaptation in a Marginal Population of the Mosquito Fish, Evolution, 1980) found that specific adaptations which could have increased the overall the fitness of a border population of mosquito fish attempting to survive in fresh water was being hindered by gene flow resulting from interbreeding with the main population.
A close parallel with Gene Flow is found within the formal education of programmers for producing secure code. In order to create and distribute programs the developer does not need to be trained in how to create a secure program, only in that they need to be able to create a functional program. Some organizations have deliberately allocated resources to train their developers in methods for developing and implementing secure programs. But if the organization is only able to attract new developers which have not received any training in a secure development lifecycle, they must expend resources to educate the developer. Depending on the turnover rate of the organization and project schedules, this reoccurring cost could be significant enough to cause the organization to loose their focus on developing a secure product.
Genetic Drift, Gene Flow, Natural Selection and Undirected Mutation form the four basic processes of evolutionary biology. With little modification or reinterpretation these processes can be applied to information security. Natural Selection becomes Artificial Selection, Undirected Mutation becomes Directed Mutation, Gene Flow is still represented but Genetic Drift becomes less important of an evolutionary process.
The process most commonly discussed when addressing evolutionary biology is the process of Natural Selection. In the basic formulation of natural selection, it only requires four conditions to operate on a population (based upon those found within Evolution, 3rd Edition by Mark Ridley);
- Reproduction - Entities must reproduce to form a new generation.
- Heredity - Entities produced via reproduction must tend to possess the characteristics (e.g. traits) from the previous generation.
- Individual Variation - The population of entities is not identical.
- Characteristic Fitness - Individual characteristics have varying degrees of fitness which allows them to propagate their traits to subsequent generations.
- Reproduction - A majority of the entities within information systems are installed or are copied onto other information systems rather than true reproduction. This form of reproduction is more akin replication which is essentially cloning as opposed to asexual reproduction. In asexual reproduction, each subsequent generation consists of identical or nearly identical copies that are produced as offspring, while cloning produces identical copies.
- Heredity - The condition for heredity is easily satisfied. Computers are quite effective at producing exact copies of programs and data, and there are numerous methods for performing integrity checks to insure that the replication events did in deed produce an identical copy.
- Individual Variation - Natural Selection requires that there is variatability within a population. Within information security, as programs are installed or replicate in an environment, they do so without any variatability. Ability to create exact copies of itself, and any errors within the replication routines can often cause fatal errors when the copy of the application attempts to execute. Simply stated, programs are produced by installation or infection. There may be some variation within the population if the entity is polymorphic or metamorphic, but typically a program is created and then processed through a polymorphic encoder to produce the variations.
- Character Fitness - The fitness of an entity within a population varies based on the character traits which it has inherited. Some characters will have a higher fitness, while others have a lower fitness. Those with a higher fitness will tend to have their characters dominate in a population as they are successful in reproducing. As the population consists of cloned entities, which implies that the individual variatability has been eliminated or minimized, which means that there will be minimal variatability for character fitness.
- Reproduction can be reinterpreted as replication, or simply as a process or an algorithm that replicates an entity.
- Heredity can be reinterpreted simply as a process for passing characters from a parent entity of to its offspring. Heredity allows for individual characters to be linked between offspring and the previous generation.
- Individual Variation can be reinterpreted as a process in which different characters are generated based on the previous generation's characters. This could be as simple as an algorithm that incrementally modifies parameters to a function as they are passed into an entities control loop which alters its interactions between itself and the environment. Like biological systems, these parameters are modified during reproduction, and are assumed to be relatively static during the lifespan of an entity.
- Character Fitness can be reinterpreted simply as a filtering function, in which the individual variation causes the fitness of the entity to vary such that selection can act on the individual entities within the population causing the higher fitness entities to survive while the lower fitness entities are pruned from the population.
Selection is readily evident in information security, as cryptographic algorithms which are broken are slowly removed from general use and newer algorithms are designed. Selection usually does not cause changes to occur instantaneously unless it is a strong selection pressure. MD5 is still within wide spread use through out the computing base despite it being known to be a weak algorithm for some time.
When considering how selection is applied to information security, it is important to understand that in evolutionary biology when an entity is selected against, it has been removed from the reproductive (or effective) population. In most cases, this means that the entity has died. When an entity has been selected against in information security it does not necessarily mean that the system has died. A more complete way of stating that an entity has been selected against in information technologies, it could be to state that when a system has been selected against it is no longer present or is no longer in its intended operating state. In the case of malware, this would be that the malware has been removed from a system, or its command and control infrastructure has been eradicated. In the case of an IT system, if a system has been compromised it has been selected against or if the system has been wiped (as in the case of a complete rebuild). Unlike organisms in the environment, once an organism has been eliminated it cannot not be brought back to life but IT systems on the other hand can be wiped and rebuilt.
Another significant process within evolutionary biology is the Undirected Mutation. Although life is able to have a high fidelity when it is replicating, errors are introduced when replication occurs. The errors cause the cellular processes to vary in ways that can enhance the organisms fitness, reduce it's fitness or have little effect on its fitness. Natural Selection works with Undirected Mutations to select for entities which have a higher fitness, and prune out the entities with a lower fitness. Although there are evolutionary algorithms within the fields of artificial intelligence, most of the processes that modify the behavior or enhance the functionality of programs are guided by Directed Mutation. Directed Mutations are mutations which are deliberately made with the goal of producing an desired effect, and in the case of malware it can range from increasing its ability to infect remote hosts, or hinder the ability of a malware analyst in determining the true nature of the application or simply getting past the latest anti-virus scanner definitions.
If Directed Mutations as a process are reinterpreted to include modifications at a larger scale, then it is tempting to think of directed mutations as being applied by an intelligent entity, most commonly referred to as an intelligent designer. Although the intelligence designer has no scientific basis in evolutionary biology, it can apply to information security in a more limited way. Evolutionary biology works without having an "intelligent designer" guiding the evolution and development of an entity. Information systems typically work with a designer and/or engineer who designs a system which is then implemented. The fitness of a system is then determined and it can be revised during the next design and subsequent implementation. As within evolutionary biology, the application of a designer to information security does not require that a single overall designer exist. Indeed the opposite is true. There are large numbers of individual designers operating and competing by proxy through their fielded applications and programs for systems and resources.
Genetic Drift is one of two evolutionary processes which can directly work against natural selection. In Genetic Drift, random inheritance of weakly or neutrally selected characters during reproduction can cause characters to either eventually dominate or be removed from a population. Weakly selected characters are those characters which only have minor selection pressures working against them, while neutral characters effectively have no selection pressures operating on them. Genetic Drift is one of the processes which is able to counter the act of Natural Selection. It is able to work against Natural Selection in that during reproduction, despite being a character that provides for a strongly enhanced fitness, it may not be passed along to subsequent generations. If there are two characters {A,B} and only one will be passed along, it will be either A or B. The other character can be lost unless there are sufficient numbers to ensure that statistically it is passed along. Unlike most of the other evolutionary processes, Genetic Drift does not have an easily identifiable analogy to information security other than personal preferences in the choice of browsers, office automation applications, operating systems, etc.
The last of the four major evolutionary processes is Gene Flow. Gene Flow occurs when two populations having different allele frequencies interbreed (usually due to a period of isolation and then reintroduction). Typically in an isolated population, Natural Selection and Genetic Drift will alter the characters of the population from their original frequencies. When the population encounters another population with which it can interbreed, the resulting interactions cause gene frequencies to change in the resulting population. It is not required that the two populations be environmentally isolated. Gene Flow can also occur if there is a strong selection pressure operating locally within the population (normally on a fringe population in which the environment is different from that of the main body).
When selection favors specific adaptations within a population, the adjusted gene frequency of initial population's genes may flow into another population with a different set allele frequency altering the resulting gene frequencies for both populations. This situation can occur because a population has becomes isolated due to environmental conditions or because an adaptation favors a specific frequency in a sub population. Since the genes favored in the different populations can be different, and the intermixing of the genes results in an intermediate allele frequencies, this process can actually work against Natural Selection preventing optimal solutions from being established. As an example of this in evolutionary biology, Stephen Stearns and Richard Sage (Mal-Adaptation in a Marginal Population of the Mosquito Fish, Evolution, 1980) found that specific adaptations which could have increased the overall the fitness of a border population of mosquito fish attempting to survive in fresh water was being hindered by gene flow resulting from interbreeding with the main population.
A close parallel with Gene Flow is found within the formal education of programmers for producing secure code. In order to create and distribute programs the developer does not need to be trained in how to create a secure program, only in that they need to be able to create a functional program. Some organizations have deliberately allocated resources to train their developers in methods for developing and implementing secure programs. But if the organization is only able to attract new developers which have not received any training in a secure development lifecycle, they must expend resources to educate the developer. Depending on the turnover rate of the organization and project schedules, this reoccurring cost could be significant enough to cause the organization to loose their focus on developing a secure product.
Genetic Drift, Gene Flow, Natural Selection and Undirected Mutation form the four basic processes of evolutionary biology. With little modification or reinterpretation these processes can be applied to information security. Natural Selection becomes Artificial Selection, Undirected Mutation becomes Directed Mutation, Gene Flow is still represented but Genetic Drift becomes less important of an evolutionary process.
Wednesday, July 29, 2009
Extinction and End Games
Recently Jeff Moss gave an introduction to the opening of Black Hat DC 2009, in which he essentially asked "is there any problem in security that has been definitively crushed or completely eradicated? Is there a problem from 10 years ago that is no longer a concern?" Specific instances of problems have been eradicated but the families of problems that persist include computer viruses, buffer overflows, cross site scripting (XSS), SQL injection (SQLi), etc. Computer viruses have existed since 1971, buffer overflows were popularized in 1996, XSS has been around since about 1997, and SQLi has been present since 1998.
Managers and security professionals are often looking for that silver bullet for solving all of the information security issues that an organization may have. Vendors of security products are often willing to demonstrate that their single or integrated security solution will provide all of the protection that an enterprise needs against emerging threats, the next generation of attacks, etc.
As information security is engaged in a Red Queen race or an evolutionary arms race, there should be no expectation that a single or multiple strategies can always ensure the survival of an organization. The security controls that are put in place will act as selection pressures on their adversaries to ensure that only the successful exploitation strategies are passed on to the next generation of attacks. The security controls are going to ensure that attackers and malware authors continue to escalating their exploitation strategies against the implemented security solutions to ensure their survival. This escalatory relationship is akin to the evolutionary arms race between predator and prey.
There are multiple outcomes for predator and prey resulting from an evolutionary arms race (Evolutionary Biology, 3rd Edition, Futuyma);
In order to cause an extinction of predator strategies (or in the case of information security an attacker's or malware author's strategies), it is not necessary to wipe out an entire population in a single event. Within evolutionary biology, an estimate of effective population size is given by the following equation; Pi = P0 * exp([b-d]*t), where Pi is the population size in the future, P0 is the initial effective population size, t is the time, b is the birth rate, and d is the death rate. As long as the birth rate is higher than the death rate, the population size will grow exponentially. If the death rate is higher than the birth rate, the population is shrinking. The birth and death rates are typically associated with environmental factors such as competition for available resources and types of selection pressures. Essentially the environment only has to change faster than the opponent's strategies can adapt.
By inspecting the rate of growth for malware, it appears that the "birth" rate is higher than the "death" rate. The effective malware population (based on the number of unique samples) is growing exponentially. The costs for malware populations have not reached their carrying capacity on the environment. Within evolutionary biology and ecology, the carrying capacity is the population size that a given environment can support based on the available resources. If a population is increasing in size, then the carrying capacity has not been reached as more resource are available to support the growth. As the population approaches the carrying capacity, the population growth decreases as available resources are more difficult to access. If the population exceeds the carrying capacity, the population will reduce in size as selection works against the population and the entities which are not able to extract enough resources to survive.
Ideally, security professionals would like to see the current situation change from being a continually escalating arms race or a cyclic strategy/counter-strategy to that of the extinction of attacker/malware strategies. By changing the selection pressures that are applied against these invasive strategies, it could be argued that extinction can be triggered. A set of selection pressures could be implemented such that nothing could survive or the selection pressures of the environment change so quickly that the invasive strategy does not have time to evolve successful adaptations. Another solution could involve changing local environmental selection pressures independent of the global selection pressures such that only specific strategies can thrive in specific "regions." This strategy is similar to having an organization switch to a different operating system and/or browser, so the commonly employed exploit strategies fail on the organization.
One of the main problems with implementing a strategy to solve the issue drastically changing the environment is that the environment has to change quickly, more quickly than the invasive strategy can evolve adaptations. The current computing environment is not conducive to drastic changes implemented through out the entire infrastructure. Virtualization is often proposed as a security solution, but to implement this solution globally would take years to decades. Most users are not going to upgrade to a virtualized operating system, unless they are going to acquire a new computer. Typically computers are not replaced or even upgraded annually. This represents a significant period of time in which attackers and malware author's can update their strategies and adapt to the new environment. As previously discussed, attackers and malware have the advantage when the environment changes due to their smaller size.
Another method for improving the situation within the Red Queen race that is occurring within information security, would be the attempt to convert the situation into an ESS. In an ESS, there is an equilibrium reached that is resistant to invasion by outside strategies. If this occurred attackers and malware would achieve a balance with the security professionals in which new infections are cleaned at approximately the same rate as they are occurring.
Instead of focusing on the extinction of malware in the near term, another strategy would be to focus on the infectious nature of malware and reducing the associated virulence. In dealing with the interactions between diseases/parasites and their hosts, the virulence of the disease/host tends to be associated with how it is transmitted between the hosts. A disease or parasite that is transmitted from parent to offspring is said to be vertically transmitted though a population. Diseases and parasites that are vertically transmitted tend to have a lower virulence, or exhibit avirulent behavior. If the disease or parasite reduces the host's fitness too much, then they will not be able to propagate to its offspring after/during reproduction, since no offspring will be produced. Horizontally transmitted diseases/parasites jump from host to host in a population through a variety of different mechanisms; direct contact, the environment or a pathogen vector (such as a mosquito in the case of Malaria). As the virulence of the disease/parasite is not dependent on the survival of the host to reproduce, only the contact with other vulnerable hosts, it is capable of reaching a much higher virulence and significantly reducing the fitness of the host.
There are a number of different ways that an evolutionary arms race can play out; it can continue to escalate, it can continue to escalate until the costs associated with the escalation cause the system to stabilize into an ESS, it can develop in cyclic phases such as the case in the interactions between diseases/parasites and hosts with their immune responses, or one of the interacting entities can go extinct as it is no longer able to adapt to the environment. With the rate that the malware population is increasing, it does not appear that the evolutionary arms race has stabilized into an ESS or that malware will go extinct in the near future, so either the escalatory nature of the race will continue or the cyclic interplay between strategy and counter-strategy will continue for the foreseeable future. The strategies employed by attackers and malware authors rely on small easily adaptable applications, which in terms of evolutionary biology means that the can more readily adapt to environmental selection pressures. Instead of causing malware to go extinct, perhaps a way can be found to tie it to the host, and force it to adopt a more avirulent or beneficial behavior by being vertically transmitted through a computer population instead of horizontally transmitted.
Managers and security professionals are often looking for that silver bullet for solving all of the information security issues that an organization may have. Vendors of security products are often willing to demonstrate that their single or integrated security solution will provide all of the protection that an enterprise needs against emerging threats, the next generation of attacks, etc.
As information security is engaged in a Red Queen race or an evolutionary arms race, there should be no expectation that a single or multiple strategies can always ensure the survival of an organization. The security controls that are put in place will act as selection pressures on their adversaries to ensure that only the successful exploitation strategies are passed on to the next generation of attacks. The security controls are going to ensure that attackers and malware authors continue to escalating their exploitation strategies against the implemented security solutions to ensure their survival. This escalatory relationship is akin to the evolutionary arms race between predator and prey.
There are multiple outcomes for predator and prey resulting from an evolutionary arms race (Evolutionary Biology, 3rd Edition, Futuyma);
- The first outcome is that neither side gains the advantage. In this situation, the evolutionary arms race continues with each side escalating their strategies (Richard Dawkins and J. R. Krebs, Arms Races between and within Species, 1979). Within an escalatory arms race, both the predator's weapons and the prey's defenses become more effective than previous generations, but neither has an advantage (G.J. Vermeij, Evolution and Escalation, 1999). More simply stated, as time passes a predator's weapons become more refined, and in response to the evolution of these better weapons a prey species evolves better defenses. The end result is neither side makes any progress, but a modern predator would be able to better exploit an ancestral prey than a predator from that period.
- The second outcome is that as the evolutionary costs for continuing the escalation increase, a set of strategies employed by both sides causes an equilibrium to be established. This equilibrium can form what is referred to as an Evolutionarily Stable System (ESS). In an ESS, a point is reached where the system is stable and resistant to invasion from outside strategies based on the costs associated for each strategy. ESSs are detailed in Evolution and the Theory of Games, by John Maynard Smith, 1982 and in the Selfish Gene by Richard Dawkins.
- The third outcome is that the system suffers from continual or periodic changes as a new strategy is employed and a counter-strategy is evolved and then deployed. This is similar to disease/parasite and host relationships, in which a disease or parasite invades a host. The population takes time to develop resistance or immunity to the invasive disease/parasite. For a period of time the population may be quite successful at repelling the disease/parasite, but eventually the disease/parasite can develop a strategy to overcome the factor that was keeping them out of the host. This is commonly seen as the over use of antibiotics has caused various strains of antibiotic immune diseases to develop; such as Methicillin-resistant Staphylococcus aureus (MRSA) or Extensively Drug-Resistant Tuberculosis (XDR-TB).
- Lastly the outcome of an evolutionary arms race can result in one or both of the species going extinct. One of the sides of the evolutionary arms race evolves an adaptation which allows it to fully exploit or evade exploitation from the other species in a way that it cannot adapt before becoming extinct. Conversely, if the predator was entirely focused on exploiting a single prey species, with the extinction of the prey, the predator species may also collapse.
In order to cause an extinction of predator strategies (or in the case of information security an attacker's or malware author's strategies), it is not necessary to wipe out an entire population in a single event. Within evolutionary biology, an estimate of effective population size is given by the following equation; Pi = P0 * exp([b-d]*t), where Pi is the population size in the future, P0 is the initial effective population size, t is the time, b is the birth rate, and d is the death rate. As long as the birth rate is higher than the death rate, the population size will grow exponentially. If the death rate is higher than the birth rate, the population is shrinking. The birth and death rates are typically associated with environmental factors such as competition for available resources and types of selection pressures. Essentially the environment only has to change faster than the opponent's strategies can adapt.
By inspecting the rate of growth for malware, it appears that the "birth" rate is higher than the "death" rate. The effective malware population (based on the number of unique samples) is growing exponentially. The costs for malware populations have not reached their carrying capacity on the environment. Within evolutionary biology and ecology, the carrying capacity is the population size that a given environment can support based on the available resources. If a population is increasing in size, then the carrying capacity has not been reached as more resource are available to support the growth. As the population approaches the carrying capacity, the population growth decreases as available resources are more difficult to access. If the population exceeds the carrying capacity, the population will reduce in size as selection works against the population and the entities which are not able to extract enough resources to survive.
Ideally, security professionals would like to see the current situation change from being a continually escalating arms race or a cyclic strategy/counter-strategy to that of the extinction of attacker/malware strategies. By changing the selection pressures that are applied against these invasive strategies, it could be argued that extinction can be triggered. A set of selection pressures could be implemented such that nothing could survive or the selection pressures of the environment change so quickly that the invasive strategy does not have time to evolve successful adaptations. Another solution could involve changing local environmental selection pressures independent of the global selection pressures such that only specific strategies can thrive in specific "regions." This strategy is similar to having an organization switch to a different operating system and/or browser, so the commonly employed exploit strategies fail on the organization.
One of the main problems with implementing a strategy to solve the issue drastically changing the environment is that the environment has to change quickly, more quickly than the invasive strategy can evolve adaptations. The current computing environment is not conducive to drastic changes implemented through out the entire infrastructure. Virtualization is often proposed as a security solution, but to implement this solution globally would take years to decades. Most users are not going to upgrade to a virtualized operating system, unless they are going to acquire a new computer. Typically computers are not replaced or even upgraded annually. This represents a significant period of time in which attackers and malware author's can update their strategies and adapt to the new environment. As previously discussed, attackers and malware have the advantage when the environment changes due to their smaller size.
Another method for improving the situation within the Red Queen race that is occurring within information security, would be the attempt to convert the situation into an ESS. In an ESS, there is an equilibrium reached that is resistant to invasion by outside strategies. If this occurred attackers and malware would achieve a balance with the security professionals in which new infections are cleaned at approximately the same rate as they are occurring.
Instead of focusing on the extinction of malware in the near term, another strategy would be to focus on the infectious nature of malware and reducing the associated virulence. In dealing with the interactions between diseases/parasites and their hosts, the virulence of the disease/host tends to be associated with how it is transmitted between the hosts. A disease or parasite that is transmitted from parent to offspring is said to be vertically transmitted though a population. Diseases and parasites that are vertically transmitted tend to have a lower virulence, or exhibit avirulent behavior. If the disease or parasite reduces the host's fitness too much, then they will not be able to propagate to its offspring after/during reproduction, since no offspring will be produced. Horizontally transmitted diseases/parasites jump from host to host in a population through a variety of different mechanisms; direct contact, the environment or a pathogen vector (such as a mosquito in the case of Malaria). As the virulence of the disease/parasite is not dependent on the survival of the host to reproduce, only the contact with other vulnerable hosts, it is capable of reaching a much higher virulence and significantly reducing the fitness of the host.
There are a number of different ways that an evolutionary arms race can play out; it can continue to escalate, it can continue to escalate until the costs associated with the escalation cause the system to stabilize into an ESS, it can develop in cyclic phases such as the case in the interactions between diseases/parasites and hosts with their immune responses, or one of the interacting entities can go extinct as it is no longer able to adapt to the environment. With the rate that the malware population is increasing, it does not appear that the evolutionary arms race has stabilized into an ESS or that malware will go extinct in the near future, so either the escalatory nature of the race will continue or the cyclic interplay between strategy and counter-strategy will continue for the foreseeable future. The strategies employed by attackers and malware authors rely on small easily adaptable applications, which in terms of evolutionary biology means that the can more readily adapt to environmental selection pressures. Instead of causing malware to go extinct, perhaps a way can be found to tie it to the host, and force it to adopt a more avirulent or beneficial behavior by being vertically transmitted through a computer population instead of horizontally transmitted.
Tuesday, July 7, 2009
Reducing the Time for Adaptation
Periodically security professionals and security vendors tout the idea that reducing the reaction time between an event and employing a counter strategy can potentially resolve the evolutionary arms races within information security. This idea is similar to an Observe, Orient, Decide and Act (OODA) loop.
In strategy, there is Boyd's OODA loop which emphasizes the idea that reducing the time required for planning and reacting faster than an opponent will provide an advantage and subsequently enhances the likelihood of the opponent making a mistake. By deceasing the time that is required to react appropriately to a situation, the initiative is maintained and consequently an opponent is always responding to the situation. The more time an opponent spends reacting, the less time they have to observe and plan; increasing the likelihood that a mistake will be made. This concept has been raised recently on the panel discussions at the CATCH 2009 conference. References to this particular type of strategy, arise periodically from malware vendors in that if the time between the release of malware and the release of generally available anti-malware signatures can be reduced, it could help to solve or alleviate the malware threat.
Applying the OODA loop or simply reducing the reaction time could potentially go a long way towards helping to alleviate the malware threat. But, it should be considered that malware will always be able to evolve more quickly than an operating system, a web application, a database or even the anti-malware tool as it has the initiative and malware is typically smaller in size and less complex. Looking at this strategy from an evolutionary biology perspective, it is similar to the Red Queen hypothesis that occurs between diseases/parasites and their hosts. It is also similar to the evolutionary arms race between malware and the rest of the information security community (anti-virus,browsers, office automation applications, operating systems, application services, etc). Viruses have genomes on the order of 10^4 base pairs, bacteria have genomes on the order of 2x10^6 base pairs, and humans have genomes on the order of 6.6x10^9 base pairs (Evolution 3rd Edition, Ridley). Modern operating systems have about 40 - 55 million lines of code (equating to 2.5 - 4 GB installed), while most malware is a few orders of magnitude smaller, approximately 119 - 134 KB in the case of Conficker.
As is the case with viruses and other more complex organisms within the real world, smaller organisms are capable of evolving at a much faster rate than large complex organisms. Consider the case of RNA viruses which have a mutation rate of about 1 mutation/generation. While bacteria have about 10^-3 mutations/generation, and humans have about 200 mutations/generation (Evolution 3rd Edition, Ridley and Evolutionary Biology 3rd Edition, Futuyma). Some diseases mutate frequently enough that every replication event experiences the likelihood that the disease will have changed. Although humans have a much higher mutation rate than diseases (such as viruses and bacteria), the generation span of a human is much longer than that of most diseases. The generation lifespan on a human is on the order of 15 - 30 years, while diseases typically have generation lifespans on the order of seconds to minutes. Per unit time diseases (e.g. viruses and bacteria) can evolve much more rapidly, and yet large complex organisms are able to survive as they have strategies which allow them to combat these adaptations. Despite the rate at which diseases are capable of evolving, they do not always win. Influenzavirus has the potential of being fatal but in most cases it is not considered life threatening.
Large complex organisms have multiple methods for allowing them to survive in an environment where diseases can rapidly evolve. Entities with smaller genomes have effectively less space in which to maintain a set of strategies which they can use to exploit their environment, while larger more complex organisms have more space in which they can record their survival strategies. Some bacteria use enzymes to protect against viral infections. Eukaryotes employ even more defenses against infection, while entities like vertebrates have evolved immune systems which are capable of responding to infection by disease. One segment of the Human genome, the Major Histocompatibility Complex (MHC) contains approximately 3.6 million base pairs or 140 genes which control a portion of the human immunological system. As of October 2004, the Immunogenetic Related Information Source (IRIS) database estimates the percentage of the human genome that controls the human immune system is approximately 7%, or 1562 genes. Although the percentage of the human genome related to the human immune system seems small, it is important to consider that a significant portion of the human genome is inactive. It is estimated that 25% of the genome is attributed to diseases which have inserted their genetic code into our genome and are now inactive, while other sections contain pseudogenes which are no inactive version of ancient genes. The percentage of the human active genome which relates to the immune system could be substantially higher than currently theorized. The cost of surviving in an evolutionary arms race can be high, as significant resources are required to defend an organism from infection by diseases and parasites.
Recently researchers, such as Banerjee in An Immune System Inspired Approach to Automated Program Verification, have looked at applying some of the methods that the immune system uses for protecting itself from disease by investigating an Automated Immune System (AIS) which can be implemented in information systems.
Implementing an immune system to handle rapidly evolving threats does not eradicate the threat. Immune systems will act as a selection pressure that will cause only those diseases which are capable of adapting to survive. Some adaptations can include methods for remaining undetected by the immune system, while others can include methods for exploiting the immune system and subverting it for its own use. In essence, these systems represent another vector in which disease can exploit a host. Human Immunodeficiency Virus (HIV) actively exploits the immune system; even at the cost of its own reproductive fitness to remain active in the host to survive when anti-HIV drugs are administered. Similarly with anti-malware products, flaws in these systems have allowed malware to exist and even spread in the form of computer worms. Malicious code routinely attempts to disable anti-virus before downloading and installing malicious components. In order to remain undetected, some malware will re-enable the anti-virus products to prevent the user from noticing anything conspicuous. Anti-virus software is complex enough that it has its own vulnerabilities which may be exploited by malware. In 2006, Symantec Anti-virus had a vulnerability (CVE-2006-2630) which allowed for a privilege escalation that was exploited by the W32.Rinbot.L worm.
Simply reducing the response time will not eradicate the threat. It will provide an advantage but it will not solve the problem. In order to respond to diseases which are able to quickly adapt to host evolutionary responses, large complex organisms have had to evolve complex responses that do not rely on a single strategy to ensure their survival. The cost of ensuring survival in an evolutionary arms race can be high, as numerous strategies need to be available to counter act the threat of disease and parasites.
In strategy, there is Boyd's OODA loop which emphasizes the idea that reducing the time required for planning and reacting faster than an opponent will provide an advantage and subsequently enhances the likelihood of the opponent making a mistake. By deceasing the time that is required to react appropriately to a situation, the initiative is maintained and consequently an opponent is always responding to the situation. The more time an opponent spends reacting, the less time they have to observe and plan; increasing the likelihood that a mistake will be made. This concept has been raised recently on the panel discussions at the CATCH 2009 conference. References to this particular type of strategy, arise periodically from malware vendors in that if the time between the release of malware and the release of generally available anti-malware signatures can be reduced, it could help to solve or alleviate the malware threat.
Applying the OODA loop or simply reducing the reaction time could potentially go a long way towards helping to alleviate the malware threat. But, it should be considered that malware will always be able to evolve more quickly than an operating system, a web application, a database or even the anti-malware tool as it has the initiative and malware is typically smaller in size and less complex. Looking at this strategy from an evolutionary biology perspective, it is similar to the Red Queen hypothesis that occurs between diseases/parasites and their hosts. It is also similar to the evolutionary arms race between malware and the rest of the information security community (anti-virus,browsers, office automation applications, operating systems, application services, etc). Viruses have genomes on the order of 10^4 base pairs, bacteria have genomes on the order of 2x10^6 base pairs, and humans have genomes on the order of 6.6x10^9 base pairs (Evolution 3rd Edition, Ridley). Modern operating systems have about 40 - 55 million lines of code (equating to 2.5 - 4 GB installed), while most malware is a few orders of magnitude smaller, approximately 119 - 134 KB in the case of Conficker.
As is the case with viruses and other more complex organisms within the real world, smaller organisms are capable of evolving at a much faster rate than large complex organisms. Consider the case of RNA viruses which have a mutation rate of about 1 mutation/generation. While bacteria have about 10^-3 mutations/generation, and humans have about 200 mutations/generation (Evolution 3rd Edition, Ridley and Evolutionary Biology 3rd Edition, Futuyma). Some diseases mutate frequently enough that every replication event experiences the likelihood that the disease will have changed. Although humans have a much higher mutation rate than diseases (such as viruses and bacteria), the generation span of a human is much longer than that of most diseases. The generation lifespan on a human is on the order of 15 - 30 years, while diseases typically have generation lifespans on the order of seconds to minutes. Per unit time diseases (e.g. viruses and bacteria) can evolve much more rapidly, and yet large complex organisms are able to survive as they have strategies which allow them to combat these adaptations. Despite the rate at which diseases are capable of evolving, they do not always win. Influenzavirus has the potential of being fatal but in most cases it is not considered life threatening.
Large complex organisms have multiple methods for allowing them to survive in an environment where diseases can rapidly evolve. Entities with smaller genomes have effectively less space in which to maintain a set of strategies which they can use to exploit their environment, while larger more complex organisms have more space in which they can record their survival strategies. Some bacteria use enzymes to protect against viral infections. Eukaryotes employ even more defenses against infection, while entities like vertebrates have evolved immune systems which are capable of responding to infection by disease. One segment of the Human genome, the Major Histocompatibility Complex (MHC) contains approximately 3.6 million base pairs or 140 genes which control a portion of the human immunological system. As of October 2004, the Immunogenetic Related Information Source (IRIS) database estimates the percentage of the human genome that controls the human immune system is approximately 7%, or 1562 genes. Although the percentage of the human genome related to the human immune system seems small, it is important to consider that a significant portion of the human genome is inactive. It is estimated that 25% of the genome is attributed to diseases which have inserted their genetic code into our genome and are now inactive, while other sections contain pseudogenes which are no inactive version of ancient genes. The percentage of the human active genome which relates to the immune system could be substantially higher than currently theorized. The cost of surviving in an evolutionary arms race can be high, as significant resources are required to defend an organism from infection by diseases and parasites.
Recently researchers, such as Banerjee in An Immune System Inspired Approach to Automated Program Verification, have looked at applying some of the methods that the immune system uses for protecting itself from disease by investigating an Automated Immune System (AIS) which can be implemented in information systems.
Implementing an immune system to handle rapidly evolving threats does not eradicate the threat. Immune systems will act as a selection pressure that will cause only those diseases which are capable of adapting to survive. Some adaptations can include methods for remaining undetected by the immune system, while others can include methods for exploiting the immune system and subverting it for its own use. In essence, these systems represent another vector in which disease can exploit a host. Human Immunodeficiency Virus (HIV) actively exploits the immune system; even at the cost of its own reproductive fitness to remain active in the host to survive when anti-HIV drugs are administered. Similarly with anti-malware products, flaws in these systems have allowed malware to exist and even spread in the form of computer worms. Malicious code routinely attempts to disable anti-virus before downloading and installing malicious components. In order to remain undetected, some malware will re-enable the anti-virus products to prevent the user from noticing anything conspicuous. Anti-virus software is complex enough that it has its own vulnerabilities which may be exploited by malware. In 2006, Symantec Anti-virus had a vulnerability (CVE-2006-2630) which allowed for a privilege escalation that was exploited by the W32.Rinbot.L worm.
Simply reducing the response time will not eradicate the threat. It will provide an advantage but it will not solve the problem. In order to respond to diseases which are able to quickly adapt to host evolutionary responses, large complex organisms have had to evolve complex responses that do not rely on a single strategy to ensure their survival. The cost of ensuring survival in an evolutionary arms race can be high, as numerous strategies need to be available to counter act the threat of disease and parasites.
Subscribe to:
Posts (Atom)