Friday, February 27, 2009

Selection Pressure, Yesterday's Strategies, Resource Exploitation and Evolutionary Costs

When attempting to think of information security from an evolutionary perspective, there are a few concepts from evolutionary biology to consider. Selection pressures, adaptive time lags, and resource/population exploitation are important to how an entity's strategies evolve.

Evolutionary pressures, selection agents, or selection pressures usually refer to the same entity within evolutionary biology; a thing or a force which causes an organism to respond and/or adapt. Selection pressures originate from the natural environment of the organisms and include things like: resource availability, changing/adverse environmental conditions, interspecies predation and intra-species competition. The more pronounced the selection pressure, the quicker an entity must respond in order to survive. Those entities that are not able to adapt (i.e. contain the required characteristics to survive) are eventually eliminated. Stronger selection pressures will eliminate maladapted entities more quickly than those which already are surviving or are able to acquire the necessary characteristics to survive.

Selection pressures can select for a characteristic to evolve in a directional, stabilizing or disruptive way. Directional selection pressures force a characteristic in a "direction." For example, directional selection pressures select for a larger or smaller characteristic. Stabilizing selection pressures force a characteristic to remain the same, and select against larger and smaller characteristics. Disruptive selection pressures select against a specific characteristic in the population. A disruptive selection pressure could select for the larger or smaller characteristic and select against the average value of the characteristic which results in the selecting characteristic to diverge.

In information security, the security controls that are implemented can be considered to be the survival strategies of an information system and even selection pressures acting upon an adversary. If an adversary does not have the ability to compromise any one of the implemented security controls, they will not be able to access the resources which the system is protecting. So in order to survive, an adversary must develop counter-strategies which are capable of exploiting the strategies that were implemented in order to survive.

Organizations should understand that their strategies will succeed and fail based on the counter strategies employed against them. While the root cause of a compromise is being analyzed, one should also investigate what selection pressures should be applied to move the organization in the direction towards secure business processes. If an organization consistently receives poor quality applications from their vendors, they should determine what selection pressures could be applied early in the process to affect the desired changes. Should the organization hold the developer responsible for a compromise of the application? The organization is contracted to provide an application that supports the business processes. A compromised application/system does not support the business goals of the organization.

Some selection pressures are strong enough that they will exert enough evolutionary pressure on a population that all of the resulting entities within the population are identical as a response to the selection pressures. Because of the strength of the selection pressure operating on the population, all of the variance of a characteristic has to be eliminated in order for members of the population to survive. Stronger selection pressures will remove the variance from the population faster than smaller selection pressures. Stated another way, some strategies exert enough of a filtering pressure that all entities must develop a specific counter strategy in order to survive.

Most malware and BotNets have a characteristic which is hard to escape; their reliance on a method to make initial contact with their controllers and join the BotNet. Initially these control channels were handled over IRC (this method has not been abandoned and is still in use), and since then they have now migrated to Peer-to-Peer communications. More recently the bots are attempting to contact their controllers via HTTP requests as Peer-to-Peer communications are filtered by boundary protection devices. Now these HTTP requests carry a name that must be looked up in DNS (e.g. part of a fast-flux DNS network). This method is becoming more common as organizational selection pressures push the bots in this direction. The reliance making initial contact with the bot-net is essentially a bottle neck which can be exploited. If the DNS look up is always a consistent or predictable string then it can be blocked by either filtering the appropriate Internet addresses even if they are part of a fast-flux network.

In evolutionary biology, it takes time for organisms to adapt to their environment. There is a time lag between when a counter strategy is employed against a population and when an entity has evolved a strategy to deal with the counter strategy. The current generation is better adapted to the previous environment than the one in which it exists (or selected for) based on conditions that existed previously. The current environment and the previous environment may have changed.

This lag commonly expresses itself in information security when systems are designed. If they are designed properly they are designed to counter all of the existing exploitation strategies that are known. Depending upon how long it takes to go from design to implementation and to delivery, the threat environment may have drastically changed. The delivered system may not be able to address all of the current threats in the environment and may need to be modified. Additionally a few years following the deployment of a system, if it is not updated, there may be emerging threats that the existing design is incapable of addressing.

This time lag also appears in the conflict between malware and anti-malware tools or attacks and intrusion detection/prevention systems, which rely on signatures to detect, identify and remove malware or attacks. A signature must be created to detect the malware, and pushes out to all of the clients before they can respond to a malware infection. In order to create a signature, a new malware variant must have been identified. Occasionally a rule will be created that was general enough that it may detect/prevent attacks that have not yet been implemented due to similarities in the way an attack's strategy attempts to exploit a target.

In evolutionary biology there is competition for resources and the easiest or the most abundant resources are typically exploited first. Sometimes the most abundant resources are not exploited right away as a population must evolve the ability to consume those resources. The abundance of a resource does not mean that it must be abundant every where, only that it is encountered frequently enough in the environment and there is sufficient competition for other resources that evolution will move an entity in that direction. Developing the ability to exploit a resource takes time and resources.

In information security this is similar to having a sufficient base of similar operating systems, browsers, databases, or frameworks to make the effort of developing an attack strategy worth while. This does not mean that just because a resource is not very common in the environment, it will not be exploited. Some operating systems will claim that they are more secure than the dominate competition, but as the number of systems increase so do the attacks against that platform. Although Microsoft Windows composes an overwhelming majority of the market, Apple has been gaining market share and the gains have been large enough that malware authors have begun to more frequently target the platform. Within the last year there has been an increase in the number of Trojans that target the Apple platform.

Although resource availability is important in determining if a platform will be exploited, another factor is apparent in the case of information security; the value of the resource being exploited. In the case of financial or government systems, which may be hosted on less common platforms there is additional incentive in targeting and exploiting these resources due to the perceived value of the system. There is some value in being able to compromise a home user's system (less so if it has a slower connection to the Internet) and there is more value in being able to compromise a web application server, but in reality some of the highest value resources are the databases which contain important information.

In short, selection pressures cause the survival strategies of entities to evolve. If they are unable to evolve, they will not survive. Responding to selection pressures, either directly (i.e. developing counter strategies) or indirectly (i.e. exploiting an unused resource) does not occur instantaneously, there is almost always delay between when the selection pressure first begins to work and a population's adaptive responses.

Saturday, February 14, 2009

Red Queen Races

The Red Queen race has been used as a model for understanding some of the various evolutionary arms races in evolution such; as parasite/host relationships and the response of pathogens to resistance (See Ridley's Red Queen or Ridley's Evolution 3rd Edition). In information security, as part of the malware red queen race, malware has co-evolved with malware detection and analysis to not only maintain its fitness but become the dominant threat to systems.

Within information security there are a number of red queen races in effect;
  • Attack strategies against applications/networks and the associated defensive strategies
  • Malware attack strategies and the resulting malware detection strategies.
  • Malware defensive strategies to prevent its analysis.
It could also be argued that cryptography is a red queen race between keeping things secret and attempting to reveal them. Cryptographic algorithms are constantly being developed to counter flaws discovered in existing algorithms and advances in technology which allows existing flaws to be exploited faster.

The strategies of attackers have changed dramatically since the early period of networking where attackers used manual methods (or custom scripts) that required a high degree of skill and/or knowledge to attack a single target. Now as tools have developed, attacks can be conducted by individuals with a low skill level and/or little knowledge about how the underlying attack works but using automated tools against any system that is accessible on the network (an overview of the development can be found in Computer Security by Bishop). In response to the development of attack tools, intrusion detection systems and log analysis tools were developed. In order to evade detection, attackers found ways to obfuscate their attacks to ensure that the attacks were reaching the end systems without alerting those monitoring networks (See libwhisker's IDS evasion techniques). More modern IDSs have responded by performing packet reassembly at the IDS to conduct packet inspection, but this consumes resources which could be allocated elsewhere. Tools such as the Metasploit Framework are making their attacks more difficult to detect by including various payload encoding techniques. Some tools are beginning to include encryption as their encoding methods are proving to be insufficient.

An overview of the history of malware can be found on VirusList.com's History of Malicious Programs. Malwarefinds its evolutionary beginnings in basic viruses which simply replicated to other hosts and deleted files or attempted to consume system resources. Early programs to detect malware were simply focused on signature matching techniques, and now malware detection has a variant of different techniques to detect malware.
  • Signature detection in which streams of bytes are analyzed for virus signatures.
  • Program emulation in which the functions of programs are emulated and executed. A program is determined to be malicious based on the events that occur.
  • Virtualized execution in which the byte streams are executed in a sandbox. The execution and results are watched to see when effect on the system it could potentially have.
  • System monitoring in which all programs are executed normally and the system's reaction is monitored for signs of malicious behavior.
  • Anomaly detection in which a system's baseline behavior is determined and as the system operates deviations are monitored to determine if a malicious program is operating.
Modern malware has evolved a variety of different techniques to evade detection and prevent analysis. The following items represent some of the strategies that are employed by malware to evade detection and remain in operation even after detection.
  • Metamorphic and polymorphic viruses - counter strategy to signature based detection.
  • Multiple layers of encoding - counter strategy to prevent detection and analysis. Some malware makes the decoding dynamic so that it can only jump to the correct instructions in a non-virtualized environment.
  • Memory only operations - counter strategy to performing offline analysis of malware. If it only exists in memory then taking the system offline will destroy the malware.
  • Modification of Service ACLs - uses the systems access controls against it to prevent removal by removing all access to the service's registry keys (except for the SYSTEM account).
  • DLL injection - Counter strategy against detection and removal. By being resident in another process it is more difficult to detect and makes it harder to remove the malware. This strategy has even been adopted by anti-malware vendors to ensure that malware cannot disable their detection engines.
  • In Memory Patching - counter strategy against other infections. If the vulnerability that was exploited to gain access to the system is still open, other malware can infect the system and compete for system resources. Some malware installs permanent patches, but these can easily be detected as they modify the system's baseline and if the system is rebooted a reinfection cannot occur if the malware was only memory resident.
  • Virtualization detection - counter strategy to analysis in virtualized environments. The Storm BotNet had VMware and VirtualPC detection methods, and if it detected that it was operating in a virtual environment it rebooted the system to clean the system and prevent further analysis. The Conficker/Downadup used SLDT/LDT results to determine if it is operating in a virtualized environment.
  • Disable running anti-malware services during install - simple counter strategy against malware prevention and removal is to simply disable the malware detection services. In addition to simply disabling these services, some malware will make it difficult to access websites of anti-malware vendors.
  • Remove system restore points - counter strategy to prevent system user's from just rolling their system state back to a previous clean point.
  • Use of anti-malware products as pre-screening - Some malware when it is designed is tested against existing malware products to ensure that there is a poor detection rate. There is no advantage in using a product that is already commonly detected.
The methods of locating remote hosts have also evolved. They began with small routines to simply scan remote addresses randomly and as fast as possible to spread as fast as possible. Some of these algorithms were flawed and prevented a maximal infection. Now there are prescan activities in which a scan of available targets that are vulnerable to an attack is performed prior to releasing malware. Malware will often scan for hosts on adjacent network spaces before seeking out other networks randomly (hosts on the same network will often be configured and managed the same way so they will likely have the same exploitable vulnerabilities).

Not only are the specific survival strategies of malware evolving in response to threats but the general concept of malware has changed. Malware has changed from seeing remote systems merely as targets to viewing them as valuable resources. Since systems are seen as resources, it is no longer advantageous to spread as fast as possible and take down as many systems as possible, these resources need to stay active in order to be of any use. Lastly malware is also moving in the direction of targeting specific individuals and/or organizations. The more customized the malware is to a specific organization or individual, the more likely it is to succeed in infecting a target host. The general software quality of malware has changed; it is no longer just written and released. It has acquired the properties of being professionally written to be more flexible and modular, and includes error handling and proper resource deallocation/cleanup.