Saturday, February 14, 2009

Red Queen Races

The Red Queen race has been used as a model for understanding some of the various evolutionary arms races in evolution such; as parasite/host relationships and the response of pathogens to resistance (See Ridley's Red Queen or Ridley's Evolution 3rd Edition). In information security, as part of the malware red queen race, malware has co-evolved with malware detection and analysis to not only maintain its fitness but become the dominant threat to systems.

Within information security there are a number of red queen races in effect;
  • Attack strategies against applications/networks and the associated defensive strategies
  • Malware attack strategies and the resulting malware detection strategies.
  • Malware defensive strategies to prevent its analysis.
It could also be argued that cryptography is a red queen race between keeping things secret and attempting to reveal them. Cryptographic algorithms are constantly being developed to counter flaws discovered in existing algorithms and advances in technology which allows existing flaws to be exploited faster.

The strategies of attackers have changed dramatically since the early period of networking where attackers used manual methods (or custom scripts) that required a high degree of skill and/or knowledge to attack a single target. Now as tools have developed, attacks can be conducted by individuals with a low skill level and/or little knowledge about how the underlying attack works but using automated tools against any system that is accessible on the network (an overview of the development can be found in Computer Security by Bishop). In response to the development of attack tools, intrusion detection systems and log analysis tools were developed. In order to evade detection, attackers found ways to obfuscate their attacks to ensure that the attacks were reaching the end systems without alerting those monitoring networks (See libwhisker's IDS evasion techniques). More modern IDSs have responded by performing packet reassembly at the IDS to conduct packet inspection, but this consumes resources which could be allocated elsewhere. Tools such as the Metasploit Framework are making their attacks more difficult to detect by including various payload encoding techniques. Some tools are beginning to include encryption as their encoding methods are proving to be insufficient.

An overview of the history of malware can be found on VirusList.com's History of Malicious Programs. Malwarefinds its evolutionary beginnings in basic viruses which simply replicated to other hosts and deleted files or attempted to consume system resources. Early programs to detect malware were simply focused on signature matching techniques, and now malware detection has a variant of different techniques to detect malware.
  • Signature detection in which streams of bytes are analyzed for virus signatures.
  • Program emulation in which the functions of programs are emulated and executed. A program is determined to be malicious based on the events that occur.
  • Virtualized execution in which the byte streams are executed in a sandbox. The execution and results are watched to see when effect on the system it could potentially have.
  • System monitoring in which all programs are executed normally and the system's reaction is monitored for signs of malicious behavior.
  • Anomaly detection in which a system's baseline behavior is determined and as the system operates deviations are monitored to determine if a malicious program is operating.
Modern malware has evolved a variety of different techniques to evade detection and prevent analysis. The following items represent some of the strategies that are employed by malware to evade detection and remain in operation even after detection.
  • Metamorphic and polymorphic viruses - counter strategy to signature based detection.
  • Multiple layers of encoding - counter strategy to prevent detection and analysis. Some malware makes the decoding dynamic so that it can only jump to the correct instructions in a non-virtualized environment.
  • Memory only operations - counter strategy to performing offline analysis of malware. If it only exists in memory then taking the system offline will destroy the malware.
  • Modification of Service ACLs - uses the systems access controls against it to prevent removal by removing all access to the service's registry keys (except for the SYSTEM account).
  • DLL injection - Counter strategy against detection and removal. By being resident in another process it is more difficult to detect and makes it harder to remove the malware. This strategy has even been adopted by anti-malware vendors to ensure that malware cannot disable their detection engines.
  • In Memory Patching - counter strategy against other infections. If the vulnerability that was exploited to gain access to the system is still open, other malware can infect the system and compete for system resources. Some malware installs permanent patches, but these can easily be detected as they modify the system's baseline and if the system is rebooted a reinfection cannot occur if the malware was only memory resident.
  • Virtualization detection - counter strategy to analysis in virtualized environments. The Storm BotNet had VMware and VirtualPC detection methods, and if it detected that it was operating in a virtual environment it rebooted the system to clean the system and prevent further analysis. The Conficker/Downadup used SLDT/LDT results to determine if it is operating in a virtualized environment.
  • Disable running anti-malware services during install - simple counter strategy against malware prevention and removal is to simply disable the malware detection services. In addition to simply disabling these services, some malware will make it difficult to access websites of anti-malware vendors.
  • Remove system restore points - counter strategy to prevent system user's from just rolling their system state back to a previous clean point.
  • Use of anti-malware products as pre-screening - Some malware when it is designed is tested against existing malware products to ensure that there is a poor detection rate. There is no advantage in using a product that is already commonly detected.
The methods of locating remote hosts have also evolved. They began with small routines to simply scan remote addresses randomly and as fast as possible to spread as fast as possible. Some of these algorithms were flawed and prevented a maximal infection. Now there are prescan activities in which a scan of available targets that are vulnerable to an attack is performed prior to releasing malware. Malware will often scan for hosts on adjacent network spaces before seeking out other networks randomly (hosts on the same network will often be configured and managed the same way so they will likely have the same exploitable vulnerabilities).

Not only are the specific survival strategies of malware evolving in response to threats but the general concept of malware has changed. Malware has changed from seeing remote systems merely as targets to viewing them as valuable resources. Since systems are seen as resources, it is no longer advantageous to spread as fast as possible and take down as many systems as possible, these resources need to stay active in order to be of any use. Lastly malware is also moving in the direction of targeting specific individuals and/or organizations. The more customized the malware is to a specific organization or individual, the more likely it is to succeed in infecting a target host. The general software quality of malware has changed; it is no longer just written and released. It has acquired the properties of being professionally written to be more flexible and modular, and includes error handling and proper resource deallocation/cleanup.

No comments:

Post a Comment