by Kapil Raina
An increase in the usage of web applications is directly related to an increase in the number of security incidents for them. Today, web application security is finally getting more prominent attention. This attention comes with the benefit of it being addressed as a higher priority now, but with the drawback of still being in an emerging area of technology. This article highlights both technical and business trends in web application security.
Network and Application Levels Merging
Traditionally, vulnerability analysis (and its management) has been focused at the network or operating system level. Such analysis has included the use of traditional manual penetration testing as well as automated security testing tools (from both proprietary and open sources). Trends are leaning towards merging the ability to scan for network vulnerabilities and application-level vulnerabilities together.
The current trend is to merge the ability of network scanners with the toolkits for the web application security space. Recently, Symantec's purchase of @Stake was in all likelihood heavily driven not only by the services business, but a strong competence of network and application security. The goal in this merging of network and application vulnerability analysis is the ability to use data found from one level and drive a more focused approach for the other level.
Another key area where we will see more integration is in the area of network management consoles. Currently, most consoles are geared towards soliciting network device information (e.g. firewalls). The trend is to incorporate input from various application-level tools (e.g. application security firewalls) in addition to network tools. One of the areas where we will probably not see more integration is patch management systems. On the network side, consoles can be set up to attach patch management solutions to notifications of problem detection. However, many web applications are proprietary and thus unique to a particular customer or department within a large corporation. As a result, no patch, other than the original developer doing a manual fix, is available.
QA Testing and Developer Awareness
Traditionally, Quality Assurance (QA) teams have not been partners with information security personnel, but trends are showing a shift in thinking. Mercury Interactive, a major player in automated testing tools, recently announced partnerships with some leading application security testing companies that provide an integrated solution between Mercury's testing products and the vendors' application vulnerability detection tools.
Does this mean QA teams will become security experts? Quite the contrary. We can expect to see more integrated solutions to allow QA testers to continue automated testing, without necessarily needing to understand the underlying security technology. In fact, we will most likely see a shift towards some type of workflow in which the owners of security policies create the appropriate tests and the QA professionals execute and measure against those tests.
Developers will also benefit from increasingly sophisticated web application vulnerability detection tools. Ideally, detection systems should be able to track defective/insecure lines of code where vulnerabilities might be found. Whenever possible, this would happen as part of a development tool operation such as a compilation of code. Some vendors have created development tools for enhancing code security, but to date, sales of these tools have been relatively poor. In addition, most of these code scanning tools are unable to provide complete application awareness and can only focus on a specific module of code. Thus, for more complex problems that might extend, for example, between a UI module and a database module, code scanners have traditionally not worked very well as stand-alone solutions. It is also foreseeable that we will see integration with bug tracking systems so that developers can simply follow their current defect tracking methodology and fix security vulnerabilities as simply as functional defects in their code.
Increased Industry Awareness
There are several technical initiatives that have been established to form a consolidated response and to increase awareness of web application vulnerabilities. It remains to be seen which one proves the most influential in steering commercial product development. The following are two key organizations involved in defining technical developments in application security:
Awareness has also been brought up by Microsoft's very public security updates. This has started to prompt some awareness in the developer community. However, it is still too early for application tools to incorporate sophisticated integration, as web application security analysis still lies primarily in the hands of security professionals such as penetration testers, QA engineers, and auditors.
While no formal direction has yet been established, industry trade groups, such as the Information Technology Association of America (ITAA), are anticipated to start providing guidelines for web application security for offshore code. Many offshore companies are currently struggling with establishing a level of security and confidence to continue to serve their clients.
Attack Detection Sophistication Increases
Web application vulnerability detection technology has become increasingly sophisticated. Most tools have progressed beyond simple buffer overflow attacks with detection capabilities limited to specific strings. With the rise of cross-site scripting (XSS) attacks, tools are still only focused on inline detection (the ability to attack and detect success in the same process). However, XSS attack detection methods are moving from that simple inline string injection/detection approach to a multi-stage attack and detection method that requires persistence of state. Complexities yet to be tackled include performance (as large amounts of data from the web application and user input need to be stored and referenced with each new interaction) and accuracy (by reducing false positives).
For example, some large financial organizations have recently had issues with cross-frame scripting (XFS), a particular type of phishing attack that poisons a single frame in a page. Recently, a vendor came out with updates that can mimic an XFS attack and then have the ability to detect that the attack succeeded.
Another area of increasing focus has been with web services. While web services has been very slow in mass adoption, some users have sites and online applications that depend on web services, and therefore have an urgent need to test for web services vulnerabilities. For the most part, vendors in this space have focused on simple detection techniques such as XML (malformed) schema based attacks and applying known web application vulnerabilities in non-XML applications to XML applications. It is doubtful much more work will be done commercially in this space until we see more dependence on web services over a broader scope of external business applications.
Detection Tools on the Rise
Some new features in detection tools include the ability for the user to create custom attacks/tests. This generally involves the ability to write scripts to address new and cutting-edge vulnerabilities. The traditional model would have required an update to the vendor's code base, which generally happens every six to nine months -- much too slow to keep up with the constant changes in the world of information security.
One of the most crucial elements of vulnerability analysis tools is not the ability to attack, but rather how quickly they can keep up with new attacks and detect the success of those attacks. The two most relevant metrics for determining effectiveness of a web application testing tool are the number of vulnerabilities discovered and the number of false positives generated. False positives can cause, in many cases, the requirement of heavy manual labor to pour through mounds of data to filter out false readings.
For the leading tools, attack detection, also referred to as fault detection, has evolved from simple pattern matching (e.g., 404 error page detection) to slightly more flexible detection (e.g., user-configurable regular expressions). Future trends will evolve into heuristic detection, which will consist of auto-generating detection through zero-day defense technology. Zero-day defense technology in this application will be the ability to learn from a pattern of known vulnerability behavior and then rule all unknown behavior as false positives (the same way some intrusion detection systems work today).
Currently, most advanced security testers use multiple tools, including commercial and open source tools. The main reason for the range of methods is that most tools only find a small percentage of existing vulnerabilities and, at the same time, generate a high number of false positives. While some tools excel in certain types of web application scenarios, overall most tools do not find more than perhaps 25-50% of known vulnerabilities in a typical application, based on this author's experience. Some vendors allow users to extend the product by adding their own scripts or exploits which can help in increasing the number of vulnerabilities found, as well as to reduce false positives. Clearly, as the technology progresses, the sophistication of these products will continue to improve. In the meantime, users need to focus on the most important requirements they have for a commercial tool and especially look for flexibility in extending the product as they make their evaluations.
Closing the Loop
Eventually, web application security detection tools will be able to provide border appliances, such as intrusion detection systems (IDS's) and firewalls, information on how to stop an attack until a vulnerability can be resolved. Various standards have emerged, each aligned with a particular set of vendors.
Some of the more prominent standards include the Application Vulnerability Description Language (AVDL) and Web Application Security (WAS), which are both XML-based standards. The shifting marketplace factors heavily into which standard will dominate. For example, Sanctum was recently acquired by WatchFire. It remains to be seen what the new parent company will establish as a strategic direction and/or if it shifts Sanctum's original strategy to support WAS (which was formed as a competitor response to SPI Dynamics' involvement in AVDL). While the industry appears to be favoring WAS, it is still unclear which standard will dominate and influence commercial product development. It's also not clear how these standards will help customers. Right now, the focus for companies is to find critical vulnerabilities that they can remediate and thus protect themselves from cyber attacks.
The current use of most web application security testing tools is still focused on the penetration tester/information security professional, with use being extended for QA and audit professionals. We are still a fair distance from holding a developer (i.e., software vendors) accountable for writing insecure code, but clearly the trend is moving in that direction. Security has always been a holistic solution, requiring all players and systems to work in concert to form a good defense.
About the Author
Kapil Raina, CISSP manages complex security product development as a Sr. Product Manager at Cenzic, Inc. He is the author of several books including "PKI Security Solutions for the Enterprise" published by Wiley & Sons and "mCommerce Security" published by McGraw-Hill. He is also a contributing author on "Biometrics" by Woodward, et al. Mr. Raina and has spoken at industry events held by the Computer Security Institute, MIS Training Institute, the Open Group, and ISACA.
This article originally appeared on SecurityFocus.com -- reproduction in whole or in part is not allowed without expressed written consent.