Software Auditing

Susceptibility Matrix: A New Aid to Software Auditing Testing for security is lengthy, complex, and costly, so focusing test efforts in areas that have the greatest number of security vulnerabilities is essential. This article describes a taxonomybased approach that gives an insight into the distribution of vulnerabilities in a system. KANTA J IWNANI MARVIN ZELKOWITZ University of Maryland

AND

ystems end up with security vulnerabilities for many reasons: poor development practices, ignored security policies during system design, incorrect configurations, improper initialization, inadequate testing because of financial- or marketing-imposed deadlines, and so on. System designers usually don’t consider security during all phases of development—rather, they usually add them later, in an ad hoc manner. Operational testing helps us find repeated instances of the same vulnerability in successive software releases. If we don’t find these vulnerabilities, they’ll continue to be exploited in successive releases, leading to unauthorized access to files or even a compromised system or network. We wanted to evaluate the impact security vulnerabilities have on an evolving product and how and when such vulnerabilities occur. Taxonomies can provide these insights. After reviewing many taxonomies (see the “Security Taxonomies” sidebar), we centered on Carl Landwehr’s model1 as the basis for our work. This article describes how we used this taxonomic information to build a matrix that helps software developers, testers, and software auditors understand the distribution of security vulnerabilities and prioritize their effort to achieve a higher level of security for subsequent software releases.

S

Three-dimensional taxonomy Table 1 (p. 18) shows how we developed a three-dimensional taxonomy. The first dimension, cause (which is a modification of Landwehr’s “genesis” dimension), describes how the vulnerability enters the system; essentially, it’s the type of security flaw. The second dimension, loca16

PUBLISHED BY THE IEEE COMPUTER SOCIETY



tion (modified from Landwehr’s “location” dimension), represents the functionality that contains the flaw. We introduced a third dimension, impact, to describe how the vulnerability affects the system. We identified the various categories in the third dimension from an empirical study of known vulnerabilities; a detailed description of all the categories appears elsewhere.2 To verify whether our taxonomy detects high-risk areas in software, we obtained a database of 853 unique Microsoft Windows flaws across all versions of Windows from the Harris Corporation (pulled from the SANS/FBI list of top 20 vulnerabilities; see www.sans.org/top20/). Then we compiled the Linux list of 160 flaws from the Red Hat Linux errata Web site (see https://www.redhat.com/apps/support/errata/). All the flaws in our database have a unique common vulnerabilities and exposures (CVE) identifier.3 CVE is a list of standardized names for vulnerabilities and other information security exposures for standardizing these names (see www.cve.mitre.org). We classified this database of Windows and Linux vulnerabilities according to each criterion of Table 1. We used STAT Scanner, the Harris Corporation’s vulnerability assessment tool for Windows, Unix, and Linux environments, to analyze vulnerability trends in Windows NT 4.0 and Linux systems (see http:// statonline.com/solutions/vuln_assess/scanner_index. asp). STAT Scanner assigns a risk level to each vulnerability—high, medium, low, or warning, depending on the exploit’s impact. Figure 1 shows the number of security flaws according to risk levels in Windows NT 4.0 Service Pack 1 (SP1) through Service Pack 6a (SP6a) and the post-SP6a Security Rollup Package

1540-7993/04/$20.00 © 2004 IEEE



IEEE SECURITY & PRIVACY

Software Auditing

Security taxonomies

T

he following sources represent various classification taxonomies for security:

R.A. Demillo and A.P. Mathur, A Grammar-Based Fault Classification Scheme and Its Application to the Classification of the Errors of TEX, tech. report SERC-TR-165-P, Purdue Univ., Dept. of Computer Science, 1995. W. Du and A.P. Mathur, “Categorization of Software Errors that Led to Security Breaches,” Proc. 21st Nat’l Information Systems Security Conf. (NISSC’98), 1998. B. Marick, A Survey of Software Fault Surveys, tech. report UIUCDCS-R90-1651, Univ. of Illinois at Urbana-Champaign, Dept. of Computer Science, Dec.1990. E.H. Spafford, “Common System Vulnerabilities,” Proc. Workshop Future Directions in Computer Misuse and Anomaly Detection, 1992, pp. 34–37. SecurityFocus Vulnerability Classification, www.securityfocus.com/ bid/8293/help/.

R.P. Abbott et al., Security Analysis and Enhancements of Computer Operating Systems, tech. report NBSIR 76-1041, Inst. for Computer Science and Technology, Nat’l Bureau of Standards, 1976. T. Aslam, A Taxonomy of Security Faults in the Unix Operating System, masters thesis, Purdue Univ., Dept. of Computer Science, 1995. T. Aslam, Use of a Taxonomy of Security Faults, tech. report 96-05, COAST Laboratory, Dept. of Computer Science, Purdue Univ., Mar. 1996. R. Bisbey and D. Hollingsworth, Protection Analysis Project Final Report, Information Sciences Inst./RR-78-13, DTIC AD A056816, Univ. of Southern California, May 1978. M. Bishop, A Taxonomy of Unix System and Network Vulnerabilities, tech. report CSE-95-10, Purdue Univ., Dept. of Computer Science, May 1995.

250 11 11 200 Warning 11

Low Medium

Number of vulnerabilities

(SRP, the interim bug-fix releases to Windows NT 4.0). Most of the flaws fixed in each service pack release are low-risk-level flaws. In the figure, the 177 lowrisk-level flaws drop to 77 whereas 34 medium-risklevel flaws drop to 20 and 12 high-risk-level flaws drop to zero over seven releases. The successive changes in each release’s security flaws provide more relevant information than just the total number of flaws present. From Figure 1, we derived the number and type of security flaws found and fixed in each successive service pack compared to the previous release. Figure 2 shows the number of flaws in each service pack relative to the number in the previous service pack. Starting with a baseline of 234 vulnerabilities in SP1 (see Figure 1), the bars above the x-axis in Figure 2 indicate the number of flaws not present in prior releases; those below indicate the number of flaws fixed in this service pack, but present in the prior service pack. So, from the 234 baseline in Figure 1, we can calculate SP2’s flaws as 234 + 2 – 19 = 217 flaws. Again, most of the flaws fixed in each service pack release are low risk level; new medium-level flaws appear in SP2, SP3, SP6a, and SRP. Only one new high-risk flaw appeared in the releases (in SP3). Figure 2’s red and pink lines show how high and medium flaws change over time. Medium-risk flaws continue to exist in future releases, but high-risk flaws get fixed, a fact that encouraged us to pursue a more detailed analysis.

High 150

9 9 177

6

161 133 5

100 108

104

100 77

50

34 12 0

SP1

34

34 25

25 6 SP5

28

11

12

9

SP2

SP4

SP4 Service packs

3 SP6a

20 SRP

Susceptibility matrix Getting back to our original motivations, we wanted to look at how and where security flaws occur, not just their frequency. To do that, we classified vulnerabilities to see if our classification scheme could identify any error-prone

Figure 1. Security flaws. The number of security flaws in Windows NT 4.0 Service Packs 1 through 6a and the post-SP6a Security Rollup Package (SRP) according to risk levels.

www.computer.org/security/



IEEE SECURITY & PRIVACY

17

Software Auditing

Table 1. Security flaw taxonomy from a security testing perspective. CAUSE

LOCATION

IMPACT

Validation errors Domain errors Serialization or aliasing errors Errors due to inadequate identification or authentication Boundary and condition errors Covert channel Exploitable logic errors

System initialization Memory management Process management or scheduling Device management

Unauthorized access Root or system access Denial of service

File management Identification or authentication

Crash, hang, or exit Failure Invalid state File manipulation Errors due to clock changes

Integrity failure

30

Number of flaws found vs. fixed

Medium 20

10

High

0

–10 Warning Low

–20

Medium High

–30 SP2

SP4

SP4

SP5

SP6a

SRP

Service packs

Figure 2. Security flaws. The number of security flaws found versus the number fixed in successive service packs.

system software components. We associated each vulnerability with a triple: each cause and location got a vector of impacts. We constructed a susceptibility matrix with various causes as columns and locations as rows, which help us understand the relationship between the previously described cause and location dimensions. Figure 3 (p. 20) shows our susceptibility matrix (of 853 flaws in Windows). We constructed similar susceptibility matrices for 160 flaws in Linux and all the Windows NT 4.0 service packs.4 By constructing susceptibility matrices of vulnerabilities in Windows NT 4.0 SP1 through SRP, we learned several things: 18

IEEE SECURITY & PRIVACY



MARCH/APRIL 2004

• Vulnerabilities are concentrated in certain combinations of the three dimensions (in a small number of triples). • High- and medium-risk flaws are concentrated in the same set of triples. • New security flaws found in successive service packs in Windows NT 4.0 appeared in the same set of triples.

By using this taxonomy to identify high-risk flaws in one release of a system (defined as the rank of security flaws for that dimension), we can eliminate or prevent most security flaws by tweaking the testing to look more intensely for these flaws. This also will increase the next release’s security level because eliminating most flaws also eliminates high- and medium-risk level flaws. Comparing the susceptibility matrices for Windows vulnerabilities (Figure 3) with that of Linux vulnerabilities,4 we can see that Windows and Linux have common high-risk areas. Figure 4 (p. 20) displays combined susceptibility matrices for Windows (left semicircle) and Linux (right semicircle). Black indicates many (≥ 36.5) Windows or (≥ 6.85) Linux flaws, whereas white indicates fewer flaws using a ratio of 5.33:1 for the relative number of Windows to Linux flaws (853 to 160). A blank area indicates no flaws. Looking at all security flaws in Windows and Linux, we observe that most flaws have similar characteristics in both systems. This implies that vulnerabilities have a similar distribution in different systems. Table 2 summarizes the common areas between Linux and Windows. Emphasizing testing on only five high-density areas identifies about two-thirds (78.6 percent of Windows and 62.5 percent of Linux) of the vulnerabilities in both systems. Looking at each system independently, individual high-density areas (the black semicircles in Figure 4) represent 87.2 percent of Windows and 72.5 percent of Linux flaws.

Aid to software auditing Software auditing seeks to prevent vulnerabilities by searching for them before they occur. Obviously, auditing existing applications and discovering vulnerabilities before they can be exploited is difficult and time-consuming.5 Moreover, identifying all possible flaws during a software audit is quite impractical. However, previous security incidents can give insights into an application’s historical high-risk areas. Using this historical data and the taxonomy in Table 1, we can construct a susceptibility matrix similar to Figure 3, to identify high-risk areas in

Software Auditing

Contrasting with threat modeling

T

hreat modeling is an iterative approach for assessing all the possible vulnerabilities in a system or application and then prioritizing them via risk analysis to guide efforts in addressing the most critical threats first. Here are a few key points for comparing the effectiveness of our taxonomy-based approach populated with actual vulnerabilities to threat modeling:

• The susceptibility matrix-based approach is simple and efficient. It does not require further risk analysis. • Threat modeling can be subjective depending on the technique chosen to identify risks. • Threat models, once applicable, become obsolete as new attacks are devised.

• It is impossible to find all possible threats using threat modeling. • The set of potential threats might be very different to the actual set of threats to the system or application. The susceptibility matrix accurately captures this information. • The number of all possible threats might be too large to allow any further analysis.

The susceptibility matrix-based approach adds very little overhead to the overall development process. Threat modeling is usually more time-consuming because all project members need to enumerate the potential threats with use cases, architecture models, and attack trees and then apply appropriate risk models for prioritizing threats.

applications. We can then use this to guide an organization’s efforts to increase its level of security in a systematic and repeatable manner. Sardonix.org persuades members of the opensource community to audit and patch existing and popular open-source applications, to improve overall application security and, consequently, security for the systems on which such applications are installed. It also provides Security Code Review Guidelines (http:// packetstor m.widexs.nl/programming-tutor ials/ code.review.html) and links to auditing tools (http:// sardonix.org/Auditing_Resources.html). Using these guidelines and static analyzers like BOON, CQual, MOPS, RATS, Flawfinder, Bunch, and PScan, and dynamic debuggers such as Sharefuzz, ElectricFence, and MemWatch, we can find vulnerabilities during audits. Clearly, auditing millions of lines of code is impractical, even with the help of guidelines and tools, but a good set of guidelines can help jog the memory of experts and serve as an effective training tool for novices.6 Checkbox solutions for auditing, similarly, are not highly effective on their own, but they can help in the overall process.6 A better approach might be to focus auditing effort only in the parts of the source tree in which a vulnerability implies a greater organizational risk. Tools identify high-risk areas based on the flaws they are designed to seek, but they might not provide complete coverage of all high-risk areas. A better tactic is to identify high-risk areas with a susceptibility matrix and then use a set of tools to find vulnerabilities in these high-risk areas. Such an approach can guide software audits in multiple ways: • The susceptibility matrix with historical data identifies high-risk areas in a system for further investigation— for example, if memory management has a large concentration of flaws, auditors can run memory debug-

Table 2. Percentages of total flaws. DENSITY

WINDOWS (PERCENTAGE OF FLAWS)

LINUX (PERCENTAGE OF FLAWS)

78.6

62.5

87.2 8.7

72.5 19.4

12.7

26.3

Common high (black circle in Figure 4) High (black) Common low (white circle in Figure 4) Low (white)

gers such as Memwatch or ElectricFence for a more thorough memory check. • Even in the absence of historical data, high-density areas in the susceptibility matrix can serve as a good starting point for software audits. • Because the susceptibility matrix provides a security snapshot of the application, software auditors can compare these snapshots taken at different times to analyze and evaluate maturity in software security. We tested this last concept with the 25 flaws found in the six Windows NT 4.0 service packs (SP2 through SRP) in Figure 2. Figure 5 (p. 21) presents our analysis. The circles in the figure represent those cells from Figure 4 that represent a high density of Windows errors; the numbers in Figure 5’s circles represent the flaws found in these later service packs. We found only three flaws (or 12 percent) in cells not previously identified with a circle, thus indicating that the initial classification was indicative of where we would find later flaws. This initial study shows that the susceptibility matrix’s biggest value may be its use as a vulnerability indicator. www.computer.org/security/



IEEE SECURITY & PRIVACY

19

Software Auditing

CAUSE

L o c a t i o n

IMP

Validation

Domain

System initialization

49 DoS (3 Med), 21 unauthorized access (1 high, 7 med), 1 crash, 1 root (1 high), 3 invalid, 6 failure, 5 integrity failure, 1 clock

11 unauthorized access (1 med), 3 integrity failure, 1 DoS, 2 root

119 unauthorized access (6 high, 41 med), 2 root access (1 high, 4 med), 11 DoS (2 med), 2 invalid (1med), 5 integrity failure (1 med), 1 failure, 1 file manipulation (1 med)

Memory management

34 unauthorized access (8 high, 18 med), 16 DoS (2 med), 6 root (2 high, 2 med)

1 unauthorized access, 2 DoS

3 unauthorized access (1 med), 1 DoS

Process management

1 DoS (1 med)

1 DoS

Device management File management

Serial/alias

Identification/authentication

11 unauthorized access (2 med), 2 DoS

2 DoS

4 unauthorized access (1 med,1 high), 3 DoS, 4 file manipulation (1 high,1 med)

2 DoS

1 DoS, 1 failure, 4 unauthorized access (1 med), 1 file manipulation (1 med)

16 root access (7 high, 7 med), 104 unauthorized access (12 high, 39 med), 4 DoS (1 med), 1 crash (1 med), 2 integrity failure, 4 invalid (2 med)

Identification/ 4 unauthorized access (1 high), 1 file 1 unauthorized access (1 high) authentication manipulation (1 med), 3 DoS (1 med) 161

24

1

302

3

4

6

2

HighMed sum

50

2

0

135

HighMed rank

3

4

5

1

Sum Ranks

Figure 3. Susceptibility matrix. Our susceptibility matrix of 853 flaws in Windows provides us with a view of the system’s

File management

tify high-risk areas; we can then focus our audit efforts in these highrisk areas. Software audits guided by a susceptibility matrix can sometimes be more effective than threat modeling (see the “Contrasting with threat modeling” sidebar). In future work, we hope to evaluate and tailor various tools and testing techniques to discover the vulnerabilities they can find and then map this information to our taxonomy.

Identification/ Authentication

References

Validation Domain Serial/ Identification/ Boundary Exploitable Covert aliasing authentication violation logic channel System initialization Memory management Process management Device management

1. C.E. Landwehr et al., “A Taxonomy of Computer Program Security Flaws,” ACM Computing Surveys, vol. 26, no. 3, 1994, pp. 211–254. 2. K. Jiwnani and M. Zelkowitz, “Maintaining Software with a Security Perspective,” IEEE Int’l Conf. Software Maintenance, IEEE CS Press, 2002, pp. 194– 203. 3. R.A. Martin, “Managing Vulnerabilities in Networked Systems,” Computer, vol. 34, no. 11, 2001, pp. 32–38.

Figure 4. Combined susceptibility matrices. Black indicates many flaws, whereas white indicates fewer (the left semicircle represents Windows and the right Linux); a blank area indicates no flaws.

ost implementation-level high-risk areas in software are common (at least between Windows and Linux) in spite of different security policies and development histories. Thus, a taxonomy can help iden-

M 20

IEEE SECURITY & PRIVACY



MARCH/APRIL 2004

Software Auditing

Boundary violation

Exploitable logic

5 DoS, 1 Integrity Failure

128 unauthorized access (6 high, 38 med), 11 root (2 high, 3 med), 82 DoS (12 med), 25 integrity failure, 4 failure (1 med), 1 invalid, 1 clock, 3 file manipulation

1 Crash, 2 DoS, 1 Failure

Covert channel Sum

Ranks

High/med sum

High/med rank

506

1

132

1

79

3

9

3

10

16

3

5

16

5

3

5

28

4

7

4

214

2

111

2

12 DoS (2 med)

5 DoS 1 root access (1 High) 1 DoS, 1 crash

3 unauthorized access (1 high), 1 invalid, 2 DoS

13 root access (3 high, 8 med), 52 unauthorized access (7 high, 17 med), 8 DoS (3 med), 1 file manipulation 12

3

0

5

1

7

0

104

0

5

2

5

vulnerable areas, showing the impact an exploit would cause.

4. K. Jiwnani, Integrating Software Testing with Security Concerns, masters thesis, Univ. of Maryland, Computer Science Dept., Dec. 2002. 5. C. Cowan, “Software Security for Open-Source Systems,” IEEE Security & Privacy, vol. 1, no. 1, 2003, pp. 38–45. 6. G. McGraw and J. Viega, “Chapter 6: Auditing Software,” Building Secure Software: How to Avoid Security Problems the Right Way, Addison-Wesley, 2001, pp. 115–133.

Validation Domain Serial/ Identification/ Boundary Exploitable Covert aliasing authentication violation logic channel System initialization Memory management Process management

2

1

5

8

2

5

Device management File management Identification/ authentication

1 1

Figure 5. New flaws found in later releases. Working with the data in Figure 4, the Kanta Jiwnani is a PhD student in comnumbers in the circles represent the number of flaws found in later releases of puter science at the University of MaryWindows NT 4.0 Service Packs. land, College Park, where she also received her MS. Her research interests include application and operating systems security, software engineering techniques for security, Web services security, versity of Maryland, College Park, and a chief scientist of the and software auditing. She is a member of the IEEE Computer Fraunhofer Center. His research interests include software engiSociety. Contact her at [email protected]. neering, software measurement, and system security. He has an MS and a PhD from Cornell University. He is a fellow of the IEEE. Contact him at [email protected]. Marvin Zelkowitz is a professor of computer science at the Uniwww.computer.org/security/



IEEE SECURITY & PRIVACY

21

Susceptibility matrix: a new aid to software auditing - Security ...

Systems end up with security vulnerabilities for many reasons: poor development practices, ig- nored security policies during system design, incorrect ...

383KB Sizes 1 Downloads 152 Views

Recommend Documents

measuring aid flows: a new approach - CiteSeerX
methodology underlying these conventional measures and propose a new ... ODA comprises official financial flows with a development purpose in ..... used, and then we spell out in more detail the application of the methodological framework.

measuring aid flows: a new approach - CiteSeerX
grant elements: with loan interest rates determined as a moving-average of current and past market interest rates, loan ..... month deposit rates are used instead.

A Software Architectural Approach to Security By Design
ponents of a TAID network: 1) TAIDs describing subsys- tems/components; 2) interfaces, ..... Architectural interaction dia- grams: Aids for system modeling.

A Taxonomy of Security-Related Requirements - Software ...
Volere Requirements Taxonomy [11], which follows a ... development organizations will not sign up to implement such .... 42, March/April 2004. [7] D.G. Firesmith ...

SUSCEPTIBILITY IN WHEAT GERMPLASM TO INFESTATION 0F.pdf
wheat aphid Rhopalosiphum padi L. In seedling bulk tests, the germplasm under. study was grouped into three categories i.e., resistant, moderately resistant ...

DC_CD_VPD-Measles-Algorithm-to-Determine-Susceptibility-in ...
Page 1 of 1. General. Population. Born BEFORE 1957. No exclusion. Monitor for S/S. If no history of disease, consider titer or. single dose of vaccine. Born in or AFTER 1957. No vaccine doses. Consult with CDPHE (Meghan, Emily or Amanda). for 21-day

Toward Faster Nonnegative Matrix Factorization: A New ...
Dec 16, 2008 - Nonlinear programming. Athena Scientific ... Proceedings of the National Academy of Sciences, 101(12):4164–4169, 2004 ... CVPR '01: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and.

Toward Faster Nonnegative Matrix Factorization: A New Algorithm and ...
College of Computing, Georgia Institute of Technology. Atlanta, GA ..... Otherwise, a complementary ba- ...... In Advances in Neural Information Pro- cessing ...

Secure Security Software
An OTP can also be used to authenticate over a clear-text channel, such as a .... To see the vulnerability in action, one would need to login using the username ...

New Modulation Method for Matrix Converters_PhD Thesis.pdf ...
New Modulation Method for Matrix Converters_PhD Thesis.pdf. New Modulation Method for Matrix Converters_PhD Thesis.pdf. Open. Extract. Open with. Sign In.

Categorization of Software Errors that led to Security ...
Oct 27, 1997 - proposed schemes for the categorization of software errors a new scheme was ... R eports of security breaches due to errors in software are ...

New School Software
Page 1. Who they are. Menlo Innovations. ○ www.menloinnovations.com. ○. Ann Arbor, MI. ○. 50+ employees. ○. New School Software.

Plant reproductive susceptibility to habitat fragmentation
explained the differences among the species' effect sizes. Furthermore, a highly ... Compatibility systems, extinction risk, habitat fragmentation, meta-analysis, mutualism disruption, plant .... data base (Reference Manager 10.0, 2001) with more tha

predicted disease susceptibility in a panamanian ...
(MALDI MS) is a rapid method for ... (A. Hyatt, CSIRO, Australian Animal Health ... administration of norepinephrine by subcuta- .... Sierra Nevada, California,.

MATRIX DECOMPOSITION ALGORITHMS A ... - PDFKUL.COM
[5] P. Lancaster and M. Tismenestsky, The Theory of Matrices, 2nd ed., W. Rheinboldt, Ed. Academic Press, 1985. [6] M. T. Chu, R. E. Funderlic, and G. H. Golub, ...