Abstract
Information security assurance and evaluation of software-intensive systems typically relies heavily on the experience of the security professionals. Obviously, automated approaches are needed in this field. Unfortunately, there is no practical approach to carrying out security evaluation in a systematic way. We introduce an iterative process for security evaluation based on security requirements, metrics and evidence collection, and discuss its applicability to the design of security evaluation experimentation set-ups in real-world systems. In this approach, security requirements can be used to define the basis for security measurements. Furthermore, other kinds of security metrics and other security evidence can be used to security decision-making.
Original language | English |
---|---|
Title of host publication | Proceedings of the 6th Annual Security Conference 2007 |
Place of Publication | Washington, DC |
Number of pages | 13 |
Publication status | Published - 2007 |
MoE publication type | A4 Article in a conference publication |
Event | 6th Annual Security Conference 2007 - Las Vegas, NV, United States Duration: 11 Apr 2007 → 12 Apr 2007 |
Conference
Conference | 6th Annual Security Conference 2007 |
---|---|
Country/Territory | United States |
City | Las Vegas, NV |
Period | 11/04/07 → 12/04/07 |