Introduction
This article provides additional community guidance for the OWASP SAMM Measure and Improve metrics related activity described here: Measure and Improve (owaspsamm.org).
Related OpenCRE reference: Steer the secure software development program.
These metrics focus primerily on activities related to software development in R&D teams. If you are looking more for metrics related to operations and production environment safety check out this external article: 14 Cybersecurity Metrics + KPIs You Must Track in 2023 | UpGuard
Measure and Improve implementation tips
The goal of these metrics, according to the activity definition on the OWASP SAMM page, is to measure the effectiveness and efficiency of the application security program. It is critical to remember this while creating the list: it must help the app sec team by providing a high degree of knowledge that is linked with their own aims and objectives.
Another critical element to remember is that every metric must be consistently measurable and affordable to collect. They should also provide a unit of measurement to represent a number or percentage.
To achieve level one maturity, security officers should capture metrics for all three categories: effort, result, and environment. It is important to have a good understanding about how Effort and Environment metrics affect the Results metrics. Executive teams need to prioritize investments that directly influence the Effort metrics in order to achieve good results in Results metrics.
Below you will find some examples for each metric category with lines showing how they influence each other; edit and enhance to suit the goals of your own application security program.
Effort metrics
These are metrics that assess the amount of effort spent on security related activities. These are mostly transactional activities that don’t necessarily represent concrete results, but help to achieve them.
Here are some examples:
- Number of trainings or total hours of trainings
- Number of external security events like trainings, workshops, conferences, meetings, communications etc..
- Number of internal communications
- Time spent performing code reviews
- Number of architectural review sessions focusing on security
- Number of threat modeling sessions
- Number of penetration tests
- Number of security requirements review sessions (functional requests) related to security
- Number of closed defects in issue tracking systems like Jira
- Number of closed defects in tools like SCA, SAST, DAST in a given period of time
- Budget spent on Cybersecurity tools/licenses
- Number of OWASP SAMM assessments
Result metrics
Metrics are used to assess the effectiveness of security efforts. In the second maturity level of this activity, these measurements will be used to define the core KPIs of the application security program.
Examples:
- Cybersecurity training coverage rate
- Cybersecurity knowledge/awareness assessment results – surveys, quizes can be done after trainings for it
- Total number of security champions
- Security champion coverage (number of teams with at least one champion divided by total number of teams or products)
- Security incidents inflow – number of new incidents reported in a given period of time
- Cost per incident in money or man days
- Average days to patch or close incidents
- Coverage of applications and repositories with tools like SCA, SAST, DAST as percentage (number of applications covered divided by the number of applications that should have been covered)
- SCA statistics: current total number of defects or total risk score in 3rd party dependencies
- Number of Critical vulnerabilities older than 90 days in 3rd party components
- SAST statistics: total number of SAST findings related to security
- Security defects statistics: total number of open issues in issue tracking systems like DSxAuthoring, TFS, Jira in security category.
- SLAs statistics: average time it takes to fix security defects, incidents and total cost in man days.
- OWASP SAMM assessments coverage: number of scopes that had a recent assessment divided by the total number of scopes
Environment metrics
- Number of scopes in the organization that will be used for OWASP SAMM assessments
- Total number of products/applications in the organization
- Total number of products/applications that should be covered by DAST scanning
- Number of 3rd party components
- Total number of DEV, QA, Ops teams
- Budget available for external cybersecurity events/trainings
- Budget available for Cybersecurity tools/licenses
- Number of cybersecurity officers
Final words
Defining and documenting list of OWASP SAMM Measure and Improve Metrics is an important step before going forward to the second maturity level where we are going to define KPIs based on these metrics.
0 Comments