Let’s Talk About Risk, Baby
https://www.societyinforisk.org/reading-list
https://qconsf.com/sf2019/presentation/quantifying-risk
https://github.com/veeral-patel/awesome-risk-quantification
Misc PDFs / Understanding Cyber Risk Quantification.pdf
https://www.cyentia.com/iris/
Factor Analysis of Information Risk (FAIR) Institute
Starting Up Security by Ryan McGeehan
- Articles on Risk Management & Forecasting
How to measure anything in cybersecurity risk book
Open-Sourcing riskquant, a library for quantifying risk</br>
Netflix has a program in our Information Security department for quantifying the risk of deliberate (attacker-driven) and accidental losses.
The Factor Analysis of Information Risk (FAIR) framework describes best practices for quantifying risk, which at the highest level of abstraction involves forecasting two quantities:
- Frequency of loss: How many times per year do you expect this loss to occur?
- Magnitude of loss: If this loss occurs, how bad will it be? (Low-loss scenario to high-loss scenario, representing a 90% confidence interval)
riskquant takes a list of loss scenarios, each with estimates of frequency, low loss magnitude, and high loss magnitude, and calculates and ranks the annualized loss for all scenarios. The annualized loss is the mean magnitude averaged over the expected interval between events, which is roughly the inverse of the frequency (e.g. a frequency of 0.1 implies an event about every 10 years).
riskquant can also calculate a “loss exceedance curve” (LEC) showing how all the scenarios together map to the probability of exceeding different total loss amounts. We do this with a Monte Carlo simulation of (by default) 100,000 possible ways that the next year could play out.
Then, for each loss that was predicted to occur, riskquant draws a random value from the lognormal magnitude distribution. All the loss magnitudes that occurred in a simulated year are summed together. At the end, we calculate the percentiles — what was the minimum loss experienced in the top 1% of simulated years? 2%? 3%? And so on.
The resulting aggregated loss exceedance curve for a particular set of loss scenarios could look like Figure 2, where the y-axis of the plot is a percentage, and the x-axis is a loss amount. You should read a point (X, Y) on the curve as: “There’s a Y percent chance that the loss will exceed X”.
You should compare the loss exceedance curve (LEC) to your organization’s risk tolerance — for example, by asking your executives: “If there was a 10% chance of losing more than $1B across all these risks, would that be OK? What amount of loss would you be OK with at that probability?” By finding the answer to that question at different levels of probability, you can draw a tolerance curve. If the LEC falls to the right of the tolerance curve at any point, then you may be carrying too much risk.
Sunday Data Analytics Digest #116 https://www.evernote.com/shard/s318/client/snv?noteGuid=6a22dfd8-e69c-462e-bbe7-0da3ded9bf89¬eKey=fc3f3f0bc94b76f51c9b775b67e0ffcd&sn=https%3A%2F%2Fwww.evernote.com%2Fshard%2Fs318%2Fsh%2F6a22dfd8-e69c-462e-bbe7-0da3ded9bf89%2Ffc3f3f0bc94b76f51c9b775b67e0ffcd&title=%2523116%253A%2BAnalytics%2Bproduct%2Bmgmt%2Bqs%2B%252F%252F%2BTeam%2Bmanagement
Measuring Cyber Threat Intelligence
https://twitter.com/gertjanbruggink/status/1126434409383714816
https://twitter.com/gertjanbruggink/status/1224607894521499653
https://github.com/gertjanbruggink/Metrics
(3) How To Measure Anything in Cybersecurity Risk - YouTube
- See also OWASP Bay Area panel, notes by Tad Whitacker on security metrics
- Richard Seiersen
#143: Data mesh > Data monolith // FAIR focus, pt. 1/3 - email from Jon Hawes