Metricon X — Agenda

- - posted in metricon | Comments

Metricon X will be held on March 21st and 22nd at the Stevens Institute of Technology in Jersey City, NJ.

The theme of the conference is: “Metrics that Matters – Help Management with Decision Making and Improve Security Posture of the Organization”

The agenda follows. Chatham House Rules apply.

Agenda

The location of Metricon X is the Babbio Center at the Stevens Institute of Technology, Castle Point on the Hudson, Hoboken, NJ.

Day 1: March 21, 2019

Time             Session
8:30–9:00 Continental breakfast
9:00–9:45 Welcome
Andrew Jaquith
9:45–10:30 A Calibrated Severity Score for Breach Impacts
Suzanne Widup, Verizon Business
Russell Thomas, Zions Bancorporation
10:30–11:15 Defensible Metrics for Improved Network Resilience Scoring to Include Lateral Movement Detection and Susceptibility
Jason Crabtree, Fractal Industries
11:15–11:30 Break
11:30–12:15 Metrics and Standards: Report From the Trenches
Walt Williams, Monotype
12:15–13:00 Lunch
13:00–13:30 Birds of a Feather Discussions
13:30–14:15 Gamifying Vulnerability Risk Data to Encourage Coordinated Disclosure: The Making of the MSRC Top 100
Christa Anderson, Microsoft
14:15–15:00 Integrating Cyber Insurance Into Your Cyber Security Arsenal
Serguei Mokhov, Concordia University
15:00–15:15 Break
15:15–16:00 Metrics that Matter: Help Management Improve Decision-Making and Improve the Organization’s Security Posture
Sanaz Sadoughi, International Monetary Fund
16:00–16:45 Communicating Cyber Risk to the Board of Directors
Wade Baker, Cyentia Institute
16:45–17:30 Break
17:30– Conviviality, conversation, chow and not a hint of covfefe

Breakfast and lunch will be provided. Dinner Thursday night will be self-funded at a local restaurant.

Day 2: March 22, 2019

Time             Session
8:30–9:00 Continental breakfast
9:00–9:45 Why Does Application Security Take So Long?
Chris Eng, Veracode
Jay Jacobs, Cyentia Institute
9:45–10:30 Assigning Probability to Cybersecurity Risk
Jennifer Bayuk, Decision Framework Systems
10:30–10:45 Break
10:45–11:30 Metrics and Standards: Can Data Science Help Understand Privileged Access?
Mike MacIntire, Panaseer
11:30–13:00 Lunch and Open Mic (Metrics Freestyle Rapping)
13:00–13:45 If KPIs are KRIs, Then We‘re Measuring It All Wrong
TBD
13:45–14:00 Break
14:00–14:45 Metrics for Organizational Cybersecurity Practices
Benjamin Charles Dean, Columbia University
14:45–15:30 Tactical Metrics Don’t Lead to Strategic Investments
Brian Gay, Northramp LLC
Sean Owen, Abt Associates
15:30–15:45 Closing
Andrew Jaquith

Breakfast and lunch will be provided.

Logistics

Venue

Stevens Institute of Technology, Babbio Center, Castle Point on the Hudson, Hoboken, NJ

Directions

To drive, see the Stevens Institue official driving directions. Parking deck entrance on Sinatra Drive is behind the Babbio building. Take the garage elevator to the lobby.

For public transit, see the Stevens Institute official public transportation directions. From Hoboken Station, walk 4 blocks from campus along the river on Sinatra Drive. Turn left on 4th Street, right into Stevens Park, abd continue onto River Street. Babbio Center is on the right.

Accommodations

Nearby hotels include:

Session Descriptions

A Calibrated Severity Score for Breach Impacts

Presenters: Suzanne Widup, Verizon Business and Russell Thomas, Zions Bancorporation

Abstract: We present a method for scoring the severity of information security breaches based on observable evidence (“Indicators of Impact”) associated with post-breach activity and consequences. Our data is 3,620 US breach episodes recorded in the VERIS Community Database Project. Each breach episode has been hand-coded with one or more publicly reported Indicators of Impact (36 categories), e.g. “Consent decree”, “Executive churn”, “Language in 8K or 10K [report to the SEC]”, and “Business relationship ended”. Ideally, we want to use these Indicators of Impact help us estimate a probabilistic cost model for each breach episode. As a steppingstone toward this goal, we have 1) developed an interval-scale severity scoring system; and 2) calibrated scoring system by the estimating the relative contribution of each Indicator of Impact as well as their functional interactions. The resulting severity scores should be useful to practitioners and policy makers for those decisions that can be made based on categorical distinctions – i.e. “bigger than a bread box”. We will also share lessons we have learned regarding how to make quantitative inferences from sparse, incomplete, and perhaps erroneous open source data.

Defensible Metrics for Improved Network Resilience Scoring to Include Lateral Movement Detection and Susceptibility

Presenter: Jason Crabtree, Fractal Industries

Abstract: Traditional efforts to define metrics to describe the resilience of networks against different attacks have been plagued by a lack of generality, challenges with data availability or quality, and narrow effectiveness around specific types of attacks or vulnerabilities. We review an extensible approach to compiling both real and synthetic data from multiple sources, propose a generalized scoring methodology using graph-based methods, show a complementary and common approach to attack path determination and planning, apply the technique to several representative test networks, demonstrate the scaling of the methodology to larger paradigmatic networks, and explore how specific detection/response capabilities can be used to reduce the overall state space which must be considered during event set generation. The talk includes a demonstration of detecting complex credential compromise attacks (e.g. Golden Ticket, Silver Ticket, DC Sync and DC Shadow) and uses the presence of such detections on the same reference networks to demonstrate the impact on network resilience scores due to the increased confidence in authentication which constrains post-exploitation attack paths considered in the overall scoring methodology.

Metrics and Standards: Report From the Trenches

Presenter: Walt Williams, Monotype

Abstract: This presentation will provide a critical review of the state of compliance frameworks and information security metrics, as well as a discussion on what success within each looks like and if it is worth the journey to get to that destination.

Gamifying Vulnerability Risk Data to Encourage Coordinated Disclosure: The Making of the MSRC Top 100

Presenter: Christa Anderson, Microsoft

Abstract: One of the ways the Microsoft Security Response Center (MSRC) encourages people to report security vulnerabilities to Microsoft is through public recognition. As part of this effort, for some years we have published the MSRC Top 100 at Black Hat USA to highlight the researchers who have done the most to contribute to the security of our customers and the broader ecosystem.

That’s been our intention, anyway.

In this session we’ll talk about how we’ve measured that contribution, potential pitfalls in designing gamification based on data collected for another purpose, how the algorithm for the top 100 has evolved over the past few years, and how we’re continuing to iterate on this algorithm (and on how we publish the data) to encourage the most valuable research.

Integrating Cyber Insurance Into Your Cyber Security Arsenal

Presenter: Serguei Mokhov, Concordia University

Abstract: Regardless of how the cyber-interloper gets into your network, the next step taken by your IT staff can determine the severity and consequences of the intrusion. It is generally acknowledged that better security and training are needed since the number of cyber attackers continue to overtake cyber defenders, it is becoming more and more difficult to improve the situation because attackers are looking for one flaw in a system’s defenses while defenders need to find and fix them all. As IT practitioners we can take all the precautions necessary for a safe and secure environment and still fail to keep unwanted intruders out. In these instances a new trend of insurance has slowly developed. This paper looks at the role of cyber insurance and its place in security environments.

Metrics that Matter: Help Management Improve Decision-Making and Improve the Organization’s Security Posture

Presenter: Sanaz Sadoughi, International Monetary Fund

Abstract: Information Security Metrics present a holistic view of the information security posture of the organization. it is critical to analyze and aggregate “metrics that matter” to provide an overall security risk scorecard to the Management to help them with decision making. This presentation explains how metrics were implemented at the International Monetary Fund to drive action and demonstrate return on investment.

Communicating Cyber Risk to the Board of Directors

Presenter: Wade Baker, Cyentia Institute

Abstract: For the last two years, I’ve been doing research into communicating cyber risk to the Board of Directors. Metrics are a major part of this. While this research has been published (https://go.focal-point.com/cyber-balance-sheet-report), I think summarizing findings for and hearing feedback from a room of experts would make for a strong session. I’ve also had the opportunity to implement this research in at least one major organizations and can share some lessons learned from that experience.

Why Does Application Security Take So Long?

Presenters: Chris Eng, Veracode and Jay Jacobs, Cyentia Institute

Abstract: Why does it take so long to fix insecure code? We pair new data about the lifecycle of a vulnerability with learnings from application security programs to answer this perennial question. Our data comprises 700,000 individual assessments and a population of over 22 million unique security findings over a 12-month period, easily the largest application security data set of its size. Chris will discuss outcomes of this study with a particular focus on identifying the factors that correlate most strongly (or not at all!) with fix rates. He’ll also provide data-backed insights into the contentious question of whether DevOps is a boon or a burden for security. Jay will do a deep dive into the analysis process and some of the techniques, such as survival analysis, he applied to the data set in order to measure and visualize the outcomes we were interested in. We’ll also describe how we identified and handled anomalous customer data that would have otherwise produced skewed representations of developer behaviors.

Assigning Probability to Cybersecurity Risk

Presenter: Jennifer Bayuk, Decision Framework Systems

Abstract: The session describes a cybersecurity decision support framework using risk management methodology developed in the professional practice of operational risk management. Operational risk (“ops risk”) is inherently low on quantitative measures in comparison with its more mature risk industry counterparts: credit risk and market risk. However, in the past few decades, professionals in the field have developed systematic data collection methods, control evaluation criteria, and risk analysis techniques that are directly applicable to cybersecurity decision support. Cybersecurity risk managers have gained immediate value from adopting these techniques. An ops risk framework allows cybersecurity risk to be analyzed in the context of both industry standards and organizational attributes. It provides precise definitions for information relevant to decisions and a methodology for using that information in the context of cybersecurity risk management. This session will provide an overview of how an ops risk framework helps organizations with cyber risk identification, classification, quantification, and monitoring.

Metrics and Standards: Can Data Science Help Understand Privileged Access?

Presenter: Mike MacIntire, Panaseer

Abstract: Privileged access is increasingly understood as a challenging problem for organizations to solve. Even Board members understand what privileges “superusers” possess and the potential impact they can have on critical business systems — for good or ill. As one CISO put it, privileged access is “at the intersection of human behavior and technical controls, and often brings IT and security into conflict”. Tools for privileged access management (PAM) exist to manage privileged access, but installing a tool is just the beginning. Once you’ve identified how people should be accessing assets, how do you clean up the tangled web of permissions that exists in your organization, without hindering by business as usual? In this talk, we’ll reframe PAM as a data science problem and explore what insight you can glean from your data, about where the problem lies and how to fix it.

Lunch and Open Mic (Metrics Freestyle Rapping)

Abstract: From 11:30 through lunch, we will be providing an open mic for on the spot or improv presentations, questions for the community, rants (within reason) and other discussion topics.

If KPIs are KRIs, Then We‘re Measuring It All Wrong

Presenter: TBD

Abstract: What are we measuring, what are we auditing? If the performance of our security teams are of tantamount performance, then they become our KRIs. This talk discusses how we can measure the human performance elements of risk reduction.

Metrics for Organizational Cybersecurity Practices

Presenter: Benjamin Charles Dean, Columbia University

Abstract: To be supplied, based on OECD paper.

Tactical Metrics Don’t Lead to Strategic Investments

Presenters: Brian Gay, Northramp LLC and Sean Owen, Abt Associates

Abstract: Traditional cybersecurity metrics programs are overloaded with streams of data that focus on tactical decisions that don’t allow senior leadership to understand how to make smart risk-focused decisions. In addition, industry has developed tools to reflect this same desire and cater to a highly technical audience primarily focused on self-measurement. In this session, we are proposing a different approach which has been successful at Abt Associates that focuses on a metrics program for non-technical decision makers and risk owners using uncomplicated metrics that are focused on communicating risk and guiding investments.

Comments