Cybersecurity:
How did we get here and where are we going?
Carl Landwehr
how do we get out of
here?
^
The talk on one
slide 1. We have built a tower,
but we built it out of
swiss cheese.
2. The mice are eating it
and the rats are eyeing
it.
3. We are setting
mousetraps and trying
to build faster than they
eat.
4. We need a better
building code and
some inspectors.
1. Where are we?
Coordinates for where we are
• Threat: how likely are attacks to occur?
• Costs: how much are attacks costing us?
• Vulnerability: how weak are our systems?
• Where are we headed: are things getting better or
worse?
Where are we? – threats
Major Events in the past 12 months
• NASDAQ break-in, October 2010 disclosed Feb 2011, probably used to steal
confidential business information from thousands of senior executives
• RSA break-in, disclosed March 17, 2011, SecurID system information
compromised and used to attack others including defense contractors
• Sony PlayStation Network (PSN) break-in, April 17-19, 2011; off the air for a
month, over 70M customers affected, follow-up attacks October 2011
• Booz Allen Hamilton break-in, July 11, 2011, 90,000 military email addresses
and encrypted passwords released.
• 48 of 50 victims unaware of breach until notified by law enforcement,
according to Mandiant CEO testimony reports first learned of breach from law
enforcement, October 4, 2011
• NSA/Cyber Command Director Gen. Keith Alexander reports at DARPA
Cyber Colloquium: "These organizations are supposed to be the best in the
market, and in my opinion, they are," he said. "But they're the ones that
recognized they were attacked. Most don't.” November 7, 2011.
• See also annual Verizon Data Breach reports
Where are we? – losses
How much are attacks costing us?
• Costs are hard to know – companies may not even know they have been
attacked, if they know, it may be hard to determine what information has been
lost, if they can it may be difficult to place a value on the information.
• Even so,the President’s Cyberspace Policy Review pointed to industry
estimates as high as $1Trillion per year in digital theft (2008)
• A newer source: Poneman Institute, August 2011, Second Annual Cost of
Cyber Crime Study
• Cost information based on confidential interviews within 50 benchmarked
organizations from a total of 401 organizations contacted, all with at least 700
enterprise “seats” and largest with almost 140,000 seats. Costs are estimated
by respondents based on a structure provided by the survey; four week
response period extrapolated to 12 months
• Report available at:
http://www.arcsight.com/collateral/whitepapers/2011_Cost_of_Cyber_Crime_S
tudy_August.pdf
Poneman Cyber Crime Cost Study Cost Categories
Internal Costs External Costs
Poneman Second Annual Cost of Cyber Crime Study Results
50 U.S. companies reported costs for 4 week period; extrapolated to annual
One year increase
30% in mean,
50% in median
Poneman Second Annual Cost of Cyber Crime Study Results
Poneman Second Annual Cost of Cyber Crime Study Results
Where are we? – vulnerabilities
How vulnerable are our systems?
• NIST/MITRE Common Vulnerability Scoring System(CVSS) to rate
severity of vulnerabilities
• Common Vulnerability Enumeration (CVE) to catalog vulnerability
provide some basis
• IBM X-Force publishes reports; following figures taken from 2011
Mid-Year Report.
“The annual vulnerability disclosure rate now appears to be fluctuating between 6,000
and 8,000 new disclosures each year.”
Vulnerability is defined as a set of conditions that leads or may lead to an implicit or
explicit failure of the confidentiality, integrity, or availability of an information system.
“Over half (55 percent) of all vulnerabilities disclosed in the first half of 2010 have
no vendor-supplied patch at the end of the period. This is slightly higher than the
52 percent that applied to all of 2009.”
Source: IBM X-Force mid-year report, August, 2011
Source: IBM X-Force mid-year report, August, 2011
“Critical vulnerabilities increasing; “Critical” + “High” roughly stable
“Medium” includes XSS and SQL injection
CVSS = Common Vulnerability Scoring System
Source: IBM X-Force mid-year report, August, 2011
Source: IBM X-Force mid-year report, August, 2011
Where are we headed? – direction
Are things getting better or worse?
• Dan Geer, Mukul Pareek, developed and implemented sentiment-based index
(ref. Consumer Confidence Index), based on 100 selected responders, higher
number means more risk
• Reported monthly since March 2011 base 1000; currently 1153
• Plans to develop a “Cyber Security Prediction Market”
Another Indicator
WSJ 9/27/2011
“Users are the
biggest risk”
Would we
expect
every employee
to lock the front
door on the
way out?
Privacy - mobile Wall Street Journal
“What They Know” series
http://blogs.wsj.com/wtk-mobile
Location Age,Gender Phone ID
Yahoo
Weeklyplus Google/Analytics
Google/Adsense Facebook Medialets
Google/Doubleclick
Apple/Quattro
2. How did we get here?
• Computing Trends 1960’s – mid 1970’s
• Business Computing
• Automation of business processes in many industries
• Business analysis
• Some outsourcing to batch providers
• Academic Computing Centers
• Campus-wide research and educational computing
• Development of timesharing systems: CTSS, DTSS, Multics, MTS, …
• Considerable operating system and programming language development
• Commercial timesharing
• CompuServe, Tymshare, National CSS, Comshare etc.
• Commodity computing
• Defense (Military/Intelligence)
• Early real-time command – control systems, “WWMCCS”
• Extensive computing for other purposes; cost-driven resource-sharing
26
Computer Security: 1960’s to mid-’70s (1 of 2)
• Commercial (business data processing):
• Threats: theft of assets, information
• Threat agents: thieves, fraudsters (insider/outsider)
• Mitigation approach:
• Assure accountability via audit and control mechanisms
• Risk assessment to focus resources (RACF, ACF2)
• Academic and commercial online computing services: • Threats: service theft, programs/data theft, user interference, vandalism
• Threat agents: customers, faculty/students, insiders
• Mitigation approach:
• assure availability of resources: backup arrangements
• accounting for use of resources
Computer Security: 1960’s to mid-’70s (2 of 2)
• Defense computing:
• Threats: espionage, sabotage, (incl. classified info.) Trojan horse programs
(Anderson, 1972)
• Threat agents: nation-state actors
• Need to satisfy regulations for protection of classified information (primarily
confidentiality)
• Mitigation approaches:
• “color change”, physical separation, “system high” operation
• “Multi-level secure” computing as a goal: information at different
security levels, users with different clearances, sharing a common
computer system
• Research approaches:
• Reference monitors, security kernels, secure operating systems,
encryption
27
Engineering Principles for Secure Systems • Saltzer and Schroeder, Protection of Information in Computer Systems,
Proceedings of the IEEE, Sept., 1975 (V. 63 #9)
• Design principles:
• Economy of mechanism (simplicity over complexity)
• Fail-safe defaults (default exclusion, explicit permission)
• Complete mediation (check each access)
• Open design
• Separation of privilege
• Least privilege
• Least common mechanism (minimize the shared mechanisms)
• Psychological acceptability (usability)
• Work factor (compare cost of breaking mechanism with attacker
resources)
• Compromise recording
• Note that these principles need to be re-interpreted as technology
advances and sometimes different principles conflict
28
OS security R&D and criteria development
1968 - 2000
Ware Rept
Anderson Rept: Reference
Monitor Concept
“Penetrate and Patch” Period
Security Kernel Experimentation
MULTICS AFDSC
MULTICS (AIM)
SCOMP KSOS
NCSC Founded
Orange Book Published:
TCB Concept
First Evaluations Completed
TNI Published
TDI Published
Federal Crit. First Draft
ADEPT-50
Timesharing Demonstrated
TCSEC Product Development
RISOS, PAP Projects
Security Profiling
DEC VMM
Sec Kernel (SKVAX)
Common Crit. First Draft
V. 1.0
1970 1980 1990 2000
Common Criteria Int. Std.
Common Criteria
Military Message Experiment
2010
A few observations…
•Defense dominated 20th c. cybersecurity R&D
•MULTICS failed because…
•The TCSEC failed because…
•The DEC SKVAX didn’t get to market because…
•SSL/TLS succeeded because…
•The SecurID succeeded because …
•MULTICS succeeded because…
30
The Multics System:
An Examination of
Its Structure
by Elliott I. Organick
What is the problem?
•Not that there are attackers
•But that our systems are full of holes
•For privacy, we often rely on the
kindness of strangers
•Underlying problems
• we don’t use sound building blocks
• we can’t measure (in)security
• we don’t know how to build manageable, usable, extensible large scale
systems with sound assurance arguments
• our workforce lacks the proper tools to succeed at these tasks
We’ve been working on this
problem for nearly 40 years
•Security industry focused on band-
aids: virus scanners, intrusion
detection, recovery, forensics
•Research community too often looking
at the science of band-aids
3. Where are we going?
Two Strategies . . .
?
?
Learn to swim with the sharks ….
• This is the world we’ve built, so for now we had better learn to live in it
... or build a seaworthy vessel?
• Boats needn’t be
leak-proof but must
have working
pumps!
Federal Cybersecurity R&D Coordination via
National Information Technology Research &
Development (NITRD) Subcommittee
37
National Coordination
Office for NITRD
National Science and Technology Council
NITRD Subcommittee
OMB OSTP
Cyber Security
and Information Assurance
Interagency Working Group
(CSIA IWG)
Special Cyber
Operations Research and
Engineering (SCORE)
Interagency Working Group
Cybersecurity R&D
Senior Steering Group
Senior representatives from
agencies conducting NIT R&D
Senior representatives from
agencies with national
cybersecurity missions National security
systems R&D
Program
managers with
cybersecurity
R&D portfolios
Federal Cybersecurity R&D
Strategic Thrusts
Research Themes
Science of Cyber Security
Transition to Practice
Support for National Priorities
38 Includes DoD, DHS, DoE, NSF, NASA, …
Initial Themes (2010)
Tailored Trustworthy
Spaces
– Supporting context specific
trust decisions
Moving Target
– Providing resilience
through agility
Cyber Economic
Incentives
– Providing incentives to
good security
Research Themes
39
New Theme (2011)
Designed-in Security
– Developing and evolving
secure software systems
Annually re-examine themes,
enrich with new concept,
provide further definition or
decomposition
39
New Secure and Trustworthy Cyberspace solicitation: NSF 12-503
Broadens former Trustworthy Computing program to include Social, Behavioral and Economics (SBE), Office of CyberInfrastructure (OCI), and Division of Mathematical Sciences (DMS)Insert text
Proposals to specify perspectives: Trustworthy Computing, SBE, Transition to Practice (or multiple)
Award sizes: Small, Medium, Frontier (up to $10M over 5 years)
Planning Webinar to inform community
Likely PI meeting – summer 2012
40
National Science Foundation 2012 Planned Activities
NSF Secure and Trustworthy Cyberspace http://www.nsf.gov/pubs/2012/nsf12503/nsf12503.htm
CISE-ENG Cyber-Physical Systems http://www.nsf.gov/pubs/2011/nsf11516/nsf11516.htm (under revision)
CISE/CNS Future Internet Architecture Program http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=503476&org=CNS
DARPA programs – see http://www.darpa.mil/Cyber_Colloquium_Presentations.aspx DHS programs -- see http://www.cyber.st.dhs.gov/
Relevant Current Research Programs
• Monthly lecture series, at NSF, open to the public, video to be publicly available • http://www.nsf.gov/cise/cns/watch/
• 12/1/11 speaker: Fabian Monrose, UNC: “Hooked On Phonics: Learning to Read Encrypted VoIP Conversations.”
• Past speakers:
11/3/11 - Stefan Savage, UCSD: “Why the hard problem of computer security needs the
soft sciences”
10/6/11 - Douglas Maughan, DHS: "So what if I take over a botnet to do my research?"
9/1/11 - Marshall Van Alstyne, BU/MIT "Fighting Fraud from an Economic Perspective"
8/4/11 - Ken Klingenstein, Internet 2: "Trust and Turtles All the Way Down..."
7/7/11 - Paul L. Harris, Harvard GSE: "Selective Credulity"
6/1/11 - Fred B. Schneider, Cornell: "Doctrine for Cybersecurity.“
Mailing list for notifications: [email protected]
42
NSF WATCH Lecture Series
43
Which of the NITRD-CSIA cybersecurity research themes does your project address? (check all that apply)
Tailored Trustworthy
Spaces
Moving
Target
Cyber
Economic
Incentives
Designed-In
Security
Science
Of
Security
None of
These
Based on 71 of 84 FY11 projects; scaled by size of award Overall Award distribution: 3 Large, 18 Medium, 50 Small, 9 CAREER, 2 CRI, 2 CPS
NSF 2011 Research Award
Outcomes (by theme)
0%
10%
20%
30%
40%
50%
60%
70%
National Science Foundation Funding Please characterize the primary focus of the research in relation to the following set of categories based roughly on the ISO protocol stack (choose one)
a. hardware
b. network
c. operating system
d. programming language
e. application, database
g. user interface, usability
h. none of these makessense for my project
Based on 62 of 84 FY11 projects; not scaled by size of award Based on 71 of 84 FY11 projects; scaled by size of award
Hardware 14%
Network 10%
Operating System
10%
Prog. Lang. 18%
Applications / Database
23%
UI 4%
None of the above 21%
National Science Foundation Funding Please characterize the primary focus of the research in relation to the following set of categories based on the system development lifecycle (choose one)
a. system requirements, includingpolicy
b. system design
c. system implementation
d. system test / verification
e. system configuration and management(setup, prior to operation)
f. system operation and recovery
g. system usability
h. none of these makes sense for myproject
Based on 71of 84 FY11 projects, scaled by size of award
Policy, Rqmnts
10%
System Design
40%
Impl. 4%
Test / Verif. 15%
Config. / Mgmt. 1%
Op./recovery 3%
Usability 2%
Not Appl. 26%
Please characterize the primary focus of the research in relation to the following set of categories based roughly on attack detection/prevention/response (choose one)
a. forensics -- figuring out what damage wasdone and who might have done it
b. recovery/restoration -- picking up thepieces after a successful attack
c. surviving attacks -- intrusion/attacktolerance, "fighting through" attacks
d. detecting/understanding attacks: intrusiondetection, situational awareness
e. preventing/disrupting attacks --mechanisms that resist attack or raise theeffort required of the attacker
f. building systems "right" -- assuringimplementations match specifications andare free of exploitable flaws
g. knowing what to build -- assuring thesystem requirements include the desiredsecurity properties, security modeling
h. none of these make sense for my project
Based on 71 of 84 FY11 projects; scaled by size of award
Forensics 2%
Recover / Restore
3% Survive / Tolerate
1%
Detect / Understand
18%
Prevent / Disrupt
28%
Build it correctly
27%
Know what to build
11%
Not applicable
10%
Based on 71 of 84 FY11 projects; scaled by size of award
0.0%
10.0%
20.0%
30.0%
40.0%
50.0%
60.0%
70.0%
80.0%
90.0%
100.0%
a. improvedunderstanding ofbasic conceptsand principles
b. prototypeimplementation
of some sort
c. deployablesoftware orhardware ofsome sort
d. this questiondoesn't makesense for my
project
Please characterize the results you expect at the end of your project (check all that apply):
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
The talk on one
slide
1. We have built a tower
of swiss cheese.
2. The mice are eating it
and the rats are eyeing
it.
3. We are setting
mousetraps and trying
to build faster than
they eat.
4. We need a better
building code and
some inspectors. Also, the tower may
have a bomb in it!
Can we get out of here?
What would it take to change the game?
A modest proposal
• Create a center that has as its mission
• the measurement of security-relevant parameters of current
software/hardware/networks and
• fostering improvement in the values measure over time
• through a program of innovative research, development and
transition activities
In the 1880’s
Wisconsin had a
problem: watered
milk
On the importance
of measurement
Stephen Babcock’s 1890 invention of a
simple, inexpensive, and accurate method
to measure the butterfat content of milk
enabled the development of a high quality
dairy industry in Wisconsin
Don’t we have enough centers already?
• We have centers for research
• We have centers for response
• No single organization inside or outside the government has both
the responsibility and the resources to improve the situation
• Measurements of the current state of cybersecurity are made
mostly by interested parties
Where should it be located?
• In the government?
• If so, where in the bureaucracy?
• Outside the government?
• If so, can it obtain necessary authority, resources?
What would such an organization need?
• The ability to develop/define meaningful system of measurements
• Implies a research program
• In-house expertise essential; research might be internal or
external or both
• The authority to make measurements as required and release the
results
• (Possibly) authority to regulate practices in particular contexts
• Consider FDA, EPA as examples
NSF Current Activities –New TwC Large Awards
PI/co-PI Institutions Title
Foster,
Walker
Cornell University
Princeton University
High-Level Language Support for Trustworthy
Networks
Evans,
Katz,
Myers
U. of Virginia
U.MD at College Park
Indiana University
Practical Secure Two-Party Computation: Techniques,
Tools, and Applications
Feamster,
Feedman,
Dingledine
GA Tech
Princeton
Tor Project
Facilitating Free and Open Access to Information on
the Internet
Sandhu,
Bertino,
Kantarcioglu
U .Tx at San Antonio
Purdue University
U.Tx at Dallas
Privacy-Enhanced Secure Data Provenance