14.10.2015 © Thomas Bleier 1
Thomas Bleier
Security by Design
OWASP EEE
14.10.2015
Definition of „Security“
• Webster: „The quality or state of being secure as
o Freedom from danger
o Freedom from fear or anxiety
o Freedom from the prospect of being laid off“
• In IT typically defined by
o Confidentiality
o Integrity
o Availability
• „Security“ means different things to different people
especially also in IT
2 14.10.2015
14.10.2015 © Thomas Bleier 2
Confidentiality
• Only authorized users are able to
access information and/or systems
• Confidentiality vs. Privacy
• Privacy: protect the person
• Confidentiality: protect the organisation/information
• Confidentiality of the content of information
vs.
Confidentiality of the source or destination of information
(Metadata)
3 14.10.2015
Integrity
• Prevention of malicious manipulation of
systems and/or data
• Integrity of the content
o Protection against change
• Integrity of the source of information (Authenticity)
o Protection against faking wrong information
• Trust ist based on the integrity of information and/or systems
4 14.10.2015
14.10.2015 © Thomas Bleier 3
Availability
• Ensure that information and/or systems
can be used by authorized users when
needed
• An important aspect of security especially
in terms of business…
• In cyber-physical systems (ICS, etc.)
availability often has a hight priority than
confidentiality or integrity
5 14.10.2015
Other aspects of „Security“
• Non-repudiation of information or actions
• Resilience – recover from security problems
• Trustworthiness – trust into a system
• Anonymity of information or actions
• Protection against unwanted information or actions
6 14.10.2015
14.10.2015 © Thomas Bleier 4
Security is not absolute
• Security level
compared to peers
• Security level of
a system
• Breadth and lowest point
is crucial, not the highest
point…
14.10.2015
Risk
• ISO 73:2002: Risk: combination of the probability […] of an event
[…] and its consequence
8 14.10.2015
Risk =
Threat
x
Vulnerability
X
Impact
Lik
eli
ho
od
14.10.2015 © Thomas Bleier 5
Security vs. Safety
• No „100%“ security/validation possible
• Example:
• Invalid input may crash a system with a probability of 1 to 10^15
• Safety: probably acceptable
• Security: an attacker looks for exactly this case
14.10.2015
Security by Design
Principles
Best Practice – „Avoid known errors!“
14.10.2015 © Thomas Bleier 10
14.10.2015 © Thomas Bleier 6
Defense in Depth
• Don‘t put all eggs in one basket!
• Multiple layers of defense
• Diverse strategies
• Attacker has to overcome multiple
barriers
• More likely detected…
• Examples:
o Access Control and Encryption to protect data
o Web Application Firewalls
o Protocol switches/translations
11 14.10.2015
Secure the weakest link
• Attackers usually choose
the simplest wayo Making already secure parts
more secure does not help
• Find the „weak links“o e.g. via Threat Analysis
• Risk-Managmenet is essentialo Think like an attacker…
• Examples:o Why trying to break the SSL-Encryption when using a trojan on the client is much
easier?
o Why trying to attack the Firewall when you can access the database directly via
SQL-injection?
12 14.10.2015
14.10.2015 © Thomas Bleier 7
Least Privilege
• For each activity, use onlyminimal required privileges
• Rights based on task,not role/identity
• Granularity of assignmente.g. Posix vs. modern ACL
• Temporal execution of activities with higher privileges
• Examples:
o User Accounts – Unix vs. Windows vs. UAC
o Sandboxing – Adobe Reader, Chrome Plugins, etc.
o Privileged Ports in Unix (<1024) – daemons should droproot privileges
13 14.10.2015
Open Design
• No „Security by Obscurity“o Security of a system must not depend on
not knowing how it was implemented
• Kerkhoff-Principle for encryptiono Always assume that an attacker has
complete knowledge about the system
• But: concealing the internal structureof a system can be an additional layerof protectiono e.g: Network – do not publish internal network infos (DNS, NAT)
• Examples:o Encryption Algorithms - AES, Hash-Algorithms - SHA-3
o Mifare RFID-Chip: proprietary algorithm, broken by reverse engineering
14 14.10.2015
14.10.2015 © Thomas Bleier 8
Economy of Mechanism
• Security mechanisms should be as simple as possible
• KISS – „Keep it simple, stupid“
• Fewer functionality means
less that can go wrong…
• Also no unnecessary security functionality
• Reduces errors in implementation, but also
in configuration and usage
• Makes validation easier
• Examples:
o Microkernel-Architectures
o Security Appliances – „function bloat“
15 14.10.2015
Compartmentalization
• Separation of system
into sealed compartments
• Security breaches in one
area do not necessarily
lead to a whole system
compromise
• Curtailment of successful
attacks
• Examples:o Network-Segmentation
o Virtualization (Hypervisor, Zones, Jails, etc.)
o Diginotar: public CA and Gov. CA in the same trust zone
16 14.10.2015
14.10.2015 © Thomas Bleier 9
Detect – Deter – Prevent
• No security system is perfect
• If you can‘t prevent succesful
attacks, you should at least
detect them…
• Traceability of activities in a
system and correlation to actors
• Deterrence
• Different gradients:o Detect – e.g. forensics
o Deter – detection and prosecution is daunting
o Detect and Recover – attack was succesful, but impact is minimized
o Prevent – attack prohibited
17 14.10.2015
Detect – Deter – Prevent
• Examples:
o Antivirus, IDS/IPS
o Credit Cards – analysis of transactions
o Bookkeeping – double-entry accounting
o Logging and analysis of transactions in the finance industry
18 14.10.2015
14.10.2015 © Thomas Bleier 10
Secure defaults
• „Secure“ settings should be
the default
• Less secure settings have to be
activated deliberately
• Blacklisting vs. Whitelisting
• Examples:
o Access Control: „default deny“
o Network/Firewall: all ports blocked, selectively open
o Operating system: no services active by default
19 14.10.2015
Separation of Duties/Privileges
• Decision should not be based on a single condition
• More checks means
more chances that
a security breach
can be detected
• Security vs. Availability
• Example:
o Four-eyes principle
o Two-factor authentication
20 14.10.2015
14.10.2015 © Thomas Bleier 11
Least common mechanism
• Different systems/system parts should
not depend on the same security system
• Problem of information transfer via
„covert channels“
• Assumptions in one case are probably
invalid in another case
• Examples:
o Single Sign On – central authentication mechanisms
o Passwort-Recovery on websites
o Authentication via other services (Facebook, etc.)
21 14.10.2015
Example: Apple iCloud / Amazon Hack
• August 2012: How Apple and Amazon Security Flaws
Led to my Epic Hackingo http://www.wired.com/gadgetlab/2012/08/apple-amazon-mat-honan-hacking/all/
o iCloud – Apple Cloud Service for iPhone (Backup, Sync, etc.)
o Protected by a password
o Reset of the password is possible via Apple Support
o For this you need your invoice address and the last 4 digits of your credit card
• How do you get this information?o Call Amazon Support: „I‘d like to add a new credit card“
o Need: account name, E-Mail, invoice address
o Call Amazon Support again, tell them you lost your E-Mail account
o Need: account name, invoice address and credit card number
o Log in to Amazon account via password reset
o Access to last orders – last 4 digits of credit card used to pay
22 14.10.2015
14.10.2015 © Thomas Bleier 12
Completely Mediated Access
• Every access to a system has to be checked
o Not only the first/front/user/etc.
• No bypass of access control
o Developer access
o Performance optimizations
• Examples
o Web Application Firewall
o Maintenance-Passwords in
various devices/appliances
23 14.10.2015
Fail secure
• In the event of an error a security mechanism should be
in the „secure“ state
• Examples:o Typical example of software code:
o Railway vs. airplane
o // this should never happen...
o // fixme later
24 14.10.2015
DWORD dwRet = IsAccessAllowed(...);
if (dwRet == ERROR_ACCESS_DENIED) {
// Security check failed.
// Inform user that access is denied.
} else {
// Security check OK.
}
14.10.2015 © Thomas Bleier 13
Psychological acceptability
• Security mechanisms should notbe a (big) obstacle
• UI for security has to be simple
o otherwise it will not be used
o or circumvented
• Security mechanism should notpenalize users who obey the rules
• Design goal: „secure“ usage should be „natural“, „unsecure“ usage should be „unnatural“
• Examples:o Passwords: Length, Complexity, Lifecycle vs. Post-It
o Browser – certificate warnings
25 14.10.2015
„Good enough“ – Security Economics
• A „perfect“ security system is typically not necessaryo also not feasible/affordable
o Too strong focus on one area negligence in other areas weakest Link
• „There are no secure systems, only degrees ofinsecurity“ (Adi Shamir)
• „It‘s all about risk“ – a good risk analysis should be at thebeginning of every security concept
• An absolute secure system that cannot be used has thesame value than a system without any security
26 14.10.2015
14.10.2015 © Thomas Bleier 14
Resilience – what happens after an attack???
• Preventing an attack is not enough
• The system has to stay operational,
even after a successful attack
• Example:
o Content Scrambling System (DRM of the DVD)
system was broken after reverse engineering of a single player
o Better: Advanced Access Content System (BlueRay)
a single broken player (key) can be blocked system survives
27 14.10.2015
Social Engineering
• Effort to break a system vs.
Effort to reach a goal…
• If technical hurdles get too high
Social Engineering
o see Kevin Mitnick
• Microsoft Security Intelligence
Report 2011:
o Nearly half of all malware infections
involve some kind of „user interaction“
28 14.10.2015
14.10.2015 © Thomas Bleier 15
Security has a price
The right balance is important!
29 14.10.2015
Security
Convenience
Functionality
Performance
Security has a price…
30 14.10.2015 http://support.microsoft.com/kb/276304/en-us
14.10.2015 © Thomas Bleier 16
Security by Design - Literature
• Ross Anderson: Security Engineering, 2008
• Bruce Schneier: Secrets & Lies, 2000
• NIST SP 800-27 – Engineering Principles for Information
Technology Security
• Bruce Schneier: Beyond Fear, 2006
• David Rice: Geekonomics, 2008
• Viega, McGraw: Building Secure Software, 2001
• Saltzer, Schroeder: The Protection of Information in
Computer Systems, 1975
31 14.10.2015
Questions?
© Thomas Bleier 32
Thomas BleierDipl.-Ing. MSc zPM CISSP CISA CISM CEH
Senior Security Architect, Teamlead Security Professional Services
T-Systems Austria GmbH
[email protected] | +43 676 8642 8587
[email protected] | +43 664 3400559