Cybersecurity: The undiscovered country… Jim Stikeleather
Transcript
Cybersecurity: The undiscovered country…
Jim Stikeleather
2
"I offer a toast – the
undiscovered country – the
future."
"An undiscovered
country whose bourne no
travelers return - puzzles the
will"
3
A perfect storm is on the horizon…
Enterprise 2.0
Management 2.0
Capitalism 2.0
Economics 2.0
IT 2.0
New Game, New Playing Field, New Rules, New Players… Means Innovation, Flexibility, Agility, and more…
4
IT consumerization
Smart Everything
A new workforce An old workforce
The new normal
Pervasive simplification
Utility computing
Work mobility & the Hollywood model
Change wants to happen, Tech enables, facilitates and accelerates it…
Risk & security management
Regulations & cyber jurisprudence
Innovate to zero
4
5 Confidential Services
An Architecture for Systemic Innovation
IP, Patents
Insight
Strategic Plans
Frameworks
Innovation for Market Challenges
Innovation for Customer & Delivery Challenges
Incubator Portfolio
Plan of Intent
Plan of Record
Continual Improvement
Relative Innovation
Value Challenges
Foresight
Scenarios
Themes
Signals, Trends
Reference Architectures
Operational & Annual
Plans
R&D
Standards, Laws
6 Confidential Services
Then you think about it:
IDEATION
INTERNAL RESOURCES
IDEATION
EXTERNAL RESOURCES
DRAFT NEW PRODUCT CONCEPTS
NARROW CONCEPTS UNDER
CONSIDERATION OUTLINE FEATURES/
BENEFITS
EXPOSE TO TARGET
CUSTOMERS (QUALITATIVE)
KILL
CONTINUE
REFINE
ESTABLISH SUCCESS CRITERIA
KILL
TEST MARKET (NEW PRODUCT
INCUBATOR)
EXECUTIVE MANAGEMEN
T REVIEW
(Investment Committee)
POST TEST MARKET EVALUATION
DESIGN, DEVELOP &
PILOT (Project
Launched)
FORM
TEAM
DEFINE THE PROBLEM/
OPPORTUNITY
EXPOSE TO TARGET CUSTOMERS FOR
VOLUME ASSESSMENT (QUANTITATIVE)
TEST MARKETING
PREPARATION
HIGH LEVEL REQUIRE-
MENTS
DEVELOP BUS CASE/OBTAIN
APPROVAL TO PROCEED
(Mgt Review)
KILL
CONTINUE
REFINE
FINAL ROLLOUT
PLAN
FINAL TIMELINE
WAR
GAMES
EXECUTIVE GROUP REVIEW
TRAINING
KILL
ROLL-OUT
EVALUATION AND
MEASUREMENT
PLAN DEVELOPMENT
AND INITIAL TIME LINE
(Proj Request/Sizing)
TURN OVER TO PRODUCT MANAGEMENT
How far back depends on feedback received
May require refinement of business case and financials
Some, or all of these stages may not apply to smaller efforts, or those with low execution risk/low capital investment For large projects, or those requiring post-pilot review and approval, these stages will apply (e.g.
Investment Committee level projects).
Project Release Management Flow Begins (see next page)
Collaboration and Co-creation: Sur/Petition Customers, staff, partners, suppliers, competitors
9
Sur/Petition – Moving Beyond Competition Competition, with its focus on what others are doing, is only the baseline for survival. “Sur/petition” focuses on value creation, going beyond traditional strategic competition to exploit the vast potential of “integrated values” that surround the purchase and use of products and services. de Bono, Edward, Sur/petition, Harper Collins, London
Management – Humanities greatest invention (the MIX - http://www.managementexchange.com/)
MANAGEMENT SCHOOLS Beginning
Dates Emphasis
CLASSICAL SCHOOL Managing workers and organizations more efficiently.
Scientific Management 1880s
Administrative Management 1940s
Bureaucratic Management 1920s
BEHAVIORAL SCHOOL Understanding human behavior in the organization.
Human Relations 1930s
Behavioral Science 1950s
QUANTITATIVE SCHOOL Increasing quality of managerial decision-making through the application of mathematical and statistical methods. Management Science 1940s
Operations Management 1940s
Management Information Systems
1950s—1970s
SYSTEMS SCHOOL 1950s Understanding the organization as a system that transforms inputs into outputs while in constant interaction with its' environment.
CONTINGENCY SCHOOL 1960s Applying management principles and processes as dictated by the unique characteristics of each situation.
– Secrecy – “Unsafe at any Speed” (Nader), “The Jungle” (Sinclair)
• Individuals
– Reasonable, rational, prudent person
• Failure to see the common good
– Law of the Commons
13
CyberSec in Real Space
Spy Cameras inserted into Xerox copiers
1960’s
US Central Command
breached using thumb drives
with malicious code
FBI Operation
Network Raider finds significant
penetration of counterfeit
network gear in use globally
2008
Adverse Events
Kill switches & backdoors are
built into everyday products
Counterfeit electronic
components
Highly sophisticated malware like
Stuxnet
Data security breaches
Hacktivists
Present
Technology Hardware Threats
1980’s
DEC computer bound for USSR
modified; Cryptographic
equipment developed with backdoor for
security agency
2007
Seagate shipped virus-
laden HDD that searched for passwords
then sent them to a server. Blamed an unnamed contractor located in
China.
Apple shipped iPod systems
infected with a Windows based virus. Blamed a
contract manufacturer.
2006
Confidential 14
Industry-wide Risk Assessment (2010)
1. Supplier policies do not adequately address the detection / prevention of counterfeit components or product tampering (unwanted functionality or malware)
2. Required training for suppliers does not address some threats and risks to supply chain security adequately
3. OEMs need to enhance governance / auditing of supplier contractual requirements to ensure compliance
Physical Security Personnel Security IT Security Policy Processes
Piece Part Manufacturing (from raw material)
Component Manufacturing & Sub-
Assembly
Assembly & Imaging
2nd Touch Customization & Personalization
Merge, Distribution & Fulfillment
Customer Care Services
Supply Chain Security Risk CategoriesSu
pply
Chain
Segm
ents
Confidential 15
16 Confidential Services
Safe Bets: O-TTPF™
Presenter
Presentation Notes
The safe bet is that data becomes self protecting by 2030, and hopefully much before that. In the shorter term is is clear that the emphasis has to shift from being an IT issue to being an Enterprise-wide Risk Management issue. Also – there are lots of standards processes in play. The traditional ones – ISO, IETF (internet engineering task force); ISA (international society for Automation); and many others – for the time being are slow compared to what is happening. A few things have to be in place for data to be able to protect itself. First off, the hardware has to re-engage with the security issue (showing my age and remember when you had to have a key for a computer ). Something like Intel’s trusted execution technology (TXT trademarked) will need to be in force. The hardware, acquired through a trusted supply chain can validate any software that is loaded on it – via security / hash code, validated pedigree / configuration or other forms of checks and balances. In particular, the validation and authentication of the hypervisor that is loading. In turn, a combination of the trusted hypervisor (see what Red Hat has done to incorporate the NSA trusted Linux features into KVM) which in turn can load the software trust environment. It is this trust environment / engine that will allow functionality and data to identify themselves, authenticate themselves, authorize each other, and adhere to the policies that govern their interactions. While the actual executing trust environment might not be visible today, the way to specify it’s permissions is - XACML - This year (2011) XACML 3.0 received a European Identity Award as the most influential standardization effort in the last year. While it will likely to continue to evolve – it, and rights expression languages (like MPEG-21) will form a foundation for not just security specification, but governance policy, risk management policy and foundations for compliance. While XACML continues to evolve, so does the “identity” portion of the equation – SAML. Something that looks like the both of them will be in place establishing policy, describing permissions and ensuring compliance in 2030 and before. What XACML and SAML will be used for is DRM (digital rights management) – its death prematurely announced. DRM got it wrong initially – on copies of the content rather than on identifying and permitting the owner / user. BLU-Ray and the current generation of flash devices may very well be the last removable media ever used – alleviating some of the more onerous issues around owner’s rights in DRM. DRM of the future moves from piracy prevention to privacy enforcement – a goal most would accept. Rights Expression Languages are the basis for describing who a user is, and what they may do with the specific instance of the data, given the context (location, other data in play, time, role, etc.) and intent (create, update, delete, link, share, etc.) of both the data and the user. Lastly, while I hate to bring it up as it presents shades of the MPIAA and RIAA wars, but DRM will likely be the basis for new business models in cyberspace to insure the integrity of contracts and transactions. To reinforce the need for the data to protect itself, one only has to look at the growth of “truly out of the reach of internal IT” systems – mobile devices, to realize the futility of protecting pipes and devices. Last year people stored enough data to fill 60,000 Libraries of Congress. The world’s 4 billion mobile-phone users (12% of whom own smartphones) have turned themselves into data-streams. YouTube claims to receive 24 hours of video every minute. Manufacturers have embedded 30m sensors into their products, converting mute bits of metal into data-generating nodes in the internet of things. The number of smartphones is increasing by 20% a year and the number of sensors by 30%. Which brings us to relational database – the sheer volume of data being anticipated precludes relational technology, and it is the security systems of relational databases that have formed much of the privacy and compliance infrastructure in IT today. We will have to move from having one application, the RDBMS control what other applications (payroll, crm, erp) have access to data, and those applications hopefully enforcing what can be done with that data - to data which protects itself, especially in a world where even the concept of an application becomes nebulous and virtual and the product of potentially real time, on demand, data conditional mashups of SaaS functionality. That is not to say that separate security software will go away. Instead, we will see the evolution of security to a model more resembling a biological immune system. I am not talking about the use of biometrics to identify the human part of the information equation, though that will happen much more universally than today. I al talking about true biological models of protection. First, data has to protect itself, much in the same way cells protect themselves in complex organisms, most of the time. Beyond that, immune systems have layers of protection that are systemic above the cells – they are both innate and adaptive response (learn), they self –regulate, they tolerate (risk manage) low threats (unless autoimmune disease), they protect the total system, not just the components. They also can be passive (pattern matching – this doesn’t belong) or active (what the industry euphemistically calls affirmative defense). WE will also see the beginnings of a growth of software diversity for equivalent functionality – one of the strategies real life uses to protect itself from infection – as opposed to the effectively mono-culture of software we live in today. We are beginning to see the start of these types of systems with things like vendors global threat intelligence, digital ants models (swarm theory), etc. The DOD's advanced research arm, DARPA, is currently working on a program that could radically change cyber security, CRASH, is based on the human immune system and will make it less likely that computers will spread cyber infections to other networks. Open Group - O-TTPF™ - Open Trusted Technology Provider Framework + TOGAF, Jericho OMG – Cloud Standards Customer Council; SysML; CISQ (consortium of IT Software Quality; Industry DTFs
17 Confidential Services
And then you flip a coin:
Quantum Wars
Computing Versus
Security
Presenter
Presentation Notes
Given all that we can forsee and forecast about what the world (individuals, businesses, governments, etc.) will want from information technology and the cybersecurity implications of them from the previously described themes – there are a few “races” that are too close to call with wildly different impacts on cybersecurity based upon the outcomes. The first is the Quantum Wars. If we get quantum computing capabilities before we get quantum security capabilities – then all is lost (euphemistically speaking). All encryption or hash based security models are vulnerable – regardless of key length and process power required. Or we get a “quantum factoring war” between the good guys and the bad guys to see who can use the biggest encryption key (I can encode faster than you can decode). On the other hand, if quantum security (specifically communication with one time pad distribution with value superposition) gets here first, then at least the “integrity” (but not authentication) element of security is significantly simplified. Either scenario results in a very different world in 2030. The advantage of quantum cryptography lies in the fact that it allows the completion of various cryptographic tasks that are proven or conjectured to be impossible using only classical (i.e., non-quantum) communication. In particular, quantum mechanics guarantees that measuring quantum data disturbs that data; this can be used to detect an adversary's interference with a message. Also, while pure quantum commitment protocols have been shown not to be unconditionally secure, they do allow much easier assumptions than other commitment protocols which will work that are not quantum based. http://www.hitxp.com/articles/software/quantum-computing-internet-security-rsa-https-public-private-cryptography/ Need to make some comment about research already underway on the “post quantum computing” security era. Progresses towards the building of quantum computers may soon convince decision makers in the security industry that the time has come to abandon unsecure public-key cryptosystems and to replace them with post-quantum cryptographic schemes. In a post-quantum world, an eavesdropper is able to use quantum resources, possibly amounting to a large quantum computer, to process information. Thus, post-quantum attackers must be taken into account whenever security models and proofs are concerned, by both quantum and classical existing and future cryptosystems. And while we still do not know when, how or even whether we will enter a post-quantum era, there is already ground for common work between the quantum information and the cryptography community. Second is the cashless society – while it has been discussed forever, it is becoming increasingly driven by a demand pull rather than a technology push set of economics, a new generation with different ideas about privacy (and less concern about Orwellian futures). The issue is really not a cashless society as much as it is completely attributable, traceable economic transactions. As such, achieving the rewards of crime become increasingly difficult, changing the ROI models of cybercrime and likely reducing the more common forms due to the extended complicated network of transactions (requiring hacks on top of hacks on top of hacks) necessary to be in receipt of the crime proceeds and being able to in turn to make use of them without arousing suspicion. Lastly – is peak everything. We have all heard about peak all, but there are many more natural resources that we are approaching peak production on, and consequentially will soon begin seeing shortages. Amd there is also peak water, peak population, peak money, peak weather, peak arable land, peak politics, peak taxes – the list is long. The economic impact of those will impact all technology including cybersecurity – both in terms of the economic conditions we operate in and the what, how and where security is applied. In general the reflective (as opposed to the alarmist) literature suggests 2025 to 2035 as the time all these peaks arrive, putting our 2030 date right in the middle. No real clue of the impact, it is here simply to say everything in this presentation could be wrong because of this ultimate black swan – not that we don’t know it is coming, but we have no clue of the impact.
18 Confidential Services
A better way to be secured…
patching
middleboxes
clean-slate
ROFL
DONA
TRIAD
CDN P2P
overlays
content-centric networking
TCP/IP
IPv6
New ID spaces
information-centrism PSIRP
NAT
DPI id-loc
Telephony
3 Interconnecting information
2 Interconnecting hosts
1 Interconnecting wires
Content Centric Storage
Content Centric Computing
Content Centric Networking
Presenter
Presentation Notes
In the end - You cannot secure the pipes, secure the servers or even just secure the storage – you have to be able to trust the data, based on the data – and nothing else! Current models of security are pretty much cyberspace equivalents for fixed fortifications in “real space” warfare. BTW, in the long run “Cloud” does not work without this capability. In most cases, current Internet architecture treats content and services simply as bits of data transported between end-systems. While this relatively simple model of operation had clear benefits when users interacted with well known servers, the recent evolution of the way the Internet is used makes it necessary to create a new model of interaction between entities representing content. Keep in mind that much if not most data on the internet is being converted to content (defined as data contained in a presentation wrapper of some sort) – be it as simple as wrapping it in XML. Three paths of technology development – all of whom should start bearing fruit in the next 3 to five years are foundations to being able to trust the data solely based on the data itself. Content Centric Networking(PARC and NSF), Content centric computing (work load based), content centric storage (already started with de-duping technology) The Benefits of Content-Centric Networking go beyond just eliminating many security problems via securing information so integrity and trust are properties of the content, not of the channel: •Simplifies network use - reduces set-up time and doesn't require manual configuration through firewalls, VPNs, and ad hoc synchronization protocols. •Provides a seamless, ubiquitous experience - allows people to easily send and receive digital content from multiple locations, mobile devices, and diverse networks. •Reduces congestion and latency - doesn't send irrelevant or redundant information through network pipelines. •Improves network performance while reducing operating costs - increases efficiency by at least three orders of magnitude. •Increases network reliability - robustly delivers information using any available medium. •Supports new and emerging applications - facilitates mobile and wireless access (which are currently relegated to the fringe of the network), and enables broadcast, voice over IP (VoIP), autonomous sensor networks, ubiquitous applications, and context-aware computing. •Empowers the user - allows people to express intent to their networks (e.g., to prioritize specific content over others) and prioritizes their needs from inside the network. CCN can solve the IP vulnerability problem by content/publisher authentication and embedded signature & access control Many of the other benefits come from the separation of routing and forwarding (speeds up network evolution), the ability to have private content sharing in a closed user-group configuration with out dedicating servers, and potentially service provider independent services concept (enabling aggregation and operation models), but can also be adapted as a managed service approach. Content centric storage is basically the next step beyond current object storage and deduplication technology. Facilitating access to data by content and its relationships rather than details of the underlying storage containers, further supported by mechanisms to define domain-specific storage optimizations, and altogether leading to highly simplified and efficient storage access. Users will have the ability to use several services to store their contents. This implies that the content management applications should be able to federate contents from multiple sources, to export contents with security and access rights policies and to offer rich content management capabilities. Additionally, most of evolving rules for governance, compliance, privacy (and security) actually center around context – data in relationship to other data (including temporal, physical and geo-location parameters), effectively a form of content. Content centric storage mechanisms should simplify both the compliance and auditability to these rules and regulations. It also relieves the user of backup and synchronization problems. Content centric computing is basically the next step beyond current BPMN and BPEL approached as being developed by organizations like OPEN GROUP with the evolution of TOGAF and Archimate; or the OMG with its Business Ecology Initiative. A key benefit of content centric computing is that it helps deliver improved control over business processes, fostering standardization across the company and compliance with regulations, policies, and best practices via removing the idiosyncrasies of data (elements). The most common barrier to agility in the past has been the dreaded "IT backlog" -- finding available resources capable of integrating diverse business systems, building custom Web applications, or writing code. Content centric computing models avoids the IT backlog because it does not rely on programming; its process logic is like a flowchart, enhanced here and there with scripting. Executable designs can be created quickly and changed easily. Integration adapters make connecting to diverse business systems simple, again without code – and unnecessary with content centric networks and storage. Reusable artifacts in the process component catalogue further enhance agility. With content centric computing, implementation cycles are likely measured in weeks, not months or years. A final key benefit of using content centric computing is that it makes end-to-end performance visible to process owners, and provides a built-in platform for problem escalation and remediation as well as validation of security, compliance and governance processes.
19 Confidential Services
Wildcards:
Presenter
Presentation Notes
Cyberspace meets real space – issues (where and which identity is the “right” one). No cyberspace security system can ever be better than the real space system that initially grants identity to any of the actors in cyberspace – be they human or a piece of equipment. Garbage in , security out. We have seen this movie before – it was called “Westworld”. Computers designing, building, repairing, maintaining and updating themselves – to the point humans no longer really know what is going on. While not consciousness or intelligence in our usual sense of the words – we do have computers making decisions and taking actions without human intervention. Just think about earlier this year and the Wall Street flash crash. It does not take a lot of imagination to foresee computers beginning to take over their own protection, after all, even the lowliest of complex biological entities evolved immune systems. With a totally different Darwinian game board, and totally different forms of perception, reasoning and even “intelligence” it would be impossible, beyond even out of the box, to envision the types of self protection that could arise without our involvement. It does not take much beyond the capabilities of Watson, CYC, CALO, Eurisko, or SEAS to do a better job than we currently are doing. Only 10-20 companies have a good grasp of big data issues and are innovating in this area. Though many companies in the Fortune 1000 are starting to experiment with Hadoop / BigTable / MapReduce , today only 10-20% of enterprises need big data solutions. This number could grow as high as 40-50% in 5 years. NoSQL databases are emerging as the preferred systems for storing and managing big data sets. The data in these sets is at the terabyte or petabyte scale, it is semi-structured, highly distributed, and much of it is of unknown value so it must be processed quickly to identify the interesting parts to keep. The McKinsey Global Institute (MGI) has no qualms about the value of all these data. In a suitably fact-packed new report, “Big data: the next frontier for innovation, competition and productivity”, MGI argues that data are becoming a factor of production, like physical or human capital. Companies that can harness big data will trample data-incompetents. Data equity, to coin a phrase, will become as important as brand equity. MGI insists that this is not just idle futurology: businesses are already adapting to big data. The problem is most of this data, individually is not valuable and embedded realtime analytics will be necessary to sort the wheat from the chaff. This introduces a new cyber security threat – manipulation of the date streams or the analytics to cause erroneous decision making or failed reaction to changing conditions. As a rapidly maturing communications technology, the Internet has brought people together even while it has reinforced privatism. The desktop computer, the laptop, the cellular and mobile phone, the Global Positoning System, the pilotless drone aircraft, video games and Government documents courtesy of Wikileaks, all are connected on the network of networks. Together these converged elements of a global socio-technical system offer wonderful possibilities for human emancipation, even while those ideas collide with established ideas of civility and decency. While at this point an academic discussion, a whole new view / practice / reasoning of ethics will of necessity take place – the outcome of which may totally change the nature of what we protect, or not, and more importantly how. Lastly, maybe the world will evolve where everyone realizes that cyberspace is the ultimate commons – even more so than shared pasture, lakes, the oceans, and outer space – and that in exchange for giving up privacy where everything is attributed and accounted, we gain security.