Date post: | 20-Dec-2015 |
Category: |
Documents |
View: | 215 times |
Download: | 3 times |
Sources of Risks
CIT304University of Sunderland
References
• P. Neumann, 1995, Computer-Related Risks, Addison-Wesley, ISBN: 0-201-55805-X
Risks in Development• System conceptualization
– Miss-assessment of the technology.
• Requirements definition – Erroneous, incomplete,
or inconsistent requirements.
• System design – Fundamental
misconceptions or flaws.
• Implementation – Various errors.
• Support systems – Faulty or poor tools.
• System analysis – False assumptions or
erroneous models.
• Testing – Incomplete or erroneous
testing.
• Evolution – Sloppy maintenance and
upgrades.
• Decommission – Premature removal;
removal of components
used elsewhere.
Risks in Use
• Environment – Earthquakes, floods, fires, etc.
• Animals – E.g., squirrelcide.
• Infrastructure – Loss of power, air conditioning
• Hardware – Malfunction due to ageing or transients
• Software – Bugs
• Communications – Outages, interference, and jamming
• Human Limitations – Installation or misuse
Note Well…
• Risks are not just security…• By the way, security risks tend to be:– Mostly involving insiders– Mostly involving human behavior– Sometimes resulting from unwarranted assumptions
– Often are due to design errors or incomplete understanding of a system or technology
System Conceptualization
• Misunderstanding of the technology– Too far– Not far enough
• Cost overruns• Schedule overruns• Lack of FeasibilityExample—MIFASS (Marine Fire and Air Support System). The agency direction was to use a CPU somewhat slower than a first generation Apple II. There was no recovery.
Requirements Definition
• Erroneous requirements• Incomplete requirements• Inconsistent requirementsExtremely common and expensive. Missing requirements are the worst problem.
System Design
• Fundamentally false assumptions– E.g., infinite speed of light
• Erroneous modelsExample: the FAA’s Advance Automation System. The contractor assumed that the average statement in Ada generated 5 machine instructions (actually it was 10) and that the speed of a 10 MHz machine was (with parallelism) 20 MHz (actually it was 12 MHz). There was no recovery.
Implementation
• Various and varied.– Chip fabrication (Intel’s early Pentium chip)
– Wiring– Programming bugs– Trojan horses– Viruses
We will discuss this.
Support Systems
• Faulty or poor tools– Language choice– Compiler/debugger– Bad tools– Editing
CASE tools never met their expectations…Sometimes reflect failure to meet standards.Sometimes is deliberate on the part of a vendor.
System Analysis
• False assumptions about– World– Operating environment– Human behavior
• Erroneous models and simulationsPrototypes help here.
Testing
• Incomplete testing• Erroneous testing• Faulty code verificationWhat is a testable requirement? One way of dealing with this is Test-Driven Development (TDD), where you write the unit tests first. We teach this in CSE301.
Evolution
• Sloppy maintenance and upgrades.• Misconceptions• New flaws• Loss of design coherencyMaintenance organizations do not attract the best engineers. Design the system so it can be maintained by entry-level staff.
Decommission
• Premature removal.• Removal of components needed elsewhere.
• Hidden dependencies• Replacement not done in time• Hardware and software end of life
• Vendor profiteering
Environment
• Earthquake• Flood• Fire• Temperature extremes• EMI• Etc…
Animals
• Sharks (underwater cables)• Squirrels (enjoy fibre and cabling)• Monkeys (inquisitive)• Birds (watch your neighborhood telephone poles)
• Horses (enjoy practical jokes)• Cattle• Pigs• Etc.
Infrastructure
• Power• Air Conditioning• Physical Security
Hardware
• Ageing• Transients• Environmental problems• Errors in Design
Software
• Bugs of many sorts– System development– Change implementation– Maintenance
Communications
• Outages• Natural interference• Jamming
– Intentional– Accidental
• Tapping• Other
Human Error
• Installation• Misuse
– Intentional– Unintentional
Adverse Effects
• A myriadDiscuss…