Date post: | 14-Dec-2015 |
Category: |
Documents |
Upload: | kristina-waterhouse |
View: | 213 times |
Download: | 0 times |
R2 LCO Review Outbrief
2011-09-01
Summary
• Thanks• Very impressed with:– Amount of work– Comprehensive architecture and artifacts– Knowledge of staff– Quality of presentations– Comprehensive demo of R1- well worth it!– Frankness and completeness of discussions– Logistics were excellent as usual- tx Paul!– Ability to stay on Schedule– Cell phone discipline
Report out rules of engagement
• We will add / delete / modify comments in our final report
• Would like to present entire out brief and then take questions
Good Stuff• CI architecture appears to be very well planned,
documented, and logical.• The work that CI has done to test the platform agent
on the target hardware was outstanding, and dramatically lowered risk.
• Prototypes were well planned, well executed, and of great value.
• Good that CI reached out to MBARI to benefit from their experience and lessons learned
Are Release 2 LCO Review Entry Criteria Met?
1. Are Release 2 LCO artifacts provided and adequate?Artifact Set Content Provided Adequate
Management System Life Cycle PlanRisk RegisterElaboration Execution Plan
YesYesYes
YesNo – out of dateNo - Lacks tasks, schedule, resources
Requirements Use cases (mature)User workflows (candidate)System and subsystem requirements (candidate)
YesNoYes
Unknownn/aNone for UX; not yet mapped to use cases
Design Architecture Specification (candidate)Technology List (baselined)Prototype Reports
Yes
YesYes
Yes
YesYes
Implementation None
Deployment None
Are Release 2 LCO Review Exit Criteria Met?
Are the use cases understood by the stakeholders, and do they provide a complete
description of the release scope?
• To our knowledge, the Marine IOs, EPE, and the project scientists were not asked to review the applicable use cases
Are the core user workflows understood and agreed to by the stakeholders?
• We do not know what the core workflows are• We do not know who the stakeholders are
Are the candidate requirements understood by the stakeholders, and do they cover the critical
use cases? • Requirements have not yet been mapped to
use cases for Rel 2• We do not know which use cases are ‘critical’
Are the critical risks identified, and have they been mitigated through exploratory prototypes?
• We saw evidence that prototypes were put in place to mitigate technical risks – good job
• The risk register was not updated to reflect this mitigation
• We did not see evidence of mitigation strategies for programmatic risks
Are the candidate architectures viable as demonstrated through analysis and
prototyping? • Yes
Is the Elaboration phase execution plan credible?
• The Elaboration Plan appears difficult to utilize to manage the Elaboration Phase
• The Elaboration Plan was not complete– lacking schedule and tasking details
Actions from R1 Reviews – Findings, Recommendations, and Suggestions• Were items satisfactorily addressed from
Release 1 reviews?– Response to findings/recommendations was
discussed in the homework presentation. – Due to time constraints, supporting evidence was
not presented for all item resolutions
Are Release 2 LCO Review Exit Criteria Satisfactorily Met?
• Is this a viable User Interface strategy, team and approach? No– Concerned that one single interface cannot apply to
multiple audiences– Not clear that the correct people (e.g., at the marine IOs)
have been interviewed• Have all of the applicable user interfaces been
identified? No• Have these user interfaces been appropriately
characterized for this stage? No• Have R1 user interfaces been adequately proven to
support L2 start? No data to make the assessment
Findings – Use Cases
• There was no identified engagement of appropriate program groups (e.g., marine IOs, data working groups) in the Use Case validation process prior to the LCO. CI expected that consensus on the use cases would take place at the LCO. This consensus process should be a very detailed and concerted working group effort spanning weeks/months as opposed to a day or two.
• Holding this LCO review does not remove the need to engage the stakeholders to achieve consensus on Use Cases. • For example, there is no use case for Navy embargoed
data streams.
Findings – User Interface
• UX does not appear to be sufficiently planned or mature enough to meet goal of pixel-perfect GUIs by LCA for Release 2.
Findings - Schedule
• The IMS is not well understood by the team, thus affecting their ability to effectively plan. (Ex: some team members did not truly understand the scope for the release)
Findings - Management
• Spiral development methodology is being used to defer more difficult tasks to later releases .
• Inadequate staffing continues to be a critical risk to CI schedule.
• OL needs to provide leadership and support for system integration between the IOs.
Findings - Risks
• There appear to be significant R2 risks that need active mitigation efforts.
• The risk register is not current – It has not been updated as a result of recent
prototypes– It does not include the risk of having a 3-4X
increase in number of use cases in R2 vs R1
• Risks do not appear to be pro-actively managed (too many realized risks)
Findings - Risks, Cont.
• Schedule and Budget impacts as a result of pushing 40% of Release 1 into Release 2 have not been fully developed. This deferment activity should be entered into the risk register as it may have drastic impacts on the ability to deploy Release 2 in time to support the Marine IO deployment.
Findings - Technical
• Apparent lack of formal trade study process• There continues to be no decision on how
assets will be managed for OOI (SAF vs. CI)– OL should take lead in resolving this issue
Findings – Review Process• Overall, the presentations appeared to lack
quantitative backup data. (e.g., lack of staff loading and schedule detail; lack of risk metrics)
• There is a misunderstanding of the board’s role in representing the stakeholders at this and previous reviews; the board cannot function as a working group
• It appears that CI has treated each of their milestone reviews so far (R1 LCO, LCA, IOC; R2 LCO) as a progress reporting event rather than as a gate review.
Recommendations – User Community
• Continue to include Rel 1 early adopters as part of Rel 2 user community (and make sure we don’t break what is already there)
• There needs to be a formal mechanism for soliciting feedback from users
• Continue to reach out to outside organizations like MBARI, and expand the list of organizations consulted
Recommendations – Use Cases
• There appears to be a need for end-to-end threads that exercise the entire system as it will be used operationally.
Recommendations – Use Cases and Requirements
• Mapping of use cases to requirements should be accomplished during Inception rather than during Elaboration.
• A requirements verification matrix should be included in the artifacts at all reviews.
• Use case format should include a field for the associated requirements.
Recommendations - Schedule
• CI must know and present a critical path schedule to their entire team.
• Overall schedule risk should be re-assessed.
Recommendations - Management
• For issues involving cross-IO boundaries, OL needs to be pro-active and own and manage the issues.
• CI needs to quickly hire more systems engineers and developers.
• OL and CI need to jointly prioritize requirements and deliverables to pro-actively prepare for budget fluctuations.
• CI should assign a person, as a local representative, to both Marine IOs to support integration.
• Project Management support must be given to UX.
Recommendations - Risks• Risk 2329 should be promoted to the system level.• Reinstitute the formal risk management process.• Add a new risk for Release 1 maintenance
consuming Release 2 resources.• A new risk should be added for the ATBD schedule,
and promoted to the system level.• Mitigation approaches need to be identified and
implemented for risks to R2 LCA.
Recommendations – Review Process
• Review charge (LCO, LCA, etc.) should be well understood by all parties (board, development team, management team). In other words there was confusion as to what should be accomplished at the review.– Provide complete written charge four weeks prior
to the review
• PRR for Release n should precede LCO for Release n+1
Recommendations - Review Process
• Allocate more time for questions from the board
• Two weeks before review, provide evidence to review chair that all entry criteria are met; if not, review should not proceed
Stoplight ChartTechnology Area Technical Risk Assessment Management Assessment
UX Red (very few screens wireframed)
Red (no plan)
COI Green/Yellow (complexity) Green
CEI Green/Yellow (scalability risk)
Green
DM Green Yellow (status tracking, staffing)
S&A Green Yellow (under staffed)
Marine Integration, Sensor Sets, and Dataset Agents
Green Green
Architecture Green/Yellow (new technologies, will it scale?)
Green
CyberPoP Green Green
Summary Conclusion
• Exit criteria for LCO were not met • Conduct delta LCO – specifics will be
provided in final board report– Includes: • Working review and finalization of use cases• Mapping of R2 requirements to use cases• UX plan and screens• Staffing and resource-loaded schedule with critical
path• Updated risk register with mitigation plans
• OOI board members are available to assist