+ All Categories
Home > Documents > Cyberinfrastructure R3 Life Cycle Objectives Review January 8-9, 2013 Ocean Observatories Initiative...

Cyberinfrastructure R3 Life Cycle Objectives Review January 8-9, 2013 Ocean Observatories Initiative...

Date post: 04-Jan-2016
Category:
Upload: maude-sanders
View: 215 times
Download: 1 times
Share this document with a friend
Popular Tags:
28
Cyberinfrastructure R3 Life Cycle Objectives Review January 8-9, 2013 Ocean Observatories Initiative CI Release 3 Life Cycle Objectives Review Preliminary Review Board Report Out
Transcript

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Ocean Observatories Initiative

CI Release 3 Life Cycle Objectives Review

Preliminary Review Board

Report Out

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Regarding the R3 LCO Board Report

• The Report Out represents initial impressions.

• Board will add, edit, or delete comments in the final report.

2

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Review Board Charge

• Assess the progress of development efforts and plans during R3 Inception Phase.

• Is R3 Inception progress sufficient to proceed to the Elaboration Phase?

• Develop findings, recommendations, and suggestions to CI and the OOI PMO.

3

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Summary

• Thanks to CI Team for the opportunity to participate in the development.

• Impressed with and appreciate the:Well prepared and coordinated presentations Impressive amount of work; dedicated teamWell-planned agendaExcellent logistics – thank you

4

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria - Entry

• Was the R3 LCO documentation presented complete and sufficient to evaluate the progress of the R3 Inception phase?

Findings:• Plenty of good information; challenge to navigate it.• Appears sufficient (could have used more time to review)

Recommendations:• Request a list of requirements (especially the critical

ones) not addressed in current use cases• Important to map the critical reqs to UC by LCO

5

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

LCO Artifact SetManagement Set System Life Cycle Plan Risk Register Elaboration Execution Plan

Requirements Set• Use cases (mature) – not mature at this point Candidate system and subsystem requirements

6

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

LCO Artifact SetDesign Set Architecture specification (candidate) Technology list (baselined) –

• What requirements drive the technology choices? • Concern with resource investments in technologies

not linked to a critical requirement.• Prototype reports – documentation was minimal and was

not clear on what was learned from the exercise. Was a risk mitigated?

7

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria – Exit #11. Are the use cases understood and do they provide a

complete description of the release scope? Does the prioritization of use cases look appropriate?

Findings:• Challenge to link presentation material • Little/no input from MIOs into UC formulation or ranking

(an action). Any req not met/dropped could impact MIOs.• Action – PMO to facilitate MIO involvement in UC

development/refinement/prioritization.

Recommendation: List critical recommendations not addressed by existing UC

8

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria – Exit #2

2. Are the candidate requirements understood and do they cover the critical use cases?

Findings:• The Review Board has not gone through all the

requirements in detail to be able to answer this. We are confident the CI team understands the requirements in DOORS.

Recommendations:• CI move forward with elaboration. • The Marine IOs agree to look at requirements and UC to

support prioritization/ranking of UC.

9

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

General Recommendation

• Recommendation to preface presentations with the criteria addressed.

10

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria – Exit #3

3. Are the critical risks identified and have they been mitigated through exploratory prototypes?

Findings:• It was difficult to see the link between the prototypes and

the risks.

Recommendations:• The Board would like to see a clearer representation of the

links between specific risk(s) and the choices of prototype efforts (eg., brief description of risk X, description of mitigations/prototype, and resulting impact to risk X and R3.

11

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria – Exit #4

4. Are the candidate architectures viable as demonstrated through analysis and prototyping?

Findings:• The Board feels the candidate architectures are viable.

There remains concern that not all critical prototype efforts are complete.

Recommendation: Focus effort on critical prototypes with an eye to cost:benefit analysis.

12

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Evaluation Criteria – Exit #5

5. Is the Elaboration Execution Plan credible?

Based on artifacts/presentations/discussions, are board members confident that the elaboration plan will be successful?

Finding: The plan for completing R2-related tasks while entering R3 Elaboration is not clear. Will there be impact to the progress of R3?

Recommendations:• Develop a plan and schedule for getting marine IO

involvement in use case maturation.• Use cases need to be stabilized in the near term.

13

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Observations about this review• Recommend better representation from IOs with

expertise in software architecture. • All artifacts must be posted 1 full week in advance;

background material and available artifacts (if mostly done) could be posted 2 weeks in advance to allow reviewers to go through all artifacts.

• Update the document map to reflect the artifacts actually posted (i.e., clearer mapping of document names).

• Broader OOI team would benefit from attending these reviews (in person or webex).

• A clarification.

14

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Questions for the Review Board?

15

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Concerns

16

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Board Questions at end of Day 1

1. In general, what level of maturity and review do you expect the use cases to have achieved by LCO?  Based on R1 and R2 experiences, is this sufficient?   Have we achieved that level of maturity?  (for John G.)

2. What level of maturity and review do you need the use cases to have achieved to support architecture development in Elaboration?  Based on R1 and R2 experiences, is this sufficient? Have we achieved that level of maturity?  (for Michael M.)

3. Who are the groups of users canvassed by CIUX (just ball park numbers and types [scientists, operators, data managers, etc.) for input on UI screens/functions to see if the design reflects their input?  Have you done beta testing with those people?

4. Are the existing use cases sufficient to address requirements and how was this determined? How were use cases ranked (i.e., what was the criteria used)?

17

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Breakout Session Assignments#1 Tuesday, 14:45-16:00

Group 1 – Data Management (Andrea, Steve, Sue)

Group 2 – Analysis and Synthesis (Art, Doug, Ed, Jon)

 

#2 Wednesday, 08:30-10:00

Group 1 – Common Operating Infrastructure ()

Common Execution Infrastructure

Group 2 – Sensing and Acquisition ()

 

#3 Wednesday, 10:15-11:45

Group 1 – Marine Integration, Sensor Sets, Dataset Agents (Sue)

Group 2 – Planning and Prosecution ()

18

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

R3 Review Board Members

• Sue Banahan* - OOI Program Office (PMO)• Ed Chapman - PMO, OOI Chief Systems Engineer• Mark Fornwall – USGS/Ocean Biogeogr. Info. Syst.• Jonathan Fram – OSU, Endurance Array• Steve Gaul – CG, Systems Architect & CI Interface • Doug Luther – U Hawaii, RSN Project Scientist• Andrea McCurdy – PMO, Assoc. Project Manager EPE• Art Miller – SIO Climate Sciences, Oceanogr./Sr. Lecturer

*chair

19

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

20

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

21

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

22

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

23

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Release 3

• R3 will deliver OnDemand Measurement Processing • Incorporates:

– R1 Data Distribution Network– R2 Managed Instrument Network

• Add the end-to-end control of how data are processed, support more advanced workflows of instrument providers and data product consumers, and on-demand measurements supporting event-driven opportunistic observations.

• First release of the Integrated Observatory Network (ION) that is intended for exposure to the general public.

24

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

25

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

26

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

27

CyberinfrastructureR3 Life Cycle Objectives Review

January 8-9, 2013

Level 1 Science Themes

• Ocean-Atmosphere Exchange• Climate Variability, Ocean Circulation, and

Ecosystems• Turbulent Mixing and Biophysical Interactions • Coastal Ocean Dynamics and Ecosystems• Fluid-Rock Interactions and the Subseafloor

Biosphere• Plate-scale, Ocean Geodynamics

28


Recommended