+ All Categories
Home > Documents > Benchmark Tutorial -- IV - Participation

Benchmark Tutorial -- IV - Participation

Date post: 12-Jul-2015
Category:
Upload: jdbess
View: 161 times
Download: 5 times
Share this document with a friend
Popular Tags:
27
www.inl.gov Benchmarking Experiments for Criticality Safety and Reactor Physics Applications II Tutorial John D. Bess and J. Blair Briggs INL Ian Hill (IDAT) OECD/NEA This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517) 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012
Transcript
Page 1: Benchmark Tutorial -- IV - Participation

ww

w.in

l.g

ov

Benchmarking Experiments for Criticality Safety and Reactor

Physics Applications – II –Tutorial

John D. Bess and J. Blair Briggs – INL

Ian Hill (IDAT) – OECD/NEA

This paper was prepared at Idaho National Laboratory for the U.S. Department of

Energy under Contract Number (DE-AC07-05ID14517)

2012 ANS Annual Meeting

Chicago, Illinois

June 24-28, 2012

Page 2: Benchmark Tutorial -- IV - Participation

Outline

I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP

II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration

III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements

IV. Benchmark Participation

2

Page 3: Benchmark Tutorial -- IV - Participation

BENCHMARK PARTICIPATION

3

Page 4: Benchmark Tutorial -- IV - Participation

Getting Started

• Criticality Safety Analyses– Most of the work has

already been performed– Typically only uncertainty

and bias calculations remain

• Reactor Operations– Occur on a daily basis

worldwide– Requires reorganization

and evaluation of activities performed

• Student and Young Generation Involvement– Provide education

opportunities prior to incorporation into the workforce

– Expand upon training at facilities/workplace

• Especially when handbook data is already widely utilized

• Computational analysis is rapidly being incorporated in the nuclear community

• Just for Fun

4

Page 5: Benchmark Tutorial -- IV - Participation

Getting Started (Continued)

• ICSBEP Handbook– Uncertainty Guide

– Critical/Subcritical Guide

– Alarm/Shielding Guide

– Fundamental Physics Guide

– DICE User’s Manual

– Reviewer Checklist

• IRPhEP Handbook– Evaluation Guide

– Uncertainty Guide

• Look at available benchmark data

• Talk with current and past participants in the benchmark programs

• What are you interested in benchmarking?

• What if you can’t do a complete benchmark?

5

Page 6: Benchmark Tutorial -- IV - Participation

6

Past and Present Student Involvement

• Since 1995, ~30 students have participated in the ICSBEP and/or IRPhEP– 14 directly at INL

• Students have authored or coauthored over 60 benchmark evaluations– 28 & 2+ in progress at INL

• They have also submitted technical papers to various conferences and journals

Page 7: Benchmark Tutorial -- IV - Participation

7

Opportunities for Involvement

• Each year the ICSBEP hosts up to one or two Summer Interns at the Idaho National Laboratory

• INL has also funded benchmark development via the CSNR Next Degree Program

• Students have participated in the projects as subcontractors through various universities and laboratories

• Benchmark development represents excellent work for collaborative Senior Design Projects, Master of Engineering Project, or Master of Science Thesis Topic

• Further information can be Obtained by contacting the Chairman of the ICSBEP– J. Blair Briggs, [email protected]

Page 8: Benchmark Tutorial -- IV - Participation

Opportunities for Student Involvement – I

• Internships– Traditional approach

– 10-weeks

– Benchmark review process completed on “free-time” during university studies

– Encouraged to publish

• Augmented Education– Rigorous structure for

Master’s Thesis

– Mentor and peer-review network

– Undergraduate thesis

– Undergraduate team projects

– Encouraged to publish

8

Page 9: Benchmark Tutorial -- IV - Participation

Opportunities for Student Involvement – II

• Center for Space Nuclear Research (CSNR)– Next Degree Program

• Work as a part- or full-time subcontractor while completing a graduate degree

– Via local or remote university participation

– Opportunity for interaction with students participating in space nuclear activities

• Other Next Degree students

• Summer fellow students (interns)

– Collaboration opportunities on other projects

9

Page 10: Benchmark Tutorial -- IV - Participation

Opportunities for Student Involvement – III

• Nuclear and Criticality Safety Engineers– Pilot program collaboration

• Battelle Energy Alliance (BEA)

• U.S. Department of Energy – Idaho (DOE-ID)

– Idaho State University and University of Idaho

– Graduate Nuclear Engineering Curriculum

– Part-time employment (full-time summer)

– Hands-on training, benchmarking, ANS meetings, thesis/dissertation work, shadow mentors, DOE and ANS standard development, etc.

10

Page 11: Benchmark Tutorial -- IV - Participation

11

Investigation Breeds Comprehension

• Benchmark procedures require investigation into– History and background

• Purpose of experiment?

– Experimental design and methods

– Analytical capabilities and procedures

– Experimental results

• Often experiments were performed with the intent to provide data for criticality safety assessments– Many are utilized to

develop criticality safety standards

Page 12: Benchmark Tutorial -- IV - Participation

12

Culturing Good Engineering Judgment

• Often experimental information is incomplete or misleading– Contact original experimenters (if available)

– Interact with professionals from the ICSBEP/IRPhEPcommunity

– Establish a personal network for the young professional engineer

“Do, or do not.

There is no „try‟”

- Jedi Master Yoda

Page 13: Benchmark Tutorial -- IV - Participation

13

Developing an Analytical Skill and Tool Set

• Evaluators develop analytical and computational capabilities throughout the evaluation process– Utility of conventional computational codes and

neutron cross section data libraries• Monte Carlo or Diffusion methods

• MCNP and KENO are the most common in the US

– Application of perturbation theory and statistical analyses

• Uncertainty evaluation

• Bias assessment

– Technical report writing

– Understanding acceptability of results

Page 14: Benchmark Tutorial -- IV - Participation

We Want You…

• Numerous reactors and test facilities worldwide that can be benchmarked– Availability of data and

expertise for various systems dwindle with time

• Visit the website– http://icsbep.inl.gov/– http://irphep.inl.gov/

• Contact us– [email protected][email protected][email protected]

http://www.oecd-nea.org/science/wpncs/icsbep/

http://www.oecd-nea.org/science/wprs/irphe/

Page 15: Benchmark Tutorial -- IV - Participation

Reactors Can Be Used for More Than Making Heat

15

Page 16: Benchmark Tutorial -- IV - Participation

WHAT BENCHMARK DATA ARE YOU MISSING?

16

Page 17: Benchmark Tutorial -- IV - Participation

Example Experiments (Not Currently Benchmarked)

• Fast Test Reactors– ZPR-3, ZPR-6, ZPR-9,

and ZPPR Assemblies– EBR-II

• Gas Cooled Reactors– GA HTGR– Fort St. Vrain– Peach Bottom– PNWL HTLTR– ROVER/NERVA

Program– Project Pluto– DRAGON

• Small Modular Reactors– SCCA-004– SORA Critical

Experiment Assembly– CLEMENTINE– GE-710 Tungsten

Nuclear Rocket– HNPF– PBR-CX– SNAP & NASA Activities– N.S. Savannah– B&W Spectral Shift

Experiments

17

Page 18: Benchmark Tutorial -- IV - Participation

Example Experiments (Continued)

• Research Reactors– More TRIGAs– AGN– PULSTAR– Argonaut– Pool-type such as MSTR– MNSR– HFIR– ACRR– Fast Burst Reactors– Kodak CFX

• Experiments at ORCEF– Mihalczo HEU Sphere– USS Sandwich– Moderated Jemima– Numerous HEU Cases– SNAP Water Immersion– “Libby” Johnson Cases

• And many, many, more– Similar experiments

validate the benchmark process and further improve nuclear data

– See OECD/NEA IRPhEP website for reports with data for benchmarking

18

Page 19: Benchmark Tutorial -- IV - Participation

General Benchmark Evaluation/Review Process

• ICSBEP and IRPhEP benchmarks are subject to extensive review

– Evaluator(s) – primary assessment of the benchmark

– Internal Reviewer(s) – in-house verification of the analysis and adherence to procedure

– Independent Reviewer(s) – external (often foreign) verification of the analysis via international experts

– Technical Review Workgroup Meeting – annual international effort to review all benchmarks prior to inclusion in the handbook

• Sometimes a subgroup is assigned to assess any final workgroup comments and revisions prior to publication

– Benchmarks are determined to be Acceptable or Unacceptable for use depending on uncertainty in results

• All Approved benchmarks are retained in the handbook

• Unacceptable data are published for documentation purposes, but benchmark specifications are not provided

19

Page 20: Benchmark Tutorial -- IV - Participation

20

Quality Assurance

Each experiment evaluation included in the Handbook undergoes a thorough internal review by the evaluator's organization. Internal reviewers are expected to verify:

1. The accuracy of the descriptive information given in the evaluation by comparison with original documentation (published and unpublished).

2. That the benchmark specification can be derived from the descriptive information given in the evaluation

3. The completeness of the benchmark specification

4. The results and conclusions

5. Adherence to format.

Page 21: Benchmark Tutorial -- IV - Participation

21

Quality Assurance (continued)

In addition, each evaluation undergoes an independent peer review by another Technical Review Group member at a different facility. Starting with the evaluator's submittal in the appropriate format, independent peer reviewers are expected to verify:

1.That the benchmark specification can be derived from the descriptive information given in the evaluation

2.The completeness of the benchmark specification

3.The results and conclusions

4.Adherence to format.

Page 22: Benchmark Tutorial -- IV - Participation

22

Quality Assurance (continued)

A third review by the Technical Review Group verifies that the benchmark specifications and the conclusions were adequately supported.

Page 23: Benchmark Tutorial -- IV - Participation

Annual Technical Review Meeting

• Typically 30 international participants

• Usually OECD/NEA Headquarters in Paris, France

• Technical review group converges to discuss each benchmark, page-by-page, prior to approval for entry into the handbook

• List of Action Items are recorded; these must be resolved prior to publication– Final approval given by Internal and Independent

Reviewers and, if necessary, a Subgroup of additional reviewers

• Final benchmark report sent to INL for publication

23

Page 24: Benchmark Tutorial -- IV - Participation

What to Expect at Tech Review Group Meetings

24

Page 25: Benchmark Tutorial -- IV - Participation

Developing International Camaraderie

25

Page 26: Benchmark Tutorial -- IV - Participation

Annual Important Dates (Guidelines)

ICSBEP IRPhEP

Benchmark Evaluation Year-round Year-round

Internal Review Year-round Year-round

Independent Review February – March August – September

Technical Review Group

Distribution

Late March –

Early April

Late September –

Early October

Technical Review Meeting Early May Late October

Resolution of Action Items May – July November – January

Submission of Final Report Before Late July Before Late January

Handbook Preparation June – September November – March

Handbook Publication End September End March

26

Reviews can be coordinated throughout the year.

These are the guidelines for annual publications.

Page 27: Benchmark Tutorial -- IV - Participation

Questions?

27

SCCA-002


Recommended