1. Benchmarking Experiments for
Criticality Safety and Reactor
Physics Applications – II –
Tutorial
John D. Bess and J. Blair Briggs – INL
Ian Hill (IDAT) – OECD/NEA
www.inl.gov
2012 ANS Annual Meeting
Chicago, Illinois
June 24-28, 2012
This paper was prepared at Idaho National Laboratory for the U.S. Department of
Energy under Contract Number (DE-AC07-05ID14517)
2. Outline
I. Introduction to Benchmarking
a. Overview
b. ICSBEP/IRPhEP
II. Benchmark Experiment Availability
a. DICE Demonstration
b. IDAT Demonstration
III. Dissection of a Benchmark Report
a. Experimental Data
b. Experiment Evaluation
c. Benchmark Model
d. Sample Calculations
e. Benchmark Measurements
IV. Benchmark Participation
2
4. Getting Started
• Criticality Safety Analyses • Student and Young
– Most of the work has Generation Involvement
already been performed – Provide education
– Typically only uncertainty opportunities prior to
and bias calculations incorporation into the
remain workforce
– Expand upon training at
• Reactor Operations facilities/workplace
– Occur on a daily basis • Especially when handbook
worldwide data is already widely
– Requires reorganization utilized
and evaluation of activities • Computational analysis is
performed rapidly being incorporated
in the nuclear community
• Just for Fun
4
5. Getting Started (Continued)
• ICSBEP Handbook • Look at available
– Uncertainty Guide benchmark data
– Critical/Subcritical Guide • Talk with current and
– Alarm/Shielding Guide past participants in the
– Fundamental Physics benchmark programs
Guide
– DICE User’s Manual
– Reviewer Checklist • What are you
interested in
• IRPhEP Handbook benchmarking?
– Evaluation Guide
– Uncertainty Guide
• What if you can’t do a
complete benchmark?
5
6. Past and Present Student Involvement
• Since 1995, ~30 students
have participated in the
ICSBEP and/or IRPhEP
– 14 directly at INL
• Students have authored
or coauthored over 60
benchmark evaluations
– 28 & 2+ in progress at INL
• They have also
submitted technical
papers to various
conferences and journals
6
7. Opportunities for Involvement
• Each year the ICSBEP hosts up to one or two
Summer Interns at the Idaho National Laboratory
• INL has also funded benchmark development via the
CSNR Next Degree Program
• Students have participated in the projects as
subcontractors through various universities and
laboratories
• Benchmark development represents excellent work for
collaborative Senior Design Projects, Master of
Engineering Project, or Master of Science Thesis
Topic
• Further information can be Obtained by contacting the
Chairman of the ICSBEP
– J. Blair Briggs, J.Briggs@inl.gov
7
8. Opportunities for Student Involvement – I
• Internships • Augmented Education
– Traditional approach – Rigorous structure for
– 10-weeks Master’s Thesis
– Benchmark review – Mentor and peer-review
process completed on network
“free-time” during – Undergraduate thesis
university studies – Undergraduate team
– Encouraged to publish projects
– Encouraged to publish
8
9. Opportunities for Student Involvement – II
• Center for Space Nuclear Research (CSNR)
– Next Degree Program
• Work as a part- or full-time subcontractor while completing a
graduate degree
– Via local or remote university participation
– Opportunity for interaction with students participating in
space nuclear activities
• Other Next Degree students
• Summer fellow students (interns)
– Collaboration opportunities on other projects
9
10. Opportunities for Student Involvement – III
• Nuclear and Criticality Safety Engineers
– Pilot program collaboration
• Battelle Energy Alliance (BEA)
• U.S. Department of Energy – Idaho (DOE-ID)
– Idaho State University and University of Idaho
– Graduate Nuclear Engineering Curriculum
– Part-time employment (full-time summer)
– Hands-on training, benchmarking, ANS meetings,
thesis/dissertation work, shadow mentors, DOE and
ANS standard development, etc.
10
11. Investigation Breeds Comprehension
• Benchmark procedures require investigation into
– History and background
• Purpose of experiment?
– Experimental design and methods
– Analytical capabilities and procedures
– Experimental results
• Often experiments were performed with the intent
to provide data for criticality safety assessments
– Many are utilized to
develop criticality
safety standards
11
12. Culturing Good Engineering Judgment
• Often experimental information is incomplete or
misleading
– Contact original experimenters (if available)
– Interact with professionals from the ICSBEP/IRPhEP
community
– Establish a personal network for the young professional
engineer
“Do, or do not.
There is no „try‟”
- Jedi Master Yoda
12
13. Developing an Analytical Skill and Tool Set
• Evaluators develop analytical and computational
capabilities throughout the evaluation process
– Utility of conventional computational codes and
neutron cross section data libraries
• Monte Carlo or Diffusion methods
• MCNP and KENO are the most common in the US
– Application of perturbation theory and statistical
analyses
• Uncertainty evaluation
• Bias assessment
– Technical report writing
– Understanding acceptability of results
13
14. We Want You…
• Numerous reactors and
test facilities worldwide that
can be benchmarked
– Availability of data and
expertise for various
systems dwindle with time
• Visit the website
– http://icsbep.inl.gov/
– http://irphep.inl.gov/
• Contact us
– icsbep@inl.gov
– irphep@inl.gov
– jim.gulliford@oecd.org
http://www.oecd-nea.org/science/wpncs/icsbep/
http://www.oecd-nea.org/science/wprs/irphe/
17. Example Experiments (Not Currently Benchmarked)
• Fast Test Reactors • Small Modular Reactors
– ZPR-3, ZPR-6, ZPR-9, – SCCA-004
and ZPPR Assemblies – SORA Critical
– EBR-II Experiment Assembly
• Gas Cooled Reactors – CLEMENTINE
– GA HTGR – GE-710 Tungsten
Nuclear Rocket
– Fort St. Vrain – HNPF
– Peach Bottom – PBR-CX
– PNWL HTLTR – SNAP & NASA Activities
– ROVER/NERVA – N.S. Savannah
Program
– Project Pluto – B&W Spectral Shift
Experiments
– DRAGON
17
18. Example Experiments (Continued)
• Research Reactors • Experiments at ORCEF
– More TRIGAs – Mihalczo HEU Sphere
– AGN – USS Sandwich
– PULSTAR – Moderated Jemima
– Argonaut – Numerous HEU Cases
– Pool-type such as MSTR – SNAP Water Immersion
– MNSR – “Libby” Johnson Cases
– HFIR • And many, many, more
– ACRR – Similar experiments
– Fast Burst Reactors validate the benchmark
– Kodak CFX process and further
improve nuclear data
– See OECD/NEA IRPhEP
website for reports with
data for benchmarking
18
19. General Benchmark Evaluation/Review Process
• ICSBEP and IRPhEP benchmarks are subject to extensive review
– Evaluator(s) – primary assessment of the benchmark
– Internal Reviewer(s) – in-house verification of the analysis and
adherence to procedure
– Independent Reviewer(s) – external (often foreign) verification of the
analysis via international experts
– Technical Review Workgroup Meeting – annual international effort to
review all benchmarks prior to inclusion in the handbook
• Sometimes a subgroup is assigned to assess any final workgroup
comments and revisions prior to publication
– Benchmarks are determined to be Acceptable or Unacceptable for
use depending on uncertainty in results
• All Approved benchmarks are retained in the handbook
• Unacceptable data are published for documentation purposes, but
benchmark specifications are not provided
19
20. Quality Assurance
Each experiment evaluation included in the Handbook
undergoes a thorough internal review by the evaluator's
organization. Internal reviewers are expected to verify:
1. The accuracy of the descriptive information given in the
evaluation by comparison with original documentation
(published and unpublished).
2. That the benchmark specification can be derived from
the descriptive information given in the evaluation
3. The completeness of the benchmark specification
4. The results and conclusions
5. Adherence to format.
20
21. Quality Assurance (continued)
In addition, each evaluation undergoes an independent
peer review by another Technical Review Group member
at a different facility. Starting with the evaluator's submittal
in the appropriate format, independent peer reviewers are
expected to verify:
1. That the benchmark specification can be derived from
the descriptive information given in the evaluation
2. The completeness of the benchmark specification
3. The results and conclusions
4. Adherence to format.
21
22. Quality Assurance (continued)
A third review by the Technical Review Group verifies that
the benchmark specifications and the conclusions were
adequately supported.
22
23. Annual Technical Review Meeting
• Typically 30 international participants
• Usually OECD/NEA Headquarters in Paris, France
• Technical review group converges to discuss each
benchmark, page-by-page, prior to approval for
entry into the handbook
• List of Action Items are recorded; these must be
resolved prior to publication
– Final approval given by Internal and Independent
Reviewers and, if necessary, a Subgroup of additional
reviewers
• Final benchmark report sent to INL for publication
23
26. Annual Important Dates (Guidelines)
ICSBEP IRPhEP
Benchmark Evaluation Year-round Year-round
Internal Review Year-round Year-round
Independent Review February – March August – September
Technical Review Group Late March – Late September –
Distribution Early April Early October
Technical Review Meeting Early May Late October
Resolution of Action Items May – July November – January
Submission of Final Report Before Late July Before Late January
Handbook Preparation June – September November – March
Handbook Publication End September End March
Reviews can be coordinated throughout the year.
These are the guidelines for annual publications.
26