SlideShare ist ein Scribd-Unternehmen logo
1 von 27
Benchmarking Experiments for
               Criticality Safety and Reactor
                Physics Applications – II –
                           Tutorial

                     John D. Bess and J. Blair Briggs – INL
                          Ian Hill (IDAT) – OECD/NEA
www.inl.gov




                                  2012 ANS Annual Meeting
                                       Chicago, Illinois
                                      June 24-28, 2012


              This paper was prepared at Idaho National Laboratory for the U.S. Department of
                          Energy under Contract Number (DE-AC07-05ID14517)
Outline
I.   Introduction to Benchmarking
     a.   Overview
     b.   ICSBEP/IRPhEP
II. Benchmark Experiment Availability
     a.   DICE Demonstration
     b.   IDAT Demonstration
III. Dissection of a Benchmark Report
     a.   Experimental Data
     b.   Experiment Evaluation
     c.   Benchmark Model
     d.   Sample Calculations
     e.   Benchmark Measurements
IV. Benchmark Participation


                                        2
BENCHMARK PARTICIPATION


                          3
Getting Started
• Criticality Safety Analyses • Student and Young
   – Most of the work has       Generation Involvement
     already been performed            – Provide education
   – Typically only uncertainty          opportunities prior to
     and bias calculations               incorporation into the
     remain                              workforce
                                       – Expand upon training at
• Reactor Operations                     facilities/workplace
   – Occur on a daily basis               • Especially when handbook
     worldwide                              data is already widely
   – Requires reorganization                utilized
     and evaluation of activities         • Computational analysis is
     performed                              rapidly being incorporated
                                            in the nuclear community
                                    • Just for Fun 


                                                                         4
Getting Started (Continued)
• ICSBEP Handbook            • Look at available
  – Uncertainty Guide          benchmark data
  – Critical/Subcritical Guide • Talk with current and
  – Alarm/Shielding Guide        past participants in the
  – Fundamental Physics          benchmark programs
    Guide
  – DICE User’s Manual
  – Reviewer Checklist         • What are you
                              interested in
• IRPhEP Handbook             benchmarking?
  – Evaluation Guide
  – Uncertainty Guide
                             • What if you can’t do a
                               complete benchmark?
                                                            5
Past and Present Student Involvement
• Since 1995, ~30 students
  have participated in the
  ICSBEP and/or IRPhEP
   – 14 directly at INL
• Students have authored
  or coauthored over 60
  benchmark evaluations
   – 28 & 2+ in progress at INL
• They have also
  submitted technical
  papers to various
  conferences and journals


                                       6
Opportunities for Involvement
• Each year the ICSBEP hosts up to one or two
  Summer Interns at the Idaho National Laboratory
• INL has also funded benchmark development via the
  CSNR Next Degree Program
• Students have participated in the projects as
  subcontractors through various universities and
  laboratories
• Benchmark development represents excellent work for
  collaborative Senior Design Projects, Master of
  Engineering Project, or Master of Science Thesis
  Topic
• Further information can be Obtained by contacting the
  Chairman of the ICSBEP
   – J. Blair Briggs, J.Briggs@inl.gov

                                                          7
Opportunities for Student Involvement – I
• Internships                • Augmented Education
   – Traditional approach      – Rigorous structure for
   – 10-weeks                    Master’s Thesis
   – Benchmark review          – Mentor and peer-review
     process completed on        network
     “free-time” during        – Undergraduate thesis
     university studies        – Undergraduate team
   – Encouraged to publish       projects
                               – Encouraged to publish




                                                          8
Opportunities for Student Involvement – II
• Center for Space Nuclear Research (CSNR)
  – Next Degree Program
     • Work as a part- or full-time subcontractor while completing a
       graduate degree
  – Via local or remote university participation
  – Opportunity for interaction with students participating in
    space nuclear activities
     • Other Next Degree students
     • Summer fellow students (interns)
  – Collaboration opportunities on other projects




                                                                       9
Opportunities for Student Involvement – III
• Nuclear and Criticality Safety Engineers
  – Pilot program collaboration
     • Battelle Energy Alliance (BEA)
     • U.S. Department of Energy – Idaho (DOE-ID)
  – Idaho State University and University of Idaho
  – Graduate Nuclear Engineering Curriculum
  – Part-time employment (full-time summer)
  – Hands-on training, benchmarking, ANS meetings,
    thesis/dissertation work, shadow mentors, DOE and
    ANS standard development, etc.




                                                        10
Investigation Breeds Comprehension
• Benchmark procedures require investigation into
  – History and background
     • Purpose of experiment?
  – Experimental design and methods
  – Analytical capabilities and procedures
  – Experimental results
• Often experiments were performed with the intent
  to provide data for criticality safety assessments
  – Many are utilized to
    develop criticality
    safety standards


                                                       11
Culturing Good Engineering Judgment
• Often experimental information is incomplete or
  misleading
  – Contact original experimenters (if available)
  – Interact with professionals from the ICSBEP/IRPhEP
    community
  – Establish a personal network for the young professional
    engineer

   “Do, or do not.
       There is no „try‟”
            - Jedi Master Yoda


                                                              12
Developing an Analytical Skill and Tool Set
• Evaluators develop analytical and computational
  capabilities throughout the evaluation process
  – Utility of conventional computational codes and
    neutron cross section data libraries
     • Monte Carlo or Diffusion methods
     • MCNP and KENO are the most common in the US
  – Application of perturbation theory and statistical
    analyses
     • Uncertainty evaluation
     • Bias assessment
  – Technical report writing
  – Understanding acceptability of results


                                                         13
We Want You…
                        • Numerous reactors and
                          test facilities worldwide that
                          can be benchmarked
                             – Availability of data and
                               expertise for various
                               systems dwindle with time
                        • Visit the website
                             – http://icsbep.inl.gov/
                             – http://irphep.inl.gov/
                        • Contact us
                             – icsbep@inl.gov
                             – irphep@inl.gov
                             – jim.gulliford@oecd.org
           http://www.oecd-nea.org/science/wpncs/icsbep/
           http://www.oecd-nea.org/science/wprs/irphe/
Reactors Can Be Used for More Than Making Heat




                                                 15
WHAT BENCHMARK DATA ARE
YOU MISSING?

                          16
Example Experiments (Not Currently Benchmarked)
• Fast Test Reactors       • Small Modular Reactors
  – ZPR-3, ZPR-6, ZPR-9,     – SCCA-004
    and ZPPR Assemblies      – SORA Critical
  – EBR-II                     Experiment Assembly
• Gas Cooled Reactors        – CLEMENTINE
  – GA HTGR                  – GE-710 Tungsten
                               Nuclear Rocket
  – Fort St. Vrain           – HNPF
  – Peach Bottom             – PBR-CX
  – PNWL HTLTR               – SNAP & NASA Activities
  – ROVER/NERVA              – N.S. Savannah
    Program
  – Project Pluto            – B&W Spectral Shift
                               Experiments
  – DRAGON

                                                        17
Example Experiments (Continued)
• Research Reactors            • Experiments at ORCEF
  –   More TRIGAs                –   Mihalczo HEU Sphere
  –   AGN                        –   USS Sandwich
  –   PULSTAR                    –   Moderated Jemima
  –   Argonaut                   –   Numerous HEU Cases
  –   Pool-type such as MSTR     –   SNAP Water Immersion
  –   MNSR                       –   “Libby” Johnson Cases
  –   HFIR                     • And many, many, more
  –   ACRR                       – Similar experiments
  –   Fast Burst Reactors          validate the benchmark
  –   Kodak CFX                    process and further
                                   improve nuclear data
                                 – See OECD/NEA IRPhEP
                                   website for reports with
                                   data for benchmarking
                                                              18
General Benchmark Evaluation/Review Process
• ICSBEP and IRPhEP benchmarks are subject to extensive review
    – Evaluator(s) – primary assessment of the benchmark
    – Internal Reviewer(s) – in-house verification of the analysis and
      adherence to procedure
    – Independent Reviewer(s) – external (often foreign) verification of the
      analysis via international experts
    – Technical Review Workgroup Meeting – annual international effort to
      review all benchmarks prior to inclusion in the handbook
        • Sometimes a subgroup is assigned to assess any final workgroup
           comments and revisions prior to publication
    – Benchmarks are determined to be Acceptable or Unacceptable for
      use depending on uncertainty in results
        • All Approved benchmarks are retained in the handbook
        • Unacceptable data are published for documentation purposes, but
           benchmark specifications are not provided

                                                                               19
Quality Assurance
Each experiment evaluation included in the Handbook
undergoes a thorough internal review by the evaluator's
organization. Internal reviewers are expected to verify:
1. The accuracy of the descriptive information given in the
   evaluation by comparison with original documentation
   (published and unpublished).
2. That the benchmark specification can be derived from
   the descriptive information given in the evaluation
3. The completeness of the benchmark specification
4. The results and conclusions
5. Adherence to format.


                                                              20
Quality Assurance (continued)
In addition, each evaluation undergoes an independent
peer review by another Technical Review Group member
at a different facility. Starting with the evaluator's submittal
in the appropriate format, independent peer reviewers are
expected to verify:
 1. That the benchmark specification can be derived from
    the descriptive information given in the evaluation
 2. The completeness of the benchmark specification
 3. The results and conclusions
 4. Adherence to format.



                                                                   21
Quality Assurance (continued)

A third review by the Technical Review Group verifies that
the benchmark specifications and the conclusions were
adequately supported.




                                                             22
Annual Technical Review Meeting
• Typically 30 international participants
• Usually OECD/NEA Headquarters in Paris, France
• Technical review group converges to discuss each
  benchmark, page-by-page, prior to approval for
  entry into the handbook
• List of Action Items are recorded; these must be
  resolved prior to publication
  – Final approval given by Internal and Independent
    Reviewers and, if necessary, a Subgroup of additional
    reviewers
• Final benchmark report sent to INL for publication

                                                            23
What to Expect at Tech Review Group Meetings




                                               24
Developing International Camaraderie




                                       25
Annual Important Dates (Guidelines)
                                        ICSBEP                   IRPhEP
Benchmark Evaluation                   Year-round               Year-round
Internal Review                        Year-round               Year-round
Independent Review                 February – March        August – September
Technical Review Group                Late March –          Late September –
Distribution                           Early April            Early October
Technical Review Meeting               Early May               Late October
Resolution of Action Items             May – July          November – January
Submission of Final Report          Before Late July       Before Late January
Handbook Preparation               June – September         November – March
Handbook Publication                End September               End March

                  Reviews can be coordinated throughout the year.
                  These are the guidelines for annual publications.
                                                                                 26
Questions?




             SCCA-002




                        27

Weitere ähnliche Inhalte

Ähnlich wie Benchmark Tutorial -- IV - Participation

Aug2013 NIST program slides
Aug2013 NIST program slidesAug2013 NIST program slides
Aug2013 NIST program slides
GenomeInABottle
 
Patrick.guske.update
Patrick.guske.updatePatrick.guske.update
Patrick.guske.update
NASAPMC
 
Patrick.guske.update
Patrick.guske.updatePatrick.guske.update
Patrick.guske.update
NASAPMC
 
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
Nolan Nichols
 
Benchmarking with Monte Carlo
Benchmarking with Monte CarloBenchmarking with Monte Carlo
Benchmarking with Monte Carlo
jdbess
 
Classroom Pilot-Testing for GETSI NON-Authors Presentation
Classroom Pilot-Testing for GETSI NON-Authors Presentation Classroom Pilot-Testing for GETSI NON-Authors Presentation
Classroom Pilot-Testing for GETSI NON-Authors Presentation
SERC at Carleton College
 
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
Jonathan Tedds
 
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
Sara R. Tompson, M.S.
 

Ähnlich wie Benchmark Tutorial -- IV - Participation (20)

Aug2013 NIST program slides
Aug2013 NIST program slidesAug2013 NIST program slides
Aug2013 NIST program slides
 
Patrick.guske.update
Patrick.guske.updatePatrick.guske.update
Patrick.guske.update
 
Patrick.guske.update
Patrick.guske.updatePatrick.guske.update
Patrick.guske.update
 
Supporting the development of a national Research Data Discovery Service - A ...
Supporting the development of a national Research Data Discovery Service - A ...Supporting the development of a national Research Data Discovery Service - A ...
Supporting the development of a national Research Data Discovery Service - A ...
 
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
Reproducibility in human cognitive neuroimaging: a community-­driven data sha...
 
ESI Supplemental 3 Slides, Fit for Purpose
ESI Supplemental 3 Slides, Fit for PurposeESI Supplemental 3 Slides, Fit for Purpose
ESI Supplemental 3 Slides, Fit for Purpose
 
Supporting the development of a national Research Data Discovery Service – a ...
Supporting the development of a national Research Data Discovery Service – a ...Supporting the development of a national Research Data Discovery Service – a ...
Supporting the development of a national Research Data Discovery Service – a ...
 
RDAP14: Learning to Curate Panel
RDAP14: Learning to Curate Panel RDAP14: Learning to Curate Panel
RDAP14: Learning to Curate Panel
 
Data Management Planning presentation at JISC workshop
Data Management Planning presentation at JISC workshopData Management Planning presentation at JISC workshop
Data Management Planning presentation at JISC workshop
 
NISO Virtual Conference Scientific Data Management: Caring for Your Instituti...
NISO Virtual Conference Scientific Data Management: Caring for Your Instituti...NISO Virtual Conference Scientific Data Management: Caring for Your Instituti...
NISO Virtual Conference Scientific Data Management: Caring for Your Instituti...
 
E3 chap-09
E3 chap-09E3 chap-09
E3 chap-09
 
Accelerating New Materials Design with Supercomputing and Machine Learning
Accelerating New Materials Design with Supercomputing and Machine LearningAccelerating New Materials Design with Supercomputing and Machine Learning
Accelerating New Materials Design with Supercomputing and Machine Learning
 
Graham Pryor
Graham PryorGraham Pryor
Graham Pryor
 
Benchmarking with Monte Carlo
Benchmarking with Monte CarloBenchmarking with Monte Carlo
Benchmarking with Monte Carlo
 
Repository : A Brief Comparative Study Between The National University Of Mal...
Repository : A Brief Comparative Study Between The National University Of Mal...Repository : A Brief Comparative Study Between The National University Of Mal...
Repository : A Brief Comparative Study Between The National University Of Mal...
 
Where do we currently stand at ICARDA?
Where do we currently stand at ICARDA?Where do we currently stand at ICARDA?
Where do we currently stand at ICARDA?
 
Classroom Pilot-Testing for GETSI NON-Authors Presentation
Classroom Pilot-Testing for GETSI NON-Authors Presentation Classroom Pilot-Testing for GETSI NON-Authors Presentation
Classroom Pilot-Testing for GETSI NON-Authors Presentation
 
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
Jonathan Tedds Distinguished Lecture at DLab, UC Berkeley, 12 Sep 2013: "The ...
 
Open-source tools for generating and analyzing large materials data sets
Open-source tools for generating and analyzing large materials data setsOpen-source tools for generating and analyzing large materials data sets
Open-source tools for generating and analyzing large materials data sets
 
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
UCLA EIS Ref Class Jan 2014 guest lecture (compressed)
 

Mehr von jdbess

Benchmark Tutorial -- III - Report
Benchmark Tutorial -- III - ReportBenchmark Tutorial -- III - Report
Benchmark Tutorial -- III - Report
jdbess
 
Benchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - AvailabilityBenchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - Availability
jdbess
 
FSP-Be - NETS2011
FSP-Be - NETS2011FSP-Be - NETS2011
FSP-Be - NETS2011
jdbess
 

Mehr von jdbess (20)

Benchmark Tutorial -- III - Report
Benchmark Tutorial -- III - ReportBenchmark Tutorial -- III - Report
Benchmark Tutorial -- III - Report
 
Benchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - AvailabilityBenchmark Tutorial -- II - Availability
Benchmark Tutorial -- II - Availability
 
NRAD - ANS 2012
NRAD - ANS 2012NRAD - ANS 2012
NRAD - ANS 2012
 
GROTESQUE - ANS 2012
GROTESQUE - ANS 2012GROTESQUE - ANS 2012
GROTESQUE - ANS 2012
 
NRAD Reactor Benchmark Update
NRAD Reactor Benchmark UpdateNRAD Reactor Benchmark Update
NRAD Reactor Benchmark Update
 
MIRTE 2010
MIRTE 2010MIRTE 2010
MIRTE 2010
 
IRPhEP 2011
IRPhEP 2011IRPhEP 2011
IRPhEP 2011
 
ICSBEP 2010
ICSBEP 2010ICSBEP 2010
ICSBEP 2010
 
Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011Development Of An ICSBEP Benchmark 2011
Development Of An ICSBEP Benchmark 2011
 
FSP-Be - NETS2011
FSP-Be - NETS2011FSP-Be - NETS2011
FSP-Be - NETS2011
 
IRPhEP - ICAPP2010
IRPhEP - ICAPP2010IRPhEP - ICAPP2010
IRPhEP - ICAPP2010
 
FFTF - PHYSOR2010
FFTF - PHYSOR2010FFTF - PHYSOR2010
FFTF - PHYSOR2010
 
HTTR - PHYSOR2010
HTTR - PHYSOR2010HTTR - PHYSOR2010
HTTR - PHYSOR2010
 
HTTR - ANSWM 2009
HTTR - ANSWM 2009HTTR - ANSWM 2009
HTTR - ANSWM 2009
 
UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009UO2F2 - ANSWM 2009
UO2F2 - ANSWM 2009
 
CSNR - YPC 2009
CSNR - YPC 2009CSNR - YPC 2009
CSNR - YPC 2009
 
TANKS - ANS 2009
TANKS - ANS 2009TANKS - ANS 2009
TANKS - ANS 2009
 
FSP - NETS 2009
FSP - NETS 2009FSP - NETS 2009
FSP - NETS 2009
 
NTR - NETS 2009
NTR -  NETS 2009NTR -  NETS 2009
NTR - NETS 2009
 
HTTR - M&C 2009
HTTR - M&C 2009HTTR - M&C 2009
HTTR - M&C 2009
 

Benchmark Tutorial -- IV - Participation

  • 1. Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA www.inl.gov 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012 This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)
  • 2. Outline I. Introduction to Benchmarking a. Overview b. ICSBEP/IRPhEP II. Benchmark Experiment Availability a. DICE Demonstration b. IDAT Demonstration III. Dissection of a Benchmark Report a. Experimental Data b. Experiment Evaluation c. Benchmark Model d. Sample Calculations e. Benchmark Measurements IV. Benchmark Participation 2
  • 4. Getting Started • Criticality Safety Analyses • Student and Young – Most of the work has Generation Involvement already been performed – Provide education – Typically only uncertainty opportunities prior to and bias calculations incorporation into the remain workforce – Expand upon training at • Reactor Operations facilities/workplace – Occur on a daily basis • Especially when handbook worldwide data is already widely – Requires reorganization utilized and evaluation of activities • Computational analysis is performed rapidly being incorporated in the nuclear community • Just for Fun  4
  • 5. Getting Started (Continued) • ICSBEP Handbook • Look at available – Uncertainty Guide benchmark data – Critical/Subcritical Guide • Talk with current and – Alarm/Shielding Guide past participants in the – Fundamental Physics benchmark programs Guide – DICE User’s Manual – Reviewer Checklist • What are you interested in • IRPhEP Handbook benchmarking? – Evaluation Guide – Uncertainty Guide • What if you can’t do a complete benchmark? 5
  • 6. Past and Present Student Involvement • Since 1995, ~30 students have participated in the ICSBEP and/or IRPhEP – 14 directly at INL • Students have authored or coauthored over 60 benchmark evaluations – 28 & 2+ in progress at INL • They have also submitted technical papers to various conferences and journals 6
  • 7. Opportunities for Involvement • Each year the ICSBEP hosts up to one or two Summer Interns at the Idaho National Laboratory • INL has also funded benchmark development via the CSNR Next Degree Program • Students have participated in the projects as subcontractors through various universities and laboratories • Benchmark development represents excellent work for collaborative Senior Design Projects, Master of Engineering Project, or Master of Science Thesis Topic • Further information can be Obtained by contacting the Chairman of the ICSBEP – J. Blair Briggs, J.Briggs@inl.gov 7
  • 8. Opportunities for Student Involvement – I • Internships • Augmented Education – Traditional approach – Rigorous structure for – 10-weeks Master’s Thesis – Benchmark review – Mentor and peer-review process completed on network “free-time” during – Undergraduate thesis university studies – Undergraduate team – Encouraged to publish projects – Encouraged to publish 8
  • 9. Opportunities for Student Involvement – II • Center for Space Nuclear Research (CSNR) – Next Degree Program • Work as a part- or full-time subcontractor while completing a graduate degree – Via local or remote university participation – Opportunity for interaction with students participating in space nuclear activities • Other Next Degree students • Summer fellow students (interns) – Collaboration opportunities on other projects 9
  • 10. Opportunities for Student Involvement – III • Nuclear and Criticality Safety Engineers – Pilot program collaboration • Battelle Energy Alliance (BEA) • U.S. Department of Energy – Idaho (DOE-ID) – Idaho State University and University of Idaho – Graduate Nuclear Engineering Curriculum – Part-time employment (full-time summer) – Hands-on training, benchmarking, ANS meetings, thesis/dissertation work, shadow mentors, DOE and ANS standard development, etc. 10
  • 11. Investigation Breeds Comprehension • Benchmark procedures require investigation into – History and background • Purpose of experiment? – Experimental design and methods – Analytical capabilities and procedures – Experimental results • Often experiments were performed with the intent to provide data for criticality safety assessments – Many are utilized to develop criticality safety standards 11
  • 12. Culturing Good Engineering Judgment • Often experimental information is incomplete or misleading – Contact original experimenters (if available) – Interact with professionals from the ICSBEP/IRPhEP community – Establish a personal network for the young professional engineer “Do, or do not. There is no „try‟” - Jedi Master Yoda 12
  • 13. Developing an Analytical Skill and Tool Set • Evaluators develop analytical and computational capabilities throughout the evaluation process – Utility of conventional computational codes and neutron cross section data libraries • Monte Carlo or Diffusion methods • MCNP and KENO are the most common in the US – Application of perturbation theory and statistical analyses • Uncertainty evaluation • Bias assessment – Technical report writing – Understanding acceptability of results 13
  • 14. We Want You… • Numerous reactors and test facilities worldwide that can be benchmarked – Availability of data and expertise for various systems dwindle with time • Visit the website – http://icsbep.inl.gov/ – http://irphep.inl.gov/ • Contact us – icsbep@inl.gov – irphep@inl.gov – jim.gulliford@oecd.org http://www.oecd-nea.org/science/wpncs/icsbep/ http://www.oecd-nea.org/science/wprs/irphe/
  • 15. Reactors Can Be Used for More Than Making Heat 15
  • 16. WHAT BENCHMARK DATA ARE YOU MISSING? 16
  • 17. Example Experiments (Not Currently Benchmarked) • Fast Test Reactors • Small Modular Reactors – ZPR-3, ZPR-6, ZPR-9, – SCCA-004 and ZPPR Assemblies – SORA Critical – EBR-II Experiment Assembly • Gas Cooled Reactors – CLEMENTINE – GA HTGR – GE-710 Tungsten Nuclear Rocket – Fort St. Vrain – HNPF – Peach Bottom – PBR-CX – PNWL HTLTR – SNAP & NASA Activities – ROVER/NERVA – N.S. Savannah Program – Project Pluto – B&W Spectral Shift Experiments – DRAGON 17
  • 18. Example Experiments (Continued) • Research Reactors • Experiments at ORCEF – More TRIGAs – Mihalczo HEU Sphere – AGN – USS Sandwich – PULSTAR – Moderated Jemima – Argonaut – Numerous HEU Cases – Pool-type such as MSTR – SNAP Water Immersion – MNSR – “Libby” Johnson Cases – HFIR • And many, many, more – ACRR – Similar experiments – Fast Burst Reactors validate the benchmark – Kodak CFX process and further improve nuclear data – See OECD/NEA IRPhEP website for reports with data for benchmarking 18
  • 19. General Benchmark Evaluation/Review Process • ICSBEP and IRPhEP benchmarks are subject to extensive review – Evaluator(s) – primary assessment of the benchmark – Internal Reviewer(s) – in-house verification of the analysis and adherence to procedure – Independent Reviewer(s) – external (often foreign) verification of the analysis via international experts – Technical Review Workgroup Meeting – annual international effort to review all benchmarks prior to inclusion in the handbook • Sometimes a subgroup is assigned to assess any final workgroup comments and revisions prior to publication – Benchmarks are determined to be Acceptable or Unacceptable for use depending on uncertainty in results • All Approved benchmarks are retained in the handbook • Unacceptable data are published for documentation purposes, but benchmark specifications are not provided 19
  • 20. Quality Assurance Each experiment evaluation included in the Handbook undergoes a thorough internal review by the evaluator's organization. Internal reviewers are expected to verify: 1. The accuracy of the descriptive information given in the evaluation by comparison with original documentation (published and unpublished). 2. That the benchmark specification can be derived from the descriptive information given in the evaluation 3. The completeness of the benchmark specification 4. The results and conclusions 5. Adherence to format. 20
  • 21. Quality Assurance (continued) In addition, each evaluation undergoes an independent peer review by another Technical Review Group member at a different facility. Starting with the evaluator's submittal in the appropriate format, independent peer reviewers are expected to verify: 1. That the benchmark specification can be derived from the descriptive information given in the evaluation 2. The completeness of the benchmark specification 3. The results and conclusions 4. Adherence to format. 21
  • 22. Quality Assurance (continued) A third review by the Technical Review Group verifies that the benchmark specifications and the conclusions were adequately supported. 22
  • 23. Annual Technical Review Meeting • Typically 30 international participants • Usually OECD/NEA Headquarters in Paris, France • Technical review group converges to discuss each benchmark, page-by-page, prior to approval for entry into the handbook • List of Action Items are recorded; these must be resolved prior to publication – Final approval given by Internal and Independent Reviewers and, if necessary, a Subgroup of additional reviewers • Final benchmark report sent to INL for publication 23
  • 24. What to Expect at Tech Review Group Meetings 24
  • 26. Annual Important Dates (Guidelines) ICSBEP IRPhEP Benchmark Evaluation Year-round Year-round Internal Review Year-round Year-round Independent Review February – March August – September Technical Review Group Late March – Late September – Distribution Early April Early October Technical Review Meeting Early May Late October Resolution of Action Items May – July November – January Submission of Final Report Before Late July Before Late January Handbook Preparation June – September November – March Handbook Publication End September End March Reviews can be coordinated throughout the year. These are the guidelines for annual publications. 26
  • 27. Questions? SCCA-002 27