Anzeige

Benchmark Tutorial -- III - Report

jdbess
9. Jul 2012
Anzeige

Más contenido relacionado

Similar a Benchmark Tutorial -- III - Report(20)

Anzeige

Benchmark Tutorial -- III - Report

  1. Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA www.inl.gov 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012 This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)
  2. Outline I. Introduction to Benchmarking a. Overview b. ICSBEP/IRPhEP II. Benchmark Experiment Availability a. DICE Demonstration b. IDAT Demonstration III. Dissection of a Benchmark Report a. Experimental Data b. Experiment Evaluation c. Benchmark Model d. Sample Calculations e. Benchmark Measurements IV. Benchmark Participation 2
  3. © Microcosmos DISSECTION OF A BENCHMARK REPORT 3
  4. Benchmark Evaluation Format guides provided on handbook DVDs 4
  5. Why Such a Rigorous Format? • These are Handbooks or Reference Books – For the benefit of the user – Orderly layout to assist the user – Information is always in the same location – Information has been rigorously verified • Separation of Geometry, Materials, and Temperature – Neutronics computer code input – Allows for systematic and detailed review/verification • Not a Compilation of Technical Reports 5
  6. ICSBEP Content and Format • Experiment title • Identification number (Fissile Material) - (Physical Form) - (Spectrum) - (Three-Digit Numerical Identifier) Fissile Material Physical Form Spectrum Plutonium PU Metal MET Fast FAST Highly Enriched Uranium HEU Compound COMP Intermediate-Energy INTER Intermediate Enriched Uranium IEU Solution SOL Thermal THERM Low Enriched Uranium LEU Miscellaneous MISC Mixed MIXED Uranium-233 U233 Mixed Plutonium – Uranium MIX Special Isotope SPEC Subcritical measurements are denoted by including the letters “SUB” at the beginning of the identifier. • Key words A list of words that describe key features of the experiment is provided. 6
  7. ICSBEP Content and Format (Continued) 1.0 DETAILED DESCRIPTION • Detailed description of the experiments and all relevant experimental data – Who, what, when, why, where? Then how? – Data preservation • Measurement methods used and the results obtained for the parameters of interest • Uncertainties in the data that were assigned by the experimenters, how the uncertainties were determined and what they represent 7
  8. ICSBEP Content and Format (Continued) 1.1 Overview of Experiment • Summary of the experiment, its original purpose, the parameters that vary in a series of configurations, and mention of significant relationships to other ICSBEP-evaluated experiments • Name of the facility, when the experiments were performed, the organization that performed the experiments, and perhaps the names of the experimenters if available • The conclusions of the Evaluation of Experimental Data section, Section 2, should be briefly stated 8
  9. ICSBEP Content and Format (Continued) 1.2 Description of Experimental Configuration • Detailed description of the physical arrangement and dimensions of the experiment – Discrepancies and inconsistencies should be noted – Use original units (SI units then added parenthetically) • Method of determining the critical condition and, if applicable, the measured reactivity are stated • Subcritical measurements may require more detailed information about the source and detectors than is typically required for critical assemblies 9
  10. ICSBEP Content and Format (Continued) 1.3 Description of Material Data • Detailed description of all materials used in the experiment as well as significant materials in the surroundings – Identify what information, if any, is missing – Note when density is calculated, and not reported • Specify source of composition data (physical or chemical analyses or from material handbooks when only the type of material was specified • Details of the methods of analysis and uncertainties • Dates of the experiment, of the chemical analysis, and of isotopic analysis or purification – Usually when isotopic buildup and decay are important 10
  11. ICSBEP Content and Format (Continued) 1.4 Temperature Data • The temperature at which the experiments were performed should be given and discussed • If available, how was the temperature measured 11
  12. ICSBEP Content and Format (Continued) 1.5 Supplemental Experimental Measurements • Additional experimental data (e.g., flux distributions, spectral indices, βeff, reactivity data, etc.) not necessarily relevant to the derivation of the benchmark model • Subcritical measurements include a description of the measurement technology and a discussion on the interpretation of the measurements as well as the measured data 12
  13. ICSBEP Content and Format (Continued) 2.0 EVALUATION OF EXPERIMENTAL DATA • Evaluation of the experimental data and conclusions • Missing data or weaknesses and inconsistencies in published data • Effects of uncertainties • Summary table • Unacceptable data not included in Sections 3 & 4 • Unacceptable data may still be used in validation efforts if the uncertainty is properly taken into account 13
  14. Section 2.0: Addressing Uncertainties • Experimental • Geometrical Properties Measurements – Dimensions – Temperature – Quantity • How does it impact – Position/location medium – Worths • Compositional – keff Variations • Repeatability and – Mass (density) reproducibility – Isotopic content – Control rod positions – Composition • Impurities This section of the benchmark report has evolved significantly since the initiation of the benchmark projects, due to the input from many international experts over the past two decades. 14
  15. Section 2.0: Addressing Uncertainties (Cont.) • Computational Analysis – What if statistical – What is your statistical uncertainty is greater uncertainty? than calculated Δk? • Needs to be larger than • Perform longer your Δk calculations calculations – Perturbation Analysis – Still doesn’t work? • 1σ • Assume linearity in Δk • Manufacturing tolerances • Apply larger perturbation • Repeated measurements • Use a scaling factor • Measurement limits • Uncertainty Guide – What is negligible? provided in Handbooks • Depends – Examples in appendix • Typically ≤0.00010 Δk – Also look at similar evaluations 15
  16. Benchmark Uncertainty vs. Criticality Safety Analysis 1σ = 0.1°C = negligible 16
  17. Typical Probability Distributions 1. Normal: 3σ 1. 2. Uniform, Equiprobable, Bounding: 1σ=Δx/√3 3. One-Sided Bounding, Triangular: 1σ=Δx/2√3 2. 3. 17
  18. Probability Selection – Example I (HTTR) Normal Distribution, 3σ 18
  19. Probability Selection – Example II (NRAD) Manufacturing Tolerances 3. ??? 1. Uniform, Bounding 2. Bounding, 3σ or Uniform? Fabrication Process? 19
  20. What if the Uncertainty is Unknown? • Example: pipe diameter • What is the manufacturing tolerance? – National/International standards – Current fabrication capabilities • Example: isotopic abundance of radioisotope • What is the uncertainty in the reported values? – Typical uncertainties for similar experiments and measurement capabilities – NIST traceable standards (systematic uncertainty) • Does your uncertainty make sense? – Does is seem to adequately reflect reality? 20
  21. Example – ORCEF HEU Metal (Y-12) Not always available Detection limit 21
  22. What if You Have Too Many “Right” Answers? • Example: composition of a material – Samples of a material distributed to different laboratories to measure – Inter-laboratory results did not match within the reported measurements and statistical uncertainties – Which is correct? • Engineering judgment – Evaluate measurement capabilities and limitations – Can we redo the measurements? – Establish an nominal composition and uncertainty • Does the uncertainty appear reasonable? • Does the uncertainty adequately encompass the measurement uncertainty • International review facilitates discussion of evaluation 22
  23. What about Detection Limits? • Bounding • Do you expect a uniform distribution or triangular? – Was the material purified? – Is the impurity typically not present? 23
  24. Dealing with Unknown Systematic Uncertainties – i.e. Impurities • What is a typical impurity content in a given material? – Similar benchmarks, expert opinion, reference/manufacturing data, fabrication process details, comparison with similar materials • Calculate effect in Δk for the inclusion of “typical” impurities in the material • Take Δk/2 and apply as a bias for removal of unknown impurity content – Include a bias uncertainty of ±Δk/2 24
  25. Random vs. Systematic Uncertainties • Random • Systematic – Variation among – Biases in measurement multiple measurements technique • Dimensions, masses, etc. – Not impacted by multiple – Negligible for large measurements sample populations • Impurities, enrichment, etc. • TRISO and pebbles in a – Can be significant pebble-bed reactor Errors (outliers) 25
  26. Example – NRAD •Bounding uncertainties •Assumed grid hole positions independent •Assumed hole diameter systematic due to single drill bit and small quantity of holes •Use of URAN card provided negligible results (Δk ≤ 0.00010) 26
  27. What is the Bottom Line? • What is the “quality” of • Acceptable Benchmark the experiment? Experiments – Total uncertainty – For well-known materials • How large is the • 1σ ≤ 1% uncertainty? – For unique experiments, – Large computational bias? the uncertainty can be • Due to missing or larger erroneous data? – For large computational biases • Are they within the uncertainty? • Are they well known and quantified? • Otherwise, unacceptable 27
  28. Example – NRAD Δktot = √(Σ Δki2) = 0.00271 28
  29. ICSBEP Content and Format (Continued) 3.0 BENCHMARK SPECIFICATIONS • Benchmark specifications provide the data necessary to construct calculational models • Retain as much detail as necessary to preserve all important aspects of the actual experiment • Simplifications include description of the transformation from the measured values to the benchmark-model values, the transformation and the uncertainties associated with the transformation 29
  30. ICSBEP Content and Format (Continued) 3.1 Description of Model • General description of main physical features of the benchmark model(s) • Simplifications and approximations made to geometric configurations and/or material compositions are described and justified • Resulting biases and additional uncertainties in keff are quantified • Justification for omitting any constituents of the materials 30
  31. Detailed vs. Simple Model(s)…What is the purpose? ZPPR-20 = SP-100 Mockup 31
  32. Assessing Biases and Correction Factors • The effect of simplifying the model is assessed by comparison of a detailed model to a simple model • Some simplifications are anti-correlated – Their effects must be modeled individually and as a whole to understand the complete result • Sometimes the bias is smaller than the statistical uncertainty – The bias is assumed negligible – The bias uncertainty is included in the overall uncertainty of the benchmark model 32
  33. Assessing Biases and Correction Factors (Cont.) • Typical Examples – Measured worth of “critical” experiment – Measured worth of components – Room return effects – Model Simplifications • Homogenization • Adjustments for benchmark model temperature • Geometry simplifications • Removal of impurities – What if true bias is unknown • Bias is estimated • Bias uncertainty is increased 33
  34. Example – GROTESQUE MCNP statistical uncertainty = 0.00003 34
  35. ICSBEP Content and Format (Continued) 3.2 Dimensions • Include all dimensions and information needed to completely describe the geometry of the benchmark model(s) – Derived from Section 1; no rounding of units • Sketches, including dimensions and labels, of the benchmark model(s) should always be included 35
  36. ICSBEP Content and Format (Continued) 3.3 Material Data • Atom densities for all materials specified for the model(s) are derived from the reported values given in the previous sections and are concisely listed – Five significant digits • Provide unique or complicated formulas for deriving atom densities 36
  37. ICSBEP Content and Format (Continued) 3.4 Temperature Data • Temperature data for the model(s) 37
  38. ICSBEP Content and Format (Continued) 3.5 Experimental & Benchmark-Model keff and/or Subcritical Parameters • Experimental values • Benchmark values (adjusted to account for bias) • Include total benchmark uncertainty – Combination of experimental & benchmark uncertainties 38
  39. ICSBEP Content and Format (Continued) 4.0 RESULTS OF SAMPLE CALCULATIONS • Calculated results obtained with the benchmark- model specification data given in Section 3 • Sample calculations only • Discuss discrepancies between Benchmark values (Section 3.5) and calculated values (Section 4.0) – C/E, (C-E)/E %, within 1σ or 3σ, biases, trends 39
  40. ICSBEP Content and Format (Continued) 5.0 REFERENCES • All formally published documents referenced in the evaluation that contain relevant information about the experiments • References to handbooks, logbooks, code manuals, textbooks, personal communications with experts, etc. are given in footnotes 40
  41. ICSBEP Content and Format (Continued) APPENDICIES • Supplemental information that is useful, but is not essential, to the derivation of the benchmark specification – Calculations, photos, scanned documentation, model variants, auxiliary measurements APPENDIX A • Typical Input listings • Brief comments about options chosen for calculations • Version of the code (e.g., MCNP, KENO, MONK, etc.), SN Codes: quadrature order, scattering order for cross sections, Convergence criteria, representative mesh size, Monte Carlo Codes: number of active generations, number of skipped generations, total number of histories, etc. 41
  42. IRPhEP Evaluation Format • Experiment title • Identification number (Reactor Name) - (Reactor Type) - (Facility Type) - (Three-Digit Numerical Identifier) (Measurement Type(s)) Reactor Type Facility Type Measurement Type Pressurized Water Reactor PWR Experimental Facility EXP Critical Configuration CRIT VVER Reactors VVER Power Reactor POWER Subcritical Configuration SUB Boiling Water Reactor BWR Research Reactor RESR Buckling & Extrapolation Length BUCK Liquid Metal Fast Reactor LMFR Spectral Characteristics SPEC Gas Cooled (Thermal) Reactor GCR Reactivity Effects REAC Gas Cooled (Fast) Reactor GCFR Reactivity Coefficients COEF Light Water Moderated Reactor LWR Kinetics Measurements KIN Heavy Water Moderated Reactor HWR Reaction-Rate Distributions RRATE Molten Salt Reactor MSR Power Distributions POWDI S RBMK Reactor RBMK Nuclide Composition ISO Fundamental FUND Other Miscellaneous Types of MISC Measurements 42
  43. IRPhEP Evaluation Format (continued) 1.1 Description of Critical and Subcritical Measurements 1.2 Description of Buckling and Extrapolation Length Measurements 1.3 Description of Spectral Characteristics Measurements 1.4 Description of Reactivity Effects Measurements 1.5 Description of Reactivity Coefficient Measurements 1.6 Description of Kinetics Measurements 1.7 Description of Reaction Rate Distribution Measurements 1.8 Description of Power Distribution Measurements 1.9 Description of Isotopic Measurements 1.10 Description of Other Miscellaneous Types of Measurements 43
  44. Summary of IRPhEP Measurements 44
  45. Additional Benchmark Measurements • Measurement uncertainty typically dominates uncertainties in geometry and materials • What about βeff? – Difficult to measure – Variation between different cross section data libraries typically large anyway • How well do measurement techniques/methods compare to computational simulations 45
  46. Questions? HTR-PROTEUS 46
Anzeige