Lionel Mazzella, Plant Modelling Team Leader at E.ON New Build and Technology, presents a supercomputing collaboration case study around E.ON's use of HPC Midlands to accelerate their innovation. For more information, please see http://hpc-midlands.ac.uk
AWS Community Day CPH - Three problems of Terraform
HPC Midlands - E.ON Supercomputing Case Study
1. Supercomputing collaboration case study
HPC Midlands launch event
Lionel Mazzella
Plant Modelling Team Leader
E.ON New Build & Technology
20th March 2013
2. Agenda
1. E.ON New Build & Technology
2. Plant Modelling Team
3. CFD projects
4. HPC Trial: Historic
5. HPC Trial: Test Project
6. HPC Trial: Outcome
7. HPC for what?
2
3. E.ON New Build & Technology
We realise E.ON’s cleaner and better strategy.
Our aim it’s to engineer a cleaner
and better future for E.ON by
delivering world-class solutions
from ideas to reality.
Ratcliffe, Nottingham
Actively involved on nearly 300
locations in Europe and Russia,
with a total output of more than
60GW. Hannover
More than 1,200 employees.
Gelsenkirchen
3
4. Plant Modelling team
Part of E.ON New Build & Technology’s Software & Modelling Department.
Team of engineers and scientists delivering:
Thermodynamic Modelling Consultancy (PROATES®)
On-line Performance Monitoring (PROATES PMS)
Computational Fluid Dynamics (CFD)
Various R&D modelling projects (E.g. CSP and Energy storage)
Tertiary
Secondary
Primary
4
5. CFD projects: Gas turbine blade heat transfer
Aerothermal Analysis of Heat Transfer to Blades
Necessary Starting Point for Lifetime Prediction
Model Film Cooling
Compute Heat HP Vane: htc
14000
Transfer Coefficients 12000
& Gas Temps 10000
8000
htc
6000
4000
2000 no film cooling
Predict GT Blade Lifetimes 0
with film cooling
-140 -100 -60 -20 20 60 100 140
Potential to Provide Considerable Savings surface distance (mm)
5
6. CFD projects: Steam flow behaviour in a steam dome
Looking at impact of power upgrade
CFD shows vortex formation in dome
High levels of swirl induced in steam lines
Leads to steam line vibration problems
Steady-state 21 Million cells. (HP Z800 Workstations)
6
7. HPC Trial: Historic
June 2011 First contact with Loughborough University
July 2011 First meeting with HPC Midlands people and look at Hydra
October 2011 Working meeting at the Loughborough University
November 2011 Defining HPC trial scope and project to be used for the test
December 2011 Meeting with ANSYS, licensing support for the HPC trial
January 2012 Meeting with E.ON IT and looking at connectivity options
July 2012 HPC Trial completion
August 2012 HPC Trial report completion
October 2012 Presentation of the report to HPC Midlands
February 2013 Meeting with HPC Midlands, Hera visit and commercial discussions
7
8. HPC Trial: Test Project
Original work was to simulate the dispersion of natural gas from leaks
occurring within a ventilated gas turbine enclosure.
Work part of an assessment to ensure compliance with Health and Safety
Executive regulations.
HP Workstation xw8600
Xeon E5405 CPU, 2 cores
4.5M elements meshing
Steady state, complex geometry, simple physics
340 iterations, around 24 hours to complete
More than 10 simulations requested for the work
8
9. HPC Trial: Outcome
The HPC Trial reviewed a range of configurations:
Simulations with 24 to 128 CPU cores.
6 different meshes used from 4.5 to 80 million cells.
Normalised results for 2010 this gives an equivalent time of 5240 sec.
HPC speeds-up from 30 to 145 times the 2010 values.
Grid 1 4.5M elements 24 cores
2010 Figure – 5240 sec Grid 2 8M elements 48 cores
The lower this number the better
Grid 3 15M elements 48 cores
Grid 4 25M elements 60 cores
Grid 5 45M elements 128 cores
Grid 6 80M elements 128 cores
9
10. HPC for what?
Non-linear models Biggest advantage of the HPC
is likely to be for jobs that:
Have large parameter
spaces.
Are time dependent.
Power
Have complex geometry.
Have very complex physics.
Partnership
Linear models A combination of the above.
Solution
Desktop Small cluster HPC
10
Previously this work has been performed on a HP xw8600 Workstation with Xeon E5405 processors. In that case the mesh consisted of 4536737 elements and was run in parallel on 2 cores. The time to complete 340 iterations was 23 hours, 13 minutes and 28 seconds (CPU core and Wall timings were almost identical). Given that 10 or more such simulations for different leak locations are required then the assessment of the GT enclosure can take several weeks to complete.
We ran the gas turbine simulation with a number of different parameter sets ranging from a 4.5m cell mesh on 24 cores, through to an 80m cell mesh on 128 cores, and observed speed-ups of betwen 30 and 145 times the performance of the in-house system based on workstation class PCs.HPC cores are more than twice as fast as the 2010 cores. For HPC Grid 1 case the speed-up should be 12 (2 cores compared to 24) but it was actually 30More cores gives reduced turn-around time, for example from 10 days to 1 day, with much improved resolution