Engine and vehicle OEMs have complex product development requirements that call for capable and standardized post-test analysis while simultaneously dealing with high levels of product technology and test diversity. Many organizations rely on shared spreadsheets or distributed desktop tools, which can produce inconsistent and non-traceable results.
This presentation addresses enterprise-scale post-processing requirements, features and implementation considerations in addition to providing lessons learned that can help move an organization to an efficient, standardized and maintainable professional process.
The presentation will cover how to address diversity of product topology, physical components, fluid properties, measurement uncertainty, naming conventions, traceability and IT architecture.
3. 3
BRUCE THOMASON
Director of Technology for SGS, Transportation
A career built on systems engineering, test technology,
and development and testing of complex systems in
aerospace, automotive, and power generation.
4. 4
R&D TEST ANALYTICS COMPLICATIONS
Product range:
Architecture, Size and
Fuels
Experiment types:
Transient, Steady State
and Extreme Condition
Measurement types:
Time-Based, Spatial and
Batch Instruments
Analysis methods:
Standard and Proprietary
5. 5
ADDED FACTORS INTERFERE
Naming conventions
Departmental
Business unit
Industry
Regulatory
Units of Measure
System
Conversions
Naming
Change in analysis methods
Development
Validation
Distribution
6. 6
INTERNAL OPERATING CHALLENGES
Not using:
Vetted engineering practices
In-house expertise
Not knowing:
Measurement uncertainty
Best practices
Not tracking or storing:
Test details
Test setups
Test conditions
Everyone is fighting fires!
7. 7
SPREADSHEET SWAMP
Spreadsheets ‘spread’
like a healthy mold
Maintenance not
performed
Analytics ill-suited to
spreadsheets are
crammed in anyway
8. 8
THE UNFORTUNATE RESULTS
A less rich set of information
A less correct set of
information
A short half-life for
experimental results
An inconsistent basis for
understanding results from:
Test to test
Product to product
Person to person
Over time
10. 10
WHY THIS FOCUS?
Enterprise Process Element Example Enterprise Tools
Test automation SGS CyFlex®,
National Instruments
LabVIEW®
Calibration and equipment
management
Fluke MET/TEAM®
Raw data storage and
retrieval
ASAM Open Data Services
Test Results Analysis SGS Mach Analytics™
Scientific Visualization and
Reporting
Too many to mention…
This space was
under-served
11. 11
EXPERIMENTATION RESULTS ANALYTICS
For this presentation:
First principal methods
To transform raw measurements
Into useful engineering information
But, with great enterprise focus:
High test volume
Large product and test diversity
Long term, big picture features
Anyone can put F=ma in a spreadsheet, but its hard to
put an engine in one!
12. 12
REQUIREMENT AREAS
Enterprise focused areas to address:
Research teams
Engineering discipline functions
Product development teams and processes
Enterprise goals
IT infrastructure and process
Lifecycle considerations
Cross-organization factors
Its not just about the math!
13. 13
RESEARCH TEAM
Driven by innovation, speed, fail fast and forward, new
components and arrangements. No pain, no gain!
Area of need in analytics Examples
Adaptability in product-to-be Product configuration change,
component change
Adaptability in methods New experiment types, new
instrumentation, new analysis
Rapid pace and changing
directions
Try and use or throw away
What-if-ing Extrapolation, overriding values,
comparing to models
Down stream knowledge flow The ability capture and convey
successful results and methods
14. 14
ENGINEERING DISCIPLINE LEADERS
Driven by stewardship of our knowledge area,
we sweat the details.
Area of need in
analytics
Examples
Development Creation of new or improved
methodologies
Validation Vetting methods during and after
development
Quality assurance On-going correctness, repeatability,
release-for-use management
Knowledge transfer Efficiently supporting teams using
developed methods.
15. 15
PRODUCT DEVELOPMENT PRIORITIES
“Get the product developed and validated, yesterday.”
“Our deadline is when?”
Area of need in analytics Examples
Efficiency High data volumes, rapid cycles
Ease of access and use Don’t slow me down, don’t make me learn
too much
Quality assurance Leveraging vetted and standard methods
Accountability Having traceable results
16. 16
INFORMATION TECHNOLOGY GOALS
“We need to keep this thing running for decades without
slowing down the engineers.”
Area of need in analytics Examples
Supportability Installed platforms: servers, web
applications, test systems, SAAS/PAAS
Transparency
Available and knowledgeable help
Standards adherence Industry and internal
Control Centralization, access control, knowledge
Predictability Performance, reliability, resource use
Security Need to know, hacker threats
17. 17
ENTERPRISE GOALS
“Let’s keep this money pump humming along,
but make it better as we go.”
Area of need in analytics Examples
Definition and Standardization Of process, methods, and actual
behavior… easier to count on and improve
a known foundation.
Clarity of ownership Process and content owners … know
where to turn for help and improvement.
Innovation Create and understand product
differentiation and market advantage
Commercial sensitivity Value creation, correctness, avoidance of
rework/warranty
Security Need-to-know basis for staff, suppliers,
customers, regulatory agents
18. 18
INTER- AND INTRA-ORGANIZATION ISSUES
Area of need in analytics Examples
Naming convention differences Departmental, OEM1 vs. OEM2 vs.
regulatory
Product and component type
differences and change
Engines (themselves with highly variable
architectures) vs. generator sets vs.
vehicles
Segregation & security Industry or regulatory standard vs. OEM
proprietary
Data store variability Relational database, ODS, flat file, live test
automation system
Information sink variability End user/tool, batch process, live test
automation system
“Our stuff has to work for everybody, everyday.”
19. 19
ANALYSIS METHODS LIFECYCLE
CONSIDERATIONS
Area of need in analytics Examples
Support for life stages of
methods
From what-if concepts “sand boxing” to
production use to “retired-but-keep-
around-as-reference”
Driven by Product change, regulatory change,
evolving instrumentation and
experimentation methods
Involving libraries, systems
and interfaces
That are added, evolve, replaced, or
made obsolete
“The world changes …… got to keep up!”
20. 20
THE PUNCHLINE
The preceding is possible
It’s been done!
SGS Mach Engine Analytics Software
The preceding is possible
Approach
Outcomes
Lessons learned
21. 21
MACH APPROACH
A base set of “components” is available
and extensible
Components are backed by component-
specific analysis methods
A “unit under test topology” defines the test
article as component connections: typically
fluid or energy flows
Test measurements associated with
specific states of components or
connections
Mach combines available information to
calculate other derivable results
Environmental Conditions ▪
Altitude ▪ Temperature ▪
Humidity ▪ Grade ▪ Air Handling
Systems ▪ Single stage turbo ▪
Sequential turbo ▪ Intercooler ▪
Intake throttle ▪ Exhaust Gas
Recirculation ▪ Low pressure
EGR ▪ High pressure EGR ▪
Diesel Fuel Injection Systems ▪
Fuel supply/return ▪ Unit Injector ▪
Pump-line-nozzle ▪ Common Rail
▪ Camshaft and Valvetrain ▪
Lifters ▪ Variable valve actuation
▪ Synchronization ▪ Catalysts &
Filters ▪ DOC ▪ DPF ▪ SCR ▪
LNT ▪ ASC ▪ TWC ▪ Exhaust
Sensors ▪ Wide band Lambda ▪
Narrow band Lambda ▪ NOx
sensor ▪ NH3 sensor ▪ Soot
sensor
25. 25
LESSONS LEARNED
A broad and long term view tends to
benefit from:
Domain experts
Modular architecture
Well-defined interfaces
Pluggable software modules
Built-in quality assurance
Domain-specific languages
26. 26
BIG DATA APPLICATIONS IN THE DYNO LAB:
MACHINE LEARNING
Dynamometer lab data are used to
create pattern recognition models for
good and poor engine operation
• Extreme environmental conditions
• Sensitivity outside of OEM installation limits
• Imposed faults/malfunction/abuse
• Training using dyno lab measurements and
known observed conditions
Analytics are then applied to large
volumes of test data using cluster
computing to discover similar poor
operating conditions
Benefits include uncovering the
scope of the problem and gaining
insights for product improvement
Training Data
Good Operation
Training Data
Poor Operation
Feature
Extraction
Model
Training
Model
Validation
Dyno Testing
DoE
Classical
Statistics
Model
Analytics to
Discover
Problems in
Populations
Wide Environmental
Test Space
Extreme Regional
Climate Data
Example: Engine Sensitivity
To Extreme Environmental Conditions
27. 27
A CLOSING CHALLENGE
If your organization is
stuck in a spreadsheet
swamp, re-think the
possible
We’ve done it before, we
can do it again
Faster
Cheaper
Better
28. 28
WHY SGS?
LEVERAGING
THE RIGHT
TECHNOLOGY
THINKING
GLOBALLY,
ACTING
LOCALLY
SERVICES AND
SOLUTIONS
ACROSS
INDUSTRIES
COMPLETE
SUPPLY CHAIN
SUPPORT
STRONG
MANAGEMENT
TEAM AND
LONGEVITY
HIGH PRIORITY
ON CUSTOMER
SERVICE AND
SATISFACTION
CORPORATE
CULTURE OF
QUALITY AND
SAFETY
SYNERGISTIC &
STRATEGIC
CLIENT
PARTNERSHIPS
TRUSTEDFORIMPARTIALITY:INSPECTION,TESTING,
VERIFICATIONANDCERTIFICATION
INDUSTRYCOMMITMENTTHROUGHCONTINUED
INVESTMENTANDACQUISITION
INDEPENDENTLY STRONG, TOGETHER STRENGTHENED FOR GROWTH