The Ultimate Guide to Choosing WordPress Pros and Cons
Patrick.guske.update
1. The Orbiting Carbon Observatory (OCO) Mission
Watching The Earth Breathe…Mapping CO2 From Space.
OCO: A Unique Lessons Learned Opportunity
Patrick J. Guske
Jet Propulsion Laboratory
California Institute of Technology
Used with Permission
2. Agenda
• Introduction to the Orbiting Carbon Observatory
• History
• Plan for the Future
• Nominal Phase or Mission Close-Out Process
• OCO’s “Close-Out” Process
• Sample Lessons
• Evaluation Plans
• Questions and Answers
Slide 2
3. Project and Mission Overview
The Orbiting Carbon Observatory (OCO)
Watching The Earth Breathe…Mapping CO2 From Space
Salient Features:
• High-resolution, three-channel grating spectrometer
• Industrial partners for Instrument and Spacecraft
• High heritage spacecraft, flies in formation with the A-Train
• Launch date: 24 February 2009 on Taurus XL from VAFB
• Operational life: 2 years
• Principal Investigator: Dr. David Crisp, Deputy: Dr. Charles Miller
• Project Manager: Thomas Livermore, Deputy: Dr. Ralph Basilio
• Earth Science Flight Projects Office Manager: Dr. Steven Bard, JPL
• ESSP Program Manager : Edward Grigsby, LaRC
• Program Scientist: Dr. William Emanuel, NASA HQ
• ESSP Program Executive: Eric Ianson, NASA HQ
Science:
• Collect the first space-based measurements of atmospheric CO2 with the precision, resolution, and
coverage needed to characterize its sources and sinks on regional scales and quantify their variability
over the seasonal cycle.
• Use independent data validation approaches to ensure high accuracy (1-2 ppm, 0.3% - 0.5%)
• Reliable climate predictions require an improved understanding of CO2 sinks
• What human and natural processes are controlling atmospheric CO2?
• What are the relative roles of the oceans and land ecosystems in absorbing CO2?
Slide 3
4. Mission System Description
3-Channel
Spectrometer Dedicated Spacecraft Ground Validation Sites
formation-flying as part of
the A-Train Constellation
Data
Processing
Center (JPL)
Taurus XL
3110 (KSC)
Mission Ops NASA GN (GSFC) and SN (TDRSS)
Please visit http://oco.jpl.nasa.gov for more information
Data Products
Slide 4
5. History
• OCO selected as ESSP mission in 2002
• Launch on February 24, 2008
– Fairing on the launch vehicle failed to separate
– Additional weight of fairing too much for final stage
– Observatory failed to achieve insertion orbit
• In a matter of moments, the Mission was over
• The Project immediately started the effort to convince NASA a re-flight was in
order
• Close-Out activities began
Slide 5
6. Plan for the Future
• The measurement OCO was to take is critical
• The Project Team has proposed a re-flight
– Utilize existing designs where possible
▪ Instrument build brought in-house
▪ Some components are obsolete
▪ Cryocooler was already a flight spare
– Quick development schedule, starting just prior to CDR
– Focus on minimizing changes (Better is the Enemy of Good Enough)
• Waiting for Approval to Proceed (ATP)
• Conducting Risk Reduction
– Early part procurement
– Clarification of requirements
– Staff retention
• Positioning for a quick start following ATP
Slide 6
7. Nominal Phase or Mission Close-Out Process
• Projects collect Lessons Learned at the end of development or the end of the
mission
• Nominally,
– Lessons are collected from all members of the team
– Lessons identify what worked and what didn’t
– Lessons are “cleaned up” and published
• The Perspective is to close out the phase or the project itself
– “Everyone needs to capture lessons learned”
• After publication, the identity of the Customer may be nebulous
– Members of the Project carry lessons forward
– Documents may be read by similar projects in development
– “Real Good” lessons may be captured into Institutional or NASA Lessons Learned
Slide 7
8. OCO’s “Close-Out” Process
• For OCO, the Project ended quickly, but still needed to be closed out
• In a similar way,
– Lessons were collected from all members of the team
– Lessons identified what worked and what didn’t
– Lessons were “cleaned up” and published
• The Perspective for OCO is different:
– “We ARE going to do this again. Let’s get it right.”
• As to the Customer for the lessons,
– “We have met the Customer and He is Us”
Slide 8
9. OCO’s Lessons Learned Process
• Collect ALL inputs, including what worked and what didn’t
• Remove duplicates
• Remove lessons that complain or “whine” but offer no solution
• Evaluate lesson to make sure it is “make it work” and not “make it better”
– Schedule impact or extra work for employees (e.g., Requirement Verification)
– Cost impact encountered by the Project (e.g., Residual Image Correction)
– Process problem that caused confusion (e.g., Expected Test Behavior Review)
• Clean up wording of lesson and proposed solution
– Focus on making it positive
– Attempt to make it achievable
– Remove “finger pointing” (The Team, both JPL and the Contractor, need to still
work together!)
– Keep only cross-cutting lessons (lessons for individual subsystems tracked there)
• Assign individuals responsible for following and “championing” implementation
– Make sure whatever worked is implemented in re-flight
– Develop a plan for implementation
– Track improvements and report status to Project management
• Signed version of the document had 78 Lessons
Slide 9
10. Sample Lesson - Requirements
• Problem/Background
– Multiple databases were maintained for the requirements tracked in DOORS
(Dynamic Object Oriented Requirements System). This resulted in linkages
between requirements in the two databases not being maintained. In addition, it
was not always possible for people from various organizations to view and review
requirements maintained within partner databases.
• Lesson Learned
– A single official version of the DOORS database needs to be maintained and
must be accessible to all applicable parties.
• Cognizant Individual: Project System Engineer
• Solution
– A proven solution for synching up requirement databases has already been put in
place for other JPL projects with Spacecraft Contractors. The process will be
implemented when OCO-2 requirements are placed into DOORS.
• Status/Evaluation
– Pending implementation
Slide 10
11. Sample Lesson – Flight Screening of Detectors
• Problem/Background
– The project decided early in the lifecycle not to build an engineering model of the
instrument and to accept the initial screening of detectors by the Vendor. This
screening included a typical set of tests but did not exactly mimic some of the
OCO unique flight conditions. During instrument thermal vacuum testing, the
integrated instrument experienced solar spectra for the first time. It was
discovered that two of the three detectors experienced residual image issues.
Rather than replace the affected detectors at the risk of significant schedule slip
and damage to the instrument, the Project instituted an effort to develop residual
image correction algorithms to process instrument imaging data.
• Lesson Learned
– The flight screening process for detectors (focal planes) needs to duplicate the
flight operating conditions including clocking, bias voltages, read-out scheme and
illumination conditions.
• Cognizant Individual: Instrument Manager
• Solution
– The OCO-2 schedule and budget have been modified to test flight candidate
detectors on the engineering testbed in flight-like conditions prior to integration
into the instrument.
• Status/Evaluation
– Pending
Slide 11
12. Sample Lesson – Transfer of Instrument Data
• Problem/Background
– Transferring large volumes of instrument data was problematic. Getting the data
to the Instrument Team in a timely manner and as complete sets was critical for
analysis and evaluation of instrument performance. The use of customer-
supplied USB hard drives was the right answer technically, but it may have
violated portable storage device policies.
• Lesson Learned
– Investigate early in the Project a method for transferring and logging large
volumes of instrument data to a centralized server/repository that is accessible by
by ATLO and the Instrument Team. The possibility of a dedicated high-volume
data line between JPL and the Contractor should be considered.
• Cognizant Individual: Mission Operations Manager
• Solution
– TBD
• Status/Evaluation
– Pending
Slide 12
13. Sample Lesson – Early Provision of S/C Simulator
• Problem/Background
– The spacecraft simulator developed and provided early in the program supported
an early design opportunity to work the Instrument data handling functions. It was
also a critical element for the Instrument development and test efforts. This
provided significant opportunities to gain experience with the operations system,
including the operational scripting language and command and telemetry
databases for the Instrument and Operations teams.
• Lesson Learned
– Providing the spacecraft simulator early in the project lifecycle supported the
resolution of significant instrument issues.
• Cognizant Individual: Instrument I&T Manager
• Solution
– Repeat the experience from OCO – Use the spacecraft simulator for Instrument
integration and test
• Status/Evaluation
– Pending
Slide 13
15. Evaluation Plans
• Matrix of Lessons Learned has been developed
• Each Lesson has been evaluated for:
– When it should be implemented
▪ Some are being implemented during the period of risk reduction prior to ATP
▪ Prior to CDR and following ATP
▪ Normal course of Project Development
– If there is a potential cost involved in implementing the Lesson
• Cognizant Individual will develop an implementation plan, including cost (if any)
and evaluation criteria of successful implementation
• Project will evaluate the implementation plans that require financial resources
– Is the change a “Make It Work” or a “Make it Better”
– Up-Front Cost vs. Potential Liens Against Reserves
• Project will determine which Lessons are to be implemented
• Project System Engineer will track implementation status monthly
• Evaluation of total implementation program will be reported following successful
launch of OCO-2
Slide 15