1. 58
Are Big Product
Lifecycle
Management
(PLM) Projects
Really Cost
Effective?
Michele Brun is an expert in a
number of business areas including
PLM Strategic Planning, Portfolio
Management, ECAD/MCAD and
Systems and Software Engineering.
Having worked in the industry since
1982, Michele has held a number of
high profile positions with the likes of
Siemens and Continental.
2. 59
THE TRADITIONAL
PLM APPROACH
Over the past 20-30 years many companies have
sought to implement a promising PLM solution,
adopting a holistic approach to the design cycle such
that it would be fully integrated with ERP.
As the CAD-CAE approach was maturing and proving
itself to be an ideal candidate for 3D modelling and for
the design of electronic circuit boards, it soon became
necessary to monitor and catalogue the design
documents and to keep track of the relationship
between the parts and the complete physical product
definition.
So electronic document vaulting and BOM production
were born, marking a big step forward! Most of the
industry endorsed this Product Data Management
approach but, with the emergence of workflow
support, this soon transformed into the concept of
PLM.
This solution involved everybody being connected
to a central system, eventually distributed with
appropriate replication capabilities, as well as ensuring
data security and 24/7 availability.
Observing how the industry has progressed and
looking forward into the future, one can easily see
that the industry will not only have to confront existing
challenges but will have to face new challenges also
including speed, quantity of data, time to market,
scales of variance, globalization and the growing
importance of software, to name but a few. Maybe it
is the right time to endorse a change of paradigm.
CHALLENGES OF TODAY,
TRENDS FOR TOMORROW
Systems Engineering, a discipline long set aside from
PLM, is now becoming a fundamental part of the cost
effectiveness of a multi-disciplinary PLM project.
Almost 30% of the development cost of today’s
products is related to software. One can easily
recognize that what makes the difference between
products is not the mechanical shape nor the
electronic parts, but rather the functionality that the
product is performing.
For example:
All have the same ‘look and feel’, the same type of
connectors, similar electronic printed circuit boards
and similar components. So, what’s the difference?
Simply use them and you’ll see the difference lies in the
functionality that has been specified by the Systems
Engineers and designed by the Software Architects
and Developers.
It is also likely that because of globalization, all of these
software components are the result of a collaborative
approach of distributed teams using distributed – and
communicating – requirements and configuration
management systems.
It is a formidable challenge to support the lifecycle of
This is a smartphone
This is a tablet
This is a removeable laptop
3. 60
a product taking into consideration all of these non-
physical product components; the challenge not only
resides in the ability to interconnect a standard PLM
system with a Systems and Software development
system, but also how to enable real-time baselining.
As software development is continuously moving
according to dynamic specifications related to change
requests and function definition refinements, it is
crucial to be able to constantly produce a ‘snapshot’,
at any given point of time, of all its content, the
implemented sub-systems of the main system.
ThissnapshotiswhatiscommonlyknownasaBaseline.
Nowadays, product development is around the clock;
companies are taking advantage and leveraging on
the skills of talented systems and software designers
across the globe, moving large, complex segments
of software components from one place to another
seamlessly and with minimum interruption to the
development cycle. This process has had to become
more collaborative in order to comply with stricter
timelines and is putting a lot of pressure on companies
to get their paradigm right. Let us consider this as the
next generation Product Life Cycle Support (PLCS)
challenge.
HOW TO MAKE PLCS PROJECTS
MORE COST EFFECTIVE
The important thing to note in the below diagram
is that the Model is the predominate feature of the
V-Cycle, versus the Method.
The V-Cycle Model is aimed at describing all the
different steps; the acquisition and interpretation of
the requirements for a given product/system based
on the function specification and its implementation
with regards to its components, whether they be
mechanical, electronic or the basic software units
of a given programme. The steps in between, i.e.
Architecture and Detailed Design, are basically the
ways to delve deeper into detailing the main functional
blocks and how to realize them in a “physical” manner.
This is the left side of the V.
The right side of the V is where each and every
side of the concept has to be validated once the
implementation of the elementary parts is completed.
This is then supplemented with the testing of the sub-
assemblies and then of the entire product/system and
forms the bridge between theory and practice.
One of the biggest issues facing industry is whether
it is easier to bring Systems Engineering into the
CAD world or vice versa. The latter would mean
that the Mechanical and Electronic Designers were
willing to endorse the V-Cycle as the one model that
everybody works with and would mean them having
to understand the tasks of an Architect and what
Requirements Engineering means.
In reality, the discrete world performs the V-Cycle
naturally, but don’t often call it that.
BRINGING TOGETHER
THE DISCRETE AND THE
FUNCTIONAL WORLDS
One may argue that the people from the so-called
discrete world, traditionally the pioneers of PLM,
4. 61
and those from the functional world (Systems and
Software), traditionally the pioneers of Requirements,
Change and Configuration management, don’t live on
the same planet. But in reality of course they do; they
simply have been living apart and must now learn to
live together and cooperate in the same biosphere.
Concurrent vs Systems Engineering; these are two
names for the same practice and that is the concept of
different disciplines contributing in parallel and using
the same initial set of requirements in order to realise
one single product performing a given functionality.
One of the most important factors in establishing a
common ground and a complementary approach is,
on one hand, to give Systems Engineering Gurus a
glimpse into the PLM world and, on the other hand,
to persuade the PLM/PLCS community to endorse a
similar model and understanding of the principles
and infrastructure of Systems Engineering. Both are
necessary and both need to learn from each other.
Again, a single common model for development is
essential in enabling parallel groups to understand
each other and to ensure they are working on the
same wavelength. The V-Cycle is not the only model
that exists, and there are debates as to whether it is
even suitable for the contemporary ways of Systems
and Product Design especially when considering
globalization.
However the V-Cycle is one of the most popular
and most tested models around and thus is a good
basis to start from. When it comes to considering
Product Configuration Management, a key enabler is
an efficient PLM system and the V-Cycle can help to
understand why this concept is so important. In effect,
looking at the main steps of the V-Cycle it is clear to
see that if the management of the Version Control
of the requirements specification, the Architecture
definition, the Design detailing and the related
Simulation models are not managed in a synchronous
manner the risk of non-compliancy of the product to
its original specifications is increased. Therefore PLM
solutions must provide a consistent configuration
management capability, not only for each and every
part or sub-assembly of the product – whether
discrete or digital – but also for the entire product
definition as well as for the entire systems-functional
5. 62
• Adaptability - no paradigm is written in stone and
companies therefore have to have the capacity
and flexibility to respond to, and absorb, lateral
shifts and scale-ups of a designated solution
If anything is certain it is that there is no doubt the
technology behind such a complex solution will have
to include cloud computing.
CONCLUSION
The new PLM or PLCS paradigm will need to ensure
a seamless and real-time environment for both the
physical and functional design of today’s product
development. In a fiercely competitive modern
market-place, product design and development is
taking a more central role in the long-term success of
a company and how it outperforms its competitors.
It is those companies that are wisely investing
significant amounts of money into a PLM system and
willing to exploit the emerging, cutting-edge (best of
breed, state of the art) technologies found in product
development that will generate a high return on this
investment.
All companies have to acknowledge that there
are challenges and risks, a massive one being the
required change of mindset. Nevertheless, the
payoff of investing in bleeding-edge technologies
and paradigms compensates for the abandonment of
existing, well-understood historical PLM solutions.
In summary, do not expect your big investments to be
cost-effective if you follow the saying
‘wash me but don’t get
me wet!’
definition.
In other words, any PLM solution which is not able to
create product baselines, as earlier discussed, is likely
to fail when it comes to managing the product life
cycle in real time.
CONCURRENT ENGINEERING
AND GLOBALLY
DISTRIBUTED DESIGN
With the emergence of globally-distributed design,
the resolving of the aforementioned functionalities of
any PLM/PLCS system is critical.
This is nothing new but the traditional approach of a
unique and centralized system that provides a central
repository where all data and information is stored
with periodical and time-bound synchronization,
is becoming obsolete. The reality is that business
globalization has created a phenomenal explosion of
the amount of data collected and how that data is
required to interact across virtual networks.
Having robust, fast, secure and practically seamless
data-tunnels between semi-autonomous divisions of
a company, operating in different geographical and
cultural parts of the world, is becoming a vital part
of a company’s design success. Old paradigms and
methodologies are collapsing under the pressure.
So in short, though it’s far from simple, the solutions
of tomorrow will have to support:
• Requirements Management, Requirements
Detailing and Architecture Design
• Change and Configuration Management with
base lining capabilities
• A collaborative distributed mechanism that
enables real-time synchronization without
replication
• The ability to interconnect heterogeneous
systems while maintaining permanent data
consistency and quality