Extending Function Point Estimation for Testing MDM Applications
1. • Cognizant 20-20 Insights
Extending Function Point Estimation for
Testing MDM Applications
Executive Summary functions across the enterprise. The approach
comprises the following steps:
Effort estimation of testing has been a much
debated topic. A variety of techniques are used
— ranging from percentage of the development
• Collect input specifications.
effort to more refined approaches based on • Compute MDM application size (this includes
use case and test case points — depending the ETL and MDM parts of testing) in function
on functional and technological complexity. points.
Underlying testing normally focuses on end-user • Determine the number of test cases for MDM
functionality. testing (including ETL test cases).
Testing of master data management (MDM) The MDM test estimation approach highlighted
applications is different. As such, it requires a in this document is aligned with the International
different approach when estimating effort. In an Function Point User Group’s (IFPUG) guidelines
MDM testing project, there are specific factors for function point analysis (FPA).
that impact estimation. They include:
Steps of Estimation Process Flow:
• The effort needed to prepare scenario-specific Size Estimation
test data and loading scripts. The input and output interfaces of the MDM appli-
• Script execution and data loading time. cation are counted, and the following general
considerations are applied while calculating the
• Availability of a separate MDM hub.
function points:
This white paper analyzes the impact of such
factors, as well as the approach that should be • Step 1: Identify the Application Boundary for
the MDM Project.
adopted to estimate the effort needed for testing
MDM solutions. The application boundary determines the
function points that need to be counted as
Estimation Approach part of the MDM application (including the ETL
System and integration testing in MDM focus part). The application boundary indicates the
on verifying the system functions, data quality, border between the software being measured
exception handling and integration of business (in terms of testing) and the user and other
cognizant 20-20 insights | august 2011
2. applications that integrate with the MDM appli- This requirement can be identified and mapped
cation. with a function point elementary process
“External Output” (EO), as it involves querying
Figure 1 depicts the application boundary and
and deriving data using business logic and,
counting scope of an MDM project. It contains
hence, fulfilling the necessary conditions
the following:
for EO.
> ETL layer functionalities.
Applying Size Estimation Technique in
> MDM and publish layer functionalities.
MDM Testing Projects
> End-to-end application functionalities, in-
When it comes to testing types, the following
cluding the ETL, MDM and publish layers.
options are considered for an MDM testing
• Step 2: Determine the Unadjusted Function project:-
Point Count 1. Option A : Database-intensive testing deliver-
The unadjusted function point count (UFPC) ables with data flow requirements for:
reflects the specific countable MDM and ETL
functionality provided to the user by the
> Source to landing data loading
(i.e., land process).
project or application. The user function-
ality is evaluated in terms of what is to be > Landing to staging data loading
delivered by the application, not how it is to (i.e., stage process).
be delivered. Only user-requested and user- > Staging to base object data loading
defined components are counted. (i.e., load process).
The UFPC can be counted by identifying and Database-intensive testing is required to
mapping different user-requested MDM func- perform in each layer of data staging, as
tionalities using function point elementary mentioned above. For example:
processes. For example, an MDM testing
requirement can be stated as, “Verification of > Data standardization and cleansing to be
customer master PKEY_SRC_ID formation as verified for the stage process.
per business rule.”
Identifying Application Boundary and Testing Scope
ETL Application
Boundary
X-Reference Tables
Match
Staging 1 Merge
Source System 1 E Target
Data Cleansing Landing Area MDM System
T and Services
Transformation
Staging 2 MDM Hub
L
Source System 2 Data Publication
Data Validation
Data Harmonization
Rejection
Cleansing
ETL Reject Standardization
ETL Layer MDM System
MDM Application Boundary
Figure 1
cognizant 20-20 insights 2
3. > Auto match and merge of data as per busi- Hence, this functionality can be mapped against
ness rule to be verified for the load process. the FP elementary process “External Query” (EQ).
Figure 2 provides a pictorial view to identify the
2. Option B: UI console-based testing deliver-
elementary processes of function point analysis
ables with the data steward-specific require-
in such data migration activities.
ments, such as:
> Manual match and merge of records as per • Step 3: Determine the Value Adjustment Factor
business rule. The value adjustment factor (VAF) indicates
> Trust rule verification for data from differ- the general functionality provided to the user
ent sources. of the application. The VAF comprises general
system characteristics (GSC) that assess
> Ability to create/edit new and existing re-
the general functionality of the application.
cords, etc.
Examples of such characteristics are:
Activities related to each of the above sections can
> Distributed data processing.
be mapped directly with the elementary processes
of function point analysis. For example, consider > Performance objective of the MDM hub.
the following data standardization and cleansing > Online data entry on the downstream ap-
requirement: “Customer address records should plications.
be free from junk characters (#, &, ^, %, !), and The VAF can vary between 0.65 and 1.35.
‘Street’ should be displayed as ‘STRT.’“
• Step 4: Calculate the Adjusted Function Point
A simple SQL query will be implemented in the Count (AFPC)
test steps in order to verify the above require- The adjusted function point count is calculated
ment. The query doesn’t need to have any logical using a specific formula:
data derivation (e.g., concatenation or selecting
a sub-string from the record) or mathemati- AFPC = VAF * UFPC
cal calculation in order to verify the cleansing • Step 5: Normalize Using Test Cases
requirement. It is required to fetch the record as
On obtaining the size of the application in
it is stored in the database as per the conditions
terms of FP, the number of normalized test
stated in the requirement.
Identifying Elementary Processes for MDM Data Flow
Integrated
MDM
Application Load Process
Boundary
Match
Source System 1 Staging 1
E Landing Area Merge
T Land P
Land Process
n Stage Process Target
MDM System
Staging 2 MDM HUB Services
L
Source System 2
Elementary
MDM System Process - EQ
External Output
Figure 2
cognizant 20-20 insights 3
4. cases (manual) befitting the application is Project Specific Factors for MDM: Testing
calculated using a formula proposed by and The impact of these factors varies from project to
based on historical data from Capers Jones. project. Based on the situation, these factors may
increase or decrease effort.
Adjusted Function Number of Normalized
Point Count Test Cases Beyond total effort, a percentage of common
factors and project-specific factors must be added
AFPC ( AFPC) ^ a
in order to arrive at the final adjusted effort.
Note: ‘a’ is a factor that can be a range of value Final Adjusted Effort = Total Effort + Total
that varies with the AFPC. Effort * (% of Common Factors + % of
Project-Specific Factors)
Effort Estimation
Factors such as initiation and planning, closure,
The effort estimation for an MDM testing project
number of iterations, etc. need to be considered
is computed on the basis of the Organizational
separately and added to the above figure.
Baseline Productivity (OBP) figures for MDM
testing projects. The total effort required by Challenges
the project based on productivity figures is as
Having outlined the approach, it is still important
follows:
to highlight that — unlike UI-intensive applica-
Total Effort in Person Hours (PH) = Number tion testing — effort estimation for testing MDM
of Normalized Test Cases / Productivity applications is still a new concept. Estimation has
(in Normalized Test Cases per PH). many challenges, a few of which include:
It is a requirement to conduct a productivity 1. Non-availability of industry-standard produc-
baselining exercise within the organization that tivity values for MDM technologies.
uses essential data from closed testing projects 2. Non-availability of detailed requirement speci-
— namely, actual project size and effort data from fications at the estimation stage.
the key members of closed projects. The final size
3. The need for skilled function point counters
is established in terms of normalized test cases
for consistent size estimation, especially
and the effort in PH. The effort for test design and
people with sufficient training and practice
test execution needs to be captured separately in
with counting rules.
order to derive the productivity figure for each
case. This yields the productivity data point for 4. The availability of subject matter experts for
each case and project. The median value of these the application in order to get a logical view of
data points gives us the OBP for test design and the application.
execution.
Final Notes
Common Factors for MDM — Testing Projects: Based on the estimation approach highlighted in
These factors always increase the effort required. this paper, we have built a tool for MDM testing
estimation. This tool not only provides simple
Factor Affecting Effort interfaces to capture user inputs, but it also
implements the calculations for effort estimation.
Project management (strategy, planning,
Additionally, it addresses the majority of the
monitoring & reporting)
challenges mentioned above by making realistic
Quality assurance assumptions based on our rich experience with
Retesting, reworking & defect tracking MDM application testing.
Training effort
Environment setup and integration
with test management tool
Test data preparation
cognizant 20-20 insights 4