SlideShare ist ein Scribd-Unternehmen logo
1 von 11
Downloaden Sie, um offline zu lesen
- 1 -
Abstract— In terms of Software Architecture
seldom the requirements of a new system are
sufficient to ensure that it will meet the needs and
expectations of customers. Without undertaking a
formal architectural evaluation process, the
organization cannot ensure that the architectural
decisions made-particularly those which affect the
achievement of quality attribute such as
performance, functionality, usability, efficiency,
maintainability, reliability, availability, security
and modifiability are advisable ones that
appropriately mitigate risks. The question to which
the answer is often vague is: The architecture is
good enough? Many authors and researchers have
identified that there is a gap between requirements
definition and architecture defined for new
systems. This failure in the elicitation of the
features that ensure the quality desired by the
client leads to risks. Some authors have researched
this field and proposed models of quality
assessment in systems architecture. However there
is a market need for tools that facilitates the
evaluation of architecture quality and connect it to
which involved components. So this paper relies on
the Architecture Tradeoff Analysis Method - ATAM
as a guide and proposes the requirements for a toll
to be used for architecture evaluation and the
traceability between components and artifacts.
Index Terms—Architecture Tradeoff Analysis
Method ATAM, Qualitative evaluation of software
architecture .
I. INTRODUCTION
The structure of a software system reflects its
architecture. Each system has an architecture, whether
achieved in a planned or an unplanned manner. A
high-quality architecture is the key aspect of the
success of any development project. Software
architecture is an “intellectually graspable” abstraction
of a complex system[1]. Architectural decisions have a
great impact on the consequent quality of software
systems. As a result, it is important to evaluate how
the designed software architecture meets its quality
demands. It gives a basis for analysis of software
systems' behavior before the system has been built [5].
The ability to verify that a future software system
fulfills it stakeholders' needs without actually having
to build it represents substantial cost-saving and risk-
mitigation [6]. Architectural views are models of
architectural structures or other elements of a software
architecture. Architects use views throughout the
architecture lifecycle for the specific purpose of
understanding and analyzing the different aspects of
system development and performance. The major
characteristic of a view is its purpose and utility.
Views are not complete descriptions of the software
architecture. Often views omit otherwise important
architectural information in order to be useful for their
specific purpose. The major issue with the views is
consistency with the system rather than completeness.
Architectural views are a necessary part of any
architecture and serve as a way to design architectural
structures and to understand different aspects of
system design. But it is difficult to ensure the
relationship between these views and the strategic
business objectives that a system should cover.
Aiming to improve the effectiveness of systems
architecture to try to achieve the quality desired by
customers, some techniques have been developed to
perform architectural analyses and qualitative
evaluation, like ATAM. The motivation to define and
describe requirements for a tool that supports
Qualitative Evaluation of Software Architecture is that
conflicts often occurs in the delivery stage of new
systems. Very often, while delivering a new developed
system, it meets the described requirements but did
not meet the expectations of customers in terms of:
functionality, reliability, usability, efficiency,
maintainability, and portability. Too often, systems are
released with performance issues, security risks, and
availability problems as a result of inappropriate
decisions. Architecture evaluation is an early risk
reduction method [2]. The architectures were defined
early in the project life cycle, but the resulting flaws
were discovered much later. The most significant
benefit of qualitative evaluation is to reassure
stakeholders that the candidate architecture is capable
of supporting the current and future business
objectives; specifically, it can meet its functional and
nonfunctional requirements. The quality attributes of a
system such as performance, availability, extensibility,
and security are a direct result of its architecture;
therefore, quality cannot be introduced easily to your
system late in the game. A qualitative evaluation of the
architecture while it is still a candidate specification
Requirements for quality evaluation of
software architecture
Joao C. Albuquerque
Hochschule Furtwangen University
Fakultät Informatik
Robert-Gerwig-Platz 1
78120 Furtwangen, Germany
E-mail: joao.albuquerque@hs-furtwangen.de
- 2 -
can reduce project risk greatly. What is the
importance of evaluating the architecture in the early
stage of a software project? The architecture
evaluation will not produce answers such as yes or no.
It will point out risks and also help to map out which
components support defined customer needs,
limitations or features. The sooner the risks or
weaknesses are identified, the lower the cost to re-
evaluate the solution and change the project in part or
in whole. There are currently few well-defined and
mature to qualitatively evaluate the architecture of a
software design methods:
SAAM – Software Architecture Analysis Method,
Target: changeability and extensibility,
ALMA – Architecture-Level Modifiability
Analysis, Target: Changeability,
PASA – Performance Assessment of Software
Architectures , Target: Efficiency and
Performance,
ATAM – Architecture Tradeoff Analysis Method,
Target: system-relevant quality attributes.
One reason for the choice of methodology for
evaluating the quality of software: Architecture
Tradeoff Analysis Method ATAM [4] was to be the
one that best drives for architecture design and it’s
relationships with quality attributes. It is a method for
evaluating software architectures relative to quality
attribute goals. ATAM evaluations expose
architectural risks that potentially inhibit the
achievement of an organization's business goals. The
ATAM gets its name because it not only reveals how
well an architecture satisfies particular quality goals,
but it also provides insight into how those quality
goals interact with each other, how they trade off
against each other. The ATAM is the leading method
in the area of software architecture evaluation. This
also means that: the system proposed in this paper
serve to support quality assessment using ATAM. In
other words, it supports and documents the nine steps:
1-Present the ATAM. 2-Present business drivers.
3-Present architecture focusing on how it addresses
the business drivers. 4-Identify architectural
approaches. Architectural approaches are identified by
the architect, but are not analyzed. 5-Generate quality
attribute utility tree. The quality factors are elicited,
specified down to the level of scenarios and
prioritized. 6-Analyze architectural approaches.
Based on the high-priority factors identified in Step 5,
the architectural approaches that address those factors
are elicited and analyzed. During this step,
architectural risks, sensitivity points, and tradeoff
points are identified. 7-Perform brainstorm meeting to
prioritize scenarios. 8-Analyze architectural
approaches. 9-Present results based on the information
collected in approaches, scenarios, attribute-specific
questions, the utility tree, and risks. Software
reliability evaluation mainly includes the quantitative
evaluation and qualitative evaluation. Both can be
summarized as follows: quantitative evaluation is
more accurate because it is based on metrics and
mathematical formulas that numerically may rank the
degree of complexity, interdependence, coupling
among others. But there is a condition so that it can be
applied. The software code must be implemented in
order to be evaluated. In other words , it is an accurate
method of assessment, but it can be used only after
implementation. On the other hand, the qualitative
assessment do not provide accurate answers but it can
be performed early in the hole process (after
requirements and architecture definition and before
implementation, tests and delivery), before many
mistakes and omissions had consumed resources, time
and customer patience unnecessarily. Risks can be
identified and avoided or mitigated previously.
Moreover, the process involves a high degree of
communication between stakeholders in the system,
sponsors and development teams. This undoubtedly
improves the degree of refinement of the quality
requirements and can be pre- set in the architecture of
the software project to end the increasing degree of
satisfaction with the product delivered. These
features, advantages and challenges lead to interest to
research and develop standards and requirements to
facilitate its implementation. Many problems in
software systems are directly related to the
specification and design of quality attributes such as
modifiability or performance, to name just a few.
Quality attributes have a main influence on customer
and end user acceptance. To address them in a
systematic and appropriate way is an important but
challenging endeavor. Quality attributes are difficult to
handle, because they need to be treated differently and
typically affect not only parts of a system but the
system as a whole. For example, engineers can’t
constrain security or performance attributes to one
single place in the system. Such attempts turn out to
be impossible in most contexts, because the concerns
are cross-cutting and often even invasive, that is, they
require software engineers to inject design and code
into existing components. With other words, most
quality attributes are systemic and need global and
strategic treatment. The benefits from performing an
quality evaluation are: improved software architecture
documentation, formulate a documented foothold for
architectural decisions, make clear quality attribute
requirements, to pinpoint risks at early stage, when the
budget for implementation has not been exhausted and
improves communication between stakeholders
avoiding misunderstandings and omissions.
- 3 -
II.PREVIOUS RESEARCHES ARTICLES ON THIS TOPIC
Some studies and researches were already made to
drill an Qualitative Evaluation tool Software
Architecture such as: [2] and [3]. Papers that also refer
to the ATAM method [1]. In the software market are
several products for mapping requirements and
components of a project. There are good solutions that
document and address traceability between: customer
requirements, architecture view, functional modules
and implementation units. In some documentation
tools, architectural attributes are referred as
functionality limitations or obligations such as
maximum response time, maximum number of
concurrent users, growth rate of the stored data but
these are just limitations. Another researched tool was
the: IBM-Rational-Quality-Manager. This is a web
application that enables to track aspects of the quality
assurance. The central artifact in this tool is a test plan
that contains information, such as goals, schedules,
milestones as well as links to associated test cases,
requirements and development items. Rational Quality
Manager includes manual test authoring and
execution, test lab management, test execution,
reporting and defect management. It was designed as a
replacement for IBM Rational Manual Tester, IBM
Rational ClearQuest Test Manager, and Rational
TestManager. An Overview of Rational Quality
Manager functions: The test plan defines the
objectives and scope for the test effort and help teams
to answer to the question: Are we ready to release a
new software version? The advantages and features
are: It can be used to define business and test
objectives, establish a review and approval process,
manage project requirements and establish the inter
dependencies between the two, define quality goals,
create and manage test cases, a test case can include
links to development items and requirements. The
relationship between test artifacts, requirements, and
development artifacts can be traced by the traceability
view. The most frequently reported disadvantages are:
there is no reference manual that describes the
functionality. The standard reports in the reporting
section are not customizable and have a plain appeal,
there is no standard report to requirements, test cases
and execution result on the same sheet. In summary,
despite the name suggest being a quality manager, in
fact it is a test manager. What is an important activity,
but does not address the issue of evaluating
architectural software quality. Considerable research
has been devoted to relating requirements and with
source code. Less attention has been paid to relating
requirements with architecture quality. Traces between
requirements with architecture might be manually
assigned and might be incomplete. The impact of
requirements changes on other requirements, design
elements and source code can be traced to determine
parts of the software to be changed. In this research
were identified features in tools traditionally used for
requirements traceability and documentation of
architecture, which in its new versions are already
references the explicit all stakeholder requests and
their attributes. Already a long way towards avoid
losing customer focus. These characteristics have been
described as some recent studies [8] Referencing
solutions that are market leaders as IBM RequisitePro.
Traceability benefits are: prioritizing requirements,
estimating change impact, proving system
understanding supporting design decisions, validating,
and more. [9] But the focus of these solutions is not to
evaluate whether the architecture is good enough to
meet business goals. Yes, there are other market
leaders with consultancy which propose this
Qualitative Evaluation, but, in some cases, the
solution behind this consulting is the migration to
proprietary complete solutions. In other words, they
evaluate the solutions made in house, point out flaws
and weaknesses and recommend throwing away years
of experience replacing by an in-a-box ERP solution
that ensure architectural quality features. But the price
of these solutions may not be within reach of most
small and medium sized companies or create
technological dependence by monthly subscription
agreements. Migrating to these new solutions involves
training to all employees, initial loss of productivity,
data migration can not always be perfect done and
among other problems. In our literature research, other
tools [3], [4] showed an efficient architectural
qualitative evaluation. However the focus of the tool
proposed in this paper is that the qualitative evaluation
of the architecture should be traceable on the
respective requirements and implementation units. In
this study we aim to assess: how far traditional tools
for mapping requirements and design architecture can
be exploited to allow adoption of the features for
evaluating the quality of software architecture. Thus
the proposed solution can be compared with
established products on the market such as IBM
Rational described at figure 3 and so we will have a
feasibility study of our proposal. After all, a new
Software proposal only makes sense if there really still
this gap in the market (the loose coupling between
architecture design approaches and system quality
attributes are seen as major weak points) and from
solutions who are able to (identifying essential
requirements for creating a sustainable architecture
that permit achieving the business goals) pointed by
previous studies. [3].
- 4 -
III. REQUIREMENTS PROPOSAL
There is a big risk of projects being designed, meet
all documented requirements but contain risks and do
not meet the expectations of its sponsors. How can
we prove to the sponsors that the solution will meet
characteristics like: Functionality, Suitability,
Security, Interoperability, Reliability, Fault Tolerance,
Recoverability, Usability, Understandability,
Learnability, Operability, Attractiveness, Efficiency,
Maintainability, Changeability, Stability, Portability,
Co-Existence? Given this gap, this paper attempts to
describe a tool that can assist in this evaluation of
software architecture and mapping traceability
between these architectural attributes and components
of reported software. The identified risks should be
associated with components that address the problem.
Thus quality assessment and improvement points may
apply corrective measures with less effort and greater
assertiveness. The features: Initially the software
should be designed to be performed during a
presentation to stakeholders. After presenting the
concept of ATAM to the stakeholders some Business
Drivers are being raised and may already be filled in
the system to continue the survey of the major
Business Drivers. Then it should open a screen to fill
the summarized description of the major Business
Drivers. Third step: the toll should then open a screen
to fill the summarized description of architecture
solutions. This description should be focused on: how
it addresses the business drivers. This step will be best
understood by everyone if you could include not only
the textual description but also images, charts, or
diagrams regarding the architecture associated with
text description. Each entry of this step should be
linked with one or more from step above. The
Architecture is usually described by using the logical
view and it is concerned with the functionality that the
system provides to end-users. UML Diagrams used to
represent the logical view include Class diagram,
Communication diagram, and Sequence diagram. The
development view illustrates a system from a
programmer's perspective and is concerned with how
the solution is modularized and the software
management. This view is also known as the
implementation view. It uses the UML Component
diagram to describe system components. UML
Diagrams used to represent the development view
include the Package diagram. The process view deals
with the dynamic aspects of the system, explains the
system processes and how they communicate, and
focuses on the runtime behavior of the system. The
process view addresses concurrency, distribution,
integrators, performance, and scalability, etc. UML
Diagrams to represent process view include the
Activity diagram. The physical view depicts the
system from a system engineer's point-of-view. It is
concerned with the topology of software components
on the physical layer, as well as the physical
connections between these components. This view is
also known as the deployment view. UML Diagrams
used to represent physical view include the
Deployment diagram. Scenarios: The description of
architecture may be illustrated using a small set of use
cases, or scenarios which become a fifth view. The
scenarios describe sequences of interactions between
objects, and between processes. They are used to
identify architectural elements and to illustrate and
validate the architecture design. They also serve as a
starting point for tests of an architecture prototype,
and it’s also called as “use case view”. The proposed
the logical data model for this tool is described at
figure 1.
Figure 1 – Logical Data Model
Requirements typically define what a system should
do, but rarely define how it should be done. How fast
how secure how easy how modifiable, how portable
and so on. To help participants elicit the most
important features for their business, a database of
pre-defined questions that refer to issues of quality,
features and limitations should be used, described at:
Table 1 - Software Architecture Quality Survey.
For each question there are answer options according
to the degree of importance of this feature for
business. The choice of these answers will help
identify project risks. And consequently, will facilitate
the stakeholders understanding of the relationship of
each risk with the software components on dealing
with this functionality. To accelerate the meetings,
the data tables containing the main software
components designed in architecture or
implementation units for the application can be pre-
registered by the development team. In the same way,
the tables of questions and answers should previously
contain to the most common situations on business
needs like: how long should a new feature be ready to
market. restrictions about response time, volume of
data stored in the system, time required for system
changes, ease of users' learning, number of concurrent
users, availability (24h x 7d) or (8h x 5d) and so on.
From this point can be performed the step 5 from
ATAM [4]- by conducting the brainstorm meeting to
identify and prioritize the system’s most important
quality attribute goals. Here must be a way to focus
the attention of stakeholders on the architectural
solutions that are most critical to the system’s success.
And so can be successfully described the Quality
attribute utility tree on figure 2.
- 5 -
Figure 2 – Quality attribute utility tree
The scenarios are the most important part of the utility
tree, the main reason is that the scenarios help us
understand the quality attributes needed, and more
importantly, by tying the attributes to real instances in
the system the scenarios help make these goals both
concrete and measurable. Scenarios specification:
First - Scenarios should be as specific as possible.
Scenarios should cover a range of:
Anticipated uses of the system - what happens
under normal use
Anticipated changes to where is expected the
system to go and develop
Unanticipated stresses to the system.
Scenarios are basically statements that have a context
a stimulus and a response and describe a situation in
the systems where the quality attribute manifests
itself.
Context - under what circumstances
Stimulus - trigger in Use case
Response - what the system does.
For example: under normal operation, perform a
database transaction in under 100 milliseconds.
Remote user requests a database report during peak
period and receives it within 5 seconds. Add a new
data server to reduce latency in scenario 1 to 2
seconds within 1 person-week. An intrusion is
detected, and the system cannot lock the doors. The
system activates the electromagnetic fence so that the
intruder cannot escape. For a new release, integrate a
new component implementation in three weeks. Half
of the servers go down during normal operation
without affecting overall system availability. Under
normal operations, queuing orders to a site which is
down, system suspends within 10 minutes of first
failed request and all resources are available while
requests are suspended. Distribution to others is not
impacted. By adding hardware alone, increase the
number of orders processed hourly by a factor of ten
while keeping the worst-case response time below 2
seconds. If we take one of these: An intrusion is
detected, and the system cannot lock the doors. The
system activates the electromagnetic fences so that the
intruder cannot escape:
The stimulus - An intrusion is detected
Context - the system cannot lock the doors
Response - the system activates fences.
Or another one (Half of the servers go down during
normal operation without affecting overall system
availability)
Stimulus - Half the servers go down
Context during normal operation
Response - without affecting overall.
The next step is prioritizing scenarios by evaluating:
Importance to Business Goals
(High, Medium, Low attributes)
Difficulty or Risk in achieving
(High, Medium, Low attributes)
The interesting scenarios are the ones with high
priority. To help clarifying those scenarios and
identifying risks the common survey / answers table
should be used. The usefulness of the questions table
is the enrichment of brainstorming, avoiding that
crucial issues are forgotten or pass unnoticed, hence
the importance of linking their response to a risk. The
purpose of connecting all the steps and prioritize
needs is that: at the end of filling all data in the
respective tables this tool will be able to generate a
report with the rankings of architectural solutions and
their components that support the main business goals.
And if any identified risks have not been addressed by
any solution, this will be highlighted as risks
untreated. Each question should be as generic related
as possible but it should address one software quality
topic or sub characteristics detailed on Figure 3 [10].
This means that, in order to help stakeholders to elicit
detailed information about their business drivers who
should be treated by the system they will try to answer
if each of those questions impacts their expectations
about the characteristics of the software project being
evaluated referring to each of the above described
quality feature or attribute If some questions receive
responses with a high degree of importance to the
strategic business goals and those still not have
structures to handle it´s needs, then new risks
classified as untreated should be detailed in the project
documentation. This will highlight points of
architecture where there is a failure and need to be
improved.
Figure 3-ISO 25010 - Software product quality model
In the new model for software quality in the draft
ISO/IEC CD 25010 standard, the concept of quality in
use has been broadened to embrace a wider range of
issues than was common in usability. The term
flexibility has been added to refer to the need for
usability in both planned and unplanned contexts of
- 6 -
use and the need for usability for people with special
needs. Flexibility can also include learnability, or how
quickly and effectively a user interface can be learned.
The standard also makes a distinction between
usability from different stakeholder perspectives that
result in different types of measures, including from
the perspective of the end user achieving personal
goals, the perspective of the organization achieving
organizational goals, and the perspective of technical
support achieving maintenance goals.
Functional suitability
The degree to which the software product provides
functions that meet stated and implied needs when the
software is used under specified conditions.
Appropriateness - The degree to which the software
product provides an appropriate set of functions for
specified tasks and user objectives.
Accuracy - The degree to which the software product
provides the right or specified results with the needed
degree of precision.
Functional suitability compliance - The degree to
which the software product adheres to standards,
conventions or regulations in laws and similar
prescriptions relating to functional suitability.
Reliability
The degree to which the software product can
maintain a specified level of performance when used
under specified conditions.
Availability - The degree to which a software
component is available when required for use.
Fault tolerance - The degree to which the software
maintain a level of performance in cases of software
faults or of infringement of its specified interface.
Recoverability - The degree to which the software
product can re-establish a specified level of
performance and recover the data directly affected in
the case of a failure.
Reliability compliance - The degree to which the
software product adheres to standards, conventions or
regulations relating to reliability.
Performance
The degree to which the software product provides
appropriate performance, relative to the amount of
resources used, under stated conditions.
Time behavior
The degree to which the software product provides
appropriate response and processing times and
throughput rates when performing its function, under
stated conditions.
Resource utilization
The degree to which the software product uses
appropriate amounts and types of resources when the
software performs its function under stated conditions.
Performance efficiency compliance
The degree to which the software product adheres to
standards or conventions relating to performance
efficiency.
Operability
The degree to which the software product can be
understood, learned, used and attractive to the user,
when used under specified conditions.
Appropriateness recognisability - The degree to which
the software product enables users to recognize
whether the software is appropriate for their needs.
Learnability - The degree to which the software
product enables users to learn its application.
Ease of use - The degree to which the software makes
it easy for users to operate and use it.
Helpfulness - The degree to which the software
product provides help when users need assistance.
Attractiveness - Degree to which the software is
attractive to the user.
Technical accessibility - The degree of operability of
the software for users with specified needs.
Operability compliance - The degree to which the
software product adheres to standards, conventions,
style guides or regulations relating to operability.
Security
The protection items from accidental or malicious
access, use, modification, destruction, or disclosure.
Confidentiality - The degree to which the software
provides protection from unauthorized disclosure of
data or information, whether accidental or deliberate.
Integrity - The degree to which the accuracy and
completeness of assets are safeguarded.
Non-repudiation - The degree to which actions or
events can be proven to have taken place, so that the
events or actions cannot be repudiated later.
Accountability - The degree to which the actions of an
entity can be traced uniquely to the entity.
Authenticity - The degree to which the identity of a
subject can be proved to be the one claimed.
Security compliance - The degree to which the
software product adheres to standards, conventions or
regulations relating to security.
Compatibility
The ability of some software components to exchange
information and/or to perform their required functions
while sharing the same hardware or environment.
Replaceability - The degree to which the software
product can be used in place of another specified
software product for the same purpose in the same
environment.
Co-existence - The degree to which the software
product can co-exist with other independent software
in a common environment sharing common resources
without any detrimental impacts.
Interoperability - The degree to which the software
product can be cooperatively operable with one or
more other software products.
Compatibility compliance - The degree to which the
software product adheres to standards, conventions or
regulations relating to compatibility.
Maintainability
The degree to which the software product can be
modified. Modifications may include corrections,
improvements or adaptation of the software to
- 7 -
changes in environment, and in requirements and
functional specifications.
Modularity - The degree to which a system or
computer program is composed of discrete
components such that a change to one component has
minimal impact on other components.
Reusability - The degree to which an asset can be used
in more than one software system, or in building other
assets.
Analyzability - The degree to which the software
product can be diagnosed for deficiencies or causes of
failures in the software, or for the parts to be modified
to be identified.
Changeability - The degree to which the software
product enables a specified modification to be
implemented. The ease with which a software product
can be modified.
Modification stability - The degree to which the
software product can avoid unexpected effects from
modifications of the software.
Testability - The degree to which the software product
enables modified software to be validated.
Maintainability compliance - The degree to which the
software product adheres to standards or conventions
relating to maintainability.
Transferability
The degree to which the software product can be
transferred from one environment to another.
Portability - The ease with which a system or
component can be transferred from one hardware or
software environment to another
Adaptability - The degree to which the software
product can be adapted for different specified
environments without applying actions or means other
than those provided for this purpose for the software
considered.
Installability - The degree to which the software
product can be successfully installed and uninstalled
in a specified environment.
Transferability compliance - The degree to which the
software product adheres to standards or conventions
relating to portability.
Figure 4-ISO 25010 - Quality model for quality in use
Quality in use [10] is the degree to which a product
used by specific users meets their needs to achieve
specific goals with effectiveness in use, efficiency in
use, flexibility in use, safety and satisfaction in use in
specific contexts of use. Quality in use is a measure of
the quality of the system in a real or simulated
operational environment. It is determined by the
quality of the software, hardware, operating
environment, and the characteristics of the users, tasks
and social environment. All these factors contribute to
quality in use. Quality in use can be used to assess the
quality of software in a specific context of use. The
attributes of quality in use are categorized into three
characteristics:
Usability in use evaluates the degree to which
specified users can achieve specified goals with
effectiveness in use, efficiency in use and
satisfaction in use ;
Flexibility in use evaluate the degree to which the
product is usable in all potential contexts of use;
Safety deals with acceptable levels of risk of
harm to people, business, data, software, property
or the environment in the intended contexts.
SUMMARY AND OUTLOOK
Comparing the data model and the list of
requirements proposed in this article it was noted that
the RequisitePro tool can be efficiently adapted to
support an architecture quality review following the
steps proposed by ATAM methodology. In other
words, a template for this tool can be developed
containing documents standards, and types of
attributes with their respective links of influence and
traceability matrix, with the facility to generate a
quality attribute scenario. With the ease that the tool
already has the ability to rank of each document
attribute, the prioritization of business drivers,
prioritization of scenarios can be taken using the:
Attributes Tab
Priority (High, Medium, Low)
Status (Proposed, Approved, Implemented,
Tested, Discarded)
Difficulty (High, Medium, Low)
Stability (High, Medium, Low)
Origin
Cost (High, Medium, Low)
Defect
Figure 5 – Features from IBM RequisitePro
And other attributes related do the ATAM evaluation
may be created at the project properties figure 5 like
risk, quality attribute, relevance, security challenge,
and so on. On this way, it can not only documenting
the requirements and architecture of the software but
also make a qualitative assessment of both the
architecture initially proposed, as well as in future
developments of the software in question, but also
- 8 -
support and connect all evaluation steps. One
advantage is to be done in the same tool already
known and used worldwide.
The Requisite Tool integrates not only the Microsoft
Word text editor on any version. The matrix of
attributes and traceability is saved in an MDB
database file Microsoft Access database that can be
versioned along with the project providing version
control type. But SQL Server can also be used to save
the project and its attributes and traceability. There is
also the option to link the requirements artifacts and
architectural design activities with a project on the
timeline of Microsoft Project. The architectural views
are also easily integrated with Rational ROSE for
designing UML diagrams and use cases. Of course
there are disadvantages to. Companies that do not use
this tool to document their projects would need to
invest in license that has a manufacturer's suggested
price is U.S. 2740 per seat and also the disadvantage
of having to re-type the requirements already
documented in another management tool. Another
disadvantage is that tool is not intended to be a web
application, so that the evaluation and meetings are
only effective if they are face. Files can not be
renamed after they are created, so it must be
previously standardize the naming of artifacts. There
is no way to be easily implemented in world wide
collaborative work using clouds like "GoogleDocs"c.
When a new document is added to the project, all
project users must manually update the project to
contain this new document on his views. Each project
document can be edited by one user at a time, while
other users can open in read-only mode. The
characteristic of merging concurrent edits in the same
document does not always work if both users type in
the same sections of the document.
The proposed: requirements in this article, the steps of
the methodology, the prioritization of attributes and
their subsequent raking could be implemented in the
tool Rational Requisite Pro elaborating a template
shown on figure 6, so users, stakeholders and
developers can use a relatively simple and common
platform to conduct the evaluation of the architecture
of a software project after the requirements and
proposed software architecture design step. The
traceability matrix is customizable in requisite pro.
This means that is possible to create many different
views with different hierarchies. So, the relationships
described in the proposed data model can be
implemented easily. In other words, it is possible to
start a new view it from the risks and detail down to
the components that address the risk. This manner
evidences risks that remain to be addressed. Another
view can start the business drivers and get down to the
scenarios and the architecture view. Each change is
automatically pointed as the impact on the system.
And so, before start coding and implementing, it is
possible to evaluate that the final product will not only
meet requirements, but increase stakeholders
satisfaction, because it was possible to evaluate the
expected quality and ensure traceability between e
points defined and the delivered solution.
REFERENCES
[1] Clements, Paul, Rick Kazman, and Mark Klein.
“Evaluating software architectures”. Reading:
Addison-Wesley, 2012.
[2] Thiel, Steffen, Andreas Hein, and Heiner
Engelhardt. “Tool Support for Scenario-Based
Architecture Evaluation”. STRAW. p. 41-45., 2003.
[Online]. Available:
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.
1.1.145.6491&rep=rep1&type=pdf#page=47
[3] Thiel, Steffen. “A framework to improve the
architecture quality of software intensive systems.”
Diss. Universität Duisburg-Essen, Fakultät für
Wirtschaftswissenschaften, 2005. [Online]. Available:
duepublico.uni-duisburg-
essen.de/servlets/DerivateServlet/Derivate-
13645/Dissertation_Thiel.pdf
[4] Kazman. Rick, Klein. Mark, and Clements. Paul,
"ATAM: Method for Architecture Evaluation,"
Software Engineering Institute, Carnegie Mellon
University, Pittsburgh, Pennsylvania, Technical Report
CMU/SEI-2000-TR-004, 2000. [Online].
Available: resources.sei.cmu.edu/library/asset-
view.cfm?AssetID=5177
[5] Perry, D. E.; Wolf, A. L. "Foundations for the
study of software architecture". ACM SIGSOFT
Software Engineering Notes, 1992
[6] SARA Work Group "Software Architecture
Review and Assessment (SARA) Report ". 2012
Available
kruchten.com/philippe/architecture/SARAv1.pdf
[7]Alan M. Davis “Software Requirements –
Analysis and Specification,” Prentice Hall, 1990.
[8]Arda Goknil, Ivan Kurtev, Klaas Van Den Berg,
Generation and validation of traces between
requirements and architecture based on formal
trace semantics, Journal of Systems and Software,
Volume 88, February 2014, Pages 112-137,
Available http://dx.doi.org/10.1016/j.jss.2013.10.006
[9] Stefan Winkler and Jens Pilgrim. A survey of
traceability in requirements engineering and model-
driven development. Softw. Syst. Model. 9, 4 2010.,
Pages 529-565.
Available http://dx.doi.org/10.1007/s10270-009-0145-0
[10]ISO/IEC CD 25010 . Systems and software
engineering—Software product Quality Requirements
and Evaluation (SQuaRE)—Software product quality
and system quality in use models. 2009
Avaliable:
sa.inceptum.eu/sites/sa.inceptum.eu/files/Content/ISO_25010.pdf
- 9 -
Figure 6 – New Template - View for IBM RequisitePro
- 10 -
Table 1 - Software Architecture - Quality Attributes Survey
Functional suitability Does the presented software architecture provide functions that meet stated and implied needs?
□ Yes – □ No default answer form
Level of Importance
□ Low □ Average □ High □ Very important
Risk □ No □ Yes □ New Risk
New Risk Description [ ]
o Appropriateness Does the presented software architecture provide an appropriate set of functions for all
specified user objectives?
o Accuracy - Does the presented software architecture provide the expected results with the needed precision?
o Functional suitability compliance - Does the presented software architecture adhere to standards, conventions
or regulations in laws relating to Functional Suitability?
Reliability - Does the presented software architecture convince it can maintain a specified level of performance when used
under specified conditions? IOr - Where are the weaknesses?
o Availability - Does the presented software architecture provide available when required for use?
o Fault tolerance - The degree to which the software maintain a level of performance in cases of software faults or
of infringement of its specified interface?
o Recoverability - The degree to which the software can re-establish a specified level of performance and recover
the data directly affected in the case of a failure?
o Reliability compliance - The degree to which the software adheres to standards, conventions or regulations
relating to reliability?
Performance Does the presented software architecture provide appropriate performance, under stated conditions?
o Time behavior The proposed system is design to provides appropriate response and processing times and
throughput rates when performing its function, under stated conditions?
o Resource utilization - How well the software uses appropriate amounts and types of resources when the
software performs its function under stated conditions?
o Performance efficiency compliance - The proposed system is designed to adheres to standards or conventions
relating to performance efficiency?
Operability - How well can the proposed software design be understood, learned, used?
o Appropriateness recognisability - How well enables the software enables users to recognize whether the
software is appropriate for their needs ?
o Learnability - How well the proposed system is design enables users to learn its application?
o Ease of use - How well the proposed system is design makes it easy for users to operate and use it?
o Helpfulness - How well the proposed system is design provides help when users need assistance?
o Attractiveness - How well the software product is or will be attractive to the user?
o Technical accessibility – Is it appropriate the degree of operability for users with specified needs?
o Operability compliance - How well the software adheres to standards, conventions, style guides or regulations
relating to operability?
Security - How well protected are items from accidental or malicious access, modification, destruction or disclosure?
o Confidentiality - How well the software provides protection from unauthorized disclosure of data or information,
whether accidental or deliberate?
o Integrity - How well the accuracy and completeness of assets are safeguarded?
o Non-repudiation - How well actions or events can be proven to have taken place, so that the events or actions
cannot be repudiated later?
o Accountability - How well the actions of an entity can be traced uniquely to the entity?
o Authenticity - How well the identity of a subject can be proved to be the one claimed?
o Security compliance - How well does the software adheres to standards or regulations relating to security?
Compatibility - Are some software components able to exchange information or to perform their required functions while
sharing the same hardware or environment?
o Replaceability - The proposed system is designed to enable be used in place of another specified software for
the same purpose in the same environment?
o Co-existence - How well the software can co-exist with other independent software in a common environment
sharing common resources without any detrimental impacts?
o Interoperability - How well the software can be cooperatively operable with one or more other software
products?
o Compatibility compliance - The proposed system is designed to adheres to standards, conventions or
regulations relating to compatibility?
Maintainability - How complicated, time consuming, expensive and difficult it is to be performed maintenance, alterations
and development of new features in this software product? Modifications may include corrections, improvements or
adaptation of the software to changes in environment, and in requirements and functional specifications?
- 11 -
o Modularity – The proposed system is designed to be build of discrete components such that a change to one
component has minimal impact on other components?
o Reusability - The proposed system is designed to enable features being used in more than one software system,
or in building other assets?
o Analyzability - How well the proposed system design can be diagnosed for deficiencies or causes of failures in
the software, or for the parts to be modified to be identified?
o Changeability - How well the proposed system design enables a specified modification to be implemented? Will it
be easy to modify parts of the implemented software?
o Modification stability - How well the proposed system design can avoid unexpected effects from modifications of
the software?
o Testability - How well the proposed system design enables modified software to be validated?
o Maintainability compliance - How well the proposed system design adheres to standards or conventions relating
to maintainability?
Transferability - How complicated, time consuming, expensive and difficult it is to transfer the software from one
environment to another and set it to be fully operational?
o Portability – Is it easy the whole system be transferred from one hardware or software environment to another?
o Adaptability – Can the proposed system design be adapted for different specified environments without applying
actions or means other than those provided for this purpose for the software considered?
o Installability - Can the proposed system design can be successfully installed and uninstalled in a specified
environment?
o Transferability compliance - Does the software adheres to standards or conventions relating to portability like
older versions of Operating System, or different languages configured at the host operating system ?
Quality in use - Can the proposed system design be used by users and meets their needs to achieve specific goals with
effectiveness, efficiency, flexibility, safety and satisfaction in use?
Usability in use - How well the proposed system design enables specified users to achieve specified goals with
effectiveness in use, efficiency in use and satisfaction in use?
Flexibility in use How well the proposed system design enables to be usable in all potential contexts of use;
Safety How well the proposed system design deals with acceptable levels of risk of harm to people, business, data,
software, property or the environment in the intended contexts?

Weitere ähnliche Inhalte

Was ist angesagt?

Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A DefinitionGlen Alleman
 
Requirement Engineering Processes & Eliciting Requirement
Requirement Engineering Processes & Eliciting Requirement Requirement Engineering Processes & Eliciting Requirement
Requirement Engineering Processes & Eliciting Requirement AqsaHayat3
 
Software requirements and analysis
Software requirements and analysisSoftware requirements and analysis
Software requirements and analysisPhanindra Cherukuri
 
Ch4-Software Engineering 9
Ch4-Software Engineering 9Ch4-Software Engineering 9
Ch4-Software Engineering 9Ian Sommerville
 
Requirements engineering
Requirements engineeringRequirements engineering
Requirements engineeringvucevic
 
Requirement Engineering
Requirement EngineeringRequirement Engineering
Requirement EngineeringSlideshare
 
Requirements Engineering Processes
Requirements Engineering ProcessesRequirements Engineering Processes
Requirements Engineering ProcessesRa'Fat Al-Msie'deen
 
Requirements Engineering Process
Requirements Engineering ProcessRequirements Engineering Process
Requirements Engineering ProcessJomel Penalba
 
System requirements engineering
System requirements engineeringSystem requirements engineering
System requirements engineeringAnimesh Chaturvedi
 
Software engineering rogers pressman chapter 7
Software engineering rogers pressman chapter 7Software engineering rogers pressman chapter 7
Software engineering rogers pressman chapter 7mohammad hossein Jalili
 
Software requirements engineering lecture 01
Software requirements engineering   lecture 01Software requirements engineering   lecture 01
Software requirements engineering lecture 01Abdul Basit
 
Critical Success Factors along ERP life-cycle in Small medium enterprises
Critical Success Factors along ERP life-cycle in Small medium enterprises Critical Success Factors along ERP life-cycle in Small medium enterprises
Critical Success Factors along ERP life-cycle in Small medium enterprises Moutasm Tamimi
 
Software Architecture: How Much Design?
Software Architecture: How Much Design?Software Architecture: How Much Design?
Software Architecture: How Much Design?Òscar Vilaplana
 

Was ist angesagt? (20)

Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A Definition
 
Saam
SaamSaam
Saam
 
Requirement Engineering
Requirement EngineeringRequirement Engineering
Requirement Engineering
 
Requirement analysis
Requirement analysisRequirement analysis
Requirement analysis
 
Requirement Engineering Processes & Eliciting Requirement
Requirement Engineering Processes & Eliciting Requirement Requirement Engineering Processes & Eliciting Requirement
Requirement Engineering Processes & Eliciting Requirement
 
Slides chapters 21-23
Slides chapters 21-23Slides chapters 21-23
Slides chapters 21-23
 
Software requirements and analysis
Software requirements and analysisSoftware requirements and analysis
Software requirements and analysis
 
Slides chapter 17
Slides chapter 17Slides chapter 17
Slides chapter 17
 
Ch4-Software Engineering 9
Ch4-Software Engineering 9Ch4-Software Engineering 9
Ch4-Software Engineering 9
 
Requirements engineering
Requirements engineeringRequirements engineering
Requirements engineering
 
Requirement Engineering
Requirement EngineeringRequirement Engineering
Requirement Engineering
 
Requirements Engineering Processes
Requirements Engineering ProcessesRequirements Engineering Processes
Requirements Engineering Processes
 
Requirements Engineering Process
Requirements Engineering ProcessRequirements Engineering Process
Requirements Engineering Process
 
System requirements engineering
System requirements engineeringSystem requirements engineering
System requirements engineering
 
Software engineering rogers pressman chapter 7
Software engineering rogers pressman chapter 7Software engineering rogers pressman chapter 7
Software engineering rogers pressman chapter 7
 
Software requirements engineering lecture 01
Software requirements engineering   lecture 01Software requirements engineering   lecture 01
Software requirements engineering lecture 01
 
Process Support for requirements engineering
Process Support for requirements engineeringProcess Support for requirements engineering
Process Support for requirements engineering
 
Requirements engineering
Requirements engineeringRequirements engineering
Requirements engineering
 
Critical Success Factors along ERP life-cycle in Small medium enterprises
Critical Success Factors along ERP life-cycle in Small medium enterprises Critical Success Factors along ERP life-cycle in Small medium enterprises
Critical Success Factors along ERP life-cycle in Small medium enterprises
 
Software Architecture: How Much Design?
Software Architecture: How Much Design?Software Architecture: How Much Design?
Software Architecture: How Much Design?
 

Ähnlich wie Requirements for quality evaluation of software architecture

Quality Attributes and Software Architectures Emerging Through Agile Developm...
Quality Attributes and Software Architectures Emerging Through Agile Developm...Quality Attributes and Software Architectures Emerging Through Agile Developm...
Quality Attributes and Software Architectures Emerging Through Agile Developm...Waqas Tariq
 
Software Architecture and Design Introduction
Software Architecture and Design IntroductionSoftware Architecture and Design Introduction
Software Architecture and Design IntroductionUsman Khan
 
Relational Analysis of Software Developer’s Quality Assures
Relational Analysis of Software Developer’s Quality AssuresRelational Analysis of Software Developer’s Quality Assures
Relational Analysis of Software Developer’s Quality AssuresIOSR Journals
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurancelokareminakshi
 
Testability measurement model for object oriented design (tmmood)
Testability measurement model for object oriented design (tmmood)Testability measurement model for object oriented design (tmmood)
Testability measurement model for object oriented design (tmmood)ijcsit
 
A Review of Agile Software Effort Estimation Methods
A Review of Agile Software Effort Estimation MethodsA Review of Agile Software Effort Estimation Methods
A Review of Agile Software Effort Estimation MethodsEditor IJCATR
 
DESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance FrameworkDESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance FrameworkIJERA Editor
 
Module-4 PART-2&3.ppt
Module-4 PART-2&3.pptModule-4 PART-2&3.ppt
Module-4 PART-2&3.pptSharatNaik11
 
How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...Glen Alleman
 
Softwareenggineering lab manual
Softwareenggineering lab manualSoftwareenggineering lab manual
Softwareenggineering lab manualVivek Kumar Sinha
 
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVE
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVERELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVE
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVEcscpconf
 
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...ijseajournal
 
Mi0033 software engineering
Mi0033  software engineeringMi0033  software engineering
Mi0033 software engineeringsmumbahelp
 
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT ijseajournal
 

Ähnlich wie Requirements for quality evaluation of software architecture (20)

Sda 6
Sda   6Sda   6
Sda 6
 
Unit 2
Unit 2Unit 2
Unit 2
 
Quality Attributes and Software Architectures Emerging Through Agile Developm...
Quality Attributes and Software Architectures Emerging Through Agile Developm...Quality Attributes and Software Architectures Emerging Through Agile Developm...
Quality Attributes and Software Architectures Emerging Through Agile Developm...
 
Software Architecture and Design Introduction
Software Architecture and Design IntroductionSoftware Architecture and Design Introduction
Software Architecture and Design Introduction
 
Relational Analysis of Software Developer’s Quality Assures
Relational Analysis of Software Developer’s Quality AssuresRelational Analysis of Software Developer’s Quality Assures
Relational Analysis of Software Developer’s Quality Assures
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurance
 
Testability measurement model for object oriented design (tmmood)
Testability measurement model for object oriented design (tmmood)Testability measurement model for object oriented design (tmmood)
Testability measurement model for object oriented design (tmmood)
 
A Review of Agile Software Effort Estimation Methods
A Review of Agile Software Effort Estimation MethodsA Review of Agile Software Effort Estimation Methods
A Review of Agile Software Effort Estimation Methods
 
DESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance FrameworkDESQA a Software Quality Assurance Framework
DESQA a Software Quality Assurance Framework
 
242296
242296242296
242296
 
Module-4 PART-2&3.ppt
Module-4 PART-2&3.pptModule-4 PART-2&3.ppt
Module-4 PART-2&3.ppt
 
Software models
Software modelsSoftware models
Software models
 
How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...
 
Slides chapters 26-27
Slides chapters 26-27Slides chapters 26-27
Slides chapters 26-27
 
Softwareenggineering lab manual
Softwareenggineering lab manualSoftwareenggineering lab manual
Softwareenggineering lab manual
 
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVE
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVERELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVE
RELIABILITY ESTIMATION FRAMEWORK -COMPLEXITY PERSPECTIVE
 
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...
PRODUCT QUALITY EVALUATION METHOD (PQEM): TO UNDERSTAND THE EVOLUTION OF QUAL...
 
Mi0033 software engineering
Mi0033  software engineeringMi0033  software engineering
Mi0033 software engineering
 
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
 
Artefacts of the Process
Artefacts of the ProcessArtefacts of the Process
Artefacts of the Process
 

Kürzlich hochgeladen

A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersNicole Novielli
 
A Glance At The Java Performance Toolbox
A Glance At The Java Performance ToolboxA Glance At The Java Performance Toolbox
A Glance At The Java Performance ToolboxAna-Maria Mihalceanu
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...Wes McKinney
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
Kuma Meshes Part I - The basics - A tutorial
Kuma Meshes Part I - The basics - A tutorialKuma Meshes Part I - The basics - A tutorial
Kuma Meshes Part I - The basics - A tutorialJoão Esperancinha
 
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesAssure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesThousandEyes
 
Accelerating Enterprise Software Engineering with Platformless
Accelerating Enterprise Software Engineering with PlatformlessAccelerating Enterprise Software Engineering with Platformless
Accelerating Enterprise Software Engineering with PlatformlessWSO2
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsRavi Sanghani
 
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)Mark Simos
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...panagenda
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...itnewsafrica
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality AssuranceInflectra
 
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Mark Goldstein
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
Microservices, Docker deploy and Microservices source code in C#
Microservices, Docker deploy and Microservices source code in C#Microservices, Docker deploy and Microservices source code in C#
Microservices, Docker deploy and Microservices source code in C#Karmanjay Verma
 
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...Nikki Chapple
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentPim van der Noll
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructureitnewsafrica
 

Kürzlich hochgeladen (20)

A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software Developers
 
A Glance At The Java Performance Toolbox
A Glance At The Java Performance ToolboxA Glance At The Java Performance Toolbox
A Glance At The Java Performance Toolbox
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
Kuma Meshes Part I - The basics - A tutorial
Kuma Meshes Part I - The basics - A tutorialKuma Meshes Part I - The basics - A tutorial
Kuma Meshes Part I - The basics - A tutorial
 
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesAssure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
 
Accelerating Enterprise Software Engineering with Platformless
Accelerating Enterprise Software Engineering with PlatformlessAccelerating Enterprise Software Engineering with Platformless
Accelerating Enterprise Software Engineering with Platformless
 
Potential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and InsightsPotential of AI (Generative AI) in Business: Learnings and Insights
Potential of AI (Generative AI) in Business: Learnings and Insights
 
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)
Tampa BSides - The No BS SOC (slides from April 6, 2024 talk)
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
 
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
Microservices, Docker deploy and Microservices source code in C#
Microservices, Docker deploy and Microservices source code in C#Microservices, Docker deploy and Microservices source code in C#
Microservices, Docker deploy and Microservices source code in C#
 
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...
Microsoft 365 Copilot: How to boost your productivity with AI – Part two: Dat...
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
 

Requirements for quality evaluation of software architecture

  • 1. - 1 - Abstract— In terms of Software Architecture seldom the requirements of a new system are sufficient to ensure that it will meet the needs and expectations of customers. Without undertaking a formal architectural evaluation process, the organization cannot ensure that the architectural decisions made-particularly those which affect the achievement of quality attribute such as performance, functionality, usability, efficiency, maintainability, reliability, availability, security and modifiability are advisable ones that appropriately mitigate risks. The question to which the answer is often vague is: The architecture is good enough? Many authors and researchers have identified that there is a gap between requirements definition and architecture defined for new systems. This failure in the elicitation of the features that ensure the quality desired by the client leads to risks. Some authors have researched this field and proposed models of quality assessment in systems architecture. However there is a market need for tools that facilitates the evaluation of architecture quality and connect it to which involved components. So this paper relies on the Architecture Tradeoff Analysis Method - ATAM as a guide and proposes the requirements for a toll to be used for architecture evaluation and the traceability between components and artifacts. Index Terms—Architecture Tradeoff Analysis Method ATAM, Qualitative evaluation of software architecture . I. INTRODUCTION The structure of a software system reflects its architecture. Each system has an architecture, whether achieved in a planned or an unplanned manner. A high-quality architecture is the key aspect of the success of any development project. Software architecture is an “intellectually graspable” abstraction of a complex system[1]. Architectural decisions have a great impact on the consequent quality of software systems. As a result, it is important to evaluate how the designed software architecture meets its quality demands. It gives a basis for analysis of software systems' behavior before the system has been built [5]. The ability to verify that a future software system fulfills it stakeholders' needs without actually having to build it represents substantial cost-saving and risk- mitigation [6]. Architectural views are models of architectural structures or other elements of a software architecture. Architects use views throughout the architecture lifecycle for the specific purpose of understanding and analyzing the different aspects of system development and performance. The major characteristic of a view is its purpose and utility. Views are not complete descriptions of the software architecture. Often views omit otherwise important architectural information in order to be useful for their specific purpose. The major issue with the views is consistency with the system rather than completeness. Architectural views are a necessary part of any architecture and serve as a way to design architectural structures and to understand different aspects of system design. But it is difficult to ensure the relationship between these views and the strategic business objectives that a system should cover. Aiming to improve the effectiveness of systems architecture to try to achieve the quality desired by customers, some techniques have been developed to perform architectural analyses and qualitative evaluation, like ATAM. The motivation to define and describe requirements for a tool that supports Qualitative Evaluation of Software Architecture is that conflicts often occurs in the delivery stage of new systems. Very often, while delivering a new developed system, it meets the described requirements but did not meet the expectations of customers in terms of: functionality, reliability, usability, efficiency, maintainability, and portability. Too often, systems are released with performance issues, security risks, and availability problems as a result of inappropriate decisions. Architecture evaluation is an early risk reduction method [2]. The architectures were defined early in the project life cycle, but the resulting flaws were discovered much later. The most significant benefit of qualitative evaluation is to reassure stakeholders that the candidate architecture is capable of supporting the current and future business objectives; specifically, it can meet its functional and nonfunctional requirements. The quality attributes of a system such as performance, availability, extensibility, and security are a direct result of its architecture; therefore, quality cannot be introduced easily to your system late in the game. A qualitative evaluation of the architecture while it is still a candidate specification Requirements for quality evaluation of software architecture Joao C. Albuquerque Hochschule Furtwangen University Fakultät Informatik Robert-Gerwig-Platz 1 78120 Furtwangen, Germany E-mail: joao.albuquerque@hs-furtwangen.de
  • 2. - 2 - can reduce project risk greatly. What is the importance of evaluating the architecture in the early stage of a software project? The architecture evaluation will not produce answers such as yes or no. It will point out risks and also help to map out which components support defined customer needs, limitations or features. The sooner the risks or weaknesses are identified, the lower the cost to re- evaluate the solution and change the project in part or in whole. There are currently few well-defined and mature to qualitatively evaluate the architecture of a software design methods: SAAM – Software Architecture Analysis Method, Target: changeability and extensibility, ALMA – Architecture-Level Modifiability Analysis, Target: Changeability, PASA – Performance Assessment of Software Architectures , Target: Efficiency and Performance, ATAM – Architecture Tradeoff Analysis Method, Target: system-relevant quality attributes. One reason for the choice of methodology for evaluating the quality of software: Architecture Tradeoff Analysis Method ATAM [4] was to be the one that best drives for architecture design and it’s relationships with quality attributes. It is a method for evaluating software architectures relative to quality attribute goals. ATAM evaluations expose architectural risks that potentially inhibit the achievement of an organization's business goals. The ATAM gets its name because it not only reveals how well an architecture satisfies particular quality goals, but it also provides insight into how those quality goals interact with each other, how they trade off against each other. The ATAM is the leading method in the area of software architecture evaluation. This also means that: the system proposed in this paper serve to support quality assessment using ATAM. In other words, it supports and documents the nine steps: 1-Present the ATAM. 2-Present business drivers. 3-Present architecture focusing on how it addresses the business drivers. 4-Identify architectural approaches. Architectural approaches are identified by the architect, but are not analyzed. 5-Generate quality attribute utility tree. The quality factors are elicited, specified down to the level of scenarios and prioritized. 6-Analyze architectural approaches. Based on the high-priority factors identified in Step 5, the architectural approaches that address those factors are elicited and analyzed. During this step, architectural risks, sensitivity points, and tradeoff points are identified. 7-Perform brainstorm meeting to prioritize scenarios. 8-Analyze architectural approaches. 9-Present results based on the information collected in approaches, scenarios, attribute-specific questions, the utility tree, and risks. Software reliability evaluation mainly includes the quantitative evaluation and qualitative evaluation. Both can be summarized as follows: quantitative evaluation is more accurate because it is based on metrics and mathematical formulas that numerically may rank the degree of complexity, interdependence, coupling among others. But there is a condition so that it can be applied. The software code must be implemented in order to be evaluated. In other words , it is an accurate method of assessment, but it can be used only after implementation. On the other hand, the qualitative assessment do not provide accurate answers but it can be performed early in the hole process (after requirements and architecture definition and before implementation, tests and delivery), before many mistakes and omissions had consumed resources, time and customer patience unnecessarily. Risks can be identified and avoided or mitigated previously. Moreover, the process involves a high degree of communication between stakeholders in the system, sponsors and development teams. This undoubtedly improves the degree of refinement of the quality requirements and can be pre- set in the architecture of the software project to end the increasing degree of satisfaction with the product delivered. These features, advantages and challenges lead to interest to research and develop standards and requirements to facilitate its implementation. Many problems in software systems are directly related to the specification and design of quality attributes such as modifiability or performance, to name just a few. Quality attributes have a main influence on customer and end user acceptance. To address them in a systematic and appropriate way is an important but challenging endeavor. Quality attributes are difficult to handle, because they need to be treated differently and typically affect not only parts of a system but the system as a whole. For example, engineers can’t constrain security or performance attributes to one single place in the system. Such attempts turn out to be impossible in most contexts, because the concerns are cross-cutting and often even invasive, that is, they require software engineers to inject design and code into existing components. With other words, most quality attributes are systemic and need global and strategic treatment. The benefits from performing an quality evaluation are: improved software architecture documentation, formulate a documented foothold for architectural decisions, make clear quality attribute requirements, to pinpoint risks at early stage, when the budget for implementation has not been exhausted and improves communication between stakeholders avoiding misunderstandings and omissions.
  • 3. - 3 - II.PREVIOUS RESEARCHES ARTICLES ON THIS TOPIC Some studies and researches were already made to drill an Qualitative Evaluation tool Software Architecture such as: [2] and [3]. Papers that also refer to the ATAM method [1]. In the software market are several products for mapping requirements and components of a project. There are good solutions that document and address traceability between: customer requirements, architecture view, functional modules and implementation units. In some documentation tools, architectural attributes are referred as functionality limitations or obligations such as maximum response time, maximum number of concurrent users, growth rate of the stored data but these are just limitations. Another researched tool was the: IBM-Rational-Quality-Manager. This is a web application that enables to track aspects of the quality assurance. The central artifact in this tool is a test plan that contains information, such as goals, schedules, milestones as well as links to associated test cases, requirements and development items. Rational Quality Manager includes manual test authoring and execution, test lab management, test execution, reporting and defect management. It was designed as a replacement for IBM Rational Manual Tester, IBM Rational ClearQuest Test Manager, and Rational TestManager. An Overview of Rational Quality Manager functions: The test plan defines the objectives and scope for the test effort and help teams to answer to the question: Are we ready to release a new software version? The advantages and features are: It can be used to define business and test objectives, establish a review and approval process, manage project requirements and establish the inter dependencies between the two, define quality goals, create and manage test cases, a test case can include links to development items and requirements. The relationship between test artifacts, requirements, and development artifacts can be traced by the traceability view. The most frequently reported disadvantages are: there is no reference manual that describes the functionality. The standard reports in the reporting section are not customizable and have a plain appeal, there is no standard report to requirements, test cases and execution result on the same sheet. In summary, despite the name suggest being a quality manager, in fact it is a test manager. What is an important activity, but does not address the issue of evaluating architectural software quality. Considerable research has been devoted to relating requirements and with source code. Less attention has been paid to relating requirements with architecture quality. Traces between requirements with architecture might be manually assigned and might be incomplete. The impact of requirements changes on other requirements, design elements and source code can be traced to determine parts of the software to be changed. In this research were identified features in tools traditionally used for requirements traceability and documentation of architecture, which in its new versions are already references the explicit all stakeholder requests and their attributes. Already a long way towards avoid losing customer focus. These characteristics have been described as some recent studies [8] Referencing solutions that are market leaders as IBM RequisitePro. Traceability benefits are: prioritizing requirements, estimating change impact, proving system understanding supporting design decisions, validating, and more. [9] But the focus of these solutions is not to evaluate whether the architecture is good enough to meet business goals. Yes, there are other market leaders with consultancy which propose this Qualitative Evaluation, but, in some cases, the solution behind this consulting is the migration to proprietary complete solutions. In other words, they evaluate the solutions made in house, point out flaws and weaknesses and recommend throwing away years of experience replacing by an in-a-box ERP solution that ensure architectural quality features. But the price of these solutions may not be within reach of most small and medium sized companies or create technological dependence by monthly subscription agreements. Migrating to these new solutions involves training to all employees, initial loss of productivity, data migration can not always be perfect done and among other problems. In our literature research, other tools [3], [4] showed an efficient architectural qualitative evaluation. However the focus of the tool proposed in this paper is that the qualitative evaluation of the architecture should be traceable on the respective requirements and implementation units. In this study we aim to assess: how far traditional tools for mapping requirements and design architecture can be exploited to allow adoption of the features for evaluating the quality of software architecture. Thus the proposed solution can be compared with established products on the market such as IBM Rational described at figure 3 and so we will have a feasibility study of our proposal. After all, a new Software proposal only makes sense if there really still this gap in the market (the loose coupling between architecture design approaches and system quality attributes are seen as major weak points) and from solutions who are able to (identifying essential requirements for creating a sustainable architecture that permit achieving the business goals) pointed by previous studies. [3].
  • 4. - 4 - III. REQUIREMENTS PROPOSAL There is a big risk of projects being designed, meet all documented requirements but contain risks and do not meet the expectations of its sponsors. How can we prove to the sponsors that the solution will meet characteristics like: Functionality, Suitability, Security, Interoperability, Reliability, Fault Tolerance, Recoverability, Usability, Understandability, Learnability, Operability, Attractiveness, Efficiency, Maintainability, Changeability, Stability, Portability, Co-Existence? Given this gap, this paper attempts to describe a tool that can assist in this evaluation of software architecture and mapping traceability between these architectural attributes and components of reported software. The identified risks should be associated with components that address the problem. Thus quality assessment and improvement points may apply corrective measures with less effort and greater assertiveness. The features: Initially the software should be designed to be performed during a presentation to stakeholders. After presenting the concept of ATAM to the stakeholders some Business Drivers are being raised and may already be filled in the system to continue the survey of the major Business Drivers. Then it should open a screen to fill the summarized description of the major Business Drivers. Third step: the toll should then open a screen to fill the summarized description of architecture solutions. This description should be focused on: how it addresses the business drivers. This step will be best understood by everyone if you could include not only the textual description but also images, charts, or diagrams regarding the architecture associated with text description. Each entry of this step should be linked with one or more from step above. The Architecture is usually described by using the logical view and it is concerned with the functionality that the system provides to end-users. UML Diagrams used to represent the logical view include Class diagram, Communication diagram, and Sequence diagram. The development view illustrates a system from a programmer's perspective and is concerned with how the solution is modularized and the software management. This view is also known as the implementation view. It uses the UML Component diagram to describe system components. UML Diagrams used to represent the development view include the Package diagram. The process view deals with the dynamic aspects of the system, explains the system processes and how they communicate, and focuses on the runtime behavior of the system. The process view addresses concurrency, distribution, integrators, performance, and scalability, etc. UML Diagrams to represent process view include the Activity diagram. The physical view depicts the system from a system engineer's point-of-view. It is concerned with the topology of software components on the physical layer, as well as the physical connections between these components. This view is also known as the deployment view. UML Diagrams used to represent physical view include the Deployment diagram. Scenarios: The description of architecture may be illustrated using a small set of use cases, or scenarios which become a fifth view. The scenarios describe sequences of interactions between objects, and between processes. They are used to identify architectural elements and to illustrate and validate the architecture design. They also serve as a starting point for tests of an architecture prototype, and it’s also called as “use case view”. The proposed the logical data model for this tool is described at figure 1. Figure 1 – Logical Data Model Requirements typically define what a system should do, but rarely define how it should be done. How fast how secure how easy how modifiable, how portable and so on. To help participants elicit the most important features for their business, a database of pre-defined questions that refer to issues of quality, features and limitations should be used, described at: Table 1 - Software Architecture Quality Survey. For each question there are answer options according to the degree of importance of this feature for business. The choice of these answers will help identify project risks. And consequently, will facilitate the stakeholders understanding of the relationship of each risk with the software components on dealing with this functionality. To accelerate the meetings, the data tables containing the main software components designed in architecture or implementation units for the application can be pre- registered by the development team. In the same way, the tables of questions and answers should previously contain to the most common situations on business needs like: how long should a new feature be ready to market. restrictions about response time, volume of data stored in the system, time required for system changes, ease of users' learning, number of concurrent users, availability (24h x 7d) or (8h x 5d) and so on. From this point can be performed the step 5 from ATAM [4]- by conducting the brainstorm meeting to identify and prioritize the system’s most important quality attribute goals. Here must be a way to focus the attention of stakeholders on the architectural solutions that are most critical to the system’s success. And so can be successfully described the Quality attribute utility tree on figure 2.
  • 5. - 5 - Figure 2 – Quality attribute utility tree The scenarios are the most important part of the utility tree, the main reason is that the scenarios help us understand the quality attributes needed, and more importantly, by tying the attributes to real instances in the system the scenarios help make these goals both concrete and measurable. Scenarios specification: First - Scenarios should be as specific as possible. Scenarios should cover a range of: Anticipated uses of the system - what happens under normal use Anticipated changes to where is expected the system to go and develop Unanticipated stresses to the system. Scenarios are basically statements that have a context a stimulus and a response and describe a situation in the systems where the quality attribute manifests itself. Context - under what circumstances Stimulus - trigger in Use case Response - what the system does. For example: under normal operation, perform a database transaction in under 100 milliseconds. Remote user requests a database report during peak period and receives it within 5 seconds. Add a new data server to reduce latency in scenario 1 to 2 seconds within 1 person-week. An intrusion is detected, and the system cannot lock the doors. The system activates the electromagnetic fence so that the intruder cannot escape. For a new release, integrate a new component implementation in three weeks. Half of the servers go down during normal operation without affecting overall system availability. Under normal operations, queuing orders to a site which is down, system suspends within 10 minutes of first failed request and all resources are available while requests are suspended. Distribution to others is not impacted. By adding hardware alone, increase the number of orders processed hourly by a factor of ten while keeping the worst-case response time below 2 seconds. If we take one of these: An intrusion is detected, and the system cannot lock the doors. The system activates the electromagnetic fences so that the intruder cannot escape: The stimulus - An intrusion is detected Context - the system cannot lock the doors Response - the system activates fences. Or another one (Half of the servers go down during normal operation without affecting overall system availability) Stimulus - Half the servers go down Context during normal operation Response - without affecting overall. The next step is prioritizing scenarios by evaluating: Importance to Business Goals (High, Medium, Low attributes) Difficulty or Risk in achieving (High, Medium, Low attributes) The interesting scenarios are the ones with high priority. To help clarifying those scenarios and identifying risks the common survey / answers table should be used. The usefulness of the questions table is the enrichment of brainstorming, avoiding that crucial issues are forgotten or pass unnoticed, hence the importance of linking their response to a risk. The purpose of connecting all the steps and prioritize needs is that: at the end of filling all data in the respective tables this tool will be able to generate a report with the rankings of architectural solutions and their components that support the main business goals. And if any identified risks have not been addressed by any solution, this will be highlighted as risks untreated. Each question should be as generic related as possible but it should address one software quality topic or sub characteristics detailed on Figure 3 [10]. This means that, in order to help stakeholders to elicit detailed information about their business drivers who should be treated by the system they will try to answer if each of those questions impacts their expectations about the characteristics of the software project being evaluated referring to each of the above described quality feature or attribute If some questions receive responses with a high degree of importance to the strategic business goals and those still not have structures to handle it´s needs, then new risks classified as untreated should be detailed in the project documentation. This will highlight points of architecture where there is a failure and need to be improved. Figure 3-ISO 25010 - Software product quality model In the new model for software quality in the draft ISO/IEC CD 25010 standard, the concept of quality in use has been broadened to embrace a wider range of issues than was common in usability. The term flexibility has been added to refer to the need for usability in both planned and unplanned contexts of
  • 6. - 6 - use and the need for usability for people with special needs. Flexibility can also include learnability, or how quickly and effectively a user interface can be learned. The standard also makes a distinction between usability from different stakeholder perspectives that result in different types of measures, including from the perspective of the end user achieving personal goals, the perspective of the organization achieving organizational goals, and the perspective of technical support achieving maintenance goals. Functional suitability The degree to which the software product provides functions that meet stated and implied needs when the software is used under specified conditions. Appropriateness - The degree to which the software product provides an appropriate set of functions for specified tasks and user objectives. Accuracy - The degree to which the software product provides the right or specified results with the needed degree of precision. Functional suitability compliance - The degree to which the software product adheres to standards, conventions or regulations in laws and similar prescriptions relating to functional suitability. Reliability The degree to which the software product can maintain a specified level of performance when used under specified conditions. Availability - The degree to which a software component is available when required for use. Fault tolerance - The degree to which the software maintain a level of performance in cases of software faults or of infringement of its specified interface. Recoverability - The degree to which the software product can re-establish a specified level of performance and recover the data directly affected in the case of a failure. Reliability compliance - The degree to which the software product adheres to standards, conventions or regulations relating to reliability. Performance The degree to which the software product provides appropriate performance, relative to the amount of resources used, under stated conditions. Time behavior The degree to which the software product provides appropriate response and processing times and throughput rates when performing its function, under stated conditions. Resource utilization The degree to which the software product uses appropriate amounts and types of resources when the software performs its function under stated conditions. Performance efficiency compliance The degree to which the software product adheres to standards or conventions relating to performance efficiency. Operability The degree to which the software product can be understood, learned, used and attractive to the user, when used under specified conditions. Appropriateness recognisability - The degree to which the software product enables users to recognize whether the software is appropriate for their needs. Learnability - The degree to which the software product enables users to learn its application. Ease of use - The degree to which the software makes it easy for users to operate and use it. Helpfulness - The degree to which the software product provides help when users need assistance. Attractiveness - Degree to which the software is attractive to the user. Technical accessibility - The degree of operability of the software for users with specified needs. Operability compliance - The degree to which the software product adheres to standards, conventions, style guides or regulations relating to operability. Security The protection items from accidental or malicious access, use, modification, destruction, or disclosure. Confidentiality - The degree to which the software provides protection from unauthorized disclosure of data or information, whether accidental or deliberate. Integrity - The degree to which the accuracy and completeness of assets are safeguarded. Non-repudiation - The degree to which actions or events can be proven to have taken place, so that the events or actions cannot be repudiated later. Accountability - The degree to which the actions of an entity can be traced uniquely to the entity. Authenticity - The degree to which the identity of a subject can be proved to be the one claimed. Security compliance - The degree to which the software product adheres to standards, conventions or regulations relating to security. Compatibility The ability of some software components to exchange information and/or to perform their required functions while sharing the same hardware or environment. Replaceability - The degree to which the software product can be used in place of another specified software product for the same purpose in the same environment. Co-existence - The degree to which the software product can co-exist with other independent software in a common environment sharing common resources without any detrimental impacts. Interoperability - The degree to which the software product can be cooperatively operable with one or more other software products. Compatibility compliance - The degree to which the software product adheres to standards, conventions or regulations relating to compatibility. Maintainability The degree to which the software product can be modified. Modifications may include corrections, improvements or adaptation of the software to
  • 7. - 7 - changes in environment, and in requirements and functional specifications. Modularity - The degree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components. Reusability - The degree to which an asset can be used in more than one software system, or in building other assets. Analyzability - The degree to which the software product can be diagnosed for deficiencies or causes of failures in the software, or for the parts to be modified to be identified. Changeability - The degree to which the software product enables a specified modification to be implemented. The ease with which a software product can be modified. Modification stability - The degree to which the software product can avoid unexpected effects from modifications of the software. Testability - The degree to which the software product enables modified software to be validated. Maintainability compliance - The degree to which the software product adheres to standards or conventions relating to maintainability. Transferability The degree to which the software product can be transferred from one environment to another. Portability - The ease with which a system or component can be transferred from one hardware or software environment to another Adaptability - The degree to which the software product can be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered. Installability - The degree to which the software product can be successfully installed and uninstalled in a specified environment. Transferability compliance - The degree to which the software product adheres to standards or conventions relating to portability. Figure 4-ISO 25010 - Quality model for quality in use Quality in use [10] is the degree to which a product used by specific users meets their needs to achieve specific goals with effectiveness in use, efficiency in use, flexibility in use, safety and satisfaction in use in specific contexts of use. Quality in use is a measure of the quality of the system in a real or simulated operational environment. It is determined by the quality of the software, hardware, operating environment, and the characteristics of the users, tasks and social environment. All these factors contribute to quality in use. Quality in use can be used to assess the quality of software in a specific context of use. The attributes of quality in use are categorized into three characteristics: Usability in use evaluates the degree to which specified users can achieve specified goals with effectiveness in use, efficiency in use and satisfaction in use ; Flexibility in use evaluate the degree to which the product is usable in all potential contexts of use; Safety deals with acceptable levels of risk of harm to people, business, data, software, property or the environment in the intended contexts. SUMMARY AND OUTLOOK Comparing the data model and the list of requirements proposed in this article it was noted that the RequisitePro tool can be efficiently adapted to support an architecture quality review following the steps proposed by ATAM methodology. In other words, a template for this tool can be developed containing documents standards, and types of attributes with their respective links of influence and traceability matrix, with the facility to generate a quality attribute scenario. With the ease that the tool already has the ability to rank of each document attribute, the prioritization of business drivers, prioritization of scenarios can be taken using the: Attributes Tab Priority (High, Medium, Low) Status (Proposed, Approved, Implemented, Tested, Discarded) Difficulty (High, Medium, Low) Stability (High, Medium, Low) Origin Cost (High, Medium, Low) Defect Figure 5 – Features from IBM RequisitePro And other attributes related do the ATAM evaluation may be created at the project properties figure 5 like risk, quality attribute, relevance, security challenge, and so on. On this way, it can not only documenting the requirements and architecture of the software but also make a qualitative assessment of both the architecture initially proposed, as well as in future developments of the software in question, but also
  • 8. - 8 - support and connect all evaluation steps. One advantage is to be done in the same tool already known and used worldwide. The Requisite Tool integrates not only the Microsoft Word text editor on any version. The matrix of attributes and traceability is saved in an MDB database file Microsoft Access database that can be versioned along with the project providing version control type. But SQL Server can also be used to save the project and its attributes and traceability. There is also the option to link the requirements artifacts and architectural design activities with a project on the timeline of Microsoft Project. The architectural views are also easily integrated with Rational ROSE for designing UML diagrams and use cases. Of course there are disadvantages to. Companies that do not use this tool to document their projects would need to invest in license that has a manufacturer's suggested price is U.S. 2740 per seat and also the disadvantage of having to re-type the requirements already documented in another management tool. Another disadvantage is that tool is not intended to be a web application, so that the evaluation and meetings are only effective if they are face. Files can not be renamed after they are created, so it must be previously standardize the naming of artifacts. There is no way to be easily implemented in world wide collaborative work using clouds like "GoogleDocs"c. When a new document is added to the project, all project users must manually update the project to contain this new document on his views. Each project document can be edited by one user at a time, while other users can open in read-only mode. The characteristic of merging concurrent edits in the same document does not always work if both users type in the same sections of the document. The proposed: requirements in this article, the steps of the methodology, the prioritization of attributes and their subsequent raking could be implemented in the tool Rational Requisite Pro elaborating a template shown on figure 6, so users, stakeholders and developers can use a relatively simple and common platform to conduct the evaluation of the architecture of a software project after the requirements and proposed software architecture design step. The traceability matrix is customizable in requisite pro. This means that is possible to create many different views with different hierarchies. So, the relationships described in the proposed data model can be implemented easily. In other words, it is possible to start a new view it from the risks and detail down to the components that address the risk. This manner evidences risks that remain to be addressed. Another view can start the business drivers and get down to the scenarios and the architecture view. Each change is automatically pointed as the impact on the system. And so, before start coding and implementing, it is possible to evaluate that the final product will not only meet requirements, but increase stakeholders satisfaction, because it was possible to evaluate the expected quality and ensure traceability between e points defined and the delivered solution. REFERENCES [1] Clements, Paul, Rick Kazman, and Mark Klein. “Evaluating software architectures”. Reading: Addison-Wesley, 2012. [2] Thiel, Steffen, Andreas Hein, and Heiner Engelhardt. “Tool Support for Scenario-Based Architecture Evaluation”. STRAW. p. 41-45., 2003. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10. 1.1.145.6491&rep=rep1&type=pdf#page=47 [3] Thiel, Steffen. “A framework to improve the architecture quality of software intensive systems.” Diss. Universität Duisburg-Essen, Fakultät für Wirtschaftswissenschaften, 2005. [Online]. Available: duepublico.uni-duisburg- essen.de/servlets/DerivateServlet/Derivate- 13645/Dissertation_Thiel.pdf [4] Kazman. Rick, Klein. Mark, and Clements. Paul, "ATAM: Method for Architecture Evaluation," Software Engineering Institute, Carnegie Mellon University, Pittsburgh, Pennsylvania, Technical Report CMU/SEI-2000-TR-004, 2000. [Online]. Available: resources.sei.cmu.edu/library/asset- view.cfm?AssetID=5177 [5] Perry, D. E.; Wolf, A. L. "Foundations for the study of software architecture". ACM SIGSOFT Software Engineering Notes, 1992 [6] SARA Work Group "Software Architecture Review and Assessment (SARA) Report ". 2012 Available kruchten.com/philippe/architecture/SARAv1.pdf [7]Alan M. Davis “Software Requirements – Analysis and Specification,” Prentice Hall, 1990. [8]Arda Goknil, Ivan Kurtev, Klaas Van Den Berg, Generation and validation of traces between requirements and architecture based on formal trace semantics, Journal of Systems and Software, Volume 88, February 2014, Pages 112-137, Available http://dx.doi.org/10.1016/j.jss.2013.10.006 [9] Stefan Winkler and Jens Pilgrim. A survey of traceability in requirements engineering and model- driven development. Softw. Syst. Model. 9, 4 2010., Pages 529-565. Available http://dx.doi.org/10.1007/s10270-009-0145-0 [10]ISO/IEC CD 25010 . Systems and software engineering—Software product Quality Requirements and Evaluation (SQuaRE)—Software product quality and system quality in use models. 2009 Avaliable: sa.inceptum.eu/sites/sa.inceptum.eu/files/Content/ISO_25010.pdf
  • 9. - 9 - Figure 6 – New Template - View for IBM RequisitePro
  • 10. - 10 - Table 1 - Software Architecture - Quality Attributes Survey Functional suitability Does the presented software architecture provide functions that meet stated and implied needs? □ Yes – □ No default answer form Level of Importance □ Low □ Average □ High □ Very important Risk □ No □ Yes □ New Risk New Risk Description [ ] o Appropriateness Does the presented software architecture provide an appropriate set of functions for all specified user objectives? o Accuracy - Does the presented software architecture provide the expected results with the needed precision? o Functional suitability compliance - Does the presented software architecture adhere to standards, conventions or regulations in laws relating to Functional Suitability? Reliability - Does the presented software architecture convince it can maintain a specified level of performance when used under specified conditions? IOr - Where are the weaknesses? o Availability - Does the presented software architecture provide available when required for use? o Fault tolerance - The degree to which the software maintain a level of performance in cases of software faults or of infringement of its specified interface? o Recoverability - The degree to which the software can re-establish a specified level of performance and recover the data directly affected in the case of a failure? o Reliability compliance - The degree to which the software adheres to standards, conventions or regulations relating to reliability? Performance Does the presented software architecture provide appropriate performance, under stated conditions? o Time behavior The proposed system is design to provides appropriate response and processing times and throughput rates when performing its function, under stated conditions? o Resource utilization - How well the software uses appropriate amounts and types of resources when the software performs its function under stated conditions? o Performance efficiency compliance - The proposed system is designed to adheres to standards or conventions relating to performance efficiency? Operability - How well can the proposed software design be understood, learned, used? o Appropriateness recognisability - How well enables the software enables users to recognize whether the software is appropriate for their needs ? o Learnability - How well the proposed system is design enables users to learn its application? o Ease of use - How well the proposed system is design makes it easy for users to operate and use it? o Helpfulness - How well the proposed system is design provides help when users need assistance? o Attractiveness - How well the software product is or will be attractive to the user? o Technical accessibility – Is it appropriate the degree of operability for users with specified needs? o Operability compliance - How well the software adheres to standards, conventions, style guides or regulations relating to operability? Security - How well protected are items from accidental or malicious access, modification, destruction or disclosure? o Confidentiality - How well the software provides protection from unauthorized disclosure of data or information, whether accidental or deliberate? o Integrity - How well the accuracy and completeness of assets are safeguarded? o Non-repudiation - How well actions or events can be proven to have taken place, so that the events or actions cannot be repudiated later? o Accountability - How well the actions of an entity can be traced uniquely to the entity? o Authenticity - How well the identity of a subject can be proved to be the one claimed? o Security compliance - How well does the software adheres to standards or regulations relating to security? Compatibility - Are some software components able to exchange information or to perform their required functions while sharing the same hardware or environment? o Replaceability - The proposed system is designed to enable be used in place of another specified software for the same purpose in the same environment? o Co-existence - How well the software can co-exist with other independent software in a common environment sharing common resources without any detrimental impacts? o Interoperability - How well the software can be cooperatively operable with one or more other software products? o Compatibility compliance - The proposed system is designed to adheres to standards, conventions or regulations relating to compatibility? Maintainability - How complicated, time consuming, expensive and difficult it is to be performed maintenance, alterations and development of new features in this software product? Modifications may include corrections, improvements or adaptation of the software to changes in environment, and in requirements and functional specifications?
  • 11. - 11 - o Modularity – The proposed system is designed to be build of discrete components such that a change to one component has minimal impact on other components? o Reusability - The proposed system is designed to enable features being used in more than one software system, or in building other assets? o Analyzability - How well the proposed system design can be diagnosed for deficiencies or causes of failures in the software, or for the parts to be modified to be identified? o Changeability - How well the proposed system design enables a specified modification to be implemented? Will it be easy to modify parts of the implemented software? o Modification stability - How well the proposed system design can avoid unexpected effects from modifications of the software? o Testability - How well the proposed system design enables modified software to be validated? o Maintainability compliance - How well the proposed system design adheres to standards or conventions relating to maintainability? Transferability - How complicated, time consuming, expensive and difficult it is to transfer the software from one environment to another and set it to be fully operational? o Portability – Is it easy the whole system be transferred from one hardware or software environment to another? o Adaptability – Can the proposed system design be adapted for different specified environments without applying actions or means other than those provided for this purpose for the software considered? o Installability - Can the proposed system design can be successfully installed and uninstalled in a specified environment? o Transferability compliance - Does the software adheres to standards or conventions relating to portability like older versions of Operating System, or different languages configured at the host operating system ? Quality in use - Can the proposed system design be used by users and meets their needs to achieve specific goals with effectiveness, efficiency, flexibility, safety and satisfaction in use? Usability in use - How well the proposed system design enables specified users to achieve specified goals with effectiveness in use, efficiency in use and satisfaction in use? Flexibility in use How well the proposed system design enables to be usable in all potential contexts of use; Safety How well the proposed system design deals with acceptable levels of risk of harm to people, business, data, software, property or the environment in the intended contexts?