The document provides an overview of different types of database testing including front end database testing, structural backend testing, functional backend testing, database migration testing, data warehouse testing, and batch job execution testing. It describes the key aspects to test for each type, such as verifying database schemas, stored procedures, triggers, data integrity, security, performance, and more. Screenshots are also included to exemplify some of the testing processes.
2. What is a Database?
• A database is a structured collection of records or data that is stored in a
computer system.
•A database not only store large amounts of records, but be accessed easily.
•New information and changes should also be fairly easy to input.
• In order to have a highly efficient database system, a program that manages
the queries and information stored on the system must be incorporated. This
is usually referred to as DBMS or a Database Management System.
•Besides these features, all databases that are created should be built with
high data integrity and the ability to recover data if hardware fails
3. What is a database?
• Quite simply, it’s an organized collection of data.
• A database management system (DBMS) such as Access, FileMaker,
Lotus Notes, Oracle or SQL Server which provides you with the
software tools you need to organize that data in a flexible manner.
• It includes tools to add, modify or delete data from the database, ask
questions (or queries) about the data stored in the database and
produce reports summarizing selected contents.
4. Why do we need a database?
• Keep records of our:
• Clients
• Staff
• Volunteers
• To keep a record of activities and interventions
• Keep sales records
• Develop reports
• Perform research
• Longitudinal tracking
5. Why Database is Independent?
The database system represents an independent component of an application
because it:
• is accessed via both application and database utilities
• has its own security system
• requires independent maintenance procedures
• may be used by more than one application
• may have dependencies outside the application (data import and export)
7. Connecting to database
• Every data base need user credentials to access the Database
• Depending the role of the user the access privileges will be given to
the user
• To connect to database user need following credentials:
• DB Server name
• Port number
• Connection string
• User Id and password
10. Why back end testing is important
• A database is the engine for any client server application
• Any back end malfunction may lead to
• Dead Lock
• Data Corruption
• Bad performance
• Data Lost
• A bug in the back end may put serious impact on the whole system.
• Too many bugs in the back end will cost tremendous resources to find and fix
the bug.
• Finally lead to delay in system developments.
11. Different Types of Database testing
• Front end Database testing
• Structural Back end Testing of the Database
• Functional Back end Testing of the Database
• Database Migration testing
• Data warehouse testing
• Batch Job execution
Based on the type of project DB testing is categorized into
Six types
12. Front end to Database
• Inserting the data through front end and verifying the same in the back end.
• Validate the data accessed in the front end is available in the back end.
• Modifications made to the record in the front is on save or submit should
reflect in the back end in the related table.
13. Front end to Database
Screen shots for the same front end to database testing
14. Database to front end
• This is reverse engineering.
• Updating or modifying the data in the back end verifying the same in the
front end.
• Accessing data in the front end and when the data base is loaded with the
data by executing the batch jobs
• Examples:
• Expiry dates of the credit card which cannot be entered through the front end.
15. Structural back end testing
• It is also called metadata or model DB testing.
• Base document for testing is DDL document.
• Based on the structure the SQL database can be divided into three categories
• Database Schema testing
• Tables
• Table Columns
• Keys
• Column Types
• Size
• Indexes
• Defaults
• Rules
• Stored procedures testing
• Triggers
16. Database schema testing
• Tables ,Table Columns , Column Types, Defaults, Rules
• Verify and validate all the tables and the table names.
• Verify and validate all the columns of each table.
• Verify and validate column type of each column of the table along with
the size of column.
• Verify and validate the constraints on the columns
• Verify and validate if any defaults values or rules that are applicable for
particular column of the table.
20. Database schema testing
• Keys and Indexes
• Verifying and validating the key constraints
• Primary Key
• Foreign Key
• Verify the column type of the foreign key and same column in the other
table
• Verify the List of indexes for the given table.
• Also Verify the list of columns associated with the index of the table
24. Stored Procedures testing
What is a stored procedure?
A stored procedure is a group of Transact-SQL statements compiled
into a single execution plan.
Pre-Condition for execution of the procedure
- Stored Procedure should get installed in the database
- Should have the unique name
- Should know the parameter name, type and number of
parameters
25. Stored Procedures testing
Verify the following while testing the stored procedure
Output stored procedure:
- Before execution the tester should know purpose of the procedure
• What a stored procedure supposed to perform
• What a stored procedure not supposed to do
-Verify the number of rows effected when procedure is executed.
-Write the simple queries to make sure the stored procedure
populates right data
Parameters:
-Check the parameters if required
-Call the procedure with the valid data
-Call the procedure boundary data
-Make a parameter in valid and run a procedure.
26. Stored Procedures testing
Verify the following while testing the stored procedure
Return Values:
-Check whether a stored procedure returns value.
-When failure occurs , non zero value must be returned.
Error messages:
-Make stored procedure fail and and cause every error message to
occur at least once
-Verify the exception that doesn’t have predefined error messages
-Verify if any location is specified for the error logs
Others:
- Integration tests of procedures
-Group related stored procedures together. Calling them in
particular order.
-Verify any sequence of jobs to get executed.
-Make invalid calling sequences and run group of stored
procedures.
-Access permissions
28. Trigger testing
-A trigger is a special kind of stored procedure that automatically
executes when an event occurs in the database server.
-The specified event is associated with either a table, a view, a schema,
or the database, and it is one of the following:
-A database manipulation (DML) statement (DELETE, INSERT,
or UPDATE)
- A database definition (DDL) statement (CREATE, ALTER, or
DROP)
-A database operation (SERVERERROR, LOGON, LOGOFF,
STARTUP, or SHUTDOWN)
-Triggers are a special PL/SQL construct similar to procedures.
-Triggers are useful for tasks such as enforcing business rules,
validating input data, and keeping an audit trail.
-Commonly used triggers
-Updating trigger
-Deleting Trigger
-Insert Trigger
29. Trigger testing
Updating ,Insert and Delete trigger:
-Make sure trigger spelling is correct.
-Verify the trigger generated for specific column.
-Trigger update validation and verification.
-Verify the roll back function when failure occurs.
-Verify the insert trigger with invalid data and verify the errors
logged.
-Delete trigger deletes the record.
-Verify the referential integrity for the record deleted.
-Try to delete the record that does not exit.
-Verify roll back function when deletion fails.
30. Functional Back end Testing of the Database
• A back end can be broken down into a finite number of testable pieces
based on application functionality
• The test focus is on the functionality of input and output but not on the
implementation and structure.
• Its not good idea to test server database as single entity at initial stage.We
have to divide into functional modules
• Dividing the server database depends on the features of particular project
31. Functional Back end Testing of the Database
• Test the functions and features
• For update and insert functions make sure data entered is as per the
application rules.
• For deletion function make sure the data is deleted correctly.
• Check the malfunctioning.
• Check for the inter operations.
• Error detection and handling
32. Functional Back end Testing of the Database
Checking data integrity and consistency: If a project does not guarantee the
data integrity and consistency, we have obligation to ask for redesign. So we
have to minimum things given below.
- Data validation before insert, delete and update functions
- Triggers must be in a place to validate the reference table records
- Check the major columns in the table to check if any wired data exists
- Try to generate inconsistent data and insert them into relevant tables and
see if any failure occurs.
- Try to insert child data before inserting its parent data.
- Try to delete the that is till referenced by data in the other table
- Make sure if any replicated servers or databases are on sync and contain the
consistent information
33. Functional Back end Testing of the Database
Login and User Security:
- SQL user login
-Database access privileges (sysusers tables)
-Table access privileges
-Table data access privileges
-NT server login
-Training account
-Check concurrent logins
-Try to login if any time consuming query is executing and see how
long it takes login to succeed
34. Database Migration testing
- Database migration testing is needed when you move data from the
old database(s) to a new database.
-The old database is called the legacy database or the source database
and the new database is called the target database or the destination
database.
- Database migration may be done manually but it is more common to
use an automated ETL (Extract-Transform-Load) process to move the
data.
- In addition to mapping the old data structure to the new one, the ETL
tool may incorporate certain business-rules to increase the quality of
data moved to the target database.
Database Migration Testing Types:
- Model Migration
- Data Migration
35. Database Migration testing
-Model Migration
- All the schema objects will not be migrated. There will be some
restrictions
- Similar to the structural database testing.
- Need to compare database structure of source DB with target DB.
- Based on DDL document verify the content.
- Verifying and validating the table structures and table names
- Verifying and validating the table columns of each table.
- Verifying and validating column data type and size
-Verify and validate the constraints defined for the table
- Verify and validate the trigger , procedures , Synonyms , Indexes ,
Functions…..
37. Database Migration testing
Model Migration
- Verify the column data type based on the type of migration
database
- Column type changes will be based on the Source and Target
DB
38. Database Migration testing
-Data migration
-Validation and verification of the data
-Validating the duplicates entities
-Verifying the nulls, empty strings and spaces.
-Verify and validate the row count of the source and target.
- The data is less than 100 rows use excel to compare the data of
source and target DB’s
-More than 100 rows use the tool for comparing the data.
-In case of row count mismatch ,verify the data that is missing and
Pre- Condition:
Load the source and target with same data
40. Data warehouse testing
• What is Data warehouse?
• A Data warehouse is a composite and collaborated data model that captures
the entire data of the organization.
• It brings data together from heterogeneous sources into single destination
• Data is extracted transformed and loaded into the data warehouse.
• Processing of the data is usually done in the staging area.
• Data is not normalized.
• There are three types of common data warehouse architectures available
44. Data warehouse testing
• Data completeness. Ensures that all expected data is loaded.
• Data transformation. Ensures that all data is transformed correctly
according to business rules and/or design specifications.
• Data quality. Ensures that the ETL application correctly rejects, substitutes
default values, corrects or ignores and reports invalid data.
• Performance and scalability. Ensures that data loads and queries perform
within expected time frames and that the technical architecture is scalable.
45. Data warehouse testing
• Data completeness:
One of the most basic tests of data completeness is to verify that all expected
data loads into the data warehouse. This includes validating that all records,
all fields and the full contents of each field are loaded. Strategies to consider
include:
• Comparing record counts between source data, data loaded to the
warehouse and rejected records.
• Comparing unique values of key fields between source data and data
loaded to the warehouse.
• Populating the full contents of each field to validate that no truncation
occurs at any step in the process.
• Testing the boundaries of each field to find any database limitations. For
example, for a decimal(3) field include values of -99 and 999
46. Data warehouse testing
• Data Transformation
Validating that data is transformed correctly based on business rules can be
the most complex part of testing an ETL application with significant
transformation logic. One typical method is to pick some sample records
and "stare and compare" to validate data transformations manually.
- Create a spreadsheet of scenarios of input data and expected results and validate these with the
business customer.
- Create test data that includes all scenarios.
- Validate correct processing of ETL-generated fields such as surrogate keys.
- Validate that data types in the warehouse are as specified in the design and/or the data model.
- Set up data scenarios that test referential integrity between tables. Validate parent-to-child
relationships in the data.
47. Data warehouse testing
• Data Quality
Data quality is defined as "how the ETL system handles data rejection,
substitution, correction and notification without modifying data." To
ensure success in testing data quality, include as many data scenarios as
possible. Typically, data quality rules are defined during design phase.
Following are the examples.
• Reject the record if a certain decimal field has nonnumeric data.
• Substitute null if a certain decimal field has nonnumeric data.
• Validate and correct the state field if necessary based on the ZIP code.
• Compare product code to values in a lookup table, and if there is no match load anyway but report to
users.
48. Data warehouse testing
• Performance and scalability:
As the volume of data in a data warehouse grows, ETL load times can be
expected to increase, and performance of queries can be expected to
degrade. This can be mitigated by having a solid technical architecture and
good ETL design. The aim of the performance testing is to point out any
potential weaknesses in the ETL design
• Load the database with peak expected production volumes to ensure that this volume of data can be
loaded by the ETL process within the agreed-upon window.
• Compare these ETL loading times to loads performed with a smaller amount of data to anticipate
scalability issues.
• Monitor the timing of the reject process and consider how large volumes of rejected data will be
handled.
• Perform simple and multiple join queries to validate query performance on large database volumes
49. Batch Job execution
A batch job is a computer program or set of programs processed in batch
mode. This means that a sequence of commands to be executed by the
operating system is listed in a file (often called a batch file, command file, or
shell script) and is submitted for execution as a single unit.
- Verify the source and destination files paths and file names.
- Verify parameters that are given while executing batch job
- Estimate the time batch job execution.
- Estimate the time taken for the batch volume testing.
- Verify log files after execution of the job
- Verify the row count of the table before and after the execution of the job.
- Validate the data after execution of the batch job.