Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Chapter 9: Testing summary, Summaries of Performance Evaluation

Describes in quality of testing, performance testing, stress testing and scalability testing and capacity planning.

Typology: Summaries

2021/2022

Uploaded on 03/31/2022

ehaab
ehaab 🇺🇸

4.2

(32)

275 documents

1 / 28

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Chapter 9
Testing
Introduction
Along with security, testing is another major topic in software development.
Without rigorous testing, there is no possible assertion of the quality of a
system, and all organizations involved in software development are aware of
the importance of testing. Unfortunately, today, when it comes to address the
issues that define the testing discipline, either in a process or in a project, more
often than not you will be observing a very craftsman-like situation.
This is paradoxical when you think that 20 or 30 years ago testing was a
very rigorous discipline. The explanation is very simple: 30 years ago most soft-
ware-intensive systems were strategic solutions, enjoying lots of resources and
with very stringent requirements on quality. Today, a lot of systems fulfill tac-
tical business objectives, in a very competitive environment, with very tight
timeframes and limited budgets. No wonder that over that period, three-quar-
ters of software automation projects have failed, being either over budget, over
time, of poor quality, or scrapped altogether (you can find more precise figures
in the Standish Group CHAOS report; see http://www.standishgroup.
com/chaos/index.html). Note, however, that even the well-tested strategic
projects from long ago tended to go over budget and over time.
191
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c

Partial preview of the text

Download Chapter 9: Testing summary and more Summaries Performance Evaluation in PDF only on Docsity!

Chapter 9

Testing

Introduction

Along with security, testing is another major topic in software development. Without rigorous testing, there is no possible assertion of the quality of a system, and all organizations involved in software development are aware of the importance of testing. Unfortunately, today, when it comes to address the issues that define the testing discipline, either in a process or in a project, more often than not you will be observing a very craftsman-like situation. This is paradoxical when you think that 20 or 30 years ago testing was a very rigorous discipline. The explanation is very simple: 30 years ago most soft- ware-intensive systems were strategic solutions, enjoying lots of resources and with very stringent requirements on quality. Today, a lot of systems fulfill tac- tical business objectives, in a very competitive environment, with very tight timeframes and limited budgets. No wonder that over that period, three-quar- ters of software automation projects have failed, being either over budget, over time, of poor quality, or scrapped altogether (you can find more precise figures in the Standish Group CHAOS report; see http://www.standishgroup. com/chaos/index.html). Note, however, that even the well-tested strategic projects from long ago tended to go over budget and over time.

191

Nonetheless, the most conscious organizations are generally achieving a good level of quality in the related activities, but often at the price of excessive allocation of resources (people and time). The good news is that the situation is generally improving, as the software development industry matures in the six dimensions described in Chapter 6: technology, tools, methods/techniques, processes, people, and organization. This improvement is apparent in the com- parison between the 1994 and 1998 figures published by the Standish Group. The objective of this chapter is to demonstrate how a seemingly peripheral issue like testing can actually be integrated within an overall engineering approach to software development, and how to implement traceability for testing artifacts as for any other type of artifacts. The software engineering approach presented in this book can help you achieve a high level of organiza- tional performance in the testing activities, as defined by two elements:

 Quality of testing, by defining techniques and methods ensuring the com- plete test coverage of all the requirements of the system, verifiable by the capability to trace the results to the requirements.  Efficiency, by defining a rigorous process for testing, but also by enabling the automation of a large number of tasks.

This chapter will focus on functional testing, as a means of completing the exploration of the functional aspects of the development of a software system. Other test topics (which we will not discuss) would be:

 Performance testing. For example, the number of concurrent sessions that a single Web server can support at a 1- and 5-second response time for known page composition.  Stress testing. For example, the number of concurrent sessions that a single Web server can maintain, while “gracefully” degrading its perform- ance. “Gracefully” in this context means without loss of data or emergence of error conditions in the system itself.  Scalability testing and capacity planning. This involves increasing the load at the same time as augmenting the infrastructure and verifying that the system response is kept within the set targets.  Availability and resilience:

192 | Chapter 9 Testing

the code for one implementation class. The rationale for this choice is that a class defines the smallest self-contained unit, encapsulating its code and data. As such, it is also a good practice to define it as the smallest unit of responsibility for a developer. In unit testing, any call to another imple- mentation class outside the unit is replaced with stubs or simulators. Calls to .NET framework classes or trusted components (external systems with .NET managed interfaces) are maintained as is. Calls by other implemen- tation classes are replaced with drivers. Thus, the unit is tested in isola- tion. Unit testing is a responsibility of the developers, as it is most likely to take a white box testing approach, making extensive use of debugger tools.  System test, which considers the whole solution as a unit. Within one spe- cific development iteration, the system test will cover the relevant use cases, as well as all use cases from previous iterations. This entails applying regression tests for the test cases that cover the use cases of the previous iterations. System testing is the responsibility of the testers, defined as a separate team.

As you will see later in this chapter, both unit test and system test artifacts trace back to use cases through the use case scenarios: unit tests through the design of the implementation classes in sequence diagrams, and system tests through the test cases. Both sequence diagrams and test cases are based on use case scenarios. Thus, the two groups of workers involved in these activities have a common reference source of knowledge that defines their objectives, and they can easily achieve a common understanding in their communication. As a final remark for this Introduction, it is not in the scope of this chapter, or of the book in general, to review any specific test automation tools, either for unit test or system test. The objective is to present you with the concepts involved in the testing activities, as well as practical methods and techniques to apply.

Approach

The approach presented puts the test development effort in a parallel track with the design and code efforts. The input artifacts for test activities, as for all the design activities, are use case descriptions, as we take the perspective of the functional aspects of software development. After all, this is a use case- driven process, and use cases are the original representation of the system

194 | Chapter 9 Testing

knowledge, defining the system requirements and specifications, the code, the documentation, and the tests. In reality you will see in this chapter that test cases and use cases are tightly related. Indeed, the overall idea of the approach is to map the use cases onto test cases.

Test Cases

As test cases proceed from use cases, the activity of specifying them can start very early in the overall development process, as soon as the use cases of the current iteration are specified in detail. The best timing is to start developing the test cases along the use case detail descriptions. However, because use cases are likely to be updated throughout the development of the analysis model, you must be aware that you will need to maintain the consistency of test cases with the use cases. On the other hand, having the test cases com- pleted and stable, along with the use cases at the end of the analysis model and before developing the design model, will be useful in order to start testing the implementation classes that are developed in parallel with the design model. Test cases are the basic components of testing. A test case is defined as a set of data inputs, execution conditions, and expected results. You can think of each test case as representing a use case scenario, or a complete path through one use case. Each use case scenario involves some or all steps of the basic flow of events and possibly one or more alternate flow of events. The above defini- tion of test cases is very close to the definition of sequence diagrams in Chapter 6, and a practical approach to develop test cases is to create at least one test case for each use case scenario represented by a sequence diagram. Note that this does not mean using the sequence diagrams to describe the test cases, as this approach would effectively define a white box test. It does mean to base the test case on the same description of the use case scenario that defines the sequence diagram. As a practical approach, each use case scenario is focused on a particular flow of events (basic or alternates). Consequently, defining test cases from a use case scenario effectively means to create one or more test cases for each documented flow of events of the use case. For this reason, all test cases that correspond to one use case scenario define a class of test cases that specify the same execution conditions, differing only in their input data and the results. At the same time, the activity of creating test cases is also an opportunity to unveil important use case scenarios, beyond those defined by the simple enumeration

Approach | 195

for the “CreateAccount_Entry validation” use case scenario of the Create Account use case. Note that the use case scenario should correspond to a col- laboration instance (containing its sequence diagrams) in the design model. In this matrix we do not enter data values but only an indication of which fields are valid (V), invalid (I), or indifferent (N/A). In TC5 it is sufficient to enter a valid first password and a different verify password to get the expected result, thus all other fields are indifferent. Although there will be other error messages, they are not of any interest for the test case. The above matrix is par- tial, as it does not cover all the possible error conditions that may occur. More test cases are needed, so that each condition has at least an invalid indication in one test case column.

Approach | 197

Table 9-1: Test Case Matrix: Definitions of the Execution Condition and Test Results

Use Case Create Account Use Case Scenario CreateAccount_Entry validation Test Case ID TC4 TC5 TC Condition Username exists Passwords do Invalid card not match expiration date Username I N/A V Password V V V Verify Password V I V E-mail address V N/A V Name on Credit Card V N/A V Card Number V N/A V Card Type V N/A V Expiration V N/A I Street Address V N/A V City V N/A V State V N/A V ZIP/Postal Code V N/A V Country V N/A V Expected Results Message displayed: Message displayed: Message displayed: This user already exists. The 2 password fields Expiration date must Please enter another user. do not match. be between MM1/yyy1 and MM2/yyy2.

The next step is to define actual test data in place of the indications on the test conditions. The same matrix template is used for this purpose (see Table 9-2). In the test case matrix, the expected results describe the state of the system from a user perspective, thus keeping the black box view of the system. They describe in all details the state of the application and any other element of the system environment that is affected as a result of the test activities (e.g., message ABC on the screen, water pump XYZ is now on, message to abc@xyz.com sent and will be received within X hours).

198 | Chapter 9 Testing

Table 9-2: Test Case Matrix: Input Data

Use Case Create Account Use Case Scenario CreateAccount_Entry validation Test Case ID TC4 TC5 TC Condition Username exists Passwords do Invalid card not match expiration date Username testbuyer01 N/A testbuyer Password testbuyer01pwd testbuyer10pwd testbuyer10pwd Verify Password testbuyer01pwd Test testbuyer10pwd E-mail address testbuyer01@ N/A testbuyer10@ softgnsosis.com softgnsosis.com Name on Credit Card testbuyer01 N/A testbuyer Card Number 1111222233330001 N/A 1111222233330010 Card Type MC N/A MC Expiration 02/04 N/A 02/ Street Address 1, High Street N/A 1, High Street City London N/A London State N/A ZIP/Postal Code W1 N/A W Country UK N/A UK Expected Results Message displayed: Message displayed: Message displayed: This user already exists. The 2 password fields Expiration date must Please enter another user. do not match. be between 09/ and 09/

 True defects, where some code or system configuration has to be modified.  False alarms, arising from erroneous test case definitions.

In the case of a false alarm, after investigation, the defect along with its assessment shall be transmitted to the test team that will need to amend the test case definition and run the test case again before closing the defect report.

200 | Chapter 9 Testing

Table 9-3: Test procedure template.

Test procedure Name of the test procedure (for identification) Related test case(s) The identification of the test case(s) that this procedure covers (useful for traceability and assessment of the coverage of functional testing to the system specifications) Initial state of the system Describe in all details the state of the application as well as before the test case starts any element of the system environment that is of importance during the test; e.g., connection to a bar-code reader, particular HTTP port enabled (useful to avoid false test failure incidents due to inade- quacy of the test assumptions with the current system environment and state). State of the data used by Describe in all details the state of the data in the system, as the system before the test it has to be present before the test is conducted. This may entail case starts describing the steps to prepare the data and reference of any auto- mated script or another document that describes the procedure to prepare the data for the particular test (useful to ensure that the test does not generate false test failure incidents due to inadequacy of test assumptions with available system data). Test procedure steps Describe in all details the steps that the user has to follow. Be very specific, referencing the specific buttons or links to click, or input fields to fill. The data itself is one of the data sets from the test case descriptions covered by the procedure. Expected final state of Complementary description of the test case results. This the system after the test case after the test case finishes description is optional and focuses on the finishes elements of the system environment that are not specified within the corresponding use case (useful in order to add a white box per- spective to the test). Expected state of the data Describe in every detail the state of the data in the system used by the system after as a result of the test. This may entail referencing extended the test case finishes documentation on how to extract the data from the system; e.g., a specific SQL query to execute (useful in order to add a white box perspective to the test. Note that the related scripts need to be cre- ated by the developers.).

Effectively this procedure consists of handing over the responsibility of cor- recting the defect, while the overall defect management and resolution proce- dure stays the same as in the case of a true defect.

Test Coverage Matrix

From the above discussion it is straightforward to trace every flow of event of every use case to a test case. But considering the discussion on test case map- ping to implementation classes, a practical question that springs to mind is how to ensure that every operation of every implementation class is participating in a test case. The indirect and necessary answer to this question is to consider the sequence diagram of a use case scenario, representing a particular flow of events of a use case, and list the implementation classes with their operations involved in the collaboration defined in the design model. As this particular flow of events has a corresponding test case, the implication is that the test case does exercise the corresponding implementation classes and the opera- tions involved in the sequence diagram. A first evaluation of test coverage can be documented by using a simple matrix, named the test coverage matrix, where you represent all implementation classes with all their operations in the first two columns, along with all the test cases on the first row. Then you mark with an X every cell where an implementation class is involved in the test case(s) corresponding to the design model sequence diagram(s) where the implementation class is used. At the same time, as the matrix effectively references all units of the system, it is the best place to track the completeness of unit test definitions. The other advantage is that you can quickly identify test case data to use for the unit test, as it will be some derivation of the test case data that covers the cor- responding class operation. Table 9-4 is a conceptual example of a test cov- erage matrix. In the example in Table 9-4 you can identify a problem with the operation OB2 of implementation class CB, as it seems that it is not exercised in any test case. As described in the design model in Chapter 6, the process of discovering class operations is based on developing the sequence diagrams for each pos- sible flow of events of a use case, represented in a use case scenario. Conse- quently, an operation that is not covered by a test case can only mean that there is a test case missing for each flow of events where the operation is used.

Approach | 201

between back box and white box testing and integrating the testing activities of testers and developers.

Test Scenarios

Test cases are used as building blocks for test scenarios, which represent plau- sible usage of the system by end users in terms of a series of tasks that the user will want to execute. They are useful in describing typical user sessions, involving a sequence of use cases, some of which might be repeated; for

Approach | 203

Table 9-5: Test scenario template.

Test Scenario Name of the test scenario (for identification) Purpose Describe the rationale why this scenario was defined. Description Describe what this scenario does. Test type Functional/Load/Stress/Performance/Security… Test cases used The list of test cases involved in the scenario. Reference the test case names. Initial state of the system before Describe in all details the state of the application as the test scenario starts well as any element of the system environment that is of importance during the test; e.g., connection to a bar-code reader, particular HTTP port enabled. State of the data used by the system Describe in all details the state of the data in the before the scenario starts system, as it has to be present before the test is conducted. This may entail describing the steps to prepare the data and reference any automated script or another document that describes the procedure to prepare the data for the particular test. Test scenario steps Enumerate the test cases that have to be applied. Identify repeating sets of test cases and how many repetitions; for example:

  1. Execute test case: A Execute 2 followed by 3 for 7 times.
  2. Execute test case: B
  3. Execute test case: C
  4. Execute test case: D Additional directives and information Any extra details on the test execution and what to collect information to collect or verify in order to assess the suc- cess of the test.

example, when the user shops for books, a test scenario can be composed of sign-in, repeat N times (browse catalogue; select books), check-out, sign-out. Test scenarios are particularly useful when developing load tests where you should create a representative mix of use cases that are likely to happen in parallel during normal system operation. As test cases are the building blocks of test scenarios, these will be composed of one or more test cases. In test sce- narios you might also mix the basic flow of events of some use cases, with the alternate flows of other use cases in order to verify that integrity of the data is maintained. Similar to the test cases, it is desirable to define a practical tem- plate for test scenarios. I will not cover test scenarios in any more detail in this book, as they do not bring significantly more insight than test cases to the discussion of functional testing.

Unit Test

A unit test is most likely to entail using a combination of white box and black box approaches. As seen in the Introduction, the black box approach to unit testing mandates that all implementation classes stub all the calls to other implementation classes that are external to the unit under consideration. As a small reminder and a matter of practicality, the strongly typed DataSet classes described in Chapter 6 are considered as .NET infrastructure classes and con- sequently trusted to be working correctly during unit tests. Stubbing the implementation classes is quite easy to achieve for three reasons:

 The class operation called has already been defined, as per applying the approach of Chapter 6. Thus, the stub structure already exists.  By virtue of the design pattern described in Chapter 6, the only imple- mentation classes that should ever be returned to the caller are the strongly typed DataSets, as representing system data. As these classes are trusted, you can create instances inside the stub and populate them with the appropriate data needed for the unit test.  Other classes that need to be returned are .NET framework classes, which are also trusted and can easily be created and populated inside the stub with the appropriate data for the unit test.

204 | Chapter 9 Testing

happens in Extreme Programming (XP), where the process also takes a similar approach in putting even more of an emphasis on specifying test cases as the basis for coding. Unit tests are effectively realized by code components that represent the concept of test drivers discussed in the Introduction. The general pattern of a test driver is to initialize the required objects, execute a call to the operation

206 | Chapter 9 Testing

namespace BooksREasy.UserAcctMgr.UserAcctMgrBLL { using System; using BooksREasy.Common.UtilityClasses; using BooksREasy.UserAcctMgr.UserAcctMgrDAL.Stubs; public class AccountManager : MarshalbyRefObject ///

/// getUserAccountByUserId: Locates the user account with the specified userId. /// /// userID of the user to retrieve user account information for. /// Typed UserInfo dataset that contains the UserAccount record with a matching userID. public UserInfo getUserAccountByUserId (string userId) { // //TODO: implement the operation // } } } //Stubs namespace BooksREasy.UserAcctMgr.UserAcctMgrBLL.Stubs { using BooksREasy.Common.UtilityClasses; public class AccountManager { public UserInfo getUserAccountByUserId (string userId) { UserInfo dsUser = new UserInfo(); if (userId=="testbuyer10") { dsUser.UserAccount.AddUserAccountRow(dsUser.UserAccount.NewUserAccountRow()); dsUser.UserAccount[0].UserId=testbuyer10; dsUser.UserAccount[0].Password="testbuyer10pwd"; dsUser.UserAccount[0].Email="testbuyer10@softgnosis.com"; dsUser.UserAccount[0].Addr1="1, High Street"; dsUser.UserAccount[0].City="London"; dsUser.UserAccount[0].Country="UK" dsUser.UserAccount[0].SecretNumber="1111"; dsUser.UserAccount[0].State=""; dsUser.UserAccount[0].Status=CommonKeyWords.ACTIVE; dsUser.UserAccount[0].Zip="W1"; } return dsUser; } } }

Figure 9-2: Defining a ".Stubs" namespace for each implementation class.

under test, and check the state of the returned objects. It is also a good prac- tice to keep the unit test code along with the definition of its corresponding class and stub, using a namespace suffixed with “.Tests” in a similar way as for stubs. Figure 9-3 presents test drivers. Note that you also need to have a utility application to run these tests and manage results. At this stage it is useful to use a unit test tool to automate this activity, which brings two advantages:

 It defines a structure for documenting the unit test, including the test code.  It defines an environment to execute the tests and track test results.

At the time of this writing, I have evaluated two unit test tools that were specifically designed for the .NET platform: Nunit and HarnessIt. Both take advantage of the attribute-based programming of .NET. Having defined the stubbed implementation classes and the test drivers, we are in the position to test the unit in complete isolation, as required by a black box approach to testing. In reality, it is not always easy or effective to implement a unit test exclu- sively using a black box approach. This is due mainly to the inherent com- plexity of the underlying .NET technological framework. As much as it

Approach | 207

namespace BooksREasy.UserAcctMgr.UserAcctMgrBLL.Tests { using System; using System.Diagnostics; using BooksREasy.Common.UtilityClasses; public class AccountManager_getUserAccountByUserId_Test { public void success() { UserInfo dsUser =new AccountManager() .getUserAccountByUserId("testbuyer10"); Debug.Assert (dsUser.UserAccount.Rows.Count == 1 && dsUser.UserAccount[0].Status.Equals (CommonKeyWords.ACTIVE), "Success"); } public void fail_Invalid_UserId() { UserInfo dsUser =new AccountManager().getUserAccountByUserId("dummyUserId"); Debug.Assert (dsUser.UserAccount.Rows.Count == 0,"Success"); } } } Figure 9-3: Defining a ".Tests" namespace for each implementation class.

System Test

In a simple and sufficient view for the discussion of this chapter, the system test consists of executing all the test cases defined so far for all the iterations of the system. Indeed, as described earlier, test cases are finalized at the same time as use cases. Thus, when the development team has developed the code implementing the use cases of the current iteration, it is possible for the test team to run the corresponding test cases. But at the same time, they also need to rerun all the test cases corresponding to use cases for the previous iterations, in order to ensure that the new state of system development has not intro- duced any defects to a previously defect-free system. This part of the testing is named regression testing. It is clear by now that test cases will possibly be run a great number of times. For this reason, it is important to define a strategy to automate the test execution as much as possible. This entails using test automation tools, but also creating scripts that will set up the environment and initialize the system with the correct data. At the same time, you need to think of automation solutions to capture the state of the system and data upon completion of each test case.

Case Study

Applying the above approach to the case study, we will produce the functional test artifacts relating to the Create Account use case, focusing specifically on the basic flow of events.

Test Cases

The following test case matrix defines the three test cases that are sufficient to cover the use case scenario of creating a new user account. The start of the use case scenario specifies that the user accesses the Create Account screen either from a link on the main menu or through a button on the sign-in screen. Thus, the first two test cases, TC1 and TC2, are there to ensure that this part of the use case scenario is tested. In TC3, the State field is indifferent because the system does not do any complex validation between the Country, the State, and the ZIP/Postal Code fields. This is the result of a simple system specification and not an error in the design (see Table 9-6). If more complex validation becomes a requirement, the

Case Study | 209

change will ripple throughout the use case specifications and system design to also impact the test case, which would then need to be changed to account for that new requirement (see Chapter 10 for the impact analysis of a change request involving the validation of the ZIP/Postal Code). Table 9-7 is the test case matrix with the input data. Notice that we have not defined any data for the State field, as this is not a relevant concept for the United Kingdom.

210 | Chapter 9 Testing

Table 9-6: Test Case Matrix: Definitions of the Execution Condition and Test Results

Use Case Create Account Use Case Scenario Basic Flow Test Case ID TC1 TC2 TC Condition Create Account Create Account is Basic flow is accessible from Sign-In accessible from menu item Username N/A N/A V Password N/A N/A V Verify Password N/A N/A V E-mail address N/A N/A V Name on Credit Card N/A N/A V Card Number N/A N/A V Card Type N/A N/A V Expiration N/A N/A V Street Address N/A N/A V City N/A N/A V State N/A N/A N/A ZIP/Postal Code N/A N/A V Country N/A N/A V Expected Results Create Account page Create Account page Account created and is displayed. is displayed. user signed in. Message displayed: signed in.