Purpose of this document
The purpose of this document is to outline the test strategy and overall test approach for the Online Backstage Management System. This includes test methodologies, traceability, resources required, and estimated schedule.
This section describes the objectives and extent of the tests. The goal is to provide a framework that can be used by managers and testers to plan and execute the necessary tests in a timely and cost-effective manner.
OBMS |
Online Backstage Management System |
RSSS |
Royal South Street Society |
SADD |
Software Architecture and Design Document |
JSP |
Java Server Pages |
XML |
Extensible Markup Language |
AJAX |
Asynchronous Java Server Pages |
HTML |
Hypertext Markup Language |
CSS |
Cascaded Style Sheets |
The OBMS is a web based application developed for the RSSS in order to automate the management of competitors and results. They wish to migrate from their paper system to the electronic online system to save on operational costs, risks and time (Wohlin, Runeson, Höst, Ohlsson, Regnell, and Wesslén, 2014)
1.3.1 Software Architecture overview (Imported from the SADD)
The OBMS 3 tier architecture and associated technologies
System components
Use case diagram
Login
This section describes the general approach to the testing process. It discusses the reasons for the selected integration testing strategy. Different strategies are often needed to test different parts of the system (Pressman, 2015).
Unit testing and component testing will be performed on the components as they are being developed. Tests will be executed using test code in the form of either custom test tools or as an automated suite of tests run against the components in their individual sandboxes (Bourque, Fairley, 2014).
Integrations tests will be performed by both the component testers as well as the system testers. The BAT and the unit test suite will be used as a regression during the integration of components. However, as the integration begins to include GUI level functionality, the tests being run will utilize significantly more manual testing and less automated testing.
Because the components will be developed from the bottom-up and top-down, the test strategy will also align to the order of development of components. This will utilize a mostly bottom-up integration test approach, but will also involve the sandwich integration test approach.
This section outlines the feature of the OBMS that will be tested and those that will not be tested. It also describes the tools that will be utilized and the environment in which the tests will be carried out.
Components developed in house
All the components developed by this organization will be tested for RSSS.
Off-The-Shelf and third party components
It is assumed that off the shelf and third party components were evaluated and the pros and cons properly weighed before choosing that component with our software. The interfaces to those components will be tested, but not the functionality or performance those components.
The MySQL database management software employed is assumed to work as designed and will not be directly tested for functionality. However, performance tests will be done during system test with respect to GUI response time that will involve the database. The database will not be directly tested.
No direct tests will be carried out on the internet/Wi-Fi backbone either. It will only be utilized during testing of the system components and functionalities.
This section identifies the resources, which include hardware, software, special test tools, and other resources needed to support testing (Bourque, Fairley, 2014).
The team will need a lab for the testing exercise. A lab area of about 600 sq ft in size, with 200 sq ft of desktop space will suffice for the testing procedure. The lab area needs multiple power outlets on each side of the room. A table in the center of the room would also allow for easy-access test team technical discussions.
To enable the team to test in an optimal environment, the lab needs to contain 4 copies of the system under test. The hardware components of the system are a Database Server, a Web Server, three client PCs with a web browser that supports Java, and the embedded OBMS. The three client PCs allow the team to test several components in parallel.
The database (MySQL) will be installed, setup, and configured properly in the database server.
The Web Server machine will have the Apache web server installed and services started so that it can properly function as a Web Server.
The client machine needs a compatible version of the set of JDK tool kits installed and properly configured with the Firefox web browser.
Additional tools and software may need to be purchased or otherwise acquired or reused. Such software is used to execute special tests that are best run and results recorded and analyzed by automated means. For Load testing this requires a tool like LoadRunner. For Security testing at the compiled source code level this requires a tool like FindBugs.
This section is the main concern of the test plan. It outlines the test cases that are used during testing. Each test case is described in detail in its own Test Case Specification document (Vij, McClure and Ekaireb, 2014). All execution of these tests will be documented in a Test Incident Report document.
Case tests for functional requirements
Case 1 Import data from guide book
This function is for capturing the guide book details from the office system into the OBMS database
Guide book document in word format
Expected Output and Pass/Fail criteria
All the information from the word document should be input to the OBMS database. There are no pass/fail criteria
This function is for competitors to register when they arrive on the competition day to let the administrators know they are there and divulge any necessary information for their event.
Competitor’s age, name, section, and other information for section
Expected Output and Pass/Fail criteria
The expected output is the competitor being listed as “Registered” and “Available to compete”. The pass/fail criteria is that once registered, the system should indicate that the competitor is registered and available for competition.
The system’s screens should be configurable so that each user can see the information they want to see.
The various types of information that must be displayed or collected
Expected Output and Pass/Fail criteria
A built screen displaying user selection. There are no pass/fail criteria
The adjudicators should be able to store records pertaining to competitors who won in the performances.
Competitor name, score, and place
Expected Output and Pass/Fail criteria
The competitors’ place order. There are no pass/fail criteria
4.1.4.4 Test Procedure
This should enable the RSSS staff to eliminate paper processes and dynamically distribute information to key staff members for updates.
Any updates
The updated information being displayed on the recipients’ devices. There are no pass/fail criteria.
The system must be able to print out information in hard copy for back up in case of system failure.
Section information
Expected Output and Pass/Fail criteria
The printed report. There are no pass/fail criteria
The system should not only allow individuals to register, but also groups or schools.
Group/school name
The expected output is the group/school registered. The pass/fail criteria are; a group/school must be in the list, incorrect or incomplete information and incomplete fields should be highlighted.
The system should allow the chairman or manager to move a competitor to a different section before he/she registers.
The competitor’s name
Expected Output and Pass/Fail criteria
The competitor is moved to new section and a confirmation message. There are no pass/fail criteria.
The system should enable a RSSS staff member to bring up a section and display the list of competitors to be registered for a particular event.
The section where competitors will be registered
Expected Output and Pass/Fail criteria
The list of competitors who will be registered. The pass/fail criteria are; competitor is not in list, incorrect or incomplete information, and incomplete fields should be highlighted.
To enable the administrator to perform changes by logging in to the system.
Password
Administrator panel. The pass/fail criteria is that the password should be correct
This enables competitors to log in to the OBMS to view their customized screens according to their roles.
No inputs are required.
Expected Output and Pass/Fail criteria
The user’s customized screen. Pass/fail criteria is that if the user chooses the admin option, they will be prompted for a password.
This test is for fulifilling the function of capturing the guide book details from the office system into the OBMS database
Guide book document in word format
All the information from the word document should be input into the back up media. No pass/fail criteria exist.
The administrator should perform a data backup/restore function in case data has been lost from the system.
Inputs
Input file.
The restored data. No pass/fail criteria exists.
The system should enable a RSSS staff member to view information concerning a particular competitor during an event.
Guide book document in word format
All the information from the word document should be input into the OBMS database. The pass criteria is that the database should reflect the input information. Otherwise it is a failure.
References
Bourque, P., & Fairley, R. E. (2014). Guide to the software engineering body of knowledge
(SWEBOK (R)): Version 3.0. IEEE Computer Society Press.
De Lemos, R., Giese, H., Müller, H. A., Shaw, M., Andersson, J., Litoiu, M., … & Weyns, D.
(2013). Software engineering for self-adaptive systems: A second research roadmap. In Software Engineering for Self-Adaptive Systems II (pp. 1-32). Springer Berlin Heidelberg.
Pressman, R. S. (2015). “Software Engineering–A Practitioner? s Approach”, Mc Graw-Hill
International Edition, 2010.
Vij, R., McClure, D. K., & Ekaireb, M. (2014). U.S. Patent No. 8,676,529. Washington, DC:
U.S. Patent and Trademark Office.
Wohlin, C., Runeson, P., Höst, M., Ohlsson, M. C., Regnell, B., & Wesslén, A. (2012).
Experimentation in software engineering. Springer Science & Business Media.
Essay Writing Service Features
Our Experience
No matter how complex your assignment is, we can find the right professional for your specific task. Contact Essay is an essay writing company that hires only the smartest minds to help you with your projects. Our expertise allows us to provide students with high-quality academic writing, editing & proofreading services.Free Features
Free revision policy
$10Free bibliography & reference
$8Free title page
$8Free formatting
$8How Our Essay Writing Service Works
First, you will need to complete an order form. It's not difficult but, in case there is anything you find not to be clear, you may always call us so that we can guide you through it. On the order form, you will need to include some basic information concerning your order: subject, topic, number of pages, etc. We also encourage our clients to upload any relevant information or sources that will help.
Complete the order formOnce we have all the information and instructions that we need, we select the most suitable writer for your assignment. While everything seems to be clear, the writer, who has complete knowledge of the subject, may need clarification from you. It is at that point that you would receive a call or email from us.
Writer’s assignmentAs soon as the writer has finished, it will be delivered both to the website and to your email address so that you will not miss it. If your deadline is close at hand, we will place a call to you to make sure that you receive the paper on time.
Completing the order and download