Test your basic knowledge |

Instructions:
  • Answer 50 questions in 15 minutes.
  • If you are not ready to take this test, you can study here.
  • Match each statement with the correct term.
  • Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.

This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. Simulated or actual operational testing by potential users/customers or an independent test team at the developers' site - but outside the development organization. Alpha testing is often employed for off-the-shelf software as a form of internal acce






2. A technique used to analyze the causes of faults (defects). The technique visually models how logical relationships between failures - human errors - and external events can combine to cause specific faults to disclose.






3. A statement of test objectives - and possibly test ideas about how to test. Test charters are used in exploratory testing. See also exploratory testing. An informal test design technique where the tester actively controls the design of the tests as t






4. The activity of establishing or updating a test plan.






5. Definition of user profiles in performance - load and/or stress testing. Profiles should reflect anticipated or actual usage based on an operational profile of a component or system - and hence the expected workload. See also load profile - operation






6. A system whose failure or malfunction may result in death or serious injury to people - or loss or severe damage to equipment - or environmental harm.






7. The percentage of all condition outcomes and decision outcomes that have been exercised by a test suite. 100% decision condition coverage implies both 100% condition coverage and 100% decision coverage.






8. The number or category assigned to an attribute of an entity by making a measurement. [ISO 14598]






9. A specification or software product that has been formally reviewed or agreed upon - that thereafter serves as the basis for further development - and that can be changed only through a formal change control process. [After IEEE 610]






10. Testing to determine how the occurrence of two or more activities within the same interval of time - achieved either by interleaving the activities or by simultaneous execution - is handled by the component or system. [After IEEE 610]






11. Acceptance testing by users/customers at their site - to determine whether or not a component or system satisfies the user/customer needs and fits within the business processes - normally including hardware as well as software.






12. A white box test design technique in which test cases are designed to execute definition and use pairs of variables.






13. Environmental and state conditions that must be fulfilled before the component or system can be executed with a particular test or test procedure.






14. A tool that provides support for testing security characteristics and vulnerabilities.






15. A test case without concrete (implementation level) values for input data and expected results. Logical operators are used; instances of the actual values are not yet defined and/or available. See also low level test case. A test case with concrete (






16. A transition between two states of a component or system.






17. Commonly used to refer to a test procedure specification - especially an automated one.






18. A group of test activities that are organized and managed together. A test level is linked to the responsibilities in a project. Examples of test levels are component test - integration test - system test and acceptance test. [After TMap]






19. The component or system to be tested. See also test item. The individual element to be tested. There usually is one test object and many test items. See also test object. A reason or purpose for designing and executing a test.






20. The capability of the software product to achieve acceptable levels of risk of harm to people - business - software - property or the environment in a specified context of use. [ISO 9126]






21. An incremental approach to integration testing where the lowest level components are tested first - and then used to facilitate the testing of higher level components. This process is repeated until the component at the top of the hierarchy is tested






22. A type of peer review that relies on visual examination of documents to detect defects - e.g. violations of development standards and non-conformance to higher level documentation. The most formal review technique and therefore always based on a docu






23. A type of integration testing in which software elements - hardware elements - or both are combined all at once into a component or an overall system - rather than in stages. [After IEEE 610] See also integration testing. Testing performed to expose






24. Test execution carried out by following a previously documented sequence of tests.






25. A document specifying a sequence of actions for the execution of a test. Also known as test script or manual test script. [After IEEE 829]






26. An abstract representation of all possible sequences of events (paths) in the execution through a component or system.






27. The function to check on the contents of libraries of configuration items - e.g. for standards compliance. [IEEE 610]






28. A framework to describe the software development life cycle activities from requirements specification to maintenance. The V-model illustrates how testing activities can be integrated into each phase of the software development life cycle.






29. The process of running a test on the component or system under test - producing actual result(s).






30. The implementation of the test strategy for a specific project. It typically includes the decisions made that follow based on the (test) project's goal and the risk assessment carried out - starting points regarding the test process - the test design






31. A form of static analysis based on a representation of sequences of events (paths) in the execution through a component or system.






32. The capability of the software product to enable modified software to be tested. [ISO 9126] See also maintainability. The ease with which a software product can be modified to correct defects - modified to meet new requirements - modified to make fut






33. A test approach in which the test suite comprises all combinations of input values and preconditions.






34. Formal or informal testing conducted during the implementation of a component or system - usually in the development environment by developers. [After IEEE 610]






35. The total costs incurred on quality activities and issues and often split into prevention costs - appraisal costs - internal failure costs and external failure costs.






36. A technique used to characterize the elements of risk. The result of a hazard analysis will drive the methods used for development and testing of a system. See also risk analysis. The process of assessing identified risks to estimate their impact and






37. A capability maturity model structure wherein capability levels provide a recommended order for approaching process improvement within specified process areas. [CMMI]






38. The percentage of paths that have been exercised by a test suite. 100% path coverage implies 100% LCSAJ coverage.






39. A sequence of events (paths) in the execution through a component or system.






40. A five level staged framework that describes the key elements of an effective software process. The Capability Maturity Model covers best-practices for planning - engineering and managing software development and maintenance. [CMM] See also Capabilit






41. An instance of an input. See also input. A variable (whether stored within a component or outside) that is read by a component.






42. A tool for seeding (i.e. intentionally inserting) faults in a component or system.






43. A black box test design technique in which test cases are designed to execute business procedures and processes. [TMap] See also procedure testing. Testing aimed at ensuring that the component or system can operate in conjunction with new or existing






44. A five level staged framework for test process improvement - related to the Capability Maturity Model Integration (CMMI) - that describes the key elements of an effective test process.






45. The percentage of combinations of all single condition outcomes within one statement that have been exercised by a test suite. 100% multiple condition coverage implies 100% condition determination coverage.






46. A variable (whether stored within a component or outside) that is written by a component.






47. Testing using input values that should be rejected by the component or system. See also error tolerance. The ability of a system or component to continue normal operation despite the presence of erroneous inputs. [After IEEE 610].






48. Testing the changes to an operational system or the impact of a changed environment to an operational system.






49. Formal testing with respect to user needs - requirements - and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user - customers or other authorized entity to determine whether or n






50. A type of test tool that enables data to be selected from existing databases or created - generated - manipulated and edited for use in testing.