Test your basic knowledge |

Instructions:
  • Answer 50 questions in 15 minutes.
  • If you are not ready to take this test, you can study here.
  • Match each statement with the correct term.
  • Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.

This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. A framework to describe the software development life cycle activities from requirements specification to maintenance. The V-model illustrates how testing activities can be integrated into each phase of the software development life cycle.






2. Data that exists (for example - in a database) before a test is executed - and that affects or is affected by the component or system under test.






3. A tool that facilitates the recording and status tracking of defects and changes. They often have workflow-oriented facilities to track and control the allocation - correction and re-testing of defects and provide reporting facilities. See also incid






4. Computer programs - procedures - and possibly associated documentation and data pertaining to the operation of a computer system. [IEEE 610]






5. A tool that supports operational security.






6. A system of (hierarchical) categories designed to be a useful aid for reproducibly classifying defects.






7. A capability maturity model structure wherein capability levels provide a recommended order for approaching process improvement within specified process areas. [CMMI]






8. Operational testing in the acceptance test phase - typically performed in a simulated real-life operational environment by operator and/or administrator focusing on operational aspects - e.g. recoverability - resource-behavior - installability and te






9. The use of software - e.g. capture/playback tools - to control the execution of tests - the comparison of actual results to expected results - the setting up of test preconditions - and other test control and reporting functions.






10. The first executable statement within a component.






11. Supplied software on any suitable media - which leads the installer through the installation process. It normally runs the installation process - provides feedback on installation results - and prompts for options.






12. A group of test activities aimed at testing a component or system focused on a specific test objective - i.e. functional test - usability test - regression test etc. A test type may take place on one or more test levels or test phases. [After TMap]






13. A scheme for the execution of test procedures. The test procedures are included in the test execution schedule in their context and in the order in which they are to be executed.






14. The behavior produced/observed when a component or system is tested.






15. A white box test design technique in which test cases are designed to execute condition outcomes.






16. Software developed specifically for a set of users or customers. The opposite is off-the-shelf software.






17. An item or event of a component or system that could be verified by one or more test cases - e.g. a function - transaction - feature - quality attribute - or structural element.






18. The fundamental test process comprises test planning and control - test analysis and design - test implementation and execution - evaluating exit criteria and reporting - and test closure activities.






19. The ability to identify related items in documentation and software - such as requirements with associated tests. See also horizontal traceability - vertical traceability. The tracing of requirements for a test level through the layers of test docume






20. Formal testing with respect to user needs - requirements - and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user - customers or other authorized entity to determine whether or n






21. Analysis of software artifacts - e.g. requirements or code - carried out without execution of these software artifacts.






22. Testing practice for a project using agile methodologies - such as extreme programming (XP) - treating development as the customer of testing and emphasizing the test first design paradigm. See also test driven development. A way of developing softwa






23. The person responsible for project management of testing activities and resources - and evaluation of a test object. The individual who directs - controls - administers - plans and regulates the evaluation of a test object.






24. The total costs incurred on quality activities and issues and often split into prevention costs - appraisal costs - internal failure costs and external failure costs.






25. An independent evaluation of software products or processes to ascertain compliance to standards - guidelines - specifications - and/or procedures based on objective criteria - including documents that specify: (1) the form or content of the products






26. A form of state transition testing in which test cases are designed to execute all valid sequences of N+1 transitions. [Chow] See also state transition testing. A black box test design technique in which test cases are designed to execute valid and i






27. Definition of user profiles in performance - load and/or stress testing. Profiles should reflect anticipated or actual usage based on an operational profile of a component or system - and hence the expected workload. See also load profile - operation






28. The ratio of the number of failures of a given category to a given unit of measure - e.g. failures per unit of time - failures per number of transactions - failures per number of computer runs. [IEEE 610]






29. Tests aimed at showing that a component or system does not work. Negative testing is related to the testers' attitude rather than a specific test approach or test design technique - e.g. testing with invalid input values or exceptions. [After Beizer]






30. The level of (business) importance assigned to an item - e.g. defect.






31. Testing where components or systems are integrated and tested one or some at a time - until all the components or systems are integrated and tested.






32. A technique used to analyze the causes of faults (defects). The technique visually models how logical relationships between failures - human errors - and external events can combine to cause specific faults to disclose.






33. The process of identifying risks using techniques such as brainstorming - checklists and failure history.






34. A black box test design technique in which test cases are designed based on boundary values. See also boundary value. An input value or output value which is on the edge of an equivalence partition or at the smallest incremental distance on either si






35. Artifacts produced during the test process required to plan - design - and execute tests - such as documentation - scripts - inputs - expected results - set-up and clear-up procedures - files - databases - environment - and any additional software or






36. A static usability test technique to determine the compliance of a user interface with recognized usability principles (the so-called "heuristics").






37. The process of testing to determine the resource-utilization of a software product. See also efficiency testing. The process of testing to determine the efficiency of a software product.






38. A variable (whether stored within a component or outside) that is written by a component.






39. The process of identifying differences between the actual results produced by the component or system under test and the expected results for a test. Test comparison can be performed during test execution (dynamic comparison) or after test execution.






40. Acceptance testing by users/customers at their site - to determine whether or not a component or system satisfies the user/customer needs and fits within the business processes - normally including hardware as well as software.






41. Statistical testing using a model of system operations (short duration tasks) and their probability of typical use. [Musa]






42. Supplied instructions on any suitable media - which guides the installer through the installation process. This may be a manual guide - step-by-step procedure - installation wizard - or any other similar process description.






43. A black box test design technique in which test cases are designed to execute representatives from equivalence partitions. In principle test cases are designed to cover each partition at least once.






44. A version of component integration testing where the progressive integration of components follows the implementation of subsets of the requirements - as opposed to the integration of components by levels of a hierarchy.






45. Testing based on an analysis of the internal structure of the component or system.






46. A test result in which a defect is reported although no such defect actually exists in the test object.






47. A model that shows the growth in reliability over time during continuous testing of a component or system as a result of the removal of defects that result in reliability failures.






48. The set from which valid input and/or output values can be selected.






49. A framework that describes the key elements of an effective product development and maintenance process. The Capability Maturity Model Integration covers best-practices for planning - engineering and managing product development and maintenance. CMMI






50. A detailed check of the test basis to determine whether the test basis is at an adequate quality level to act as an input document for the test process. [After TMap]