Test your basic knowledge |

Instructions:
  • Answer 50 questions in 15 minutes.
  • If you are not ready to take this test, you can study here.
  • Match each statement with the correct term.
  • Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.

This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. Testing based on an analysis of the internal structure of the component or system.






2. The capability of the software product to provide appropriate performance - relative to the amount of resources used under stated conditions. [ISO 9126]






3. The testing activities that must be repeated when testing is re-started after a suspension. [After IEEE 829]






4. Testing based on an analysis of the internal structure of the component or system.






5. An item or event of a component or system that could be verified by one or more test cases - e.g. a function - transaction - feature - quality attribute - or structural element.






6. An element of configuration management - consisting of selecting the configuration items for a system and recording their functional and physical characteristics in technical documentation. [IEEE 610]






7. Separation of responsibilities - which encourages the accomplishment of objective testing. [After DO-178b]






8. A development life cycle where a project is broken into a usually large number of iterations. An iteration is a complete development loop resulting in a release (internal or external) of an executable product - a subset of the final product under dev






9. A test tool to perform automated test comparison of actual results with expected results.






10. A tool that provides support for testing security characteristics and vulnerabilities.






11. A systematic approach to risk identification and analysis of identifying possible modes of failure and attempting to prevent their occurrence. See also Failure Mode - Effect and Criticality Analysis (FMECA).






12. Testing of software or specification by manual simulation of its execution. See also static analysis. Analysis of software artifacts - e.g. requirements or code - carried out without execution of these software artifacts.






13. A risk directly related to the test object. See also risk. A factor that could result in future negative consequences; usually expressed as impact and likelihood.






14. A document reporting on any event that occurred - e.g. during the testing - which requires investigation. [After IEEE 829]






15. The testing of individual software components. [After IEEE 610]






16. The ratio of the number of failures of a given category to a given unit of measure - e.g. failures per unit of time - failures per number of transactions - failures per number of computer runs. [IEEE 610]






17. A test design technique where the experience of the tester is used to anticipate what defects might be present in the component or system under test as a result of errors made - and to design tests specifically to expose them.






18. Testing the methods and processes used to access and manage the data(base) - to ensure access methods - processes and data rules function as expected and that during access to the database - data is not corrupted or unexpectedly deleted - updated or






19. A detailed check of the test basis to determine whether the test basis is at an adequate quality level to act as an input document for the test process. [After TMap]






20. A transition between two states of a component or system.






21. A type of integration testing in which software elements - hardware elements - or both are combined all at once into a component or an overall system - rather than in stages. [After IEEE 610] See also integration testing. Testing performed to expose






22. A device or storage area used to store data temporarily for differences in rates of data flow - time or occurrence of events - or amounts of data that can be handeld by the devices or processes involved in the transfer or use of the data. [IEEE 610]






23. The fundamental test process comprises test planning and control - test analysis and design - test implementation and execution - evaluating exit criteria and reporting - and test closure activities.






24. A grid showing the resulting transitions for each state combined with each possible event - showing both valid and invalid transitions.






25. Formal testing with respect to user needs - requirements - and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user - customers or other authorized entity to determine whether or n






26. A review of a software work product by colleagues of the producer of the product for the purpose of identifying defects and improvements. Examples are inspection - technical review and walkthrough.






27. A set of interrelated activities - which transform inputs into outputs. [ISO 12207]






28. A black box test design technique in which test cases are designed to execute valid and invalid state transitions. See also N-switch testing. A form of state transition testing in which test cases are designed to execute all valid sequences of N+1 tr






29. A form of static analysis based on a representation of sequences of events (paths) in the execution through a component or system.






30. An expert based test estimation technique that aims at making an accurate estimation using the collective wisdom of the team members.






31. A white box test design technique in which test cases are designed to execute LCSAJs.






32. A diagram that depicts the states that a component or system can assume - and shows the events or circumstances that cause and/or result from a change from one state to another. [IEEE 610]






33. A data item that specifies the location of another data item; for example - a data item that specifies the address of the next employee record to be processed. [IEEE 610]






34. Testing that involves the execution of the software of a component or system.






35. The capability of the software to be understood - learned - used and attractive to the user when used under specified conditions. [ISO 9126]






36. A test environment comprised of stubs and drivers needed to execute a test.






37. A form of static analysis based on the definition and usage of variables.






38. The degree to which a component - system or process meets specified requirements and/or user/customer needs and expectations. [After IEEE 610]






39. A peer group discussion activity that focuses on achieving consensus on the technical approach to be taken. [Gilb and Graham - IEEE 1028] See also peer review. A review of a software work product by colleagues of the producer of the product for the p






40. A white box test design technique in which test cases are designed to execute definition and use pairs of variables.






41. A test design technique in which a model of the statistical distribution of the input is used to construct representative test cases. See also operational profile testing. Statistical testing using a model of system operations (short duration tasks)






42. A black box test design technique in which test cases are designed to execute user scenarios.






43. A white box test design technique in which test cases are designed to execute branches.






44. Supplied instructions on any suitable media - which guides the installer through the installation process. This may be a manual guide - step-by-step procedure - installation wizard - or any other similar process description.






45. The consequence/outcome of the execution of a test. It includes outputs to screens - changes to data - reports - and communication messages sent out.






46. A model structure wherein attaining the goals of a set of process areas establishes a maturity level; each level builds a foundation for subsequent levels. [CMMI]






47. The degree of impact that a defect has on the development or operation of a component or system. [After IEEE 610]






48. Procedure to derive and/or select test cases for nonfunctional testing based on an analysis of the specification of a component or system without reference to its internal structure. See also black box test design technique. Procedure to derive and/o






49. The person involved in the review that identifies and describes anomalies in the product or project under review. Reviewers can be chosen to represent different viewpoints and roles in the review process.






50. Execution of the test process against a single identifiable release of the test object.