Test your basic knowledge |

Instructions:
  • Answer 50 questions in 15 minutes.
  • If you are not ready to take this test, you can study here.
  • Match each statement with the correct term.
  • Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.

This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. A tree showing equivalence parititions hierarchically ordered - which is used to design test cases in the classification tree method. See also classification tree method. A black box test design technique in which test cases - described by means of a






2. A white box test design technique in which test cases are designed to execute statements.






3. Formal testing with respect to user needs - requirements - and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user - customers or other authorized entity to determine whether or n






4. The percentage of condition outcomes that have been exercised by a test suite. 100% condition coverage requires each single condition in every decision statement to be tested as True and False.






5. A device or storage area used to store data temporarily for differences in rates of data flow - time or occurrence of events - or amounts of data that can be handeld by the devices or processes involved in the transfer or use of the data. [IEEE 610]






6. A skeletal or special-purpose implementation of a software component - used to develop or test a component that calls or is otherwise dependent on it. It replaces a called component. [After IEEE 610]






7. A requirement that specifies a function that a component or system must perform. [IEEE 610]






8. A black box test design technique in which test cases are designed based upon the definition of the input domain and/or output domain.






9. An element of configuration management - consisting of the evaluation - co-ordination - approval or disapproval - and implementation of changes to configuration items after formal establishment of their configuration identification. [IEEE 610]






10. Testing carried out informally; no formal test preparation takes place - no recognized test design technique is used - there are no expectations for results and arbitrariness guides the test execution activity.






11. The percentage of LCSAJs of a component that have been exercised by a test suite. 100% LCSAJ coverage implies 100% decision coverage.






12. An incremental approach to integration testing where the component at the top of the component hierarchy is tested first - with lower level components being simulated by stubs. Tested components are then used to test lower level components. The proce






13. A software component or test tool that replaces a component that takes care of the control and/or the calling of a component or system. [After TMap]






14. A portion of an input or output domain for which the behavior of a component or system is assumed to be the same - based on the specification.






15. Testing performed to expose defects in the interfaces and in the interactions between integrated components or systems. See also component integration testing - system integration testing. Testing performed to expose defects in the interfaces and int






16. A tool that supports operational security.






17. Analysis of software artifacts - e.g. requirements or code - carried out without execution of these software artifacts.






18. Testing of individual components in isolation from surrounding components - with surrounding components being simulated by stubs and drivers - if needed.






19. A type of peer review that relies on visual examination of documents to detect defects - e.g. violations of development standards and non-conformance to higher level documentation. The most formal review technique and therefore always based on a docu






20. An attribute of a component or system specified or implied by requirements documentation (for example reliability - usability or design constraints). [After IEEE 1008]






21. The process through which decisions are reached and protective measures are implemented for reducing risks to - or maintaining risks within - specified levels.






22. A version of component integration testing where the progressive integration of components follows the implementation of subsets of the requirements - as opposed to the integration of components by levels of a hierarchy.






23. A reason or purpose for designing and executing a test.






24. A document describing the scope - approach - resources and schedule of intended test activities. It identifies amongst others test items - the features to be tested - the testing tasks - who will do each task - degree of tester independence - the tes






25. A sequence of executable statements within a component.






26. Execution of a test on a specific version of the test object.






27. The process of assigning a number or category to an entity to describe an attribute of that entity. [ISO 14598]






28. A path for which a set of input values and preconditions exists which causes it to be executed.






29. A measurement scale and the method used for measurement. [ISO 14598]






30. Computer programs - procedures - and possibly associated documentation and data pertaining to the operation of a computer system. [IEEE 610]






31. Acronym for Computer Aided Software Testing. See also test automation.The use of software to perform or support test activities - e.g. test management - test design - test execution and results checking.






32. A set of input values - execution preconditions - expected results and execution postconditions - developed for a particular objective or test condition - such as to exercise a particular program path or to verify compliance with a specific requireme






33. A set of several test cases for a component or system under test - where the post condition of one test is often used as the precondition for the next one.






34. The person responsible for project management of testing activities and resources - and evaluation of a test object. The individual who directs - controls - administers - plans and regulates the evaluation of a test object.






35. Definition of user profiles in performance - load and/or stress testing. Profiles should reflect anticipated or actual usage based on an operational profile of a component or system - and hence the expected workload. See also load profile - operation






36. A computational model consisting of a finite number of states and transitions between those states - possibly with accompanying actions. [IEEE 610]






37. A project is a unique set of coordinated and controlled activities with start and finish dates undertaken to achieve an objective conforming to specific requirements - including the constraints of time - cost and resources. [ISO 9000]






38. The consequence/outcome of the execution of a test. It includes outputs to screens - changes to data - reports - and communication messages sent out.






39. The process of testing to determine the functionality of a software product.






40. A detailed check of the test basis to determine whether the test basis is at an adequate quality level to act as an input document for the test process. [After TMap]






41. An uninterrupted period of time spent in executing tests. In exploratory testing - each test session is focused on a charter - but testers can also explore new opportunities or issues during a session. The tester creates and executes test cases on th






42. The capability of the software product to enable specified modifications to be implemented. [ISO 9126] See also maintainability. The ease with which a software product can be modified to correct defects - modified to meet new requirements - modified






43. Testing of a component or system at specification or implementation level without execution of that software - e.g. reviews or static code analysis.






44. Testing the quality of the documentation - e.g. user guide or installation guide.






45. A document reporting on any flaw in a component or system that can cause the component or system to fail to perform its required function. [After IEEE 829]






46. Procedure used to derive and/or select test cases.






47. A type of performance testing conducted to evaluate a system or component at or beyond the limits of its anticipated or specified work loads - or with reduced availability of resources such as access to memory or servers. [After IEEE 610] See also pe






48. A minimal software item that can be tested in isolation.






49. A test plan that typically addresses multiple test levels. See also test plan. A document describing the scope - approach - resources and schedule of intended test activities. It identifies amongst others test items - the features to be tested - the






50. The percentage of all condition outcomes and decision outcomes that have been exercised by a test suite. 100% decision condition coverage implies both 100% condition coverage and 100% decision coverage.







Sorry!:) No result found.

Can you answer 50 questions in 15 minutes?


Let me suggest you:



Major Subjects



Tests & Exams


AP
CLEP
DSST
GRE
SAT
GMAT

Most popular tests