SUBJECTS
|
BROWSE
|
CAREER CENTER
|
POPULAR
|
JOIN
|
LOGIN
Business Skills
|
Soft Skills
|
Basic Literacy
|
Certifications
About
|
Help
|
Privacy
|
Terms
|
Email
Search
Test your basic knowledge |
ISTQB
Start Test
Study First
Subjects
:
certifications
,
istqb
,
it-skills
Instructions:
Answer 50 questions in 15 minutes.
If you are not ready to take this test, you can
study here
.
Match each statement with the correct term.
Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.
This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. A model structure wherein attaining the goals of a set of process areas establishes a maturity level; each level builds a foundation for subsequent levels. [CMMI]
expected result
cause-effect graph
consistency
staged representation
2. The first executable statement within a component.
entry point
fault seeding tool
anomaly
test item
3. A tool that supports operational security.
security tool
severity
white-box test design technique
buffer
4. A white box test design technique in which test cases are designed to execute condition outcomes and decision outcomes.
incident management
decision condition testing
variable
test monitoring
5. A sequence of executable statements within a component.
test logging
subpath
acceptance criteria
peer review
6. The composition of a component or system as defined by the number - nature - and interconnections of its constituent parts.
configuration
usability testing
product risk
condition testing
7. Testing using input values that should be rejected by the component or system. See also error tolerance. The ability of a system or component to continue normal operation despite the presence of erroneous inputs. [After IEEE 610].
configuration control board (CCB)
invalid testing
defect management
product risk
8. The data received from an external source by the test object during test execution. The external source can be hardware - software or human.
domain
pseudo-random
test input
management review
9. Acronym for Commercial Off-The-Shelf software. See off-the-shelf software. A software product that is developed for the general market - i.e. for a large number of customers - and that is delivered to many customers in identical format.
acceptance criteria
COTS
data flow testing
wild pointer
10. Testing where the system is subjected to large volumes of data. See also resource-utilization testing. The process of testing to determine the resource-utilization of a software product.
review tool
volume testing
compliance
exception handling
11. The totality of functionality and features of a software product that bear on its ability to satisfy stated or implied needs. [After ISO 9126]
error
acceptance criteria
software quality
test case suite
12. An aggregation of hardware - software or both - that is designated for configuration management and treated as a single entity in the configuration management process. [IEEE 610]
configuration item
test objective
review tool
modelling tool
13. A white box test design technique in which test cases are designed to execute combinations of single condition outcomes (within one statement).
performance testing
multiple condition testing
robustness testing
capture/replay tool
14. An integration test type that is concerned with testing the interfaces between components or systems.
interface testing
code coverage
reliability growth model
anomaly
15. A questionnaire based usability test technique to evaluate the usability - e.g. user-satisfaction - of a component or system. [Veenendaal]
Software Usability Measurement Inventory (SUMI)
pairwise testing
Capability Maturity Model Integration (CMMI)
equivalence partition coverage
16. A test case without concrete (implementation level) values for input data and expected results. Logical operators are used; instances of the actual values are not yet defined and/or available. See also low level test case. A test case with concrete (
incident
Test Point Analysis (TPA)
test phase
high level test case
17. The representation of a distinct set of tasks performed by the component or system - possibly based on user behavior when interacting with the component or system - and their probabilities of occurance. A task is logical rather that physical and can
unreachable code
cause-effect graphing
operational profile
performance indicator
18. A test is deemed to fail if its actual result does not match its expected result.
fail
incremental development model
non-conformity
feature
19. An incremental approach to integration testing where the lowest level components are tested first - and then used to facilitate the testing of higher level components. This process is repeated until the component at the top of the hierarchy is tested
expected result
exit point
bottom-up testing
data flow coverage
20. A superior method or innovative practice that contributes to the improved performance of an organization under given context - usually recognized as 'best' by other peer organizations.
operational profile
Failure Mode - Effect and Criticality Analysis (FMECA)
best practice
failure rate
21. A tool that provides support to the test management and control part of a test process. It often has several capabilities - such as testware management - scheduling of tests - the logging of results - progress tracking - incident management and test
recoverability testing
condition testing
test management tool
test data preparation tool
22. An element of storage in a computer that is accessible by a software program by referring to it by a name.
validation
definition-use pair
test case suite
variable
23. An element of configuration management - consisting of selecting the configuration items for a system and recording their functional and physical characteristics in technical documentation. [IEEE 610]
modelling tool
data flow analysis
configuration identification
actual result
24. Testing - either functional or non-functional - without reference to the internal structure of the component or system. Black-box test design technique Procedure to derive and/or select test cases based on an analysis of the specification - either fu
requirements-based testing
Test Point Analysis (TPA)
black-box testing
audit trail
25. The percentage of paths that have been exercised by a test suite. 100% path coverage implies 100% LCSAJ coverage.
functional test design technique
conversion testing
serviceability testing
path coverage
26. The period of time in the software life cycle during which the requirements for a software product are defined and documented. [IEEE 610]
ttractiveness
requirements phase
level test plan
test implementation
27. A test approach in which the test suite comprises all combinations of input values and preconditions.
test procedure specification
control flow analysis
exhaustive testing
defect tracking tool
28. The process of assigning a number or category to an entity to describe an attribute of that entity. [ISO 14598]
measurement
software life cycle
conversion testing
compatibility testing
29. Testing of individual components in isolation from surrounding components - with surrounding components being simulated by stubs and drivers - if needed.
path coverage
smoke test
risk analysis
isolation testing
30. The capability of the software product to re-establish a specified level of performance and recover the data directly affected in case of failure. [ISO 9126] See also reliability. The ability of the software product to perform its required functions
milestone
Failure Mode and Effect Analysis (FMEA)
test comparator
recoverability
31. An attribute of a test indicating whether the same results are produced each time the test is executed.
test performance indicator
test reproduceability
project risk
input value
32. A tool for seeding (i.e. intentionally inserting) faults in a component or system.
cost of quality
desk checking
fault seeding tool
output value
33. A five level staged framework that describes the key elements of an effective software process. The Capability Maturity Model covers best-practices for planning - engineering and managing software development and maintenance. [CMM] See also Capabilit
classification tree method
Capability Maturity Model (CMM)
risk analysis
test specification
34. The percentage of LCSAJs of a component that have been exercised by a test suite. 100% LCSAJ coverage implies 100% decision coverage.
incident management
LCSAJ coverage
operational acceptance testing
test item
35. The percentage of condition outcomes that have been exercised by a test suite. 100% condition coverage requires each single condition in every decision statement to be tested as True and False.
bespoke software
condition coverage
resource utilization
configuration management
36. A grid showing the resulting transitions for each state combined with each possible event - showing both valid and invalid transitions.
test summary report
test schedule
test data preparation tool
state table
37. A logical expression that can be evaluated as True or False - e.g. A>B. See also test condition. An item or event of a component or system that could be verified by one or more test cases - e.g. a function - transaction - feature - quality attribute
priority
condition
test log
thread testing
38. The percentage of all condition outcomes and decision outcomes that have been exercised by a test suite. 100% decision condition coverage implies both 100% condition coverage and 100% decision coverage.
test infrastructure
intake test
installation guide
decision condition coverage
39. The behavior predicted by the specification - or another source - of the component or system under specified conditions.
database integrity testing
expected result
best practice
maintainability testing
40. A system of (hierarchical) categories designed to be a useful aid for reproducibly classifying defects.
entry point
defect taxonomy
management review
maintenance
41. An abstract representation of all possible sequences of events (paths) in the execution through a component or system.
expected result
baseline
control flow graph
root cause analysis
42. A graphical representation of inputs and/or stimuli (causes) with their associated outputs (effects) - which can be used to design test cases.
validation
cause-effect graph
risk level
path
43. Testing of a previously tested program following modification to ensure that defects have not been introduced or uncovered in unchanged areas of the software - as a result of the changes made. It is performed when the software or its environment is c
post-execution comparison
scripted testing
agile testing
regression testing
44. An approach to testing to reduce the level of product risks and inform stakeholders on their status - starting in the initial stages of a project. It involves the identification of product risks and their use in guiding the test process.
black-box test design technique
condition testing
memory leak
risk-based testing
45. The process of testing to determine the reliability of a software product.
understandability
inspection
reliability testing
walkthrough
46. The process of recognizing - investigating - taking action and disposing of defects. It involves recording defects - classifying them and identifying the impact. [After IEEE 1044]
variable
installation guide
actual result
defect management
47. The last executable statement within a component.
project
defect management
exit point
scalability
48. The percentage of equivalence partitions that have been exercised by a test suite.
exploratory testing
Capability Maturity Model (CMM)
equivalence partition coverage
fail
49. A document that specifies - ideally in a complete - precise and verifiable manner - the requirements - design - behavior - or other characteristics of a component or system - and - often - the procedures for determining whether these provisions have
specification
condition
buffer
functional testing
50. The result of a decision (which therefore determines the branches to be taken).
decision outcome
reviewer
specified input
exercised