SUBJECTS
|
BROWSE
|
CAREER CENTER
|
POPULAR
|
JOIN
|
LOGIN
Business Skills
|
Soft Skills
|
Basic Literacy
|
Certifications
About
|
Help
|
Privacy
|
Terms
|
Email
Search
Test your basic knowledge |
ISTQB
Start Test
Study First
Subjects
:
certifications
,
istqb
,
it-skills
Instructions:
Answer 50 questions in 15 minutes.
If you are not ready to take this test, you can
study here
.
Match each statement with the correct term.
Don't refresh. All questions and answers are randomly picked and ordered every time you load a test.
This is a study tool. The 3 wrong answers for each question are randomly chosen from answers to other questions. So, you might find at times the answers obvious, but you will see it re-enforces your understanding as you take the test each time.
1. The testing of individual software components. [After IEEE 610]
test implementation
unit testing
test script
definition-use pair
2. Testing in which two or more variants of a component or system are executed with the same inputs - the outputs compared - and analyzed in cases of discrepancies. [IEEE 610]
software
actual result
operational acceptance testing
back-to-back testing
3. The data received from an external source by the test object during test execution. The external source can be hardware - software or human.
stability
test input
buffer
error
4. Testing performed to expose defects in the interfaces and in the interactions between integrated components or systems. See also component integration testing - system integration testing. Testing performed to expose defects in the interfaces and int
test design specification
use case testing
reliability growth model
integration testing
5. A framework to describe the software development life cycle activities from requirements specification to maintenance. The V-model illustrates how testing activities can be integrated into each phase of the software development life cycle.
V-model
path sensitizing
incident logging
risk level
6. A minimal software item that can be tested in isolation.
learnability
unit
dynamic analysis
data flow analysis
7. Testing of software or specification by manual simulation of its execution. See also static analysis. Analysis of software artifacts - e.g. requirements or code - carried out without execution of these software artifacts.
test design
requirements phase
desk checking
decision
8. Testing based on an analysis of the internal structure of the component or system.
configuration item
instrumenter
code-based testing
definition-use pair
9. A tool that provides support to the test management and control part of a test process. It often has several capabilities - such as testware management - scheduling of tests - the logging of results - progress tracking - incident management and test
test management tool
security
testable requirements
test condition
10. The process of identifying differences between the actual results produced by the component or system under test and the expected results for a test. Test comparison can be performed during test execution (dynamic comparison) or after test execution.
test comparison
code-based testing
input value
performance testing
11. An item or event of a component or system that could be verified by one or more test cases - e.g. a function - transaction - feature - quality attribute - or structural element.
baseline
test condition
risk identification
condition testing
12. Testing conducted to evaluate a component or system in its operational environment. [IEEE 610]
portability
executable statement
Fault Tree Analysis (FTA)
operational testing
13. A test is deemed to fail if its actual result does not match its expected result.
safety testing
fail
ad hoc testing
security testing tool
14. The capability of the software product to be used in place of another specified software product for the same purpose in the same environment. [ISO 9126] See also portability. The ease with which the software product can be transferred from one hardw
fault seeding
replaceability
entry point
independence of testing
15. Testing that involves the execution of the software of a component or system.
fail
test process
operational testing
dynamic testing
16. Formal testing with respect to user needs - requirements - and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user - customers or other authorized entity to determine whether or n
acceptance testing
oracle
static testing
COTS
17. [Beizer] A black box test design technique in which test cases are designed to execute representatives from equivalence partitions. In principle test cases are designed to cover each partition at least once.
quality
partition testing
hazard analysis
functional integration
18. A black box test design technique in which test cases are designed to execute valid and invalid state transitions. See also N-switch testing. A form of state transition testing in which test cases are designed to execute all valid sequences of N+1 tr
functional requirement
root cause
test infrastructure
state transition testing
19. Software developed specifically for a set of users or customers. The opposite is off-the-shelf software.
bespoke software
test execution technique
dynamic testing
Test Maturity Model (TMM)
20. The totality of functionality and features of a software product that bear on its ability to satisfy stated or implied needs. [After ISO 9126]
defect masking
software quality
cause-effect graph
database integrity testing
21. A scripting technique that stores test input and expected results in a table or spreadsheet - so that a single control script can execute all of the tests in the table. Data driven testing is often used to support the application of test execution to
test monitoring
portability
data driven testing
test automation
22. A test management task that deals with developing and applying a set of corrective actions to get a test project on track when monitoring shows a deviation from what was planned. See also test management. The planning - estimating - monitoring and co
defect masking
test control
retrospective meeting
error guessing
23. A series which appears to be random but is in fact generated according to some prearranged sequence.
test data
resumption criteria
pseudo-random
security testing
24. An aggregation of hardware - software or both - that is designated for configuration management and treated as a single entity in the configuration management process. [IEEE 610]
master test plan
configuration item
entry criteria
output domain
25. A model structure wherein attaining the goals of a set of process areas establishes a maturity level; each level builds a foundation for subsequent levels. [CMMI]
static code analyzer
actual outcome
staged representation
test design technique
26. A path by which the original input to a process (e.g. data) can be traced back through the process - taking the process output as a starting point. This facilitates defect analysis and allows a process audit to be carried out. [After TMap]
input domain
black-box test design technique
audit trail
condition outcome
27. An approach to testing to reduce the level of product risks and inform stakeholders on their status - starting in the initial stages of a project. It involves the identification of product risks and their use in guiding the test process.
risk-based testing
operational testing
milestone
decision outcome
28. The fundamental test process comprises test planning and control - test analysis and design - test implementation and execution - evaluating exit criteria and reporting - and test closure activities.
development testing
test tool
bottom-up testing
test process
29. The number of defects identified in a component or system divided by the size of the component or system (expressed in standard measurement terms - e.g. lines-of-code - number of classes or function points).
off-the-shelf software
requirements-based testing
executable statement
defect density
30. A superior method or innovative practice that contributes to the improved performance of an organization under given context - usually recognized as 'best' by other peer organizations.
path coverage
best practice
bottom-up testing
bespoke software
31. A software tool that translates programs expressed in a high order language into their machine language equivalents. [IEEE 610]
infeasible path
recoverability testing
state table
compiler
32. The degree to which a component - system or process meets specified requirements and/or user/customer needs and expectations. [After IEEE 610]
impact analysis
error tolerance
statement
quality
33. An abstract representation of the sequence and possible changes of the state of data objects - where the state of an object is any of: creation - usage - or destruction. [Beizer]
data flow
level test plan
interface testing
compliance
34. The total costs incurred on quality activities and issues and often split into prevention costs - appraisal costs - internal failure costs and external failure costs.
load testing
high level test case
cost of quality
test control
35. An instance of an input. See also input. A variable (whether stored within a component or outside) that is read by a component.
test monitoring
capture/playback tool
state transition
input value
36. The degree to which a component or system is operational and accessible when required for use. Often expressed as a percentage. [IEEE 610]
COTS
ad hoc testing
availability
test design
37. Directed and focused attempt to evaluate the quality - especially reliability - of a test object by attempting to force specific failures to occur.
orthogonal array testing
baseline
attack
buffer overflow
38. Procedure used to derive and/or select test cases.
multiple condition testing
requirements-based testing
test design technique
test implementation
39. The exit criteria that a component or system must satisfy in order to be accepted by a user - customer - or other authorized entity. [IEEE 610]
error
coverage item
desk checking
acceptance criteria
40. The ratio of the number of failures of a given category to a given unit of measure - e.g. failures per unit of time - failures per number of transactions - failures per number of computer runs. [IEEE 610]
agile testing
integration
recoverability
failure rate
41. The representation of a distinct set of tasks performed by the component or system - possibly based on user behavior when interacting with the component or system - and their probabilities of occurance. A task is logical rather that physical and can
operational profile
technical review
anomaly
test procedure specification
42. A test management task that deals with the activities related to periodically checking the status of a test project. Reports are prepared that compare the actuals to that which was planned. See also test management. The planning - estimating - monito
behavior
inspection
test monitoring
version control
43. An extension of FMEA - as in addition to the basic FMEA - it includes a criticality analysis - which is used to chart the probability of failure modes against the severity of their consequences. The result highlights failure modes with relatively hig
test suite
Failure Mode - Effect and Criticality Analysis (FMECA)
pair programming
false-fail result
44. The organizational artifacts needed to perform testing - consisting of test environments - test tools - office environment and procedures.
test infrastructure
severity
data definition
suspension criteria
45. A formula based test estimation method based on function point analysis. [TMap]
Test Point Analysis (TPA)
test harness
pointer
security tool
46. The process of testing to determine the compliance of the component or system.
compliance testing
fail
test item
oracle
47. Testing - either functional or non-functional - without reference to the internal structure of the component or system. Black-box test design technique Procedure to derive and/or select test cases based on an analysis of the specification - either fu
specification
boundary value coverage
static code analyzer
black-box testing
48. The person involved in the review that identifies and describes anomalies in the product or project under review. Reviewers can be chosen to represent different viewpoints and roles in the review process.
reviewer
traceability
test run
domain
49. Comparison of actual and expected results - performed after the software has finished running.
test suite
post-execution comparison
code
domain
50. Operational testing in the acceptance test phase - typically performed in a simulated real-life operational environment by operator and/or administrator focusing on operational aspects - e.g. recoverability - resource-behavior - installability and te
production acceptance testing
exhaustive testing
incident logging
understandability