What is quality assurance?
The set of support activities (including facilitation, training, measurement and analysis) needed to provide adequate confidence that processes are established and continuously improved in order to produce products that meet specifications and fit for us
Find many QA and software testing Interview Questions, Ask Interview Questions and get expert answers, Find many Sample resumes or Request one professionally written for you for FREE : please visit: http://crackinterviews.com/blog/
What is the purpose of the testing?
Testing provides information whether or not a certain product meets the requirements.
What is the difference between QA and testing?
Quality Assurance is that set of activities that are carried out to set standards and to monitor and improve performance so that the care provided is as effective and as safe as possible. Testing provides information whether or not a certain product meets the requirements. It also provides information where the product fails to meet the requirements.
What is software quality'?
OR Define software quality for me, as you understand it?
Quality software is reasonably bug-free, delivered on time and within budget, meets requirements and/or expectations, and is maintainable. However, quality is obviously a subjective term. It will depend on who the 'customer' is and their overall influence in the scheme of things. Each type of 'customer' will have their own slant on 'quality' - the accounting department might define quality in terms of profits while an end-user might define quality as user-friendly and bug-free.
What's the role of documentation in QA?
Critical. (Note that documentation can be electronic, not necessarily paper.) QA practices should be documented such that they are repeatable. Specifications, designs, business rules, inspection reports, configurations, code changes, test plans, test cases, bug reports, user manuals, etc. should all be documented. There should ideally be a system for easily finding and obtaining documents and determining what documentation will have a particular piece of information. Change management for documentation should be used if possible.
Explain the software development lifecycle.
There are seven stages of the software development lifecycle
1. Initiate the project The users identify their Business requirements.
2. Define the project The software development team translates the business requirements into system specifications and put together into System Specification Document.
3. Design the system The System Architecture Team designs the system and write Functional Design Document. During design phase general solutions re hypothesized and data and process structures are organized.
4. Build the system The System Specifications and design documents are given to the development team code the modules by following the Requirements and Design document.
5. Test the system - The test team develops the test plan following the requirements. The software is build and installed on the test platform after developers have completed development and Unit Testing. The testers test the software by following the test plan.
6. Deploy the system After the user-acceptance testing and certification of the software, it is installed on the production platform. Demos and training are given to the users.
7. Support the system - After the software is in production, the maintenance phase of the life begins. During this phase the development team works with the development document staff to modify and enhance the application and the test team works with the test documentation staff to verify and validate the changes and enhancement to the application software.
At what stage of the SDLC does testing begin in your opinion?
QA process starts from the second phase of the Software Development Life Cycle i.e. Define the System. Actual Product testing will be done on Test the system phase (Phase-5). During this phase test team will verify the actual results against expected results.
Explain the pre testing phase, acceptance testing and testing phase.
Pre testing Phase:
1. Review the requirements document for the testability: Tester will use the requirement document to write the test cases.
2. Establishing the hard freeze date: Hard freeze date is a date after which system test team will not accept any more software and documentation changes from development team, unless they are fixes of severity 1 MR's. The date is scheduled so that product test team will have time for final regression.
3. Writing master test plan: It is written by the lead tester or test coordinator. Master test plan includes entire testing plan, testing resources and testing strategy.
4. Setting up MR Tool: The MR tool must be set as soon as you know of the different modules in the product, the developers and testers on the product, the hardware platform, and operating system testing will be done.
This information will be available upon the completion of the first draft of the architecture document. Both testers and developers are trained how to use the system.
5. Setting up the test environment: The test environment is set on separate machines, database and network. This task is performed by the technical support team. First time it takes some time, Afterwards the same environment can be used by the later releases.
6. Writing the test plan and test cases: Template and the tool is decided to write the test plan, test cases and test procedures. Expected results are organized in the test plan according to the feature categories specified in the requirement document. For each feature positive and negative test cases are written. Writing test plan requires the complete understanding of the product and its interfaces with other systems. After test plan is completed, a walkthrough is conducted with the developers and design team members to baseline the test plan document.
7. Setting up the test automation tool: Planning of test strategy on how to automate the testing. Which test cases will be executed for regression testing. Not all the test cases will be executed during regression testing.
8. Identify acceptance test cases: Select subsets that are expected on the first day of system test. These tests must pass to accept the product in the system test.
Acceptance testing phase:
1. When the product enters system test, check it has completed integration test and must meet the integration test exit criteria.
2. Check integration exit criteria and product test entrance criteria in the master test plan or test strategy documents.
3. Check the integration testing sign off criteria sheet.
4. Coordinate release with product development.
5. How the code will be migrated from development environment to the test environment.
6. Installation and acceptance testing.
Product testing phase:
1. Running the test: Execution of test cases and verify if actual functionality of application matches the expected results.
2. Initial manual testing is recommended to isolate unexpected system behavior. Once application is stable automated regression test could be generated.
3. Issue MR's upon detection of the bugs.
What is the value of a testing group? How do you justify your work and budget?
All software products contain defects/bugs, despite the best efforts of their development teams. It is important for an outside party (one who is not developer) to test the product from a viewpoint that is more objective and representative of the product user.
Testing group test the software from the requirements point of view or what is required by the user. Testers job is to examine a program and see if it does not do what it is supposed to do and also see what it does what it is not supposed to do.
What is master test plan? What it contains? Who is responsible for writing it?
OR
What is a test plan? Who is responsible for writing it? What it contains.
OR
What's a 'test plan'? What did you include in a test plan?
A software project test plan is a document that describes the objectives, scope, approach, and focus of a software testing effort. The process of preparing a test plan is a useful way to think through the efforts needed to validate the acceptability of a software product. The completed document will help people outside the test group understand the 'why' and 'how' of product validation. It should be thorough enough to be useful but not so thorough that no one outside the test group will read it. The following are some of the items that might be included in a test plan, depending on the particular project:
Title Identification of software including version/release numbers Revision history of document including authors, dates, approvals Table of Contents Purpose of document, intended audience Objective of testing effort Software product overview Relevant related document list, such as requirements, design documents, other test plans, etc. Relevant standards or legal requirements Trace ability requirements Relevant naming conventions and identifier conventions Overall software project organization and personnel/contact-info/responsibilties Test organization and personnel/contact-info/responsibilities Assumptions and dependencies Project risk analysis Testing priorities and focus Scope and limitations of testing Test outline - a decomposition of the test approach by test type, feature, functionality, process, system, module, etc. as applicable Outline of data input equivalence classes, boundary value analysis, error classes Test environment - hardware, operating systems, other required software, data configurations, interfaces to other systems Test environment validity analysis - differences between the test and production systems and their impact on test validity. Test environment setup and configuration issues Software migration processes Software CM processes Test data setup requirements Database setup requirements Outline of system-logging/error-logging/other capabilities, and tools such as screen capture software, that will be used to help describe and report bugs Discussion of any specialized software or hardware tools that will be used by testers to help track the cause or source of bugs Test automation - justification and overview Test tools to be used, including versions, patches, etc. Test script/test code maintenance processes and version control Problem tracking and resolution - tools and processes Project test metrics to be used Reporting requirements and testing deliverables Software entrance and exit criteria Initial sanity testing period and criteria Test suspension and restart criteria Personnel allocation Personnel pre-training needs Test site/location Outside test organizations to be utilized and their purpose, responsibilties, deliverables, contact persons, and coordination issues Relevant proprietary, classified, security, and licensing issues. Open issues Appendix - glossary, acronyms, etc.
The team-lead or a Sr. QA Analyst is responsible to write this document.
Why is test plan a controlled document?
Because it controls the entire testing process. Testers have to follow this test plan during the entire testing process.
What information you need to formulate test plan?
Need the Business requirement document to prepare the test plan.
What are the entrance and exit criteria in the system test?
Entrance and exit criteria of each testing phase is written in the master test plan.
Enterence Criteria:
- Integration exit criteria have been successfully met.
- All installation documents are completed.
- All shippable software has been successfully built
- Syate, test plan is baselined by completing the walkthrough of the test plan.
- Test environment should be setup.
- All severity 1 MR's of integration test phase should be closed.
Exit Criteria:
- All the test cases in the test plan should be executed.
- All MR's/defects are either closed or deferred.
- Regression testing cycle should be executed after closing the MR's.
- All documents are reviewed, finilized and signed-off.
If there are no requirements, how will you write your test plan?
If there are no requirements we try to gather as much details as possible from:
Business Analysts Developers (If accessible) Previous Version documentation (if any) Stake holders (If accessible) Prototypes.
What is White box testing/unit testing?
Unit testing - The most 'micro' scale of testing; to test particular functions or code modules. Typically done by the programmer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses.
Difference between Black and White box testing?
Black box testing: Functional testing based on requirements with no knowledge of the internal program structure or data. Also known as closed-box testing.
White Box testing: Testing approaches that examine the program structure and device test data from the program logic.
What are the roles of glass-box and black-box testing tools?
Glass-box testing also called as white-box testing refers to testing, with detailed knowledge of the modules internals. Thus these tools concentrate more on the algorithms, data structures used in development of modules. These tools perform testing on individual modules more likely than the whole application. Black-Box testing tools refer to testing the interface, functionality and performance testing of the system module and the whole system.
What is Black box testing?
Black Box testing is also called system testing which is performed by the testers. Here the features and requirements of the product as described in the requirement document are tested.
What is Integration testing?
Integration testing - Testing of combined parts of an application to determine if they function together correctly. The 'parts' can be code modules, individual applications, client and server applications on a network, etc. This type of testing is especially relevant to client/server and distributed systems.
What knowledge you require to do white box, integration and black box testing?
For white box testing you need to understand the internals of the module like data structures and algorithms and have access to the source code and for black box testing only understanding/functionality of the application.
What is Regression testing?
Regression testing: Re-testing after fixes or modifications of the software or its environment. It can be difficult to determine how much re-testing is needed, especially near the end of the development cycle. Automated testing tools can be especially useful for this type of testing..
Why do we do regression testing?
In any application new functionalities can be added so the application has to be tested to see whether the added functionalities have affected the existing functionalities or not. Here instead of retesting all the existing functionalities baseline scripts created for these can be rerun and tested.
How do we regression testing?
Various automation-testing tools can be used to perform regression testing like WinRunner, Rational Robot and Silk Test.
What are positive scenarios?
Testing to see whether the application is doing what it is supposed to do.
What are negative scenarios?
Testing to see whether the application is not doing what it is not suppose to do.
What is the difference between regression automation tool and performance automation tool?
Regression testing tools capture test and play them back at a later time. The capture and playback feature is fundamental to regression testing.
Performance testing tool determine the load a server can handle. And must have feature to stimulate many users from one machine, scheduling and synchronize different users, able to measure the network load under different number of simulated users.
What is the difference between exception and validation testing?
Validation testing aims to demonstrate that the software functions in a manner that can be reasonably expected by the customer. Testing the software in conformance to the Software Requirements Specifications.
Exception testing deals with handling the exceptions (unexpected events) while the AUT is run. Basically this testing involves how to change the control flow of the AUT when an exception arises.
What is user acceptance testing?
It is also called as Beta Testing. Once System Testing is done and the system seems stable to the developers and testers, system engineers usually invite the end users of the software to see if they like the software. If the users like the software the way it is then software will be delivered to the user. Otherwise necessary changes will be made to the software and software will pass through all phases of testing again.
What is manual testing and what is automated testing?
Manual testing involves testing of software application by manually performing the actions on the AUT based on test plans.
Automated testing involves testing of a software application by performing the actions on the AUT by using automated testing tool (such as Quick Test Professional, WinRunner, LoadRunner, Rational Root) based on test plans
What is smoke testing?
The smoke test must evolve as the system evolves. At first, the smoke test will probably test something simple, such as whether the system can say, "Hello, World." As the system develops, the smoke test will become more thorough. The first test might take a matter of seconds to run; as the system grows, the smoke test can grow to 30 minutes, an hour, or more.
What is soak testing?
The software system will be run for a total of 14 hours continuously. If the system is a control system, it will be used to continuously move each of the instrument mechanisms during this time. Any other system will be expected to perform its intended function continuously during this period. The software system must not fail during this period.
What is stress testing, performance testing, Security testing, Recovery testing and volume testing.
Stress testing: Testing the system if it can handle peak usage period loads that result from large number of simultaneous users, transactions or devices. Monitoring should be performed for throughput and system stability.
Performance Testing: Testing the system whether the system functions are being performed in an acceptable timeframe under simultaneous user load. Timings for both read and update transactions should be gathered to determine whether. This should be done stand-alone and then in a multi-user environment to determine the transaction throughput.
Security Testing: Testing the system for its security from unauthorized use and unauthorized data access.
Recovery Testing: Testing a system to see how it responds to errors and abnormal conditions, such as system crash, loss of device, communications, or power.
Volume Testing: Testing to the system to determine if it can correctly process large volumes of data fed to the system. Systems can often respond unpredictably when large volume causes files to overflow and need extensions.
What is MR?
MR is a Modification Request also known as Defect Report, a request to modify the program so that program does what it is supposed to do.
Why you write MR?
MR is written for reporting problems/errors or suggestions in the software.
What information does MR contain?
OR
Describe me to the basic elements you put in a defect report?
OR
What is the procedure for bug reporting?
The bug needs to be communicated and assigned to developers that can fix it. After the problem is resolved, fixes should be re-tested, and determinations made regarding requirements for regression testing to check that fixes didn't create problems elsewhere. If a problem-tracking system is in place, it should encapsulate these processes. A variety of commercial problem-tracking/management software tools are available.
The following are items to consider in the tracking process:
Complete information such that developers can understand the bug, get an idea of its severity, and reproduce it if necessary. Current bug status (e.g., 'Released for Retest', 'New', etc.) The application name or identifier and version The function, module, feature, object, screen, etc. where the bug occurred Environment specifics, system, platform, relevant hardware specifics Test case name/number/identifier One-line bug description Full bug description Description of steps needed to reproduce the bug if not covered by a test case or if the developer doesn't have easy access to the test case/test script/test tool Names and/or descriptions of file/data/messages/etc. used in test File excerpts/error messages/log file excerpts/screen shots/test tool logs that would be helpful in finding the cause of the problem Severity estimate (a 5-level range such as 1-5 or 'critical'-to-'low' is common) Was the bug reproducible? Tester name Test date Bug reporting date Name of developer/group/organization the problem is assigned to Description of problem cause Description of fix Code section/file/module/class/method that was fixed Date of fix Application version that contains the fix Tester responsible for retest Retest date Retest results Regression testing requirements Tester responsible for regression tests Regression testing results
Which MR tool you used to write MR?
Quality Center, Test Director, Rational ClearQuest, PVCS Tracker
What criteria you will follow to assign severity and due date to the MR?
Defects (MR) are assigned severity as follows:
Critical: show stoppers (the system is unusable)
High: The system is very hard to use and some cases are prone to convert to critical issues if not taken care of.
Medium: The system functionality has a major bug but is not too critical but needs to be fixed in order for the AUT to go to production environment.
Low: cosmetic (GUI related)
If the functionality of an application had an inbuilt bug because of which the test script fails, would you automate the test?
No, we do the automation once the application is tested manually and it is stabilized. Automation is for regression testing.
You find a bug and the developer says "It's not possible" what do u do?
I'll discuss with him under what conditions (working environment) the bug was produced. I'll provide him with more details and the snapshot of the bug.
How do you help developer to track the fault s in the software?
By providing him with details of the defects which include the environment, test data, steps followed etc and helping him to reproduce the defect in his environment.
What are the different types of MRs?
MR for suggestions,
MR for defect reports,
MR for documentations changes
What is the role of a bug tracking system?
Bug tracking system captures, manages and communicates changes, issues and tasks, providing basic process control to ensure coordination and communication within and across development and content teams at every step..
What is a successful product?
A bug free product, meeting the expectations of the user would make the product successful.
What Process/Methodologies are you familiar with?
Waterfall methodology
Spiral methodology
V Model
Agile
[Or talk about Customized methodology of the specific client]
What are CMM and CMMI? What is the difference?
The Capability Maturity Model for Software (CMM or SW-CMM) is a model for judging the maturity of the software processes of an organization and for identifying the key practices that are required to increase the maturity of these processes.
The Capability Maturity Model Integration (CMMI) provides the guidance for improving your organization's processes and your ability to manage the development, acquisition, and maintenance of products and services. CMMIntegration places proven practices into a structure that helps your organization assess its organizational maturity and process area capability, establish priorities for improvement, and guide the implementation of these improvements.
The new integrated model (CMMI) uses Process Areas (known as PAs) which are different to the previous model, and covers as well systems as software processes, rather than only software processes as in the SW-CMM.
What you will do during the first day of job?
Get acquainted with my team and application
What was the test team hierarchy?
Project Leader
QA lead
QA Analyst
Tester
What are the different automation tools you know?
Automation tools provided by Mercury Interactive Quick Test Professionl, WinRunner, LoadRunner; Rational Rational Robot; Segue- SilkTest.
What is ODBC?
Open Database Connectivity (ODBC) is an open standard application-programming interface (API) for accessing a database. ODBC is based on Structured Query Language (SQL) Call-Level Interface. It allows programs to use SQL requests that will access databases without having to know the proprietary interfaces to the databases. ODBC handles the SQL request and converts it into a request the individual database system understands.
Did you ever have problems working with developers?
NO. I had a good rapport with the developers.
Describe your experience with code analyzers?
Code analyzers generally check for bad syntax, logic, and other language-specific programming errors at the source level. This level of testing is often referred to as unit testing and server component testing. I used code analyzers as part of white box testing.
How do you survive chaos?
I survive by maintaining my calm and focusing on the work.
Tell me about the worst boss you've ever had.
Fortunately I always had the best bosses, talking in professional terms I had no complains on my bosses.
What do you like about Windows?
Interface and User friendliness
Windows is one the best software I ever used. It is user friendly and very easy to learn.
Find many QA and software testing Interview Questions, Ask Interview Questions and get expert answers, Find many Sample resumes or Request one professionally written for you for FREE : please visit: http://crackinterviews.com/blog/
0