QA Handbook

Everything You Need to Know

QA Handbook

What is QA Testing?

Foundation of Quality Assurance

Quality Assurance (QA) Testing is the systematic process of ensuring software applications meet specified requirements and function correctly before reaching end users. As a QA professional, you act as the guardian of software quality.

Key Responsibilities

Finding and documenting software defects - Systematic bug detection and analysis
Verifying features work according to specifications - Requirements validation
Ensuring applications are user-friendly - UX/UI testing
Performance and security validation - Non-functional testing

Real-World Impact

Consider major companies like Tesco - their e-commerce platform handles millions of transactions daily. A single bug in their checkout process could cost millions in lost revenue. QA testing prevents such disasters.

Airlines Example - Croatia Airlines

An airline booking system must handle seat selection, payment processing, and passenger data accurately. One bug in the payment gateway could result in double charges or failed bookings, affecting thousands of travelers.

7 Testing Principles

Fundamental concepts every QA should know

1. Testing shows presence of defects

Testing can prove that defects are present, but cannot prove that there are no defects.

2. Exhaustive testing is impossible

Testing everything is not feasible except in trivial cases. Risk analysis and priorities should guide testing efforts.

3. Early testing

Testing activities should start as early as possible in the SDLC and be focused on defined objectives.

4. Defect clustering

A small number of modules contain most of the defects discovered during pre-release testing.

5. Pesticide paradox

If the same tests are repeated, they will no longer find new bugs. Test cases need to be reviewed and updated.

6. Testing is context dependent

Testing is done differently in different contexts. Safety-critical software requires different testing than e-commerce sites.

7. Absence of errors fallacy

Finding and fixing defects does not help if the system built is unusable and does not fulfill user needs.

SDLC - Software Development Life Cycle

Understanding development methodologies

Waterfall Model

1
Requirements

Gather and analyze business requirements

2
Design

System architecture and UI design

3
Implementation

Code development phase

4
Testing

QA validation and verification

5
Deployment

Release to production

6
Maintenance

Ongoing support and updates

V-Model (Verification & Validation)

The V-Model emphasizes testing at each development phase, ensuring early defect detection.

RequirementsUser Acceptance Test
System DesignSystem Testing
Detailed DesignIntegration Testing
CodingUnit Testing

Key Benefit: Testing strategies are planned during corresponding development phases, leading to better test coverage and early defect detection.

Agile vs Waterfall Comparison

Waterfall Testing

  • • Testing phase starts after development
  • • Detailed documentation required
  • • Sequential process
  • • Less flexibility for changes
  • • Good for stable requirements

Agile Testing

  • • Testing throughout development cycle
  • • Collaborative approach
  • • Iterative process
  • • High flexibility for changes
  • • Continuous feedback

STLC - Software Testing Life Cycle

Systematic approach to testing

1

Requirement Analysis

Review and understand requirements, identify testable scenarios

Activities:

  • Analyze functional & non-functional requirements
  • Identify test conditions
  • Create Requirement Traceability Matrix (RTM)

Deliverables:

  • RTM document
  • Test Strategy document
  • Automation feasibility report
2

Test Planning

Define test approach, scope, resources, and timeline

Activities:

  • Define test scope and approach
  • Estimate effort and timeline
  • Identify resources and roles

Deliverables:

  • Test Plan document
  • Test Estimation
  • Resource Planning
3

Test Case Design & Development

Create detailed test cases and test data

Activities:

  • Create test cases from requirements
  • Develop automation scripts
  • Prepare test data

Deliverables:

  • Test Cases document
  • Test Scripts
  • Test Data sets
4

Test Environment Setup

Prepare testing environment and test data

Activities:

  • Setup test environment
  • Install required software
  • Configure test data

Deliverables:

  • Environment setup document
  • Test data creation
  • Smoke test results
5

Test Case Execution

Execute test cases and report defects

Activities:

  • Execute test cases
  • Log defects in bug tracking tool
  • Retest fixed defects

Deliverables:

  • Test execution results
  • Defect reports
  • Test logs
6

Test Reporting

Analyze results and create test summary report

Activities:

  • Evaluate test completion criteria
  • Analyze metrics and coverage
  • Prepare final report

Deliverables:

  • Test summary report
  • Test metrics
  • Test coverage report
7

Test Closure

Document lessons learned and archive test artifacts

Activities:

  • Document lessons learned
  • Archive test artifacts
  • Analyze process improvements

Deliverables:

  • Test closure report
  • Best practices document
  • Test artifacts archive

Testing Levels

Different levels of software testing

Testing Pyramid

The testing pyramid shows the ideal distribution of different types of tests in a software project. More tests at the bottom (unit tests) and fewer at the top (UI tests).

UI Tests (2%)
System Tests (8%)
Integration Tests (20%)
Unit Tests (70%)

1. Unit Testing

Testing individual components or modules in isolation. The smallest testable parts of an application.

Login Page Example:
Username Field → Unit Test
Password Field → Unit Test
Login Button → Unit Test
Characteristics:
  • • Tests individual functions/methods
  • • Fast execution (milliseconds)
  • • Easy to write and maintain
  • • High code coverage possible
  • • Done by developers
  • • Uses mocks/stubs for dependencies

2. Integration Testing

Testing the interfaces and interaction between integrated components or systems.

Integration Example:
Login Page
Database

Test: Login form communicates with user database

3. System Testing

Testing the complete integrated system to verify it meets specified requirements.

System Test Areas:
  • • Functionality: All features work correctly
  • • Reliability: System stability over time
  • • Performance: Speed and responsiveness
  • • Security: Data protection and access control

4. User Acceptance Testing (UAT)

Final testing performed by end users to ensure the system meets business requirements.

Alpha Testing:
  • • Performed by internal users/employees
  • • Controlled environment
  • • Before beta testing
Beta Testing:
  • • Performed by external users
  • • Real-world environment
  • • Limited user group

V-Model (Verification & Validation)

Testing throughout the development lifecycle

What is the V-Model?

The V-Model is an extension of the waterfall model where testing activities are planned in parallel with corresponding development phases. Each development phase has a corresponding testing phase.

Verification (Left Side):

  • • Static testing activities
  • • Reviews and walkthroughs
  • • Document analysis
  • • "Are we building the product right?"

Validation (Right Side):

  • • Dynamic testing activities
  • • Actual test execution
  • • Code execution with test data
  • • "Are we building the right product?"

V-Model Phase Mapping

Requirements Analysis

Gather business requirements

User Acceptance Testing

Validate user requirements

System Design

High-level architecture

System Testing

Test complete system

Detailed Design

Module-level design

Integration Testing

Test module interactions

Coding

Implementation phase

Unit Testing

Test individual modules

V-Model Benefits

✓ Advantages:

  • • Early test planning and design
  • • Better defect prevention
  • • Clear testing objectives
  • • Higher quality deliverables

✗ Disadvantages:

  • • Rigid and less flexible
  • • Difficult to accommodate changes
  • • No early prototypes
  • • High risk for complex projects

Static vs Dynamic Testing

Two fundamental approaches to testing

Static Testing

Testing without executing the code. Reviews, walkthroughs, and analysis.

Methods:

  • Code Reviews - Peer review of source code
  • Walkthroughs - Author explains code to team
  • Inspections - Formal defect detection process
  • Static Analysis Tools - Automated code analysis

Benefits:

  • • Early defect detection
  • • Cost-effective bug prevention
  • • Improves code quality
  • • Knowledge sharing

Real Example:

Reviewing login page HTML/CSS for accessibility issues, checking if proper form labels are used for screen readers.

Dynamic Testing

Testing by executing the code with various inputs and checking outputs.

Types:

  • Functional Testing - Testing features work correctly
  • Performance Testing - Speed, load, stress testing
  • Security Testing - Vulnerability assessment
  • Usability Testing - User experience validation

Characteristics:

  • • Requires test environment
  • • Uses test data
  • • Validates actual behavior
  • • Can be automated

Real Example:

Actually filling out and submitting a login form with different username/password combinations to test authentication logic.

Static vs Dynamic Comparison

AspectStatic TestingDynamic Testing
Code ExecutionNo code executionCode is executed
When AppliedEarly development phasesAfter code completion
CostLower costHigher cost
Defect TypesLogic errors, syntax issuesRuntime errors, performance issues

Manual vs Automation Testing

Choosing the right approach for different scenarios

Manual Testing

Human testers manually execute test cases without automation tools.

✓ Best For:

  • • Exploratory testing
  • • Usability testing
  • • Ad-hoc testing
  • • New feature testing
  • • UI/UX validation

✗ Limitations:

  • • Time-consuming for repetitive tasks
  • • Human error prone
  • • Not suitable for load testing
  • • Resource intensive

Example Scenario:

Testing a new checkout flow for Tesco's website - checking if the payment process feels intuitive and secure to users.

Automation Testing

Using tools and scripts to execute tests automatically without human intervention.

✓ Best For:

  • • Regression testing
  • • Performance testing
  • • Repetitive test cases
  • • Data-driven testing
  • • Cross-browser testing

✗ Limitations:

  • • High initial setup cost
  • • Maintenance overhead
  • • Cannot test user experience
  • • Requires technical skills

Example Scenario:

Running 500 login test cases overnight to verify authentication works across different browsers and user types.

Decision Matrix: When to Use What

Use Manual Testing When:

  • • Testing new features for first time
  • • Exploring application behavior
  • • Checking visual elements
  • • Testing user workflows
  • • Performing accessibility testing

Use Automation When:

  • • Tests need to run repeatedly
  • • Testing across multiple environments
  • • Performing load/stress testing
  • • Running smoke tests
  • • Doing regression testing

Hybrid Approach:

  • • Automate stable, repetitive tests
  • • Manual testing for new features
  • • Use both for comprehensive coverage
  • • Start manual, automate over time
  • • Focus automation on critical paths

Functional Testing

Testing what the system does

What is Functional Testing?

Functional testing verifies that each function of the software application operates according to the requirement specification. It focuses on testing the functionality of the system.

Key Characteristics:

  • • Based on functional requirements
  • • Black box testing technique
  • • Validates business logic
  • • User-centric approach
  • • Input-output behavior verification

Testing Focus Areas:

  • • User interface functionality
  • • Database operations
  • • API functionality
  • • Security features
  • • Business workflow validation

Types of Functional Testing

Unit Testing

Testing individual components or modules in isolation.

Example - Login Function:
function validateLogin(username, password) {
  if (!username || !password) return false;
  return checkCredentials(username, password);
}
Test Cases:
  • • Empty username → false
  • • Empty password → false
  • • Valid credentials → true
  • • Invalid credentials → false

Integration Testing

Testing interaction between integrated modules.

Big Bang Approach:

All modules integrated simultaneously and tested as a whole.

Example: Testing complete e-commerce flow: User registration → Login → Browse products → Add to cart → Checkout → Payment
Incremental Approach:

Modules integrated one by one and tested at each step.

Example: First test Login + User Database, then add Product Catalog, then Shopping Cart, etc.

System Testing

Testing the complete integrated system to verify it meets specified requirements.

Real-World Example: Airlines Booking System
Flight Search
  • • Search by destination
  • • Filter by price/time
  • • Display available seats
Booking Process
  • • Seat selection
  • • Passenger details
  • • Payment processing
Confirmation
  • • Booking confirmation
  • • Email ticket
  • • SMS notification

User Acceptance Testing (UAT)

Final testing performed by end users to ensure the system meets business requirements.

Alpha Testing:

Internal testing by organization's employees.

Example: Tesco employees testing new online grocery ordering system before public release.
Beta Testing:

Testing by limited external users in real environment.

Example: Selected customers testing new mobile banking features before full rollout.

Sample Functional Test Case

Test Case: User Registration

Test ID:TC_REG_001
Objective:Verify user can register successfully
Precondition:User not registered before
Priority:High

Test Steps & Expected Results

Step 1: Navigate to registration page
Expected: Registration form displayed
Step 2: Fill valid details and submit
Expected: Success message shown
Step 3: Check email for confirmation
Expected: Confirmation email received

Testing Techniques

White-box testing methods and coverage techniques

White Box Testing Techniques

White box testing techniques focus on the internal structure of the code. These techniques help ensure thorough testing coverage by examining different aspects of code execution.

Code Coverage Formula:

Coverage = (Number of Executed Items / Total Number of Items) × 100%

Statement Coverage

Ensures that every executable statement in the code is executed at least once during testing.

Example:
function validateAge(age) {
  if (age >= 18) {
    return "Adult";
  }
  return "Minor";
}

Test cases: validateAge(20) and validateAge(15) achieve 100% statement coverage

Branch Coverage

Ensures that every branch (true/false) of every decision point is executed at least once.

Example:
function checkAccess(age, member) {
  if (age >= 18 && member) {
    return "Access granted";
  }
  return "Access denied";
}

Need tests for both true and false branches of the condition

Condition Coverage

Ensures that each boolean sub-expression has been evaluated to both true and false.

Test Requirements:
  • • Each condition must be tested as true
  • • Each condition must be tested as false
  • • More thorough than branch coverage
  • • May require multiple test cases

Loop Testing

Focuses on testing the validity of loop constructs. Different strategies for different loop types.

Simple Loops:
  • • Skip the loop entirely (n=0)
  • • Only one pass (n=1)
  • • Two passes (n=2)
  • • m passes (n=m, typical value)
  • • n-1, n, n+1 passes
Nested Loops:
  • • Start with innermost loop
  • • Set outer loops to minimum
  • • Test innermost with simple loop strategy
  • • Work outward
  • • Continue until all tested
Concatenated Loops:
  • • Independent loops: test separately
  • • Dependent loops: test as nested
  • • Check loop counter dependencies
  • • Verify data flow between loops
Loop Testing Example:
function sumArray(arr) {
  let sum = 0;
  for (let i = 0; i < arr.length; i++) {
    sum += arr[i];
  }
  return sum;
}
Test Cases:
  • • sumArray([]) - Zero iterations
  • • sumArray([5]) - One iteration
  • • sumArray([1,2]) - Two iterations
  • • sumArray([1,2,3,4,5]) - Multiple iterations

Path Testing

Tests all possible paths through the code using cyclomatic complexity.

Cyclomatic Complexity:
V(G) = Number of decision points + 1

This determines the minimum number of test cases needed for path coverage

Coverage Techniques Comparison

TechniqueWhat it CoversStrengthWeakness
StatementEvery executable statementEasy to measureWeakest form of coverage
BranchEvery decision outcomeBetter than statementDoesn't test all conditions
ConditionEvery boolean conditionTests individual conditionsMay miss some decision outcomes
PathEvery possible execution pathMost thoroughCan be impractical for complex code

Testing Techniques Best Practices

✓ Recommendations:

  • • Start with statement coverage as minimum
  • • Aim for 100% branch coverage
  • • Use condition coverage for complex logic
  • • Apply loop testing for all loop constructs
  • • Consider path testing for critical modules
  • • Use tools to measure coverage automatically

Coverage Goals:

  • Critical systems: 100% branch coverage
  • Commercial software: 80-90% coverage
  • Web applications: 70-80% coverage
  • Prototypes: 60-70% coverage
  • • Focus on quality over quantity
  • • Combine with black-box techniques

Performance Testing

Comprehensive performance testing strategies

Load Testing

Testing system behavior under expected normal load conditions.

Typical Load Metrics:
5000 Users
Concurrent Users
< 5 sec
Response Time
1000 TPS
Transactions/Sec

Stress Testing

Testing system behavior beyond normal capacity to find the breaking point.

Stress Progression:
Normal Load5,000 users
Increased Load10,000 users
Breaking Point15,000+ users

Volume Testing

Testing system performance with large amounts of data.

Volume Test Scenarios:
  • • Database with 10 million records
  • • File processing of 4GB+ files
  • • Memory usage with large datasets
  • • Network bandwidth utilization

Performance Benchmarks

Web Applications:

  • • Page Load: < 3 seconds
  • • API Response: < 200ms
  • • Database Query: < 100ms

Mobile Apps:

  • • App Launch: < 2 seconds
  • • Screen Transition: < 1 second
  • • Data Sync: < 5 seconds

Enterprise Systems:

  • • Transaction: < 500ms
  • • Report Generation: < 30 sec
  • • System Availability: 99.9%

Non-Functional Testing

Testing how the system performs

What is Non-Functional Testing?

Non-functional testing evaluates the performance, usability, reliability, and other quality aspects of the software. It focuses on HOW the system performs rather than WHAT it does.

Performance Metrics:

  • • Response time
  • • Throughput
  • • Resource utilization
  • • Scalability

Quality Attributes:

  • • Usability
  • • Reliability
  • • Security
  • • Compatibility

Environmental:

  • • Cross-browser testing
  • • Mobile responsiveness
  • • Network conditions
  • • Device compatibility

Performance Testing Types

Load Testing

Testing system behavior under normal expected load conditions.

Objectives:
  • • Verify response time requirements
  • • Ensure system stability
  • • Validate throughput expectations
  • • Identify performance bottlenecks
Real Example - Tesco Online:

Normal Load: 10,000 concurrent users

Expected Response: Page load < 3 seconds

Transactions: 500 orders per minute

Stress Testing

Testing system behavior beyond normal capacity to find breaking point.

Airlines Example - Croatia Airlines During Holiday Rush:
Normal Load

2,000 users booking flights simultaneously

Stress Load

15,000 users during Christmas booking rush

Breaking Point

System fails at 20,000+ concurrent users

Goal: Ensure graceful degradation - system should slow down but not crash completely.

Volume Testing

Testing system with large amounts of data to verify performance and stability.

Database Testing Example:
Test Scenarios:
  • • 10 million customer records
  • • 100 million transaction history
  • • 50GB product catalog
  • • 1TB of user-generated content
Validation Points:
  • • Search response time remains < 2s
  • • Database queries don't timeout
  • • Memory usage stays within limits
  • • Data integrity maintained

Spike Testing

Testing system behavior under sudden, extreme load increases.

Black Friday Example - E-commerce Site:
Normal Traffic:5,000 users
Spike Traffic (12:00 AM):50,000 users in 2 minutes
Test Goal: Verify system can handle sudden 10x traffic increase without complete failure

Security Testing

Common Tests:
  • • SQL Injection attacks
  • • Cross-site scripting (XSS)
  • • Authentication bypass
  • • Session management
  • • Data encryption validation
Example: Testing login form against SQL injection:' OR '1'='1

Usability Testing

Evaluation Criteria:
  • • Ease of navigation
  • • User interface clarity
  • • Task completion time
  • • Error prevention
  • • User satisfaction
Example: Can a new user complete checkout process within 3 minutes without help?

Compatibility Testing

Testing Matrix:
Browsers:
  • • Chrome 120+
  • • Firefox 115+
  • • Safari 16+
  • • Edge 110+
Devices:
  • • iPhone 12+
  • • Samsung Galaxy
  • • iPad Pro
  • • Desktop 1920x1080

Reliability Testing

Metrics:
  • MTBF: Mean Time Between Failures
  • MTTR: Mean Time To Recovery
  • Availability: 99.9% uptime target
  • Failure Rate: < 0.1% transactions

Black Box vs White Box Testing

Testing approaches based on code knowledge

Black Box Testing

Testing without knowledge of internal code structure. Focus on inputs and outputs.

Techniques:

  • Equivalence Partitioning - Group similar inputs
  • Boundary Value Analysis - Test edge values
  • Decision Table Testing - Test business rules
  • State Transition Testing - Test state changes

Example - Login Form:

InputExpectedResult
Valid user/passLogin success✓ Pass
Invalid userLogin failed✓ Pass
Empty fieldsValidation error✓ Pass

Real-World Use:

Testing Croatia Airlines booking system by trying different passenger counts, dates, and destinations without knowing the backend database structure.

White Box Testing

Testing with full knowledge of internal code structure, logic, and design.

Techniques:

  • Statement Coverage - Execute every code line
  • Branch Coverage - Test all if/else paths
  • Path Coverage - Test all possible paths
  • Condition Coverage - Test all conditions

Code Example:

if (user.isValid() && user.isActive()) {
  loginUser(user);
} else {
  showError("Invalid credentials");
}

Test cases: valid+active user, valid+inactive user, invalid user

Real-World Use:

Unit testing payment processing code to ensure all branches (successful payment, insufficient funds, network timeout) are properly tested.

Gray Box Testing (Hybrid Approach)

Combination of black box and white box testing - limited knowledge of internal workings.

Characteristics:

  • • Partial code knowledge
  • • Access to design documents
  • • Integration testing focus
  • • API testing

Best For:

  • • Integration testing
  • • Penetration testing
  • • Matrix testing
  • • Regression testing

Example:

Testing API endpoints for e-commerce cart - knowing the API structure but not the internal database queries.

Bug Reporting

Effective defect identification and documentation

What is a Bug?

A bug (or defect) is a flaw in a software system that causes it to behave in an unintended or unexpected way. It represents a deviation from the expected functionality as defined in the requirements.

Types of Bugs:

  • • Functional bugs
  • • Performance issues
  • • UI/UX problems
  • • Security vulnerabilities
  • • Compatibility issues
  • • Data corruption

Common Causes:

  • • Coding errors
  • • Requirement misunderstanding
  • • Design flaws
  • • Integration issues
  • • Environmental factors
  • • Human mistakes

Impact Areas:

  • • User experience
  • • System performance
  • • Data integrity
  • • Business operations
  • • Security risks
  • • Financial losses

Bug Lifecycle

Bug Status Flow

1
New

Bug discovered and reported

2
Assigned

Assigned to developer for fixing

3
Fixed

Developer has resolved the issue

4
Closed

Bug verified as fixed and closed

Priority vs Severity

Severity

Impact on system functionality - How much the bug affects the system's operation.

CriticalSystem crash, data loss
MajorFeature not working
MediumUI issues, typos
LowCosmetic issues

Priority

Urgency of fixing - How quickly the bug needs to be resolved based on business needs.

HighFix immediately
MediumFix in current release
LowFix in future release
Example Scenarios:
High Priority + Critical Severity: Payment system crashes during checkout
High Priority + Low Severity: Company logo missing on homepage before product launch
Low Priority + Critical Severity: Admin panel crashes (affects few users)
Low Priority + Low Severity: Text alignment issue in footer

Modern Bug Report Template

Bug Report #BUG-001

Created on Loading...

High PriorityMajor Severity
Login button not responding on mobile devices
John Doe (QA Tester)
iOS 16.0, Safari, iPhone 14 Pro
User Authentication
1. Open application on mobile device
2. Navigate to login page
3. Enter valid credentials
4. Tap login button
User should be logged in and redirected to dashboard
Login button does not respond to touch events
📱 mobile_login_screenshot.png • 🎥 login_issue_recording.mp4

Bug Reporting Best Practices

✓ Do's:

  • • Write clear, descriptive titles
  • • Provide detailed steps to reproduce
  • • Include screenshots/videos
  • • Specify environment details
  • • Set appropriate priority and severity
  • • Test on multiple environments
  • • Verify bug before reporting

✗ Don'ts:

  • • Don't use vague descriptions
  • • Don't report duplicate bugs
  • • Don't skip reproduction steps
  • • Don't assume everyone knows the context
  • • Don't report multiple issues in one bug
  • • Don't forget to include evidence
  • • Don't set wrong priority/severity

Test Case Writing

Creating effective test cases

Test Case Structure

Essential Components:

  • Test Case ID: Unique identifier (TC_001)
  • Test Case Title: Clear, descriptive name
  • Objective: What you're testing
  • Preconditions: Setup requirements
  • Test Steps: Detailed actions
  • Expected Results: What should happen
  • Postconditions: Cleanup steps

Test Case Attributes:

  • Priority: High/Medium/Low
  • Test Type: Functional/Non-functional
  • Test Level: Unit/Integration/System
  • Test Data: Required input data
  • Environment: Test environment details
  • Author: Test case creator
  • Creation Date: When created
  • Execution Status: Pass/Fail/Blocked

Sample Test Case: User Login

Test Case ID:TC_LOGIN_001
Title:Verify successful login with valid credentials
Objective:Test login functionality with correct username and password
Priority:High
Precondition:User has valid account, browser is open
Test Data:Username: testuser@example.com
Password: Test123!

Test Steps & Expected Results:

Step 1: Navigate to login page (www.example.com/login)
Expected: Login form is displayed with username and password fields
Step 2: Enter valid username in username field
Expected: Username is entered successfully
Step 3: Enter valid password in password field
Expected: Password is masked and entered successfully
Step 4: Click Login button
Expected: User is redirected to dashboard page
Step 5: Verify user is logged in
Expected: User profile/logout option is visible

Test Execution Results

PASSED

Definition: Test executed successfully and met all expected results

Action: Mark as passed, move to next test case

Documentation: Record execution date and tester name

FAILED

Definition: Test did not meet expected results, defect found

Action: Create bug report, assign to development team

Documentation: Record failure details and attach evidence

BLOCKED

Definition: Test cannot be executed due to external factors

Action: Identify and resolve blocking issue

Documentation: Record reason for blocking and resolution steps

Test Case Writing Best Practices

✓ Do's:

  • • Write clear, concise test steps
  • • Use simple language
  • • Include specific test data
  • • Make test cases independent
  • • Cover both positive and negative scenarios
  • • Review and update regularly

✗ Don'ts:

  • • Don't write vague or ambiguous steps
  • • Don't assume prior knowledge
  • • Don't create dependent test cases
  • • Don't skip expected results
  • • Don't use complex technical jargon
  • • Don't forget to specify test data

Regression Testing

Ensuring new changes don't break existing functionality

What is Regression Testing?

Regression testing is the process of testing existing software functionality to ensure that new code changes, bug fixes, or new features haven't negatively impacted the existing working features.

Key Objectives:

  • • Verify existing functionality still works
  • • Ensure new changes don't introduce bugs
  • • Maintain software quality and stability
  • • Validate system integration after changes
  • • Confirm bug fixes don't create new issues

When to Perform:

  • • After bug fixes
  • • After new feature implementation
  • • After code refactoring
  • • Before major releases
  • • After environment changes

Types of Regression Testing

Complete Regression Testing

Testing the entire application from scratch when major changes are made.

When to Use:
  • • Major system updates
  • • Architecture changes
  • • Multiple bug fixes
  • • Before major releases
Characteristics:
  • • Time-consuming
  • • Resource intensive
  • • Comprehensive coverage
  • • High confidence level

Partial Regression Testing

Testing only the affected modules and their related functionalities.

When to Use:
  • • Minor bug fixes
  • • Small feature additions
  • • Localized changes
  • • Quick releases
Characteristics:
  • • Faster execution
  • • Cost-effective
  • • Focused testing
  • • Risk-based approach

Selective Regression Testing

Testing selected test cases based on code changes and impact analysis.

Selection Criteria:
High Priority:
  • • Critical business functions
  • • Recently modified areas
  • • Integration points
Medium Priority:
  • • Related functionalities
  • • Common user workflows
  • • Previously failed areas
Low Priority:
  • • Stable, unchanged features
  • • Non-critical functions
  • • Rarely used features

Regression Testing Process

1

Impact Analysis

Analyze code changes to identify affected areas and dependencies

2

Test Case Selection

Choose appropriate test cases based on impact analysis and risk assessment

3

Test Execution

Execute selected test cases and document results

4

Result Analysis

Analyze test results and report any new defects or regressions

Real-World Example: E-commerce Website

Scenario:

A bug was fixed in the payment processing module where credit card validation was failing for certain card types.

Regression Test Areas:

Direct Impact:
  • • Payment processing with all card types
  • • Credit card validation logic
  • • Payment confirmation flow
  • • Error handling for invalid cards
Indirect Impact:
  • • Order completion process
  • • Shopping cart functionality
  • • User account updates
  • • Email notifications

Test Cases to Execute:

  • • Verify payment with Visa, MasterCard, American Express
  • • Test payment with invalid card numbers
  • • Verify order completion after successful payment
  • • Test shopping cart persistence during payment
  • • Verify email confirmations are sent

Regression Testing Best Practices

✓ Best Practices:

  • • Automate repetitive regression tests
  • • Maintain a regression test suite
  • • Prioritize test cases by risk and impact
  • • Update test cases regularly
  • • Use version control for test cases
  • • Document test results thoroughly

✗ Common Pitfalls:

  • • Testing everything without prioritization
  • • Ignoring impact analysis
  • • Using outdated test cases
  • • Not automating stable test cases
  • • Insufficient test coverage
  • • Poor communication with development team

Smoke Testing

Basic functionality verification

What is Smoke Testing?

Smoke testing is a preliminary testing approach that verifies the basic functionality of an application to ensure it's stable enough for further detailed testing. It's also known as "Build Verification Testing."

Key Characteristics:

  • • Quick and shallow testing
  • • Tests critical functionalities only
  • • Performed after new build deployment
  • • Determines if build is stable for testing
  • • Usually automated
  • • Takes 30 minutes to 2 hours

Purpose:

  • • Verify application launches successfully
  • • Check critical paths work
  • • Ensure basic functionality is intact
  • • Save time by catching major issues early
  • • Decide if detailed testing should proceed

Smoke Testing Process

Step 1: Build Deployment

New build is deployed to the test environment

Activities:
  • • Deploy latest build to test environment
  • • Verify deployment was successful
  • • Check application starts without errors
  • • Confirm environment setup is correct

Step 2: Execute Smoke Tests

Run predefined smoke test cases covering critical functionality

Test Areas:
  • • Application login/authentication
  • • Main navigation and menus
  • • Core business functions
  • • Database connectivity
  • • API endpoints (if applicable)
  • • File upload/download
  • • Search functionality
  • • Basic CRUD operations

Step 3: Analyze Results

Evaluate test results and make go/no-go decision

✓ PASS

All critical functions work. Proceed with detailed testing.

✗ FAIL

Critical issues found. Reject build and return to development.

⚠ CONDITIONAL

Minor issues found. Proceed with caution or fix first.

Sample Smoke Test Cases: E-commerce Website

Critical Path Test Cases:

User Authentication:
  • • TC_001: Verify application loads successfully
  • • TC_002: Verify user can login with valid credentials
  • • TC_003: Verify user can logout successfully
  • • TC_004: Verify registration page loads
Core Functionality:
  • • TC_005: Verify product catalog loads
  • • TC_006: Verify search functionality works
  • • TC_007: Verify add to cart function
  • • TC_008: Verify checkout process starts

Sample Test Case Detail:

Test Case ID: TC_SMOKE_001
Title: Verify User Login
Priority: Critical
Estimated Time: 2 minutes
Steps:
1. Open application URL
2. Click Login button
3. Enter valid credentials
4. Click Submit
Expected: User successfully logged in

Smoke vs Sanity vs Regression Testing

AspectSmoke TestingSanity TestingRegression Testing
PurposeVerify build stabilityVerify specific functionalityVerify existing features work
ScopeBroad but shallowNarrow but deepBroad and deep
When PerformedAfter new buildAfter minor changesAfter any changes
Time Required30 min - 2 hours1-3 hoursSeveral hours to days
AutomationUsually automatedCan be manual or automatedPreferably automated

Smoke Testing Best Practices

✓ Do's:

  • • Keep test cases simple and focused
  • • Automate smoke tests for efficiency
  • • Include only critical functionalities
  • • Run smoke tests before detailed testing
  • • Document clear pass/fail criteria
  • • Update smoke tests with new features

✗ Don'ts:

  • • Don't include detailed test scenarios
  • • Don't test edge cases or negative scenarios
  • • Don't spend too much time on smoke testing
  • • Don't ignore smoke test failures
  • • Don't make smoke tests too complex
  • • Don't skip smoke testing for urgent releases

Real-World Scenario: Banking Application

Situation:

New build of online banking application deployed to test environment after adding mobile payment feature.

Smoke Test Results:

✓ Passed:
  • • Application loads
  • • User login works
  • • Account balance displays
  • • Navigation functions
✗ Failed:
  • • Money transfer crashes
  • • Transaction history empty
Decision:

Build rejected. Critical functionality broken. Return to development for fixes.

Cypress Automation Testing

Modern end-to-end testing framework for web applications

What is Cypress?

Cypress is a next-generation front-end testing tool built for the modern web. It enables you to write, run, and debug tests directly in the browser with real-time reloads and time-travel debugging capabilities.

Key Features:

  • • Real-time browser testing
  • • Automatic waiting and retries
  • • Time-travel debugging
  • • Network traffic control
  • • Screenshots and videos
  • • Easy setup and configuration

Advantages:

  • • Fast test execution
  • • Developer-friendly syntax
  • • Excellent debugging capabilities
  • • Built-in assertions
  • • No WebDriver needed
  • • Great documentation

Use Cases:

  • • End-to-end testing
  • • Integration testing
  • • Unit testing
  • • API testing
  • • Visual regression testing
  • • Component testing

Getting Started: Installation & Setup

Step 1: Install Cypress
# Install via npm
npm install cypress --save-dev
# Or install via yarn
yarn add cypress --dev
Step 2: Open Cypress Test Runner
# Open Cypress GUI
npx cypress open
# Run tests in headless mode
npx cypress run

Your First Cypress Test

Example: Login Test
// cypress/e2e/login.cy.js
describe('Login Functionality', () => {)
  beforeEach(() => {
    cy.visit('https://example.com/login')
  })
  it('should login with valid credentials', () => {
    // Type username
    cy.get('[data-cy="username"]').type('testuser@example.com')
    
    // Type password
    cy.get('[data-cy="password"]').type('password123')
    
    // Click login button
    cy.get('[data-cy="login-button"]').click()
    
    // Verify successful login
    cy.url().should('include', '/dashboard')
    cy.get('[data-cy="welcome-message"]').should('be.visible')
  })
})

Essential Cypress Commands

Navigation & Interaction

cy.visit(url)
Navigate to a specific URL
Example: cy.visit('https://example.com')
cy.get(selector)
Get DOM element(s) by selector
Example: cy.get('[data-cy="submit-btn"]')
cy.click()
Click on an element
Example: cy.get('button').click()
cy.type(text)
Type text into input field
Example: cy.get('input').type('Hello World')

Assertions & Verification

cy.should('be.visible')
Assert element is visible
Example: cy.get('.modal').should('be.visible')
cy.should('contain', text)
Assert element contains text
Example: cy.get('h1').should('contain', 'Welcome')
cy.should('have.value', value)
Assert input has specific value
Example: cy.get('input').should('have.value', 'test')
cy.url().should('include', path)
Assert URL contains specific path
Example: cy.url().should('include', '/dashboard')

Learning Resources

📺 Video Tutorial

Comprehensive Cypress Tutorial

Complete guide covering installation, basic commands, advanced features, and best practices.

Watch Tutorial on YouTube

📚 Official Documentation

Cypress Official Docs

Comprehensive documentation with examples, API reference, and guides for all Cypress features.

Visit Cypress Documentation

🎯 Learning Path Recommendation

  1. Start with the official Cypress documentation to understand core concepts
  2. Follow the YouTube tutorial for hands-on practice
  3. Practice with simple tests on your own projects
  4. Explore advanced features like custom commands and API testing
  5. Learn about CI/CD integration and best practices
  6. Join the Cypress community for ongoing support and learning