HOV QA Engineers
  • πŸ”HOV QA Engineers
  • AGILE TESTING
    • πŸ’»CSS elements for QA
    • πŸ›«On-boarding QA
    • 🀝Testers Communications, Task and Responsibilities
    • ♻️Software Testing Life cycle
    • πŸ““Test Plan
    • 🎟️Kanban board and ticket flows
    • πŸ““User Stories and Acceptance Criteria
    • πŸ‘£Gherkin syntax
    • πŸ€–Test Design
    • πŸ“…Test Strategy
    • πŸ‘¬Types of Testing strategy
    • πŸ§ͺTesting types
    • πŸ›Bug and Bug life Cycle
    • ✍️Manual Testing
    • Automation testing
    • πŸ‘·β€β™‚οΈE2E Testing
    • E2E testing best practices
    • Accessibility testing
    • Performance testing
    • Mobile Testing
  • Tools and Guidelines
    • Software QA Engineer Roadmap
    • πŸ’»Setup Cypress v10 E2E testing
    • πŸ§ͺSetup End to end test with Cypress test suite
    • Setup Performance test with k6
    • API testing with postman
    • Building GitHub Action part 1
    • ✍️Playwright + Cucumber
  • Training Videos
    • SQA Trainings
  • Research
    • πŸ“±Mobile Automation with Appium
      • Setup Project and Configuration (Android)
    • 🎭Playwright - Web Apps E2E Testing Tool
      • Setup Project and Configuration (Android)
    • Testing Library
  • PROJECTS
    • Page 2
    • Opexa
    • πŸ“£Identifi
      • βš–οΈSQA Metrics and Testing progress
      • πŸ§ͺManual testing
        • WEB
          • Sanity Testing
          • Regression Test cases
          • Credentials and URLs
        • Android
          • Sanity Testing
          • Regression Testing
          • End to end Testing
          • Credentials
        • IOS
          • Sanity Testing
          • Regression Testing
      • πŸ“”API Testing
      • πŸ€–Web Automation
        • E2E Automation Test Plan
        • Web Automation Setup
          • 🚧Setup WSL2 Environment for Windows
          • πŸ—οΈSetup local Environment (Linux/Ubuntu)
        • Regression Testing Coverage
        • Sanity Testing Coverage
      • 🎭Performance Testing
        • K6 Test Runs
      • πŸ“΄Mobile Automation
        • Mobile Automation Test Plan
        • Identifi Mobile Automation Setup
      • Page 1
    • πŸ–ΌοΈsubsit
      • Test plan
      • Smoke Test cases
      • πŸ§ͺTest Scenarios
      • πŸ•ΈοΈWeb Automation
    • 🧡ThreadSync
      • Test Plan
    • πŸ‘οΈUpWatch
      • Product Requirement
      • Test plan
      • Monitoring & Bug Reporting
      • E2E Test
        • E2E UpWatch Test
      • E2E QA Automation
    • 🎲Wallet
      • 🧬Test Plan
      • πŸ’»E2E Wallet Automation Test Plan
      • πŸ“–E2E Test Automation Docs
      • πŸ“‘Credentials| Urls
      • πŸ“šWallet Feature List
    • πŸ‘¨β€πŸ’»DevLuvs
      • πŸ§ͺTest Cases
      • πŸ”‘Credentials For Automation
        • πŸ€–Automation Test Cases
      • πŸ“ŠAutomation Board
    • βš™οΈMehira
      • πŸ•ΈοΈWeb Automation
      • Sanity Testing Document
      • How to start running Mehira application from your local using Docker engine via Ubuntu platform
Powered by GitBook
On this page
  • Introduction
  • Types of Performance Testing
  1. PROJECTS
  2. Identifi

Performance Testing

Introduction

The goal of Performance Testing is not to find bugs but to eliminate performance bottlenecks. Performance Testing is done to provide clients the information about the application regarding the speed, stability and scalability under specific work loads.

Without Performance Testing software is likely to suffer from issues such as: running slow while several users use it simultaneously, inconsistencies across different operating systems and poor usability.

Types of Performance Testing

  • Load testing - checks the application's ability to perform under anticipated user loads. The objective is to identify performance bottlenecks before the software application goes live.

  • Stress testing - involves testing an application under extreme workloads to see how it handles high traffic or data processing. The objective is to identify the breaking point of an application.

  • Endurance testing - is done to make sure the software can handle the expected load over a long period of time.

  • Spike testing - tests the software's reaction to sudden large spikes in the load generated by users.

  • Volume testing - Under Volume Testing large no. of. Data is populated in a database and the overall software system's behavior is monitored. The objective is to check software application's performance under varying database volumes.

  • Scalability testing The objective of scalability testing is to determine the software application's effectiveness in "scaling up" to support an increase in user load. It helps plan capacity addition to your software system.

PreviousSanity Testing CoverageNextK6 Test Runs

Last updated 2 years ago

πŸ“£
🎭