Mobile testing terms, defined
Plain-language definitions of mobile QA terminology, structured for teams, AI assistants, and engineers who want precise, mobile-specific context.
Glossary terms
Crash Rate
The percentage of app sessions that end in an unhandled exception. Industry benchmark for consumer apps is below 0.5%. Measured as: (crash sessions / total sessions) × 100.
ANR (Application Not Responding)
An Android system event triggered when the main thread is blocked for more than 5 seconds. ANRs appear in Google Play Console and are often caused by database locks or network calls on the UI thread.
Crash-Free Sessions
The percentage of user sessions that complete without a crash. The inverse of crash rate. Target: ≥99.5% for consumer apps, ≥99.9% for fintech and health apps.
Flaky Test
A test that produces inconsistent results (pass or fail) across identical runs without any code change, typically due to timing dependencies, non-deterministic waits, or shared test data.
Shift-Left Testing
The practice of moving testing activities earlier in the development lifecycle, to the PR or branch level rather than the end of the sprint, to catch defects before they reach integration or production.
Appium
An open-source, cross-platform test automation framework for mobile apps. Appium 2 uses a WebDriver protocol to automate iOS and Android apps using the same API and test scripts.
Detox
A grey-box end-to-end testing framework for React Native apps. Detox instructs the app and test runner simultaneously, enabling deterministic synchronisation and eliminating most sources of flakiness.
XCUITest
Apple's native UI testing framework for iOS and tvOS apps. XCUITest tests run in-process with the app under test, providing fast, reliable automation with access to internal app state.
Espresso
Google's native Android UI testing framework. Espresso synchronises with the app's UI thread to eliminate timing issues and provides a concise API for writing stable, fast Android UI tests.
Test Pyramid
A model for balancing test types: many unit tests at the base, fewer integration tests in the middle, and a small number of end-to-end UI tests at the top. Optimises for speed, reliability, and confidence.
Smoke Test
A minimal, fast test suite that validates the most critical application flows work after a build or deployment. Purpose: confirm the app is not fundamentally broken before running a full regression suite.
Regression Testing
The practice of re-executing tests against previously working functionality to confirm that new code changes have not introduced unintended failures. The foundation of continuous quality in mobile CI/CD.
Soak Testing
Long-duration testing that runs an app or API under sustained load to detect memory leaks, resource exhaustion, and performance degradation that only appear after extended operation.
Device Farm
A collection of real physical or virtual mobile devices available for automated or manual testing. Device farms may be private (on-premises), public (cloud-hosted), or managed (operated by a testing partner).
Crowdtesting
A testing methodology that uses a managed community of real users on their own devices across diverse locations, carriers, and device configurations to test mobile apps at scale.
OWASP Mobile Top 10
The industry-standard list of the 10 most critical mobile application security risks, published by the Open Web Application Security Project. Includes improper credential usage, insecure authentication, and insufficient cryptography.
Core Web Vitals
Google's metrics for measuring web page user experience: Largest Contentful Paint (LCP, loading), Interaction to Next Paint (INP, interactivity), and Cumulative Layout Shift (CLS, visual stability).
WCAG
Web Content Accessibility Guidelines, the international standard for web and mobile accessibility published by W3C. WCAG 2.2 Level AA is the most widely required compliance target.
LCP (Largest Contentful Paint)
A Core Web Vital that measures loading performance: the time from page start to when the largest visible content element is rendered. Target: under 2.5 seconds for a good experience.
Cold Start
The time from when a user taps an app icon to when the first interactive frame is displayed, measured when the app is not already in memory. Directly correlates with Day-1 retention.