Skip to Content

Create an End-to-End Scenario in UI Automation Testing

What is an End-to-End Scenario?

An end-to-end (E2E) scenario in UI Automation Testing represents a complete user journey through your application. It’s a sequence of interactions that accomplishes a specific goal from the user’s perspective—from start to finish, just as a real user would experience it.

Examples of E2E Scenarios:

  • User registers for an account, verifies email, and completes profile setup
  • Customer browses products, adds items to cart, and completes checkout
  • Content creator logs in, creates a blog post, and publishes it
  • Admin user navigates to settings, updates configuration, and saves changes
  • User initiates password reset, receives email, and sets new password

Prerequisites

Before creating your first scenario, ensure you have:

  1. A Test Suite Created – Scenarios must belong to a test suite
  2. Agent Installed – The recording agent must be installed and configured
  3. Application Access – Access to the application environment you want to test
  4. Test Data Ready – Any accounts, credentials, or data needed for testing
  5. Clear Goal – Understanding of what user journey you want to test

Step-by-Step Guide to Creating a Scenario

Step 1: Navigate to Your Test Suite

  1. Open E2E Test Automation
  2. Navigate to the UI Automation Testing module
  3. Locate and open the test suite where you want to create the scenario
  4. You’ll see the test suite dashboard with existing scenarios (if any)

Tip: If you haven’t created a test suite yet, create one first by clicking “Create New Test Suite” and filling in the required details (name, description, team, visibility level).


Step 2: Create a New Scenario

  1. Inside your test suite, click the “Create New Scenario” button
  2. A dialog or form will appear asking for scenario details

Scenario Name:

  • Provide a clear, descriptive name that explains what the scenario tests
  • Use action-oriented language
  • Be specific about the workflow

Good Examples:

  • “User Registration with Email Verification”
  • “Complete Checkout Process with Saved Payment Method”
  • “Create and Publish Blog Post with Images”
  • “Multi-Step Form Submission with Validation”

Poor Examples:

  • “Test 1”
  • “Login”
  • “Scenario”
  • “Check stuff”

Scenario Description:

  • Add a detailed description of what this scenario covers
  • Include the starting point and expected end state
  • Mention any prerequisites or test data requirements
  • Document the business value or user story being tested

Example Description:

This scenario tests the complete user registration workflow including: - Navigating to the registration page - Filling in all required fields (name, email, password) - Accepting terms and conditions - Email verification - Profile setup completion Prerequisites: - Use test email account: test+\{timestamp\}@example.com - Ensure email service is accessible for verification Expected Outcome: - User successfully registers - Verification email is received - User can log in with new credentials - Profile is created with basic information

Additional Settings (if available):

  • Tags/Labels: Add tags for easy filtering (e.g., “critical”, “checkout”, “authentication”)
  • Priority: Set scenario priority (high, medium, low)
  • Browser/Device: Specify which browser or device to test on
  • Environment: Select testing environment (dev, staging, production)
  1. Click “Create” or “Save” to create the scenario

Step 3: Configure Scenario Settings

Once the scenario is created, you may need to configure additional settings:

Execution Settings:

  • Timeout Duration: How long to wait for elements before failing
  • Retry Logic: Whether to retry failed actions automatically
  • Screenshot Capture: When to capture screenshots (on failure, every step, etc.)
  • Video Recording: Enable/disable video recording of test execution

Environment Configuration:

  • Base URL: The starting URL for your application
  • Test Data: Any variables or data specific to this scenario
  • Authentication: Pre-configured login credentials if needed

Notification Settings:

  • On Success: Who to notify when the scenario passes
  • On Failure: Who to alert when the scenario fails
  • Execution Schedule: When to run this scenario automatically

Step 4: Understanding Scenario Structure

Before recording, understand how your scenario will be organized:

Scenario Components:

Test Cases (Actions):

  • Individual steps in your scenario
  • Each click, input, or navigation is a test case
  • Can be recorded automatically or added manually

Assertions:

  • Validation points that verify expected behavior
  • Check that elements exist, text appears, values are correct
  • Ensure the application responded as expected

Reusable Components:

  • Pre-built action sequences you can insert
  • Common workflows like login, navigation, logout
  • Save time and ensure consistency

Flow Connections:

  • How different screens and actions connect
  • The path a user takes through the application
  • Visual representation of the journey

Step 5: Plan Your Scenario Flow

Before you start recording, plan the complete flow:

Map Out the Journey:

  1. Starting Point: Where does the user begin?
    • Example: Homepage, login page, dashboard
  2. Key Actions: What does the user do at each step?
    • Example: Click “Sign Up”, fill form, submit, verify email
  3. Validation Points: Where should you add assertions?
    • Example: After login → verify welcome message appears
  4. End State: Where does the journey complete?
    • Example: User dashboard, confirmation page, success message

Example Planning Document:

Scenario: Complete Product Purchase 1. Starting Point: Homepage (https://example.com) 2. Flow: a. Navigate to login page b. Log in with test credentials → Assertion: Verify logged in (check user name appears) c. Search for product "wireless headphones" → Assertion: Verify search results appear d. Click on first product in results → Assertion: Verify product details page loads e. Click "Add to Cart" → Assertion: Verify cart count increases → Assertion: Verify success message appears f. Navigate to cart → Assertion: Verify product appears in cart g. Click "Checkout" → Assertion: Verify checkout page loads h. Fill shipping information i. Select shipping method → Assertion: Verify shipping cost updates j. Fill payment information k. Click "Place Order" → Assertion: Verify order confirmation page → Assertion: Verify order number appears → Assertion: Verify confirmation email sent 3. End State: Order confirmation page with order number

Identify Reusable Components:

Look for common actions that might be used in other scenarios:

  • Login sequence → Create “User Login” component
  • Search functionality → Create “Product Search” component
  • Add to cart → Create “Add to Cart” component

Step 6: Prepare Test Data

Gather all data needed for your scenario:

User Credentials:

  • Test usernames and passwords
  • Different user roles if testing permissions
  • Account with specific states (verified, unverified, etc.)

Test Inputs:

  • Form field values
  • Search terms
  • Product names or IDs
  • Addresses and contact information
  • Payment details (use test payment methods)

Expected Outputs:

  • Success messages
  • Error messages
  • Confirmation numbers
  • Expected URLs after navigation

Example Test Data Sheet:

Scenario: User Registration Input Data: - First Name: John - Last Name: Doe - Email: test+\{timestamp\}@example.com - Password: TestPass123! - Phone: (555) 123-4567 - Address: 123 Test Street - City: Testville - Zip: 12345 Expected Outputs: - Success message: "Welcome, John!" - Redirect URL: https://example.com/dashboard - Confirmation email subject: "Welcome to Example.com"

Step 7: Identify Assertion Points

Determine where to add validations throughout your scenario:

After Critical Actions:

  • After login → Verify user is authenticated
  • After form submission → Verify success message
  • After navigation → Verify correct page loaded
  • After data entry → Verify data was saved

At Decision Points:

  • Before conditional actions → Verify conditions are met
  • After branching logic → Verify correct path taken

For Data Validation:

  • After retrieving data → Verify data is correct
  • After calculations → Verify computed values
  • After updates → Verify changes persisted

Common Assertion Types:

Element Visibility:

  • “Verify welcome message appears”
  • “Verify error notification is displayed”
  • “Verify submit button is visible”

Text Content:

  • “Verify page title is ‘Dashboard’”
  • “Verify user name displays as ‘John Doe’”
  • “Verify success message says ‘Order placed successfully’”

Element State:

  • “Verify submit button is enabled”
  • “Verify checkbox is checked”
  • “Verify field is not editable”

URL Validation:

Count/Quantity:

  • “Verify cart shows 3 items”
  • “Verify 10 results appear in search”

Step 8: Consider Edge Cases and Variations

Think about different paths and scenarios:

Happy Path:

  • Everything works as expected
  • All inputs are valid
  • All services are available
  • Normal user behavior

Error Scenarios:

  • Invalid inputs → Test form validation
  • Missing required fields → Test error messages
  • Network failures → Test error handling
  • Timeout scenarios → Test loading states

Edge Cases:

  • Minimum and maximum input values
  • Special characters in input
  • Very long input strings
  • Concurrent actions
  • Browser-specific behaviors

Alternative Flows:

  • Different user roles (admin, regular user, guest)
  • Different data combinations
  • Various browser sizes (desktop, tablet, mobile)
  • Different entry points to the same flow

Step 9: Set Up Reusable Components

Before recording, create or identify reusable components:

Common Reusable Components:

Login Component:

Actions: 1. Navigate to login page 2. Enter username 3. Enter password 4. Click login button 5. Verify successful login

Navigation Component:

Actions: 1. Click main menu 2. Select category 3. Verify page loads

Logout Component:

Actions: 1. Click user menu 2. Click logout 3. Verify redirect to homepage

Benefits:

  • Create once, use in multiple scenarios
  • Update in one place, changes apply everywhere
  • Ensure consistency across tests
  • Speed up scenario creation

Step 10: Review and Finalize Scenario Configuration

Before starting the recording:

Double-Check:

  • Scenario name is clear and descriptive
  • Description documents the complete flow
  • Test data is ready and accessible
  • Reusable components are identified or created
  • Assertion points are planned
  • Environment settings are correct

Prepare Your Environment:

  • Clear browser cache if needed
  • Close unnecessary tabs or applications
  • Ensure stable internet connection
  • Have test credentials readily available
  • Open any reference documentation

Mental Checklist:

  • I know where to start
  • I know each action to perform
  • I know where to add assertions
  • I have all test data ready
  • I understand the expected end state
  • I’ve identified reusable components

Scenario Creation Best Practices

Naming Conventions

Do:

  • Use descriptive, action-oriented names
  • Include the main workflow or goal
  • Be specific about what’s being tested
  • Use consistent naming patterns across scenarios

Don’t:

  • Use generic names like “Test 1” or “Scenario A”
  • Use technical jargon unless necessary
  • Make names too long (keep under 100 characters)
  • Use special characters that might cause issues

Scenario Scope

Do:

  • Keep scenarios focused on a single user journey
  • Test one complete workflow per scenario
  • Break very long flows into multiple scenarios
  • Group related actions together logically

Don’t:

  • Try to test everything in one scenario
  • Mix unrelated workflows
  • Create scenarios that are too short or too long
  • Include unnecessary navigation or setup

Documentation

Do:

  • Write clear, detailed descriptions
  • Document prerequisites and assumptions
  • Note any special configurations needed
  • Include expected outcomes
  • Reference user stories or requirements

Don’t:

  • Leave descriptions empty or vague
  • Assume everyone knows the context
  • Skip documenting edge cases or variations
  • Forget to update descriptions when scenarios change

Organization

Do:

  • Group related scenarios in the same test suite
  • Use tags or labels for easy filtering
  • Set appropriate priority levels
  • Organize by user role, feature, or workflow

Don’t:

  • Create disorganized test suites
  • Mix critical and minor scenarios without distinction
  • Duplicate scenarios across multiple suites unnecessarily
  • Forget to archive or delete obsolete scenarios

Common Scenario Types

Authentication Scenarios

Login and Logout:

  • Valid credentials
  • Invalid credentials
  • Password reset flow
  • Remember me functionality
  • Session timeout

Form Submission Scenarios

Data Entry and Validation:

  • Complete form with valid data
  • Form validation testing
  • Error message display
  • Multi-step form completion
  • Draft saving and restoration

E-commerce Scenarios

Shopping and Checkout:

  • Product search and filtering
  • Add/remove from cart
  • Guest checkout
  • Registered user checkout
  • Payment processing
  • Order confirmation

Content Management Scenarios

CRUD Operations:

  • Create new content
  • Edit existing content
  • Delete content
  • Publish/unpublish workflow
  • Version control

Search and Filter Scenarios

Data Retrieval:

  • Basic search
  • Advanced search with filters
  • Sort results
  • Pagination
  • No results handling

After Creating the Scenario

Once you’ve created and configured your scenario:

Next Steps:

  1. Record the Scenario – Use the recording agent to capture the actual workflow
  2. Review Recording – Check that all actions were captured correctly
  3. Add Assertions – Insert validation points at key steps
  4. Test Execution – Run the scenario to verify it works
  5. Refine and Optimize – Adjust locators, timing, or actions as needed

Ongoing Maintenance:

  • Update scenarios when UI changes
  • Review and update test data regularly
  • Monitor scenario success rates
  • Retire obsolete scenarios
  • Keep documentation current

Summary

Creating an end-to-end scenario in UI Automation Testing involves careful planning and configuration before you even start recording. By following these steps:

  1. Navigate to your test suite
  2. Create a new scenario with a clear name and description
  3. Configure scenario settings (timeout, retries, notifications)
  4. Plan the complete flow from start to finish
  5. Prepare all necessary test data
  6. Identify where to add assertions
  7. Consider edge cases and variations
  8. Set up reusable components
  9. Review and finalize configuration

You’ll create well-structured, maintainable scenarios that effectively validate your application’s user workflows.

Remember: The time spent planning and configuring your scenario properly will save significant time during recording, execution, and maintenance. A well-planned scenario is easier to record, more reliable during execution, and simpler to maintain over time.

Ready to record? Proceed to the next guide: “Record a Complete End-to-End Scenario in UI Automation Testing”


Create an End-to-End Scenario | Documentation