GPT Driver User Guide
GPT Driver User Guide
GPT Driver User Guide
  • Getting Started
    • πŸš€Getting Started
    • πŸ”ƒUploading a Build File
    • πŸ§™β€β™‚οΈCreating Tests
      • Commands
        • Tap Command
        • Type Command
        • scroll Command
        • scrollUntilVisible Command
        • swipe Command
        • slide Command
        • wait Command
        • assertVisible Command
        • openLink Command
        • pressBackButton Command
        • launchApp Command
        • restartApp Command
      • πŸ‘οΈwithVision: Instructions
      • 🧠AI Instructions
    • 🏁Executing Tests
  • ☎️Device Configuration
  • βš™οΈUnder the Hood
  • Getting Around
    • ✏️Test Editor
    • πŸ›°οΈTest Overview
    • πŸ…Test Reports
    • ⏺️Test Recordings
    • πŸ‘€Settings
  • Best Practices
    • πŸ§‘β€πŸ’»API Documentation
    • Versioning
    • ↗️Templates
    • πŸ–‡οΈTest Dependencies & Prompt References
    • πŸ”—Deep Links
    • πŸ“§Email Verification
    • πŸ“‘Network Calls
    • πŸͺ‘Parameterized Strings
    • πŸ“Changing Device Location
    • πŸͺΆConditional Steps
    • 🐦Nested Steps
    • ⌚Smart Caching
    • πŸ—£οΈEnv. Variables
    • πŸ’―Bulk Step Testing for Robustness
    • πŸ“–Exact Text Assertions
    • πŸ’¬Auto-grant Permissions
  • πŸ§ͺMocking Network Data
  • 🌎Localization Testing
  • Code Generation
  • ❔FAQs
Powered by GitBook
On this page
  • What are Test Dependencies?
  • Running Tests with Dependencies
  • Example (Multiple Test Dependencies/Chaining Test Dependencies)
  • What are Prompt References?
  • Why use Prompt References?
  • Ideal Use Cases?
  • How It Works?
  • Prompt Reference Demo Video
  1. Best Practices

Test Dependencies & Prompt References

PreviousTemplatesNextDeep Links

Last updated 3 days ago

What are Test Dependencies?

Test dependencies allow you to efficiently chain tests together, mimicking real user flows while ensuring a robust testing environment.

These are tests that need to be completed before running a specific test. For example, a test case requiring a logged-in user would depend on a login test being run beforehand.

Running Tests with Dependencies

GPT Driver offers two ways to execute tests with dependencies:

  1. Cloud Runs: When you run a test farthest along a workflow (like "Send Message"), the system automatically triggers any dependent tests (like "Login" and "Add Friend") in the correct order.

  2. Test Editor: Use the "Execute prompt incl. dependencies" option within the Test Editor. This runs the current test along with all its required dependencies.

Example (Multiple Test Dependencies/Chaining Test Dependencies)

Imagine a sequence of tests:

  1. Login

  2. Add Friend (depends on Login)

  3. Send Message (depends on Add Friend)

If you run the "Send Message" test, the system runs "Login" and "Add Friend" first, ensuring a successful test execution.


What are Prompt References?

Prompt Reference is a powerful new feature in GPTDriver that enhances your ability to design reusable and scalable automated tests. Unlike traditional Test Dependencies, which only allow tests to run in a strict sequence, Prompt Reference gives you the flexibility to nest prompts or entire tests within other tests, enabling modular structures that more accurately reflect real user flows in mobile apps.

Why use Prompt References?

Many users are already familiar with the Test Dependency feature, which supports basic sequential execution. While useful for linear flows, dependencies can become limiting as your test cases grow in complexity.

Prompt Reference solves this by allowing you to insert one or more existing prompts (or entire tests) directly inside another test. This lets you:

  • Build hierarchical test structures

  • Reuse logic across multiple test cases

  • Reduce duplication and maintenance overhead

  • Better organize tests around real-world app flows

Ideal Use Cases?

  • Onboarding flows that reuse login or permissions setup

  • Checkout processes that include reusable steps like address entry or payment selection

  • Multi-screen navigation that reuses deep-link flows or feature walkthroughs

  • Regression suites that reuse core smoke tests in larger scenario tests

How It Works?

To insert a prompt or test within another test:

  1. Open the Test Editor.

  2. Type @ where you’d like to reference another prompt or test.

  3. A dropdown list will appear with all available prompts and tests in your account.

  4. Select the one you want to insert. It will appear as a block within the current test.

Prompt Reference Demo Video

πŸ–‡οΈ