PRODUCT DOCUMENTATION

Barcable User Guide

Use this guide to connect GitHub, onboard a repository, generate tests, run them, and interpret results with repeatable best practices.

Prerequisites

Who should use this

Backend engineers, SREs, and infrastructure leads who own service performance and reliability.

Repository access

Your GitHub account needs access to every repository you plan to exercise during onboarding.

Token scopes

Generate a Personal Access Token with repo and read:org. Barcable calls /user/repos and repository contents APIs to discover Docker assets.

Docker readiness

Each repo should ship a Dockerfile that builds with docker build and exposes the service port via EXPOSE.

Checklist

  • Confirm GitHub access and token scopes.
  • Verify the target repo builds locally with Docker.
  • Choose the first repository and branch you will validate with Barcable.

1. Overview

What Barcable does

Barcable discovers your Dockerized services, generates k6 load tests from repository context (OpenAPI specs, routes, fixtures), and runs them on managed Cloud Run jobs while streaming execution progress and metrics back to the dashboard.

First-session outcome

Connect GitHub, onboard one repository, generate an endpoint-specific test, execute it against your service, and interpret p95 latency and error rate on the reports page.

Checklist

  • Understand Barcable's discovery -> generation -> execution flow.
  • Identify which repository and endpoint you will exercise first.

2. Quick Start Checklist

1

Create your token

Generate a GitHub Personal Access Token with repo and read:org so Barcable can discover repositories and Docker assets.

2

Add the token in Settings

Paste the token inside Barcable Settings and run Test Connection to confirm access before moving on.

3

Onboard a repository

From Explore Repos choose a Docker-ready project and complete the onboarding modal with the right build context.

4

Generate a test suite

Open Explore Tests for the target branch and run Generate Tests to produce scripts and scenarios for your endpoints.

5

Run the suite

Kick off Auto-Run or Toggle Runs from the Dashboard to execute the generated tests against your environment.

6

Review results

Inspect latency, throughput, and failure rates on the Reports page to decide whether the service is healthy.

Checklist

  • PAT connected and validated.
  • Repository shows as onboarded and Docker ready.
  • One run executed and metrics reviewed.

3. Step-by-Step Setup and Usage

Follow these sub-sections sequentially to connect Barcable, generate meaningful tests, and interpret results with confidence.

3a.Sign in and connect your GitHub token

Access Barcable

Sign in with GitHub (or your configured email login), then open Settings from the sidebar.

Add the token

In the GitHub Personal Access Token card, choose Add Token.

  • Token Name: choose a label such as prod-load-testing.
  • Personal Access Token: paste the 40-character value (format ghp_xxxxx).

Click Save Token. The token is stored only in browser localStorage.

Validate access

Choose Test Connection to verify scopes.

  • Success shows Connected! Found N repositories.
  • Errors usually mean the token lacks repo/read:org, has expired, or still needs organization approval in GitHub settings.

Use Clear any time you need to remove a saved token.

Checklist

  • Token saved and shows the "Configured" badge.
  • "Test Connection" succeeds with expected repo count.
  • Rotation reminder documented (recommended every 90 days).

3b.Explore repositories and onboard one

Find your repository

Head to Explore Repos, search or filter, and refresh the catalog whenever new repositories appear.

Review repository details

Each card highlights language, stars, forks, visibility, and whether it is already onboarded (look for the green Onboarded pill).

Complete the onboarding modal

  • Build Context: ./ if the Dockerfile sits at repo root, otherwise a relative folder (e.g. services/api).
  • Dockerfile Path: default Dockerfile unless named differently.
  • Build Command: keep docker build . or add flags as needed.
  • Start Command: command Barcable runs after build (e.g. npm run start:prod).

Validate and deploy

Use Validate to confirm Docker assets, then Onboard Repository. Review branch readiness under View Branches when you are ready to deploy.

Checklist

  • Target repo onboarded and listed under "Repositories".
  • Branch badges show "Docker Ready" or deployment status.
  • Troubleshooting notes captured for any validation failures.

3c.Tests: discover, generate, and manage

Open the workspace

Navigate to Explore Tests, then open the branch workspace for your onboarded repo.

Configure generation

Choose Generate Tests and set options:

  • Provide a custom base URL if your deployment differs from defaults.
  • Add custom headers for auth or tenancy needs.

Review generated suites

Barcable runs fetching, discovery, analysis, and generation passes. Once finished, inspect each suite endpoints, load profile, and payloads, and download scripts when needed.

Iterate on scenarios

  • Regenerate with updated headers or base URLs whenever inputs change.
  • Edit downloaded scripts locally to tweak payloads or thresholds.
  • Use bulk delete to retire outdated scenarios and keep the list focused.

Checklist

  • At least one generated test reviewed in detail.
  • Scripts downloaded or regenerated with correct headers/base URL.
  • Unneeded tests cleaned up with the bulk "Delete" control.

3d.Run tests from the dashboard

Auto-Run Tests

Launch all stored suites with a single click. Barcable spins up Cloud Run jobs and streams progress while tests execute.

Run Tests

Use this option once a deployment shows Deployed in the status card. Runs target the detected deployment URL.

Toggle Runs

Provide an ad-hoc base URL (for example, a preview environment) plus optional headers, then execute the selected suites against that endpoint.

Monitor execution

  • Track counts and live success rate in the Test Execution Status card.
  • Runs finish automatically once suites complete; re-run after adjusting base URLs or regenerating scripts.
  • Latest Test Run summarizes successes, failures, and timestamps for quick triage.

Checklist

  • First run completes and progress appears in "Test Execution Status".
  • Learned which run mode (Auto-Run, Run Tests, Toggle Runs) suits each scenario.
  • Noted the limitation that runs auto-complete without a manual stop.

3e.Metrics and reports

Navigate the reports view

Jump to View Reports from dashboard quick actions or open the Reports tab directly to see recent executions.

Filter what matters

  • Filter by test name, repository, or refresh to pull fresh data.
  • Export JSON when you need to take results into another tool.

Interpret key metrics

  • http_req_duration_p95 for latency.
  • http_req_failed_rate for error percentage.
  • http_reqs, vus, and vus_max for load and throughput.
  • Badge colors on http_req_success_rate indicate healthy, warning, or failing runs.

Compare runs

Sort by start time, scope the page with query parameters (for example, repository filters), and export rows to diff p95 or failure rates as you refine baselines.

Checklist

  • Located the latest run in the reports list.
  • Interpreted p95 latency and error rates using table columns.
  • Exported or documented metrics for baseline comparison.

4. Best Practices

1

Keep payloads realistic

Use anonymized production traffic or fixtures so generated scenarios mirror auth patterns, headers, and validation logic.

2

Seed dependable data

Provision test accounts or fixtures in your deployment pipeline to stabilize responses across repeated Barcable runs.

3

Start with short runs

Begin with focused smoke tests (2-5 minutes) before graduating to endurance or stress profiles.

4

Match production ramps

Adopt realistic ramp shapes (for example linear warm-up) to surface cold-start and autoscaling issues.

5

Name suites clearly

Use method-resource-purpose patterns (GET-customers-smoke-v2) so lists stay scannable for your team.

6

Add lightweight tagging

Until UI tags arrive, encode metadata in test names (such as @release-2024-07) and note conventions in your runbook.

Checklist

  • Payloads, ramps, and durations mapped to real traffic.
  • Naming approach agreed with the team.
  • Data seeding strategy validated.

5. Security and Least Privilege

1

Handle PATs with care

Store tokens only in the browser, rotate every 90 days, and revoke immediately when personnel changes occur.

2

Limit scopes

Stick to repo and read:org. Add broader scopes only when a future feature explicitly requires them.

3

Remove local copies

Use Clear in Settings and delete the token from GitHub (Settings -> Developer settings -> Tokens) when access is no longer required.

4

Review access logs

Leverage GitHub audit logs and forthcoming Barcable audit entries to monitor token usage.

Checklist

  • Rotation schedule documented.
  • PAT cleared from unused browsers.
  • Revocation procedure rehearsed with the team.

6. Troubleshooting & FAQs

IssueLikely causeResolution
Invalid token when testing connectionPAT missing repo or read:orgRegenerate the token with both scopes, update in Settings, and retest.
Repo not marked Docker ReadyDockerfile outside the declared build context, or build failingAdjust Build Context/Dockerfile Path in the onboarding modal; ensure docker build succeeds locally.
Tests never startTest generation did not complete, or no deployment/base URL providedReturn to Explore Tests, regenerate until tests appear, and in Dashboard supply a base URL via Toggle Runs or deploy from Repositories.
Metrics flat or emptyRequests hitting placeholder URL or failing authenticationSet the correct base URL (section 3d) and ensure headers include valid credentials.
Need detailed logsWant to inspect deployment or orchestrator errorsOpen Repositories -> View Branches. Failed deployments show the error inline and fetch extra context via /api/logs. For run-time issues, export the run from Reports and examine console logs recorded during execution.

Additional tips:

  • Check the browser console for detailed JSON responses from APIs such as /api/batch-orchestrator when debugging orchestration errors.
  • Use the Select All + Delete controls in Explore Tests to clear corrupted or outdated scenarios before regenerating.

Checklist

  • Identified the relevant diagnostic view (Settings, Explore Repos, Dashboard, Reports).
  • Logged outstanding issues and resolutions.
  • Know how to escalate (logs, Barcable support) if the UI insight is insufficient.

7. Glossary

Virtual users (VUs)

Simulated concurrent clients generated by k6.

p95 latency

The response time under which 95% of requests complete.

Throughput (http_reqs)

Total requests executed during the run.

Error rate (http_req_failed_rate)

Fraction of requests returning non-success responses.

Warm up

Initial load period that primes caches and autoscaling before measuring steady-state metrics.

Cold start

Latency spikes caused by newly provisioned instances or functions.

Suite

The collection of generated tests for a repository/branch.

Baseline

Reference run (e.g. last healthy deployment) used for comparison.

Checklist

  • Terms mapped to Barcable UI columns.
  • Shared definitions adopted for team reporting.