Files
dance-lessons-coach/.vibe/skills/bdd-testing/references/BDD_BEST_PRACTICES.md
Gabriel Radureau 89f17cba7d
Some checks failed
CI/CD Pipeline / CI Pipeline (push) Failing after 7m12s
🔧 chore: fix skill naming and gitea actions compatibility (related to #2)
2026-04-06 16:56:11 +02:00

13 KiB

BDD Best Practices for DanceLessonsCoach

Based on our implementation experience with Godog and the existing pkg/bdd codebase.

Core Principles from Our Implementation

Black Box Testing Done Right

DO:

  • Test only through public HTTP API endpoints
  • Use real HTTP requests to verify actual behavior
  • Isolate each scenario with fresh client instances
  • Verify server is actually running (real HTTP calls)

DON'T:

  • Access database or internal services directly
  • Mock HTTP responses (defeats black box testing)
  • Share state between scenarios
  • Assume server is running without verification

Hybrid In-Process Testing Pattern

Our successful approach avoids external process management:

// ✅ Our working pattern
func (s *Server) Start() error {
    // Start real server in same process
    go func() {
        if err := s.httpServer.ListenAndServe(); err != nil {
            log.Error().Err(err).Msg("Test server failed")
        }
    }()
    return s.waitForServerReady()
}

func (s *Server) waitForServerReady() error {
    // Poll readiness endpoint
    for attempt := 0; attempt < 30; attempt++ {
        resp, err := http.Get(s.baseURL + "/api/ready")
        if err == nil && resp.StatusCode == http.StatusOK {
            return nil
        }
        time.Sleep(100 * time.Millisecond)
    }
    return fmt.Errorf("server not ready")
}

Step Definition Patterns

Godog's Exact Pattern Matching

Critical Insight: Godog reports steps as "undefined" if patterns don't match exactly.

Working Pattern:

// Use Godog's EXACT regex from --format=progress output
ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)

Problematic Pattern:

// Custom pattern that doesn't match Godog's suggestion
ctx.Step(`^I request greeting "(.*)"$`, sc.iRequestAGreetingFor)
// Results in: "undefined step: I request a greeting for "John""

StepContext Pattern

Our proven approach for step organization:

// pkg/bdd/steps/steps.go
type StepContext struct {
    client *testserver.Client
}

func NewStepContext(client *testserver.Client) *StepContext {
    return &StepContext{client: client}
}

func InitializeAllSteps(ctx *godog.ScenarioContext, client *testserver.Client) {
    sc := NewStepContext(client)
    
    // Register all steps with exact patterns
    ctx.Step(`^the server is running$`, sc.theServerIsRunning)
    ctx.Step(`^I request the default greeting$`, sc.iRequestTheDefaultGreeting)
    ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)
    ctx.Step(`^I request the health endpoint$`, sc.iRequestTheHealthEndpoint)
    ctx.Step(`^the response should be "([^"]*)"$`, sc.theResponseShouldBe)
}

JSON Handling Gotchas

Feature File Escaping

Problem: Gherkin files require special JSON escaping

Correct:

Then the response should be "{\\"message\\":\\"Hello world!\\"}"

Incorrect:

Then the response should be "{"message":"Hello world!"}"
// Results in: expected "{\"message\":\"Hello world!\"}", got "{"message":"Hello world!"}"

Step Implementation Cleanup

// ✅ Our working solution
func (sc *StepContext) theResponseShouldBe(expected string) error {
    // Clean captured JSON from feature file
    cleanExpected := strings.Trim(expected, `"\`)
    
    // Get actual response and trim newline
    actual := strings.TrimSuffix(string(sc.client.lastBody), "\n")
    
    if actual != cleanExpected {
        return fmt.Errorf("expected response %q, got %q", cleanExpected, actual)
    }
    return nil
}

Test Server Implementation

Fixed Port Strategy

Why Port 9191:

  • Avoids conflicts with main server (8080)
  • Consistent across all tests
  • Easy to remember and debug

Server Lifecycle:

// Shared server for normal scenarios
var sharedServer *testserver.Server

func InitializeTestSuite(ctx *godog.TestSuiteContext) {
    ctx.BeforeSuite(func() {
        sharedServer = testserver.NewServer()
        if err := sharedServer.Start(); err != nil {
            panic(err)
        }
    })
    
    ctx.AfterSuite(func() {
        if sharedServer != nil {
            sharedServer.Stop()
        }
    })
}

Real Server Integration

Key Insight: Use actual server code for realistic testing

// pkg/bdd/testserver/server.go
func NewServer() *Server {
    return &Server{port: 9191}
}

func (s *Server) Start() error {
    s.baseURL = fmt.Sprintf("http://localhost:%d", s.port)
    
    // Create REAL server instance from pkg/server
    cfg := createTestConfig(s.port)
    realServer := server.NewServer(cfg, context.Background())
    
    // Use real router and handlers
    s.httpServer = &http.Server{
        Addr:    fmt.Sprintf(":%d", s.port),
        Handler: realServer.Router(),
    }
    
    // Start in same process
    go s.httpServer.ListenAndServe()
    return s.waitForServerReady()
}

Client Implementation

HTTP Client Pattern

// pkg/bdd/testserver/client.go
type Client struct {
    server   *Server
    lastResp *http.Response
    lastBody []byte
}

func (c *Client) Request(method, path string, body []byte) error {
    url := c.server.GetBaseURL() + path
    req, err := http.NewRequest(method, url, nil)
    if err != nil {
        return fmt.Errorf("failed to create request: %w", err)
    }
    
    resp, err := http.DefaultClient.Do(req)
    if err != nil {
        return fmt.Errorf("request failed: %w", err)
    }
    defer resp.Body.Close()
    
    c.lastResp = resp
    c.lastBody, err = io.ReadAll(resp.Body)
    return err
}

Response Validation

// ✅ Robust validation with helpful error messages
func (c *Client) ExpectResponseBody(expected string) error {
    if c.lastResp == nil {
        return fmt.Errorf("no response received")
    }
    
    actual := string(c.lastBody)
    actual = strings.TrimSuffix(actual, "\n")  // Trim trailing newline
    
    if actual != expected {
        return fmt.Errorf("expected response body %q, got %q", expected, actual)
    }
    
    return nil
}

Common Pitfalls and Solutions

1. "Undefined Step" Warnings

Symptom: Tests pass but show warnings about undefined steps

Root Cause: Step regex doesn't match Godog's exact pattern

Solution:

# Run with progress format to see exact patterns
godog --format=progress

# Use the EXACT pattern shown in output

2. JSON Comparison Failures

Symptom: Response validation fails despite correct JSON

Root Causes:

  • Trailing newlines in response
  • Improper escaping in feature files
  • Quote handling issues

Solution:

// Clean both expected and actual values
cleanExpected := strings.Trim(expected, `"\`)
actual := strings.TrimSuffix(string(body), "\n")

3. Server Connection Issues

Symptom: "connection refused" or server not responding

Root Causes:

  • Server not started
  • Port conflict
  • Server crashed

Solution:

# Check server health
curl http://localhost:9191/api/ready

# Check server logs
go test ./features/... -v

4. Context Type Confusion

Symptom: Compilation errors about context types

Root Cause: Mixing context.Context with *godog.ScenarioContext

Solution:

// ✅ Correct: Store ScenarioContext and use for registration
func InitializeScenario(ctx *godog.ScenarioContext) {
    client := testserver.NewClient(sharedServer)
    steps.InitializeAllSteps(ctx, client)  // Pass ScenarioContext
}

// ❌ Wrong: Trying to use context.Context for steps
func InitializeScenario(ctx context.Context) {  // Wrong type!
    // This won't work
}

Debugging Techniques

Step Pattern Debugging

# Show which steps are defined
godog --format=progress --show-step-definitions

# Run specific feature
godog features/greet.feature

# Verbose output
godog --format=pretty --verbose

Server Debugging

# Check server is running
curl -v http://localhost:9191/api/ready

# Check health endpoint
curl -v http://localhost:9191/api/health

# Test greet endpoint
curl -v http://localhost:9191/api/v1/greet/John

Test Output Analysis

# Run with verbose output
go test ./features/... -v

# Look for:
# - "undefined step" warnings
# - Connection errors
# - JSON mismatch errors
# - Context type errors

Performance Optimization

Shared Server Pattern

For normal scenarios: Use shared server to avoid startup overhead

// Suite-level shared server
var sharedServer *testserver.Server

func InitializeTestSuite(ctx *godog.TestSuiteContext) {
    ctx.BeforeSuite(func() {
        sharedServer = testserver.NewServer()
        sharedServer.Start()
    })
    
    ctx.AfterSuite(func() {
        sharedServer.Stop()
    })
}

Dedicated Server Pattern

For shutdown/readiness tests: Use dedicated server when needed

// Scenario-level dedicated server
func InitializeShutdownScenario(ctx *godog.ScenarioContext) {
    server := testserver.NewServer()
    ctx.BeforeScenario(func(*godog.Scenario) {
        server.Start()
    })
    
    ctx.AfterScenario(func(*godog.Scenario, error) {
        server.Stop()
    })
}

Test Organization

Feature File Structure

features/
├── greet.feature          # Greet service tests
├── health.feature         # Health endpoint tests
├── readiness.feature      # Readiness/shutdown tests
└── bdd_test.go            # Test suite entry point

Step Definition Organization

pkg/bdd/
├── steps/
│   ├── steps.go           # Main step definitions
│   └── shutdown_steps.go  # Shutdown-specific steps
├── testserver/
│   ├── server.go         # Test server implementation
│   └── client.go          # HTTP client
└── suite.go               # Test suite initialization

Validation Script

Complete Test Validation

#!/bin/bash
# scripts/run-bdd-tests.sh

set -e

echo "🧪 Running BDD tests..."
go test ./features/... -v

# Check for any undefined, pending, or skipped steps
echo "🔍 Validating test results..."
TEST_OUTPUT=$(go test ./features/... 2>&1)

if echo "$TEST_OUTPUT" | grep -q "undefined\|pending\|skipped"; then
    echo "❌ ERROR: Found undefined, pending, or skipped steps"
    echo "$TEST_OUTPUT" | grep -E "undefined|pending|skipped"
    exit 1
fi

if echo "$TEST_OUTPUT" | grep -q "FAIL"; then
    echo "❌ ERROR: Some tests failed"
    exit 1
fi

echo "✅ All BDD tests passed with no undefined steps"
echo "✅ No pending or skipped steps found"
echo "✅ All scenarios executed successfully"

Continuous Integration

CI/CD Integration

# .github/workflows/bdd-tests.yml
name: BDD Tests

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v4
    
    - name: Set up Go
      uses: actions/setup-go@v4
      with:
        go-version: '1.26'
    
    - name: Install dependencies
      run: go mod download
    
    - name: Run BDD tests
      run: ./scripts/run-bdd-tests.sh
    
    - name: Validate no undefined steps
      run: |
        if go test ./features/... 2>&1 | grep -q "undefined"; then
          echo "ERROR: Found undefined steps"
          exit 1
        fi

Lessons Learned

1. Godog is Particular About Patterns

  • Always use exact regex patterns from godog --format=progress
  • Small deviations cause warnings even if tests pass
  • Function names should match step descriptions

2. Black Box Testing Requires Real Verification

  • Actually verify server is running with HTTP calls
  • Don't mock responses - defeats the purpose
  • Use real server code for realistic testing

3. JSON Handling is Tricky

  • Escape properly in feature files
  • Trim newlines from responses
  • Clean captured groups in step implementations

4. Context Types Matter

  • Steps receive *godog.ScenarioContext
  • Not context.Context
  • Store context properly for step access

5. In-Process Testing is More Reliable

  • Avoid external processes
  • Use real server code in same process
  • Fixed ports work better than dynamic allocation

Success Metrics

Our BDD implementation achieved:

  • 100% API coverage - All endpoints tested
  • Zero undefined steps - All steps properly recognized
  • No process management issues - Hybrid in-process approach
  • Fast execution - Shared server pattern
  • Reliable validation - Comprehensive test script
  • Production ready - Used in CI/CD pipeline

Recommendations

  1. Start with existing patterns - Use our proven approach
  2. Follow Godog's exact patterns - Avoid undefined step warnings
  3. Use hybrid in-process testing - More reliable than external processes
  4. Validate thoroughly - Run validation script before committing
  5. Document gotchas - Add to this guide as you learn
  6. Keep tests fast - Use shared server for normal scenarios
  7. Test in CI/CD - Ensure BDD tests run in pipeline