- Fix step pattern escaping in pkg/bdd/steps/steps.go:80 - Update CI/CD workflow to use run-bdd-tests.sh script - Enhance run-bdd-tests.sh for both local and CI environments - Add strict validation for undefined/pending/skipped steps - Update BDD testing documentation with pattern requirements The CI/CD pipeline now properly validates BDD tests and fails on any undefined, pending, or skipped steps. All 22 BDD scenarios are passing with correct step pattern registration.
17 KiB
BDD Testing Debugging Guide
Comprehensive guide to debugging BDD tests for dance-lessons-coach.
Common Issues and Solutions
1. "Undefined Step" Warnings
Symptoms:
Feature: Greet Service
Scenario: Default greeting # features/greet.feature:3
Given the server is running # ??? UNDEFINED STEP
When I request the default greeting # ??? UNDEFINED STEP
Then the response should be "..." # ??? UNDEFINED STEP
Root Cause: Step patterns don't match Godog's exact expectations. Godog is very particular about regex escaping.
Common Pattern Issues:
\"vs\\"(single vs double escaping)- Exact quote handling in JSON patterns
- Parameter capture group syntax
Debugging Steps:
-
Run with progress format:
godog --format=progress features/greet.feature -
Check suggested patterns:
You can implement step definitions for the undefined steps with these snippets: func theResponseShouldBe(arg1, arg2 string) error { return godog.ErrPending } func InitializeScenario(ctx *godog.ScenarioContext) { ctx.Step(`^the response should be "{\\"([^"]*)\\":\\"([^"]*)\\"}"$`, theResponseShouldBe) } -
Compare with your implementation:
// ❌ Wrong pattern (single escaping) ctx.Step(`^the response should be "{\"([^"]*)\":\"([^"]*)\"}"$`, sc.commonSteps.theResponseShouldBe) // ✅ Correct pattern (double escaping - matches Godog's suggestion) ctx.Step(`^the response should be "{\\"([^"]*)\\":\\"([^"]*)\\"}"$`, sc.commonSteps.theResponseShouldBe)
Key Insight: Godog expects \\" (four backslashes + quote) for escaped quotes in JSON patterns, not \" (two backslashes + quote).
Solution: Use Godog's EXACT regex patterns, paying special attention to:
- JSON escaping:
\\"not\" - Parameter names: Use
arg1, arg2as suggested - Capture groups: Match Godog's exact regex syntax
2. JSON Comparison Failures
Symptoms:
Expected response body "{\"message\":\"Hello world!\"}",
got "{\"message\":\"Hello world!\"}\n"
Root Causes:
- Trailing newlines in JSON responses
- Improper escaping in feature files
- Quote handling issues
Debugging Steps:
-
Check actual response:
curl -v http://localhost:9191/api/v1/greet/ -
Inspect in step implementation:
func (sc *StepContext) theResponseShouldBe(expected string) error { fmt.Printf("Expected: %q\n", expected) fmt.Printf("Actual: %q\n", string(sc.client.lastBody)) // ... } -
Verify feature file escaping:
# ❌ Wrong escaping Then the response should be "{"message":"Hello world!"}" # ✅ Correct escaping Then the response should be "{\\"message\\":\\"Hello world!\\"}"
Solution: Trim newlines and properly clean JSON:
cleanExpected := strings.Trim(expected, `"\`)
actual := strings.TrimSuffix(string(body), "\n")
3. Server Connection Issues
Symptoms:
Request failed: dial tcp [::1]:9191: connect: connection refused
Root Causes:
- Server not started
- Port conflict
- Server crashed during test
Debugging Steps:
-
Check server manually:
curl -v http://localhost:9191/api/ready -
Check port usage:
lsof -i :9191 netstat -an | grep 9191 -
Add debug logging to server startup:
func (s *Server) Start() error { log.Info().Int("port", s.port).Msg("Starting test server") // ... log.Info().Str("url", s.baseURL).Msg("Test server started") return s.waitForServerReady() } -
Verify test suite hooks:
func InitializeTestSuite(ctx *godog.TestSuiteContext) { ctx.BeforeSuite(func() { log.Info().Msg("BeforeSuite: Starting shared server") sharedServer = testserver.NewServer() if err := sharedServer.Start(); err != nil { log.Error().Err(err).Msg("Failed to start server") panic(err) } log.Info().Msg("BeforeSuite: Server started successfully") }) // ... }
Solution: Ensure server starts before tests and check for port conflicts.
4. Context Type Mismatches
Symptoms:
cannot use ctx (type *godog.ScenarioContext) as type context.Context in argument to InitializeScenario
Root Cause: Mixing context.Context with *godog.ScenarioContext.
Debugging Steps:
-
Check function signatures:
// ❌ Wrong func InitializeScenario(ctx context.Context) { // Wrong type! // ... } // ✅ Correct func InitializeScenario(ctx *godog.ScenarioContext) { // ... } -
Verify step registration:
// ✅ Correct func InitializeAllSteps(ctx *godog.ScenarioContext, client *testserver.Client) { sc := NewStepContext(client) ctx.Step(`^the server is running$`, sc.theServerIsRunning) // ... }
Solution: Always use *godog.ScenarioContext for step registration.
5. Step Not Executing
Symptoms: Step is defined but doesn't seem to execute.
Root Causes:
- Step pattern doesn't match
- Step not registered
- Context not passed correctly
Debugging Steps:
-
Add logging to step:
func (sc *StepContext) theServerIsRunning() error { log.Info().Msg("theServerIsRunning step executing") return sc.client.Request("GET", "/api/ready", nil) } -
Verify registration:
func InitializeScenario(ctx *godog.ScenarioContext) { client := testserver.NewClient(sharedServer) steps.InitializeAllSteps(ctx, client) // Verify steps are registered log.Info().Int("steps", len(ctx.Steps())).Msg("Steps registered") for _, step := range ctx.Steps() { log.Debug().Str("pattern", step.Pattern).Msg("Registered step") } } -
Check Godog output:
godog --format=progress --show-step-definitions
Solution: Ensure proper registration and pattern matching.
Advanced Debugging Techniques
1. Verbose Logging
Add detailed logging to all components:
// pkg/bdd/steps/steps.go
func (sc *StepContext) theServerIsRunning() error {
log.Info().Msg("=== theServerIsRunning step started ===")
err := sc.client.Request("GET", "/api/ready", nil)
if err != nil {
log.Error().Err(err).Msg("Server verification failed")
} else {
log.Info().Msg("Server verification succeeded")
}
log.Info().Msg("=== theServerIsRunning step completed ===")
return err
}
2. HTTP Request Tracing
Add request/response logging:
// pkg/bdd/testserver/client.go
func (c *Client) Request(method, path string, body []byte) error {
url := c.server.GetBaseURL() + path
log.Debug().Str("method", method).Str("url", url).Msg("Sending request")
req, err := http.NewRequest(method, url, nil)
if err != nil {
log.Error().Err(err).Msg("Request creation failed")
return fmt.Errorf("failed to create request: %w", err)
}
resp, err := http.DefaultClient.Do(req)
if err != nil {
log.Error().Err(err).Msg("Request failed")
return fmt.Errorf("request failed: %w", err)
}
defer resp.Body.Close()
c.lastResp = resp
c.lastBody, err = io.ReadAll(resp.Body)
if err != nil {
log.Error().Err(err).Msg("Response read failed")
} else {
log.Debug().Int("status", resp.StatusCode).
Str("body", string(c.lastBody)).
Msg("Received response")
}
return err
}
3. Test Execution Tracing
Run tests with detailed output:
# Verbose Godog output
godog --format=pretty --verbose features/greet.feature
# Go test with verbose output
go test ./features/... -v
# Show step definitions
godog --format=progress --show-step-definitions
4. Interactive Debugging
Use dlv for interactive debugging:
# Install Delve
go install github.com/go-delve/delve/cmd/dlv@latest
# Start debugging
dlv test ./features/...
# Set breakpoints
(b) pkg/bdd/steps/steps.go:25
# Continue execution
(c)
# Print variables
(p) sc.client.lastBody
5. Network Debugging
Capture HTTP traffic:
# Use mitmproxy
mitmproxy --mode reverse:http://localhost:9191 --listen-port 9192
# Configure client to use proxy
client := &http.Client{
Transport: &http.Transport{
Proxy: http.ProxyURL(url.Parse("http://localhost:9192")),
},
}
Common Error Patterns
Pattern 1: JSON Escaping Issues
Error:
Expected: "{\"message\":\"Hello world!\"}"
Got: "{\"message\":\"Hello world!\"}"
Solution: Properly escape in feature files and clean in code.
Pattern 2: Trailing Newlines
Error:
Expected: "..."
Got: "...\n"
Solution: strings.TrimSuffix(actual, "\n")
Pattern 3: Port Conflicts
Error:
listen tcp :9191: bind: address already in use
Solution:
# Find and kill process
kill -9 $(lsof -ti :9191)
Pattern 4: Server Not Ready
Error:
server did not become ready after 30 attempts
Solution:
- Check server logs
- Increase timeout in
waitForServerReady - Verify configuration
Pattern 5: Step Registration Issues
Error:
panic: step definition for "the server is running" already exists
Solution: Ensure steps are registered only once per context.
Debugging Checklist
✅ Pre-Test Checklist
- Server port (9191) is available
- No zombie test processes running
- Feature files use proper JSON escaping
- Step patterns match Godog's exact suggestions
- All steps are properly registered
- Context types are correct
✅ Runtime Checklist
- Server starts successfully (check logs)
- Readiness endpoint responds (curl localhost:9191/api/ready)
- Steps execute in correct order
- HTTP requests succeed
- Responses match expectations
- No undefined step warnings
✅ Post-Test Checklist
- Server shuts down gracefully
- All resources are cleaned up
- Port is released
- No goroutine leaks
- Test results are consistent
Debugging Tools
Essential Tools
| Tool | Purpose | Installation |
|---|---|---|
curl |
HTTP requests | Built-in |
godog |
BDD test runner | go install github.com/cucumber/godog/cmd/godog@latest |
dlv |
Go debugger | go install github.com/go-delve/delve/cmd/dlv@latest |
mitmproxy |
HTTP proxy | brew install mitmproxy |
jq |
JSON processing | brew install jq |
Useful Commands
# Check server health
curl -v http://localhost:9191/api/health
# Test specific endpoint
curl -v http://localhost:9191/api/v1/greet/John
# Check port usage
lsof -i :9191
# Kill process on port
kill -9 $(lsof -ti :9191)
# Run specific feature
godog features/greet.feature -v
# Show step definitions
godog --format=progress --show-step-definitions
# Debug with Delve
dlv test ./features/...
Performance Debugging
Slow Test Execution
Symptoms: Tests take longer than expected.
Debugging Steps:
-
Profile test execution:
go test ./features/... -cpuprofile=cpu.prof go tool pprof cpu.prof -
Identify bottlenecks:
(pprof) top (pprof) web -
Common bottlenecks:
- Server startup time
- HTTP request/response
- JSON parsing
- Step execution
Optimizations:
- Reuse HTTP connections
- Enable parallel execution
- Reduce logging in tests
- Cache configuration
Memory Issues
Symptoms: High memory usage during tests.
Debugging Steps:
-
Memory profiling:
go test ./features/... -memprofile=mem.prof go tool pprof mem.prof -
Check for leaks:
(pprof) top (pprof) inuse_objects -
Common memory issues:
- Unclosed response bodies
- Goroutine leaks
- Cached data not released
- Large JSON responses
Solutions:
- Ensure all
resp.Body.Close()calls - Clean up resources in AfterScenario
- Limit response sizes in tests
- Use streaming for large data
CI/CD Debugging
Failed CI Builds
Common Issues:
- Port conflicts in parallel builds
- Missing dependencies
- Environment differences
- Timeout issues
Debugging Steps:
-
Check CI logs:
- name: Run BDD tests run: | set -x go test ./features/... -v 2>&1 | tee test-output.txt exit ${PIPESTATUS[0]} -
Add debug information:
- name: Show environment run: | echo "Go version: $(go version)" echo "Working directory: $(pwd)" echo "Port 9191 status: $(lsof -i :9191 || echo 'available')" echo "Feature files: $(find features -name '*.feature')" -
Common CI fixes:
# Use unique ports for parallel jobs env: BDD_PORT: ${{ 9191 + github.run_id % 100 }} # Increase timeouts - name: Run tests with timeout timeout-minutes: 5 run: go test ./features/... -timeout=5m
Debugging Workflow
Systematic Debugging Approach
-
Reproduce the issue:
go test ./features/... -v -
Isolate the problem:
- Run specific feature
- Run specific scenario
- Disable other tests
-
Gather information:
- Logs
- HTTP responses
- Step execution order
- Timing information
-
Formulate hypothesis:
- What might be causing the issue?
- Where could the problem be?
-
Test hypothesis:
- Add logging
- Modify test
- Check assumptions
-
Implement fix:
- Update code
- Add validation
- Improve error handling
-
Verify fix:
- Run tests again
- Check related scenarios
- Test edge cases
-
Document solution:
- Update debugging guide
- Add to gotchas section
- Improve error messages
Common Fixes
Fix 1: JSON Escaping
Before:
Then the response should be "{"message":"Hello world!"}"
After:
Then the response should be "{\\"message\\":\\"Hello world!\\"}"
Fix 2: Step Pattern
Before:
ctx.Step(`^I request greeting "(.*)"$`, sc.iRequestAGreetingFor)
After:
ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)
Fix 3: Response Cleaning
Before:
if string(c.lastBody) != expected {
return fmt.Errorf("mismatch")
}
After:
actual := strings.TrimSuffix(string(c.lastBody), "\n")
if actual != expected {
return fmt.Errorf("expected %q, got %q", expected, actual)
}
Fix 4: Server Verification
Before:
func (sc *StepContext) theServerIsRunning() error {
// Assume server is running
return nil
}
After:
func (sc *StepContext) theServerIsRunning() error {
// Actually verify server is running
return sc.client.Request("GET", "/api/ready", nil)
}
Success Stories
Case Study 1: Undefined Steps
Problem: Tests passed but showed undefined step warnings.
Debugging:
- Ran
godog --format=progress - Compared patterns with implementation
- Found slight regex mismatch
Solution: Updated step patterns to match Godog's exact suggestions.
Result: ✅ No more undefined step warnings.
Case Study 2: JSON Mismatch
Problem: Response validation failed despite correct JSON.
Debugging:
- Added logging to see actual vs expected
- Found trailing newline in response
- Discovered improper escaping in feature file
Solution: Added newline trimming and proper JSON cleaning.
Result: ✅ All JSON comparisons now pass.
Case Study 3: Server Connection
Problem: Intermittent connection refused errors.
Debugging:
- Added server readiness logging
- Found race condition in server startup
- Discovered port conflict in CI
Solution: Improved readiness verification and added port conflict detection.
Result: ✅ Reliable server startup in all environments.
Final Tips
- Start simple: Test one scenario at a time
- Add logging: You can never have too much debug info
- Verify assumptions: Don't assume anything works
- Test manually: Use curl to verify endpoints
- Read logs: They often contain the answer
- Check patterns: Godog is particular about regex
- Clean data: Trim newlines, escape JSON properly
- Validate early: Catch issues before they multiply
- Document fixes: Help future you (and others)
- Ask for help: Sometimes a fresh perspective helps
Conclusion
BDD testing debugging follows a systematic approach:
- Identify the specific issue
- Isolate the problematic component
- Gather relevant information
- Analyze the root cause
- Implement the fix
- Verify the solution
- Document the learning
With this guide and the patterns established in our implementation, you should be able to debug any BDD testing issue efficiently.