Files
dance-lessons-coach/.vibe/skills/bdd-testing/references/DEBUGGING.md
Gabriel Radureau 30af706590 🤖 feat: enhance agent skills for BDD testing and CI/CD management
Skill Improvements:
- BDD Testing Skill: Enhanced step templates, debugging guides, and patterns
- Gitea Client Skill: Added wiki management, issue tracking, and workflow monitoring
- Product Owner Assistant: Improved user story workflow and documentation
- Commit Message Skill: Better gitmoji integration and issue referencing
- Changelog Manager: Enhanced change tracking and documentation
- Skill Creator: Improved skill generation templates and validation
- Swagger Documentation: Updated OpenAPI integration guides

Key Features:
- BDD best practices documentation
- Gitea API client with wiki support
- User story implementation workflow
- Git commit message standardization
- Skill development patterns
- OpenAPI/Swagger documentation generation

Generated by Mistral Vibe.
Co-Authored-By: Mistral Vibe <vibe@mistral.ai>
2026-04-09 00:26:08 +02:00

744 lines
17 KiB
Markdown

# BDD Testing Debugging Guide
Comprehensive guide to debugging BDD tests for dance-lessons-coach.
## Common Issues and Solutions
### 1. "Undefined Step" Warnings
**Symptoms:**
```
Feature: Greet Service
Scenario: Default greeting # features/greet.feature:3
Given the server is running # ??? UNDEFINED STEP
When I request the default greeting # ??? UNDEFINED STEP
Then the response should be "..." # ??? UNDEFINED STEP
```
**Root Cause:** Step patterns don't match Godog's exact expectations. Godog is very particular about regex escaping.
**Common Pattern Issues:**
- `\"` vs `\\"` (single vs double escaping)
- Exact quote handling in JSON patterns
- Parameter capture group syntax
**Debugging Steps:**
1. **Run with progress format:**
```bash
godog --format=progress features/greet.feature
```
2. **Check suggested patterns:**
```
You can implement step definitions for the undefined steps with these snippets:
func theResponseShouldBe(arg1, arg2 string) error {
return godog.ErrPending
}
func InitializeScenario(ctx *godog.ScenarioContext) {
ctx.Step(`^the response should be "{\\"([^"]*)\\":\\"([^"]*)\\"}"$`, theResponseShouldBe)
}
```
3. **Compare with your implementation:**
```go
// ❌ Wrong pattern (single escaping)
ctx.Step(`^the response should be "{\"([^"]*)\":\"([^"]*)\"}"$`, sc.commonSteps.theResponseShouldBe)
// ✅ Correct pattern (double escaping - matches Godog's suggestion)
ctx.Step(`^the response should be "{\\"([^"]*)\\":\\"([^"]*)\\"}"$`, sc.commonSteps.theResponseShouldBe)
```
**Key Insight:** Godog expects `\\"` (four backslashes + quote) for escaped quotes in JSON patterns, not `\"` (two backslashes + quote).
**Solution:** Use Godog's EXACT regex patterns, paying special attention to:
- JSON escaping: `\\"` not `\"`
- Parameter names: Use `arg1, arg2` as suggested
- Capture groups: Match Godog's exact regex syntax
### 2. JSON Comparison Failures
**Symptoms:**
```
Expected response body "{\"message\":\"Hello world!\"}",
got "{\"message\":\"Hello world!\"}\n"
```
**Root Causes:**
- Trailing newlines in JSON responses
- Improper escaping in feature files
- Quote handling issues
**Debugging Steps:**
1. **Check actual response:**
```bash
curl -v http://localhost:9191/api/v1/greet/
```
2. **Inspect in step implementation:**
```go
func (sc *StepContext) theResponseShouldBe(expected string) error {
fmt.Printf("Expected: %q\n", expected)
fmt.Printf("Actual: %q\n", string(sc.client.lastBody))
// ...
}
```
3. **Verify feature file escaping:**
```gherkin
# ❌ Wrong escaping
Then the response should be "{"message":"Hello world!"}"
# ✅ Correct escaping
Then the response should be "{\\"message\\":\\"Hello world!\\"}"
```
**Solution:** Trim newlines and properly clean JSON:
```go
cleanExpected := strings.Trim(expected, `"\`)
actual := strings.TrimSuffix(string(body), "\n")
```
### 3. Server Connection Issues
**Symptoms:**
```
Request failed: dial tcp [::1]:9191: connect: connection refused
```
**Root Causes:**
- Server not started
- Port conflict
- Server crashed during test
**Debugging Steps:**
1. **Check server manually:**
```bash
curl -v http://localhost:9191/api/ready
```
2. **Check port usage:**
```bash
lsof -i :9191
netstat -an | grep 9191
```
3. **Add debug logging to server startup:**
```go
func (s *Server) Start() error {
log.Info().Int("port", s.port).Msg("Starting test server")
// ...
log.Info().Str("url", s.baseURL).Msg("Test server started")
return s.waitForServerReady()
}
```
4. **Verify test suite hooks:**
```go
func InitializeTestSuite(ctx *godog.TestSuiteContext) {
ctx.BeforeSuite(func() {
log.Info().Msg("BeforeSuite: Starting shared server")
sharedServer = testserver.NewServer()
if err := sharedServer.Start(); err != nil {
log.Error().Err(err).Msg("Failed to start server")
panic(err)
}
log.Info().Msg("BeforeSuite: Server started successfully")
})
// ...
}
```
**Solution:** Ensure server starts before tests and check for port conflicts.
### 4. Context Type Mismatches
**Symptoms:**
```
cannot use ctx (type *godog.ScenarioContext) as type context.Context in argument to InitializeScenario
```
**Root Cause:** Mixing `context.Context` with `*godog.ScenarioContext`.
**Debugging Steps:**
1. **Check function signatures:**
```go
// ❌ Wrong
func InitializeScenario(ctx context.Context) { // Wrong type!
// ...
}
// ✅ Correct
func InitializeScenario(ctx *godog.ScenarioContext) {
// ...
}
```
2. **Verify step registration:**
```go
// ✅ Correct
func InitializeAllSteps(ctx *godog.ScenarioContext, client *testserver.Client) {
sc := NewStepContext(client)
ctx.Step(`^the server is running$`, sc.theServerIsRunning)
// ...
}
```
**Solution:** Always use `*godog.ScenarioContext` for step registration.
### 5. Step Not Executing
**Symptoms:** Step is defined but doesn't seem to execute.
**Root Causes:**
- Step pattern doesn't match
- Step not registered
- Context not passed correctly
**Debugging Steps:**
1. **Add logging to step:**
```go
func (sc *StepContext) theServerIsRunning() error {
log.Info().Msg("theServerIsRunning step executing")
return sc.client.Request("GET", "/api/ready", nil)
}
```
2. **Verify registration:**
```go
func InitializeScenario(ctx *godog.ScenarioContext) {
client := testserver.NewClient(sharedServer)
steps.InitializeAllSteps(ctx, client)
// Verify steps are registered
log.Info().Int("steps", len(ctx.Steps())).Msg("Steps registered")
for _, step := range ctx.Steps() {
log.Debug().Str("pattern", step.Pattern).Msg("Registered step")
}
}
```
3. **Check Godog output:**
```bash
godog --format=progress --show-step-definitions
```
**Solution:** Ensure proper registration and pattern matching.
## Advanced Debugging Techniques
### 1. Verbose Logging
Add detailed logging to all components:
```go
// pkg/bdd/steps/steps.go
func (sc *StepContext) theServerIsRunning() error {
log.Info().Msg("=== theServerIsRunning step started ===")
err := sc.client.Request("GET", "/api/ready", nil)
if err != nil {
log.Error().Err(err).Msg("Server verification failed")
} else {
log.Info().Msg("Server verification succeeded")
}
log.Info().Msg("=== theServerIsRunning step completed ===")
return err
}
```
### 2. HTTP Request Tracing
Add request/response logging:
```go
// pkg/bdd/testserver/client.go
func (c *Client) Request(method, path string, body []byte) error {
url := c.server.GetBaseURL() + path
log.Debug().Str("method", method).Str("url", url).Msg("Sending request")
req, err := http.NewRequest(method, url, nil)
if err != nil {
log.Error().Err(err).Msg("Request creation failed")
return fmt.Errorf("failed to create request: %w", err)
}
resp, err := http.DefaultClient.Do(req)
if err != nil {
log.Error().Err(err).Msg("Request failed")
return fmt.Errorf("request failed: %w", err)
}
defer resp.Body.Close()
c.lastResp = resp
c.lastBody, err = io.ReadAll(resp.Body)
if err != nil {
log.Error().Err(err).Msg("Response read failed")
} else {
log.Debug().Int("status", resp.StatusCode).
Str("body", string(c.lastBody)).
Msg("Received response")
}
return err
}
```
### 3. Test Execution Tracing
Run tests with detailed output:
```bash
# Verbose Godog output
godog --format=pretty --verbose features/greet.feature
# Go test with verbose output
go test ./features/... -v
# Show step definitions
godog --format=progress --show-step-definitions
```
### 4. Interactive Debugging
Use `dlv` for interactive debugging:
```bash
# Install Delve
go install github.com/go-delve/delve/cmd/dlv@latest
# Start debugging
dlv test ./features/...
# Set breakpoints
(b) pkg/bdd/steps/steps.go:25
# Continue execution
(c)
# Print variables
(p) sc.client.lastBody
```
### 5. Network Debugging
Capture HTTP traffic:
```bash
# Use mitmproxy
mitmproxy --mode reverse:http://localhost:9191 --listen-port 9192
# Configure client to use proxy
client := &http.Client{
Transport: &http.Transport{
Proxy: http.ProxyURL(url.Parse("http://localhost:9192")),
},
}
```
## Common Error Patterns
### Pattern 1: JSON Escaping Issues
**Error:**
```
Expected: "{\"message\":\"Hello world!\"}"
Got: "{\"message\":\"Hello world!\"}"
```
**Solution:** Properly escape in feature files and clean in code.
### Pattern 2: Trailing Newlines
**Error:**
```
Expected: "..."
Got: "...\n"
```
**Solution:** `strings.TrimSuffix(actual, "\n")`
### Pattern 3: Port Conflicts
**Error:**
```
listen tcp :9191: bind: address already in use
```
**Solution:**
```bash
# Find and kill process
kill -9 $(lsof -ti :9191)
```
### Pattern 4: Server Not Ready
**Error:**
```
server did not become ready after 30 attempts
```
**Solution:**
1. Check server logs
2. Increase timeout in `waitForServerReady`
3. Verify configuration
### Pattern 5: Step Registration Issues
**Error:**
```
panic: step definition for "the server is running" already exists
```
**Solution:** Ensure steps are registered only once per context.
## Debugging Checklist
### ✅ Pre-Test Checklist
- [ ] Server port (9191) is available
- [ ] No zombie test processes running
- [ ] Feature files use proper JSON escaping
- [ ] Step patterns match Godog's exact suggestions
- [ ] All steps are properly registered
- [ ] Context types are correct
### ✅ Runtime Checklist
- [ ] Server starts successfully (check logs)
- [ ] Readiness endpoint responds (curl localhost:9191/api/ready)
- [ ] Steps execute in correct order
- [ ] HTTP requests succeed
- [ ] Responses match expectations
- [ ] No undefined step warnings
### ✅ Post-Test Checklist
- [ ] Server shuts down gracefully
- [ ] All resources are cleaned up
- [ ] Port is released
- [ ] No goroutine leaks
- [ ] Test results are consistent
## Debugging Tools
### Essential Tools
| Tool | Purpose | Installation |
|------|---------|--------------|
| `curl` | HTTP requests | Built-in |
| `godog` | BDD test runner | `go install github.com/cucumber/godog/cmd/godog@latest` |
| `dlv` | Go debugger | `go install github.com/go-delve/delve/cmd/dlv@latest` |
| `mitmproxy` | HTTP proxy | `brew install mitmproxy` |
| `jq` | JSON processing | `brew install jq` |
### Useful Commands
```bash
# Check server health
curl -v http://localhost:9191/api/health
# Test specific endpoint
curl -v http://localhost:9191/api/v1/greet/John
# Check port usage
lsof -i :9191
# Kill process on port
kill -9 $(lsof -ti :9191)
# Run specific feature
godog features/greet.feature -v
# Show step definitions
godog --format=progress --show-step-definitions
# Debug with Delve
dlv test ./features/...
```
## Performance Debugging
### Slow Test Execution
**Symptoms:** Tests take longer than expected.
**Debugging Steps:**
1. **Profile test execution:**
```bash
go test ./features/... -cpuprofile=cpu.prof
go tool pprof cpu.prof
```
2. **Identify bottlenecks:**
```
(pprof) top
(pprof) web
```
3. **Common bottlenecks:**
- Server startup time
- HTTP request/response
- JSON parsing
- Step execution
**Optimizations:**
- Reuse HTTP connections
- Enable parallel execution
- Reduce logging in tests
- Cache configuration
### Memory Issues
**Symptoms:** High memory usage during tests.
**Debugging Steps:**
1. **Memory profiling:**
```bash
go test ./features/... -memprofile=mem.prof
go tool pprof mem.prof
```
2. **Check for leaks:**
```
(pprof) top
(pprof) inuse_objects
```
3. **Common memory issues:**
- Unclosed response bodies
- Goroutine leaks
- Cached data not released
- Large JSON responses
**Solutions:**
- Ensure all `resp.Body.Close()` calls
- Clean up resources in AfterScenario
- Limit response sizes in tests
- Use streaming for large data
## CI/CD Debugging
### Failed CI Builds
**Common Issues:**
- Port conflicts in parallel builds
- Missing dependencies
- Environment differences
- Timeout issues
**Debugging Steps:**
1. **Check CI logs:**
```yaml
- name: Run BDD tests
run: |
set -x
go test ./features/... -v 2>&1 | tee test-output.txt
exit ${PIPESTATUS[0]}
```
2. **Add debug information:**
```yaml
- name: Show environment
run: |
echo "Go version: $(go version)"
echo "Working directory: $(pwd)"
echo "Port 9191 status: $(lsof -i :9191 || echo 'available')"
echo "Feature files: $(find features -name '*.feature')"
```
3. **Common CI fixes:**
```yaml
# Use unique ports for parallel jobs
env:
BDD_PORT: ${{ 9191 + github.run_id % 100 }}
# Increase timeouts
- name: Run tests with timeout
timeout-minutes: 5
run: go test ./features/... -timeout=5m
```
## Debugging Workflow
### Systematic Debugging Approach
1. **Reproduce the issue:**
```bash
go test ./features/... -v
```
2. **Isolate the problem:**
- Run specific feature
- Run specific scenario
- Disable other tests
3. **Gather information:**
- Logs
- HTTP responses
- Step execution order
- Timing information
4. **Formulate hypothesis:**
- What might be causing the issue?
- Where could the problem be?
5. **Test hypothesis:**
- Add logging
- Modify test
- Check assumptions
6. **Implement fix:**
- Update code
- Add validation
- Improve error handling
7. **Verify fix:**
- Run tests again
- Check related scenarios
- Test edge cases
8. **Document solution:**
- Update debugging guide
- Add to gotchas section
- Improve error messages
## Common Fixes
### Fix 1: JSON Escaping
**Before:**
```gherkin
Then the response should be "{"message":"Hello world!"}"
```
**After:**
```gherkin
Then the response should be "{\\"message\\":\\"Hello world!\\"}"
```
### Fix 2: Step Pattern
**Before:**
```go
ctx.Step(`^I request greeting "(.*)"$`, sc.iRequestAGreetingFor)
```
**After:**
```go
ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)
```
### Fix 3: Response Cleaning
**Before:**
```go
if string(c.lastBody) != expected {
return fmt.Errorf("mismatch")
}
```
**After:**
```go
actual := strings.TrimSuffix(string(c.lastBody), "\n")
if actual != expected {
return fmt.Errorf("expected %q, got %q", expected, actual)
}
```
### Fix 4: Server Verification
**Before:**
```go
func (sc *StepContext) theServerIsRunning() error {
// Assume server is running
return nil
}
```
**After:**
```go
func (sc *StepContext) theServerIsRunning() error {
// Actually verify server is running
return sc.client.Request("GET", "/api/ready", nil)
}
```
## Success Stories
### Case Study 1: Undefined Steps
**Problem:** Tests passed but showed undefined step warnings.
**Debugging:**
1. Ran `godog --format=progress`
2. Compared patterns with implementation
3. Found slight regex mismatch
**Solution:** Updated step patterns to match Godog's exact suggestions.
**Result:** ✅ No more undefined step warnings.
### Case Study 2: JSON Mismatch
**Problem:** Response validation failed despite correct JSON.
**Debugging:**
1. Added logging to see actual vs expected
2. Found trailing newline in response
3. Discovered improper escaping in feature file
**Solution:** Added newline trimming and proper JSON cleaning.
**Result:** ✅ All JSON comparisons now pass.
### Case Study 3: Server Connection
**Problem:** Intermittent connection refused errors.
**Debugging:**
1. Added server readiness logging
2. Found race condition in server startup
3. Discovered port conflict in CI
**Solution:** Improved readiness verification and added port conflict detection.
**Result:** ✅ Reliable server startup in all environments.
## Final Tips
1. **Start simple**: Test one scenario at a time
2. **Add logging**: You can never have too much debug info
3. **Verify assumptions**: Don't assume anything works
4. **Test manually**: Use curl to verify endpoints
5. **Read logs**: They often contain the answer
6. **Check patterns**: Godog is particular about regex
7. **Clean data**: Trim newlines, escape JSON properly
8. **Validate early**: Catch issues before they multiply
9. **Document fixes**: Help future you (and others)
10. **Ask for help**: Sometimes a fresh perspective helps
## Conclusion
BDD testing debugging follows a systematic approach:
1. **Identify** the specific issue
2. **Isolate** the problematic component
3. **Gather** relevant information
4. **Analyze** the root cause
5. **Implement** the fix
6. **Verify** the solution
7. **Document** the learning
With this guide and the patterns established in our implementation, you should be able to debug any BDD testing issue efficiently.