Updated existing Architecture Decision Records: - Added user authentication references to ADR-0008 (BDD Testing) - Updated ADR-0016 (CI/CD Pipeline) with authentication workflow - Enhanced ADR-0017 (Trunk-based Development) with BDD integration - Added security considerations to multiple ADRs - Updated cross-references throughout documentation Removed deprecated files: - docker-compose.cicd-test.yml (replaced by docker-compose.yml) Generated by Mistral Vibe. Co-Authored-By: Mistral Vibe <vibe@mistral.ai>
7.8 KiB
Adopt BDD with Godog for behavioral testing
- Status: Accepted
- Deciders: Gabriel Radureau, AI Agent
- Date: 2026-04-05
Context and Problem Statement
We needed to add behavioral testing to dance-lessons-coach that provides:
- User-centric test scenarios
- Living documentation
- Integration testing capabilities
- Clear communication between technical and non-technical stakeholders
- Complementary testing to unit tests
Decision Drivers
- Need for higher-level testing than unit tests
- Desire for living documentation that's always up-to-date
- Requirement for testing through public interfaces
- Need for clear behavioral specifications
- Desire for good test organization and readability
Considered Options
- Godog (Cucumber for Go) - BDD framework for Go
- Ginkgo - BDD-style testing framework
- Standard Go testing - Extended for integration tests
- Custom BDD framework - Build our own
Decision Outcome
Chosen option: "Godog" because it provides proper BDD support with Gherkin syntax, good Go integration, living documentation capabilities, and follows standard Cucumber patterns.
Pros and Cons of the Options
Godog
- Good, because proper BDD with Gherkin syntax
- Good, because living documentation
- Good, because good Go integration
- Good, because follows Cucumber standards
- Good, because clear separation of concerns
- Bad, because slightly more complex setup
- Bad, because slower execution than unit tests
Ginkgo
- Good, because good BDD-style testing
- Good, because fast execution
- Good, because good Go integration
- Bad, because not proper Gherkin/BDD
- Bad, because less clear for non-technical stakeholders
Standard Go testing
- Good, because no external dependencies
- Good, because familiar to Go developers
- Bad, because no BDD capabilities
- Bad, because no living documentation
- Bad, because less organized for behavioral tests
Custom BDD framework
- Good, because tailored to our needs
- Good, because no external dependencies
- Bad, because time-consuming to develop
- Bad, because need to maintain ourselves
- Bad, because likely less feature-rich
Implementation Structure
features/
├── greet.feature # Gherkin feature files
├── health.feature
└── readiness.feature
pkg/bdd/
├── steps/ # Step definitions
│ ├── greet_steps.go # Implementation of steps
│ ├── health_steps.go
│ └── readiness_steps.go
│
├── testserver/ # Test infrastructure
│ ├── server.go # In-process test server harness
│ └── client.go # HTTP client for testing
│
└── suite.go # Test suite initialization
Testing Approach Evolution
Initial Approach (Process-based)
Initially planned to test against external server process using go run, but this proved unreliable for automated testing due to:
- Process management complexity
- Port conflicts in parallel execution
- CI/CD environment challenges
- Process cleanup issues
Current Approach (Hybrid In-Process)
Adopted a hybrid approach that maintains black box testing principles while improving reliability:
// pkg/bdd/testserver/server.go
func (s *Server) Start() error {
// Create real server instance from pkg/server
cfg := createTestConfig(s.port)
realServer := server.NewServer(cfg, context.Background())
// Start HTTP server in same process
s.httpServer = &http.Server{
Addr: fmt.Sprintf(":%d", s.port),
Handler: realServer.Router(),
}
go func() {
if err := s.httpServer.ListenAndServe(); err != nil && err != http.ErrServerClosed {
log.Error().Err(err).Msg("Test server failed")
}
}()
return s.waitForServerReady()
}
Black Box Testing Principles Maintained
Despite using in-process server, the approach maintains core black box testing principles:
✅ External Interface Testing: All tests interact through HTTP API only
✅ No Implementation Knowledge: Tests don't access internal server components
✅ Real Server Code: Uses actual server implementation from pkg/server
✅ Production Configuration: Tests with realistic server configuration
✅ Isolation: Each test suite gets fresh server instance
What We Test vs What We Don't
✅ Covered by BDD Tests
- HTTP API endpoints and responses
- Request/response handling
- Business logic through public interface
- Error handling and status codes
- Readiness/liveness behavior
- JSON serialization/deserialization
🚫 Not Covered by BDD Tests (Covered Elsewhere)
- Actual process startup/shutdown (covered by
scripts/test-server.sh) - Main function execution (covered by integration tests)
- External process management (covered by server control scripts)
- Operating system signals (covered by manual testing)
Example Feature File
# features/greet.feature
Feature: Greet Service
The greet service should return appropriate greetings
Scenario: Default greeting
Given the server is running
When I request the default greeting
Then the response should be "Hello world!"
Scenario: Personalized greeting
Given the server is running
When I request a greeting for "John"
Then the response should be "Hello John!"
Example Step Implementation
// pkg/bdd/steps/steps.go
func InitializeAllSteps(ctx *godog.ScenarioContext, client *testserver.Client) {
sc := NewStepContext(client)
ctx.Step(`^the server is running$`, sc.theServerIsRunning)
ctx.Step(`^I request the default greeting$`, sc.iRequestTheDefaultGreeting)
ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)
ctx.Step(`^I request the health endpoint$`, sc.iRequestTheHealthEndpoint)
ctx.Step(`^the response should be "{\"([^"]*)\":\"([^"]*)\"}"$`, sc.theResponseShouldBe)
}
// StepContext struct holds the test client
type StepContext struct {
client *testserver.Client
}
func (sc *StepContext) theServerIsRunning() error {
// Actually verify the server is running by checking the readiness endpoint
return sc.client.Request("GET", "/api/ready", nil)
}
func (sc *StepContext) iRequestTheDefaultGreeting() error {
return sc.client.Request("GET", "/api/v1/greet/", nil)
}
func (sc *StepContext) theResponseShouldBe(arg1, arg2 string) error {
// Handle JSON escaping from feature files
cleanArg1 := strings.Trim(arg1, `"\`)
cleanArg2 := strings.Trim(arg2, `"\`)
expected := fmt.Sprintf(`{"%s":"%s"}`, cleanArg1, cleanArg2)
return sc.client.ExpectResponseBody(expected)
}
Black Box Testing Approach
The BDD implementation follows black box testing principles:
- External perspective: Tests interact only through public HTTP API
- No implementation knowledge: Tests don't know about internal components
- Behavior focus: Tests verify what the system does, not how it does it
- Interface testing: Tests verify the contract between system and users
Testing Strategy
Test Types
- Direct HTTP tests: Test raw API behavior
- SDK client tests: Test generated client integration (future)
Test Execution
# Run BDD tests
cd features
godog
# Run with specific format
godog -f progress
# Run specific feature
godog features/greet.feature
Links
Integration with CI/CD
# Example GitHub Actions step
- name: Run BDD tests
run: |
cd features
godog -f progress
Performance Considerations
- BDD tests are slower than unit tests (expected)
- Each scenario runs with fresh server instance for isolation
- Tests can be run in parallel where appropriate
- Focus on critical paths rather than exhaustive testing