Updates .gitignore to ignore feature-specific config files Aligns test organization with Godog best practices and community standards
11 KiB
ADR 0024: BDD Test Organization and Isolation Strategy
Status
Proposed 🟡
Context
As the dance-lessons-coach project grows, our BDD test suite has encountered several challenges. While we initially followed basic Godog patterns, we need to evolve our organization to handle complex scenarios like config hot reloading while maintaining test reliability.
Current Issues
- Test Interdependence: Tests affect each other through shared state (config files, database)
- Timing Issues: Config reloading and server restarts cause race conditions
- Cognitive Load: Large test files with many scenarios are hard to maintain
- Flaky Tests: Tests pass individually but fail when run together
- Edge Case Handling: Special setup/teardown requirements for certain tests
Godog Best Practices Alignment
According to Godog documentation and community best practices, our current organization partially follows recommendations but needs improvement in:
- Feature Granularity: Some files contain multiple unrelated features
- Step Organization: Steps could be better grouped by domain
- Context Management: Need better state isolation between scenarios
- Tagging Strategy: Currently missing tag-based test selection
Decision
Adopt a modular, isolated test suite architecture with the following principles:
1. Test Organization by Feature (Godog-Aligned)
Following Godog best practices, we organize tests by business domain with proper feature granularity:
features/
├── auth/ # Business domain
│ ├── authentication.feature # Single feature per file
│ ├── password_reset.feature # Single feature per file
│ └── user_management.feature # Single feature per file
├── config/ # Business domain
│ ├── hot_reloading.feature # Single feature per file
│ └── validation.feature # Single feature per file
├── greet/ # Business domain
│ ├── v1_greeting.feature # Single feature per file
│ └── v2_greeting.feature # Single feature per file
├── health/ # Business domain
│ └── health_check.feature # Single feature per file
└── jwt/ # Business domain
├── secret_rotation.feature # Single feature per file
└── retention_policy.feature # Single feature per file
Key Improvements over current structure:
- ✅ Single responsibility: One feature per file
- ✅ Business alignment: Grouped by domain, not technical concerns
- ✅ Scalability: Easy to add new features without bloating files
2. Isolation Strategies
A. Config File Isolation
- Each feature directory has its own config file pattern
- Config files are cleaned up after each feature test run
- Example:
features/auth/auth-test-config.yaml
B. Database Isolation
- Use separate database schemas or suffixes per feature
- Example:
dance_lessons_coach_auth_test,dance_lessons_coach_greet_test
C. Server Port Isolation
- Assign different ports to different test groups
- Prevents port conflicts during parallel testing
3. Test Execution Strategy
Option 1: Sequential Feature Testing (Recommended)
# Run tests by feature group
./scripts/test-feature.sh auth
./scripts/test-feature.sh config
./scripts/test-feature.sh greet
Option 2: Parallel Feature Testing (Advanced)
# Run features in parallel with isolation
./scripts/test-all-features-parallel.sh
4. Test Synchronization (Godog Best Practices)
A. Explicit Waits with Timeouts
Following Godog's arrange-act-assert pattern:
// Instead of fixed sleep times
func waitForServerReady(maxAttempts int, delay time.Duration) error {
for i := 0; i < maxAttempts; i++ {
if serverIsReady() {
return nil
}
time.Sleep(delay)
}
return fmt.Errorf("server not ready after %d attempts", maxAttempts)
}
B. Godog Context Management
Implement proper context structs as recommended by Godog:
// Feature-specific context for isolation
type AuthContext struct {
client *testserver.Client
db *sql.DB
users map[string]UserData
}
func InitializeAuthContext() *AuthContext {
return &AuthContext{
client: testserver.NewClient(),
db: connectToFeatureDB("auth"),
users: make(map[string]UserData),
}
}
func CleanupAuthContext(ctx *AuthContext) {
// Cleanup resources
ctx.db.Close()
}
C. Tag-Based Test Selection
Add Godog tag support for selective test execution:
// In feature files
@smoke @auth
Scenario: Successful user authentication
Given the server is running
When I authenticate with valid credentials
Then the authentication should be successful
// Run specific tags
go test ./features/... -tags=smoke
godog --tags=@auth features/
B. Event-Based Synchronization
// Use server lifecycle events
func waitForConfigReload() error {
return waitForEvent("config_reloaded", 30*time.Second)
}
C. Test Hooks with Timeouts
// In test setup
ctx.Step("^I wait for v2 API to be enabled$", func() error {
return waitForCondition(30*time.Second, func() bool {
return v2EndpointAvailable()
})
})
5. Test Lifecycle Management
Before Suite (Feature Level)
func InitializeFeatureSuite(featureName string) {
// Setup feature-specific resources
initDatabaseForFeature(featureName)
createFeatureConfigFile(featureName)
startIsolatedServer(featureName)
}
After Suite (Feature Level)
func CleanupFeatureSuite(featureName string) {
// Cleanup feature-specific resources
cleanupDatabaseForFeature(featureName)
removeFeatureConfigFile(featureName)
stopIsolatedServer(featureName)
}
6. Shell Script Integration
Create feature-specific test scripts:
# scripts/test-feature.sh
#!/bin/bash
FEATURE=$1
DATABASE="dance_lessons_coach_${FEATURE}_test"
CONFIG="features/${FEATURE}/${FEATURE}-test-config.yaml"
# Setup
setup_feature_environment() {
echo "🧪 Setting up ${FEATURE} feature tests..."
create_database ${DATABASE}
generate_config ${CONFIG}
}
# Run tests
run_feature_tests() {
echo "🚀 Running ${FEATURE} feature tests..."
DLC_DATABASE_NAME=${DATABASE} \
DLC_CONFIG_FILE=${CONFIG} \
go test ./features/${FEATURE}/... -v
}
# Teardown
cleanup_feature_environment() {
echo "🧹 Cleaning up ${FEATURE} feature tests..."
drop_database ${DATABASE}
remove_config ${CONFIG}
}
# Main execution
setup_feature_environment
run_feature_tests
cleanup_feature_environment
7. Configuration Management
Feature-Specific Config Files
# features/auth/auth-test-config.yaml
server:
host: "127.0.0.1"
port: 9192 # Feature-specific port
database:
name: "dance_lessons_coach_auth_test" # Feature-specific database
api:
v2_enabled: true # Feature-specific settings
auth:
jwt:
ttl: 1h
8. Test Data Management
A. Feature-Scoped Data
- Each feature gets its own data namespace
- Example:
auth_user_*,greet_message_*prefixes
B. Automatic Cleanup
func CleanupFeatureData(featureName string) {
// Remove all data created by this feature
db.Exec(fmt.Sprintf("DELETE FROM %s_* WHERE feature = '%s'", featureName, featureName))
}
Consequences
Positive
- Improved Test Reliability: Tests don't interfere with each other
- Better Maintainability: Smaller, focused test files
- Faster Development: Run only relevant tests during feature development
- Easier Debugging: Isolate issues to specific features
- Parallel Testing: Enable safe parallel execution
- SOLID Compliance: Single responsibility for test files
Negative
- Increased Complexity: More moving parts in test infrastructure
- Resource Usage: Multiple databases/servers consume more resources
- Setup Time: Initial test runs may be slower due to setup
- Learning Curve: Team needs to understand the isolation patterns
Neutral
- Test Execution Time: May increase or decrease depending on parallelization
- CI/CD Changes: Pipeline needs adaptation for new test organization
Implementation Plan
Phase 1: Refactor Current Tests (1-2 weeks)
- Split monolithic feature files into feature directories
- Create feature-specific test scripts
- Implement basic isolation (config files, database names)
Phase 2: Enhance Test Infrastructure (2-3 weeks)
- Add synchronization helpers to test framework
- Implement server lifecycle management
- Create comprehensive cleanup routines
Phase 3: Parallel Testing (Optional)
- Add parallel test execution capability
- Implement port management for parallel runs
- Add resource monitoring
Alternatives Considered
1. Single Test Suite with Better Cleanup
Rejected because: Doesn't solve fundamental interdependence issues
2. Docker-Based Isolation
Rejected because: Too heavyweight for local development
3. Test Virtualization
Rejected because: Overkill for current project size
Success Metrics
- Test Reliability: >95% pass rate in CI/CD
- Test Isolation: Ability to run any single feature test independently
- Developer Experience: Feature tests run in <30 seconds locally
- Maintainability: New team members can understand test structure in <1 hour
References
Godog Official Resources
BDD Best Practices
Test Organization Patterns
Revision History
- 2026-04-09: Initial draft based on BDD test challenges
- 2026-04-09: Added implementation details and examples
Decision Makers
- Approved by: Gabriel Radureau
- Consulted: AI Agent (Mistral Vibe)
- Informed: Development Team
Future Considerations
- Test Impact Analysis: Track which tests are affected by code changes
- Flaky Test Detection: Automatically identify and quarantine flaky tests
- Performance Benchmarking: Monitor test execution times over time
- Test Coverage Visualization: Feature-level coverage reports
Status: 🟡 Proposed → Ready for team review and implementation
Note: This ADR complements ADR 0023 (Config Hot Reloading) by addressing the test organization aspects of hot reloading functionality.