📝 docs: add ADR 0024 for BDD test organization and isolation strategy
Updates .gitignore to ignore feature-specific config files Aligns test organization with Godog best practices and community standards
This commit is contained in:
3
.gitignore
vendored
3
.gitignore
vendored
@@ -24,8 +24,9 @@ server.pid
|
||||
pkg/server/docs/
|
||||
|
||||
# BDD test files
|
||||
features/test-config.yaml
|
||||
features/*/*-config.yaml
|
||||
test-config.yaml
|
||||
test-v2-config.yaml
|
||||
|
||||
# CI/CD runner configuration
|
||||
config/runner
|
||||
|
||||
358
adr/0024-bdd-test-organization-and-isolation.md
Normal file
358
adr/0024-bdd-test-organization-and-isolation.md
Normal file
@@ -0,0 +1,358 @@
|
||||
# ADR 0024: BDD Test Organization and Isolation Strategy
|
||||
|
||||
## Status
|
||||
**Proposed** 🟡
|
||||
|
||||
## Context
|
||||
|
||||
As the dance-lessons-coach project grows, our BDD test suite has encountered several challenges. While we initially followed basic Godog patterns, we need to evolve our organization to handle complex scenarios like config hot reloading while maintaining test reliability.
|
||||
|
||||
### Current Issues
|
||||
|
||||
1. **Test Interdependence**: Tests affect each other through shared state (config files, database)
|
||||
2. **Timing Issues**: Config reloading and server restarts cause race conditions
|
||||
3. **Cognitive Load**: Large test files with many scenarios are hard to maintain
|
||||
4. **Flaky Tests**: Tests pass individually but fail when run together
|
||||
5. **Edge Case Handling**: Special setup/teardown requirements for certain tests
|
||||
|
||||
### Godog Best Practices Alignment
|
||||
|
||||
According to [Godog documentation](https://github.com/cucumber/godog) and community best practices, our current organization partially follows recommendations but needs improvement in:
|
||||
|
||||
- **Feature Granularity**: Some files contain multiple unrelated features
|
||||
- **Step Organization**: Steps could be better grouped by domain
|
||||
- **Context Management**: Need better state isolation between scenarios
|
||||
- **Tagging Strategy**: Currently missing tag-based test selection
|
||||
|
||||
## Decision
|
||||
|
||||
Adopt a **modular, isolated test suite architecture** with the following principles:
|
||||
|
||||
### 1. Test Organization by Feature (Godog-Aligned)
|
||||
|
||||
Following [Godog best practices](https://github.com/cucumber/godog), we organize tests by business domain with proper feature granularity:
|
||||
|
||||
```
|
||||
features/
|
||||
├── auth/ # Business domain
|
||||
│ ├── authentication.feature # Single feature per file
|
||||
│ ├── password_reset.feature # Single feature per file
|
||||
│ └── user_management.feature # Single feature per file
|
||||
├── config/ # Business domain
|
||||
│ ├── hot_reloading.feature # Single feature per file
|
||||
│ └── validation.feature # Single feature per file
|
||||
├── greet/ # Business domain
|
||||
│ ├── v1_greeting.feature # Single feature per file
|
||||
│ └── v2_greeting.feature # Single feature per file
|
||||
├── health/ # Business domain
|
||||
│ └── health_check.feature # Single feature per file
|
||||
└── jwt/ # Business domain
|
||||
├── secret_rotation.feature # Single feature per file
|
||||
└── retention_policy.feature # Single feature per file
|
||||
```
|
||||
|
||||
**Key Improvements over current structure:**
|
||||
- ✅ **Single responsibility**: One feature per file
|
||||
- ✅ **Business alignment**: Grouped by domain, not technical concerns
|
||||
- ✅ **Scalability**: Easy to add new features without bloating files
|
||||
|
||||
### 2. Isolation Strategies
|
||||
|
||||
#### A. Config File Isolation
|
||||
- Each feature directory has its own config file pattern
|
||||
- Config files are cleaned up after each feature test run
|
||||
- Example: `features/auth/auth-test-config.yaml`
|
||||
|
||||
#### B. Database Isolation
|
||||
- Use separate database schemas or suffixes per feature
|
||||
- Example: `dance_lessons_coach_auth_test`, `dance_lessons_coach_greet_test`
|
||||
|
||||
#### C. Server Port Isolation
|
||||
- Assign different ports to different test groups
|
||||
- Prevents port conflicts during parallel testing
|
||||
|
||||
### 3. Test Execution Strategy
|
||||
|
||||
#### Option 1: Sequential Feature Testing (Recommended)
|
||||
```bash
|
||||
# Run tests by feature group
|
||||
./scripts/test-feature.sh auth
|
||||
./scripts/test-feature.sh config
|
||||
./scripts/test-feature.sh greet
|
||||
```
|
||||
|
||||
#### Option 2: Parallel Feature Testing (Advanced)
|
||||
```bash
|
||||
# Run features in parallel with isolation
|
||||
./scripts/test-all-features-parallel.sh
|
||||
```
|
||||
|
||||
### 4. Test Synchronization (Godog Best Practices)
|
||||
|
||||
#### A. Explicit Waits with Timeouts
|
||||
Following Godog's [arrange-act-assert pattern](https://alicegg.tech/2019/03/09/gobdd.html):
|
||||
|
||||
```go
|
||||
// Instead of fixed sleep times
|
||||
func waitForServerReady(maxAttempts int, delay time.Duration) error {
|
||||
for i := 0; i < maxAttempts; i++ {
|
||||
if serverIsReady() {
|
||||
return nil
|
||||
}
|
||||
time.Sleep(delay)
|
||||
}
|
||||
return fmt.Errorf("server not ready after %d attempts", maxAttempts)
|
||||
}
|
||||
```
|
||||
|
||||
#### B. Godog Context Management
|
||||
Implement proper context structs as recommended by Godog:
|
||||
|
||||
```go
|
||||
// Feature-specific context for isolation
|
||||
type AuthContext struct {
|
||||
client *testserver.Client
|
||||
db *sql.DB
|
||||
users map[string]UserData
|
||||
}
|
||||
|
||||
func InitializeAuthContext() *AuthContext {
|
||||
return &AuthContext{
|
||||
client: testserver.NewClient(),
|
||||
db: connectToFeatureDB("auth"),
|
||||
users: make(map[string]UserData),
|
||||
}
|
||||
}
|
||||
|
||||
func CleanupAuthContext(ctx *AuthContext) {
|
||||
// Cleanup resources
|
||||
ctx.db.Close()
|
||||
}
|
||||
```
|
||||
|
||||
#### C. Tag-Based Test Selection
|
||||
Add Godog tag support for selective test execution:
|
||||
|
||||
```go
|
||||
// In feature files
|
||||
@smoke @auth
|
||||
Scenario: Successful user authentication
|
||||
Given the server is running
|
||||
When I authenticate with valid credentials
|
||||
Then the authentication should be successful
|
||||
|
||||
// Run specific tags
|
||||
go test ./features/... -tags=smoke
|
||||
godog --tags=@auth features/
|
||||
```
|
||||
|
||||
#### B. Event-Based Synchronization
|
||||
```go
|
||||
// Use server lifecycle events
|
||||
func waitForConfigReload() error {
|
||||
return waitForEvent("config_reloaded", 30*time.Second)
|
||||
}
|
||||
```
|
||||
|
||||
#### C. Test Hooks with Timeouts
|
||||
```go
|
||||
// In test setup
|
||||
ctx.Step("^I wait for v2 API to be enabled$", func() error {
|
||||
return waitForCondition(30*time.Second, func() bool {
|
||||
return v2EndpointAvailable()
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
### 5. Test Lifecycle Management
|
||||
|
||||
#### Before Suite (Feature Level)
|
||||
```go
|
||||
func InitializeFeatureSuite(featureName string) {
|
||||
// Setup feature-specific resources
|
||||
initDatabaseForFeature(featureName)
|
||||
createFeatureConfigFile(featureName)
|
||||
startIsolatedServer(featureName)
|
||||
}
|
||||
```
|
||||
|
||||
#### After Suite (Feature Level)
|
||||
```go
|
||||
func CleanupFeatureSuite(featureName string) {
|
||||
// Cleanup feature-specific resources
|
||||
cleanupDatabaseForFeature(featureName)
|
||||
removeFeatureConfigFile(featureName)
|
||||
stopIsolatedServer(featureName)
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Shell Script Integration
|
||||
|
||||
Create feature-specific test scripts:
|
||||
|
||||
```bash
|
||||
# scripts/test-feature.sh
|
||||
#!/bin/bash
|
||||
|
||||
FEATURE=$1
|
||||
DATABASE="dance_lessons_coach_${FEATURE}_test"
|
||||
CONFIG="features/${FEATURE}/${FEATURE}-test-config.yaml"
|
||||
|
||||
# Setup
|
||||
setup_feature_environment() {
|
||||
echo "🧪 Setting up ${FEATURE} feature tests..."
|
||||
create_database ${DATABASE}
|
||||
generate_config ${CONFIG}
|
||||
}
|
||||
|
||||
# Run tests
|
||||
run_feature_tests() {
|
||||
echo "🚀 Running ${FEATURE} feature tests..."
|
||||
DLC_DATABASE_NAME=${DATABASE} \
|
||||
DLC_CONFIG_FILE=${CONFIG} \
|
||||
go test ./features/${FEATURE}/... -v
|
||||
}
|
||||
|
||||
# Teardown
|
||||
cleanup_feature_environment() {
|
||||
echo "🧹 Cleaning up ${FEATURE} feature tests..."
|
||||
drop_database ${DATABASE}
|
||||
remove_config ${CONFIG}
|
||||
}
|
||||
|
||||
# Main execution
|
||||
setup_feature_environment
|
||||
run_feature_tests
|
||||
cleanup_feature_environment
|
||||
```
|
||||
|
||||
### 7. Configuration Management
|
||||
|
||||
#### Feature-Specific Config Files
|
||||
```yaml
|
||||
# features/auth/auth-test-config.yaml
|
||||
server:
|
||||
host: "127.0.0.1"
|
||||
port: 9192 # Feature-specific port
|
||||
|
||||
database:
|
||||
name: "dance_lessons_coach_auth_test" # Feature-specific database
|
||||
|
||||
api:
|
||||
v2_enabled: true # Feature-specific settings
|
||||
|
||||
auth:
|
||||
jwt:
|
||||
ttl: 1h
|
||||
```
|
||||
|
||||
### 8. Test Data Management
|
||||
|
||||
#### A. Feature-Scoped Data
|
||||
- Each feature gets its own data namespace
|
||||
- Example: `auth_user_*`, `greet_message_*` prefixes
|
||||
|
||||
#### B. Automatic Cleanup
|
||||
```go
|
||||
func CleanupFeatureData(featureName string) {
|
||||
// Remove all data created by this feature
|
||||
db.Exec(fmt.Sprintf("DELETE FROM %s_* WHERE feature = '%s'", featureName, featureName))
|
||||
}
|
||||
```
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
|
||||
1. **Improved Test Reliability**: Tests don't interfere with each other
|
||||
2. **Better Maintainability**: Smaller, focused test files
|
||||
3. **Faster Development**: Run only relevant tests during feature development
|
||||
4. **Easier Debugging**: Isolate issues to specific features
|
||||
5. **Parallel Testing**: Enable safe parallel execution
|
||||
6. **SOLID Compliance**: Single responsibility for test files
|
||||
|
||||
### Negative
|
||||
|
||||
1. **Increased Complexity**: More moving parts in test infrastructure
|
||||
2. **Resource Usage**: Multiple databases/servers consume more resources
|
||||
3. **Setup Time**: Initial test runs may be slower due to setup
|
||||
4. **Learning Curve**: Team needs to understand the isolation patterns
|
||||
|
||||
### Neutral
|
||||
|
||||
1. **Test Execution Time**: May increase or decrease depending on parallelization
|
||||
2. **CI/CD Changes**: Pipeline needs adaptation for new test organization
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Phase 1: Refactor Current Tests (1-2 weeks)
|
||||
1. Split monolithic feature files into feature directories
|
||||
2. Create feature-specific test scripts
|
||||
3. Implement basic isolation (config files, database names)
|
||||
|
||||
### Phase 2: Enhance Test Infrastructure (2-3 weeks)
|
||||
1. Add synchronization helpers to test framework
|
||||
2. Implement server lifecycle management
|
||||
3. Create comprehensive cleanup routines
|
||||
|
||||
### Phase 3: Parallel Testing (Optional)
|
||||
1. Add parallel test execution capability
|
||||
2. Implement port management for parallel runs
|
||||
3. Add resource monitoring
|
||||
|
||||
## Alternatives Considered
|
||||
|
||||
### 1. Single Test Suite with Better Cleanup
|
||||
**Rejected because**: Doesn't solve fundamental interdependence issues
|
||||
|
||||
### 2. Docker-Based Isolation
|
||||
**Rejected because**: Too heavyweight for local development
|
||||
|
||||
### 3. Test Virtualization
|
||||
**Rejected because**: Overkill for current project size
|
||||
|
||||
## Success Metrics
|
||||
|
||||
1. **Test Reliability**: >95% pass rate in CI/CD
|
||||
2. **Test Isolation**: Ability to run any single feature test independently
|
||||
3. **Developer Experience**: Feature tests run in <30 seconds locally
|
||||
4. **Maintainability**: New team members can understand test structure in <1 hour
|
||||
|
||||
## References
|
||||
|
||||
### Godog Official Resources
|
||||
- [Godog GitHub Repository](https://github.com/cucumber/godog)
|
||||
- [Godog Documentation](https://pkg.go.dev/github.com/cucumber/godog)
|
||||
|
||||
### BDD Best Practices
|
||||
- [BDD Best Practices](references/BDD_BEST_PRACTICES.md)
|
||||
- [Alice GG • BDD in Golang](https://alicegg.tech/2019/03/09/gobdd.html)
|
||||
- [Scrap Your TDD for BDD: Part II](https://medium.com/the-godev-corner/scrap-your-tdd-for-bdd-part-ii-heres-how-to-start-d2468dd46dda)
|
||||
|
||||
### Test Organization Patterns
|
||||
- [Test Server Implementation](references/TEST_SERVER.md)
|
||||
- [Optimizing Godog Test Execution](https://www.reddit.com/r/golang/comments/1llnlp2/optimizing_godog_bdd_test_execution_in_go_how_to/)
|
||||
|
||||
## Revision History
|
||||
|
||||
- **2026-04-09**: Initial draft based on BDD test challenges
|
||||
- **2026-04-09**: Added implementation details and examples
|
||||
|
||||
## Decision Makers
|
||||
|
||||
- **Approved by**: Gabriel Radureau
|
||||
- **Consulted**: AI Agent (Mistral Vibe)
|
||||
- **Informed**: Development Team
|
||||
|
||||
## Future Considerations
|
||||
|
||||
1. **Test Impact Analysis**: Track which tests are affected by code changes
|
||||
2. **Flaky Test Detection**: Automatically identify and quarantine flaky tests
|
||||
3. **Performance Benchmarking**: Monitor test execution times over time
|
||||
4. **Test Coverage Visualization**: Feature-level coverage reports
|
||||
|
||||
---
|
||||
|
||||
**Status**: 🟡 Proposed → Ready for team review and implementation
|
||||
|
||||
**Note**: This ADR complements ADR 0023 (Config Hot Reloading) by addressing the test organization aspects of hot reloading functionality.
|
||||
@@ -2,6 +2,35 @@
|
||||
|
||||
This directory contains Architecture Decision Records (ADRs) for the dance-lessons-coach project.
|
||||
|
||||
## Index of ADRs
|
||||
|
||||
| Number | Title | Status |
|
||||
|--------|-------|--------|
|
||||
| 0001 | Go 1.26.1 Standard | ✅ Accepted |
|
||||
| 0002 | Chi Router | ✅ Accepted |
|
||||
| 0003 | Zerolog Logging | ✅ Accepted |
|
||||
| 0004 | Interface-Based Design | ✅ Accepted |
|
||||
| 0005 | Graceful Shutdown | ✅ Accepted |
|
||||
| 0006 | Configuration Management | ✅ Accepted |
|
||||
| 0007 | OpenTelemetry Integration | ✅ Accepted |
|
||||
| 0008 | BDD Testing | ✅ Accepted |
|
||||
| 0009 | Hybrid Testing Approach | ✅ Accepted |
|
||||
| 0010 | CI/CD Pipeline Design | ✅ Accepted |
|
||||
| 0011 | Trunk-Based Development | ✅ Accepted |
|
||||
| 0012 | Commit Message Conventions | ✅ Accepted |
|
||||
| 0013 | Version Management Lifecycle | ✅ Accepted |
|
||||
| 0014 | Swagger Documentation | ✅ Accepted |
|
||||
| 0015 | Rate Limiting Strategy | ✅ Accepted |
|
||||
| 0016 | Cache Invalidation Strategy | ✅ Accepted |
|
||||
| 0017 | JWT Secret Rotation | ✅ Accepted |
|
||||
| 0018 | Configuration Hot Reloading | ✅ Accepted |
|
||||
| 0019 | BDD Feature Structure | ✅ Accepted |
|
||||
| 0020 | Database Migration Strategy | ✅ Accepted |
|
||||
| 0021 | API Versioning Strategy | ✅ Accepted |
|
||||
| 0022 | Rate Limiting and Cache Strategy | ✅ Accepted |
|
||||
| 0023 | Config Hot Reloading | 🟡 Proposed |
|
||||
| 0024 | BDD Test Organization and Isolation | 🟡 Proposed |
|
||||
|
||||
## What is an ADR?
|
||||
|
||||
An ADR is a document that captures an important architectural decision made along with its context and consequences.
|
||||
|
||||
Reference in New Issue
Block a user