✨ feat: add feature-based test organization per ADR 0024 🐛 fix: resolve compilation errors in suite_feature.go 📝 docs: add comprehensive BDD framework documentation ♻️ refactor: split monolithic tests into modular features 🧪 test: implement synchronization helpers and context management ⚡ perf: add parallel test execution capability 🔧 chore: add feature-specific test scripts and validation 📚 docs: move BDD_TAGS.md to features/ for better organization Generated by Mistral Vibe. Co-Authored-By: Mistral Vibe <vibe@mistral.ai>
327 lines
9.1 KiB
Markdown
327 lines
9.1 KiB
Markdown
# BDD Testing Framework
|
|
|
|
This directory contains the Behavior-Driven Development (BDD) testing framework for the dance-lessons-coach project, implementing the architecture described in ADR 0024.
|
|
|
|
## 🗺️ Architecture Overview
|
|
|
|
The BDD framework follows a modular, isolated test suite architecture with these key components:
|
|
|
|
### 📁 Directory Structure
|
|
|
|
```
|
|
pkg/bdd/
|
|
├── README.md # This file
|
|
├── context/ # Feature-specific test contexts
|
|
│ ├── auth_context.go # Authentication test context
|
|
│ └── config_context.go # Configuration test context
|
|
├── helpers/ # Test synchronization helpers
|
|
│ └── synchronization.go # Wait functions and utilities
|
|
├── parallel/ # Parallel test execution
|
|
│ ├── port_manager.go # Port allocation system
|
|
│ └── resource_monitor.go # Resource tracking
|
|
├── steps/ # Step definitions
|
|
│ ├── auth_steps.go # Authentication steps
|
|
│ ├── config_steps.go # Configuration steps
|
|
│ ├── greet_steps.go # Greeting steps
|
|
│ ├── health_steps.go # Health check steps
|
|
│ ├── jwt_retention_steps.go # JWT retention steps
|
|
│ └── steps.go # Main step registration
|
|
├── suite.go # Test suite initialization
|
|
├── suite_feature.go # Feature-specific suite support
|
|
└── testserver/ # Test server implementation
|
|
├── client.go # HTTP test client
|
|
└── server.go # Test server with config
|
|
```
|
|
|
|
## 🎯 Core Components
|
|
|
|
### 1. Test Server
|
|
|
|
**Location:** `pkg/bdd/testserver/`
|
|
|
|
The test server provides a real HTTP server instance for black-box testing:
|
|
|
|
- **Hybrid Testing**: Runs in-process (not external process)
|
|
- **Configuration**: Loads feature-specific configs from `features/*/*-test-config.yaml`
|
|
- **Database**: Manages PostgreSQL connections with proper isolation
|
|
- **Port Management**: Uses feature-specific ports (9192-9196)
|
|
|
|
**Key Functions:**
|
|
- `NewServer()` - Creates test server instance
|
|
- `Start()` - Starts server with feature-specific configuration
|
|
- `initDBConnection()` - Initializes database connection
|
|
- `createTestConfig()` - Loads feature-specific configuration
|
|
|
|
### 2. Step Definitions
|
|
|
|
**Location:** `pkg/bdd/steps/`
|
|
|
|
Step definitions implement the Gherkin scenarios using Godog:
|
|
|
|
- **Domain-Specific**: Organized by feature area (auth, config, greet, etc.)
|
|
- **Reusable**: Common patterns in `common_steps.go`
|
|
- **Exact Matching**: Uses Godog's exact regex patterns
|
|
|
|
**Example:**
|
|
```go
|
|
// greet_steps.go
|
|
func (gs *GreetSteps) iRequestAGreetingFor(name string) error {
|
|
return gs.client.Request("GET", fmt.Sprintf("/api/v1/greet/%s", name), nil)
|
|
}
|
|
```
|
|
|
|
### 3. Synchronization Helpers
|
|
|
|
**Location:** `pkg/bdd/helpers/`
|
|
|
|
Helpers provide robust waiting mechanisms for async operations:
|
|
|
|
- **Timeout Support**: All functions include timeout parameters
|
|
- **Polling**: Uses context-based polling with configurable intervals
|
|
- **Common Patterns**: Covers server readiness, config reload, API availability
|
|
|
|
**Available Helpers:**
|
|
- `waitForServerReady()` - Waits for server to be ready
|
|
- `waitForConfigReload()` - Detects configuration changes
|
|
- `waitForCondition()` - Generic condition waiting
|
|
- `waitForV2APIEnabled()` - Checks v2 API availability
|
|
|
|
### 4. Parallel Testing
|
|
|
|
**Location:** `pkg/bdd/parallel/`
|
|
|
|
Parallel execution infrastructure for CI/CD optimization:
|
|
|
|
- **Port Management**: `PortManager` allocates unique ports
|
|
- **Resource Monitoring**: Tracks memory, goroutines, CPU usage
|
|
- **Controlled Parallelism**: `ParallelTestRunner` limits concurrency
|
|
|
|
**Key Features:**
|
|
- Thread-safe port allocation
|
|
- Resource limit enforcement
|
|
- Timeout detection
|
|
- Comprehensive monitoring
|
|
|
|
### 5. Feature Contexts
|
|
|
|
**Location:** `pkg/bdd/context/`
|
|
|
|
Feature-specific test contexts for better organization:
|
|
|
|
- **AuthContext**: User management and authentication
|
|
- **ConfigContext**: Configuration file handling
|
|
- **Extensible**: Easy to add new feature contexts
|
|
|
|
## 🚀 Test Execution
|
|
|
|
### Running All Tests
|
|
|
|
```bash
|
|
# Default: Run all features sequentially
|
|
go test ./features/...
|
|
|
|
# With environment variables
|
|
DLC_DATABASE_HOST=localhost DLC_DATABASE_PORT=5432 \
|
|
DLC_DATABASE_USER=postgres DLC_DATABASE_PASSWORD=postgres \
|
|
DLC_DATABASE_NAME=dance_lessons_coach_bdd_test \
|
|
DLC_DATABASE_SSL_MODE=disable \
|
|
go test ./features/...
|
|
```
|
|
|
|
### Feature-Specific Testing
|
|
|
|
```bash
|
|
# Test specific feature
|
|
./scripts/test-feature.sh greet
|
|
|
|
# Test with specific tags
|
|
./scripts/test-by-tag.sh @smoke greet
|
|
```
|
|
|
|
### Parallel Testing
|
|
|
|
```bash
|
|
# Run all features in parallel
|
|
./scripts/test-all-features-parallel.sh
|
|
|
|
# Run specific features in parallel
|
|
# (Requires PostgreSQL container running)
|
|
```
|
|
|
|
### Tag-Based Testing
|
|
|
|
```bash
|
|
# List available tags
|
|
./scripts/run-bdd-tests.sh list-tags
|
|
|
|
# Run smoke tests
|
|
./scripts/run-bdd-tests.sh run @smoke
|
|
|
|
# Run critical tests for auth
|
|
./scripts/run-bdd-tests.sh run @critical @auth
|
|
```
|
|
|
|
## 📋 Test Organization
|
|
|
|
### Feature Structure
|
|
|
|
Each feature follows this structure:
|
|
|
|
```
|
|
features/{feature}/
|
|
├── {feature}.feature # Gherkin scenarios
|
|
├── {feature}-test-config.yaml # Feature-specific config
|
|
└── {feature}_test.go # Go test runner
|
|
```
|
|
|
|
### Configuration Files
|
|
|
|
Feature-specific YAML files define test environment:
|
|
|
|
```yaml
|
|
# features/greet/greet-test-config.yaml
|
|
server:
|
|
host: "127.0.0.1"
|
|
port: 9194
|
|
|
|
database:
|
|
host: "localhost"
|
|
port: 5432
|
|
name: "dance_lessons_coach_greet_test"
|
|
|
|
api:
|
|
v2_enabled: true
|
|
```
|
|
|
|
### Tagging System
|
|
|
|
Comprehensive tagging for selective test execution:
|
|
|
|
- **Feature Tags**: `@auth`, `@config`, `@greet`, `@health`, `@jwt`
|
|
- **Priority Tags**: `@smoke`, `@critical`, `@basic`, `@advanced`
|
|
- **Component Tags**: `@api`, `@v2`, `@database`, `@security`
|
|
|
|
See `features/BDD_TAGS.md` for complete documentation.
|
|
|
|
## 🔧 Database Management
|
|
|
|
### Database Creation
|
|
|
|
The framework handles database creation automatically:
|
|
|
|
1. **PostgreSQL Container**: Uses Docker (`dance-lessons-coach-postgres`)
|
|
2. **Feature Databases**: Creates `dance_lessons_coach_{feature}_test` per feature
|
|
3. **Cleanup**: Automatically drops databases after tests
|
|
|
|
**Database Creation Flow:**
|
|
1. Check if database exists
|
|
2. Create if missing (`createdb` command)
|
|
3. Run tests with isolated database
|
|
4. Cleanup (`dropdb` command)
|
|
|
|
### Configuration
|
|
|
|
Database settings come from:
|
|
- Environment variables (`DLC_DATABASE_*`)
|
|
- Feature-specific config files
|
|
- Default values for development
|
|
|
|
## 🧪 Best Practices
|
|
|
|
### Step Definition Patterns
|
|
|
|
```go
|
|
// ✅ DO: Use Godog's exact regex patterns
|
|
ctx.Step(`^I request a greeting for "([^"]*)"$`, sc.iRequestAGreetingFor)
|
|
|
|
// ❌ DON'T: Use different patterns
|
|
ctx.Step(`^I request greeting "(.*)"$`, sc.iRequestAGreetingFor)
|
|
```
|
|
|
|
### Test Isolation
|
|
|
|
- Each feature has unique port and database
|
|
- No shared state between features
|
|
- Cleanup after each test run
|
|
- Feature-specific configuration
|
|
|
|
### Synchronization
|
|
|
|
```go
|
|
// ✅ DO: Use helpers for async operations
|
|
helpers.waitForServerReady(client, 30*time.Second)
|
|
|
|
// ❌ DON'T: Use fixed sleep times
|
|
time.Sleep(5 * time.Second)
|
|
```
|
|
|
|
### Context Management
|
|
|
|
```go
|
|
// ✅ DO: Use feature-specific contexts
|
|
switch featureName {
|
|
case "auth":
|
|
authCtx = context.NewAuthContext(client)
|
|
context.InitializeAuthContext(ctx, client)
|
|
}
|
|
```
|
|
|
|
## 📈 Performance Optimization
|
|
|
|
### Parallel Execution
|
|
|
|
- Use `scripts/test-all-features-parallel.sh` for CI/CD
|
|
- Limit parallelism based on system resources
|
|
- Monitor resource usage with `ResourceMonitor`
|
|
|
|
### Selective Testing
|
|
|
|
- Run only relevant tests with tag filtering
|
|
- Use `@smoke` for quick validation
|
|
- Use `@critical` for essential path testing
|
|
|
|
### Resource Management
|
|
|
|
- Set appropriate timeouts
|
|
- Limit maximum goroutines
|
|
- Monitor memory usage
|
|
- Cleanup resources promptly
|
|
|
|
## 🔧 Troubleshooting
|
|
|
|
### Common Issues
|
|
|
|
| Issue | Cause | Solution |
|
|
|-------|-------|----------|
|
|
| Undefined steps | Step pattern mismatch | Use Godog's exact suggested patterns |
|
|
| Port conflicts | Multiple servers | Check port allocation in config files |
|
|
| Database connection | PostgreSQL not running | Start with `docker compose up -d postgres` |
|
|
| Test isolation | Shared state | Verify unique ports/databases per feature |
|
|
|
|
### Debugging
|
|
|
|
```bash
|
|
# Verbose output
|
|
go test ./features/... -v
|
|
|
|
# Check specific feature
|
|
cd features/greet && go test -v .
|
|
|
|
# List available tags
|
|
./scripts/run-bdd-tests.sh list-tags
|
|
```
|
|
|
|
## 📚 Documentation
|
|
|
|
- **ADR 0024**: BDD Test Organization and Isolation Strategy
|
|
- **BDD_TAGS.md**: Complete tag reference
|
|
- **Godog Documentation**: https://github.com/cucumber/godog
|
|
|
|
## 🎯 Future Enhancements
|
|
|
|
- **Test Impact Analysis**: Track which tests are affected by code changes
|
|
- **Flaky Test Detection**: Automatically identify and quarantine flaky tests
|
|
- **Performance Benchmarking**: Monitor test execution times
|
|
- **AI-Assisted Testing**: Automated test generation and optimization
|
|
|
|
This BDD framework provides a robust foundation for behavior-driven development in the dance-lessons-coach project, ensuring test reliability, maintainability, and scalability. |