Files
dance-lessons-coach/adr/0009-hybrid-testing-approach.md
Gabriel Radureau a24b4fdb3b 📝 docs(adr): homogenize 23 ADRs + rewrite README (Tâche 7 migration) (#18)
## Summary

Homogenize all 23 ADRs to a single canonical header format, and rewrite `adr/README.md` to match the actual state of the corpus.

This is **Tâche 7** of the ARCODANGE Phase 1 migration (Claude Code → Mistral Vibe). Independent from PR #17 (Tâche 6 — restructure AGENTS.md) — both can merge in any order. No code changes; only documentation.

## Changes

### 1. Homogenize 21 ADR headers (commit `db09d0a`)

The audit (Tâche 6 Phase A, Mistral intent-router agent, 2026-05-02) had identified **3 inconsistent header formats** :

- **F1** — list bullets (`* Status:` / `* Date:` / `* Deciders:`) : 11 ADRs (0001-0008, 0011, 0014, 0023)
- **F2** — bold fields (`**Status:**` / `**Date:**` / `**Authors:**`) : 9 ADRs (0009, 0010, 0012, 0013, 0015, 0016, 0017, 0018, 0019)
- **F3** — dedicated section (`## Status\n**Value** `) : 5 ADRs (0020, 0021, 0022, 0024, 0025)

Plus mixed metadata names (Authors / Deciders / Decision Date / Implementation Date / Implementation Status / Last Updated) and decorative emojis on status values made the corpus hard to scan or template against.

**Canonical format adopted** (see `adr/README.md` for full template) :

```markdown
# NN. Title

**Status:** <Proposed | Accepted | Implemented | Partially Implemented | Approved | Rejected | Deferred | Deprecated | Superseded by ADR-NNNN>
**Date:** YYYY-MM-DD
**Authors:** Name(s)

[optional **Field:** ... lines]

## Context...
```

**Transformations applied** (via `/tmp/homogenize-adrs.py` script, 23 files scanned, 21 modified — 0010 and 0012 were already conform) :

- F1 list bullets → bold fields
- F2 cleanup : `**Deciders:**` → `**Authors:**`, strip status emojis
- F3 sections : `## Status\n**Value** ` → `**Status:** Value` (single line)
- Strip decorative emojis from `**Status:**` and `**Implementation Status:**`
- Convert `* Last Updated:` / `* Implementation Status:` / `* Decision Drivers:` / `* Decision Date:` to bold
- Date typo fix : `2024-04-XX` → `2026-04-XX` for ADRs 0018, 0019 (off-by-2-years in original)
- Normalize multiple blank lines after header (max 1)

**ADR body content is preserved unchanged.** Only headers transformed.

### 2. Rewrite `adr/README.md` (commit `d64ab02`)

Previous README had multiple inconsistencies :

- Index table listed wrong titles for ADRs 0010-0021 (looked like an aspirational forecast that never matched reality — e.g. "0011 = Trunk-Based Development" but real 0011 is absent and Trunk-Based Development is actually 0017)
- Listed entries for ADRs 0011 (validation library) and 0014 (gRPC) but **these files do not exist** in the repo
- 0024 (BDD Test Organization) was missing from the detail list
- Template still showed the obsolete F1 format (`* Status:`)
- Decorative emojis on every status entry

Rewrite :

- Index table **regenerated from actual file contents** (title from H1, status from `**Status:**` line) — emoji-free, accurate
- Notes that 0011 / 0014 are not currently in use (reserved)
- Updated template block matches the canonical format
- Status Legend extended with `Approved`, `Partially Implemented`, `Deferred`
- Added note that 0026 is the next free number for new ADRs

## Test plan

- [x] All 23 ADRs follow `**Status:**` / `**Date:**` / `**Authors:**` (verified via grep)
- [x] No more occurrences of `* Status:` (F1) or `## Status` (F3) in any ADR header
- [x] No more emojis on `**Status:**` lines
- [x] `adr/README.md` index links resolve to existing files (no more 0011 / 0014 dead links)
- [x] Pre-commit hooks pass (`go mod tidy`, `go fmt`, `swag fmt`)

## Migration context

Part of Phase 1 of the ARCODANGE migration from Claude Code to Mistral Vibe. Tâche 7 of the curriculum.

Independent from PR #17 (which restructures `AGENTS.md`). The two PRs touch disjoint files — no merge conflict expected when both are merged.

🤖 Generated with [Claude Code](https://claude.com/claude-code) (Opus 4.7, 1M context). Mistral Vibe (intent-router agent / mistral-medium-3.5) did the original audit identifying the 3 formats during Tâche 6 Phase A.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Co-Authored-By: Mistral Vibe (devstral-2 / mistral-medium-3.5)
Reviewed-on: #18
Co-authored-by: Gabriel Radureau <arcodange@gmail.com>
Co-committed-by: Gabriel Radureau <arcodange@gmail.com>
2026-05-03 11:01:13 +02:00

11 KiB

Combine BDD and Swagger-based testing

Status: Partially Implemented (BDD + Documentation only) Authors: Gabriel Radureau, AI Agent Date: 2026-04-05 Last Updated: 2026-04-05 Implementation Status: BDD testing and OpenAPI documentation completed, SDK generation deferred

Context and Problem Statement

We need to establish a comprehensive testing strategy for dance-lessons-coach that provides:

  • Behavioral verification through BDD
  • API documentation through Swagger/OpenAPI
  • Client SDK validation
  • Clear separation of concerns
  • Maintainable test suite

Decision Drivers

  • Need for comprehensive API testing
  • Desire for living documentation
  • Requirement for client SDK validation
  • Need for clear test organization
  • Desire for maintainable test suite

Considered Options

  • BDD only - Use Godog for all testing
  • Swagger only - Use OpenAPI for testing
  • Hybrid approach - Combine BDD and Swagger testing
  • Custom solution - Build our own testing framework

Decision Outcome

Chosen option: "Hybrid approach" because it provides the best combination of behavioral verification, API documentation, client validation, and maintainable test organization.

Implementation Status

Status: Partially Implemented (BDD + Documentation only)

What We Actually Have

  1. BDD Testing with Direct HTTP Client

    • Godog framework integration
    • Direct HTTP testing of all endpoints
    • Comprehensive feature coverage
    • Clear, readable scenarios
    • 7 scenarios, 21 steps, 100% passing
  2. OpenAPI/Swagger Documentation

    • swaggo/swag integration
    • Interactive Swagger UI at /swagger/
    • OpenAPI 2.0 specification
    • Hierarchical tagging system
    • Embedded documentation for single-binary deployment
  3. Swagger-based Testing (Not implemented)

    • No SDK generation from OpenAPI spec
    • No SDK-based BDD tests
    • No client validation through generated SDKs
    • No api/gen/ directory with generated clients

Why We Don't Need Full Hybrid Testing (Yet)

  1. Current Scale: Small API with limited endpoints (health, ready, version, greet)
  2. Team Size: Small team can effectively maintain direct HTTP tests
  3. Complexity: SDK generation adds unnecessary infrastructure complexity
  4. Maintenance: Direct HTTP tests are simpler to write and maintain
  5. Coverage: Current BDD tests provide comprehensive coverage of all functionality
  6. No External Consumers: No current need for official SDKs or client libraries
  7. Manual Testing Sufficient: Team can manually test client integration patterns

Current Testing Architecture

features/
├── greet.feature          # Direct HTTP testing ✅
├── health.feature         # Direct HTTP testing ✅
└── readiness.feature      # Direct HTTP testing ✅

pkg/bdd/
├── steps/                 # Step definitions ✅
│   └── steps.go           # Direct HTTP client steps ✅
└── testserver/            # Test infrastructure ✅
    ├── client.go          # HTTP client ✅
    └── server.go          # Test server ✅

pkg/server/docs/           # OpenAPI documentation ✅
├── swagger.json           # Generated spec ✅
├── swagger.yaml           # Generated spec ✅
└── docs.go               # Embedded docs ✅

Missing Components for Full Hybrid Approach

api/                        # Not implemented ❌
├── openapi.yaml            # Manual spec (not generated) ❌
└── gen/                    # Generated code ❌
    └── go/                 # Go SDK client ❌

features/
└── greet_sdk.feature       # SDK-based testing ❌

pkg/bdd/
├── steps/
│   └── sdk_steps.go        # SDK client steps ❌
└── testserver/
    └── sdk_client.go       # SDK client wrapper ❌

Pros and Cons of the Options

Hybrid approach

  • Good, because combines strengths of both approaches
  • Good, because BDD for behavioral verification
  • Good, because Swagger for API documentation
  • Good, because SDK testing for client validation
  • Good, because clear separation of concerns
  • Bad, because more complex setup
  • Bad, because requires maintaining two test suites

BDD only

  • Good, because consistent testing approach
  • Good, because good for behavioral verification
  • Bad, because no API documentation
  • Bad, because no SDK validation

Swagger only

  • Good, because good API documentation
  • Good, because SDK validation
  • Bad, because poor for behavioral testing
  • Bad, because less readable for non-technical stakeholders

Custom solution

  • Good, because tailored to our needs
  • Good, because no external dependencies
  • Bad, because time-consuming to develop
  • Bad, because need to maintain ourselves

Implementation Strategy

Phase 1: BDD Implementation (Current) COMPLETED

features/
├── greet.feature          # Direct HTTP testing ✅
├── health.feature         # Direct HTTP testing ✅
└── readiness.feature      # Direct HTTP testing ✅

pkg/bdd/
├── steps/                 # Step definitions ✅
│   └── steps.go           # Direct HTTP client steps ✅
└── testserver/            # Test infrastructure ✅
    ├── client.go          # HTTP client ✅
    └── server.go          # Test server ✅

Phase 2: Swagger Integration (Current) COMPLETED

pkg/server/docs/           # OpenAPI documentation ✅
├── swagger.json           # Generated spec ✅
├── swagger.yaml           # Generated spec ✅
└── docs.go               # Embedded docs ✅

pkg/server/                # Server integration ✅
├── server.go             # Swagger UI routes ✅
└── main.go              # Swagger annotations ✅

Phase 3: SDK Generation (Future - Not Currently Needed) DEFERRED

api/                        # Future consideration ❌
├── openapi.yaml            # Manual spec (if needed) ❌
└── gen/                    # Generated code ❌
    └── go/                 # Go SDK client ❌

features/
└── greet_sdk.feature       # SDK-based testing ❌

pkg/bdd/
├── steps/
│   └── sdk_steps.go        # SDK client steps ❌
└── testserver/
    └── sdk_client.go       # SDK client wrapper ❌

Current Testing Benefits

1. Direct HTTP Tests (Our Current Approach)

  • Verify raw API behavior
  • Test edge cases and error handling
  • Black box testing of actual endpoints
  • No dependency on generated code
  • Simple to write and maintain
  • Fast execution
  • Clear failure messages

2. SDK-Based Tests (Not Implemented)

  • Would validate generated client works correctly
  • Would test client integration patterns
  • Would catch issues in SDK generation
  • Would provide examples for SDK users
  • Would add complexity to test suite
  • Would require maintenance of generated code

Example SDK-Based Feature

# features/greet_sdk.feature
Feature: Greet Service SDK
  The generated SDK should work correctly with the service

  Scenario: SDK default greeting
    Given the server is running
    And I have a configured SDK client
    When I call Greet with no name
    Then the response should be "Hello world!"

  Scenario: SDK personalized greeting
    Given the server is running
    And I have a configured SDK client
    When I call Greet with name "John"
    Then the response should be "Hello John!"

  Scenario: SDK error handling
    Given the server is running
    And I have a configured SDK client
    When I call Greet with invalid parameters
    Then I should receive an appropriate error

Implementation Order

  1. Implement BDD with direct HTTP client (COMPLETED)
  2. Add Swagger/OpenAPI documentation (COMPLETED)
  3. Generate SDK clients from Swagger spec (DEFERRED - not currently needed)
  4. Add SDK-based BDD tests (DEFERRED - not currently needed)

Test Organization

features/
├── greet.feature          # Direct HTTP tests
├── greet_sdk.feature      # SDK client tests
├── health.feature         # Direct HTTP tests
├── health_sdk.feature    # SDK client tests
└── readiness.feature      # Direct HTTP tests

Future Enhancements

If We Need SDK Generation Later

  • Add oapi-codegen for SDK generation
  • Generate Go, TypeScript, Python clients
  • Add SDK-based BDD tests
  • Implement automated SDK generation in CI/CD
  • Add SDK validation to workflow

Current Focus (More Valuable)

  • Add performance testing to BDD suite
  • Integrate contract testing
  • Add API version compatibility testing
  • Improve test coverage for edge cases
  • Add more realistic test scenarios

Monitoring and Maintenance

Current Approach

  • Regular review of test coverage
  • Update tests when API changes
  • Keep OpenAPI spec in sync with implementation
  • Monitor test execution in CI/CD
  • Review BDD scenarios for realism

If We Add SDK Generation Later

  • Monitor SDK generation for breaking changes
  • Validate generated SDKs work correctly
  • Update SDK-based tests when API changes
  • Maintain compatibility between SDK versions
  • Document SDK usage patterns

Conclusion

What We Actually Have (Current Implementation)

BDD Testing: Comprehensive behavioral testing with Godog OpenAPI Documentation: Interactive Swagger UI with swaggo/swag Direct HTTP Testing: 7 scenarios, 21 steps, 100% passing Production Ready: Fully tested and operational

What We Don't Have (Deferred)

SDK Generation: No generated clients from OpenAPI spec Hybrid Testing: No SDK-based BDD tests Client Validation: No automated client validation oapi-codegen: Using swaggo instead

Why This is the Right Approach

  1. Pragmatic: Solves immediate needs without over-engineering
  2. Maintainable: Simple infrastructure, easy to understand
  3. Effective: Covers all functionality with direct HTTP testing
  4. Scalable: Can add SDK generation later if needed
  5. Team-Appropriate: Matches current team size and expertise

Future Considerations

If we need SDK generation in the future:

  • Add oapi-codegen alongside swaggo
  • Generate Go, TypeScript, Python clients
  • Add SDK-based BDD tests
  • Implement true hybrid testing approach

Current Status: Partially Implemented (BDD + Documentation) BDD Tests: http://localhost:8080/api/health (all passing) OpenAPI Docs: http://localhost:8080/swagger/ OpenAPI Spec: http://localhost:8080/swagger/doc.json

Proposed by: Arcodange Team Implemented by: 2026-04-05 Last Updated: 2026-04-05 Status: Production Ready for Current Needs