Step 4 · Test

Goal

Verify the feature works correctly, securely, and at the edges. No feature ships without passing this gate.

Instructions

You are in workflow step 4 of the feature-cycle. Implementation is complete. Now verify it thoroughly.


Tasks to Perform

1. Run the Full Test Suite

[test runner command]

# Expected: all tests pass, including any newly written tests

If any test fails: fix the root cause. Do not move forward with failing tests.

2. Acceptance Criteria Verification

Open docs/features/FEATURE-NAME.md. Find the ## Acceptance Criteria section.

For each criterion, verify it passes — ideally with an automated test, otherwise with a documented manual verification:

Acceptance Criterion Test Type Status
Given [condition], When [action], Then [result] Unit/Integration/Manual ✅/❌

3. Edge Case Testing

For every input the feature accepts, test the boundaries:

  • [ ] Empty / null / undefined — what happens with no data?
  • [ ] Maximum size — very long strings, large files, maximum numbers
  • [ ] Minimum size — zero-length, 0, negative numbers where relevant
  • [ ] Special characters<script>alert(1)</script>, ' OR 1=1 --, ../../../etc/passwd
  • [ ] Concurrent execution — what if two requests come in simultaneously?

4. Security Testing

Run the security checklist against the new feature:

  • [ ] Can an unauthenticated user access this feature? (should return 401)
  • [ ] Can a user access another user's data via this feature? (IDOR test)
  • [ ] Does the feature accept any HTML/script input and render it back? (XSS test)
# Test without authentication
curl -X POST /api/new-endpoint -d '{"test": "value"}' | head -5
# Expected: 401 Unauthorized

# Test with another user's resource ID
# Login as User A, then request User B's resource ID
# Expected: 403 Forbidden or 404 Not Found

5. Performance Baseline

For endpoints that return collections or perform complex operations:

# Check query count (enable query logging first)
# Make a request and count how many DB queries ran
# Expected: < 10 queries for a typical list endpoint

# Check response time
curl -o /dev/null -s -w "Total: %{time_total}s\n" https://localhost/api/new-endpoint
# Expected: < 500ms for simple operations

6. Regression Check

Verify that the feature did not break anything existing:

# Run the entire test suite, not just new tests
[full test runner command]

# Check that existing API responses didn't change shape
# (if you have contract tests or snapshot tests, run them)

Defect Standards

Any defect found during this step:

  1. Write a failing test that reproduces it (before fixing)
  2. Fix the root cause
  3. Verify the test now passes
  4. Commit: fix: [what broke] — found during feature testing
  5. Add any follow-up tasks to TODO.md:
    - [ ] fix: [what broke] — root cause: [one sentence] _(ref: workflows/feature-cycle/04-test.md)_
  6. Status rules: [ ] = not started · [~] = in progress (one at a time) · [x] = done (prefix the date)
  7. Continue verification

Expected Output

  • All acceptance criteria verified and documented
  • All edge cases tested
  • Security checks passed
  • Any bugs found during testing are fixed and committed

Exit Criteria

This step is complete when:

  • [ ] All acceptance criteria are passing
  • [ ] Full test suite passes
  • [ ] Security checks passed
  • [ ] Performance is acceptable
  • [ ] No known defects remain open
  • [ ] Ready to proceed to Step 5 (Release)