Skip to content
Compliance & Traceability

DO-178C Test Traceability with TofuPilot

Learn how to maintain DO-178C compliant test traceability for airborne software using TofuPilot's structured test records and requirement mapping.

JJulien Buteau
advanced11 min readMarch 14, 2026

DO-178C Test Traceability with TofuPilot

DO-178C (Software Considerations in Airborne Systems and Equipment Certification) requires complete traceability from requirements to test cases to test results. For hardware-software integrated systems, this means linking every test measurement to the requirement it verifies. TofuPilot provides the structured test data layer.

DO-178C and Testing

DO-178C defines five Design Assurance Levels (DAL), from A (catastrophic) to E (no safety effect). Higher levels require more rigorous testing and documentation.

DALFailure conditionTesting rigor
ACatastrophicFull MC/DC coverage, independence
BHazardousFull decision coverage
CMajorFull statement coverage
DMinorBasic testing
ENo effectMinimal

Regardless of DAL, all levels require traceability between requirements, test cases, and test results.

The Traceability Chain

Requirements → Test Cases → Test Procedures → Test Results ↓ ↓ ↓ ↓ DOORS Test Plan TofuPilot TofuPilot or Jama (doc) Procedure Run Results

TofuPilot handles the right half: test procedures (defined as procedure IDs with steps and measurements) and test results (actual run data with pass/fail).

Mapping Requirements to Test Procedures

Use a consistent naming convention that links TofuPilot procedures to requirements.

do178c_test.py
from tofupilot import TofuPilotClient

client = TofuPilotClient()

# Procedure ID includes the requirement reference
client.create_run(
    procedure_id="TC-SW-REQ-042-AIRSPEED-COMPUTATION",
    unit_under_test={
        "serial_number": "ADC-UNIT-007",
        "part_number": "AIR-DATA-COMPUTER-V2",
    },
    run_passed=True,
    steps=[{
        "name": "Airspeed Computation Accuracy",
        "step_type": "measurement",
        "status": True,
        "measurements": [
            {"name": "indicated_airspeed_error_kts", "value": 0.8, "unit": "kts", "limit_high": 2.0},
            {"name": "true_airspeed_error_kts", "value": 1.2, "unit": "kts", "limit_high": 3.0},
            {"name": "mach_number_error", "value": 0.002, "unit": "Mach", "limit_high": 0.005},
        ],
    }, {
        "name": "Airspeed Range Verification",
        "step_type": "measurement",
        "status": True,
        "measurements": [
            {"name": "min_airspeed_kts", "value": 30, "unit": "kts", "limit_high": 40},
            {"name": "max_airspeed_kts", "value": 450, "unit": "kts", "limit_low": 400},
        ],
    }],
)

Requirements Traceability Matrix

Req IDRequirementTest ProcedureTofuPilot MeasurementLimit
SW-REQ-042Airspeed computation accuracy < 2 ktsTC-SW-REQ-042indicated_airspeed_error_kts< 2.0 kts
SW-REQ-043True airspeed accuracy < 3 ktsTC-SW-REQ-042true_airspeed_error_kts< 3.0 kts
SW-REQ-044Mach computation accuracy < 0.005TC-SW-REQ-042mach_number_error< 0.005
SW-REQ-045Operate from 40 to 400 ktsTC-SW-REQ-042min/max_airspeed_kts40/400

Hardware-Software Integration Testing

DO-178C Section 6.4 covers hardware/software integration testing. These tests verify that the software works correctly on the target hardware.

hwsw_integration.py
# Hardware/Software integration test
client.create_run(
    procedure_id="HWSW-INT-ADC-ARINC429",
    unit_under_test={"serial_number": "ADC-UNIT-007"},
    run_passed=True,
    steps=[{
        "name": "ARINC 429 Output Verification",
        "step_type": "measurement",
        "status": True,
        "measurements": [
            {"name": "label_airspeed_rate_hz", "value": 25.0, "unit": "Hz", "limit_low": 24.5, "limit_high": 25.5},
            {"name": "label_altitude_rate_hz", "value": 12.5, "unit": "Hz", "limit_low": 12.0, "limit_high": 13.0},
            {"name": "ssm_status_normal", "value": 1, "unit": "bool", "limit_low": 1},
            {"name": "data_latency_ms", "value": 18.2, "unit": "ms", "limit_high": 40.0},
        ],
    }, {
        "name": "Watchdog Timer Verification",
        "step_type": "measurement",
        "status": True,
        "measurements": [
            {"name": "watchdog_timeout_ms", "value": 50, "unit": "ms", "limit_low": 45, "limit_high": 55},
            {"name": "reset_recovery_ms", "value": 120, "unit": "ms", "limit_high": 200},
        ],
    }],
)

Regression Testing

When software is modified, DO-178C requires regression testing to verify no unintended effects. TofuPilot makes regression tracking straightforward:

  1. Run the full test suite on the new software version
  2. Compare results against the baseline (previous version) in TofuPilot
  3. Flag any measurements that changed beyond expected tolerance
regression_check.py
# Compare test results across software versions
baseline_runs = client.get_runs(
    procedure_id="TC-SW-REQ-042-AIRSPEED-COMPUTATION",
    limit=10,
)

# Filter by software version through unit metadata or date range
# Compare measurement values between versions

If a measurement that was 0.8 kts error in version 2.1 becomes 1.9 kts error in version 2.2, the regression test catches it before certification submission.

DER/Auditor Evidence Package

When presenting test evidence to a DER (Designated Engineering Representative) or certification authority:

DocumentSource
Requirements Traceability MatrixYour requirements tool + TofuPilot procedure mapping
Test ProceduresTofuPilot procedure definitions with steps and limits
Test ResultsTofuPilot run data with measurements and pass/fail
Test Coverage AnalysisMap of requirements to TofuPilot procedures
Regression Test ReportTofuPilot comparison between software versions

TofuPilot provides the structured, timestamped test evidence. Your certification package references TofuPilot data as the authoritative test record.

DO-254 Hardware Testing

For DO-254 (hardware assurance), the same traceability principles apply. Hardware requirements map to hardware test procedures, which map to test results in TofuPilot.

Common DO-254 test types tracked in TofuPilot:

  • FPGA functional verification
  • Environmental qualification
  • EMI/EMC compliance
  • Power supply characterization
  • Timing and performance verification

Use the same procedure naming convention (requirement ID in the procedure ID) for consistent traceability across hardware and software testing.

More Guides

Put this guide into practice