DO-178C Test Traceability with TofuPilot
DO-178C (Software Considerations in Airborne Systems and Equipment Certification) requires complete traceability from requirements to test cases to test results. For hardware-software integrated systems, this means linking every test measurement to the requirement it verifies. TofuPilot provides the structured test data layer.
DO-178C and Testing
DO-178C defines five Design Assurance Levels (DAL), from A (catastrophic) to E (no safety effect). Higher levels require more rigorous testing and documentation.
| DAL | Failure condition | Testing rigor |
|---|---|---|
| A | Catastrophic | Full MC/DC coverage, independence |
| B | Hazardous | Full decision coverage |
| C | Major | Full statement coverage |
| D | Minor | Basic testing |
| E | No effect | Minimal |
Regardless of DAL, all levels require traceability between requirements, test cases, and test results.
The Traceability Chain
Requirements → Test Cases → Test Procedures → Test Results
↓ ↓ ↓ ↓
DOORS Test Plan TofuPilot TofuPilot
or Jama (doc) Procedure Run Results
TofuPilot handles the right half: test procedures (defined as procedure IDs with steps and measurements) and test results (actual run data with pass/fail).
Mapping Requirements to Test Procedures
Use a consistent naming convention that links TofuPilot procedures to requirements.
from tofupilot import TofuPilotClient
client = TofuPilotClient()
# Procedure ID includes the requirement reference
client.create_run(
procedure_id="TC-SW-REQ-042-AIRSPEED-COMPUTATION",
unit_under_test={
"serial_number": "ADC-UNIT-007",
"part_number": "AIR-DATA-COMPUTER-V2",
},
run_passed=True,
steps=[{
"name": "Airspeed Computation Accuracy",
"step_type": "measurement",
"status": True,
"measurements": [
{"name": "indicated_airspeed_error_kts", "value": 0.8, "unit": "kts", "limit_high": 2.0},
{"name": "true_airspeed_error_kts", "value": 1.2, "unit": "kts", "limit_high": 3.0},
{"name": "mach_number_error", "value": 0.002, "unit": "Mach", "limit_high": 0.005},
],
}, {
"name": "Airspeed Range Verification",
"step_type": "measurement",
"status": True,
"measurements": [
{"name": "min_airspeed_kts", "value": 30, "unit": "kts", "limit_high": 40},
{"name": "max_airspeed_kts", "value": 450, "unit": "kts", "limit_low": 400},
],
}],
)Requirements Traceability Matrix
| Req ID | Requirement | Test Procedure | TofuPilot Measurement | Limit |
|---|---|---|---|---|
| SW-REQ-042 | Airspeed computation accuracy < 2 kts | TC-SW-REQ-042 | indicated_airspeed_error_kts | < 2.0 kts |
| SW-REQ-043 | True airspeed accuracy < 3 kts | TC-SW-REQ-042 | true_airspeed_error_kts | < 3.0 kts |
| SW-REQ-044 | Mach computation accuracy < 0.005 | TC-SW-REQ-042 | mach_number_error | < 0.005 |
| SW-REQ-045 | Operate from 40 to 400 kts | TC-SW-REQ-042 | min/max_airspeed_kts | 40/400 |
Hardware-Software Integration Testing
DO-178C Section 6.4 covers hardware/software integration testing. These tests verify that the software works correctly on the target hardware.
# Hardware/Software integration test
client.create_run(
procedure_id="HWSW-INT-ADC-ARINC429",
unit_under_test={"serial_number": "ADC-UNIT-007"},
run_passed=True,
steps=[{
"name": "ARINC 429 Output Verification",
"step_type": "measurement",
"status": True,
"measurements": [
{"name": "label_airspeed_rate_hz", "value": 25.0, "unit": "Hz", "limit_low": 24.5, "limit_high": 25.5},
{"name": "label_altitude_rate_hz", "value": 12.5, "unit": "Hz", "limit_low": 12.0, "limit_high": 13.0},
{"name": "ssm_status_normal", "value": 1, "unit": "bool", "limit_low": 1},
{"name": "data_latency_ms", "value": 18.2, "unit": "ms", "limit_high": 40.0},
],
}, {
"name": "Watchdog Timer Verification",
"step_type": "measurement",
"status": True,
"measurements": [
{"name": "watchdog_timeout_ms", "value": 50, "unit": "ms", "limit_low": 45, "limit_high": 55},
{"name": "reset_recovery_ms", "value": 120, "unit": "ms", "limit_high": 200},
],
}],
)Regression Testing
When software is modified, DO-178C requires regression testing to verify no unintended effects. TofuPilot makes regression tracking straightforward:
- Run the full test suite on the new software version
- Compare results against the baseline (previous version) in TofuPilot
- Flag any measurements that changed beyond expected tolerance
# Compare test results across software versions
baseline_runs = client.get_runs(
procedure_id="TC-SW-REQ-042-AIRSPEED-COMPUTATION",
limit=10,
)
# Filter by software version through unit metadata or date range
# Compare measurement values between versionsIf a measurement that was 0.8 kts error in version 2.1 becomes 1.9 kts error in version 2.2, the regression test catches it before certification submission.
DER/Auditor Evidence Package
When presenting test evidence to a DER (Designated Engineering Representative) or certification authority:
| Document | Source |
|---|---|
| Requirements Traceability Matrix | Your requirements tool + TofuPilot procedure mapping |
| Test Procedures | TofuPilot procedure definitions with steps and limits |
| Test Results | TofuPilot run data with measurements and pass/fail |
| Test Coverage Analysis | Map of requirements to TofuPilot procedures |
| Regression Test Report | TofuPilot comparison between software versions |
TofuPilot provides the structured, timestamped test evidence. Your certification package references TofuPilot data as the authoritative test record.
DO-254 Hardware Testing
For DO-254 (hardware assurance), the same traceability principles apply. Hardware requirements map to hardware test procedures, which map to test results in TofuPilot.
Common DO-254 test types tracked in TofuPilot:
- FPGA functional verification
- Environmental qualification
- EMI/EMC compliance
- Power supply characterization
- Timing and performance verification
Use the same procedure naming convention (requirement ID in the procedure ID) for consistent traceability across hardware and software testing.