ISO 13485, EU MDR, FDA 21 CFR 820. Design controls through post-market surveillance, with traceability that auditors actually believe.

The auditor asked one question: "Show me how requirement 47 traces to verification."
Forty-five minutes later, the team was still searching. The requirement lived in a Word document. The test protocol was a PDF somewhere else. The results were in a spreadsheet that hadn't been updated in months. The trace matrix—an Excel file someone maintained between audits—was fiction.
The finding: "Traceability between design requirements and verification is not adequately maintained."
That finding cascaded. No traceability means no design control. No design control means you can't demonstrate the device meets essential requirements. Major nonconformance. Certification at risk. Eight months of remediation.
MDR doesn't accept "it's in there somewhere."
Traditional design control is document management with traceability bolted on afterward. Requirements in Word. Design outputs in CAD. Test protocols somewhere else. Results in spreadsheets. An overworked QA engineer maintains the trace matrix—a separate artifact that must be manually synchronized every time anything changes. Every "someone must remember" is a failure point.
Seal inverts this. Traceability isn't a matrix you maintain—it's how work gets done. Create a verification protocol and you link it to the requirement it tests. Record results and they're linked to the protocol. The chain is structural, not documentary.
Click requirement 47. See its rationale, the design reviews that evaluated it, its risk controls. Click "traces to" and see the design output, the verification protocol, the test results. Forward and backward traceability are the same data viewed from different directions.
Design reviews work the same way. A design review isn't meeting minutes—it's a record that links to everything evaluated at that moment: verification status, open issues, risk analysis state. When an auditor asks "what was the design status at Phase 2 review?", the answer is the review record itself, with links to the exact evidence that existed then.
ISO 14971 requires identifying hazards, analyzing hazardous situations, estimating risk, implementing controls, and evaluating residual risk. Most organizations treat this as documentation—a Risk Management File assembled during development and updated reluctantly.
The problem: risk isn't static. Field experience reveals hazards you didn't anticipate. Design changes introduce new failure modes. A Risk Management File that's accurate at approval becomes fiction over time.
Seal treats risk as live analysis. Hazards link to hazardous situations. Situations link to potential harms. Risk controls reduce probability or severity—and link to the verification that proves they work. When a field complaint suggests a new hazard, add it—the system shows what else is affected. "Show me all hazards with severity 4+ and inadequate risk control" is a query, not a research project.
The benefit-risk determination MDR requires generates from this structure. Clinical benefits documented with evidence. Risks characterized with probability and severity. The comparison traceable to source data, not an assertion in a document.
Three weeks before submission, most companies panic. Pull documents from SharePoint, shared drives, email. Wonder which version is current. Assemble 2,400 pages manually. Discover the clinical evaluation references outdated risk analysis. Scramble. Fix inconsistencies. Create new ones.
This isn't preparation—it's reconstruction. And it's instantly outdated because development continued while you assembled.
Seal maintains Technical Documentation continuously. Annex II sections map to structured data: device description from your master record, design information from your DHF, GSPR compliance from requirements traceability, risk analysis from your living risk management, clinical evaluation from your CER with linked literature and clinical data.
Generate Technical Documentation and you're rendering, not assembling. Current state in seconds. When the Notified Body asks for an update six months later, generate again. Same process. Current data.
Clinical Evaluation Reports update incrementally as literature searches run and PMCF data accumulates. One source of truth serves multiple markets—EU MDR, FDA 510(k), Health Canada—different formats pulling from the same data.
PMS under MDR isn't passive collection. It's active surveillance with teeth.
Most organizations struggle because PMS data comes from everywhere: complaints through customer service, vigilance reports through regulatory affairs, literature through medical affairs, registry data through clinical, PMCF through R&D. Different systems. Different formats. Pattern detection requires manual aggregation.
Seal consolidates sources into unified surveillance. Complaints enter as structured data—product, event, outcome—enabling trend analysis without manual review. Literature monitoring flags relevant publications. Registry data imports with real-world metrics.
The system analyzes continuously. When alarm-related complaints exceed normal variation, you know immediately—not because someone noticed while reviewing spreadsheets, but because the system detected the pattern.
When a signal crosses threshold, action follows. FSCA documentation generates pre-populated with affected lots, distribution records, complaint history. You know which 12,000 units shipped, which hospitals have them, and what to do next.
Same auditor. Same question: "Show me how requirement 47 traces to verification."
One click.
The requirement appears with its design rationale. Click through: design output, verification protocol, test results (±2.1% flow accuracy, within spec), the design review that evaluated it, the risk controls it satisfies, the GSPR mapping.
"Post-market data?" Surveillance dashboard: complaint trends, literature monitoring, registry data. No signal detected.
"Technical Documentation?" Generate. 2,400 pages in 30 seconds. Every section current. Every source linked.
"Risk analysis?" Current state with all hazards, controls, and residual risk. Updated last week when a field complaint suggested a new use error—added, analyzed, controlled, verified.
The auditor nods. Moves on.
No finding.
