qms-design-control

Design Controls

Gaps found before audits. Not after.

The traceability matrix showed Input → Output → Verification for every requirement except one. Class I recall followed. Live traceability that exposes gaps before auditors do.

The Requirement Nobody Verified

510(k) cleared in 90 days. Six months later, a nurse called: the device was too hard to activate. Her patient's hand strength couldn't generate the required force. The patient was elderly, arthritic. The device was designed by engineers with healthy hands who never considered this use case.

The design input was there, buried in section 4.3.2 of the requirements document: "Activation force shall not exceed 2N." Copied from a predicate device. Nobody questioned whether 2N was the right number. Nobody built a test for it. The traceability matrix showed Input → Output → Verification for every other requirement. This one had a gap. A blank cell that nobody noticed until the FDA asked about it.

"How did you verify this requirement was met?" Silence. The verification protocol tested nineteen requirements. This one wasn't included. Someone assumed it would be obvious. Someone else assumed engineering had covered it. Nobody actually verified.

Class I recall. Public notification. Design modification. Resubmission. The gap cost eighteen months and a reputation.

Design control · every requirement traced. Every test linked.
The traceability gap
"Show me the test that covers requirement 47."
Hours rebuilding the trace matrix from scattered files.
FDA 820.30 citation: inadequate design verification.
The Seal approach
Traceability built in, not bolted on.
Requirement → Design → Test → Result. One click.
Coverage gaps flagged automatically.
User needs
Intended use · clinical context
Input
Requirements · specifications
Output
Drawings · software
Verification
Output → input · built correctly?
Validation
Device → needs · built right thing?
DHF
Design history file
Design review at each phase: formal gates with documented decisions
Traceability matrix · auto-generated, always current
REQ-001 → TEST-001 → Pass
REQ-002 → TEST-002 → Pass
REQ-003 → No test linked!
Coverage: 97% · Gaps: 2
Fig. 1 — Design Control

The Documentation Trap

Most companies treat design controls as a documentation exercise. Requirements go in a Word document. Design outputs go in another Word document. A traceability matrix in Excel links them together. Manually updated when someone remembers. Verification protocols are written, executed, and filed. The Design History File is assembled before audits by someone who wasn't involved in the development, pulling documents from shared drives and hoping nothing is missing.

This approach creates the illusion of control. Documents exist. Signatures exist. But the connections are fragile. When requirements change, does someone update the traceability matrix? When verification protocols are written, does anyone check that every requirement has a corresponding test? When the DHF is compiled, does anyone verify it's complete?

The gaps aren't visible until an auditor walks the matrix. By then, it's too late.

Living Traceability

DHF · computed from live relationships · gaps visible immediately
Design inputs
47
All linked
Design outputs
42
3 inputs have no output
Verification
38
2 outputs have no V&V
Reviews
8
All signed
Risk items
54
All controlled
DHR records
142
Live
Gaps · action items · inbox for design team
REQ-031
Input without linked design output
REQ-108
Input without linked design output
REQ-214
Input without linked design output
OUT-184
Output with no verification protocol
OUT-229
Output with no verification protocol
Fig. 2 — Design History File

Seal makes design controls a living system where gaps become visible the moment they're created.

When you create a design input, it exists as a record in the system. Not a line in a document. When you create a design output, you link it to the inputs it addresses. The system knows which inputs have outputs and which don't. When you create a verification test, you link it to the outputs it verifies. The system knows which outputs have verification and which don't.

The traceability matrix isn't a spreadsheet someone maintains. It's a view computed from actual relationships. When an input has no output, the matrix shows a gap. When an output has no verification, the matrix shows a gap. You see the gap immediately, not when the auditor asks about it.

When requirements change. And they always change. The system propagates the impact. A changed input flags its linked outputs for review. Changed outputs flag their verification tests for reassessment. The traceability doesn't break silently; it highlights what needs attention.

Inputs: Where Everything Begins

Design inputs are the foundation. Everything else traces back to them. If your inputs are wrong, your outputs will be wrong. If your inputs are incomplete, your device will have gaps.

Inputs come from multiple sources. User needs define what problems the device solves and for whom. Clinical requirements specify outcomes, populations, and indications. Regulatory requirements flow from applicable standards. IEC 60601 for electrical safety, ISO 10993 for biocompatibility, specific FDA guidances for your device type. Risk inputs emerge from hazard analysis. The design must mitigate identified risks.

Each input should be specific and measurable. "Easy to use" isn't a design input. "Activation force shall not exceed 2N" is a design input. Though you'd better verify that 2N is actually appropriate for your user population. Each input should trace to a source. Why is this a requirement? Each input needs a corresponding output and verification.

Outputs: The Actual Design

Design outputs are what you actually designed. Specifications define what the device must do. Drawings define physical characteristics. Software architecture defines computational behavior. Manufacturing procedures define how it's built.

Each output addresses one or more inputs. The traceability should be explicit. This specification exists because of that requirement. When an output doesn't trace to an input, you have a question: why does this exist? When an input doesn't trace to an output, you have a gap: how is this requirement addressed?

Output documents are controlled. Versioned, approved, change-managed. When an output changes, its verification needs reassessment. When an output is approved, it's ready for verification.

Verification and Validation

Verification answers: do outputs meet inputs? You specified a 2N activation force. Does the device actually require less than 2N? You measure, you document, you conclude. Verification is objective evidence that the design output meets the design input.

Validation answers: does the device meet user needs? You can verify every specification and still fail validation if your specifications were wrong. Verification tests what you designed. Validation tests whether what you designed is actually what users need.

The distinction matters. You can have a perfectly verified device that fails validation because you didn't understand your users. The nurse's patient couldn't activate the device not because it failed verification. It met the 2N spec. But because 2N was too much force for elderly arthritic hands. The verification passed. The validation would have caught this if anyone had tested with representative users.

Design Reviews

Design reviews are cross-functional checkpoints. Engineering presents design status. Quality reviews documentation completeness. Regulatory assesses compliance. Manufacturing evaluates producibility. The review isn't a rubber stamp. It's a critical evaluation of whether the design is ready to proceed.

Each review has defined criteria. What should be complete at this stage? Are design inputs finalized? Are outputs documented? Are verifications planned? The review evaluates against criteria and documents findings.

Action items emerge from reviews. Gaps need closure. Questions need answers. The review doesn't close until action items are resolved. In Seal, action items link to the review, have owners and due dates, and track to completion. Reviews can't close with open items.

The Design History File

The DHF is the complete record of the design process. Inputs, outputs, reviews, verifications, validations, changes. For a 510(k), it's the evidence supporting your submission. For an audit, it's the story of how you designed this device.

In most companies, the DHF is assembled before audits. Someone pulls documents from various locations, checks them against a DHF index, and assembles a package. Documents are missing. Versions don't match. Signatures are illegible. The assembly takes days and the result is uncertain.

In Seal, the DHF compiles automatically. Design inputs are in the DHF because they were created in the system. Design outputs are in the DHF because they're linked to inputs. Verification results are in the DHF because they're linked to outputs. Reviews and their action items are in the DHF. Changes and their approvals are in the DHF.

The DHF is always current because it's generated from live data, not assembled from archives. When an auditor asks to see the DHF, you show them. Completely, accurately, immediately. When you prepare a 510(k), the design control evidence is ready because you've been maintaining it all along.

Design Changes

Designs change. User feedback reveals gaps. Testing discovers issues. Manufacturing identifies producibility problems. Regulatory requirements evolve. Changes are normal. Uncontrolled changes are dangerous.

Every design change follows design control. What inputs are affected? What outputs need revision? What verifications need re-execution? What validations need reassessment? The change doesn't just update a document. It updates the entire traceability chain.

In Seal, design changes link to the design records they affect. Impact assessment identifies downstream effects automatically. Changed outputs trigger verification reassessment. The system maintains traceability through changes, not despite them.

Neil sets up your DHF structure from applicable standards

Tell Neil. Seal's AI. About your device: "Class II cardiovascular, subject to IEC 60601-1, IEC 60601-1-2 EMC, and ISO 10993 biocompatibility." Neil generates design inputs from those standards with appropriate verification approaches. Your DHF structure, requirement categories, and V&V templates build from the regulatory context of your specific device. Not a generic template.

Neil also works throughout development. Writing a requirement that says "device shall be easy to use"? Neil flags it: "This requirement isn't measurable. Consider adding acceptance criteria." Building your traceability matrix? Neil continuously scans for gaps: "Design input DI-047 has no linked output." Preparing the DHF for submission? Neil compiles from linked records and flags incomplete sections before you discover them during the filing scramble.

When your next device program starts, Neil configures it from the standards and predicate device data. Reusing what applies, flagging what's new. The DHF setup that took months for your first device takes days for the second.

The Gap That Isn't There

The auditor asks about activation force. You show them the design input, linked to user research documenting the target population's hand strength. You show them the design output, the specification that translates user needs into measurable criteria. You show them the verification protocol, the test method, the results, the conclusion. The traceability is complete because the system wouldn't let you proceed with a gap.

The requirement nobody verified? That's no longer possible. The system makes gaps visible the moment they're created. You fix them during development, not during a recall.

Capabilities

01Design Input Management
Capture user needs, clinical requirements, regulatory standards, and risk inputs. Each input is specific, measurable, and traceable.
02Traceability Matrix
Automatic linking of inputs to outputs to V&V. See gaps where inputs lack outputs or outputs lack verification.
03Design Reviews
Schedule reviews, track attendance, capture decisions and action items. Reviews don't close until actions are resolved.
04V&V Management
Manage verification and validation protocols, execution, and results. Link results to the requirements they verify.
05Design Transfer
Track readiness for manufacturing. Process validation, equipment qualification, training, DMR completion.
06Auto-Compiled DHF
Design History File compiles automatically from linked records. Always current, always complete, always audit-ready.
07Software Design Controls
IEC 62304 compliance built in. Software requirements, architecture, unit testing, integration testing. All traced with appropriate rigor by safety class.
08Risk-Driven Requirements
Hazards from risk analysis generate design inputs automatically. Risk controls trace through design outputs to verification evidence.
09AI Traceability Gaps
AI continuously scans your design controls for missing links. Inputs without outputs, outputs without verification, requirements without tests. Gaps surface immediately, not at audit.
10AI DHF Compilation
AI assembles Design History Files from linked records, identifies missing documentation, and generates completeness reports. DHF ready when you need it, not weeks before.
01 / 10
Design Input Management
Design Input Management

Entities

Entity
Description
Kind
Design Input
What the device must do. Everything traces back here. An input without a test is a gap waiting to be found.
type
User Need
What problems are users solving? Market research, clinical input, voice of customer.
template
UI-003
'Activation force ≤2N.' Copied from predicate. Nobody built a test. Gap found at audit.
instance
Clinical Requirement
Clinical outcomes, patient population, indications for use.
template
Regulatory Requirement
IEC 60601, ISO 10993, FDA guidance. Standards drive inputs.
template
Risk Mitigation
Hazards identified in risk analysis. Risk management generates design inputs.
template
Design Output
The actual design. Specs, drawings, software. Each output addresses inputs. Missing link? Something's wrong.
type
Product Specification
What the device must do. Performance, dimensions, materials.
template
Engineering Drawing
CAD files, schematics, assembly drawings.
template
Traceability Matrix
Input → Output → Verification. Gaps become visible the moment they're created, not when an auditor walks the matrix.
type
Verification
Do outputs meet inputs? Specs say X. Does device do X? Objective evidence.
type
VER-FORCE-001
Grip force testing. 50 samples, all <5N. Evidence attached. Input verified.
instance
Validation
Does device meet user needs? You can verify specs and still fail validation if specs missed what users actually need.
type
Design History File
Complete record. In most companies, assembled before audits. In Seal, compiles automatically. Always current, always ready.
type
Design Review
Cross-functional checkpoint. R&D, Quality, Regulatory, Manufacturing. Reviews don't close until action items resolve.
type

FAQ

Design changes go through the same process as new development. Impact assessment, appropriate review, V&V updates if needed, traceability updates. The DHF captures the evolution of the design over time.
Yes. Import design inputs from requirements management tools, Excel, or Word documents. Once imported, requirements become traceable records in the design control system.
Risk management integrates directly. Hazards identified in risk analysis become design inputs (mitigation requirements). Design outputs address those inputs. Verification confirms the risk controls work. The risk file and DHF are connected.
Software follows the same design control framework with additional rigor per IEC 62304. Software requirements, architecture, detailed design, unit testing, integration testing. All tracked with appropriate traceability.
The DHF contains the evidence for your 510(k) submission. Predicate device comparison, substantial equivalence arguments, V&V summaries. All derived from the design control records you've maintained throughout development.
Yes. Role-based access controls determine who can create, edit, and approve design records. Teams can work in parallel on different subsystems while maintaining traceability to system-level requirements.
Commercial off-the-shelf components have their own design inputs. The specifications you require from the vendor. Verification confirms the COTS component meets those specifications. The supplier's design history isn't your responsibility; your use of their component is.
Accessories and companion products have their own design control records linked to the primary device. Interface requirements flow between them. When the primary device changes, the system identifies potentially affected accessories.
Post-market changes follow the same design control process. Impact assessment determines which inputs, outputs, and verifications need revision. Changes trace to complaint data or field feedback that initiated them. The DHF evolves throughout the product lifecycle.
Yes. The DHF contains the evidence for your submission. Predicate comparison, substantial equivalence arguments, V&V summaries. All derived from design control records. Export packages in FDA-expected formats.

Go live in 48 hours.