Should we replace our manual QA inspection line with machine vision? 200K units/month, 3 shifts, 0.2% defect rate, $400K budget for automation.
Deploy a Cognex In-Sight 9902 or Keyence CV-X480F machine vision system in PARALLEL with existing manual inspection...
Decision
Do NOT do a full replacement. Deploy a Cognex In-Sight 9902 or Keyence CV-X480F machine vision system in PARALLEL with existing manual inspection for 6 months. Install one machine vision station inline BEFORE manual inspection stations. Run both systems on 100% of units, logging machine vision pass/fail alongside human decisions for every unit. After ~1.2M unit comparisons, make a data-driven go/no-go on full replacement. Budget: $180K hardware (Cognex In-Sight 9902 + GigE camera + structured lighting dome), $40K integration, $30K ViDi deep learning training on 12 defect types, $50K reserve for lighting/model iteration. Total $300K, $100K buffer. Success thresholds: ≥99.8% agreement with human inspectors AND ≥95% of human-caught defects detected, validated over ≥500K units. Key failure mode: Structured lighting fails on subtle waviness or discoloration due to alloy batch reflectance variation. Mitigated by $50K reserve and manual inspectors as backstop. Keep all 9 inspectors during validation. No union disruption. Timeline: 6 weeks procurement/install, months 2-4 parallel run + model training, months 5-6 statistical validation, month 7 go/no-go with real data.
Next actions
Council notes
Evidence boundary
Observed from your filing
- Should we replace our manual QA inspection line with machine vision? 200K units/month, 3 shifts, 0.2% defect rate, $400K budget for automation.
Assumptions used for analysis
- Defects are visually detectable — all 12 defect types have surface-visible signatures amenable to camera-based inspection rather than requiring X-ray, ultrasonic, or functional testing
- Production line has physical space for an inline machine vision station with structured lighting dome before existing manual inspection stations
- The 0.2% defect rate is stable enough that 6 months of parallel data is representative of normal production variation including seasonal material batch changes
- Cognex ViDi deep learning module can be trained to adequate performance on 12 defect types within the $30K training budget allocation
- Current 9-inspector manual inspection is the primary cost center justifying automation — if inspectors perform other functions beyond visual inspection, the ROI case weakens
- existing stack defaulted: greenfield assumed (not_addressed)
- data volume defaulted: not specified (not_addressed)
Inferred candidate specifics
- Do NOT do a full replacement. Deploy a Cognex In-Sight 9902 or Keyence CV-X480F machine vision system in PARALLEL with existing manual inspection for 6 months. Install one machine vision station inline BEFORE manual inspection stations. Run both systems on 100% of units, logging machine vision pass/fail alongside human decisions for every unit. After ~1.2M unit comparisons, make a data-driven go/no-go on full replacement. Budget: $180K hardware (Cognex In-Sight 9902 + GigE camera + structured lighting dome), $40K integration, $30K ViDi deep learning training on 12 defect types, $50K reserve for lighting/model iteration. Total $300K, $100K buffer. Success thresholds: ≥99.8% agreement with human inspectors AND ≥95% of human-caught defects detected, validated over ≥500K units. Key failure mode: Structured lighting fails on subtle waviness or discoloration due to alloy batch reflectance variation. Mitigated by $50K reserve and manual inspectors as backstop. Keep all 9 inspectors during validation. No union disruption. Timeline: 6 weeks procurement/install, months 2-4 parallel run + model training, months 5-6 statistical validation, month 7 go/no-go with real data.
- Write an RFQ (Request for Quote) to Cognex and Keyence specifying: 200K units/month throughput, 0.2% defect rate across 12 defect types (enumerate them), inline pre-inspection station placement, GigE camera with structured lighting dome for specular/metallic surfaces, ViDi deep learning module, and conveyor interface requirements — with delivery within 6 weeks.
- Branch b003 had the highest confidence (0.82), survived all 3 adversarial rounds with strengthening from all models, named specific hardware (Cognex In-Sight 9902, Keyence CV-X480F), provided detailed budget breakdown, defined quantitative success thresholds, and identified specific failure modes. Branch b001 (0.70) was generic and failed the specificity gate. No surviving branch had higher confidence than b003.
- Evaluate off-the-shelf machine vision solutions generically
- Branch b001 lacked specificity — no named hardware, no deployment architecture, no success thresholds, no failure modes. It restated the question as a recommendation without answering HOW to proceed.
- Hybrid system with machine vision primary + 2 human inspectors per shift for flagged units
- Branch b002 (killed round 2) proposed immediate operational dependency on machine vision without validation data. Its 3-tier confidence classification is architecturally sound but premature — you cannot set confidence thresholds without the 6-month parallel run data that b003 generates first. Also risked throughput bottleneck if false positive rate exceeds 5%.
- Invest $400K in upstream SPC and defect prevention instead of inspection automation
Inferred specifics table
| Value | Kind | Basis | Where introduced |
|---|---|---|---|
| Cognex In-Sight 9902 or Keyence CV-X480F machine | estimate | synthetic | chosen_path |
| inspection for 6 months | estimate | synthetic | chosen_path |
| Run both systems on 100% of units | threshold | synthetic | chosen_path |
| After ~1.2M unit comparisons | estimate | synthetic | chosen_path |
| Budget: $180K hardware | estimate | synthetic | chosen_path |
| $40K integration | estimate | synthetic | chosen_path |
| $30K ViDi deep learning training | estimate | synthetic | chosen_path |
| training on 12 defect types | estimate | synthetic | chosen_path |
| $50K reserve for lighting/model iteration | estimate | synthetic | chosen_path |
| Total $300K | estimate | synthetic | chosen_path |
| $100K buffer | estimate | synthetic | chosen_path |
| Success thresholds: ≥99.8% agreement with human inspectors | threshold | synthetic | chosen_path |
| inspectors AND ≥95% of human-caught defects detected | threshold | synthetic | chosen_path |
| validated over ≥500K units | estimate | synthetic | chosen_path |
| Keyence CV-X480F machine vision system in | estimate | synthetic | chosen_path |
| month 7 go/no-go with real data | estimate | synthetic | chosen_path |
| 2% defect rate across 12 defect types | threshold | synthetic | next_action |
| delivery within 6 weeks | estimate | synthetic | next_action |
| 0.82 | estimate | synthetic | selection_rationale |
| Cognex In-Sight 9902 | estimate | synthetic | selection_rationale |
Unknowns blocking a firmer verdict
- Whether $100K buffer is sufficient for full replacement deployment if validation succeeds — full replacement across 3 shifts likely requires additional capital beyond the original $400K budget
- The 12 defect types are referenced but not enumerated — some defect types (cosmetic vs. dimensional vs. functional) have fundamentally different machine vision detection profiles, and the approach may work for 10 of 12 but fail on 2
- Union contract implications of eventual full replacement are deferred, not resolved — the parallel run buys time but doesn't address the labor transition plan
- ROI timeline is unclear: if 9 inspectors cost $450K-$600K/year and validation takes 7 months, the payback period depends on a second capital expenditure for full replacement that isn't budgeted