All case studies

AI / ERP

·

2026

Putting an AI analyst in front of IQMS/Oracle ERP data

Built a Model Context Protocol server exposing read-only IQMS and dataPARC data to Claude. Plant questions answered in seconds, against live production data.

1,587

ERP tables mapped

2,959

FK relationships

seconds

Query latency

zero

Write access

read-only by design

Our ERP (IQMS / DELMIAworks, backed by Oracle) has 1,500+ tables and is the system of record for everything that happens in production. Every time an ops or finance leader wanted a cross-cutting view of inventory, production, or costs, someone had to write SQL or wait on the vendor report queue.

I mapped the schema (1,587 tables, 2,959 FK relationships), built a read-only Model Context Protocol server exposing it to Claude, and shipped a simple chat interface on top. Now plant questions get answered by an LLM that has the real schema in context and executes read-only SQL against the live system.

Stack

  • Python MCP server (SSE transport)
  • Oracle cx_Oracle / python-oracledb
  • Anthropic Claude (Opus / Sonnet)
  • Schema dictionary pre-loaded into agent context
  • Next.js + Flask web chat wrapper
  • Internal-network deployment (no cloud data egress)

The problem

IQMS is the lifeblood of a manufacturing operation, but it's famously opaque. Every ad-hoc reporting request — "what's our WIP value by work center," "what's the lot history on this customer complaint," "which POs are aging past 30 days" — required either the vendor's Crystal Reports, a SQL-literate human, or both.

The people who need the answers are production leaders, quality managers, finance. They don't write SQL. So the data sat behind a wait queue of technical people.

Why MCP (and not a Text-to-SQL app)

Anthropic's Model Context Protocol is the right abstraction for this because it separates the data-access layer from the chat UI. Once the ERP is exposed as an MCP server, any LLM client that speaks MCP can talk to it — the Claude desktop app, a web chat I build, a script, Claude Code. I don't have to rebuild the integration for each surface.

The server is read-only by construction — no INSERT/UPDATE/DELETE tools exist. This is the non-negotiable that makes it safe to let an LLM drive queries against production.

What it does today

  • Answers natural-language questions about any of the 1,587 IQMS tables using schema + FK context loaded into the model.
  • Generates and executes safe SELECT queries, returns structured results, and summarizes them.
  • Supports follow-up questions — the model can navigate FK relationships (e.g., order → line → shipment → inventory) without a human crafting joins.
  • Separate dataPARC MCP server does the same for SCADA historian tag reads — trends, current values, interpolated history.

Deployment + security

The MCP server runs on the internal network only — no cloud data egress, no SaaS middleman. Authentication is scoped to internal users with MFA via the identity layer. All queries are logged with full prompt + SQL trace so audit is trivial.

Why this matters beyond one company

Most mid-market manufacturers are sitting on decades of ERP data that nobody can get at without a BI consultant. MCP + a capable LLM changes that equation. The schema mapping is the hard part; once done, the capability is durable and portable across LLM providers and UI surfaces.

Links