OCERP

Practice areas · what platform ships, what partner adds

The five practices, with an honest platform vs partner split.

Every practice below names what OCERP ships in v2.9.x and what the partner brings as their own work. Two columns, side by side. no fictional deliverables. what the code does is what the page says.

5 practices v2.9.x codebase referenced Platform vs Partner column per practice Updated 2026‑05‑07

What's actually in the platform

Modules in v2.9.x — highlighted by practice.

14 highlighted modules — practice‑facing · ▣ 7 support modules — always available · many more behind these (60+ frontend feature dirs, 72 backend modules total). See /standards for what each can read & write.

Jump to a practice

How the 5 practices chain together

One project, five practices, in flow.

Practice
01

BIM Transformation

Turn RVT, IFC, DWG and DGN into structured, classified, quantity‑ready data through DDC cad2data — without giving up the source model. What ships: a real CAD/BIM intake pipeline and an element/property browser. What it isn't: a 3D coordination viewer.

RVT · IFC · DWG · DGN DDC cad2data BIM hub · element browser

Platform vs partner

▸ What OCERP ships

BIM intake & element handling
  • DDC cad2data — read RVT, IFC 4 / 4.3, DWG, DGN into canonical JSON+Parquet
  • BIM hub — element/property browser with filtering & search
  • BIM requirements rule engine (10‑op constraint contract)
  • BIM filter panel — type‑aware element filters
  • BCF read & write for issue exchange
  • Element ↔ BOQ position binding (manual + AI‑assisted via match module)

▸ What the partner brings

BIM judgement & mapping rules
  • Customer‑specific element → cost classification mapping
  • Coordination workflow design (clash, BCF round‑trips)
  • Model‑authoring standards review & remediation
  • Customer training on BIM hub, requirements rules, filters
  • Acceptance tests on the canonical extraction (per project family)
  • Ongoing parameter‑gap monitoring on incoming model revisions

Signals it's working

+Estimators stop maintaining a parallel Excel for quantities.
+Re‑running quantities on a new model revision takes minutes.
+Coordination changes show up as flagged BOQ drift.
If quantities still arrive by email attachment after week six — mapping wasn't validated.
Practice
02

Cost Intelligence

CWICR ships ~55,000 priced positions spanning UK and DACH catalogues. The platform handles per‑project catalogue binding, vectorised search, assembly composition, multi‑region pricing. Honest: there are no live API connectors to RSMeans/BCIS/BKI/Sirados — those import as XLSX/CSV today.

CWICR ~55K seed Per‑project catalogue Vectorised search

Platform vs partner

▸ What OCERP ships

Cost data + catalogue handling
  • CWICR seed library — ~55K priced items total, covering UK (NRM/BCIS) and DACH (DIN/BKI)
  • Per‑project catalogue binding (v2.8.2+) with currency code & provenance
  • Vectorised semantic search (pgvector) on imported rates
  • Assembly composition with regional factor table
  • Cost model + cost-match modules for AI‑suggested unit rates
  • External pricebook import via XLSX / CSV / GAEB X81 STLB
  • Auto region detection on fresh installs (v2.8.5)

▸ What the partner brings

Library calibration & governance
  • Importing customer's inherited Excel / Sirados / RSMeans extracts
  • Backtesting library against last 6–12 awarded jobs
  • Regional factor calibration per customer's project mix
  • Governance design: who can change a rate, on what evidence
  • Supplier feedback loop process (closed‑out invoice → rate update)
  • Customer training on catalogue binding & assembly building

Signals it's working

+Estimators stop opening last year's job to copy a rate.
+Awarded vs estimated variance trends toward ±5%.
+Library updates monthly — not every two years.
If senior estimator still trusts personal doc more than library — calibration didn't take.
Practice
03

AI Takeoff

PDFs and DWGs into measured quantities through the takeoff viewer with five measurement primitives: distance, area, count, polyline, volume — each carrying a 500‑char annotation field. The audit trail records who measured what, when, and on which sheet.

PDF + DWG 5 annotation types Audit trail per quantity

Platform vs partner

▸ What OCERP ships

Takeoff tooling & matching
  • PDF.js + canvas takeoff viewer with 5 measure types (distance, area, count, polyline, volume)
  • 500‑char annotation field per quantity, persisted with audit trail
  • DWG takeoff module (separate path for vector DWG)
  • AI element → cost item matching (match module)
  • Bind takeoff results to BOQ positions with provenance
  • Multi‑user editing on the same drawing set

▸ What the partner brings

Domain feedback & model improvement
  • Discipline‑specific calibration of element matching
  • Onboarding the takeoff lead — one or two weeks of feedback loop
  • Standardising annotation conventions across the team
  • Audit trail review process for disputes
  • Procedure for re‑doing takeoff on revised drawing sets

Signals it's working

+Per‑sheet takeoff time drops on repetitive disciplines.
+Disputes get resolved by pointing at the audit trail.
+Junior estimators handle volume that used to need a senior.
If the audit trail isn't being read during disputes — process broke, not tool.
Practice
04

Compliance & Validation

46 validation rules registered across 5 standards — DIN 276, NRM, MasterFormat, GAEB, DPGF — plus the universal BOQ quality pack. Every BOQ release passes the pipeline; every failure carries a rule ID, severity and source element. Honest: BREEAM, LEED, DGNB, EU Taxonomy rule packs are not yet shipped.

DIN 276 · NRM · CSI GAEB · DPGF rule packs Requirements engine

Platform vs partner

▸ What OCERP ships

Validation engine + rule packs
  • Validation framework (rule registry, severities, results)
  • Built‑in rule packs: DIN 276, NRM 1/2, MasterFormat, GAEB, DPGF, BOQ quality (~46 rules)
  • Requirements engine — 10‑operation unified constraint contract
  • Validate‑against‑BIM endpoint (v2.8.8 — Requirements & Validation v2)
  • Compliance dashboard with traffic‑light drill‑down
  • Excel template + import + unified export for requirements

▸ What the partner brings

Standards interpretation & custom rules
  • Inventory of standards the customer must comply with
  • Customer‑specific rule additions (budget caps, preferred suppliers)
  • Severity tuning — what blocks, what flags, what suggests
  • Override governance — when warnings get ignored, by whom, on what evidence
  • Audit packaging for client / public buyer review

Signals it's working

+Tender packs stop coming back with structural objections.
+The senior estimator stops being the human firewall for compliance.
+Disputes resolve by reading the audit trail, not debating who said what.
If overrides accumulate without recorded reasons — governance broke, not the rules.
Practice
05

Tendering

GAEB X83 round‑trip is the wired tender‑exchange path today. Plus RFQ/bidding module for distribution and bid handling, procurement module for PO ↔ GR ↔ invoice 3‑way matching, and DACH pack VOB/B contract templates. Honest: BC3 (Spain) and native DPGF (France) are roadmap, not shipped.

GAEB X83 round‑trip RFQ / bid module 3‑way invoice match

Platform vs partner

▸ What OCERP ships

Tender exchange + bid handling
  • GAEB X83 read & write with rule‑pack gate before export
  • GAEB X84 import (alternative bids); legacy DA XML import
  • RFQ / bidding module — distribution, return collection, comparison
  • Procurement module — 3‑way invoice match (PO ↔ GR ↔ Invoice)
  • VOB/B contract templates (Einheitspreis · Pauschal · Stundenlohn) in DACH pack
  • XLSX export with region‑shaped column profiles (NRM 2, DPGF, CSI, Pliego)

▸ What the partner brings

Process & commercial expertise
  • Subcontractor list management & pre‑qualification
  • Buyer‑specific format profile design (per public buyer)
  • Award reasoning & commercial review process
  • Customer‑specific contract wrappers (FIDIC, NEC4, JCT, AIA — partner‑provided)
  • Tender governance — escalation, anomaly review, award sign‑off

Signals it's working

+Tender cycle time drops noticeably on repeat trades.
+Subcontractor bids arrive in the right format on the first attempt.
+Award decisions audit cleanly — every cent has a reason.
If subs are still emailing PDFs of marked‑up BOQs — pack format wasn't tight.

First 12 weeks · what an engagement actually looks like

The honest onboarding rhythm.

Every practice is different, but the rhythm of a partner‑led OCERP engagement tends to look like this. Numbers below are typical, not promises — calibrate with your partner.

▸ Week 1 – 2
Discovery
  • Inventory of existing tooling (Excel, Sirados, CostX, etc.)
  • Sample real project data — last 1‑2 awarded jobs
  • List standards that actually bind your work
  • Define what "working" looks like at week 12
▸ Week 3 – 4
Pilot setup
  • OCERP install on partner / customer infra
  • Region pack, CWICR seed binding
  • Customer's first cost catalogue imported (XLSX/CSV)
  • Validation rule packs configured per project
▸ Week 5 – 8
First real run
  • Real CAD/BIM or PDF goes through the pipeline
  • BOQ is built, validated, exported (GAEB X83 etc.)
  • Senior estimator confirms or rejects AI matches
  • Mismatches drive rule pack & assembly tuning
▸ Week 9 – 12
Calibration & handover
  • Backtest against awarded jobs · variance report
  • Customer team trained on each module they'll touch
  • Governance docs (who edits rates, validation overrides)
  • Roadmap of what to fix in the next quarter
Honest expectation: a partner can drive real value inside 12 weeks on cost, validation and tendering work. BIM transformation and AI takeoff calibration typically need a longer cycle — first project for setup, second project for genuine speed gains.

Buyer's questionnaire · before you sign

Questions to actually ask a partner.

OCERP is open‑source — the platform isn't where the partner adds margin. The partner adds margin in methodology, calibration, training, governance. These questions sort the serious partners from the resellers.

▸ Ask about platform fit

  • Which of the 5 practices are you proposing to lead, and why this one first?
  • Which regional pack applies, and what's missing for our jurisdiction?
  • Are our standards already in the rule pack or do we need custom ones?
  • What's our tender format requirement — does it already round‑trip?
  • How will the customer's existing rate library get imported?

▸ Ask about partner depth

  • Show me two finished engagements in this practice — names & outcomes.
  • Who specifically does the BIM mapping or rate calibration?
  • How do you handle disputes between estimator and AI suggestion?
  • What's the governance model for rate changes after handover?
  • What happens to our data if we switch partner or self‑host?

▸ Green flags to look for

  • Partner can name specific limitations in v2.9.x without flinching
  • Has read the standard they claim — not just the brochure
  • Demonstrates a full GAEB X83 round‑trip on real data
  • Insists on sample real data in week 1, not slides
  • Calibrates against your awarded jobs, not external benchmarks

▸ Red flags to walk from

  • Pitches BREEAM/LEED/FIDIC as native — they're not in the codebase
  • Promises a 3D viewer on day one — it's an element/property browser today
  • Shows fictional "BCIS / RSMeans live feeds" — those are XLSX imports
  • Refuses to commit to data‑exit clause in the contract
  • No name in their team for the senior estimator who'll calibrate

Track every roadmap item on GitHub

Issues for BC3 native, DPGF native, BREEAM rule pack, RSMeans connector, COBie write, IDS validation, FIDIC profile and more — all open at datadrivenconstruction/OpenConstructionERP. Star, comment, contribute.

Open on GitHub

Next step

Not sure which practice to start with? Take the assessment.

Eleven questions, five dimensions, a band and three prioritised actions. The result tells you which practice will move the needle for your firm — before you spend a euro on consulting.