What's actually in the platform
Modules in v2.9.x — highlighted by practice.
▦ 14 highlighted modules — practice‑facing · ▣ 7 support modules — always available · many more behind these (60+ frontend feature dirs, 72 backend modules total). See /standards for what each can read & write.
Jump to a practice
How the 5 practices chain together
One project, five practices, in flow.
01
BIM Transformation
Turn RVT, IFC, DWG and DGN into structured, classified, quantity‑ready data through DDC cad2data — without giving up the source model. What ships: a real CAD/BIM intake pipeline and an element/property browser. What it isn't: a 3D coordination viewer.
Platform vs partner
▸ What OCERP ships
BIM intake & element handling
- DDC cad2data — read RVT, IFC 4 / 4.3, DWG, DGN into canonical JSON+Parquet
- BIM hub — element/property browser with filtering & search
- BIM requirements rule engine (10‑op constraint contract)
- BIM filter panel — type‑aware element filters
- BCF read & write for issue exchange
- Element ↔ BOQ position binding (manual + AI‑assisted via
matchmodule)
▸ What the partner brings
BIM judgement & mapping rules
- Customer‑specific element → cost classification mapping
- Coordination workflow design (clash, BCF round‑trips)
- Model‑authoring standards review & remediation
- Customer training on BIM hub, requirements rules, filters
- Acceptance tests on the canonical extraction (per project family)
- Ongoing parameter‑gap monitoring on incoming model revisions
Signals it's working
02
Cost Intelligence
CWICR ships ~55,000 priced positions spanning UK and DACH catalogues. The platform handles per‑project catalogue binding, vectorised search, assembly composition, multi‑region pricing. Honest: there are no live API connectors to RSMeans/BCIS/BKI/Sirados — those import as XLSX/CSV today.
Platform vs partner
▸ What OCERP ships
Cost data + catalogue handling
- CWICR seed library — ~55K priced items total, covering UK (NRM/BCIS) and DACH (DIN/BKI)
- Per‑project catalogue binding (v2.8.2+) with currency code & provenance
- Vectorised semantic search (pgvector) on imported rates
- Assembly composition with regional factor table
- Cost model + cost-match modules for AI‑suggested unit rates
- External pricebook import via XLSX / CSV / GAEB X81 STLB
- Auto region detection on fresh installs (v2.8.5)
▸ What the partner brings
Library calibration & governance
- Importing customer's inherited Excel / Sirados / RSMeans extracts
- Backtesting library against last 6–12 awarded jobs
- Regional factor calibration per customer's project mix
- Governance design: who can change a rate, on what evidence
- Supplier feedback loop process (closed‑out invoice → rate update)
- Customer training on catalogue binding & assembly building
Signals it's working
03
AI Takeoff
PDFs and DWGs into measured quantities through the takeoff viewer with five measurement primitives: distance, area, count, polyline, volume — each carrying a 500‑char annotation field. The audit trail records who measured what, when, and on which sheet.
Platform vs partner
▸ What OCERP ships
Takeoff tooling & matching
- PDF.js + canvas takeoff viewer with 5 measure types (distance, area, count, polyline, volume)
- 500‑char annotation field per quantity, persisted with audit trail
- DWG takeoff module (separate path for vector DWG)
- AI element → cost item matching (
matchmodule) - Bind takeoff results to BOQ positions with provenance
- Multi‑user editing on the same drawing set
▸ What the partner brings
Domain feedback & model improvement
- Discipline‑specific calibration of element matching
- Onboarding the takeoff lead — one or two weeks of feedback loop
- Standardising annotation conventions across the team
- Audit trail review process for disputes
- Procedure for re‑doing takeoff on revised drawing sets
Signals it's working
04
Compliance & Validation
46 validation rules registered across 5 standards — DIN 276, NRM, MasterFormat, GAEB, DPGF — plus the universal BOQ quality pack. Every BOQ release passes the pipeline; every failure carries a rule ID, severity and source element. Honest: BREEAM, LEED, DGNB, EU Taxonomy rule packs are not yet shipped.
Platform vs partner
▸ What OCERP ships
Validation engine + rule packs
- Validation framework (rule registry, severities, results)
- Built‑in rule packs: DIN 276, NRM 1/2, MasterFormat, GAEB, DPGF, BOQ quality (~46 rules)
- Requirements engine — 10‑operation unified constraint contract
- Validate‑against‑BIM endpoint (v2.8.8 — Requirements & Validation v2)
- Compliance dashboard with traffic‑light drill‑down
- Excel template + import + unified export for requirements
▸ What the partner brings
Standards interpretation & custom rules
- Inventory of standards the customer must comply with
- Customer‑specific rule additions (budget caps, preferred suppliers)
- Severity tuning — what blocks, what flags, what suggests
- Override governance — when warnings get ignored, by whom, on what evidence
- Audit packaging for client / public buyer review
Signals it's working
05
Tendering
GAEB X83 round‑trip is the wired tender‑exchange path today. Plus RFQ/bidding module for distribution and bid handling, procurement module for PO ↔ GR ↔ invoice 3‑way matching, and DACH pack VOB/B contract templates. Honest: BC3 (Spain) and native DPGF (France) are roadmap, not shipped.
Platform vs partner
▸ What OCERP ships
Tender exchange + bid handling
- GAEB X83 read & write with rule‑pack gate before export
- GAEB X84 import (alternative bids); legacy DA XML import
- RFQ / bidding module — distribution, return collection, comparison
- Procurement module — 3‑way invoice match (PO ↔ GR ↔ Invoice)
- VOB/B contract templates (Einheitspreis · Pauschal · Stundenlohn) in DACH pack
- XLSX export with region‑shaped column profiles (NRM 2, DPGF, CSI, Pliego)
▸ What the partner brings
Process & commercial expertise
- Subcontractor list management & pre‑qualification
- Buyer‑specific format profile design (per public buyer)
- Award reasoning & commercial review process
- Customer‑specific contract wrappers (FIDIC, NEC4, JCT, AIA — partner‑provided)
- Tender governance — escalation, anomaly review, award sign‑off
Signals it's working
First 12 weeks · what an engagement actually looks like
The honest onboarding rhythm.
Every practice is different, but the rhythm of a partner‑led OCERP engagement tends to look like this. Numbers below are typical, not promises — calibrate with your partner.
- Inventory of existing tooling (Excel, Sirados, CostX, etc.)
- Sample real project data — last 1‑2 awarded jobs
- List standards that actually bind your work
- Define what "working" looks like at week 12
- OCERP install on partner / customer infra
- Region pack, CWICR seed binding
- Customer's first cost catalogue imported (XLSX/CSV)
- Validation rule packs configured per project
- Real CAD/BIM or PDF goes through the pipeline
- BOQ is built, validated, exported (GAEB X83 etc.)
- Senior estimator confirms or rejects AI matches
- Mismatches drive rule pack & assembly tuning
- Backtest against awarded jobs · variance report
- Customer team trained on each module they'll touch
- Governance docs (who edits rates, validation overrides)
- Roadmap of what to fix in the next quarter
Buyer's questionnaire · before you sign
Questions to actually ask a partner.
OCERP is open‑source — the platform isn't where the partner adds margin. The partner adds margin in methodology, calibration, training, governance. These questions sort the serious partners from the resellers.
▸ Ask about platform fit
- Which of the 5 practices are you proposing to lead, and why this one first?
- Which regional pack applies, and what's missing for our jurisdiction?
- Are our standards already in the rule pack or do we need custom ones?
- What's our tender format requirement — does it already round‑trip?
- How will the customer's existing rate library get imported?
▸ Ask about partner depth
- Show me two finished engagements in this practice — names & outcomes.
- Who specifically does the BIM mapping or rate calibration?
- How do you handle disputes between estimator and AI suggestion?
- What's the governance model for rate changes after handover?
- What happens to our data if we switch partner or self‑host?
▸ Green flags to look for
- Partner can name specific limitations in v2.9.x without flinching
- Has read the standard they claim — not just the brochure
- Demonstrates a full GAEB X83 round‑trip on real data
- Insists on sample real data in week 1, not slides
- Calibrates against your awarded jobs, not external benchmarks
▸ Red flags to walk from
- Pitches BREEAM/LEED/FIDIC as native — they're not in the codebase
- Promises a 3D viewer on day one — it's an element/property browser today
- Shows fictional "BCIS / RSMeans live feeds" — those are XLSX imports
- Refuses to commit to data‑exit clause in the contract
- No name in their team for the senior estimator who'll calibrate
Next step
Not sure which practice to start with? Take the assessment.
Eleven questions, five dimensions, a band and three prioritised actions. The result tells you which practice will move the needle for your firm — before you spend a euro on consulting.