Dundas BI Integration: Sync BI Metrics into Cascade
Bridging BI Insights with Strategic Execution
This custom integration creates a secure, automated bridge between Dundas BI and Cascade, so the numbers your teams trust in dashboards and reports can automatically update the right Metrics in Cascade without manual copy/paste.
The goal is a reliable, repeatable sync that keeps strategic reporting accurate and current.
Integration Overview
With Dundas BI, organizations typically centralize KPIs in dashboards/reports built on curated data models. This integration pulls those KPI outputs (usually as a tabular export) and maps them into Cascade on a schedule.
Common use cases
- Executive KPI scorecards (monthly/weekly)
- Operational dashboards (daily)
- Department-level metrics (Finance, HR, Ops)
- Program performance rollups (e.g., delivery, risk, spend)
How It Works
This integration is typically unidirectional (Dundas BI → Cascade) and runs on a schedule (e.g., daily/hourly), like other custom “system of record → Cascade” integrations.
A standard flow looks like:
- Authenticate to Dundas BI (service account recommended)
- Export a board/report/visualization to a file format such as CSV
- Parse and normalize results into a stable schema (e.g., metric_key, metric_date, metric_value)
- Upsert into Cascade Measures (matching by stable identifiers)
Dundas BI explicitly supports REST API-driven export flows: log on → create export → download export result.
Refer to link below:
https://www.dundas.com/support/developer/samples/integration/create-an-export-and-download-a-file
Authentication
Dundas BI’s REST API supports creating a session via logon endpoints (for example POST /API/LogOn/) and then using that session for authorized API calls.
Option A — Service Account (Username/Password via REST LogOn)
Recommended for most customers (especially on-prem or private-hosted environments).
Customer provides
- Dundas BI base URL (your environment)
- Integration user credentials (service account recommended)
- Confirmation the account can access the dashboards/reports being exported
How it works (high-level)
- Integration calls
POST /API/LogOn/to create a session - Subsequent calls authenticate using an Authorization: Bearer session ID
Refer to link below regarding how to create a service account:
Option B — Token-Based Approaches (When Required by Your Environment)
Some environments use token patterns (for example, one-time logon tokens or session tokens as part of SSO/embedding flows). Dundas BI documentation describes token-based approaches for creating sessions in SSO scenarios.
Customer provides
- Which token approach is enabled in your Dundas BI environment
- Any required configuration details for that approach
- A dedicated integration identity (recommended)
In most “data sync into Cascade” use cases, a non-interactive service account is the simplest and most supportable path.
Data Setup in Dundas BI (What You Need to Prepare)
1) Create a Curated Export Asset
For stability, we strongly recommend exporting from a curated dashboard/report/visualization where:
- The table columns are controlled (names and data types don’t change often)
- Filters are deterministic (same inputs → same results)
- Row counts are manageable
Dundas BI provides REST API samples for exporting views and downloading the export result.
2) Define the Output Schema
To map cleanly into Cascade Metrics, your export should include (at minimum):
- Metric Key (stable identifier or agreed name, e.g., revenue_total, oee, cases_shipped)
- Metric Value (number)
- Metric Date (recommended for time-series; otherwise we treat as “current”)
- (Optional) Dimension columns (region, department, facility, etc.) if you want segmented Metrics or multiple Metrics updated from one export
Scheduling & Incremental Logic
- If you need incremental behavior (e.g., “last 30 days only”, “since last refresh”), implement that in the Dundas BI asset (filters/parameters) so every export is predictable and controlled.
- Keep exports small and intentional (avoid dumping entire datasets).
Customer Checklist
A) Environment- Dundas BI deployment type: Cloud / Hosted / On-Prem
- Base URL
- Service account credentials or token-based method details
- Which dashboards/reports/visualizations to export
- Export format (CSV preferred for tabular KPI sync)
- KPI definitions + which Cascade Metrics to update
- Any aggregation logic (sum/avg/latest)
- Desired sync frequency (hourly/daily/weekly)
Troubleshooting & Support
If the Dundas BI integration isn’t updating as expected:
- Auth failures: confirm the integration account can log in and still has access to the exported asset. (If sessions/tokens are used, confirm they are valid for API access.)
- Export failures: validate the dashboard/report/visualization can be exported via API (and the export format is supported).
- Schema changes: if columns were renamed/removed in Dundas BI, mappings may break—keep the export schema stable.
- Data formatting: ensure dates and numerics are consistent (e.g., ISO dates like YYYY-MM-DD) to prevent parsing issues.
- Network / Access: isn’t reachable from the integration runtime (allowlisting as required).