Qlik Sense Integration
Syncing Qlik Cloud Data into Cascade
This document outlines the supported methods for integrating Qlik Sense (Cloud) with Cascade to automatically sync KPIs, measures, or curated analytics data into Cascade.
This guide assumes:
- You are using Qlik Cloud
- Cascade will pull data from Qlik on a schedule
- Data will be pushed into Cascade via API
Integration Method
Best practice is to expose a curated Table object (Hypercube) inside Qlik and allow Cascade to read only that object.
This ensures:
- Stable schema
- Controlled business logic
- Minimal breakage
- No direct access to raw datasets
Cascade will:
- Authenticate to Qlik Cloud
- Open the App via API
- Query the Hypercube (table object)
- Page through results
- Flatten the dataset
- Upsert into Cascade metrics
1) Authentication Methods
Qlik Cloud supports both API Key authentication and OAuth 2.0.
Method 1 — API Key (Fastest and simplest option)
Customer Setup Steps
- Log into Qlik Cloud
- Go to Profile Settings → API Keys
- Click Generate New Key
- Copy the generated API key immediately
What the Customer Provides to Cascade
- Qlik Tenant URL (Ex. https://yourtenant.us.qlikcloud.com)
- API Key (Bearer Token)
Refer to the link below for Generating and Managing API Keys in Qlik Sense
Method 2 — OAuth 2.0 (Best for Long Term Integrations)
Customer Setup Steps
- Go to Admin Console → OAuth
- Create a new OAuth Web Client
- Enable Allow Machine-to-Machine (M2M) (this allows system to system communication without user involvement)
- Under Authentication Method select Client Secret
- Set the allowed scopes (read access to apps)
What the Customer Provides
- Tenant URL
- Client ID
- Client Secret
- Authorization endpoint
- Token endpoint
Refer to the link below for guidance on Access via Oauth in Qlik Sense
2) Preparing Data in Qlik (Critical Step)
Create a Dedicated Integration Table
Inside your Qlik App:
- Create a Straight Table visualization
- Include only required fields
- Ensure stable schema
- Avoid calculated dimensions that change dynamically
- Limit row count (< 5,000 rows recommended)
What the Customer Provides
- App ID (this is in the URL while inside the app)
- Object Name
- Object ID (if you know how to extract the object ID, if not Cascade can pull this)
Note: Cascade reads exactly what the table returns. It does NOT apply filters automatically, apply date logic, apply incremental behavior. All filtering must be defined inside the Qlik object.
3) Pulling Data via API (Technical Overview)
Qlik Cloud provides REST APIs that allow you to:
- List apps
- Discover objects
- Export table data
- Retrieve structured dataset output
This approach avoids WebSockets entirely.
Step 1 – List Objects in the App
Retrieve objects (visualizations, sheets, tables).
- GET https://<tenant>/api/v1/items?resourceType=app&resourceId=<appId>
- Cascade will look for the Table / Straight Table object and capture the objectId, objectType, and name
Step 2 – Export Table Data (REST-Based Export)
Qlik Cloud allows exporting visualization data using REST export endpoints.
- POST https://<tenant>/api/v1/reports
- In the body of the post will include type, format, appId and objectId
This generates a report job.
Step 3 — Retrieve Report Output
Once the report job is created the status of the report creation is polled:
- GET https://<tenant>/api/v1/reports/<reportId>
When status is completed the results can be downloaded in JSON or CSV format
- GET https://<tenant>/api/v1/reports/<reportId>/download
4) Scheduling & Incremental Logic
Qlik integrations typically run on a fixed schedule.
Recommended Best Practices
- Filter inside the Qlik table (e.g., last 12 months)
- Use deterministic logic
- Keep object schema stable
- Avoid dynamic column changes
Note: If incremental sync is required, implement inside Qlik. Cascade does not apply incremental filtering automatically.
5) Troubleshooting
Unauthorized (401)
- API key invalid
- Token expired
- Missing scope
API Failure
- Incorrect tenant URL
- Incorrect App ID
Missing Rows
- Paging logic incomplete
- Filters in Qlik object
- Row limit exceeded