Skip to content
  • There are no suggestions because the search field is empty.

Custom Snowflake Integration: Advanced Data Connectivity

Exploring Standard and PrivateLink Integration Paths

Strategic Integration Overview

Seamlessly bridge your Snowflake data with Cascade's strategic platform. This custom integration automates the flow of vital business intelligence, delivering real-time insights and maintaining data consistency. Empower robust, data-driven decision-making and streamline strategic reporting for unified organizational execution.


Integration Mechanics: How it Works with Cascade

This integration establishes a secure, unidirectional data flow from your Snowflake Views or Tables to your Cascade Metrics. It facilitates the transfer of your data to update corresponding Cascade Metrics through a configured data transfer method.

The integration primarily sources predefined metrics, key performance indicators, or other aggregated business data from your specified Snowflake views. This structured data then directly populates your Cascade Metrics.

Data Flow Initiation & Connectivity

Establishing data flow for your custom Snowflake integration can be configured via several methods: through a Cascade-initiated Pull, a Client-initiated Push, or using File Transfer mechanisms. Our integration capabilities are designed for flexibility, allowing us to adapt to your specific environment and security requirements, including secure private connectivity options.

Data Pull (Cascade Initiated)

For Cascade to pull data directly from your Snowflake environment, we will establish and maintain an authenticated connection.

  1. Snowflake API Connectivity (SQL API):

    • Authentication & Access: Provide API credentials (OAuth or Key-Pair Auth) for a service user/app role with EXECUTE STATEMENT permissions.

    • Endpoint Details: Specify your Snowflake Account ID (e.g., xyz12345.us-east-1), database/schema/warehouse for API calls.

    • Data Querying: Cascade executes defined SQL queries (e.g., SELECT statements from tables/views) via the SQL API and retrieve results.

    • Network Connectivity:

      • Ensure necessary firewall rules permit outbound connections
      • PrivateLink Integration: PrivateLink connections can be configured to secure communication with the Snowflake SQL API endpoint, bypassing public internet routes.

  2. Direct Database Connectivity (JDBC/ODBC):

    • Authentication & Access: Provide Snowflake user/service account credentials (username/password or Key-Pair Auth for specialized connectors like MuleSoft's) with appropriate roles.

    • Connection Details: Supply your Snowflake Account ID, database, warehouse, and specific schemas/tables/views.

    • Network Connectivity:

      • Ensure necessary firewall rules permit outbound connections

      • PrivateLink Integration: For secure, non-public connectivity, configure an AWS PrivateLink (or equivalent Azure Private Link/GCP Private Service Connect) to your Snowflake Virtual Private Cloud (VPC). This allows Cascade to connect privately to your Snowflake instance, bypassing the public internet. You would provide us with relevant PrivateLink endpoint details.

    • Test Data Provisioning: Create and provide names of a test database, schema, or view with representative data for testing.

Data Push (Client Initiated)

You can configure your systems to push data directly from your Snowflake environment to Cascade's API. This provides your organization with full control over data transfer initiation:

  1. Direct API Push from Your Integration Layer: Your existing applications, custom scripts, or integration platforms will read data from your Snowflake instance and then send it to Cascade's API endpoint.

    • Cascade API Endpoint: We will provide you with the specific Cascade API ingestion endpoint.

      • This endpoint can accommodate both bulk data transfers and individual record updates.

      • At your request and subject to Cascade's approval, we can configure IP whitelisting for your dedicated Cascade API ingestion endpoint.
    • Authentication: Authenticate requests using the provided Cascade API key which can be cycled at any time.

    • Trigger Configuration: Configure your system (e.g., custom scripts, webhooks, scheduled jobs) to trigger and send data.

    • PrivateLink Integration (Optional): PrivateLink connectivity can be established from your environment to Cascade's API ingestion endpoint for secure, non-public data transmission.

  2. Push Initiated from Within Snowflake (via External Functions): Leverage Snowflake's External Functions to enable your Snowflake database to directly call an external API endpoint (provided by Cascade) based on SQL operations or data events. This allows for event-driven or programmatic pushes directly from Snowflake.

    • Network Connectivity:

      • Ensure necessary firewall rules permit outbound connections from your Snowflake VPC to Cascade's API endpoint.

    • Trigger Configuration: This push is initiated via SQL statements or scheduled tasks within Snowflake.

File Transfer

File transfer offers a flexible approach for data exchange, allowing Cascade to retrieve files from your designated storage.

  • Authentication & Access: Provide credentials (e.g., AWS S3 bucket access keys, SFTP username/password, Azure Blob Storage SAS token) to your designated cloud storage location where your data is exported.

  • File Location: Specify the exact bucket/container name and file path/prefix within your storage.

  • Network Connectivity: 

    • Ensure necessary firewall rules permit outbound connections.
    • PrivateLink Integration: For enhanced security, a PrivateLink connection can be established to your cloud storage (e.g., S3 VPC Endpoint) for direct, private data retrieval by Cascade.

Next Steps & Support

To ensure the most efficient and effective integration of your Snowflake data with Cascade, we recommend reading this article and reviewing your specific requirement.

Prepare for Your Integration: Key Considerations

We encourage you to assess your internal environment and data flow preferences. Considering these points will facilitate a more focused and productive initial consultation:

  • Internal Stakeholder Involvement: Identify which internal teams (e.g., IT, Security, Snowflake administrators) would need to be involved from your side to provide necessary access or configure data synchronization.

  • Data Flow Initiation Preference: Based on your internal policies and system capabilities, determine if a Push, Pull or File Transfer method would be best. 
  • Provide Relevant Details Based on Your Initiation Preference:

    • Once you've determined your preferred data transfer method, identify and prepare the corresponding authentication and access credentials for our integration team. These details will vary significantly based on the method chosen.

      Our integration platform will then securely utilize these credentials to manage the necessary authentication flows for accessing your data.

Initiate Your Integration Journey: Discovery & Solution Design

  • Collaborative Assessment: We will conduct a detailed consultation to collaboratively evaluate your specific environment, security requirements, and data integration needs. This assessment is crucial for designing the most suitable data transfer method and overall integration strategy tailored to your exact situation.

  • Scoping & Planning: Following this discovery, we will proceed with detailed scoping and planning for your custom Snowflake integration.

Ongoing Assistance & Troubleshooting

  • During Setup: For any questions that arise during the integration setup process, your Solutions Architect is your primary contact.

  • When Contacting Support: To expedite resolution, please provide comprehensive details including timestamps, specific Snowflake Account/Database/Schema names, any error messages received, and relevant screenshots.

 


FAQs

What is the typical data synchronization frequency for this integration?

The integration typically runs once per day (e.g., at 8:00 PM BST). Specific schedules can be discussed and customized during the initial scoping process to best fit your operational needs.

 

Who is responsible for the ongoing maintenance and monitoring of the custom Snowflake integration?

Cascade is responsible for the ongoing build, monitoring, and maintenance of the custom integration once it's deployed. This includes managing the connection and data flow logic.

 

How is my data secured during transfer from Snowflake to Cascade?

Data is secured in transit using industry-standard encryption protocols (TLS 1.2/1.3). Data is encrypted once ingested, utilizing industry-standard encryption protocols (e.g., AES-256). For enhanced security and privacy, PrivateLink connections can be configured to ensure data bypasses the public internet entirely, traveling over a private network.

 

What happens if our Snowflake schema, table, or view structure changes?

Changes to the underlying schema, table, or view structure in Snowflake may impact the integration's functionality. We recommend notifying your Solutions Architect of any planned changes in advance to allow for necessary re-scoping and adjustment of the integration's queries or transformations.

 

Can Cascade integrate data from multiple Snowflake instances or accounts?

Yes, Cascade can be configured to integrate data from multiple distinct Snowflake instances or accounts. Each instance or account would typically require a separate integration setup, allowing for tailored configurations.

 

Are there performance considerations when querying very large datasets from Snowflake?

Yes, for very large or complex datasets, optimizing query performance is key. During scoping, we will work with you to define efficient queries, leverage dedicated Snowflake warehouses for the integration, and utilize appropriate data transfer methods to ensure optimal performance and minimal impact on your Snowflake environment.