How to crawl Snowflake

Once you have configured the Snowflake user permissions, you can establish a connection between Atlan and Snowflake. (If you are also using AWS PrivateLink or Azure Private Link for Snowflake, you will need to set that up first, too.)

To crawl metadata from Snowflake, review the order of operations and then complete the following steps.

Select the source

To select Snowflake as your source:

  1. In the top right of any screen, navigate to New and then click New Workflow.
  2. From the list of packages, select Snowflake Assets and click on Setup Workflow.

Provide credentials

To enter your Snowflake credentials:

  1. For Account Identifiers (Host), enter the hostname, AWS PrivateLink endpoint, or Azure Private Link endpoint for your Snowflake instance.
  2. For Authentication, choose the method you configured when setting up the Snowflake user:
    • For Basic authentication, enter the Username and Password you configured in either Snowflake or the identity provider.
    • For Keypair authentication, enter the Username, Encrypted Private Key, and Private Key Password (if any) you configured.
    • For Okta SSO authentication, enter the UsernamePassword, and Authenticator you configured. The Authenticator will be the Okta URL endpoint of your Okta account, typically in the form of https://<okta_account_name>.okta.com.
  3. For Role, select the Snowflake role through which the crawler should run.
  4. For Warehouse, select the Snowflake warehouse in which the crawler should run.
  5. Click Test Authentication to confirm connectivity to Snowflake using these details.
  6. Once successful, at the bottom of the screen, click Next.

Configure the connection

To complete the Snowflake connection configuration:

  1. Provide a Connection Name that represents your source environment. For example, you might use values like production, development, gold, or analytics.
  2. (Optional) To change the users able to manage this connection, change the users or groups listed under Connection Admins.
    🚨 Careful! If you do not specify any user or group, nobody will be able to manage the connection — not even admins.
  3. (Optional) To prevent users from querying any Snowflake data, change Allow SQL Query to No.
  4. (Optional) To prevent users from previewing any Snowflake data, change Allow Data Preview to No.
  5. At the bottom of the screen, click Next to proceed.

Configure the crawler

🚨 Careful! When modifying an existing Snowflake connection, switching to a different extraction method will delete and recreate all assets in the existing connection. If you'd like to change the extraction method, contact Atlan support for assistance.

Before running the Snowflake crawler, you can further configure it.

You must select the extraction method you configured when you set up Snowflake for Atlan. For the Account Usage method, specify:

  • the Database Name of the copied Snowflake database
  • the Schema Name of the copied ACCOUNT_USAGE schema

You can override the defaults for any of the remaining options:

  • For Asset selection, select a filtering option:
    • To select the assets you want to include in crawling, click Include by hierarchy and filter for assets down to the database or schema level. (This will default to all assets, if none are specified.)
    • To have the crawler include Databases, Schemas, or Tables & Views based on a naming convention, click Include by regex and specify a regular expression — for example, specifying ATLAN_EXAMPLE_DB.* for Databases will include all the matching databases and their child assets.
    • To select the assets you want to exclude from crawling, click Exclude by hierarchy and filter for assets down to the database or schema level. (This will default to no assets, if none are specified.) 
    • To have the crawler ignore Databases, Schemas, or Tables & Views based on a naming convention, click Exclude by regex and specify a regular expression — for example, specifying ATLAN_EXAMPLE_TABLES.* for Tables & Views will exclude all the matching tables and views.
    • Click + to add more filters. If you add multiple filters, assets will be crawled based on matching all the filtering conditions you have set.
  • To exclude lineage for views in Snowflake, change View Definition Lineage to No.
  • To import tags from Snowflake to Atlan, change Import Tags to Yes.
    🚨 Careful! Object tagging in Snowflake currently requires Enterprise Edition or higher. If your organization does not have Enterprise Edition or higher and you try to import Snowflake tags to Atlan, the Snowflake connection will fail with an error — unable to retrieve tags.
  • For Control Config, keep Default for the default configuration or click Custom to further configure the crawler:
    • If you have received a custom crawler configuration from Atlan support, for Custom Config, enter the value provided. You can also:
      • Enter {“ignore-all-case”: true} to enable crawling assets with case-sensitive identifiers.
    • For Enable Source Level Filtering, click True to enable schema-level filtering at source or keep False to disable it.
    • For Use JDBC Internal Methods, click True to enable JDBC internal methods for data extraction or click False to disable it.
    • For Exclude tables with empty data, change to Yes to exclude any tables and corresponding columns without any data.
    • For Exclude views, change to Yes to exclude all views from crawling.
💪 Did you know? If an asset appears in both the include and exclude filters, the exclude filter takes precedence.

Run the crawler

To run the Snowflake crawler, after completing the steps above:

  1. To check for any permissions or other configuration issues before running the crawler, click Preflight checks.
  2. You can either:
    • To run the crawler once immediately, at the bottom of the screen, click the Run button.
    • To schedule the crawler to run hourly, daily, weekly, or monthly, at the bottom of the screen, click the Schedule Run button.

Once the crawler has completed running, you will see the assets in Atlan's asset page! 🎉

Note that the Atlan crawler will currently skip any unsupported data types to ensure a successful workflow run.

Related articles

Was this article helpful?
1 out of 1 found this helpful