How to crawl Looker

Once you have configured the Looker user permissions, you can establish a connection between Atlan and Looker.

To crawl metadata from Looker, review the order of operations and then complete the following steps.

Select the source

To select Looker as your source:

  1. In the top right of any screen, navigate to New and then click New Workflow.
  2. From the list of packages, select Looker Assets and click on Setup Workflow.

Provide credentials

Choose your extraction method:

Direct extraction method

To enter your Looker credentials:

  1. For Host Name, enter the full URL for your Looker API host, including the https://.
  2. For Port, keep 443 for Looker instances created after July 7, 2020, or switch to 19999 for older instances.
  3. For Client ID, enter the client ID you generated when setting up user permissions.
  4. For Client Secret, enter the client secret you generated when setting up user permissions.
  5. (Optional) For Field Level Lineage:
    1. For Private SSH Key, paste the private SSH key for the key you configured in GitHub.
    2. For Passphrase for the private key, enter the passphrase that protects the key, if any. (If the key is not protected by a passphrase leave this blank.)
  6. At the bottom of the form, click the Test Authentication button to confirm connectivity to Looker using these details.
  7. When successful, at the bottom of the screen click the Next button.

S3 extraction method

Atlan also supports the S3 extraction method for fetching metadata from Looker. This method uses Atlan's looker-extractor tool to fetch metadata. You will need to first extract the metadata yourself and then make it available in S3.

To enter your S3 details:

  1. For S3 bucket name, enter the name of your S3 bucket. If you are reusing Atlan's S3 bucket, you can leave this blank.
  2. For S3 prefix, enter the S3 prefix under which all the metadata files exist. These include projects.json, dashboards.json, and so on.
  3. For S3 region, enter the name of the S3 region.
  4. When complete, at the bottom of the screen click Next.

Configure the connection

To complete the Looker connection configuration:

  1. Provide a Connection Name that represents your source environment. For example, you might want to use values like production, development, gold, or analytics.
  2. (Optional) To change the users able to manage this connection, change the users or groups listed under Connection Admins.
    🚨 Careful! If you do not specify any user or group, nobody will be able to manage the connection β€” not even admins.
  3. At the bottom of the screen, click the Next button to proceed.

Configure the crawler

Before running the Looker crawler, you can further configure it. (These options are only available when using the direct extraction method.)

You can override the defaults for any of these options:

  • Looker folders contain saved content, such as dashboards, looks, and tiles:
    • To select the Looker folders you want to include in crawling, click Include Folders. (This will default to all folders, if none are specified.)
    • To select the Looker folders you want to exclude from crawling, click Exclude Folders. (This will default to no folders, if none are specified.)
  • Looker projects contain LookML files, such as models, views, and explores:
    • To select the Looker projects you want to include in crawling, click Include Projects. (This will default to all projects, if none are specified.)
    • To select the Looker projects you want to exclude from crawling, click Exclude Projects. (This will default to no projects, if none are specified.)
  • For Use Field Level Lineage, click True to enable crawling field-level lineage for Looker or click False to disable it.
πŸ’ͺ Did you know? If a folder or project appears in both the include and exclude filters, the exclude filter takes precedence.

Run the crawler

To run the Looker crawler, after completing the steps above:

  1. To check for any permissions or other configuration issues before running the crawler, click Preflight checks.
  2. You can either:
    • To run the crawler once immediately, at the bottom of the screen, click the Run button.
    • To schedule the crawler to run hourly, daily, weekly, or monthly, at the bottom of the screen, click the Schedule Run button.

Once the crawler has completed running, you will see the assets in Atlan's asset page! πŸŽ‰

Related articles

Was this article helpful?
1 out of 1 found this helpful