Once you have configured the Looker user permissions, you can establish a connection between Atlan and Looker.
To crawl metadata from Looker, review the order of operations and then complete the following steps.
Select the source
To select Looker as your source:
- In the top right of any screen, navigate to New and then click New Workflow.
- From the list of packages, select Looker Assets and click on Setup Workflow.
Provide credentials
Choose your extraction method:
- In Direct extraction, Atlan connects to Looker and crawls metadata directly.
- In Offline extraction, you need to first extract metadata yourself and make it available in S3.
Direct extraction method
To enter your Looker credentials:
- For Host Name, enter the full URL for your Looker API host, including the
https://
. - For Port, keep 443 for Looker instances created after July 7, 2020, or switch to 19999 for older instances.
- For Client ID, enter the client ID you generated when setting up user permissions.
- For Client Secret, enter the client secret you generated when setting up user permissions.
- (Optional) For Field Level Lineage:
- For Private SSH Key, paste the private SSH key for the key you configured in GitHub.
- For Passphrase for the private key, enter the passphrase that protects the key, if any. (If the key is not protected by a passphrase, leave this blank.)
- For SSH Known Hosts, add any value that needs to be hardcoded in the
~/.ssh/known-hosts
file before cloning your project Git repositories using SSH. (If not required, leave this blank.)
- At the bottom of the form, click the Test Authentication button to confirm connectivity to Looker using these details.
- When successful, at the bottom of the screen click the Next button.
Offline extraction method
Atlan also supports the offline extraction method for fetching metadata from Looker. This method uses Atlan's looker-extractor tool to fetch metadata. You will need to first extract the metadata yourself and then make it available in S3.
To enter your S3 details:
- For Bucket name, enter the name of your S3 bucket or Atlan's bucket.
- For Bucket prefix, enter the S3 prefix under which all the metadata files exist. These include
projects.json
,dashboards.json
, and so on. - For Bucket region, enter the name of the S3 region.
- When complete, at the bottom of the screen click Next.
Configure the connection
To complete the Looker connection configuration:
- Provide a Connection Name that represents your source environment. For example, you might want to use values like
production
,development
,gold
, oranalytics
. - (Optional) To change the users able to manage this connection, change the users or groups listed under Connection Admins.
🚨 Careful! If you do not specify any user or group, nobody will be able to manage the connection — not even admins.
- At the bottom of the screen, click the Next button to proceed.
Configure the crawler
Before running the Looker crawler, you can further configure it. (These options are only available when using the direct extraction method.)
You can override the defaults for any of these options:
- Looker folders contain saved content, such as dashboards, looks, and tiles:
- To select the Looker folders you want to include in crawling, click Include Folders. (This will default to all folders, if none are specified.)
- To select the Looker folders you want to exclude from crawling, click Exclude Folders. (This will default to no folders, if none are specified.)
- Looker projects contain LookML files, such as models, views, and explores:
- To select the Looker projects you want to include in crawling, click Include Projects. (This will default to all projects, if none are specified.)
- To select the Looker projects you want to exclude from crawling, click Exclude Projects. (This will default to no projects, if none are specified.)
- For Use Field Level Lineage, click True to enable crawling field-level lineage for Looker or click False to disable it.
Run the crawler
To run the Looker crawler, after completing the steps above:
- To check for any permissions or other configuration issues before running the crawler, click Preflight checks.
- You can either:
- To run the crawler once immediately, at the bottom of the screen, click the Run button.
- To schedule the crawler to run hourly, daily, weekly, or monthly, at the bottom of the screen, click the Schedule Run button.
Once the crawler has completed running, you will see the assets in Atlan's asset page! 🎉