How to crawl on-premises Tableau

Once you have set up the tableau-extractor tool, you can extract metadata from your on-premises Tableau instances by completing the following steps.

Run tableau-extractor

Crawl all Tableau connections

To crawl all Tableau connections using the tableau-extractor tool:

  1. Log into the server with Docker Compose installed.
  2. Change to the directory containing the compose file.
  3. Run Docker Compose: sudo docker-compose up

Crawl a specific connection

To crawl a specific Tableau connection using the tableau-extractor tool:

  1. Log into the server with Docker Compose installed.
  2. Change to the directory containing the compose file.
  3. Run Docker Compose: sudo docker-compose up <connection-name>

(Replace <connection-name> with the name of the connection from the services section of the compose file.)

(Optional) Review generated files

The tableau-extractor tool will generate many folders with JSON files for each service. For example:

  • calculated_fields
  • dashboards
  • datasources
  • workbooks
  • and many others

You can inspect the metadata and make sure it is acceptable for providing metadata to Atlan.

Upload generated files to S3

To provide Atlan access to the extracted metadata, you will need to upload the metadata to an S3 bucket.

💪 Did you know? We recommend uploading to the same S3 bucket as Atlan uses to avoid access issues. Reach out to your Data Success Manager to get the details of your Atlan bucket. To create your own bucket, refer to the Create your own S3 bucket section of the dbt documentation. (The steps will be exactly the same.)

To upload the metadata to S3:

  1. Ensure that all files for a particular connection have the same prefix. For example, output/tableau-example/dashboards/result-0.json, output/tableau-example/workbooks/result-0.json, and so on.
  2. Upload the files to the S3 bucket using your preferred method.

For example, to upload all files using the AWS CLI:

aws s3 cp output/tableau-example s3://my-bucket/metadata/tableau-example --recursive

Crawl metadata in Atlan

Once you have extracted metadata on-premises and uploaded the results to S3, you can crawl the metadata into Atlan:

Be sure you select S3 for the Extraction method.

Related articles

Was this article helpful?
0 out of 0 found this helpful