In some cases you will not be able to expose your Microsoft Azure Synapse Analytics instance for Atlan to mine query history from the Query Store. For example, this may happen when security requirements restrict access to sensitive, mission-critical data.
In such cases you may want to decouple the mining of query history from its ingestion in Atlan. This approach gives you full control over your resources and metadata transfer to Atlan.
Once you have mined query history on-premises and uploaded the results to S3, you can mine query history in Atlan:
Prerequisites
To mine query history from your on-premises Microsoft Azure Synapse Analytics instance, you will need to use Atlan's synapse-miner tool.
Install Docker Compose
Docker Compose is a tool for defining and running applications composed of many Docker containers. (Any guesses where the name came from? π)
To install Docker Compose:
Get the synapse-miner tool
To get the synapse-miner tool:
- Raise a support ticket to get a link to the latest version.
- Download the image using the link provided by support.
- Load the image to the server you'll use to mine Microsoft Azure Synapse Analytics:
sudo docker load -i /path/to/synapse-miner-master.tar
Get the compose file
Atlan provides you with a configuration file for the synapse-miner tool. This is a Docker compose file.
To get the compose file:
- Download the latest compose file.
- Save the file to an empty directory on the server you'll use to access your on-premises Microsoft Azure Synapse Analytics instance.
- The file is
docker-compose.yml
.
Define database connections
The structure of the compose file includes three main sections:
-
x-templates
contains configuration fragments. You should ignore this section β do not make any changes to it. -
services
is where you will define your Microsoft Azure Synapse Analytics connections. -
volumes
contains mount information. You should ignore this section as well β do not make any changes to it.
Define services
For each on-premises Microsoft Azure Synapse Analytics instance, define an entry under services
in the compose file.
Each entry will have the following structure:
services:
connection-name:
<<: *mine
environment:
<<: *synapsedb
USERNAME: <USERNAME>
PASSWORD: <PASSWORD>
HOST: <HOST>
PORT: <PORT>
DATABASE: <DATABASE>
volumes:
- ./output/connection-name:/output
- Replace
connection-name
with the name of your connection. -
<<: *mine
tells the synapse-miner tool to run. -
environment
contains all parameters for the tool:-
USERNAME
β specify the database username. -
PASSWORD
β specify the database password. -
HOST
β specify the database host. -
PORT
β specify the database port. -
DATABASE
β specify the database name.
-
-
volumes
specifies where to store results. In this example, the miner will store results in the./output/connection-name
folder on the local file system.
You can add as many Microsoft Azure Synapse Analytics connections as you want.
services
format in more detail.Secure credentials
Using local files
Using Docker secrets
To create and use Docker secrets:
- Create a new Docker secret:
printf "This is a secret password" | docker secret create my_database_password -
- At the top of your compose file, add a secrets element to access your secret:
secrets: my-database-password: external: true
- Within the
service
section of the compose file, add a new secrets element and specifyPASSWORD_SECRET_PATH
to use it as a password.
Example
Let's explain in detail with an example:
secrets:
my-database-password:
external: true
x-templates:
# ...
services:
my-database:
<<: *mine
environment:
<<: *synapsedb
USERNAME: <USERNAME>
PASSWORD_SECRET_PATH: "/run/secrets/my_database_password"
# ...
volumes:
# ...
secrets:
- my-database-password
volumes:
jars: