To integrate Snowflake with Abacus.AI, you need to set up the connector and provide the necessary permissions.
This connector uses key/pair authentication.
Gather Required Information:
SNOWFLAKE INSTANCE URL: Click on your account ID, the eight-digit alphanumeric code located at the bottom-left corner of your Snowflake homepage. Then, hover over the resulting pop-up menu to reveal the "copy account URL" button, resembling a paperclip. Your unique URL generally takes this form: https://<domain>.<region>.snowflakecomputing.com
, but may include additional components. This full URL is required for the connection.
WAREHOUSE: Pick the Snowflake warehouse you wish to connect to. You can find the list of warehouses by using the command SHOW WAREHOUSES
in a Snowflake SQL worksheet. The warehouse is the compute resource in Snowflake.
DATABASE: Pick the Snowflake database that houses the data you wish to access. You can find the list of databases by using the command SHOW DATABASES
in a Snowflake SQL worksheet. The database is where your data is stored.
USER: Pick any string for a unique user name to be created specifically for Abacus.AI. "ABACUSAI" is commonly used, but is not required to be the user name. This user will be used to connect to Snowflake.
LOGIN NAME (OPTIONAL): Enter the login name if it differs from the username. This field is optional and can be left blank if the login name and username are the same.
ROLE: Pick an existing role or choose any string as a unique name for the role you'd like Abacus.AI to occupy. Abacus.AI will generate a create statement for you to run if the role doesn't yet exist. The role defines the permissions for accessing data.
Access Abacus.AI Connected Services Dashboard:
Enter Snowflake Details:
Save
. Run Command in Snowflake:
You will get a pop-up with instructions to run a command in Snowflake. This command will create the necessary user and role for Abacus.AI.
Note: You need to have sufficient permissions to create and authorize the user and role - e.g., ACCOUNTADMIN.
Replace Schema Name:
<Enter schema_name here>
in the commands with a schema in your chosen database you wish to allow Abacus.AI to read from. Execute Command in Snowflake:
Verify Connector Setup:
The Snowflake connector now supports mimicking user permissions by querying using the user’s access token. This requires each user to connect using OAuth to access a DataLLM based on a connector with RBAC enabled.
Connector Creation with RBAC Enabled:
User Permissions in DataLLM:
OAuth Linking for Users:
<workspace>.abacus.ai/chatllm/admin/connectors-list/
. RBAC Flow in Action:
If using Snowflake Share, please follow these setup instructions:
Create a new Snowflake Share:
Gather Required Information from Snowflake:
Access Abacus.AI Connected Services Dashboard:
Enter Snowflake Details:
Save
.Run Command in Snowflake:
Add Consumer:
On the detail page of the share, next to Shared With, click on "Add Consumers."
Paste the Account ID from step 5 into the box under "Share With Snowflake Accounts."
Verify Connector Setup:
Once the Snowflake connector is set up, you can fetch data to train models in Abacus.AI.
Create a New Project:
Create New Dataset:
Name the Dataset:
Read from External Service:
Enter Dataset Details:
Configure Schema Mapping:
- After the dataset is uploaded, configure the schema mapping and proceed to train models with the data.
What happens if the underlying Snowflake table changes its schema after I've already created the dataset?
If this happens just go to the relevant dataset's page in Abacus.AI, and select Create New Dataset Version
Then, on the resulting screen, you can select the columns that you would like to include. Our system will automatically detect the schema change, and if you use 'SELECT *', any changes to the schema will be reflected in the new dataset version.