The following documentation describes the resources required to deploy Datalogz on Microsoft Azure.
Resource List
(1) Azure Virtual Network
(1 or 2) Azure Virtual Machine w/ SSH Key Pair
(1) Azure Database for PostgreSQL Server
App DB
Warehouse DB
Airflow DB
(1) Azure Key Vault
(0 or 1) Azure Storage Account (ADLS Gen2 Data Lake)
For Snowflake customers:
(1) Snowflake Warehouse
Warehouse DB (instead of Warehouse DB in Postgres)
Virtual Machines
Datalogz runs on Virtual Machines inside Docker Containers for a simple, cost-effective deployment that can be scaled vertically as demand increases. You may deploy using either Windows or Linux.
Provisioning
Create new virtual machine(s) to host your Datalogz frontend application and backend API in the region of your choice. You may choose either a Windows 10/11 distribution or a Linux distribution.
The security group inbound rules on this machine should allow HTTP/HTTPS traffic from your private network IP – so your users can access the site.
The security group outbound rules on this machine should allow HTTPS traffic from your network IP – so the Gateway API Service can make HTTPS connection to 3rd party services such as Microsoft for running the OAuth2.0.
The security group inbound rules on this machine should allow SSH or RDP traffic from your private team’s IP – so your team can remotely login and deploy the builds.
Ensure you have enabled ManagedIdentity access on the Virtual Machine in the “Identity” panel on the Virtual Machine page in the Azure portal. This will ensure the VMs identity can be used to retrieve keys from the Key Vault requiring no sensitive credentials to be located in an .env file on the VM itself.
Grant access to the key vault to this VM identity following instructions here.
SSH or RDP into the VM to install Docker and clone the repositories.
VM System Requirements
Option 1: Monolith VM supporting backend and frontend services
Minimum
CPU: 2 vCPU
Memory: 8 GB RAM
OS Disk: 50 GB
Estimated Cost as of 2/1/2023: $80 / mo.
Recommended
CPU: 4 vCPU
Memory: 16 GB RAM
OS Disk: 50 GB
Estimated Cost as of 2/1/2023: $160 / mo.
Total Estimated Cost: $80 - $160 / mo.
Option 2: Split VM each supporting either frontend or backend service
Backend VM
Minimum
CPU: 2 vCPU
Memory: 8 GB RAM
OS Disk: 50 GB
Estimated Cost as of 2/1/2023: $80 / mo.
Recommended
CPU: 4 vCPU
Memory: 16 GB RAM
OS Disk: No changes required
Estimated Cost as of 2/1/2023: $160 / mo.
Frontend VM
Minimum / Recommended
CPU: 2 vCPU
Memory: 4 GB RAM
OS Disk: No change
Estimated Cost as of 2/1/2023: $40 / mo.
Total Estimated Cost: $120 - $200 / mo.
Windows 10/11 Required Images
Datalogz is a portable application that can be deployed to either Windows or Linux machines using Docker to virtualize resources.
Only certain Windows 10/11 VM Images on Azure support nested virtualization. Please choose a VM from this list that has the three asterisks denoting ***Hyper-threaded and capable of running nested virtualization.
Datalogz recommends using the following sizes based on whether you choose Option 1 or 2 above:
Size for Single (Monolith) VM: D4d_v4
Sizes for Split VM:
Backend: D2d_v4
Frontend: D2d_v4
SSH or RDP into VM
SSH
Set the correct permissions on your SSH key pair before connecting to the virtual machine, updating the following variables with your key name, user name, and VM IP address
Download the Remote Desktop Protocol (RDP) connection file for the Windows 10/11 VM from the Azure portal and ensure the machine connecting to the remote resources is allowed on port 3389 (RDP).
Installing Docker
Both the Datalogz frontend and backend applications are deployed using Docker.
Docker on Windows 10/11
Installing Docker on Windows 10/11 performs best when using Windows Subsystem for Linux (WSL2). Here are the steps required to install this pre-requisite.
Use Remote Desktop Protocol to connect to the VM
Open a Powershell Terminal as Administrator and run the following command to setup Windows Subsystem for Linux:
wsl --install
Restart the VM. When the VM restarts, Windows Subsystem for Linux may start automatically. You can create a new user named dl_windows_linux_user with your own password to access WSL2 directly, but it's unlikely you will need to.
Now you can proceed to install Docker Desktop by following the official docs in the next section.
Install Docker Desktop (Windows and Linux)
Please continue with the Docker Desktop installation referencing the official docs.
After installation as completed, open Docker Desktop and Accept the Terms of Conditions for Docker Engine to start. If the Docker Engine is not started you may need to Disconnect and Reconnect to the VM.
Set up credential store by running the command in a Command Prompt:
git config --global credential.helper 'store'
The next time you run git pull on a remote origin and sign-in, your credentials will be cached for future reuse.
SSL Certificates
Deploying a Datalogz Proof-of-Concept (POC) will use self-signed keys generated during the build process to enable encrypted communications over HTTPS, and you will access your VM either using the Public IP Address of the VM or an Azure-provided DNS ending in *.cloudapp.azure.com. For example:
Monolith VM
https://x.x.x.x OR https://mono-mycomanywin11.eastus2.cloudapp.azure.com
Split VM
https://x.x.x.x OR https://app-mycomanywin11.eastus2.cloudapp.azure.com
https://x.x.x.y OR https://api-mycomanywin11.eastus2.cloudapp.azure.com
Deploying Datalogz into Production (PROD) enables you to provide your own Certificates for deploying Datalogz to new subdomains on an existing domain. For example:
Monolith VM
https://app.datalogz.mycompany.com
Split VM
https://app.datalogz.mycompany.com
https://api.datalogz.mycompany.com
Key Vault
Datalogz uses Azure Key Vault to store sensitive secrets required to run the application.
Throughout this guide there will be references to secrets which should be stored in the Key Vault and environment variables which should be stored in an .env file in your project folder. Datalogz supports both the ManagedIdentityCredential the EnvironmentCredential class to authenticate to a Key Vault using an App Registration as a Principal.
To set up your Azure Key Vault follow these steps:
Login to Azure Portal
Create a new Key Vault if one has not already been created.
Proceed to either ManagedIdentity or EnvironmentCredential Identity options.
ManagedIdentityCredential (System)
Navigate to the Virtual Machine you wish to use as a Managed System Identity and select Identity on the left sidebar.
Enable System Assigned identity and add the following role assignments to the VM:
“Key Vault Secrets Officer”
“Virtual Machine Contributor”
"Virtual Machine User Login"
Navigate to Key Vault and select Access configuration on the left sidebar.
Set permission model to “Azure role-based access control” and click Apply.
IAM permissions have already been configured in step 4, and you can confirm that they are present from the Key Vaults IAM page, if desired.
Add the Key vault to the default subnet in the same virtual network as the VM.
You can find this in the Key Vault > Networking tab.
Select "Allow public access from specific virtual networks and IP addresses."
Add the existing virtual network where your VM is located.
A service endpoint will be created for this subnet.
Note: If the subnet cannot take additional service endpoints, a new subnet will be required.
SSH into VM and install the Azure CLI
curl-sLhttps://aka.ms/InstallAzureCLIDeb|sudobash
Login to Azure using 2FA from the SSH terminal, following the prompts
Create a new App Registration to act as a Principal to access the Key Vault
Set Access configuration policy to “vault” access control
Create policy and add the Secrets Officer role to the App Principal.
Add the Key vault to the default subnet in the same virtual network as the VM is located in.
Authenticate the key vault by ensuring you have included the following environment variables in the following dot files for these two repositories (after you SSH into the VM):
Pull the code and add the following environment variables to a dot file in your project directory named .prod.env using the correct values based on the examples provided.
ENV=PRODDBT_ENV=prodWAREHOUSE_TYPE=POSTGRES# Key Vault Authentication# Option 1 - If using Managed Identity Access to Key VaultAZURE_RESOURCE_GROUP_NAME=AZURE_VM_NAME=AZURE_KEY_VAULT_NAME=AZURE_KEY_VAULT_URL=# Option 2 - If using Environment Access to Key VaultAZURE_TENANT_ID=AZURE_CLIENT_ID=AZURE_CLIENT_SECRET=# Warehouse# If using Snowflake, change WAREHOUSE_TYPE to SNOWFLAKE
Key vault
Pull the code and add the following environment variables to a dot file in your project directory named .env.prod using the correct values based on the examples provided.
# Specifies the URL or connection string to the Celery result backend# e.g. db+postgresql://<user>:<pass>@<host>/airflow# if SSL is required include parameters: ?sslmode=require&sslrootcert=/opt/airflow/<name_of_cert>.crt.pem# Some postgreSQL servers may require <user> to be in <user@host> format.AIRFLOW--CELERY--RESULT-BACKEND# Specifies the URL or connection string to the Airflow metadata database.# e.g. postgresql+psycopg2://<user>:<pass>@<host>/airflow# if SSL is required include parameters: ?sslmode=require&sslrootcert=/opt/airflow/<name_of_cert>.crt.pem# Some postgreSQL servers may require <user> to be in <user@host> format.AIRFLOW--DATABASE--SQL-ALCHEMY-CONN# Specifies the Fernet key used for encrypting and decrypting Airflow connections and variables. # This must be a 32 base64 encoded string# e.g. Example format: jHfPb-mvRhWyofw8bzyCJym-HyKjSNNbwS8bLJjK0Vo=AIRFLOW-FERNET-KEY# Specifies the username and password for the Airflow web UI used for debugging.AIRFLOW-WWW-USERAIRFLOW-WWW-PASSWORD# Specifies the private access token used for callbacks to gateway on task success/fail# e.g. eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c
AIRFLOW-TO-GATEWAY-ACCESS-TOKENGATEWAY-SERVICE-BASE-URL=http://bi_gateway_service:5000# dbt transformation serviceDBT-ACCESS-TOKEN# if using Postgres WarehouseDIAGNOSTICS-POSTGRES-HOSTDIAGNOSTICS-POSTGRES-DATABASE-NAME=datalogz_whDIAGNOSTICS-POSTGRES-PASSWORDDIAGNOSTICS-POSTGRES-SCHEMA-NAME=publicDIAGNOSTICS-POSTGRES-USERNAME=datalogz_diagnostics_admin# if using Snowflake WarehouseSNOWFLAKE-ACCOUNT-IDENTIFIERDIAGNOSTICS-SNOWFLAKE-WAREHOUSE-NAME=DATALOGZ_BIOPSDIAGNOSTICS-SNOWFLAKE-DATABASE-NAME=DATALOGZ_WHDIAGNOSTICS-SNOWFLAKE-PASSWORDDIAGNOSTICS-SNOWFLAKE-ROLE-NAME=DATALOGZ_DIAGNOSTICS_ADMIN_ROLEDIAGNOSTICS-SNOWFLAKE-SCHEMA-NAME=PUBLICDIAGNOSTICS-SNOWFLAKE-USERNAME=DATALOGZ_DIAGNOSTICS_ADMIN# (Optional)# if using Azure Storage Account# Specifies the connection string to the Azure Blob Storage account.# e.g. DefaultEndpointsProtocol=https;AccountName=myaccount;AccountKey=mykey;EndpointSuffix=core.windows.netAZURE-BLOB-CONNECTION-STRINGAZURE-BLOB-CONTAINER-NAMEAZURE-BLOB-STORAGE-ACCOUNT-NAME
Repository: datalogz-bi-gateway
File: .env.prod
Pull the code and add the following environment variables to a dot file in your project directory named .env.prod using the correct values based on the examples provided.
ENV=PROD# This variable is used to construct the redirect urls for OAuth.# The hostname value should be ideally set to the host name of the server that serves the frontend to the client. # Moreover, the IP address could also be used if the host name is not registered.HOST_NAME=app.your_hostname.comHTTP_SCHEME=httpsCRON_SERVICE_URL=https://airflow_webserver:8080# Include either "MICROSOFT" or "TABLEAU_SERVER"CONFIGURED_IDPS=["MICROSOFT"]# Create this SQL user in advance following the SQL code in Databases sectionENTITLEMENT_USER_NAME=datalogz_gateway_user# Key Vault Authentication# Option 1 - If using Managed Identity Access to Key VaultAZURE_RESOURCE_GROUP_NAME=AZURE_VM_NAME=AZURE_KEY_VAULT_NAME=AZURE_KEY_VAULT_URL=# Option 2 - If using Environment Access to Key VaultAZURE_TENANT_ID=AZURE_CLIENT_ID=AZURE_CLIENT_SECRET=
Key vault
Navigate to your Key Vault in the Azure Portal and add the following environment variables populating them with the correct values based on the examples provided.
# Specifies the connection string to the application's database# e.g. postgresql://datalogz_gateway_admin:<password>@<host>:<port>/datalogz_bi# If SSL is required include parameters: ?sslmode=require&sslrootcert=/opt/airflow/<cert_name>.crt.pem# Some postgreSQL servers may require <user> to be in <user@host> format.APP-DB-CONNECTION-STRING# Specifies the connection string to the application's OLAP warehouse# e.g. postgresql://datalogz_diagnostics_admin:<password>@<host>:<port>/datalogz_wh?options=-csearch_path%3Dbiops_marts%2Dbiops_general
# If SSL is required include parameters: &sslmode=require&sslrootcert=/opt/airflow/<cert_name>.crt.pem# Some postgreSQL servers may require <user> to be in <user@host> format.BI-DB-CONNECTION-STRING# Specifies the private access token used for callbacks to gateway on task success/fail# e.g. eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5c
AIRFLOW-TO-GATEWAY-ACCESS-TOKEN# Specifies the algorithm used to encode and decode JWT tokens# the secret should at least be 32 characters long, but the longer the better.JWT-ALGORITHM=HS256JWT-SECRET-KEY# Specifies the API key for the mail client to send emailsMAIL-CLIENT-API-KEY# Specifies the client ID and Decret for the Microsoft OAuth2 applicationMICROSOFT-CLIENT-IDMICROSOFT-CLIENT-SECRET# PowerBI API Authentication# Option 1# Specifies the client ID and Secret for the Power BI OAuth2 applicationPOWERBI-CLIENT-IDPOWERBI-CLIENT-SECRET# Option 2# Specifies the client ID and Secret for the PowerBI Service PrincipalPOWERBI-SP-TENANT-IDPOWERBI-SP-CLIENT-IDPOWERBI-SP-CLIENT-SECRET
Azure Mail Client
Datalogz allows Azure customers to easily integrate email notifications into Datalogz using the Microsoft Graph REST API and an Azure App Registration.
Documentation: Setting Up Email Notification with Datalogz Using Microsoft Graph REST API
Introduction
Datalogz now allows users to set up email notifications for their team environments via the Microsoft Graph REST API (specifically, the 'user: sendMail' method). This feature enables any user in an organization to connect their Microsoft account through OAuth. Their email will then be utilized to send out email notifications to team members.
Please note that this feature is available only in private deployments.
How to Connect
Follow these steps to connect your Microsoft account:
Navigate to the 'Settings' page and select 'Email Settings'.
Click the 'Connect' button and execute the OAuth flow.
You will be redirected to the Microsoft consent screen. At this point, select a Microsoft account that has access to a mailbox (in other words, your Microsoft account should have Outlook access).
After granting consent, you will be redirected back to the Datalogz app. If the process is successful, a success message will appear in the popup.
Troubleshooting
If the process fails, the most common issue is that the selected Microsoft account does not have access to a mailbox. To resolve this, you can either ask your Microsoft account admin to provide a license to give your user access to a mailbox or you can choose a different account with mailbox access.
Furthermore, the exact reason for process failure will be detailed in the popup, as shown in the screenshot. This information should help you understand and address the issue.
Considerations
Before initiating the connection process, please ensure that the redirect URL in the Microsoft App has been properly added: https://<host>/api/v0/oauth/azure_mail/redirect. Failure to do so may result in unsuccessful integration between the Datalogz app and your Microsoft account.
Remember to replace <host> with your specific hostname.
This integration will provide you with a seamless communication experience, ensuring that all team members stay informed and aligned. If you encounter any issues during setup, please refer to our troubleshooting section or contact our support team for further assistance.
Build
Follow these steps to build the Datalogz services on your virtual machine.
To build the Datalogz application, follow these steps:
ssh into a linux virtual machine to build the following repositories.
datalogz-bi-diagnostics (ELT)
datalogz-bi-gateway (API)
datalogz-bi-frontend (APP)
git pull the main/master branches of each repository
Read the README.md for each repository
Build the services:
Build datalogz-bi-diagnostics
Add the following environment variables to .prod.env
ENV=PROD
DBT_ENV=prod
WAREHOUSE_TYPE=[SNOWFLAKE | POSTGRES]
If you are running Azure Managed Identity VMs, add the following:
AZURE_KEY_VAULT_URL=
AZURE_KEY_VAULT_NAME=
AZURE_VM_NAME=
AZURE_RESOURCE_GROUP_NAME=
From the project directory, run source ./init_env.sh to run through the interactive build script:
Choose env: prod
Choose warehouse: postgres, snowflake
Choose cloud: azure, aws
Choose IAM method for VM: env, identity
Choose vm setup: mono or split
The ./init_env.sh script will build the correct docker compose file based on the options that are chosen.
Build datalogz-bi-gateway
Add the following environment variables to .prod.env
ENV=PROD
PYTHONDONTWRITEBYTECODE=1
CRON_SERVICE_URL=https://airflow_webserver:8080
HTTP_SCHEME=https
HOST_NAME=localhost
Change localhost to your host DNS or private IP of the VM
If you are running Azure Managed Identity VMs, add the following:
AZURE_RESOURCE_GROUP_NAME=
AZURE_VM_NAME=
AZURE_KEY_VAULT_NAME=
AZURE_KEY_VAULT_URL=
Build the services from the project directory:
Monolith VM:
Run docker compose -f docker-compose.mono.prod.yml up --build -d
Split VM:
Run docker compose -f docker-compose.prod.yml up --build -d
Build datalogz-bi-frontend
Add the following environment variables to .env
REACT_APP_IDENTITY_PROVIDER = 'MICROSOFT'
VM_MANAGED_IDENTITY = 'TRUE'
Values for REACT_APP_IDENTITY_PROVIDER can be:
MICROSOFT
TABLEAU_SSO_FOR_CLOUD
TABLEAU_SA
WORKOS
GOOGLE
If REACT_APP_IDENTITY_PROVIDER is set, MANAGED_IDENTITY can also be set to TRUE to embed Service Principal credentials into API build for new Connectors to use by default.
Build the services from the project directory:
Monolith VM:
Run docker compose -f docker-compose.mono.prod.yml up --build -d
OR
Split VM:
Run docker compose -f docker-compose.dev.yml up --build -d
Databases
The Datalogz application uses OLTP and OLAP databases to optimize performance and scalability between the two workloads.
Database System Requirements
Option 1: Postgres Only
Azure Database for PostgreSQL (flexible server)
Minimum
Memory Optimized
Compute: 2 vCores, 16 GiB Memory, 3200 IOPS
Storage: 512 GB
Estimated Cost as of 2/1/2023: $500 / mo.
Recommended
Memory Optimized
Compute: 4 vCores, 32 GiB Memory, 6400 IOPS
Storage: 512 GB
Estimated Cost as of 2/1/2023: $850 / mo.
Total Estimated Cost: $500 - $850 / mo.
Option 2: Postgres + Snowflake
Postgres
Azure Database for PostgreSQL
Minimum/Recommended
Memory Optimized
Compute: 2 vCores, 8 GiB Memory, 3200 IOPS
Storage: 512 GB
Estimated Cost as of 2/1/2023: $200 / mo.
Snowflake
Minimum
Compute: X-Small Warehouse
Storage: Pay as you go
Estimated Cost as of 2/1/2023 assuming 10 users: $600 / mo.
Recommended
Compute: Small Warehouse
Storage: Pay as you go
Estimated Cost as of 2/1/2023 assuming 10 users: $1000 / mo.
Total Estimated Cost: $800 - $1200 / mo.
Object Creation
Option 1: Postgres Only
App DB (Azure Database for PostgreSQL)
Name: datalogz_bi
Warehouse DB (Azure Database for PostgreSQL)
Name: datalogz_wh
Option 2: Postgres + Snowflake
Postgres
App DB (Azure Database for PostgreSQL)
Name: datalogz_bi
Snowflake
Warehouse DB (Snowflake)
Name: datalogz_wh
Confirm non-sensitive environment variables listed have been added to .prod.env and that sensitive enviornment variables have been added to the Key Vault
Confirm non-sensitive environment variables listed have been added to .prod.env and that sensitive enviornment variables have been added to the Key Vault
---- Run the following commands from the master (postgres) database---- Run one line at a timeCREATEDATABASEdatalogz_bi;CREATEROLE datalogz_gateway_admin WITHPASSWORD'<password>'LOGIN; -- Add the password!GRANT ALL PRIVILEGES ONDATABASE datalogz_bi TO datalogz_gateway_admin WITHGRANTOPTION;CREATEDATABASEairflow; CREATEROLE airflow WITHPASSWORD'<password>'LOGIN; -- Add the password!GRANT ALL PRIVILEGES ONDATABASE airflow TO airflow;
---- Connect to the datalogz_bi database as datalogz_gateway_admin---- Run the following commands from the datalogz_bi databaseGRANT ALL PRIVILEGES ONSCHEMA public TO datalogz_gateway_admin;GRANTSELECT, INSERT, UPDATE, DELETEON ALL TABLES INSCHEMA public TO datalogz_gateway_admin;
---- Run the following commands from the master (postgres) database-- Setup the BI Warehouse Database (OLAP - App Workflows)CREATEDATABASEdatalogz_wh;CREATEROLE datalogz_diagnostics_admin WITHPASSWORD'<password>'LOGIN; -- Add the password!CREATEROLE datalogz_gateway_admin WITHPASSWORD'<password>'LOGIN CREATEROLE; -- Add the password!CREATEROLE datalogz_gateway_user WITHPASSWORD'<password>'LOGIN; -- Add the password!CREATEROLE write_user_role;CREATEROLE read_user_role;GRANT ALL ONDATABASE datalogz_wh TO datalogz_diagnostics_admin WITHGRANTOPTION;GRANT ALL ONDATABASE datalogz_wh TO datalogz_gateway_admin WITHGRANTOPTION;GRANT ALL ONDATABASE datalogz_wh TO write_user_role;---- Run the following commands from the datalogz_wh databaseCREATESCHEMApowerbi_ingest;CREATESCHEMApowerbi_source;CREATESCHEMAtableau_ingest;CREATESCHEMAtableau_source;CREATESCHEMAbiops_general;CREATESCHEMAbiops_int;CREATESCHEMAbiops_marts;GRANT USAGE ONSCHEMA powerbi_ingest TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA powerbi_source TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA tableau_ingest TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA tableau_source TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA biops_general TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA biops_int TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA biops_marts TO datalogz_diagnostics_admin;GRANT USAGE ONSCHEMA biops_general TO datalogz_gateway_admin;GRANT USAGE ONSCHEMA biops_marts TO datalogz_gateway_admin;GRANT USAGE ONSCHEMA biops_general TO read_user_role;GRANT USAGE ONSCHEMA biops_marts TO read_user_role;GRANT ALL ONSCHEMA biops_general TO write_user_role;ALTERSCHEMA powerbi_ingest OWNERTO datalogz_diagnostics_admin;ALTERSCHEMA powerbi_source OWNERTO datalogz_diagnostics_admin;ALTERSCHEMA tableau_ingest OWNERTO datalogz_diagnostics_admin;ALTERSCHEMA tableau_source OWNERTO datalogz_diagnostics_admin;ALTERSCHEMA biops_general OWNERTO datalogz_diagnostics_admin;GRANT read_user_role TO datalogz_gateway_user;GRANT write_user_role TO datalogz_gateway_admin;CREATETABLEbiops_general.entitlement( account_id int, environment_id int, connector_id int, account_connector_key varchar(128), user_name varchar(256), system_name varchar(256), entitlement_type varchar(256), entitlement_status varchar(256), created_at timestampdefault current_timestamp, updated_at timestampdefault current_timestamp);GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA powerbi_ingest TO datalogz_diagnostics_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA powerbi_source TO datalogz_diagnostics_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA tableau_ingest TO datalogz_diagnostics_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA tableau_source TO datalogz_diagnostics_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA biops_general TO datalogz_diagnostics_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA powerbi_ingest TO datalogz_gateway_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA powerbi_source TO datalogz_gateway_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA tableau_ingest TO datalogz_gateway_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA tableau_source TO datalogz_gateway_admin;GRANTSELECT, INSERT, UPDATEON ALL TABLES INSCHEMA biops_general TO datalogz_gateway_admin;GRANTSELECTONTABLE datalogz_wh.biops_general.entitlement TO read_user_role;GRANTINSERT, SELECT, UPDATEONTABLE datalogz_wh.biops_general.entitlement TO write_user_role;CREATE EXTENSION IFNOTEXISTS FUZZYSTRMATCH WITHSCHEMA biops_marts;CREATE EXTENSION IFNOTEXISTS plpgsql;
-- Setup the Application Database (OLTP - App Workflows)CREATEDATABASEdatalogz_bi;CREATEROLE datalogz_gateway_admin WITHPASSWORD'<password>'LOGIN; -- Add the password!GRANT ALL PRIVILEGES ONDATABASE datalogz_bi TO datalogz_gateway_db_admin WITHGRANTOPTION;GRANT ALL PRIVILEGES ONSCHEMA public TO datalogz_gateway_db_admin;GRANTSELECT, INSERT, UPDATE, DELETEON ALL TABLES INSCHEMA public TO datalogz_gateway_db_admin;-- Setup the Airflow Database (OLTP - API Connector Workflows)CREATEDATABASEairflow; CREATEROLE airflow WITHPASSWORD'<password>'LOGIN; -- Add the password!GRANT ALL PRIVILEGES ONDATABASE airflow TO airflow;
-- Setup the BI Warehouse Database (OLAP - App Workflows)USEROLE ACCOUNTADMIN;CREATE WAREHOUSE IFNOTEXISTS DATALOGZ_BIOPS WITH WAREHOUSE_SIZE='X-SMALL';USEROLE SYSADMIN; -- whichever role you want to own this databaseCREATEDATABASEDATALOGZ_WH;CREATESCHEMAIFNOTEXISTS POWERBI_INGEST;CREATESCHEMAIFNOTEXISTS POWERBI_SOURCE;CREATESCHEMAIFNOTEXISTS TABLEAU_INGEST;CREATESCHEMAIFNOTEXISTS TABLEAU_SOURCE;CREATESCHEMAIFNOTEXISTS BIOPS_GENERAL;CREATEROLEIFNOTEXISTS DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;CREATEROLEIFNOTEXISTS DATALOGZ_GATEWAY_ADMIN_ROLE;CREATEROLEIFNOTEXISTS WRITE_USER_ROLE;CREATEROLEIFNOTEXISTS READ_USER_ROLE;GRANTCREATE USER ON ACCOUNT TOROLE WRITE_USER_ROLE;GRANTROLE USERADMIN TOROLE WRITE_USER_ROLE;GRANTROLE WRITE_USER_ROLE TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE;GRANT USAGE ON WAREHOUSE DATALOGZ_BIOPS TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT USAGE ON WAREHOUSE DATALOGZ_BIOPS TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE;GRANT USAGE ON WAREHOUSE DATALOGZ_BIOPS TOROLE USERADMIN;GRANT USAGE ON WAREHOUSE DATALOGZ_BIOPS TOROLE READ_USER_ROLE;GRANT USAGE, OPERATE ON WAREHOUSE DATALOGZ_BIOPS TOROLE WRITE_USER_ROLE;GRANT USAGE ONDATABASE DATALOGZ_WH TOROLE READ_USER_ROLE;GRANT USAGE ONDATABASE DATALOGZ_WH TOROLE WRITE_USER_ROLE;GRANT ALL PRIVILEGES ONDATABASE DATALOGZ_WH TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE WITHGRANTOPTION;GRANT ALL PRIVILEGES ONDATABASE DATALOGZ_WH TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE WITHGRANTOPTION;GRANT USAGE ONSCHEMA BIOPS_MARTS TOROLE READ_USER_ROLE;GRANT USAGE ONSCHEMA BIOPS_GENERAL TOROLE READ_USER_ROLE;GRANT USAGE ONSCHEMA BIOPS_GENERAL TOROLE WRITE_USER_ROLE;GRANT ALL PRIVILEGES ONSCHEMA POWERBI_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT ALL PRIVILEGES ONSCHEMA POWERBI_SOURCE TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT ALL PRIVILEGES ONSCHEMA TABLEAU_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT ALL PRIVILEGES ONSCHEMA TABLEAU_SOURCE TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT ALL PRIVILEGES ONSCHEMA BIOPS_GENERAL TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;CREATE USER IF NOT EXISTS DATALOGZ_DIAGNOSTICS_ADMIN WITH PASSWORD = '<password>' DEFAULT_ROLE = DATALOGZ_DIAGNOSTICS_ADMIN_ROLE; -- Add the password!
CREATE USER IF NOT EXISTS DATALOGZ_GATEWAY_ADMIN WITH PASSWORD = '<password>' DEFAULT_ROLE = DATALOGZ_GATEWAY_ADMIN_ROLE; -- Add the password!
GRANTOWNERSHIPONROLE READ_USER_ROLE TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE REVOKE CURRENT GRANTS;GRANTOWNERSHIPONROLE WRITE_USER_ROLE TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE REVOKE CURRENT GRANTS;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA POWERBI_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA POWERBI_SOURCE TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA TABLEAU_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA TABLEAU_SOURCE TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA BIOPS_GENERAL TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTSELECT, INSERT, UPDATEON FUTURE TABLES INSCHEMA BIOPS_GENERAL TOROLE DATALOGZ_GATEWAY_ADMIN_ROLE;GRANTROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE TO USER DATALOGZ_DIAGNOSTICS_ADMIN;GRANTROLE DATALOGZ_GATEWAY_ADMIN_ROLE TO USER DATALOGZ_GATEWAY_ADMIN;USEROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;CREATESCHEMAIFNOTEXISTS DATALOGZ_WH.POWERBI_INGEST;CREATESCHEMAIFNOTEXISTS DATALOGZ_WH.POWERBI_SOURCE;CREATESCHEMAIFNOTEXISTS DATALOGZ_WH.TABLEAU_INGEST;CREATESCHEMAIFNOTEXISTS DATALOGZ_WH.TABLEAU_SOURCE;CREATESCHEMAIFNOTEXISTS DATALOGZ_WH.BIOPS_GENERAL;CREATE OR REPLACETABLEDATALOGZ_WH.BIOPS_GENERAL.ENTITLEMENT( account_id int, environment_id int, connector_id int, user_name varchar(256), system_name varchar(256), entitlement_type varchar(256), entitlement_status varchar(256), created_at timestampdefaultcurrent_timestamp(), updated_at timestampdefaultcurrent_timestamp());GRANTINSERT, SELECT, UPDATEONTABLE DATALOGZ_WH.BIOPS_GENERAL.ENTITLEMENT TOROLE WRITE_USER_ROLE;CREATEFILE FORMAT IFNOTEXISTS json_file_formatTYPE='json' STRIP_OUTER_ARRAY = true STRIP_NULL_VALUES = true;USEROLE ACCOUNTADMIN;CREATEORREPLACE STORAGE INTEGRATION <name_of_storage_integration>-- Add a name!TYPE= EXTERNAL_STAGE STORAGE_PROVIDER ='AZURE'ENABLED= TRUE AZURE_TENANT_ID =''-- Add the Tenant ID! STORAGE_ALLOWED_LOCATIONS = ('azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_extensions_analytics_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_preview_image/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_revisions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_search_usage/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_metrics/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_scan_result/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources_tables/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources_tables_columns/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_sites_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_sites_views_search_usage/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_permissions_groups_capabilities/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_revisions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_datasources_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_runs/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_permissions_groups_capabilities/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups_users/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups_users_favorites/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/extensions_analytics_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/connected_apps/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/custom_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/data_alerts/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/data_alerts_details/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/schedules/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/schedules_extracts/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_columns/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases_permissions/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/jobs/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/recommendations/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/settings_mobile_security_settings/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/settings_user_notifications/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/sites_content_recently_viewed/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/sites_embedding_settings/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/subscriptions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_extract_refreshes/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_linked/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_run_flows/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/virtual_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_virtual_connections_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/saved_issues/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_extensions_analytics_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_preview_image/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_revisions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_search_usage/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_metrics/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_scan_result/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources_tables/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources_tables_columns/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_sites_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_sites_views_search_usage/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_permissions_groups_capabilities/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_revisions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_datasources_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_runs/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_permissions_groups_capabilities/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups_users/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups_users_favorites/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/extensions_analytics_connections/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/connected_apps/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/custom_views/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/data_alerts/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/data_alerts_details/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/schedules/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/schedules_extracts/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_columns/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_permissions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_search_usage/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases_permissions/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/jobs/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/recommendations/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/settings_mobile_security_settings/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/settings_user_notifications/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/sites_content_recently_viewed/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/sites_embedding_settings/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/subscriptions/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_extract_refreshes/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_linked/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_run_flows/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/virtual_connections/','azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_virtual_connections_certifications/',
'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/saved_issues/', ); SHOW INTEGRATIONS;DESC STORAGE INTEGRATION <name_of_storage_integration>;SELECTSYSTEM$GET_SNOWFLAKE_PLATFORM_INFO(); -- add network rules for each of the subnets displayed using the Azure CLI -- az storage account network-rule add --account-name <name_of_sa> --resource-group <name_of_rg> --subnet "<name_of_subnet>"
-- then proceed belowGRANTCREATE STAGE ONSCHEMA POWERBI_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTCREATE STAGE ONSCHEMA TABLEAU_INGEST TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANT USAGE ON INTEGRATION <name_of_storage_integration>TOROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;USEROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;GRANTINSERT, SELECT, UPDATEONTABLE DATALOGZ_WH.BIOPS_GENERAL.ENTITLEMENT TOROLE WRITE_USER_ROLE;CREATEFILE FORMAT IFNOTEXISTS json_file_formatTYPE='json' STRIP_OUTER_ARRAY = true STRIP_NULL_VALUES = true;USEROLE DATALOGZ_DIAGNOSTICS_ADMIN_ROLE;create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_extensions_analytics_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_extensions_analytics_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_preview_image storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_preview_image/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_revisions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_revisions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_metrics storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_metrics/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_workbooks_scan_result storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_workbooks_scan_result/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_scan_results_datasources storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_scan_results_datasources_tables storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources_tables/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_scan_results_datasources_tables_columns storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/scan_results_datasources_tables_columns/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_sites_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_sites_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_sites_views_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_sites_views_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_projects_permissions_groups_capabilities storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/projects_permissions_groups_capabilities/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_datasources storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_datasources_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_datasources_revisions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_revisions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_datasources_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_datasources_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_datasources_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_datasources_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/datasources_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_runs storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_runs/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_permissions_groups_capabilities storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_permissions_groups_capabilities/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_flows_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/flows_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_groups storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_groups_users storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups_users/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_groups_users_favorites storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/groups_users_favorites/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_extensions_analytics_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/extensions_analytics_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_connected_apps storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/connected_apps/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_custom_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/custom_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_data_alerts storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/data_alerts/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_data_alerts_details storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/data_alerts_details/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_schedules storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/schedules/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_schedules_extracts storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/schedules_extracts/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_tables storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_tables_columns storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_columns/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_tables_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_tables_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_tables_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_tables_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_databases storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_databases_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_databases_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_databases_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_jobs storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/jobs/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_recommendations storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/recommendations/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_settings_mobile_security_settings storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/settings_mobile_security_settings/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_settings_user_notifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/settings_user_notifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_sites_content_recently_viewed storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/sites_content_recently_viewed/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_sites_embedding_settings storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/sites_embedding_settings/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_subscriptions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/subscriptions/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_tasks_extract_refreshes storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_extract_refreshes/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_tasks_linked storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_linked/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_tasks_run_flows storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/tasks_run_flows/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_virtual_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/virtual_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_metadata_virtual_connections_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/metadata_virtual_connections_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.powerbi_ingest.powerbi_saved_issues storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/powerbi/admin/saved_issues/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_extensions_analytics_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_extensions_analytics_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_preview_image storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_preview_image/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_revisions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_revisions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_metrics storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_metrics/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_workbooks_scan_result storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_workbooks_scan_result/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_scan_results_datasources storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_scan_results_datasources_tables storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources_tables/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_scan_results_datasources_tables_columns storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/scan_results_datasources_tables_columns/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_sites_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_sites_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_sites_views_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_sites_views_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_projects_permissions_groups_capabilities storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/projects_permissions_groups_capabilities/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_datasources storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_datasources_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_datasources_revisions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_revisions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_datasources_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_datasources_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_datasources_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_datasources_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/datasources_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_runs storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_runs/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_permissions_groups_capabilities storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_permissions_groups_capabilities/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_flows_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/flows_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_groups storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_groups_users storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups_users/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_groups_users_favorites storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/groups_users_favorites/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_extensions_analytics_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/extensions_analytics_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_connected_apps storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/connected_apps/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_custom_views storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/custom_views/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_data_alerts storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/data_alerts/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_data_alerts_details storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/data_alerts_details/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_schedules storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/schedules/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_schedules_extracts storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/schedules_extracts/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_tables storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_tables_columns storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_columns/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_tables_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_tables_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_tables_search_usage storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_tables_search_usage/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_databases storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_databases_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_databases_permissions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_databases_permissions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_jobs storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/jobs/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_recommendations storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/recommendations/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_settings_mobile_security_settings storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/settings_mobile_security_settings/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_settings_user_notifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/settings_user_notifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_sites_content_recently_viewed storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/sites_content_recently_viewed/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_sites_embedding_settings storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/sites_embedding_settings/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_subscriptions storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/subscriptions/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_tasks_extract_refreshes storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_extract_refreshes/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_tasks_linked storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_linked/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_tasks_run_flows storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/tasks_run_flows/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_virtual_connections storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/virtual_connections/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_metadata_virtual_connections_certifications storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/metadata_virtual_connections_certifications/' file_format = json_file_format;
create stage if not exists datalogz_wh.tableau_ingest.tableau_saved_issues storage_integration = datalogz_wh_saas_prod_int url = 'azure://<blob_storage_name>.blob.core.windows.net/<container_name>/data/tableau/admin/saved_issues/' file_format = json_file_format;
Storage Account
Datalogz uses Azure Storage Account to stage external files for ingestion into the target warehouse. This is default for Snowflake warehouses and optional for Postgres warehouses.
Login to Azure Portal and create a new Gen2 Storage Account (hierarchical namespace).
Add the storage account to the appropriate virtual network from Storage Account > Networking panel.
Create a new container named datalogzbidiagnostics
Set the access level to Private
Ensure the following environment variables have been added to your Azure Key Vault.
These are required for your VM or environment credentials to be authenticated to read/write to the storage container:
Navigate to Key Vault > Secrets and click Generate/Import
Create a new secret key and value for each of the following:
AZURE-BLOB-CONTAINER-NAME
AZURE-BLOB-STORAGE-ACCOUNT-NAME
AZURE-BLOB-CONNECTION-STRING
The values for these are available in your Storage Account:
The value for AZURE-BLOB-CONTAINER-NAMEshould be set to datalogzbidiagnostics
The value for AZURE-BLOB-STORAGE-ACCOUNT-NAMEshould be set to the name of your storage account.
The value for AZURE-BLOB-CONNECTION-STRINGcan be found in: