To set the destination credentials navigate to the settings tab in the Bold Data Hub. Bold Data Hub supports seven different destinations:
- PostgreSQL
- Apache Doris
- SQL server
- MySQL
- Google BigQuery
- Snowflake
- Oracle
We can configure multiple data store destinations with the same server type and load data into them. This is common in scenarios where we might have multiple databases of the same type (for example, multiple MySQL or PostgreSQL databases) for different environments like development, testing, staging, or production, or for different segments of business operations.
Step 1 : Click on the settings.
Step 2: Choose the Connection Type.
New: Choose this option if you are creating a new connection to a destination for which you have not previously created credentials.
Existing: Select this option if you are updating or modifying the credentials or settings of a connection that you have already set up.
Step 3: Choose the destination where you want to move the data.
Step 4: Enter the credentials for the respective destination.
Step 5: Click on Save to save the credentials from the Bold Reports Data Store. If all the given credentials are valid, the Datastore settings are saved successfully message will appear near the save button.
Enter the following credentials for PostgreSQL:
postgres
.Enter the credentials for the Apache Doris.
Enter the following credentials for SQL Server:
Enter the following credentials for MySQL:
Enter the credentials for Snowflake:
Server: Enter the server name (e.g., account.snowflakecomputing.com).
Username: Enter your Snowflake username.
Password: Enter your Snowflake password.
Warehouse: Enter your Snowflake warehouse name.
Database: Enter your Snowflake database name.
Click Save to save the credentials. If all the provided credentials are valid, the message “Datastore settings saved successfully” will appear near the save button.
Enter the credentials for Oracle:
Click Save to save the credentials. If all the provided credentials are valid, the message “Datastore settings saved successfully” will appear near the save button.
Enter the credentials for Google BigQuery:
Click here to learn how to get the Google BigQuery service account JSON file.
Click Save to save the credentials. If all the provided credentials are valid, the message “Datastore settings saved successfully” will appear near the save button.
Log in to or create a Google Cloud account.
Sign up for or log in to the Google Cloud Platform in your web browser.
Create a new Google Cloud project.
After arriving at the Google Cloud Console welcome page, click the project selector in the top-left, then click the New Project button. Enter your desired project name and click the Create button.
Create a service account and grant BigQuery permissions. To create a service account, click the Go to Create Service Account button on the linked documentation page. Select the project you created and name the service account as desired.
Click the Continue button and grant the following roles to enable schema creation and data loading:
You don’t need to grant users access to this service account at this stage, so click the Done button.
Download the service account JSON file. After clicking Done, you will be redirected to the service accounts table page. Locate the service account you created, click the three dots under the Actions column, and select Manage keys. On the Manage keys page, click Add key, then select Create new key, and finally click Create, ensuring the preselected JSON option is retained.
A JSON file containing your service account’s private key will be downloaded.
Note: Bold Data Hub currently does not support creating a data source in Bold Reports when using Google BigQuery as the data store.