-
Notifications
You must be signed in to change notification settings - Fork 31
Update to 50.23 and implement Brigad OAuth Databricks Driver Changes #40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
AlexVialaBellander
wants to merge
16
commits into
relferreira:master
Choose a base branch
from
AlexVialaBellander:origin/updated
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 12 commits
Commits
Show all changes
16 commits
Select commit
Hold shift + click to select a range
d6effaf
fix: bad aliasing
e64a0ca
fix: bad aliasing (#1)
shrodingers fb53083
Feat/unity databricks (#2)
shrodingers 514d271
Chore/releasing (#3)
shrodingers c5b2560
Fix/names (#4)
shrodingers 7620659
fix: pr comms (#5)
shrodingers 91540de
Merge branch 'master' of github.com:Brigad/metabase-sparksql-databric…
83a6af0
Fix/readme (#6)
shrodingers a28cad5
Merge branch 'master' of github.com:Brigad/metabase-sparksql-databric…
9637359
chore: bump to 49.7
138ce9b
Handle connection attempts to unsupported databases
metamben 68a40e6
Upgrade to 0.50.23
AlexVialaBellander 1ecf1e0
Update Dockerfile
AlexVialaBellander 5431b6a
Update src/metabase/driver/sparksql_databricks_v2.clj
AlexVialaBellander c8ca534
Update src/metabase/driver/sparksql_databricks_v2.clj
AlexVialaBellander 6c688c0
Update src/metabase/driver/sparksql_databricks_v2.clj
AlexVialaBellander File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -67,4 +67,4 @@ target/checksum.txt | |
| repo | ||
| .cpcache | ||
| .lsp | ||
| .clj-kondo | ||
| .clj-kondo | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,97 +1,16 @@ | ||
| # Metabase Driver: Spark Databricks | ||
| So the credits are a bit complicated, but originally, this driver was developed by Fernando Goncalves and Rajesh Kumar Ravi. Their original | ||
| repository is no longer around. However, github user [relferreira](https://github.com/relferreira), kindly updates it [here](https://github.com/relferreira/metabase-sparksql-databricks-driver/tree/master). However, his solution | ||
| does not allow for OAuth Secrets, which was something solved by [shrodingers](https://github.com/shrodingers) at [Brigad](https://github.com/Brigad/metabase-sparksql-databricks-driver). | ||
|
|
||
| **Credits**: This repository is only an updated version of the work of Fernando Goncalves and Rajesh Kumar Ravi. | ||
| Thus, this work is a combination of two somewhat actively maintained repositories. All that I do is to merge the two solutions and update the driver to work with Metabase 0.50.23. | ||
|
|
||
| ## Installation | ||
|
|
||
| To build a dockerized Metabase including the Databricks driver from this repository, simply run: | ||
|
|
||
| ``` | ||
| docker build -t metabase:0.46.6.2-db -f Dockerfile . | ||
| docker build -t metabase:0.50.23-databricks -f Dockerfile . | ||
| ``` | ||
|
|
||
| The Metabase Databricks driver gets build and included in a final Metabase docker image. | ||
|
|
||
| ### To be fixed for >= v0.46: | ||
|
|
||
| To run the tests for this driver, run the following: | ||
|
|
||
| ``` | ||
| docker build -t metabase/databricks-test --target stg_test . | ||
| docker run --rm --name mb-test metabase/databricks-test | ||
| ``` | ||
|
|
||
| or, if you have Clojure on your local machine, just: | ||
|
|
||
| ``` | ||
| clojure -X:test | ||
| ``` | ||
|
|
||
| # Connecting | ||
|
|
||
| ## Parameters | ||
|
|
||
|  | ||
|
|
||
| - Display Name: a identification name for your database in Metabase | ||
| - Host: your Databricks URL (adb-XXXXXXXXX.azuredatabricks.net) | ||
| - Port: usually 443 | ||
| - Database Name: usually `default` | ||
| - Username: usually `token` | ||
| - Password: personal access token created in Databrick's dashboard | ||
| - Additional JDBC connection string options: | ||
| - SQL Warehouse (Endpoint): you can find it at `/sql/warehouses/` at the `Connection details` tab. It should have the following pattern: `;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/endpoints/<SQL WAREHOUSE ID>;UID=token;PWD=<ACCESS TOKEN>` | ||
| - Cluster Endpoint: you will find it at your cluster's details page. It should have the following pattern: `;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/<ORG ID>/<CLUSTER ID>;AuthMech=3;UID=token;PWD=<ACCESS TOKEN>` | ||
|
|
||
| ## Building the driver (the fast way) | ||
|
|
||
| Use the `Dockerfile` on this repo: | ||
|
|
||
| ```bash | ||
| docker build -t metabase:metabase-head-databricks-1.3.0 . | ||
| ``` | ||
|
|
||
| And you can deploy to some docker registry of your own and use the image! | ||
|
|
||
| Example of running: | ||
|
|
||
| ```bash | ||
| docker run -d -p 3000:3000 --name metabase metabase:metabase-head-databricks-1.6.0 | ||
| ``` | ||
|
|
||
| And access `http://localhost:3000`. | ||
|
|
||
| ## Building the driver (advanced way) | ||
|
|
||
| ### Prereq: Install Metabase as a local maven dependency, compiled for building drivers | ||
|
|
||
| Clone the [Metabase repo](https://github.com/metabase/metabase) first if you haven't already done so. | ||
|
|
||
| ```bash | ||
| cd /path/to/metabase/ | ||
| ./bin/build | ||
| ``` | ||
|
|
||
| ### Build the Spark Databricks driver | ||
|
|
||
| ```bash | ||
| # (In the sparksql-databricks driver directory) | ||
| clojure -X:build :project-dir "\"$(pwd)\"" | ||
| ``` | ||
|
|
||
| ### Copy it to your plugins dir and restart Metabase | ||
|
|
||
| ```bash | ||
| mkdir -p /path/to/metabase/plugins/ | ||
| cp target/sparksql-databricks.metabase-driver.jar /path/to/metabase/plugins/ | ||
| jar -jar /path/to/metabase/metabase.jar | ||
| ``` | ||
|
|
||
| _or:_ | ||
|
|
||
| ```bash | ||
| mkdir -p /path/to/metabase/plugins | ||
| cp target/sparksql-databricks.metabase-driver.jar /path/to/metabase/plugins/ | ||
| cd /path/to/metabase_source | ||
| lein run | ||
| ``` |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,41 +1,58 @@ | ||
| info: | ||
| name: Metabase Databricks Spark SQL Driver | ||
| name: Metabase Databricks Spark SQL Driver (v2) | ||
| version: 1.0.0-SNAPSHOT | ||
| description: Allows Metabase to connect to Databricks Spark SQL databases. | ||
| driver: | ||
| - name: hive-like | ||
| lazy-load: true | ||
| abstract: true | ||
| parent: sql-jdbc | ||
| - name: sparksql-databricks | ||
| display-name: Spark SQL (Databricks) | ||
| - name: sparksql-databricks-v2 | ||
| display-name: Databricks SQL (v2) | ||
| lazy-load: true | ||
| parent: hive-like | ||
| connection-properties: | ||
| - merge: | ||
| - host | ||
| - placeholder: "<account>.cloud.databricks.com" | ||
| - merge: | ||
| - port | ||
| - default: 443 | ||
| - host | ||
| - placeholder: "<account>.cloud.databricks.com" | ||
| helper-text: "The hostname of your Databricks account" | ||
| - name: app-id | ||
| display-name: Databricks client id | ||
| placeholder: "9af18267-60e7-4061-b2d5-e2414af88b0b" | ||
| required: true | ||
| helper-text: "The id of the service principal you generated an Oauth token for (see : https://docs.databricks.com/en/dev-tools/authentication-oauth.html)" | ||
| - name: app-secret | ||
| display-name: Databricks OAuth secret | ||
| placeholder: "doseXXXXXXXXXXXX" | ||
| required: true | ||
| helper-text: "The secret of the service principal you generated an Oauth token for (see : https://docs.databricks.com/en/dev-tools/authentication-oauth.html)" | ||
| - name: http-path | ||
| display-name: HTTP Path | ||
| placeholder: "/sql/1.0/warehouses/<id>" | ||
| helper-text: "The path to the Databricks SQL endpoint (see : https://docs.databricks.com/en/integrations/compute-details.html)" | ||
| required: true | ||
| - name: catalog | ||
| display-name: Catalog | ||
| placeholder: "<catalog-name>" | ||
| required: true | ||
| - merge: | ||
| - dbname | ||
| - placeholder: default | ||
| - merge: | ||
| - user | ||
| - default: token | ||
| - merge: | ||
| - password | ||
| - placeholder: "<user_token>" | ||
| - required: false | ||
| display-name: Schema / Database (Optional) | ||
| - advanced-options-start | ||
| - merge: | ||
| - additional-options | ||
| - name: jdbc-flags | ||
| placeholder: ";transportMode=http;ssl=1;httpPath=<cluster-http-path>;AuthMech=3;UID=token;PWD=<token>" | ||
| placeholder: ";transportMode=http;ssl=1;" | ||
| - merge: | ||
| - additional-options | ||
| - name: port | ||
| display-name: HTTP Port | ||
| placeholder: "443" | ||
| default: 443 | ||
| - default-advanced-options | ||
| connection-properties-include-tunnel-config: false | ||
| init: | ||
| - step: load-namespace | ||
| namespace: metabase.driver.sparksql-databricks | ||
| namespace: metabase.driver.sparksql-databricks-v2 | ||
| - step: register-jdbc-driver | ||
| class: metabase.driver.FixedSparkDriver |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,6 @@ | ||
| SCRIPT_DIR=$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" &>/dev/null && pwd) | ||
|
|
||
| docker buildx build --build-arg METABASE_VERSION=v0.49.7 --target stg_export --platform "linux/amd64" -t metabase:databricks-plugin "$SCRIPT_DIR/.." | ||
| container_id=$(docker create "metabase:databricks-plugin" /bin/bash) | ||
| docker cp "$container_id:/sparksql-databricks-v2.metabase-driver.jar" "$SCRIPT_DIR/../dist/databricks-sql.metabase-driver.jar" | ||
| docker rm "$container_id" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.