We use our own cookies and third parties ones to offer our services and collect statistical data. If you continue browsing the internet you accept them. More information

Accept
Back

Power BI Writeback to Databricks

20-06-2023
Power BI Writeback to Databricks


Introduction
The Writeback Grid works best when combined with the "DirectQuery" connection mode. This allows you to write back data in real-time and see changes instantly reflected in all the other visuals across a report. For more information on using the Power BI Databricks connector, follow this link. If you prefer to use the "Import" mode, the Writeback Grid still allows users to manually refresh the Power BI dataset after committing changes, without any limitations or restrictions. Additionally, Power BI Premium offers the extra capability of refreshing single tables and partitions. Read more here.

Prerequisites to use the Writeback Grid with Databricks:

  • Power BI access
  • Databricks workspace/endpoint
  • Databricks personal access token (PAT)
  • Writeback Grid custom visual


Preparing Writeback Grid Custom Visual
The first step is to connect Power BI to Databricks. This connection is independent of the custom visual connection. Next, download the custom visual from AppSource. You can find more information on downloading and using custom visuals here. Once the visual is in your report, you can start dragging fields into it and enable the "editing" feature for each column individually, in addition to setting various other formatting options.



Configuring Writeback Grid Custom Visual
To configure the Writeback Grid for Databricks, navigate to the Format pane within your Power BI report. Populate the following fields under the "Config" section:

  • Backend type: Choose "Databricks"
  • Server hostname: Find your Server Hostname in your Databricks console by navigating to your specific cluster, going to the "Configuration" tab, and expanding the "Advanced options."
  • HTTP path: Find your HTTP Path in your Databricks console by navigating to your specific cluster, going to the "Configuration" tab, and expanding the "Advanced options."
  • Catalog name: Enter the catalog name, which is "hive_metastore" if you are not using Unity Catalog.
  • Database name: Enter the name of the database.
  • Database table: Enter the name of the table, ensuring it aligns with the fields used in the visual.
  • Database table UID: Enter the column names that make your rows unique (PK), required to update or delete existing rows, e.g., "column1, column2."
  • Personal access token: Provide a generated API key from the "User Settings" in your Databricks console.




FAQ

Why do I get conversion problems when submitting changes using date or timestamp fields?

The built-in formats of Power BI and Databricks slightly differ, which can lead to incorrectly displayed dates or timestamps. In such cases, you can provide a custom format for each column in the format pane, e.g., Databricks timestamps need a custom format of "yyyy-MM-dd HH:mm:ss.fff."

If you face any other issues, please let us know info@creativedata.io