Databricks
Databricks
Databricks OpenAI LLM Model - Part 1
Create a table
CREATE TABLE asset_digital_ai (
query_key string NOT NULL,
query STRING NOT NULL,
answer_key string ,
answer STRING ,
query_datetime TIMESTAMP NOT NULL,
answer_datetime...
Databricks OpenAI LLM Model - Stored Procedure - Part 2
One‑time setup: HTTP connection to your APIM/Azure OpenAI endpoint
%sql
CREATE OR REPLACE CONNECTION aoai_apim
TYPE HTTP
OPTIONS (
host 'https://xxxxxxxxxxxxxxxxxx.a03.azurefd.net',
port ...
Install Databricks CLI
Open your Command Prompt or PowerShell window.
Run the following commands to search for and install the CLI:
bash
winget search databricks winget install Databricks.DatabricksCLI
Restart your command prompt or PowerShell session to ensure the...
Secret Scope
Validate
databricks auth login --profile rtio
Create
databricks secrets create-scope <scope-name> --initial-manage-principal users
Add Key
databricks secrets put-secret <scope-name> <key-name>
Example:
databricks auth login --profile rtio
databricks secrets create-scope...
Azure Databricks - Alphabetical list of built-in functions
This article provides an alphabetically-ordered list of built-in functions and operators in Azure...
Load a CSV file via Scala to Databricks
The following example uses parquet for the cloudFiles.format. Use csv, avro, or json for other file sources. All other settings for read and write stay the same for the default behaviors for each format.
spark.readStream.format("cloudFiles")...
Load a CSV file via Python to Databricks
The following example uses parquet for the cloudFiles.format. Use csv, avro, or json for other file sources. All other settings for read and write stay the same for the default behaviors for each format.
(spark.readStream.format("cloudFiles")...