Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle changes from Databricks Python SDK 0.37.0 #320

Merged
merged 6 commits into from
Nov 15, 2024

Conversation

JCZuurmond
Copy link
Contributor

@JCZuurmond JCZuurmond commented Nov 15, 2024

Handle changes from Databricks Python SDK 0.37.0: LakeviewAPI now works with a Dashboard object

  • Requires change in Python SDK --> Implemented workaround, see this issue about removing the workaround when the SDK resolved the LakeviewAPI deployment issues

@JCZuurmond JCZuurmond marked this pull request as ready for review November 15, 2024 10:22
Copy link

github-actions bot commented Nov 15, 2024

❌ 33/36 passed, 3 failed, 4 skipped, 4m41s total

❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (20.956s)
... (skipped 14294 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgcxhfdi7/working-copy in /tmp/tmpgcxhfdi7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684589
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684595
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8982472358871536790"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (204 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f6be8b10548a4b6e8594f1ec92adf849&contextId=8982472358871536790
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:20,603\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
10:30 DEBUG [databricks.sdk] Loaded from environment
10:30 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmpgcxhfdi7/working-copy in /tmp/tmpgcxhfdi7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684589
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684595
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "8982472358871536790"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=8982472358871536790: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=8982472358871536790
< 200 OK
< {
<   "id": "8982472358871536790",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=65fb120b399340a98b3eea4b2147da45, context_id=8982472358871536790: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=65fb120b399340a98b3eea4b2147da45&contextId=8982472358871536790
< 200 OK
< {
<   "id": "65fb120b399340a98b3eea4b2147da45",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.Q6hW/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (204 more bytes)",
>   "contextId": "8982472358871536790",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=f6be8b10548a4b6e8594f1ec92adf849&contextId=8982472358871536790
< 200 OK
< {
<   "id": "f6be8b10548a4b6e8594f1ec92adf849",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:20,603\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
[gw0] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n backend.execute("SELECT * FROM TEST_SCHEMA.__RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (21.482s)
... (skipped 14294 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0ue9srb7/working-copy in /tmp/tmp0ue9srb7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684590
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684594
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7323052038081084640"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (189 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=741685f193fd40aba06f633767eb504c&contextId=7323052038081084640
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:21,146\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
10:30 DEBUG [databricks.sdk] Loaded from environment
10:30 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:30 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp0ue9srb7/working-copy in /tmp/tmp0ue9srb7
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_lsql-0.13.1+720241115103101-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684590
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684594
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "7323052038081084640"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=7323052038081084640: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=7323052038081084640
< 200 OK
< {
<   "id": "7323052038081084640",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=9ed8ee611424496ebc4535affb6a5db4, context_id=7323052038081084640: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=9ed8ee611424496ebc4535affb6a5db4&contextId=7323052038081084640
< 200 OK
< {
<   "id": "9ed8ee611424496ebc4535affb6a5db4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.NAAt/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (189 more bytes)",
>   "contextId": "7323052038081084640",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=741685f193fd40aba06f633767eb504c&contextId=7323052038081084640
< 200 OK
< {
<   "id": "741685f193fd40aba06f633767eb504c",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:21,146\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13306 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13394)
[gw1] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python
❌ test_runtime_backend_errors_handled[\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors import NotFound\nbackend = RuntimeBackend()\ntry:\n query_response = backend.fetch("DESCRIBE __RANDOM__")\n return "FAILED"\nexcept NotFound as e:\n return "PASSED"\n]: assert '{"ts": "2024...]}}\n"PASSED"' == 'PASSED' (19.868s)
... (skipped 14138 bytes)
Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp6vyfvfx8/working-copy in /tmp/tmp6vyfvfx8
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_lsql-0.13.1+720241115103121-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684624
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684626
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "4908315292167573232"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (191 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=b9f2d30da7e34f8280e2603fc8bcc90a&contextId=4908315292167573232
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:41,015\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13170 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13258)
10:31 DEBUG [databricks.sdk] Loaded from environment
10:31 DEBUG [databricks.sdk] Ignoring pat auth, because metadata-service is preferred
10:31 DEBUG [databricks.sdk] Ignoring basic auth, because metadata-service is preferred
10:31 DEBUG [databricks.sdk] Attempting to configure auth: metadata-service
10:31 INFO [databricks.sdk] Using Databricks Metadata Service authentication
10:31 DEBUG [databricks.sdk] GET /api/2.0/preview/scim/v2/Me
< 200 OK
< {
<   "active": true,
<   "displayName": "labs-runtime-identity",
<   "emails": [
<     {
<       "primary": true,
<       "type": "work",
<       "value": "**REDACTED**"
<     }
<   ],
<   "entitlements": [
<     {
<       "value": "**REDACTED**"
<     },
<     "... (1 additional elements)"
<   ],
<   "externalId": "d0f9bd2c-5651-45fd-b648-12a3fc6375c4",
<   "groups": [
<     {
<       "$ref": "Groups/300667344111082",
<       "display": "labs.scope.runtime",
<       "type": "direct",
<       "value": "**REDACTED**"
<     }
<   ],
<   "id": "4643477475987733",
<   "name": {
<     "givenName": "labs-runtime-identity"
<   },
<   "schemas": [
<     "urn:ietf:params:scim:schemas:core:2.0:User",
<     "... (1 additional elements)"
<   ],
<   "userName": "4106dc97-a963-48f0-a079-a578238959a6"
< }
10:31 DEBUG [databricks.labs.blueprint.wheels] Building wheel for /tmp/tmp6vyfvfx8/working-copy in /tmp/tmp6vyfvfx8
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_lsql-0.13.1+720241115103121-py3-none-any.whl
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 404 Not Found
< {
<   "error_code": "RESOURCE_DOES_NOT_EXIST",
<   "message": "The parent folder (/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels) does not exist."
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Creating missing folders: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/mkdirs
> {
>   "path": "/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels"
> }
< 200 OK
< {}
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684624
< }
10:31 DEBUG [databricks.labs.blueprint.installation] Converting Version into JSON format
10:31 DEBUG [databricks.labs.blueprint.installation] Uploading: /Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/version.json
10:31 DEBUG [databricks.sdk] POST /api/2.0/workspace/import
> [raw stream]
< 200 OK
< {
<   "object_id": 4237054998684626
< }
10:31 DEBUG [databricks.sdk] GET /api/2.1/clusters/get?cluster_id=DATABRICKS_CLUSTER_ID
< 200 OK
< {
<   "autotermination_minutes": 60,
<   "CLOUD_ENV_attributes": {
<     "availability": "SPOT_WITH_FALLBACK_AZURE",
<     "first_on_demand": 2147483647,
<     "spot_bid_max_price": -1.0
<   },
<   "cluster_cores": 8.0,
<   "cluster_id": "DATABRICKS_CLUSTER_ID",
<   "cluster_memory_mb": 32768,
<   "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<   "cluster_source": "UI",
<   "creator_user_name": "[email protected]",
<   "custom_tags": {
<     "ResourceClass": "SingleNode"
<   },
<   "data_security_mode": "SINGLE_USER",
<   "TEST_SCHEMA_tags": {
<     "Budget": "opex.sales.labs",
<     "ClusterId": "DATABRICKS_CLUSTER_ID",
<     "ClusterName": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "Creator": "[email protected]",
<     "DatabricksInstanceGroupId": "-8854613105865987054",
<     "DatabricksInstancePoolCreatorId": "4183391249163402",
<     "DatabricksInstancePoolId": "TEST_INSTANCE_POOL_ID",
<     "Owner": "[email protected]",
<     "Vendor": "Databricks"
<   },
<   "disk_spec": {},
<   "driver": {
<     "host_private_ip": "10.179.8.17",
<     "instance_id": "925dca160e3b4a04b67c09f0c33a089e",
<     "node_attributes": {
<       "is_spot": false
<     },
<     "node_id": "880251be7d884b56b843a65cec9b0b08",
<     "private_ip": "10.179.10.17",
<     "public_dns": "",
<     "start_timestamp": 1731666196640
<   },
<   "driver_healthy": true,
<   "driver_instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "driver_instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "driver_node_type_id": "Standard_D8as_v4",
<   "effective_spark_version": "16.0.x-scala2.12",
<   "enable_elastic_disk": true,
<   "enable_local_disk_encryption": false,
<   "init_scripts_safe_mode": false,
<   "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<   "instance_source": {
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID"
<   },
<   "jdbc_port": 10000,
<   "last_activity_time": 1731666293600,
<   "last_restarted_time": 1731666236329,
<   "last_state_loss_time": 1731666236300,
<   "node_type_id": "Standard_D8as_v4",
<   "num_workers": 0,
<   "pinned_by_user_name": "4183391249163402",
<   "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<   "spark_conf": {
<     "spark.databricks.cluster.profile": "singleNode",
<     "spark.master": "local[*]"
<   },
<   "spark_context_id": 5394234943045964788,
<   "spark_version": "16.0.x-scala2.12",
<   "spec": {
<     "autotermination_minutes": 60,
<     "cluster_name": "Scoped MSI Cluster: runtime (Single Node, Single User)",
<     "custom_tags": {
<       "ResourceClass": "SingleNode"
<     },
<     "data_security_mode": "SINGLE_USER",
<     "instance_pool_id": "TEST_INSTANCE_POOL_ID",
<     "num_workers": 0,
<     "single_user_name": "4106dc97-a963-48f0-a079-a578238959a6",
<     "spark_conf": {
<       "spark.databricks.cluster.profile": "singleNode",
<       "spark.master": "local[*]"
<     },
<     "spark_version": "16.0.x-scala2.12"
<   },
<   "start_time": 1731598210709,
<   "state": "RUNNING",
<   "state_message": ""
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/contexts/create
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "4908315292167573232"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Pending"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, context_id=4908315292167573232: (ContextStatus.PENDING) current status: ContextStatus.PENDING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/contexts/status?clusterId=DATABRICKS_CLUSTER_ID&contextId=4908315292167573232
< 200 OK
< {
<   "id": "4908315292167573232",
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "get_ipython().run_line_magic('pip', 'install /Workspace/Users/4106dc97-a963-48f0-a079-a578238959... (110 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~1s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~2s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": null,
<   "status": "Running"
< }
10:31 DEBUG [databricks.sdk] cluster_id=DATABRICKS_CLUSTER_ID, command_id=657f3f2567d044deb834f4cdde1dcfc4, context_id=4908315292167573232: (CommandStatus.RUNNING) current status: CommandStatus.RUNNING (sleeping ~3s)
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=657f3f2567d044deb834f4cdde1dcfc4&contextId=4908315292167573232
< 200 OK
< {
<   "id": "657f3f2567d044deb834f4cdde1dcfc4",
<   "results": {
<     "data": "Processing /Workspace/Users/4106dc97-a963-48f0-a079-a578238959a6/.nOgJ/wheels/databricks_labs_ls... (3270 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 DEBUG [databricks.sdk] POST /api/1.2/commands/execute
> {
>   "clusterId": "DATABRICKS_CLUSTER_ID",
>   "command": "import json\nfrom databricks.labs.lsql.backends import RuntimeBackend\nfrom databricks.sdk.errors ... (191 more bytes)",
>   "contextId": "4908315292167573232",
>   "language": "python"
> }
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a"
< }
10:31 DEBUG [databricks.sdk] GET /api/1.2/commands/status?clusterId=DATABRICKS_CLUSTER_ID&commandId=b9f2d30da7e34f8280e2603fc8bcc90a&contextId=4908315292167573232
< 200 OK
< {
<   "id": "b9f2d30da7e34f8280e2603fc8bcc90a",
<   "results": {
<     "data": "{\"ts\": \"2024-11-15 10:31:41,015\", \"level\": \"ERROR\", \"logger\": \"SQLQueryContextLogger\", \"msg\": \"[... (13170 more bytes)",
<     "resultType": "text"
<   },
<   "status": "Finished"
< }
10:31 WARNING [databricks.sdk] cannot parse converted return statement. Just returning text
Traceback (most recent call last):
  File "/home/runner/work/lsql/lsql/.venv/lib/python3.10/site-packages/databricks/labs/blueprint/commands.py", line 123, in run
    return json.loads(results.data)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/__init__.py", line 346, in loads
    return _TEST_SCHEMA_decoder.decode(s)
  File "/opt/hostedtoolcache/Python/3.10.15/x64/lib/python3.10/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 13258)
[gw1] linux -- Python 3.10.15 /home/runner/work/lsql/lsql/.venv/bin/python

Running from acceptance #450

Copy link
Collaborator

@nfx nfx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@nfx nfx merged commit 69c6e97 into main Nov 15, 2024
8 of 9 checks passed
@nfx nfx deleted the fix/databricks-sdk-0370-changes branch November 15, 2024 10:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working internal
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants