1. Packages
  2. Databricks Provider
  3. API Docs
  4. getCluster
Databricks v1.65.0 published on Wednesday, Apr 9, 2025 by Pulumi

databricks.getCluster

Explore with Pulumi AI

Databricks v1.65.0 published on Wednesday, Apr 9, 2025 by Pulumi

Note If you have a fully automated setup with workspaces created by databricks.MwsWorkspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.

Retrieves information about a databricks.Cluster using its id. This could be retrieved programmatically using databricks.getClusters data source.

Example Usage

Retrieve attributes of each SQL warehouses in a workspace

import * as pulumi from "@pulumi/pulumi";
import * as databricks from "@pulumi/databricks";

const all = databricks.getClusters({});
const allGetCluster = all.then(all => .reduce((__obj, [__key, __value]) => ({ ...__obj, [__key]: databricks.getCluster({
    clusterId: __value,
}) })));
Copy
import pulumi
import pulumi_databricks as databricks

all = databricks.get_clusters()
all_get_cluster = {__key: databricks.get_cluster(cluster_id=__value) for __key, __value in all.ids}
Copy
Coming soon!
using System.Collections.Generic;
using System.Linq;
using Pulumi;
using Databricks = Pulumi.Databricks;

return await Deployment.RunAsync(() => 
{
    var all = Databricks.GetClusters.Invoke();

    var allGetCluster = ;

});
Copy
Coming soon!
Coming soon!

Using getCluster

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getCluster(args: GetClusterArgs, opts?: InvokeOptions): Promise<GetClusterResult>
function getClusterOutput(args: GetClusterOutputArgs, opts?: InvokeOptions): Output<GetClusterResult>
Copy
def get_cluster(cluster_id: Optional[str] = None,
                cluster_info: Optional[GetClusterClusterInfo] = None,
                cluster_name: Optional[str] = None,
                id: Optional[str] = None,
                opts: Optional[InvokeOptions] = None) -> GetClusterResult
def get_cluster_output(cluster_id: Optional[pulumi.Input[str]] = None,
                cluster_info: Optional[pulumi.Input[GetClusterClusterInfoArgs]] = None,
                cluster_name: Optional[pulumi.Input[str]] = None,
                id: Optional[pulumi.Input[str]] = None,
                opts: Optional[InvokeOptions] = None) -> Output[GetClusterResult]
Copy
func LookupCluster(ctx *Context, args *LookupClusterArgs, opts ...InvokeOption) (*LookupClusterResult, error)
func LookupClusterOutput(ctx *Context, args *LookupClusterOutputArgs, opts ...InvokeOption) LookupClusterResultOutput
Copy

> Note: This function is named LookupCluster in the Go SDK.

public static class GetCluster 
{
    public static Task<GetClusterResult> InvokeAsync(GetClusterArgs args, InvokeOptions? opts = null)
    public static Output<GetClusterResult> Invoke(GetClusterInvokeArgs args, InvokeOptions? opts = null)
}
Copy
public static CompletableFuture<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
public static Output<GetClusterResult> getCluster(GetClusterArgs args, InvokeOptions options)
Copy
fn::invoke:
  function: databricks:index/getCluster:getCluster
  arguments:
    # arguments dictionary
Copy

The following arguments are supported:

ClusterId string
The id of the cluster.
ClusterInfo GetClusterClusterInfo
block, consisting of following fields:
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
Id string
cluster ID
ClusterId string
The id of the cluster.
ClusterInfo GetClusterClusterInfo
block, consisting of following fields:
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
Id string
cluster ID
clusterId String
The id of the cluster.
clusterInfo GetClusterClusterInfo
block, consisting of following fields:
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
id String
cluster ID
clusterId string
The id of the cluster.
clusterInfo GetClusterClusterInfo
block, consisting of following fields:
clusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
id string
cluster ID
cluster_id str
The id of the cluster.
cluster_info GetClusterClusterInfo
block, consisting of following fields:
cluster_name str
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
id str
cluster ID
clusterId String
The id of the cluster.
clusterInfo Property Map
block, consisting of following fields:
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
id String
cluster ID

getCluster Result

The following output properties are available:

ClusterId string
ClusterInfo GetClusterClusterInfo
block, consisting of following fields:
ClusterName string
Cluster name, which doesn’t have to be unique.
Id string
cluster ID
ClusterId string
ClusterInfo GetClusterClusterInfo
block, consisting of following fields:
ClusterName string
Cluster name, which doesn’t have to be unique.
Id string
cluster ID
clusterId String
clusterInfo GetClusterClusterInfo
block, consisting of following fields:
clusterName String
Cluster name, which doesn’t have to be unique.
id String
cluster ID
clusterId string
clusterInfo GetClusterClusterInfo
block, consisting of following fields:
clusterName string
Cluster name, which doesn’t have to be unique.
id string
cluster ID
cluster_id str
cluster_info GetClusterClusterInfo
block, consisting of following fields:
cluster_name str
Cluster name, which doesn’t have to be unique.
id str
cluster ID
clusterId String
clusterInfo Property Map
block, consisting of following fields:
clusterName String
Cluster name, which doesn’t have to be unique.
id String
cluster ID

Supporting Types

GetClusterClusterInfo

Autoscale GetClusterClusterInfoAutoscale
AutoterminationMinutes int
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
AwsAttributes GetClusterClusterInfoAwsAttributes
AzureAttributes GetClusterClusterInfoAzureAttributes
ClusterCores double
ClusterId string
The id of the cluster.
ClusterLogConf GetClusterClusterInfoClusterLogConf
ClusterLogStatus GetClusterClusterInfoClusterLogStatus
ClusterMemoryMb int
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
ClusterSource string
CreatorUserName string
CustomTags Dictionary<string, string>
Additional tags for cluster resources.
DataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
DefaultTags Dictionary<string, string>
DockerImage GetClusterClusterInfoDockerImage
Driver GetClusterClusterInfoDriver
DriverInstancePoolId string
similar to instance_pool_id, but for driver node.
DriverNodeTypeId string
The node type of the Spark driver.
EnableElasticDisk bool
Use autoscaling local storage.
EnableLocalDiskEncryption bool
Enable local disk encryption.
Executors List<GetClusterClusterInfoExecutor>
GcpAttributes GetClusterClusterInfoGcpAttributes
InitScripts List<GetClusterClusterInfoInitScript>
InstancePoolId string
The pool of idle instances the cluster is attached to.
IsSingleNode bool
JdbcPort int
Kind string
LastRestartedTime int
LastStateLossTime int
NodeTypeId string
Any supported databricks.getNodeType id.
NumWorkers int
PolicyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
RuntimeEngine string
The type of runtime of the cluster
SingleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
SparkConf Dictionary<string, string>
Map with key-value pairs to fine-tune Spark clusters.
SparkContextId int
SparkEnvVars Dictionary<string, string>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
SparkVersion string
Runtime version of the cluster.
Spec GetClusterClusterInfoSpec
SshPublicKeys List<string>
SSH public key contents that will be added to each Spark node in this cluster.
StartTime int
State string
StateMessage string
TerminatedTime int
TerminationReason GetClusterClusterInfoTerminationReason
UseMlRuntime bool
WorkloadType GetClusterClusterInfoWorkloadType
Autoscale GetClusterClusterInfoAutoscale
AutoterminationMinutes int
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
AwsAttributes GetClusterClusterInfoAwsAttributes
AzureAttributes GetClusterClusterInfoAzureAttributes
ClusterCores float64
ClusterId string
The id of the cluster.
ClusterLogConf GetClusterClusterInfoClusterLogConf
ClusterLogStatus GetClusterClusterInfoClusterLogStatus
ClusterMemoryMb int
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
ClusterSource string
CreatorUserName string
CustomTags map[string]string
Additional tags for cluster resources.
DataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
DefaultTags map[string]string
DockerImage GetClusterClusterInfoDockerImage
Driver GetClusterClusterInfoDriver
DriverInstancePoolId string
similar to instance_pool_id, but for driver node.
DriverNodeTypeId string
The node type of the Spark driver.
EnableElasticDisk bool
Use autoscaling local storage.
EnableLocalDiskEncryption bool
Enable local disk encryption.
Executors []GetClusterClusterInfoExecutor
GcpAttributes GetClusterClusterInfoGcpAttributes
InitScripts []GetClusterClusterInfoInitScript
InstancePoolId string
The pool of idle instances the cluster is attached to.
IsSingleNode bool
JdbcPort int
Kind string
LastRestartedTime int
LastStateLossTime int
NodeTypeId string
Any supported databricks.getNodeType id.
NumWorkers int
PolicyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
RuntimeEngine string
The type of runtime of the cluster
SingleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
SparkConf map[string]string
Map with key-value pairs to fine-tune Spark clusters.
SparkContextId int
SparkEnvVars map[string]string
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
SparkVersion string
Runtime version of the cluster.
Spec GetClusterClusterInfoSpec
SshPublicKeys []string
SSH public key contents that will be added to each Spark node in this cluster.
StartTime int
State string
StateMessage string
TerminatedTime int
TerminationReason GetClusterClusterInfoTerminationReason
UseMlRuntime bool
WorkloadType GetClusterClusterInfoWorkloadType
autoscale GetClusterClusterInfoAutoscale
autoterminationMinutes Integer
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
awsAttributes GetClusterClusterInfoAwsAttributes
azureAttributes GetClusterClusterInfoAzureAttributes
clusterCores Double
clusterId String
The id of the cluster.
clusterLogConf GetClusterClusterInfoClusterLogConf
clusterLogStatus GetClusterClusterInfoClusterLogStatus
clusterMemoryMb Integer
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
clusterSource String
creatorUserName String
customTags Map<String,String>
Additional tags for cluster resources.
dataSecurityMode String
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
defaultTags Map<String,String>
dockerImage GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driverInstancePoolId String
similar to instance_pool_id, but for driver node.
driverNodeTypeId String
The node type of the Spark driver.
enableElasticDisk Boolean
Use autoscaling local storage.
enableLocalDiskEncryption Boolean
Enable local disk encryption.
executors List<GetClusterClusterInfoExecutor>
gcpAttributes GetClusterClusterInfoGcpAttributes
initScripts List<GetClusterClusterInfoInitScript>
instancePoolId String
The pool of idle instances the cluster is attached to.
isSingleNode Boolean
jdbcPort Integer
kind String
lastRestartedTime Integer
lastStateLossTime Integer
nodeTypeId String
Any supported databricks.getNodeType id.
numWorkers Integer
policyId String
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine String
The type of runtime of the cluster
singleUserName String
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf Map<String,String>
Map with key-value pairs to fine-tune Spark clusters.
sparkContextId Integer
sparkEnvVars Map<String,String>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sparkVersion String
Runtime version of the cluster.
spec GetClusterClusterInfoSpec
sshPublicKeys List<String>
SSH public key contents that will be added to each Spark node in this cluster.
startTime Integer
state String
stateMessage String
terminatedTime Integer
terminationReason GetClusterClusterInfoTerminationReason
useMlRuntime Boolean
workloadType GetClusterClusterInfoWorkloadType
autoscale GetClusterClusterInfoAutoscale
autoterminationMinutes number
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
awsAttributes GetClusterClusterInfoAwsAttributes
azureAttributes GetClusterClusterInfoAzureAttributes
clusterCores number
clusterId string
The id of the cluster.
clusterLogConf GetClusterClusterInfoClusterLogConf
clusterLogStatus GetClusterClusterInfoClusterLogStatus
clusterMemoryMb number
clusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
clusterSource string
creatorUserName string
customTags {[key: string]: string}
Additional tags for cluster resources.
dataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
defaultTags {[key: string]: string}
dockerImage GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driverInstancePoolId string
similar to instance_pool_id, but for driver node.
driverNodeTypeId string
The node type of the Spark driver.
enableElasticDisk boolean
Use autoscaling local storage.
enableLocalDiskEncryption boolean
Enable local disk encryption.
executors GetClusterClusterInfoExecutor[]
gcpAttributes GetClusterClusterInfoGcpAttributes
initScripts GetClusterClusterInfoInitScript[]
instancePoolId string
The pool of idle instances the cluster is attached to.
isSingleNode boolean
jdbcPort number
kind string
lastRestartedTime number
lastStateLossTime number
nodeTypeId string
Any supported databricks.getNodeType id.
numWorkers number
policyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine string
The type of runtime of the cluster
singleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf {[key: string]: string}
Map with key-value pairs to fine-tune Spark clusters.
sparkContextId number
sparkEnvVars {[key: string]: string}
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sparkVersion string
Runtime version of the cluster.
spec GetClusterClusterInfoSpec
sshPublicKeys string[]
SSH public key contents that will be added to each Spark node in this cluster.
startTime number
state string
stateMessage string
terminatedTime number
terminationReason GetClusterClusterInfoTerminationReason
useMlRuntime boolean
workloadType GetClusterClusterInfoWorkloadType
autoscale GetClusterClusterInfoAutoscale
autotermination_minutes int
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
aws_attributes GetClusterClusterInfoAwsAttributes
azure_attributes GetClusterClusterInfoAzureAttributes
cluster_cores float
cluster_id str
The id of the cluster.
cluster_log_conf GetClusterClusterInfoClusterLogConf
cluster_log_status GetClusterClusterInfoClusterLogStatus
cluster_memory_mb int
cluster_name str
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
cluster_source str
creator_user_name str
custom_tags Mapping[str, str]
Additional tags for cluster resources.
data_security_mode str
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
default_tags Mapping[str, str]
docker_image GetClusterClusterInfoDockerImage
driver GetClusterClusterInfoDriver
driver_instance_pool_id str
similar to instance_pool_id, but for driver node.
driver_node_type_id str
The node type of the Spark driver.
enable_elastic_disk bool
Use autoscaling local storage.
enable_local_disk_encryption bool
Enable local disk encryption.
executors Sequence[GetClusterClusterInfoExecutor]
gcp_attributes GetClusterClusterInfoGcpAttributes
init_scripts Sequence[GetClusterClusterInfoInitScript]
instance_pool_id str
The pool of idle instances the cluster is attached to.
is_single_node bool
jdbc_port int
kind str
last_restarted_time int
last_state_loss_time int
node_type_id str
Any supported databricks.getNodeType id.
num_workers int
policy_id str
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtime_engine str
The type of runtime of the cluster
single_user_name str
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
spark_conf Mapping[str, str]
Map with key-value pairs to fine-tune Spark clusters.
spark_context_id int
spark_env_vars Mapping[str, str]
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
spark_version str
Runtime version of the cluster.
spec GetClusterClusterInfoSpec
ssh_public_keys Sequence[str]
SSH public key contents that will be added to each Spark node in this cluster.
start_time int
state str
state_message str
terminated_time int
termination_reason GetClusterClusterInfoTerminationReason
use_ml_runtime bool
workload_type GetClusterClusterInfoWorkloadType
autoscale Property Map
autoterminationMinutes Number
Automatically terminate the cluster after being inactive for this time in minutes. If specified, the threshold must be between 10 and 10000 minutes. You can also set this value to 0 to explicitly disable automatic termination.
awsAttributes Property Map
azureAttributes Property Map
clusterCores Number
clusterId String
The id of the cluster.
clusterLogConf Property Map
clusterLogStatus Property Map
clusterMemoryMb Number
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
clusterSource String
creatorUserName String
customTags Map<String>
Additional tags for cluster resources.
dataSecurityMode String
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
defaultTags Map<String>
dockerImage Property Map
driver Property Map
driverInstancePoolId String
similar to instance_pool_id, but for driver node.
driverNodeTypeId String
The node type of the Spark driver.
enableElasticDisk Boolean
Use autoscaling local storage.
enableLocalDiskEncryption Boolean
Enable local disk encryption.
executors List<Property Map>
gcpAttributes Property Map
initScripts List<Property Map>
instancePoolId String
The pool of idle instances the cluster is attached to.
isSingleNode Boolean
jdbcPort Number
kind String
lastRestartedTime Number
lastStateLossTime Number
nodeTypeId String
Any supported databricks.getNodeType id.
numWorkers Number
policyId String
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine String
The type of runtime of the cluster
singleUserName String
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf Map<String>
Map with key-value pairs to fine-tune Spark clusters.
sparkContextId Number
sparkEnvVars Map<String>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sparkVersion String
Runtime version of the cluster.
spec Property Map
sshPublicKeys List<String>
SSH public key contents that will be added to each Spark node in this cluster.
startTime Number
state String
stateMessage String
terminatedTime Number
terminationReason Property Map
useMlRuntime Boolean
workloadType Property Map

GetClusterClusterInfoAutoscale

maxWorkers Integer
minWorkers Integer
maxWorkers number
minWorkers number
maxWorkers Number
minWorkers Number

GetClusterClusterInfoAwsAttributes

GetClusterClusterInfoAzureAttributes

GetClusterClusterInfoAzureAttributesLogAnalyticsInfo

GetClusterClusterInfoClusterLogConf

GetClusterClusterInfoClusterLogConfDbfs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoClusterLogConfS3

Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination This property is required. string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination This property is required. str
canned_acl str
enable_encryption bool
encryption_type str
endpoint str
kms_key str
region str
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoClusterLogConfVolumes

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoClusterLogStatus

GetClusterClusterInfoDockerImage

GetClusterClusterInfoDockerImageBasicAuth

Password string
Username string
Password string
Username string
password String
username String
password string
username string
password String
username String

GetClusterClusterInfoDriver

GetClusterClusterInfoDriverNodeAwsAttributes

IsSpot bool
IsSpot bool
isSpot Boolean
isSpot boolean
is_spot bool
isSpot Boolean

GetClusterClusterInfoExecutor

GetClusterClusterInfoExecutorNodeAwsAttributes

IsSpot bool
IsSpot bool
isSpot Boolean
isSpot boolean
is_spot bool
isSpot Boolean

GetClusterClusterInfoGcpAttributes

GetClusterClusterInfoInitScript

GetClusterClusterInfoInitScriptAbfss

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoInitScriptDbfs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoInitScriptFile

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoInitScriptGcs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoInitScriptS3

Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination This property is required. string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination This property is required. str
canned_acl str
enable_encryption bool
encryption_type str
endpoint str
kms_key str
region str
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoInitScriptVolumes

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoInitScriptWorkspace

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpec

ClusterId This property is required. string
The id of the cluster.
DriverInstancePoolId This property is required. string
similar to instance_pool_id, but for driver node.
DriverNodeTypeId This property is required. string
The node type of the Spark driver.
EnableElasticDisk This property is required. bool
Use autoscaling local storage.
EnableLocalDiskEncryption This property is required. bool
Enable local disk encryption.
NodeTypeId This property is required. string
Any supported databricks.getNodeType id.
SparkVersion This property is required. string
Runtime version of the cluster.
ApplyPolicyDefaultValues bool
Autoscale GetClusterClusterInfoSpecAutoscale
AwsAttributes GetClusterClusterInfoSpecAwsAttributes
AzureAttributes GetClusterClusterInfoSpecAzureAttributes
ClusterLogConf GetClusterClusterInfoSpecClusterLogConf
ClusterMountInfos List<GetClusterClusterInfoSpecClusterMountInfo>
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
CustomTags Dictionary<string, string>
Additional tags for cluster resources.
DataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
DockerImage GetClusterClusterInfoSpecDockerImage
GcpAttributes GetClusterClusterInfoSpecGcpAttributes
IdempotencyToken string
An optional token to guarantee the idempotency of cluster creation requests.
InitScripts List<GetClusterClusterInfoSpecInitScript>
InstancePoolId string
The pool of idle instances the cluster is attached to.
IsSingleNode bool
Kind string
Libraries List<GetClusterClusterInfoSpecLibrary>
NumWorkers int
PolicyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
RuntimeEngine string
The type of runtime of the cluster
SingleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
SparkConf Dictionary<string, string>
Map with key-value pairs to fine-tune Spark clusters.
SparkEnvVars Dictionary<string, string>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
SshPublicKeys List<string>
SSH public key contents that will be added to each Spark node in this cluster.
UseMlRuntime bool
WorkloadType GetClusterClusterInfoSpecWorkloadType
ClusterId This property is required. string
The id of the cluster.
DriverInstancePoolId This property is required. string
similar to instance_pool_id, but for driver node.
DriverNodeTypeId This property is required. string
The node type of the Spark driver.
EnableElasticDisk This property is required. bool
Use autoscaling local storage.
EnableLocalDiskEncryption This property is required. bool
Enable local disk encryption.
NodeTypeId This property is required. string
Any supported databricks.getNodeType id.
SparkVersion This property is required. string
Runtime version of the cluster.
ApplyPolicyDefaultValues bool
Autoscale GetClusterClusterInfoSpecAutoscale
AwsAttributes GetClusterClusterInfoSpecAwsAttributes
AzureAttributes GetClusterClusterInfoSpecAzureAttributes
ClusterLogConf GetClusterClusterInfoSpecClusterLogConf
ClusterMountInfos []GetClusterClusterInfoSpecClusterMountInfo
ClusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
CustomTags map[string]string
Additional tags for cluster resources.
DataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
DockerImage GetClusterClusterInfoSpecDockerImage
GcpAttributes GetClusterClusterInfoSpecGcpAttributes
IdempotencyToken string
An optional token to guarantee the idempotency of cluster creation requests.
InitScripts []GetClusterClusterInfoSpecInitScript
InstancePoolId string
The pool of idle instances the cluster is attached to.
IsSingleNode bool
Kind string
Libraries []GetClusterClusterInfoSpecLibrary
NumWorkers int
PolicyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
RuntimeEngine string
The type of runtime of the cluster
SingleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
SparkConf map[string]string
Map with key-value pairs to fine-tune Spark clusters.
SparkEnvVars map[string]string
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
SshPublicKeys []string
SSH public key contents that will be added to each Spark node in this cluster.
UseMlRuntime bool
WorkloadType GetClusterClusterInfoSpecWorkloadType
clusterId This property is required. String
The id of the cluster.
driverInstancePoolId This property is required. String
similar to instance_pool_id, but for driver node.
driverNodeTypeId This property is required. String
The node type of the Spark driver.
enableElasticDisk This property is required. Boolean
Use autoscaling local storage.
enableLocalDiskEncryption This property is required. Boolean
Enable local disk encryption.
nodeTypeId This property is required. String
Any supported databricks.getNodeType id.
sparkVersion This property is required. String
Runtime version of the cluster.
applyPolicyDefaultValues Boolean
autoscale GetClusterClusterInfoSpecAutoscale
awsAttributes GetClusterClusterInfoSpecAwsAttributes
azureAttributes GetClusterClusterInfoSpecAzureAttributes
clusterLogConf GetClusterClusterInfoSpecClusterLogConf
clusterMountInfos List<GetClusterClusterInfoSpecClusterMountInfo>
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
customTags Map<String,String>
Additional tags for cluster resources.
dataSecurityMode String
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
dockerImage GetClusterClusterInfoSpecDockerImage
gcpAttributes GetClusterClusterInfoSpecGcpAttributes
idempotencyToken String
An optional token to guarantee the idempotency of cluster creation requests.
initScripts List<GetClusterClusterInfoSpecInitScript>
instancePoolId String
The pool of idle instances the cluster is attached to.
isSingleNode Boolean
kind String
libraries List<GetClusterClusterInfoSpecLibrary>
numWorkers Integer
policyId String
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine String
The type of runtime of the cluster
singleUserName String
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf Map<String,String>
Map with key-value pairs to fine-tune Spark clusters.
sparkEnvVars Map<String,String>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sshPublicKeys List<String>
SSH public key contents that will be added to each Spark node in this cluster.
useMlRuntime Boolean
workloadType GetClusterClusterInfoSpecWorkloadType
clusterId This property is required. string
The id of the cluster.
driverInstancePoolId This property is required. string
similar to instance_pool_id, but for driver node.
driverNodeTypeId This property is required. string
The node type of the Spark driver.
enableElasticDisk This property is required. boolean
Use autoscaling local storage.
enableLocalDiskEncryption This property is required. boolean
Enable local disk encryption.
nodeTypeId This property is required. string
Any supported databricks.getNodeType id.
sparkVersion This property is required. string
Runtime version of the cluster.
applyPolicyDefaultValues boolean
autoscale GetClusterClusterInfoSpecAutoscale
awsAttributes GetClusterClusterInfoSpecAwsAttributes
azureAttributes GetClusterClusterInfoSpecAzureAttributes
clusterLogConf GetClusterClusterInfoSpecClusterLogConf
clusterMountInfos GetClusterClusterInfoSpecClusterMountInfo[]
clusterName string
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
customTags {[key: string]: string}
Additional tags for cluster resources.
dataSecurityMode string
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
dockerImage GetClusterClusterInfoSpecDockerImage
gcpAttributes GetClusterClusterInfoSpecGcpAttributes
idempotencyToken string
An optional token to guarantee the idempotency of cluster creation requests.
initScripts GetClusterClusterInfoSpecInitScript[]
instancePoolId string
The pool of idle instances the cluster is attached to.
isSingleNode boolean
kind string
libraries GetClusterClusterInfoSpecLibrary[]
numWorkers number
policyId string
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine string
The type of runtime of the cluster
singleUserName string
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf {[key: string]: string}
Map with key-value pairs to fine-tune Spark clusters.
sparkEnvVars {[key: string]: string}
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sshPublicKeys string[]
SSH public key contents that will be added to each Spark node in this cluster.
useMlRuntime boolean
workloadType GetClusterClusterInfoSpecWorkloadType
cluster_id This property is required. str
The id of the cluster.
driver_instance_pool_id This property is required. str
similar to instance_pool_id, but for driver node.
driver_node_type_id This property is required. str
The node type of the Spark driver.
enable_elastic_disk This property is required. bool
Use autoscaling local storage.
enable_local_disk_encryption This property is required. bool
Enable local disk encryption.
node_type_id This property is required. str
Any supported databricks.getNodeType id.
spark_version This property is required. str
Runtime version of the cluster.
apply_policy_default_values bool
autoscale GetClusterClusterInfoSpecAutoscale
aws_attributes GetClusterClusterInfoSpecAwsAttributes
azure_attributes GetClusterClusterInfoSpecAzureAttributes
cluster_log_conf GetClusterClusterInfoSpecClusterLogConf
cluster_mount_infos Sequence[GetClusterClusterInfoSpecClusterMountInfo]
cluster_name str
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
custom_tags Mapping[str, str]
Additional tags for cluster resources.
data_security_mode str
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
docker_image GetClusterClusterInfoSpecDockerImage
gcp_attributes GetClusterClusterInfoSpecGcpAttributes
idempotency_token str
An optional token to guarantee the idempotency of cluster creation requests.
init_scripts Sequence[GetClusterClusterInfoSpecInitScript]
instance_pool_id str
The pool of idle instances the cluster is attached to.
is_single_node bool
kind str
libraries Sequence[GetClusterClusterInfoSpecLibrary]
num_workers int
policy_id str
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtime_engine str
The type of runtime of the cluster
single_user_name str
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
spark_conf Mapping[str, str]
Map with key-value pairs to fine-tune Spark clusters.
spark_env_vars Mapping[str, str]
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
ssh_public_keys Sequence[str]
SSH public key contents that will be added to each Spark node in this cluster.
use_ml_runtime bool
workload_type GetClusterClusterInfoSpecWorkloadType
clusterId This property is required. String
The id of the cluster.
driverInstancePoolId This property is required. String
similar to instance_pool_id, but for driver node.
driverNodeTypeId This property is required. String
The node type of the Spark driver.
enableElasticDisk This property is required. Boolean
Use autoscaling local storage.
enableLocalDiskEncryption This property is required. Boolean
Enable local disk encryption.
nodeTypeId This property is required. String
Any supported databricks.getNodeType id.
sparkVersion This property is required. String
Runtime version of the cluster.
applyPolicyDefaultValues Boolean
autoscale Property Map
awsAttributes Property Map
azureAttributes Property Map
clusterLogConf Property Map
clusterMountInfos List<Property Map>
clusterName String
The exact name of the cluster to search. Can only be specified if there is exactly one cluster with the provided name.
customTags Map<String>
Additional tags for cluster resources.
dataSecurityMode String
Security features of the cluster. Unity Catalog requires SINGLE_USER or USER_ISOLATION mode. LEGACY_PASSTHROUGH for passthrough cluster and LEGACY_TABLE_ACL for Table ACL cluster. Default to NONE, i.e. no security feature enabled.
dockerImage Property Map
gcpAttributes Property Map
idempotencyToken String
An optional token to guarantee the idempotency of cluster creation requests.
initScripts List<Property Map>
instancePoolId String
The pool of idle instances the cluster is attached to.
isSingleNode Boolean
kind String
libraries List<Property Map>
numWorkers Number
policyId String
Identifier of Cluster Policy to validate cluster and preset certain defaults.
runtimeEngine String
The type of runtime of the cluster
singleUserName String
The optional user name of the user to assign to an interactive cluster. This field is required when using standard AAD Passthrough for Azure Data Lake Storage (ADLS) with a single-user cluster (i.e., not high-concurrency clusters).
sparkConf Map<String>
Map with key-value pairs to fine-tune Spark clusters.
sparkEnvVars Map<String>
Map with environment variable key-value pairs to fine-tune Spark clusters. Key-value pairs of the form (X,Y) are exported (i.e., X='Y') while launching the driver and workers.
sshPublicKeys List<String>
SSH public key contents that will be added to each Spark node in this cluster.
useMlRuntime Boolean
workloadType Property Map

GetClusterClusterInfoSpecAutoscale

maxWorkers Integer
minWorkers Integer
maxWorkers number
minWorkers number
maxWorkers Number
minWorkers Number

GetClusterClusterInfoSpecAwsAttributes

GetClusterClusterInfoSpecAzureAttributes

GetClusterClusterInfoSpecAzureAttributesLogAnalyticsInfo

GetClusterClusterInfoSpecClusterLogConf

GetClusterClusterInfoSpecClusterLogConfDbfs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecClusterLogConfS3

Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination This property is required. string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination This property is required. str
canned_acl str
enable_encryption bool
encryption_type str
endpoint str
kms_key str
region str
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoSpecClusterLogConfVolumes

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecClusterMountInfo

localMountDirPath This property is required. String
networkFilesystemInfo This property is required. Property Map
remoteMountDirPath String

GetClusterClusterInfoSpecClusterMountInfoNetworkFilesystemInfo

ServerAddress This property is required. string
MountOptions string
ServerAddress This property is required. string
MountOptions string
serverAddress This property is required. String
mountOptions String
serverAddress This property is required. string
mountOptions string
server_address This property is required. str
mount_options str
serverAddress This property is required. String
mountOptions String

GetClusterClusterInfoSpecDockerImage

url This property is required. String
basicAuth Property Map

GetClusterClusterInfoSpecDockerImageBasicAuth

Password This property is required. string
Username This property is required. string
Password This property is required. string
Username This property is required. string
password This property is required. String
username This property is required. String
password This property is required. string
username This property is required. string
password This property is required. str
username This property is required. str
password This property is required. String
username This property is required. String

GetClusterClusterInfoSpecGcpAttributes

GetClusterClusterInfoSpecInitScript

abfss Property Map
dbfs Property Map

Deprecated: For init scripts use 'volumes', 'workspace' or cloud storage location instead of 'dbfs'.

file Property Map
gcs Property Map
s3 Property Map
volumes Property Map
workspace Property Map

GetClusterClusterInfoSpecInitScriptAbfss

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecInitScriptDbfs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecInitScriptFile

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecInitScriptGcs

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecInitScriptS3

Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
Destination This property is required. string
CannedAcl string
EnableEncryption bool
EncryptionType string
Endpoint string
KmsKey string
Region string
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String
destination This property is required. string
cannedAcl string
enableEncryption boolean
encryptionType string
endpoint string
kmsKey string
region string
destination This property is required. str
canned_acl str
enable_encryption bool
encryption_type str
endpoint str
kms_key str
region str
destination This property is required. String
cannedAcl String
enableEncryption Boolean
encryptionType String
endpoint String
kmsKey String
region String

GetClusterClusterInfoSpecInitScriptVolumes

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecInitScriptWorkspace

Destination This property is required. string
Destination This property is required. string
destination This property is required. String
destination This property is required. string
destination This property is required. str
destination This property is required. String

GetClusterClusterInfoSpecLibrary

GetClusterClusterInfoSpecLibraryCran

Package This property is required. string
Repo string
Package This property is required. string
Repo string
package_ This property is required. String
repo String
package This property is required. string
repo string
package This property is required. str
repo str
package This property is required. String
repo String

GetClusterClusterInfoSpecLibraryMaven

Coordinates This property is required. string
Exclusions List<string>
Repo string
Coordinates This property is required. string
Exclusions []string
Repo string
coordinates This property is required. String
exclusions List<String>
repo String
coordinates This property is required. string
exclusions string[]
repo string
coordinates This property is required. str
exclusions Sequence[str]
repo str
coordinates This property is required. String
exclusions List<String>
repo String

GetClusterClusterInfoSpecLibraryPypi

Package This property is required. string
Repo string
Package This property is required. string
Repo string
package_ This property is required. String
repo String
package This property is required. string
repo string
package This property is required. str
repo str
package This property is required. String
repo String

GetClusterClusterInfoSpecWorkloadType

clients This property is required. Property Map

GetClusterClusterInfoSpecWorkloadTypeClients

Jobs bool
Notebooks bool
Jobs bool
Notebooks bool
jobs Boolean
notebooks Boolean
jobs boolean
notebooks boolean
jobs bool
notebooks bool
jobs Boolean
notebooks Boolean

GetClusterClusterInfoTerminationReason

Code string
Parameters Dictionary<string, string>
Type string
Code string
Parameters map[string]string
Type string
code String
parameters Map<String,String>
type String
code string
parameters {[key: string]: string}
type string
code str
parameters Mapping[str, str]
type str
code String
parameters Map<String>
type String

GetClusterClusterInfoWorkloadType

clients This property is required. Property Map

GetClusterClusterInfoWorkloadTypeClients

Jobs bool
Notebooks bool
Jobs bool
Notebooks bool
jobs Boolean
notebooks Boolean
jobs boolean
notebooks boolean
jobs bool
notebooks bool
jobs Boolean
notebooks Boolean

Package Details

Repository
databricks pulumi/pulumi-databricks
License
Apache-2.0
Notes
This Pulumi package is based on the databricks Terraform Provider.
Databricks v1.65.0 published on Wednesday, Apr 9, 2025 by Pulumi