Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Handle schema attribute in databricks_pipeline #4137

Merged
merged 1 commit into from
Oct 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -372,7 +372,6 @@ This block describes the queue settings of the job:
* `periodic` - (Optional) configuration block to define a trigger for Periodic Triggers consisting of the following attributes:
* `interval` - (Required) Specifies the interval at which the job should run. This value is required.
* `unit` - (Required) Options are {"DAYS", "HOURS", "WEEKS"}.

* `file_arrival` - (Optional) configuration block to define a trigger for [File Arrival events](https://learn.microsoft.com/en-us/azure/databricks/workflows/jobs/file-arrival-triggers) consisting of following attributes:
* `url` - (Required) URL to be monitored for file arrivals. The path must point to the root or a subpath of the external location. Please note that the URL must have a trailing slash character (`/`).
* `min_time_between_triggers_seconds` - (Optional) If set, the trigger starts a run only after the specified amount of time passed since the last time the trigger fired. The minimum allowed value is 60 seconds.
Expand Down
3 changes: 2 additions & 1 deletion docs/resources/pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,8 @@ The following arguments are supported:
* `photon` - A flag indicating whether to use Photon engine. The default value is `false`.
* `serverless` - An optional flag indicating if serverless compute should be used for this DLT pipeline. Requires `catalog` to be set, as it could be used only with Unity Catalog.
* `catalog` - The name of catalog in Unity Catalog. *Change of this parameter forces recreation of the pipeline.* (Conflicts with `storage`).
* `target` - The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
* `target` - (Optional, String, Conflicts with `schema`) The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
* `schema` - (Optional, String, Conflicts with `target`) The default schema (database) where tables are read from or published to. The presence of this attribute implies that the pipeline is in direct publishing mode.
* `edition` - optional name of the [product edition](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-concepts.html#editions). Supported values are: `CORE`, `PRO`, `ADVANCED` (default). Not required when `serverless` is set to `true`.
* `channel` - optional name of the release channel for Spark version used by DLT pipeline. Supported values are: `CURRENT` (default) and `PREVIEW`.
* `budget_policy_id` - optional string specifying ID of the budget policy for this DLT pipeline.
Expand Down
2 changes: 2 additions & 0 deletions pipelines/resource_pipeline.go
Original file line number Diff line number Diff line change
Expand Up @@ -246,6 +246,8 @@ func (Pipeline) CustomizeSchema(s *common.CustomizableSchema) *common.Customizab
s.SchemaPath("storage").SetConflictsWith([]string{"catalog"})
s.SchemaPath("catalog").SetConflictsWith([]string{"storage"})
s.SchemaPath("ingestion_definition", "connection_name").SetConflictsWith([]string{"ingestion_definition.0.ingestion_gateway_id"})
s.SchemaPath("target").SetConflictsWith([]string{"schema"})
s.SchemaPath("schema").SetConflictsWith([]string{"target"})

// MinItems fields
s.SchemaPath("library").SetMinItems(1)
Expand Down
Loading