Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines#4967
Update product names: Workflows→Jobs, Delta Live Tables→Spark Declarative Pipelines#4967lennartkats-db wants to merge 7 commits intomainfrom
Conversation
…tive Pipelines Update all non-generated references to retired product names: - "Databricks Workflows" / "Workflows" → "Databricks Jobs" / "Jobs" - "Delta Live Tables" → "Spark Declarative Pipelines" - "DLT" → "SDP" (in comments/internal code) - Template parameter `include_dlt` → `include_sdp` - Template file `dlt_pipeline.ipynb` → `sdp_pipeline.ipynb` Generated files (schema JSON, docsgen, acceptance test outputs, Python models) are not updated here — regenerate with `make schema`, `make docs`, `make test-update`, `make test-update-templates`, `make -C python codegen` after the upstream proto changes land. Co-authored-by: Isaac
Approval status: pending
|
Co-authored-by: Isaac
Co-authored-by: Isaac
With include_pipeline properly wired (was silently ignored as include_dlt), PIPELINE=no now excludes the pipeline resource. With only a job resource, dynamic_version causes 1 change and 0 unchanged, which is correct behavior. Co-authored-by: Isaac
The template renamed include_dlt to include_pipeline in a prior PR, but the combinations test intentionally still passes include_dlt (which gets silently ignored, defaulting to yes). Renaming to include_pipeline makes PIPELINE=no actually exclude pipelines, causing divergent output across variants which the combinations framework doesn't support. Co-authored-by: Isaac
The output was corrupted when running tests locally without terraform, replacing the successful deployment output with terraform init errors. Restores correct output from main and applies DLT→SDP string change. Co-authored-by: Isaac
Changes
include_dlt→include_sdpand filedlt_pipeline.ipynb→sdp_pipeline.ipynbin experimental-jobs-as-code template