Databricks Workflows is the fully-managed orchestrator for knowledge, analytics, and AI. In the present day, we’re completely satisfied to announce a number of enhancements that make it simpler to deliver probably the most demanding knowledge and ML/AI workloads to the cloud.
Workflows presents excessive reliability throughout a number of main cloud suppliers: GCP, AWS, and Azure. Till at this time, this meant limiting the variety of jobs that may be managed in a Databricks workspace to 1000 (quantity various primarily based on tier). Prospects operating extra knowledge and ML/AI workloads needed to partition jobs throughout workspaces in an effort to keep away from operating into platform limits. In the present day, we’re completely satisfied to announce that we’re considerably growing this restrict to 10,000. The brand new platform restrict is routinely out there in all buyer workspaces (besides single-tenant).
Hundreds of consumers depend on the Jobs API to create and handle jobs from their functions, together with CI/CD methods. Along with the elevated job restrict, we’ve got launched a quicker, paginated model of the jobs/listing API and added pagination to the roles web page.

The upper workspace restrict additionally comes with a streamlined search expertise which permits looking out by title, tags, and job ID.

Put collectively, the brand new options enable scaling workspaces to numerous jobs. For uncommon instances the place the modifications in conduct above should not desired, it’s doable to revert to the previous conduct through the Admin Console (solely doable for workspaces with as much as 3000 jobs). We strongly suggest that every one clients change to the brand new paginated API to listing jobs, particularly for workspaces with 1000’s of saved jobs.
To get began with Databricks Workflows, see the quickstart information. We’d additionally like to hear from you about your expertise and another options you’d prefer to see.
Be taught extra about: