Hi all,
Within my company, METL was rolled out to more than 20 DevOps teams. To avoid each team reinventing the wheel when working with METL, we set up a Community of Practice. This community defines best practices such as naming conventions and provides easy-to-use, predefined shared jobs. In this post, I want to focus on the Shared Jobs functionality in METL and compare it with the new Shared Pipelines feature in DPC.
Shared Jobs in METL allowed us to create and publish our own reusable components that were not available out of the box. One example is a “Create Dynamic Table” shared job, which we enhanced with its own distinctive icon, as shown below.
The Shared Jobs functionality made it possible to add custom icons, human-readable parameters, and a dedicated help page explaining how those parameters should be used. This made shared components intuitive and easy to adopt across teams.
We are now in the process of migrating to DPC within our company. Before our DevOps teams can make this transition, we first need to migrate our Shared Jobs to DPC. Unfortunately, I was surprised to discover that Shared Pipelines—which I initially assumed to be the direct successor of Shared Jobs in METL—are fundamentally different.
There is no way to visually distinguish Shared Pipelines from one another, other than renaming them, since they all use the same generic “Run Shared Pipeline” component.
In addition, the way parameters are handled has changed significantly. Previously, parameters were part of the component itself, with clear context. Now, parameters must be selected as scalar variables in a separate, generic window, without any contextual information.
vs
Documentation options are also very limited, with only a generic description field available.
Because of these limitations, the way DPC handles Shared Pipelines makes it difficult for us to migrate our existing Shared Jobs. As a result, we are forced to rethink our entire way of working rather than being able to migrate smoothly.
Best regards,
Huseyin Kara


