ConfigsApr 15, 2026·3 min read

Argo Workflows — Kubernetes-Native Workflow Engine

Argo Workflows is a CNCF-graduated workflow engine for orchestrating parallel jobs on Kubernetes, modelling pipelines as DAGs where each step runs as a container.

TL;DR
Argo Workflows orchestrates parallel jobs on Kubernetes by modeling pipelines as DAGs with container steps.
§01

What it is

Argo Workflows is a CNCF-graduated workflow engine for orchestrating parallel jobs on Kubernetes. It models data pipelines, CI jobs, batch processing, and ML training as directed acyclic graphs (DAGs) where each step runs as a container.

Argo Workflows is built for platform engineers and data teams who need Kubernetes-native workflow orchestration with artifact passing, retry logic, and a visual UI.

§02

How it saves time or tokens

Argo Workflows eliminates custom scripting for job orchestration. Instead of writing shell scripts that manage pod lifecycles, you declare workflows as YAML manifests. The engine handles parallelism, retries, timeouts, and artifact passing between steps. The web UI provides real-time visibility into running workflows.

§03

How to use

  1. Install the controller and UI:
kubectl create namespace argo
kubectl apply -n argo -f https://github.com/argoproj/argo-workflows/releases/latest/download/install.yaml
  1. Install the CLI and submit a workflow:
brew install argo
argo submit -n argo --watch https://raw.githubusercontent.com/argoproj/argo-workflows/main/examples/hello-world.yaml
  1. Access the UI:
kubectl -n argo port-forward svc/argo-server 2746:2746
# Open http://localhost:2746
§04

Example

apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
  generateName: dag-pipeline-
spec:
  entrypoint: pipeline
  templates:
    - name: pipeline
      dag:
        tasks:
          - name: fetch-data
            template: run
            arguments:
              parameters: [{name: cmd, value: 'curl -o /tmp/data.csv https://example.com/data'}]
          - name: process
            template: run
            dependencies: [fetch-data]
            arguments:
              parameters: [{name: cmd, value: 'python process.py'}]
          - name: report
            template: run
            dependencies: [process]
            arguments:
              parameters: [{name: cmd, value: 'python report.py'}]
    - name: run
      inputs:
        parameters: [{name: cmd}]
      container:
        image: python:3.11
        command: [sh, -c]
        args: ['{{inputs.parameters.cmd}}']
§05

Related on TokRepo

§06

Common pitfalls

  • Not configuring artifact storage (S3/GCS), which limits artifact passing between steps
  • Running the argo-server without authentication in production environments
  • Creating workflows with too many parallel steps without setting resource quotas

Frequently Asked Questions

How does Argo Workflows compare to Airflow?+

Argo Workflows is Kubernetes-native where each step runs as a container. Airflow runs tasks as Python functions in a centralized scheduler. Argo is better for containerized workloads; Airflow is better for Python-centric data pipelines with extensive operator libraries.

Can Argo Workflows handle ML training pipelines?+

Yes. Argo Workflows supports GPU scheduling, artifact passing for model weights, parameter sweeps, and integration with Kubeflow. Many ML teams use it for training, evaluation, and deployment pipelines.

What is the difference between Argo Workflows and Argo CD?+

Argo Workflows orchestrates batch jobs and pipelines. Argo CD handles continuous deployment (GitOps). They are separate projects under the Argo umbrella and can be used together or independently.

Does Argo Workflows support cron scheduling?+

Yes. CronWorkflow resources let you schedule workflows on a cron expression. This is useful for recurring batch jobs, nightly builds, and periodic data processing.

How do I pass data between workflow steps?+

Use artifacts (files stored in S3, GCS, or MinIO) or parameters (string values). Artifacts are better for large datasets; parameters work for small values like IDs or status codes.

Citations (3)

Discussion

Sign in to join the discussion.
No comments yet. Be the first to share your thoughts.

Related Assets