Skip to content

Jenkins Region-Aware Kubernetes Deploy with Pkl and Python

Jenkins Region-Aware Kubernetes Deploy with Pkl and Python

Section titled “Jenkins Region-Aware Kubernetes Deploy with Pkl and Python”

This example combines:

  1. Pkl for approved region-to-resource policy.
  2. Python for runtime cluster detection and Helm argument construction.
  3. Jenkins Shared Library for consistent pipeline usage.

Result: a single Jenkins step can deploy N replicas while pinning pods to a region-approved zone and injecting the nearest artifact store.

Create resources/pkl/RegionMap.pkl in your shared library repository.

module RegionMap
class RegionalConfig {
region: String
artifactStore: String
nodeZone: String
}
routes: Map<String, RegionalConfig> = Map(
"us-east-1", new {
region = "us-east-1"
artifactStore = "s3://org-us-east-artifacts"
nodeZone = "us-east-1a"
},
"eu-central-1", new {
region = "eu-central-1"
artifactStore = "s3://org-eu-central-artifacts"
nodeZone = "eu-central-1b"
}
)

Create resources/scripts/orchestrator.py.

This version evaluates Pkl through the pkl CLI, validates the detected region, and runs Helm with node affinity.

import json
import subprocess
import sys
from kubernetes import client, config
def get_current_cluster_region() -> str:
config.load_kube_config()
v1 = client.CoreV1Api()
nodes = v1.list_node().items
if not nodes:
raise RuntimeError("No Kubernetes nodes found")
return nodes[0].metadata.labels.get("topology.kubernetes.io/region", "")
def load_region_map(pkl_file: str) -> dict:
result = subprocess.run(
["pkl", "eval", "--format", "json", pkl_file],
check=True,
text=True,
capture_output=True,
)
return json.loads(result.stdout)
def run_region_aware_deploy(app_name: str, replicas: int) -> None:
region_map = load_region_map("RegionMap.pkl")
routes = region_map.get("routes", {})
current_region = get_current_cluster_region()
route = routes.get(current_region)
if not route:
print(f"Error: region '{current_region}' is not approved in RegionMap.pkl")
sys.exit(1)
node_zone = route["nodeZone"]
artifact_store = route["artifactStore"]
deploy_cmd = [
"helm",
"upgrade",
"--install",
app_name,
"./charts",
"--set",
f"replicaCount={replicas}",
"--set",
f"env.ARTIFACT_STORE={artifact_store}",
"--set",
"affinity.nodeAffinity.requiredDuringSchedulingIgnoredDuringExecution."
"nodeSelectorTerms[0].matchExpressions[0].key=topology.kubernetes.io/zone",
"--set",
"affinity.nodeAffinity.requiredDuringSchedulingIgnoredDuringExecution."
"nodeSelectorTerms[0].matchExpressions[0].operator=In",
"--set",
"affinity.nodeAffinity.requiredDuringSchedulingIgnoredDuringExecution."
f"nodeSelectorTerms[0].matchExpressions[0].values[0]={node_zone}",
"--wait",
]
print(
f"Deploying app={app_name} replicas={replicas} "
f"region={current_region} zone={node_zone}"
)
subprocess.run(deploy_cmd, check=True)
if __name__ == "__main__":
if len(sys.argv) != 3:
print("Usage: python3 orchestrator.py <app_name> <replicas>")
sys.exit(2)
run_region_aware_deploy(sys.argv[1], int(sys.argv[2]))

Create vars/regionAwareDeploy.groovy.

def call(Map cfg = [:]) {
String appName = cfg.appName ?: error('appName is required')
int replicas = (cfg.replicas ?: 200) as int
pipeline {
agent { label 'python-worker' }
stages {
stage('Region-Aware Deploy') {
steps {
script {
def pklMap = libraryResource 'pkl/RegionMap.pkl'
def orchestrator = libraryResource 'scripts/orchestrator.py'
writeFile file: 'RegionMap.pkl', text: pklMap
writeFile file: 'orchestrator.py', text: orchestrator
sh "python3 orchestrator.py ${appName} ${replicas}"
}
}
}
}
}
}

Use this structure in your Jenkins shared library repository:

.
|-- vars/
| `-- regionAwareDeploy.groovy
`-- resources/
|-- pkl/
| `-- RegionMap.pkl
`-- scripts/
`-- orchestrator.py
@Library('apple-global-lib') _
regionAwareDeploy(
appName: 'artifact-router',
replicas: 200
)
  1. Install the Pkl CLI on Jenkins agents that run the step.
Terminal window
curl -L https://github.com/apple/pkl/releases/download/0.25.2/pkl-linux-amd64 \
-o /usr/local/bin/pkl
chmod +x /usr/local/bin/pkl
  1. Install Python dependencies on the same agents.
Terminal window
pip install kubernetes
  1. Ensure Helm and kubeconfig are available to the Jenkins agent.
  2. Register the shared library in Jenkins (Manage Jenkins -> System -> Global Pipeline Libraries).

Use these failure paths to verify guardrails are working as designed:

  1. Unapproved region:
    • Condition: cluster label resolves to a region missing from routes.
    • Result: script exits with code 1 at if not route.
    • Example log: Error: region 'ap-south-1' is not approved in RegionMap.pkl.
  2. No Kubernetes access:
    • Condition: missing kubeconfig or broken RBAC.
    • Result: config.load_kube_config() or list_node() throws and build fails fast.
  3. Helm deploy failure:
    • Condition: bad chart values, image pull failure, scheduling failure.
    • Result: subprocess.run(..., check=True) raises and Jenkins marks stage failed.

You can test orchestration logic in a container without a real cluster by mocking Kubernetes + Helm.

Minimal tests/test_orchestrator.py example:

from unittest.mock import patch, MagicMock
import orchestrator
@patch("orchestrator.subprocess.run")
@patch("orchestrator.get_current_cluster_region", return_value="us-east-1")
@patch("orchestrator.load_region_map")
def test_builds_expected_helm_command(mock_load_map, _mock_region, mock_run):
mock_load_map.return_value = {
"routes": {
"us-east-1": {
"artifactStore": "s3://org-us-east-artifacts",
"nodeZone": "us-east-1a",
}
}
}
mock_run.return_value = MagicMock()
orchestrator.run_region_aware_deploy("artifact-router", 200)
cmd = mock_run.call_args[0][0]
assert "helm" in cmd
assert "replicaCount=200" in cmd
assert "env.ARTIFACT_STORE=s3://org-us-east-artifacts" in cmd

Simple Dockerfile for tests:

FROM python:3.12-slim
WORKDIR /workspace
COPY . /workspace
RUN pip install pytest kubernetes
CMD ["pytest", "-q", "tests/test_orchestrator.py"]

Run locally:

Terminal window
docker build -t region-aware-tests .
docker run --rm region-aware-tests

For integration, run against an ephemeral cluster (for example kind) and validate affinity and env injection.

Example integration flow:

  1. Create a local cluster and label worker nodes with expected topology labels.
  2. Deploy using python3 orchestrator.py artifact-router 3.
  3. Assert scheduled pods have matching node zone and env var.

Example checks:

Terminal window
kubectl get pods -l app.kubernetes.io/name=artifact-router -o wide
kubectl get pod <pod-name> -o jsonpath='{.spec.affinity.nodeAffinity}'
kubectl get pod <pod-name> -o jsonpath='{.spec.containers[0].env[?(@.name=="ARTIFACT_STORE")].value}'

Jenkins integration stage example:

stage('Integration Test (kind)') {
steps {
sh '''
kind create cluster --name ci-region-test
kubectl label nodes ci-region-test-worker topology.kubernetes.io/region=us-east-1 --overwrite
kubectl label nodes ci-region-test-worker topology.kubernetes.io/zone=us-east-1a --overwrite
python3 orchestrator.py artifact-router 3
kubectl get pods -A
kind delete cluster --name ci-region-test
'''
}
}

Use both Jenkins and GitHub together:

  1. Application repo:
    • Keep Jenkinsfile that calls regionAwareDeploy(...).
    • Configure Jenkins multibranch pipeline pointing to GitHub repo.
    • Enable GitHub webhook for push and PR events.
  2. Shared library repo:
    • Store vars/ and resources/.
    • Tag versions (for example v1.4.0) and pin in Jenkins library config.
    • Run CI on pull requests before Jenkins consumes new tags.

Example .github/workflows/library-ci.yml for shared library repo:

name: library-ci
on:
pull_request:
push:
branches: [main]
jobs:
python-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install deps
run: pip install pytest kubernetes
- name: Run unit tests
run: pytest -q tests/test_orchestrator.py

Example GitHub-triggered Jenkinsfile:

@Library('apple-global-lib@v1.4.0') _
pipeline {
agent any
stages {
stage('Deploy') {
when { branch 'main' }
steps {
regionAwareDeploy(appName: 'artifact-router', replicas: 200)
}
}
}
}