gitYou can only go so far. While you can use it as you wish, it has worthwhile advantages, such as being able to make atomic changes that affect multiple parts of the system in a single commit, which eliminates entire classes of compatibility and integration issues. You can always split a monorepo later (see). git-filter-repo).
So, let’s say you’re a small-to-medium team using a monorepo. Let’s go ahead and say that this monorepo stores all of your company’s code, meaning it spans many different programming languages – it’s a polyglot monorepo. What tools can you use to manage versioning in a consistent way?
i debate that changesets A solid alternative, even if it’s primarily focused on the JavaScript/TypeScript ecosystem.
background
For any versioning tool, you’re typically looking for how to:
- Define what content appears in changelog/release notes
- Affect the version numbers of packages
- Automate commits while bumping and tagging the actual metadata
- Automate builds in React
changesets Assumes per-package semantic versioning (i.e., all packages have their own version). In addition, each package has its own CHANGELOG.md.
changesets The team also has GitHub Actions, changesets/action which importantly allows specifying custom scripts for version And publish Order. it only gives adaptation changesets Support for Polyglot repositories.
In changesetsEngineers send “changeset” files to the repository that define what content ends up in the changelog, and which package versions are bumped (ie, major, minor, patched).
see changesets Documentation for more details.
Implementing an automated release process on GitHub
i’m a fan of just. I also really like uv Script. The example below uses both.
I’m also going to assume that you’re in an enterprise setting where all your monorepos are private, not open-source.
repository setup
My recommended organization (at least at the time of writing) is something like this.
.
├── .changeset
│ ├── config.json
│ └── README.md
├── contrib
│ └── utils
├── docker
│ └── Dockerfile
├── docs
│ ├── package.json
│ ├── pnpm-lock.yaml
│ ├── ...
│ └── pnpm-workspace.yaml
├── Justfile
├── package-lock.json
├── package.json
├── packages
│ ├── python-one
│ │ ├── ...
│ │ └── package.json
│ ├── rust-one
│ │ ├── ...
│ │ └── package.json
│ └── rust-two
│ │ ├── ...
│ │ └── package.json
├── pnpm-workspace.yaml
└── third-party
Keep all packages in one packages/ Directory, no matter what language they are. I also enjoy keeping documentation in code form, so assuming you have one docs/ Directory too, and that your documents are written in a JavaScript-based frontend (like Starlight) for the purpose of highlighting nuances later.
changeset configuration
With this setup you can configure changesets with a proxy pnpm Workspace in origin with all your packages.
# pnpm-workspace.yaml
packages:
- "packages/**"
And, declare yourself changesets Dependencies:
// package.json
{
"name": "example-monorepo",
"private": true,
"devDependencies": {
"@changesets/changelog-git": "^0.2.0",
"@changesets/cli": "^2.29.0"
}
}
Now you should also update your .gitignore: :
node_modules/
Because changesets Built for JavaScript, we also need a “proxy” package.json Files for all our packages; changesets Uses these to execute the version bump.
These can be as simple as:
// packages/python-one/package.json
{
"name": "python-one",
"version": "0.1.0",
"private": true
}
With this setup, notice how we’re intentionally trying Exclude our inner docs/ As a PNPM Workbench member – we only want to version the packages. To do this, declare docs/ its directory own pnpm vertical, otherwise it will try and combine docs/ root dependency package-lock.json. It can be as simple as this:
# docs/pnpm-workspace.pyml
packages: []
Next, we can add our changeset configuration:
// .changeset/config.json
{
"$schema": "https://unpkg.com/@changesets/[email protected]/schema.json",
"changelog": "@changesets/changelog-git",
"commit": false,
"fixed": [],
"linked": [],
"access": "restricted",
"baseBranch": "main",
"updateInternalDependencies": "patch",
"ignore": [],
"privatePackages": {
"version": true,
"tag": true
},
"___experimentalUnsafeOptions_WILL_CHANGE_IN_PATCH": {
"onlyUpdatePeerDependentsWhenOutOfRange": true
}
}
Automating Releases with GitHub
Glue for creating polyglot versioning PR
Next, we want to automate our release. That is, creating a changelog PR, bumping up the package metadata, pushing tags, and triggering builds on those tags.
Let’s start with our GitHub workflow definition, and unpack the scripts it calls.
name: Release
on:
push:
branches:
- main
concurrency: ${{ github.workflow }}-${{ github.ref }}
permissions:
contents: write
pull-requests: write
jobs:
release:
name: Release
runs-on: ubuntu-latest
outputs:
published: ${{ steps.changesets.outputs.published }}
steps:
- uses: actions/checkout@v6
- uses: actions/setup-node@v4
with:
cache: npm
- uses: astral-sh/setup-uv@v7
- uses: taiki-e/install-action@just
- run: npm install
- name: Create Release Pull Request or Tag
id: changesets
uses: changesets/action@v1
with:
version: just version
publish: npx @changesets/cli publish
# I like conventional commits
commit: "chore(release): version packages"
title: "chore(release): version packages"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
docker:
needs: [release]
if: needs.release.outputs.published == 'true'
uses: ./.github/workflows/docker.yml
secrets: inherit
on.push.tags As a trigger.
It turns out that GitHub has two fatal flaws with that intuitive approach (at the time of writing). First of all, if you press 3 tags at once, the workflow will not trigger. Unfortunately, this is a relatively common scenario in monorepos. Second, GitHub’s trigger on.push.tags Very unbelievable. This unreliability still exists even if you use PAT as per their instructions.
So, instead consider an obvious workflow_call For that purpose as I have done here.
setting version: just version Multilingual support is key.
# Version packages based on changesets
[doc('Consume changesets: bump versions, update changelogs, sync native version files.')]
[group('release')]
version:
npx @changesets/cli version
uv run --script contrib/utils/sync-versions.py
Glue Meat for Polyglot Support So, how do you apply sync-versions.py.
The main thing here is that we trust it changesets to collide volumes package.json for us when we call npx @changesets/cli versionBut then it is up to us to propagate that version appropriately in the corresponding language metadata.
Here’s an example that uses a very simple analysis. You can write something similar (or better!) for the languages you use.
#!/usr/bin/env -S uv run --script
#
# /// script
# requires-python = ">=3.12"
# dependencies = []
# ///
#
# Sync versions from package.json files (updated by changesets) to native
# package manifests (Cargo.toml, pyproject.toml, etc.).
import json
import re
import subprocess
from enum import Enum, auto
from pathlib import Path
PACKAGES_DIR = Path(__file__).resolve().parent.parent.parent / "packages"
class SyncResult(Enum):
NOT_FOUND = auto()
UP_TO_DATE = auto()
UPDATED = auto()
def read_package_json(pkg_dir: Path) -> dict | None:
"""Read and parse a package.json file."""
pkg_json = pkg_dir / "package.json"
if not pkg_json.exists():
return None
return json.loads(pkg_json.read_text())
def update_cargo_toml(pkg_dir: Path, version: str) -> SyncResult:
"""Update version in [package] section of Cargo.toml."""
cargo_toml = pkg_dir / "Cargo.toml"
if not cargo_toml.exists():
return SyncResult.NOT_FOUND
lines = cargo_toml.read_text().splitlines(keepends=True)
in_package_section = False
for i, line in enumerate(lines):
stripped = line.strip()
# Track which TOML section we're in
if stripped.startswith("["):
in_package_section = stripped == "[package]"
continue
if in_package_section and stripped.startswith("version"):
new_line = re.sub(
r'^(\s*version\s*=\s*")([^"]+)(")',
rf"\g<1>{version}\3",
line,
)
if new_line != line:
lines[i] = new_line
cargo_toml.write_text("".join(lines))
rel = cargo_toml.relative_to(PACKAGES_DIR.parent)
print(f" Updated {rel}")
return SyncResult.UPDATED
return SyncResult.UP_TO_DATE
return SyncResult.UP_TO_DATE
def update_pyproject_toml(pkg_dir: Path, version: str) -> SyncResult:
"""Update version in [project] section of pyproject.toml."""
pyproject = pkg_dir / "pyproject.toml"
if not pyproject.exists():
return SyncResult.NOT_FOUND
lines = pyproject.read_text().splitlines(keepends=True)
in_project_section = False
for i, line in enumerate(lines):
stripped = line.strip()
# Track which TOML section we're in
if stripped.startswith("["):
in_project_section = stripped == "[project]"
continue
if in_project_section and stripped.startswith("version"):
new_line = re.sub(
r'^(\s*version\s*=\s*")([^"]+)(")',
rf"\g<1>{version}\3",
line,
)
if new_line != line:
lines[i] = new_line
pyproject.write_text("".join(lines))
rel = pyproject.relative_to(PACKAGES_DIR.parent)
print(f" Updated {rel}")
return SyncResult.UPDATED
return SyncResult.UP_TO_DATE
return SyncResult.UP_TO_DATE
def refresh_lockfiles() -> None:
"""Refresh all lockfiles under the repo to match updated versions."""
repo_root = PACKAGES_DIR.parent
print("Refreshing lockfiles...")
# Cargo.lock — root workspace + any standalone crate lockfiles
cargo_locks = sorted(
set(repo_root.glob("Cargo.lock")) | set(PACKAGES_DIR.rglob("Cargo.lock"))
)
for cargo_lock in cargo_locks:
lock_dir = cargo_lock.parent
rel = lock_dir.relative_to(repo_root) or Path(".")
print(f" cargo update --workspace in {rel}")
subprocess.run(["cargo", "update", "--workspace"], cwd=lock_dir, check=True)
# uv.lock — Python packages
for uv_lock in sorted(PACKAGES_DIR.rglob("uv.lock")):
lock_dir = uv_lock.parent
print(f" uv lock in {lock_dir.relative_to(repo_root)}")
subprocess.run(["uv", "lock"], cwd=lock_dir, check=True)
def main() -> None:
print("Syncing versions from package.json to native manifests...")
print()
updated = 0
for pkg_json in sorted(PACKAGES_DIR.rglob("package.json")):
pkg_dir = pkg_json.parent
pkg_data = read_package_json(pkg_dir)
if pkg_data is None:
continue
version = pkg_data.get("version")
if version is None:
continue
name = pkg_data.get("name", pkg_dir.name)
print(f"{name} @ {version}")
results = [
update_cargo_toml(pkg_dir, version),
update_pyproject_toml(pkg_dir, version),
]
if any(r == SyncResult.UPDATED for r in results):
updated += 1
elif all(r == SyncResult.NOT_FOUND for r in results):
print(" (no native manifest found)")
else:
print(" (already up to date)")
print()
print(f"Synced {updated} package(s).")
print()
refresh_lockfiles()
print()
print("Done.")
if __name__ == "__main__":
main()
Feedback on package tag
in standard changesets flow, you will now have a pull request on GitHub with the appropriate CHANGELOG.md Updates, as well as metadata updates for all relevant packages.
Once that dissolves, the same action will continue, make everything come true. .changeset Files are consumed, and tags are pushed.
changesets Will only push tags, will not publish packages, because we have set
"privatePackages": {
"version": true,
"tag": true
}
in our .changeset/config.json and set up all packages private: true.
Typically, you will then want to respond to these pushed tags. For example, to create new Docker images.
For that, instead of using a on.push.tags Trigger As you would reasonably assume, you would probably want a workflow_call. For why, see the tip in the first post.
on:
workflow_call: {}
workflow_dispatch:
inputs:
dry_run:
description: 'Build images without pushing to GHCR'
required: false
type: boolean
default: false
no_cache:
description: 'Force a build without using the cache'
required: false
type: boolean
default: false
Summary
changesets Even without direct native support for multiple languages, today Polyglot can manage per-package semantic versioning and changelogs in monorepos. The trick is to treat the JavaScript package manifest as the canonical source of version bumps, and then sync those bumps into the language-native manifest via your script.
Some transits exist (such as making apparently independent pnpm-workspace.yaml files for subdirectories you want to make independent, or use a separate individual access token to push tags), but there’s no blocker in being able to benefit from this changesetsConvenient workflow.
I would suggest versioning the monorepo with a single global version using a tool like semantic-release. I’ve been trying since changesetsI’m sold on the benefits of allowing people to write commit messages for future internal engineers, as well as adding a separate changelog note for end users. These are often two different audiences, and relying on the same traditional commitment to serve both is often not optimal.
<a href