first commit
Copilot Setup Steps / copilot-setup-steps (push) Has been cancelled

This commit is contained in:
2026-04-22 19:51:20 +07:00
commit 93d1b7c3d3
579 changed files with 99797 additions and 0 deletions
+18
View File
@@ -0,0 +1,18 @@
# CODEOWNERS for Ghost Repository
# This file defines code ownership for automatic review assignment
# https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners
# E2E Test Ownership
# The top-level e2e directory requires review from designated owners
/e2e/ @9larsons
# Tinybird Analytics
# Tinybird data pipelines and services require review from designated owners
**/tinybird/ @9larsons @cmraible @evanhahn @troyciesco
# @tryghost/parse-email-address
/ghost/parse-email-address/ @EvanHahn
# Inbox Links
ghost/core/core/server/lib/get-inbox-links.ts @EvanHahn
ghost/core/test/unit/server/lib/get-inbox-links.test.ts @EvanHahn
+128
View File
@@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
report@ghost.org.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.
+84
View File
@@ -0,0 +1,84 @@
# Contributing to Ghost
For **help**, **support**, **questions** and **ideas** please use **[our forum](https://forum.ghost.org)** 🚑.
---
## Where to Start
If you're a developer looking to contribute, but you're not sure where to begin: Check out the [good first issue](https://github.com/TryGhost/Ghost/labels/good%20first%20issue) label on Github, which contains small piece of work that have been specifically flagged as being friendly to new contributors.
After that, if you're looking for something a little more challenging to sink your teeth into, there's a broader [help wanted](https://github.com/TryGhost/Ghost/labels/help%20wanted) label encompassing issues which need some love.
If you've got an idea for a new feature, please start by suggesting it in the [forum](https://forum.ghost.org), as adding new features to Ghost first requires generating consensus around a design and spec.
## Working on Ghost Core
If you're going to work on Ghost core you'll need to go through a slightly more involved install and setup process than the usual Ghost CLI version.
First you'll need to fork [Ghost](https://github.com/tryghost/ghost) to your personal Github account, and then follow the detailed [install from source](https://ghost.org/docs/install/source/) setup guide.
### Branching Guide
`main` on the main repository always contains the latest changes. This means that it is WIP for the next minor version and should NOT be considered stable. Stable versions are tagged using [semantic versioning](http://semver.org/).
On your local repository, you should always work on a branch to make keeping up-to-date and submitting pull requests easier, but in most cases you should submit your pull requests to `main`. Where necessary, for example if multiple people are contributing on a large feature, or if a feature requires a database change, we make use of feature branches.
### Commit Messages
We have a handful of simple standards for commit messages which help us to generate readable changelogs. Please follow this wherever possible and mention the associated issue number.
- **1st line:** Max 80 character summary
- Written in past tense e.g. “Fixed the thing” not “Fixes the thing”
- Start with one of: Fixed, Changed, Updated, Improved, Added, Removed, Reverted, Moved, Released, Bumped, Cleaned
- **2nd line:** [Always blank]
- **3rd line:** `ref <issue link>`, `fixes <issue link>`, `closes <issue link>` or blank
- **4th line:** Why this change was made - the code includes the what, the commit message should describe the context of why - why this, why now, why not something else?
If your change is **user-facing** please prepend the first line of your commit with **an emoji key**. If the commit is for an alpha feature, no emoji is needed. We are following [gitmoji](https://gitmoji.carloscuesta.me/).
**Main emojis we are using:**
- ✨ Feature
- 🎨 Improvement / change
- 🐛 Bug Fix
- 🌐 i18n (translation) submissions [[See Translating Ghost docs for more detail](https://www.notion.so/5af2858289b44f9194f73f8a1e17af59?pvs=25#bef8c9988e294a4b9a6dd624136de36f)]
- 💡 Anything else flagged to users or whoever is writing release notes
Good commit message examples: [new feature](https://github.com/TryGhost/Ghost/commit/61db6defde3b10a4022c86efac29cf15ae60983f), [bug fix](https://github.com/TryGhost/Ghost/commit/6ef835bb5879421ae9133541ebf8c4e560a4a90e) and [translation](https://github.com/TryGhost/Ghost/commit/83904c1611ae7ab3257b3b7d55f03e50cead62d7).
**Bumping @tryghost dependencies**
When bumping `@tryghost/*` dependencies, the first line should follow the above format and say what has changed, not say what has been bumped.
There is no need to include what modules have changed in the commit message, as this is _very_ clear from the contents of the commit. The commit should focus on surfacing the underlying changes from the dependencies - what actually changed as a result of this dependency bump?
[Good example](https://github.com/TryGhost/Ghost/commit/95751a0e5fb719bb5bca74cb97fb5f29b225094f)
### Submitting Pull Requests
We aim to merge any straightforward, well-understood bug fixes or improvements immediately, as long as they pass our tests (run `pnpm test` to check locally). We generally dont merge new features and larger changes without prior discussion with the core product team for tech/design specification.
Please provide plenty of context and reasoning around your changes, to help us merge quickly. Closing an already open issue is our preferred workflow. If your PR gets out of date, we may ask you to rebase as you are more familiar with your changes than we will be.
### Sharing feedback on Documentation
While the Docs are no longer Open Source, we welcome revisions and ideas on the forum! Please create a Post with your questions or suggestions in the [Contributing to Ghost Category](https://forum.ghost.org/c/contributing/27). Thank you for helping us keep the Docs relevant and up-to-date.
---
## Contributor License Agreement
By contributing your code to Ghost you grant the Ghost Foundation a non-exclusive, irrevocable, worldwide, royalty-free, sublicenseable, transferable license under all of Your relevant intellectual property rights (including copyright, patent, and any other rights), to use, copy, prepare derivative works of, distribute and publicly perform and display the Contributions on any licensing terms, including without limitation:
(a) open source licenses like the MIT license; and (b) binary, proprietary, or commercial licenses. Except for the licenses granted herein, You reserve all right, title, and interest in and to the Contribution.
You confirm that you are able to grant us these rights. You represent that You are legally entitled to grant the above license. If Your employer has rights to intellectual property that You create, You represent that You have received permission to make the Contributions on behalf of that employer, or that Your employer has waived such rights for the Contributions.
You represent that the Contributions are Your original works of authorship, and to Your knowledge, no other person claims, or has the right to claim, any right in any invention or patent related to the Contributions. You also represent that You are not legally obligated, whether by entering into an agreement or otherwise, in any way that conflicts with the terms of this license.
The Ghost Foundation acknowledges that, except as explicitly described in this Agreement, any Contribution which you provide is on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE.
+3
View File
@@ -0,0 +1,3 @@
# You can add one username per supported platform and one custom link
github: tryghost
open_collective: ghost
+76
View File
@@ -0,0 +1,76 @@
name: 🐛 Bug report
description: Report reproducible software issues so we can improve
body:
- type: markdown
attributes:
value: |
## Welcome 👋
Thank you for taking the time to fill out a bug report 🙂
We'll respond as quickly as we can. The more information you provide the easier & quicker it is for us to diagnose the problem.
- type: textarea
id: summary
attributes:
label: Issue Summary
description: Explain roughly what's wrong
validations:
required: true
- type: textarea
id: reproduction
attributes:
label: Steps to Reproduce
description: Also tell us, what did you expect to happen?
placeholder: |
1. This is the first step...
2. This is the second step, etc.
validations:
required: true
- type: input
id: version
attributes:
label: Ghost Version
validations:
required: true
- type: input
id: node
attributes:
label: Node.js Version
validations:
required: true
- type: input
id: install
attributes:
label: How did you install Ghost?
description: Provide details of your host & operating system
validations:
required: true
- type: dropdown
id: database
attributes:
label: Database type
options:
- MySQL 5.7
- MySQL 8
- SQLite3
- Other
validations:
required: true
- type: input
id: browsers
attributes:
label: Browser & OS version
description: Include this for frontend bugs
- type: textarea
id: logs
attributes:
label: Relevant log / error output
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
render: shell
- type: checkboxes
id: terms
attributes:
label: Code of Conduct
description: By submitting this issue, you agree to follow our [Code of Conduct](https://ghost.org/conduct)
options:
- label: I agree to be friendly and polite to people in this repository
required: true
+11
View File
@@ -0,0 +1,11 @@
blank_issues_enabled: true
contact_links:
- name: 🚑 Help & Support
url: https://forum.ghost.org
about: Please use the community forum for questions
- name: 💡 Features & Ideas
url: https://forum.ghost.org/c/Ideas
about: Please vote for & post new ideas in the the forum
- name: 📖 Documentation
url: https://ghost.org/docs/
about: Tutorials & reference guides for themes, the API and more
+14
View File
@@ -0,0 +1,14 @@
Got some code for us? Awesome 🎊!
Please take a minute to explain the change you're making:
- Why are you making it?
- What does it do?
- Why is this something Ghost users or developers need?
Please check your PR against these items:
- [ ] I've read and followed the [Contributor Guide](https://github.com/TryGhost/Ghost/blob/main/.github/CONTRIBUTING.md)
- [ ] I've explained my change
- [ ] I've written an automated test to prove my change works
We appreciate your contribution! 🙏
+28
View File
@@ -0,0 +1,28 @@
# How to get support for Ghost 👨‍👩‍👧‍👦
For **help**, **support**, **questions** and **ideas** please use **[our forum](https://forum.ghost.org)** 🚑.
Please **_do not_** raise an issue on GitHub.
We have a **help** category in our **[forum](https://forum.ghost.org/)** where you can get quick answers,
help with debugging weird issues, and general help with any aspect of Ghost. There's also an **ideas** category for feature requests.
Our extensive **documentation** can be found at https://ghost.org/docs/.
Please go to https://forum.ghost.org and signup to join our community.
You can create a new account, or signup using Google, Twitter or Facebook.
Issues which are not bug reports will be closed.
## Using Ghost(Pro)?
**Ghost(Pro)** users have access to email support via the support at ghost dot org address.
## Why not GitHub?
GitHub is our office, it's the place where our development team does its work. We use the issue list
to keep track of bugs and the features that we are working on. We do this openly for transparency.
With the forum, you can leverage the knowledge of our wider community to get help with any problems you are
having with Ghost. Please keep in mind that Ghost is FLOSS, and free support is provided by the goodwill
of our wonderful community members.
+155
View File
@@ -0,0 +1,155 @@
---
description: GitHub Agentic Workflows (gh-aw) - Create, debug, and upgrade AI-powered workflows with intelligent prompt routing
disable-model-invocation: true
---
# GitHub Agentic Workflows Agent
This agent helps you work with **GitHub Agentic Workflows (gh-aw)**, a CLI extension for creating AI-powered workflows in natural language using markdown files.
## What This Agent Does
This is a **dispatcher agent** that routes your request to the appropriate specialized prompt based on your task:
- **Creating new workflows**: Routes to `create` prompt
- **Updating existing workflows**: Routes to `update` prompt
- **Debugging workflows**: Routes to `debug` prompt
- **Upgrading workflows**: Routes to `upgrade-agentic-workflows` prompt
- **Creating shared components**: Routes to `create-shared-agentic-workflow` prompt
- **Fixing Dependabot PRs**: Routes to `dependabot` prompt — use this when Dependabot opens PRs that modify generated manifest files (`.github/workflows/package.json`, `.github/workflows/requirements.txt`, `.github/workflows/go.mod`). Never merge those PRs directly; instead update the source `.md` files and rerun `gh aw compile --dependabot` to bundle all fixes
Workflows may optionally include:
- **Project tracking / monitoring** (GitHub Projects updates, status reporting)
- **Orchestration / coordination** (one workflow assigning agents or dispatching and coordinating other workflows)
## Files This Applies To
- Workflow files: `.github/workflows/*.md` and `.github/workflows/**/*.md`
- Workflow lock files: `.github/workflows/*.lock.yml`
- Shared components: `.github/workflows/shared/*.md`
- Configuration: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/github-agentic-workflows.md
## Problems This Solves
- **Workflow Creation**: Design secure, validated agentic workflows with proper triggers, tools, and permissions
- **Workflow Debugging**: Analyze logs, identify missing tools, investigate failures, and fix configuration issues
- **Version Upgrades**: Migrate workflows to new gh-aw versions, apply codemods, fix breaking changes
- **Component Design**: Create reusable shared workflow components that wrap MCP servers
## How to Use
When you interact with this agent, it will:
1. **Understand your intent** - Determine what kind of task you're trying to accomplish
2. **Route to the right prompt** - Load the specialized prompt file for your task
3. **Execute the task** - Follow the detailed instructions in the loaded prompt
## Available Prompts
### Create New Workflow
**Load when**: User wants to create a new workflow from scratch, add automation, or design a workflow that doesn't exist yet
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/create-agentic-workflow.md
**Use cases**:
- "Create a workflow that triages issues"
- "I need a workflow to label pull requests"
- "Design a weekly research automation"
### Update Existing Workflow
**Load when**: User wants to modify, improve, or refactor an existing workflow
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/update-agentic-workflow.md
**Use cases**:
- "Add web-fetch tool to the issue-classifier workflow"
- "Update the PR reviewer to use discussions instead of issues"
- "Improve the prompt for the weekly-research workflow"
### Debug Workflow
**Load when**: User needs to investigate, audit, debug, or understand a workflow, troubleshoot issues, analyze logs, or fix errors
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/debug-agentic-workflow.md
**Use cases**:
- "Why is this workflow failing?"
- "Analyze the logs for workflow X"
- "Investigate missing tool calls in run #12345"
### Upgrade Agentic Workflows
**Load when**: User wants to upgrade workflows to a new gh-aw version or fix deprecations
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/upgrade-agentic-workflows.md
**Use cases**:
- "Upgrade all workflows to the latest version"
- "Fix deprecated fields in workflows"
- "Apply breaking changes from the new release"
### Create Shared Agentic Workflow
**Load when**: User wants to create a reusable workflow component or wrap an MCP server
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/create-shared-agentic-workflow.md
**Use cases**:
- "Create a shared component for Notion integration"
- "Wrap the Slack MCP server as a reusable component"
- "Design a shared workflow for database queries"
### Fix Dependabot PRs
**Load when**: User needs to close or fix open Dependabot PRs that update dependencies in generated manifest files (`.github/workflows/package.json`, `.github/workflows/requirements.txt`, `.github/workflows/go.mod`)
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/dependabot.md
**Use cases**:
- "Fix the open Dependabot PRs for npm dependencies"
- "Bundle and close the Dependabot PRs for workflow dependencies"
- "Update @playwright/test to fix the Dependabot PR"
## Instructions
When a user interacts with you:
1. **Identify the task type** from the user's request
2. **Load the appropriate prompt** from the GitHub repository URLs listed above
3. **Follow the loaded prompt's instructions** exactly
4. **If uncertain**, ask clarifying questions to determine the right prompt
## Quick Reference
```bash
# Initialize repository for agentic workflows
gh aw init
# Generate the lock file for a workflow
gh aw compile [workflow-name]
# Debug workflow runs
gh aw logs [workflow-name]
gh aw audit <run-id>
# Upgrade workflows
gh aw fix --write
gh aw compile --validate
```
## Key Features of gh-aw
- **Natural Language Workflows**: Write workflows in markdown with YAML frontmatter
- **AI Engine Support**: Copilot, Claude, Codex, or custom engines
- **MCP Server Integration**: Connect to Model Context Protocol servers for tools
- **Safe Outputs**: Structured communication between AI and GitHub API
- **Strict Mode**: Security-first validation and sandboxing
- **Shared Components**: Reusable workflow building blocks
- **Repo Memory**: Persistent git-backed storage for agents
- **Sandboxed Execution**: All workflows run in the Agent Workflow Firewall (AWF) sandbox, enabling full `bash` and `edit` tools by default
## Important Notes
- Always reference the instructions file at https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/github-agentic-workflows.md for complete documentation
- Use the MCP tool `agentic-workflows` when running in GitHub Copilot Cloud
- Workflows must be compiled to `.lock.yml` files before running in GitHub Actions
- **Bash tools are enabled by default** - Don't restrict bash commands unnecessarily since workflows are sandboxed by the AWF
- Follow security best practices: minimal permissions, explicit network access, no template injection
- **Single-file output**: When creating a workflow, produce exactly **one** workflow `.md` file. Do not create separate documentation files (architecture docs, runbooks, usage guides, etc.). If documentation is needed, add a brief `## Usage` section inside the workflow file itself.
+14
View File
@@ -0,0 +1,14 @@
{
"entries": {
"actions/github-script@v8": {
"repo": "actions/github-script",
"version": "v8",
"sha": "ed597411d8f924073f98dfc5c65a23a2325f34cd"
},
"github/gh-aw/actions/setup@v0.51.5": {
"repo": "github/gh-aw/actions/setup",
"version": "v0.51.5",
"sha": "88319be75ab1adc60640307a10e5cf04b3deff1e"
}
}
}
+20
View File
@@ -0,0 +1,20 @@
codecov:
require_ci_to_pass: true
coverage:
status:
patch: false
project:
default: false
admin-tests:
flags:
- admin-tests
threshold: 0.2%
e2e-tests:
flags:
- e2e-tests
threshold: 0.2%
flags:
admin-tests:
carryforward: true
e2e-tests:
carryforward: true
+2
View File
@@ -0,0 +1,2 @@
#!/usr/bin/env sh
exec bash "$(dirname "$0")/commit-msg.bash" "$@"
+97
View File
@@ -0,0 +1,97 @@
#!/usr/bin/env bash
# Get the commit message file path from the first argument
commit_msg_file="$1"
# Read the commit message
commit_msg=$(cat "$commit_msg_file")
# Colors for output
red='\033[0;31m'
yellow='\033[1;33m'
no_color='\033[0m'
# Get the first line (subject)
subject=$(echo "$commit_msg" | head -n1)
# Get the second line
second_line=$(echo "$commit_msg" | sed -n '2p')
# Get the third line
third_line=$(echo "$commit_msg" | sed -n '3p')
# Get the rest of the message (body)
body=$(echo "$commit_msg" | tail -n +4)
# Check subject length (max 80 characters)
if [ ${#subject} -gt 80 ]; then
echo -e "${yellow}Warning: Commit message subject is too long (max 80 characters)${no_color}"
echo -e "Current length: ${#subject} characters"
fi
# Check if second line is blank
if [ ! -z "$second_line" ]; then
echo -e "${yellow}Warning: Second line should be blank${no_color}"
fi
# Check third line format
if [ ! -z "$third_line" ]; then
if [[ "$third_line" =~ ^(refs|ref:) ]]; then
echo -e "${red}Error: Third line should not start with 'refs' or 'ref:'${no_color}" >&2
echo -e "Use 'ref <issue link>', 'fixes <issue link>', or 'closes <issue link>' instead" >&2
echo -e "${yellow}Press Enter to edit the message...${no_color}" >&2
read < /dev/tty # Wait for Enter key press from the terminal
# Get the configured Git editor
editor=$(git var GIT_EDITOR)
if [ -z "$editor" ]; then
editor=${VISUAL:-${EDITOR:-vi}} # Fallback logic similar to Git
fi
# Re-open the editor on the commit message file, connected to the terminal
$editor "$commit_msg_file" < /dev/tty
# Re-read the potentially modified commit message after editing
commit_msg=$(cat "$commit_msg_file")
# Need to update related variables as well
subject=$(echo "$commit_msg" | head -n1)
second_line=$(echo "$commit_msg" | sed -n '2p')
third_line=$(echo "$commit_msg" | sed -n '3p')
body=$(echo "$commit_msg" | tail -n +4)
# Re-check the third line *again* after editing
if [[ "$third_line" =~ ^(refs|ref:) ]]; then
echo -e "${red}Error: Third line still starts with 'refs' or 'ref:'. Commit aborted.${no_color}" >&2
exit 1 # Abort commit if still invalid
fi
# If fixed, the script will continue to the next checks
fi
if ! [[ "$third_line" =~ ^(ref|fixes|closes)\ .*$ ]]; then
echo -e "${yellow}Warning: Third line should start with 'ref', 'fixes', or 'closes' followed by an issue link${no_color}" >&2
fi
fi
# Check for body content (why explanation)
if [ -z "$body" ]; then
echo -e "${yellow}Warning: Missing explanation of why this change was made${no_color}"
echo -e "The body should explain: why this, why now, why not something else?"
fi
# Check for emoji in user-facing changes
if [[ "$subject" =~ ^[^[:space:]]*[[:space:]] ]]; then
first_word="${subject%% *}"
if [[ ! "$first_word" =~ ^[[:punct:]] ]]; then
echo -e "${yellow}Warning: User-facing changes should start with an emoji${no_color}"
echo -e "Common emojis: ✨ (Feature), 🎨 (Improvement), 🐛 (Bug Fix), 🌐 (i18n), 💡 (User-facing)"
fi
fi
# Check for past tense verbs in subject
past_tense_words="Fixed|Changed|Updated|Improved|Added|Removed|Reverted|Moved|Released|Bumped|Cleaned"
if ! echo "$subject" | grep -iE "$past_tense_words" > /dev/null; then
echo -e "${yellow}Warning: Subject line should use past tense${no_color}"
echo -e "Use one of: Fixed, Changed, Updated, Improved, Added, Removed, Reverted, Moved, Released, Bumped, Cleaned"
fi
exit 0
+2
View File
@@ -0,0 +1,2 @@
#!/usr/bin/env sh
exec bash "$(dirname "$0")/pre-commit.bash" "$@"
+116
View File
@@ -0,0 +1,116 @@
#!/usr/bin/env bash
# Modified from https://github.com/chaitanyagupta/gitutils
[ -n "$CI" ] && exit 0
pnpm lint-staged --relative
lintStatus=$?
if [ $lintStatus -ne 0 ]; then
echo "❌ Linting failed"
exit 1
fi
green='\033[0;32m'
no_color='\033[0m'
grey='\033[0;90m'
red='\033[0;31m'
##
## 1) Check and remove submodules before committing
##
ROOT_DIR=$(git rev-parse --show-cdup)
SUBMODULES=$(grep path ${ROOT_DIR}.gitmodules | sed 's/^.*path = //')
MOD_SUBMODULES=$(git diff --cached --name-only --ignore-submodules=none | grep -F "$SUBMODULES" || true)
echo -e "Checking submodules ${grey}(pre-commit hook)${no_color} "
# If no modified submodules, exit with status code 0, else remove them and continue
if [[ -n "$MOD_SUBMODULES" ]]; then
echo -e "${grey}Removing submodules from commit...${no_color}"
for SUB in $MOD_SUBMODULES
do
git reset --quiet HEAD "$SUB"
echo -e "\t${grey}removed:\t$SUB${no_color}"
done
echo
echo -e "${grey}Submodules removed from commit, continuing...${no_color}"
# If there are no changes to commit after removing submodules, abort to avoid an empty commit
if output=$(git status --porcelain) && [ -z "$output" ]; then
echo -e "nothing to commit, working tree clean"
exit 1
fi
else
echo "No submodules in commit, continuing..."
fi
##
## 2) Suggest shipping a new version of @tryghost/activitypub when changes are detected
## The intent is to ship smaller changes more frequently to production
##
increment_version() {
local package_json_path=$1
local version_type=$2
local current_version
current_version=$(grep '"version":' "$package_json_path" | awk -F '"' '{print $4}')
IFS='.' read -r major minor patch <<< "$current_version"
case "$version_type" in
major) ((major++)); minor=0; patch=0 ;;
minor) ((minor++)); patch=0 ;;
patch) ((patch++)) ;;
*) echo "Invalid version type"; exit 1 ;;
esac
new_version="$major.$minor.$patch"
# Update package.json with new version
if [[ "$OSTYPE" == "darwin"* ]]; then
# macOS
sed -i '' -E "s/\"version\": \"[0-9]+\.[0-9]+\.[0-9]+\"/\"version\": \"$new_version\"/" "$package_json_path"
else
# Linux and others
sed -i -E "s/\"version\": \"[0-9]+\.[0-9]+\.[0-9]+\"/\"version\": \"$new_version\"/" "$package_json_path"
fi
echo "Updated version to $new_version in $package_json_path"
}
AP_BUMP_NEEDED=false
MODIFIED_FILES=$(git diff --cached --name-only)
for FILE in $MODIFIED_FILES; do
if [[ "$FILE" == apps/activitypub/* ]]; then
AP_BUMP_NEEDED=true
break
fi
done
if [[ "$AP_BUMP_NEEDED" == true ]]; then
echo -e "\nYou have made changes to @tryghost/activitypub."
echo -e "Would you like to ship a new version? (yes)"
read -r new_version </dev/tty
if [[ -z "$new_version" || "$new_version" == "yes" || "$new_version" == "y" ]]; then
echo -e "Is that a patch, minor or major? (patch)"
read -r version_type </dev/tty
# Default to patch
if [[ -z "$version_type" ]]; then
version_type="patch"
fi
if [[ "$version_type" != "patch" && "$version_type" != "minor" && "$version_type" != "major" ]]; then
echo -e "${red}Invalid input. Skipping version bump.${no_color}"
else
echo "Bumping version ($version_type)..."
increment_version "apps/activitypub/package.json" "$version_type"
git add apps/activitypub/package.json
fi
fi
fi
+247
View File
@@ -0,0 +1,247 @@
{
"extends": [
"github>tryghost/renovate-config"
],
// Limit concurrent branches to keep Renovate runs within the 30-minute
// Mend timeout and avoid overwhelming CI with dozens of queued jobs.
// The shared preset disables rate limiting, but Ghost's monorepo is
// large enough that unlimited branches cause timeouts during rebasing.
"branchConcurrentLimit": 10,
// Keep manually-closed immortal/grouped PRs closed unless explicitly
// reopened from the Dependency Dashboard.
"recreateWhen": "never",
// pnpm lockfile generation has been hitting Mend's 3GB memory ceiling.
// Renovate maintainers suggested starting with toolSettings.nodeMaxMemory
// set to 1024MB to reduce pnpm's Node heap usage and keep the overall job
// under the hosted runner limit.
"toolSettings": {
"nodeMaxMemory": 1024
},
// We have to disable platform based automerge (forcing renovate to do it manually)
// as otherwise renovate wont follow our schedule
"platformAutomerge": false,
"timezone": "Etc/UTC",
// Restrict Renovate runs to the automerge windows so branch updates
// (rebasing, force-pushes) happen around the same times automerge
// can actually complete, not during the working day when CI is busy.
// Each block starts one hour earlier than the matching automerge
// window so Renovate has time to rebase and open/refresh PRs before
// automerge is eligible to run.
"schedule": [
// Run all weekend
"* * * * 0,6",
// Run on Monday morning (Sun 23:00 is already covered by weekend)
"* 0-12 * * 1",
// Run on weekday evenings, starting 1 hour earlier than automerge
"* 21-23 * * 1-5",
// Run on early weekday mornings (previous day 23:00 is already
// covered by the evening block above)
"* 0-4 * * 2-6"
],
"automergeSchedule": [
// Allow automerge all weekend
"* * * * 0,6",
// Allow automerge on Monday morning
"* 0-12 * * 1",
// Allow automerge overnight on weekday evenings (10pm-4:59am UTC)
"* 22-23 * * 1-5",
"* 0-4 * * 2-6"
],
"ignoreDeps": [
// https://github.com/TryGhost/Ghost/commit/2b9e494dfcb95c40f596ccf54ec3151c25d53601
// `got` 10.x has a Node 10 bug that makes it pretty much unusable for now
"got",
// https://github.com/TryGhost/Ghost/commit/2b9e494dfcb95c40f596ccf54ec3151c25d53601
// `intl-messageformat` 6.0.0 introduced a breaking change in terms of
// escaping that would be pretty difficult to fix for now
"intl-messageformat",
// https://github.com/TryGhost/Ghost/commit/b2fa84c7ff9bf8e21b0791f268f57e92759a87b1
// no reason given
"moment",
// https://github.com/TryGhost/Ghost/pull/10672
// https://github.com/TryGhost/Ghost/issues/10870
"moment-timezone",
// https://github.com/TryGhost/Admin/pull/1111/files
// Ignored because of a mobiledoc-kit issue but that's now in koenig, can probably be cleaned up
"simple-dom",
// https://github.com/TryGhost/Admin/pull/1111/files
// https://github.com/TryGhost/Ghost/pull/10672
// These have been ignored since forever
"ember-drag-drop",
"normalize.css",
"validator",
// https://github.com/TryGhost/Ghost/commit/7ebf2891b7470a1c2ffeddefb2fe5e7a57319df3
// Changed how modules are loaded, caused a weird error during render
"@embroider/macros",
// https://github.com/TryGhost/Ghost/commit/a10ad3767f60ed2c8e56feb49e7bf83d9618b2ab
// Caused linespacing issues in the editor, but now it's used in different places
// So not sure if it's relevant - soon we will finish switching to react-codemirror
"codemirror",
// https://github.com/TryGhost/Ghost/commit/3236891b80988924fbbdb625d30cb64a7bf2afd1
// ember-cli-code-coverage@2.0.0 broke our code coverage
"ember-cli-code-coverage",
// https://github.com/TryGhost/Ghost/commit/1382e34e42a513c201cb957b7f843369a2ce1b63
// ember-cli-terser@4.0.2 has a regression that breaks our sourcemaps
"ember-cli-terser"
],
"ignorePaths": [
"test",
"ghost/admin/lib/koenig-editor/package.json"
],
"packageRules": [
// Always require dashboard approval for major updates
// This was largely to avoid the noise of major updates which were ESM only
// The idea was to check and accept major updates if they were NOT ESM
// But this hasn't been workable with our capacity
// Plus, ESM-only is an edge case in the grand scheme of dependencies
{
"description": "Require dashboard approval for major updates",
"matchUpdateTypes": [
"major"
],
"dependencyDashboardApproval": true
},
// Group NQL packages separately from other TryGhost packages
{
"groupName": "NQL packages",
"matchPackageNames": [
"@tryghost/nql",
"@tryghost/nql-lang"
]
},
// Split the broad shared TryGhost group into smaller logical lanes so
// failures in one area (e.g. email rendering) don't block unrelated
// internal package updates from merging.
{
"groupName": "TryGhost runtime packages",
"matchPackageNames": [
"@tryghost/adapter-base-cache",
"@tryghost/admin-api-schema",
"@tryghost/api-framework",
"@tryghost/bookshelf-plugins",
"@tryghost/database-info",
"@tryghost/debug",
"@tryghost/domain-events",
"@tryghost/errors",
"@tryghost/http-cache-utils",
"@tryghost/job-manager",
"@tryghost/logging",
"@tryghost/metrics",
"@tryghost/mw-error-handler",
"@tryghost/mw-vhost",
"@tryghost/pretty-cli",
"@tryghost/prometheus-metrics",
"@tryghost/promise",
"@tryghost/referrer-parser",
"@tryghost/root-utils",
"@tryghost/security",
"@tryghost/social-urls",
"@tryghost/tpl",
"@tryghost/validator",
"@tryghost/version",
"@tryghost/zip"
]
},
{
"groupName": "TryGhost admin support packages",
"matchPackageNames": [
"@tryghost/color-utils",
"@tryghost/custom-fonts",
"@tryghost/limit-service",
"@tryghost/members-csv",
"@tryghost/timezone-data"
]
},
{
"groupName": "TryGhost content and email packages",
"matchPackageNames": [
"@tryghost/config-url-helpers",
"@tryghost/content-api",
"@tryghost/helpers",
"@tryghost/html-to-mobiledoc",
"@tryghost/html-to-plaintext",
"@tryghost/nodemailer",
"@tryghost/parse-email-address",
"@tryghost/request",
"@tryghost/string",
"@tryghost/url-utils"
]
},
{
"groupName": "TryGhost test support packages",
"matchPackageNames": [
"@tryghost/email-mock-receiver",
"@tryghost/express-test",
"@tryghost/webhook-mock-receiver"
]
},
// Always automerge these packages:
{
"matchPackageNames": [
// This is a pre-1.0.0 package, but it provides icons
// and is very very regularly updated and seems safe to update
"lucide-react"
],
"automerge": true
},
// Allow internal Docker digest pins to automerge once the relevant
// CI checks have gone green.
{
"description": "Automerge internal Docker digest updates after CI passes",
"matchDatasources": [
"docker"
],
"matchPackageNames": [
"ghost/traffic-analytics",
"tinybirdco/tinybird-local"
],
"matchUpdateTypes": [
"digest"
],
"automerge": true,
"automergeType": "pr"
},
// Ignore all ember-related packages in admin
// Our ember codebase is being replaced with react and
// Most of the dependencies have breaking changes and it's too hard to update
// Therefore, we'll leave these as-is for now
{
"groupName": "Disable ember updates",
"matchFileNames": [
"ghost/admin/package.json"
],
"matchPackageNames": [
// `ember-foo` style packages
"/^ember(-|$)/",
// scoped `@ember/*` packages
"/^@ember\\//",
// foo/ember-something style packages
"/\\/ember(-|$)/"
],
"enabled": false
},
// Don't allow css preprocessor updates in admin
{
"groupName": "disable css",
"matchFileNames": [
"ghost/admin/package.json"
],
"matchPackageNames": [
"autoprefixer",
"ember-cli-postcss",
"/^postcss/",
"/^css/"
],
"enabled": false
}
]
}
+44
View File
@@ -0,0 +1,44 @@
const fs = require('fs/promises');
const exec = require('util').promisify(require('child_process').exec);
const path = require('path');
const semver = require('semver');
(async () => {
const core = await import('@actions/core');
const corePackageJsonPath = path.join(__dirname, '../../ghost/core/package.json');
const corePackageJson = require(corePackageJsonPath);
const current_version = corePackageJson.version;
console.log(`Current version: ${current_version}`);
const firstArg = process.argv[2];
console.log('firstArg', firstArg);
const buildString = await exec('git rev-parse --short HEAD').then(({stdout}) => stdout.trim());
let newVersion;
if (firstArg === 'canary' || firstArg === 'six') {
const bumpedVersion = semver.inc(current_version, 'minor');
newVersion = `${bumpedVersion}-pre-g${buildString}`;
} else {
newVersion = `${current_version}-0-g${buildString}`;
}
newVersion += '+moya';
console.log('newVersion', newVersion);
corePackageJson.version = newVersion;
await fs.writeFile(corePackageJsonPath, JSON.stringify(corePackageJson, null, 2));
const adminPackageJsonPath = path.join(__dirname, '../../ghost/admin/package.json');
const adminPackageJson = require(adminPackageJsonPath);
adminPackageJson.version = newVersion;
await fs.writeFile(adminPackageJsonPath, JSON.stringify(adminPackageJson, null, 2));
console.log('Version bumped to', newVersion);
core.setOutput('BUILD_VERSION', newVersion);
core.setOutput('GIT_COMMIT_HASH', buildString);
})();
+256
View File
@@ -0,0 +1,256 @@
const fs = require('fs');
const path = require('path');
const execFileSync = require('child_process').execFileSync;
const MONITORED_APPS = {
portal: {
packageName: '@tryghost/portal',
path: 'apps/portal'
},
sodoSearch: {
packageName: '@tryghost/sodo-search',
path: 'apps/sodo-search'
},
comments: {
packageName: '@tryghost/comments-ui',
path: 'apps/comments-ui'
},
announcementBar: {
packageName: '@tryghost/announcement-bar',
path: 'apps/announcement-bar'
},
signupForm: {
packageName: '@tryghost/signup-form',
path: 'apps/signup-form'
}
};
const MONITORED_APP_ENTRIES = Object.entries(MONITORED_APPS);
const MONITORED_APP_PATHS = MONITORED_APP_ENTRIES.map(([, app]) => app.path);
function runGit(args) {
try {
return execFileSync('git', args, {encoding: 'utf8'}).trim();
} catch (error) {
const stderr = error.stderr ? error.stderr.toString().trim() : '';
const stdout = error.stdout ? error.stdout.toString().trim() : '';
const message = stderr || stdout || error.message;
throw new Error(`Failed to run "git ${args.join(' ')}": ${message}`);
}
}
function readVersionFromPackageJson(packageJsonContent, sourceLabel) {
let parsedPackageJson;
try {
parsedPackageJson = JSON.parse(packageJsonContent);
} catch (error) {
throw new Error(`Unable to parse ${sourceLabel}: ${error.message}`);
}
if (!parsedPackageJson.version || typeof parsedPackageJson.version !== 'string') {
throw new Error(`${sourceLabel} does not contain a valid "version" field`);
}
return parsedPackageJson.version;
}
function parseSemver(version) {
const match = version.match(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-([0-9A-Za-z-]+(?:\.[0-9A-Za-z-]+)*))?(?:\+[0-9A-Za-z-]+(?:\.[0-9A-Za-z-]+)*)?$/);
if (!match) {
throw new Error(`Invalid semver version "${version}"`);
}
const prerelease = match[4] ? match[4].split('.').map((identifier) => {
if (/^\d+$/.test(identifier)) {
return Number(identifier);
}
return identifier;
}) : [];
return {
major: Number(match[1]),
minor: Number(match[2]),
patch: Number(match[3]),
prerelease
};
}
function comparePrereleaseIdentifier(a, b) {
const isANumber = typeof a === 'number';
const isBNumber = typeof b === 'number';
if (isANumber && isBNumber) {
if (a === b) {
return 0;
}
return a > b ? 1 : -1;
}
if (isANumber) {
return -1;
}
if (isBNumber) {
return 1;
}
if (a === b) {
return 0;
}
return a > b ? 1 : -1;
}
function compareSemver(a, b) {
const aVersion = parseSemver(a);
const bVersion = parseSemver(b);
if (aVersion.major !== bVersion.major) {
return aVersion.major > bVersion.major ? 1 : -1;
}
if (aVersion.minor !== bVersion.minor) {
return aVersion.minor > bVersion.minor ? 1 : -1;
}
if (aVersion.patch !== bVersion.patch) {
return aVersion.patch > bVersion.patch ? 1 : -1;
}
const aPrerelease = aVersion.prerelease;
const bPrerelease = bVersion.prerelease;
if (!aPrerelease.length && !bPrerelease.length) {
return 0;
}
if (!aPrerelease.length) {
return 1;
}
if (!bPrerelease.length) {
return -1;
}
const maxLength = Math.max(aPrerelease.length, bPrerelease.length);
for (let i = 0; i < maxLength; i += 1) {
const aIdentifier = aPrerelease[i];
const bIdentifier = bPrerelease[i];
if (aIdentifier === undefined) {
return -1;
}
if (bIdentifier === undefined) {
return 1;
}
const identifierComparison = comparePrereleaseIdentifier(aIdentifier, bIdentifier);
if (identifierComparison !== 0) {
return identifierComparison;
}
}
return 0;
}
function getChangedFiles(baseSha, compareSha) {
let mergeBaseSha;
try {
mergeBaseSha = runGit(['merge-base', baseSha, compareSha]);
} catch (error) {
throw new Error(`Unable to determine merge-base for ${baseSha} and ${compareSha}. Ensure the base branch history is available in the checkout.\n${error.message}`);
}
return runGit(['diff', '--name-only', mergeBaseSha, compareSha, '--', ...MONITORED_APP_PATHS])
.split('\n')
.map(file => file.trim())
.filter(Boolean);
}
function getChangedApps(changedFiles) {
return MONITORED_APP_ENTRIES
.filter(([, app]) => {
return changedFiles.some((file) => {
return file === app.path || file.startsWith(`${app.path}/`);
});
})
.map(([key, app]) => ({key, ...app}));
}
function getPrVersion(app) {
const packageJsonPath = path.resolve(__dirname, `../../${app.path}/package.json`);
if (!fs.existsSync(packageJsonPath)) {
throw new Error(`${app.path}/package.json does not exist in this PR`);
}
return readVersionFromPackageJson(
fs.readFileSync(packageJsonPath, 'utf8'),
`${app.path}/package.json from PR`
);
}
function getMainVersion(app) {
return readVersionFromPackageJson(
runGit(['show', `origin/main:${app.path}/package.json`]),
`${app.path}/package.json from main`
);
}
function main() {
const baseSha = process.env.PR_BASE_SHA;
const compareSha = process.env.PR_COMPARE_SHA || process.env.GITHUB_SHA;
if (!baseSha) {
throw new Error('Missing PR_BASE_SHA environment variable');
}
if (!compareSha) {
throw new Error('Missing PR_COMPARE_SHA/GITHUB_SHA environment variable');
}
const changedFiles = getChangedFiles(baseSha, compareSha);
const changedApps = getChangedApps(changedFiles);
if (changedApps.length === 0) {
console.log(`No app changes detected. Skipping version bump check.`);
return;
}
console.log(`Checking version bump for apps: ${changedApps.map(app => app.key).join(', ')}`);
const failedApps = [];
for (const app of changedApps) {
const prVersion = getPrVersion(app);
const mainVersion = getMainVersion(app);
if (compareSemver(prVersion, mainVersion) <= 0) {
failedApps.push(
`${app.key} (${app.packageName}) was changed but version was not bumped above main (${prVersion} <= ${mainVersion}). Please run "pnpm ship" in ${app.path} to bump the package version.`
);
continue;
}
console.log(`${app.key} version bump check passed (${prVersion} > ${mainVersion})`);
}
if (failedApps.length) {
throw new Error(`Version bump checks failed:\n- ${failedApps.join('\n- ')}`);
}
console.log('All monitored app version bump checks passed.');
}
try {
main();
} catch (error) {
console.error(error.message);
process.exit(1);
}
+44
View File
@@ -0,0 +1,44 @@
// NOTE: this file can't use any NPM dependencies because it needs to run even if dependencies aren't installed yet or are corrupted
const {execSync} = require('child_process');
resetNxCache();
deleteNodeModules();
deleteBuildArtifacts();
console.log('Cleanup complete!');
function deleteBuildArtifacts() {
console.log('Deleting all build artifacts...');
try {
execSync('find ./ghost -type d -name "build" -exec rm -rf \'{}\' +', {
stdio: 'inherit'
});
execSync('find ./ghost -type f -name "tsconfig.tsbuildinfo" -delete', {
stdio: 'inherit'
});
} catch (error) {
console.error('Failed to delete build artifacts:', error);
process.exit(1);
}
}
function deleteNodeModules() {
console.log('Deleting all node_modules directories...');
try {
execSync('find . -name "node_modules" -type d -prune -exec rm -rf \'{}\' +', {
stdio: 'inherit'
});
} catch (error) {
console.error('Failed to delete node_modules directories:', error);
process.exit(1);
}
}
function resetNxCache() {
console.log('Resetting NX cache...');
try {
execSync('rm -rf .nxcache .nx');
} catch (error) {
console.error('Failed to reset NX cache:', error);
process.exit(1);
}
}
+710
View File
@@ -0,0 +1,710 @@
#!/usr/bin/env node
'use strict';
const fs = require('fs');
const path = require('path');
const jsonc = require('jsonc-parser');
const { execSync } = require('child_process');
/**
* Parse pnpm outdated --json output into an array of
* [packageName, current, wanted, latest, dependencyType] tuples.
*
* pnpm's JSON output is an object keyed by package name:
* { "pkg": { "wanted": "1.0.1", "latest": "2.0.0", "dependencyType": "dependencies" } }
*
* pnpm's JSON output does not include a "current" field — "wanted"
* represents the lockfile-resolved version, so we use it as current.
*/
function parsePnpmOutdatedOutput(stdout) {
if (!stdout || !stdout.trim()) {
return [];
}
const data = JSON.parse(stdout);
return Object.entries(data).map(([name, info]) => [
name,
info.wanted,
info.wanted,
info.latest,
info.dependencyType
]);
}
/**
* Smart lockfile drift detector that focuses on actionable updates
* and avoids API rate limits by using pnpm's built-in commands where possible
*/
class LockfileDriftDetector {
constructor() {
this.workspaces = [];
this.directDeps = new Map();
this.outdatedInfo = [];
this.workspaceStats = new Map();
this.workspaceDepsCount = new Map();
this.ignoredWorkspaceDeps = new Set();
this.renovateIgnoredDeps = new Set();
// Parse command line arguments
this.args = process.argv.slice(2);
this.filterSeverity = null;
// Check for severity filters
if (this.args.includes('--patch')) {
this.filterSeverity = 'patch';
} else if (this.args.includes('--minor')) {
this.filterSeverity = 'minor';
} else if (this.args.includes('--major')) {
this.filterSeverity = 'major';
}
// Check for help flag
if (this.args.includes('--help') || this.args.includes('-h')) {
this.showHelp();
process.exit(0);
}
}
/**
* Show help message
*/
showHelp() {
console.log(`
Dependency Inspector - Smart lockfile drift detector
Usage: dependency-inspector.js [options]
Options:
--patch Show all packages with patch updates
--minor Show all packages with minor updates
--major Show all packages with major updates
--help, -h Show this help message
Without flags, shows high-priority updates sorted by impact.
With a severity flag, shows all packages with that update type.
`);
}
/**
* Load ignored dependencies from renovate configuration
*/
loadRenovateConfig() {
console.log('🔧 Loading renovate configuration...');
try {
// Read renovate.json from project root (two levels up from .github/scripts/)
const renovateConfigPath = path.join(__dirname, '../../.github/renovate.json5');
const renovateConfig = jsonc.parse(fs.readFileSync(renovateConfigPath, 'utf8'));
if (renovateConfig.ignoreDeps) {
for (const dep of renovateConfig.ignoreDeps) {
this.renovateIgnoredDeps.add(dep);
}
console.log(`📝 Loaded ${renovateConfig.ignoreDeps.length} ignored dependencies from renovate.json`);
console.log(` Ignored: ${Array.from(this.renovateIgnoredDeps).join(', ')}`);
} else {
console.log('📝 No ignoreDeps found in renovate.json');
}
} catch (error) {
console.warn('⚠️ Could not load renovate.json:', error.message);
}
}
/**
* Get all workspace package.json files
*/
async findWorkspaces() {
// Read from project root (two levels up from .github/scripts/)
const rootDir = path.join(__dirname, '../..');
const rootPackage = JSON.parse(fs.readFileSync(path.join(rootDir, 'package.json'), 'utf8'));
// Read workspace patterns from pnpm-workspace.yaml (primary) or package.json (fallback)
let workspacePatterns = [];
const pnpmWorkspacePath = path.join(rootDir, 'pnpm-workspace.yaml');
if (fs.existsSync(pnpmWorkspacePath)) {
const content = fs.readFileSync(pnpmWorkspacePath, 'utf8');
let inPackages = false;
for (const line of content.split('\n')) {
if (/^packages:/.test(line)) {
inPackages = true;
continue;
}
if (inPackages) {
const match = line.match(/^\s+-\s+['"]?([^'"]+)['"]?\s*$/);
if (match) {
workspacePatterns.push(match[1]);
} else if (/^\S/.test(line)) {
break;
}
}
}
} else {
workspacePatterns = rootPackage.workspaces || [];
}
console.log('📦 Scanning workspaces...');
// Add root package
this.workspaces.push({
name: rootPackage.name || 'root',
path: '.',
packageJson: rootPackage
});
// Find workspace packages
for (const pattern of workspacePatterns) {
const globPattern = path.join(rootDir, pattern.replace(/\*$/, ''));
try {
const dirs = fs.readdirSync(globPattern, { withFileTypes: true })
.filter(dirent => dirent.isDirectory())
.map(dirent => path.join(globPattern, dirent.name));
for (const dir of dirs) {
const packageJsonPath = path.join(dir, 'package.json');
if (fs.existsSync(packageJsonPath)) {
try {
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8'));
// Skip ghost/admin directory but track its dependencies for filtering
if (path.basename(dir) === 'admin' && dir.includes('ghost')) {
console.log(`🚫 Ignoring ghost/admin workspace (tracking deps for filtering)`);
const deps = {
...packageJson.dependencies,
...packageJson.devDependencies,
...packageJson.peerDependencies,
...packageJson.optionalDependencies
};
// Add all ghost/admin dependencies to ignore list
for (const depName of Object.keys(deps || {})) {
this.ignoredWorkspaceDeps.add(depName);
}
continue;
}
this.workspaces.push({
name: packageJson.name || path.basename(dir),
path: dir,
packageJson
});
} catch (e) {
console.warn(`⚠️ Skipped ${packageJsonPath}: ${e.message}`);
}
}
}
} catch (e) {
console.warn(`⚠️ Skipped pattern ${pattern}: ${e.message}`);
}
}
console.log(`Found ${this.workspaces.length} workspaces`);
return this.workspaces;
}
/**
* Extract all direct dependencies from workspaces
*/
extractDirectDependencies() {
console.log('🔍 Extracting direct dependencies...');
for (const workspace of this.workspaces) {
const { packageJson } = workspace;
const deps = {
...packageJson.dependencies,
...packageJson.devDependencies,
...packageJson.peerDependencies,
...packageJson.optionalDependencies
};
// Count total dependencies for this workspace
const totalDepsForWorkspace = Object.keys(deps || {}).length;
this.workspaceDepsCount.set(workspace.name, totalDepsForWorkspace);
for (const [name, range] of Object.entries(deps || {})) {
if (!this.directDeps.has(name)) {
this.directDeps.set(name, new Set());
}
this.directDeps.get(name).add({
workspace: workspace.name,
range,
path: workspace.path
});
}
}
return this.directDeps;
}
/**
* Use pnpm outdated to get comprehensive outdated info
* This is much faster and more reliable than manual API calls
*/
async getOutdatedPackages() {
console.log('🔄 Running pnpm outdated (this may take a moment)...');
let stdout;
try {
stdout = execSync('pnpm outdated --json', {
encoding: 'utf8',
maxBuffer: 10 * 1024 * 1024 // 10MB buffer for large output
});
} catch (error) {
// pnpm outdated exits with code 1 when there are outdated packages
if (error.status === 1 && error.stdout) {
stdout = error.stdout;
} else {
console.error('Failed to run pnpm outdated:', error.message);
return [];
}
}
return parsePnpmOutdatedOutput(stdout);
}
/**
* Analyze the severity of version differences
*/
analyzeVersionDrift(current, wanted, latest) {
const parseVersion = (v) => {
const match = v.match(/(\d+)\.(\d+)\.(\d+)/);
if (!match) return { major: 0, minor: 0, patch: 0 };
return {
major: parseInt(match[1]),
minor: parseInt(match[2]),
patch: parseInt(match[3])
};
};
const currentVer = parseVersion(current);
const latestVer = parseVersion(latest);
const majorDiff = latestVer.major - currentVer.major;
const minorDiff = latestVer.minor - currentVer.minor;
const patchDiff = latestVer.patch - currentVer.patch;
let severity = 'patch';
let score = patchDiff;
if (majorDiff > 0) {
severity = 'major';
score = majorDiff * 1000 + minorDiff * 100 + patchDiff;
} else if (minorDiff > 0) {
severity = 'minor';
score = minorDiff * 100 + patchDiff;
}
return { severity, score, majorDiff, minorDiff, patchDiff };
}
/**
* Process and categorize outdated packages
*/
processOutdatedPackages(outdatedData) {
console.log('📊 Processing outdated package information...');
// Initialize workspace stats
for (const workspace of this.workspaces) {
this.workspaceStats.set(workspace.name, {
total: 0,
major: 0,
minor: 0,
patch: 0,
packages: [],
outdatedPackageNames: new Set() // Track unique package names per workspace
});
}
const results = {
direct: [],
transitive: [],
stats: {
total: 0,
major: 0,
minor: 0,
patch: 0
}
};
for (const [packageName, current, wanted, latest, packageType] of outdatedData) {
const isDirect = this.directDeps.has(packageName);
// Skip packages that are only used by ignored workspaces (like ghost/admin)
if (!isDirect && this.ignoredWorkspaceDeps.has(packageName)) {
continue;
}
// Skip packages that are ignored by renovate configuration
if (this.renovateIgnoredDeps.has(packageName)) {
continue;
}
const analysis = this.analyzeVersionDrift(current, wanted, latest);
const packageInfo = {
name: packageName,
current,
wanted,
latest,
type: packageType || 'dependencies',
isDirect,
...analysis,
workspaces: isDirect ? Array.from(this.directDeps.get(packageName)) : []
};
// Update workspace statistics for direct dependencies
if (isDirect) {
for (const workspaceInfo of packageInfo.workspaces) {
const stats = this.workspaceStats.get(workspaceInfo.workspace);
if (stats && !stats.outdatedPackageNames.has(packageName)) {
// Only count each package once per workspace
stats.outdatedPackageNames.add(packageName);
stats.total++;
stats[analysis.severity]++;
stats.packages.push({
name: packageName,
current,
latest,
severity: analysis.severity
});
}
}
results.direct.push(packageInfo);
} else {
results.transitive.push(packageInfo);
}
results.stats.total++;
results.stats[analysis.severity]++;
}
// Deduplicate direct dependencies and count workspace impact
const directDepsMap = new Map();
for (const pkg of results.direct) {
if (!directDepsMap.has(pkg.name)) {
directDepsMap.set(pkg.name, {
...pkg,
workspaceCount: pkg.workspaces.length,
impact: pkg.workspaces.length // Number of workspaces affected
});
}
}
// Sort by impact: workspace count first, then severity, then score
const sortByImpact = (a, b) => {
// First by number of workspaces (more workspaces = higher priority)
if (a.impact !== b.impact) {
return b.impact - a.impact;
}
// Then by severity
if (a.severity !== b.severity) {
const severityOrder = { major: 3, minor: 2, patch: 1 };
return severityOrder[b.severity] - severityOrder[a.severity];
}
// Finally by version drift score
return b.score - a.score;
};
results.direct = Array.from(directDepsMap.values()).sort(sortByImpact);
results.transitive.sort((a, b) => {
if (a.severity !== b.severity) {
const severityOrder = { major: 3, minor: 2, patch: 1 };
return severityOrder[b.severity] - severityOrder[a.severity];
}
return b.score - a.score;
});
return results;
}
/**
* Display filtered results by severity
*/
displayFilteredResults(results) {
const severityEmoji = {
major: '🔴',
minor: '🟡',
patch: '🟢'
};
const emoji = severityEmoji[this.filterSeverity];
const filterTitle = this.filterSeverity.toUpperCase();
console.log(`${emoji} ${filterTitle} UPDATES ONLY:\n`);
// Filter direct dependencies
const filteredDirect = results.direct.filter(pkg => pkg.severity === this.filterSeverity);
const filteredTransitive = results.transitive.filter(pkg => pkg.severity === this.filterSeverity);
console.log(`Found ${filteredDirect.length} direct and ${filteredTransitive.length} transitive ${this.filterSeverity} updates.\n`);
if (filteredDirect.length > 0) {
console.log('📦 DIRECT DEPENDENCIES:');
console.log('─'.repeat(80));
// Sort by workspace impact, then by package name
filteredDirect.sort((a, b) => {
if (a.impact !== b.impact) {
return b.impact - a.impact;
}
return a.name.localeCompare(b.name);
});
for (const pkg of filteredDirect) {
const workspaceList = pkg.workspaces.map(w => w.workspace).join(', ');
const impactNote = pkg.workspaceCount > 1 ? ` (${pkg.workspaceCount} workspaces)` : '';
console.log(` ${emoji} ${pkg.name}: ${pkg.current}${pkg.latest}${impactNote}`);
console.log(` Workspaces: ${workspaceList}`);
}
console.log('\n🚀 UPDATE COMMANDS:');
console.log('─'.repeat(80));
for (const pkg of filteredDirect) {
console.log(` pnpm update ${pkg.name}@latest`);
}
}
if (filteredTransitive.length > 0) {
console.log('\n\n🔄 TRANSITIVE DEPENDENCIES:');
console.log('─'.repeat(80));
console.log(' These will likely be updated automatically when you update direct deps.\n');
// Sort by package name for easier scanning
filteredTransitive.sort((a, b) => a.name.localeCompare(b.name));
for (const pkg of filteredTransitive) {
console.log(` ${emoji} ${pkg.name}: ${pkg.current}${pkg.latest}`);
}
}
// Show workspace-specific breakdown
console.log('\n\n🏢 WORKSPACE BREAKDOWN:');
console.log('─'.repeat(80));
for (const [workspaceName, stats] of this.workspaceStats.entries()) {
const severityCount = stats[this.filterSeverity];
if (severityCount > 0) {
const packages = stats.packages.filter(p => p.severity === this.filterSeverity);
console.log(`\n 📦 ${workspaceName}: ${severityCount} ${this.filterSeverity} update${severityCount !== 1 ? 's' : ''}`);
// Show all packages for this workspace with the selected severity
for (const pkg of packages) {
console.log(` ${emoji} ${pkg.name}: ${pkg.current}${pkg.latest}`);
}
}
}
console.log('');
}
/**
* Display results in a helpful format
*/
displayResults(results) {
console.log('\n🎯 DEPENDENCY ANALYSIS RESULTS\n');
// If filtering by severity, show filtered results
if (this.filterSeverity) {
this.displayFilteredResults(results);
return;
}
// Workspace-specific statistics
console.log('🏢 WORKSPACE BREAKDOWN:');
console.log(' Outdated packages per workspace:\n');
// Sort workspaces by percentage of outdated packages (descending), then by total count
const sortedWorkspaces = Array.from(this.workspaceStats.entries())
.sort(([nameA, a], [nameB, b]) => {
const totalA = this.workspaceDepsCount.get(nameA) || 0;
const totalB = this.workspaceDepsCount.get(nameB) || 0;
const percentageA = totalA > 0 ? (a.total / totalA) * 100 : 0;
const percentageB = totalB > 0 ? (b.total / totalB) * 100 : 0;
// Sort by percentage first, then by total count
if (Math.abs(percentageA - percentageB) > 0.1) {
return percentageB - percentageA;
}
return b.total - a.total;
});
for (const [workspaceName, stats] of sortedWorkspaces) {
const totalDeps = this.workspaceDepsCount.get(workspaceName) || 0;
const outdatedCount = stats.total;
const percentage = totalDeps > 0 ? ((outdatedCount / totalDeps) * 100).toFixed(1) : '0.0';
if (stats.total === 0) {
console.log(`${workspaceName}: All ${totalDeps} dependencies up to date! (0% outdated)`);
} else {
console.log(` 📦 ${workspaceName}: ${outdatedCount}/${totalDeps} outdated (${percentage}%)`);
console.log(` 🔴 Major: ${stats.major} | 🟡 Minor: ${stats.minor} | 🟢 Patch: ${stats.patch}`);
// Show top 3 most outdated packages for this workspace
const topPackages = stats.packages
.sort((a, b) => {
const severityOrder = { major: 3, minor: 2, patch: 1 };
return severityOrder[b.severity] - severityOrder[a.severity];
})
.slice(0, 3);
if (topPackages.length > 0) {
console.log(` Top issues: ${topPackages.map(p => {
const emoji = p.severity === 'major' ? '🔴' : p.severity === 'minor' ? '🟡' : '🟢';
return `${emoji} ${p.name} (${p.current}${p.latest})`;
}).join(', ')}`);
}
console.log('');
}
}
console.log('');
// Direct dependencies (most actionable)
if (results.direct.length > 0) {
console.log('🎯 DIRECT DEPENDENCIES (High Priority):');
console.log(' Sorted by impact: workspace count → severity → version drift\n');
const topDirect = results.direct.slice(0, 15);
for (const pkg of topDirect) {
const emoji = pkg.severity === 'major' ? '🔴' : pkg.severity === 'minor' ? '🟡' : '🟢';
const impactEmoji = pkg.workspaceCount >= 5 ? '🌟' : pkg.workspaceCount >= 3 ? '⭐' : '';
console.log(` ${emoji} ${impactEmoji} ${pkg.name}`);
console.log(` ${pkg.current}${pkg.latest} (${pkg.severity})`);
console.log(` Used in ${pkg.workspaceCount} workspace${pkg.workspaceCount !== 1 ? 's' : ''}: ${pkg.workspaces.map(w => w.workspace).join(', ')}`);
console.log('');
}
if (results.direct.length > 15) {
console.log(` ... and ${results.direct.length - 15} more direct dependencies\n`);
}
}
// Sample of most outdated transitive dependencies
if (results.transitive.length > 0) {
console.log('🔄 MOST OUTDATED TRANSITIVE DEPENDENCIES (Lower Priority):');
console.log(' These will likely be updated automatically when you update direct deps.\n');
const topTransitive = results.transitive.slice(0, 10);
for (const pkg of topTransitive) {
const emoji = pkg.severity === 'major' ? '🔴' : pkg.severity === 'minor' ? '🟡' : '🟢';
console.log(` ${emoji} ${pkg.name}: ${pkg.current}${pkg.latest} (${pkg.severity})`);
}
if (results.transitive.length > 10) {
console.log(` ... and ${results.transitive.length - 10} more transitive dependencies\n`);
}
}
// Generate update commands for highest impact packages
const topUpdates = results.direct.slice(0, 5);
if (topUpdates.length > 0) {
console.log('🚀 SUGGESTED COMMANDS (highest impact first):');
for (const pkg of topUpdates) {
const impactNote = pkg.workspaceCount > 1 ? ` (affects ${pkg.workspaceCount} workspaces)` : '';
console.log(` pnpm update ${pkg.name}@latest${impactNote}`);
}
console.log('');
}
const generatedAt = new Date().toISOString();
const latestCommit = this.getLatestCommitRef();
// Summary at the end
console.log('📈 SUMMARY:');
console.log(` Generated at: ${generatedAt}`);
console.log(` Latest commit: ${latestCommit}`);
console.log(` Total dependencies: ${this.directDeps.size}`);
console.log(` Total outdated: ${results.stats.total}`);
console.log(` Major updates: ${results.stats.major}`);
console.log(` Minor updates: ${results.stats.minor}`);
console.log(` Patch updates: ${results.stats.patch}`);
console.log(` Direct deps: ${results.direct.length}`);
console.log(` Transitive deps: ${results.transitive.length}\n`);
}
/**
* Get the latest commit reference for the current checkout
*/
getLatestCommitRef() {
try {
return execSync("git log -1 --format='%h %ad %s' --date=iso-strict", {
encoding: 'utf8'
}).trim();
} catch (error) {
return 'Unavailable';
}
}
/**
* Run pnpm audit and display a vulnerability summary
*/
displayAuditSummary() {
console.log('🔒 SECURITY AUDIT:\n');
try {
let stdout = '';
try {
stdout = execSync('pnpm audit --json', {
encoding: 'utf8',
maxBuffer: 10 * 1024 * 1024
});
} catch (error) {
// pnpm audit exits with non-zero when vulnerabilities are found
stdout = error.stdout || '';
}
if (!stdout || !stdout.trim()) {
console.log(' ⚠️ Could not parse audit summary\n');
return;
}
const data = JSON.parse(stdout);
if (data.metadata && data.metadata.vulnerabilities) {
const v = data.metadata.vulnerabilities;
const total = v.info + v.low + v.moderate + v.high + v.critical;
console.log(` Total vulnerabilities: ${total}`);
console.log(` 🔴 Critical: ${v.critical}`);
console.log(` 🟠 High: ${v.high}`);
console.log(` 🟡 Moderate: ${v.moderate}`);
console.log(` 🟢 Low: ${v.low}`);
if (v.info > 0) {
console.log(` ️ Info: ${v.info}`);
}
console.log(` Total dependencies scanned: ${data.metadata.totalDependencies}\n`);
} else {
console.log(' ⚠️ Could not parse audit summary\n');
}
} catch (error) {
console.log(` ⚠️ Audit failed: ${error.message}\n`);
}
}
async run() {
try {
// Change to project root directory to run commands correctly
const rootDir = path.join(__dirname, '../..');
process.chdir(rootDir);
this.loadRenovateConfig();
await this.findWorkspaces();
this.extractDirectDependencies();
const outdatedData = await this.getOutdatedPackages();
if (outdatedData.length === 0) {
console.log('🎉 All packages are up to date!');
return;
}
const results = this.processOutdatedPackages(outdatedData);
this.displayResults(results);
this.displayAuditSummary();
} catch (error) {
console.error('❌ Error:', error.message);
process.exit(1);
}
}
}
// Run the detector
const detector = new LockfileDriftDetector();
detector.run();
@@ -0,0 +1,25 @@
const userAgent = process.env.npm_config_user_agent || '';
if (/\bpnpm\//.test(userAgent)) {
process.exit(0);
}
const detectedPackageManager = userAgent.split(' ')[0] || 'unknown';
console.error(`
Ghost now uses pnpm for dependency installation.
Detected package manager: ${detectedPackageManager}
Use one of these instead:
corepack enable pnpm
pnpm install
Common command replacements:
yarn setup -> pnpm run setup
yarn dev -> pnpm dev
yarn test -> pnpm test
yarn lint -> pnpm lint
`);
process.exit(1);
+215
View File
@@ -0,0 +1,215 @@
const path = require('path');
const fs = require('fs/promises');
const exec = require('util').promisify(require('child_process').exec);
const readline = require('readline/promises');
const semver = require('semver');
// Maps a package name to the config key in defaults.json
const CONFIG_KEYS = {
'@tryghost/portal': 'portal',
'@tryghost/sodo-search': 'sodoSearch',
'@tryghost/comments-ui': 'comments',
'@tryghost/announcement-bar': 'announcementBar',
'@tryghost/signup-form': 'signupForm'
};
const CURRENT_DIR = process.cwd();
const packageJsonPath = path.join(CURRENT_DIR, 'package.json');
const packageJson = require(packageJsonPath);
const APP_NAME = packageJson.name;
const APP_VERSION = packageJson.version;
async function safeExec(command) {
try {
return await exec(command);
} catch (err) {
return {
stdout: err.stdout,
stderr: err.stderr
};
}
}
async function ensureEnabledApp() {
const ENABLED_APPS = Object.keys(CONFIG_KEYS);
if (!ENABLED_APPS.includes(APP_NAME)) {
console.error(`${APP_NAME} is not enabled, please modify ${__filename}`);
process.exit(1);
}
}
async function ensureNotOnMain() {
const currentGitBranch = await safeExec(`git branch --show-current`);
if (currentGitBranch.stderr) {
console.error(`There was an error checking the current git branch`)
console.error(`${currentGitBranch.stderr}`);
process.exit(1);
}
if (currentGitBranch.stdout.trim() === 'main') {
console.error(`The release can not be done on the "main" branch`)
process.exit(1);
}
}
async function ensureCleanGit() {
const localGitChanges = await safeExec(`git status --porcelain`);
if (localGitChanges.stderr) {
console.error(`There was an error checking the local git status`)
console.error(`${localGitChanges.stderr}`);
process.exit(1);
}
if (localGitChanges.stdout) {
console.error(`You have local git changes - are you sure you're ready to release?`)
console.error(`${localGitChanges.stdout}`);
process.exit(1);
}
}
async function getNewVersion() {
const rl = readline.createInterface({input: process.stdin, output: process.stdout});
const bumpTypeInput = await rl.question('Is this a patch, minor or major (patch)? ');
rl.close();
const bumpType = bumpTypeInput.trim().toLowerCase() || 'patch';
if (!['patch', 'minor', 'major'].includes(bumpType)) {
console.error(`Unknown bump type ${bumpTypeInput} - expected one of "patch", "minor, "major"`)
process.exit(1);
}
return semver.inc(APP_VERSION, bumpType);
}
async function updateConfig(newVersion) {
const defaultConfigPath = path.resolve(__dirname, '../../ghost/core/core/shared/config/defaults.json');
const defaultConfig = require(defaultConfigPath);
const configKey = CONFIG_KEYS[APP_NAME];
defaultConfig[configKey].version = `${semver.major(newVersion)}.${semver.minor(newVersion)}`;
await fs.writeFile(defaultConfigPath, JSON.stringify(defaultConfig, null, 4) + '\n');
}
async function updatePackageJson(newVersion) {
const newPackageJson = Object.assign({}, packageJson, {
version: newVersion
});
await fs.writeFile(packageJsonPath, JSON.stringify(newPackageJson, null, 2) + '\n');
}
async function getChangelog(newVersion) {
const rl = readline.createInterface({input: process.stdin, output: process.stdout});
const i18nChangesInput = await rl.question('Does this release contain i18n updates (Y/n)? ');
rl.close();
const i18nChanges = i18nChangesInput.trim().toLowerCase() !== 'n';
let changelogItems = [];
if (i18nChanges) {
changelogItems.push('Updated i18n translations');
}
// Restrict git log to only the current directory (the specific app)
const lastFiftyCommits = await safeExec(`git log -n 50 --oneline -- .`);
if (lastFiftyCommits.stderr) {
console.error(`There was an error getting the last 50 commits`);
process.exit(1);
}
const lastFiftyCommitsList = lastFiftyCommits.stdout.split('\n');
const releaseRegex = new RegExp(`Released ${APP_NAME} v${APP_VERSION}`);
const indexOfLastRelease = lastFiftyCommitsList.findIndex((commitLine) => {
const commitMessage = commitLine.slice(11); // Take the hash off the front
return releaseRegex.test(commitMessage);
});
if (indexOfLastRelease === -1) {
console.warn(`Could not find commit for previous release. Will include recent commits affecting this app.`);
// Fallback: get recent commits for this app (last 20)
const recentCommits = await safeExec(`git log -n 20 --pretty=format:"%h%n%B__SPLIT__" -- .`);
if (recentCommits.stderr) {
console.error(`There was an error getting recent commits`);
process.exit(1);
}
const recentCommitsList = recentCommits.stdout.split('__SPLIT__');
const recentCommitsWhichMentionLinear = recentCommitsList.filter((commitBlock) => {
return commitBlock.includes('https://linear.app/ghost');
});
const commitChangelogItems = recentCommitsWhichMentionLinear.map((commitBlock) => {
const lines = commitBlock.split('\n');
if (!lines.length || !lines[0].trim()) {
return null; // Skip entries with no hash
}
const hash = lines[0].trim();
return `https://github.com/TryGhost/Ghost/commit/${hash}`;
}).filter(Boolean); // Filter out any null entries
changelogItems.push(...commitChangelogItems);
} else {
const lastReleaseCommit = lastFiftyCommitsList[indexOfLastRelease];
const lastReleaseCommitHash = lastReleaseCommit.slice(0, 10);
// Also restrict this git log to only the current directory (the specific app)
const commitsSinceLastRelease = await safeExec(`git log ${lastReleaseCommitHash}..HEAD --pretty=format:"%h%n%B__SPLIT__" -- .`);
if (commitsSinceLastRelease.stderr) {
console.error(`There was an error getting commits since the last release`);
process.exit(1);
}
const commitsSinceLastReleaseList = commitsSinceLastRelease.stdout.split('__SPLIT__');
const commitsSinceLastReleaseWhichMentionLinear = commitsSinceLastReleaseList.filter((commitBlock) => {
return commitBlock.includes('https://linear.app/ghost');
});
const commitChangelogItems = commitsSinceLastReleaseWhichMentionLinear.map((commitBlock) => {
const lines = commitBlock.split('\n');
if (!lines.length || !lines[0].trim()) {
return null; // Skip entries with no hash
}
const hash = lines[0].trim();
return `https://github.com/TryGhost/Ghost/commit/${hash}`;
}).filter(Boolean); // Filter out any null entries
changelogItems.push(...commitChangelogItems);
}
const changelogList = changelogItems.map(item => ` - ${item}`).join('\n');
return `Changelog for v${APP_VERSION} -> ${newVersion}: \n${changelogList}`;
}
async function main() {
await ensureEnabledApp();
await ensureNotOnMain();
await ensureCleanGit();
console.log(`Running release for ${APP_NAME}`);
console.log(`Current version is ${APP_VERSION}`);
const newVersion = await getNewVersion();
console.log(`Bumping to version ${newVersion}`);
const changelog = await getChangelog(newVersion);
await updatePackageJson(newVersion);
await exec(`git add package.json`);
await updateConfig(newVersion);
await exec(`git add ../../ghost/core/core/shared/config/defaults.json`);
await exec(`git commit -m 'Released ${APP_NAME} v${newVersion}\n\n${changelog}'`);
console.log(`Release commit created - please double check it and use "git commit --amend" to make any changes before opening a PR to merge into main`)
}
main();
+30
View File
@@ -0,0 +1,30 @@
name: CI (Release)
on:
push:
tags:
- 'v[0-9]*'
# Tags must never be cancelled — each is a public release
concurrency:
group: ci-release-${{ github.ref_name }}
cancel-in-progress: false
# Workflow-level permissions set the ceiling for the reusable ci.yml.
# id-token is never in the default token, so it must be granted explicitly
# here — otherwise the ci: job's `permissions:` block exceeds the caller
# workflow's permissions and GitHub rejects the run with startup_failure.
permissions:
actions: read
contents: write
packages: write
id-token: write
jobs:
ci:
uses: ./.github/workflows/ci.yml
secrets: inherit
permissions:
actions: read
contents: write
packages: write
id-token: write
File diff suppressed because it is too large Load Diff
+158
View File
@@ -0,0 +1,158 @@
name: Cleanup GHCR Images
on:
schedule:
- cron: "30 4 * * *" # Daily at 04:30 UTC
workflow_dispatch:
inputs:
dry_run:
description: "Log what would be deleted without making changes"
required: false
default: true
type: boolean
retention_days:
description: "Delete versions older than this many days"
required: false
default: 14
type: number
min_keep:
description: "Always keep at least this many versions per package"
required: false
default: 10
type: number
permissions:
packages: write
env:
ORG: TryGhost
RETENTION_DAYS: ${{ inputs.retention_days || 14 }}
MIN_KEEP: ${{ inputs.min_keep || 10 }}
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-latest
strategy:
matrix:
package: [ghost, ghost-core, ghost-development]
steps:
- name: Delete old non-release versions
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DRY_RUN: ${{ github.event_name == 'schedule' && 'false' || inputs.dry_run }}
PACKAGE: ${{ matrix.package }}
run: |
set -euo pipefail
cutoff=$(date -u -d "-${RETENTION_DAYS} days" +%Y-%m-%dT%H:%M:%SZ 2>/dev/null \
|| date -u -v-${RETENTION_DAYS}d +%Y-%m-%dT%H:%M:%SZ)
echo "Package: ${ORG}/${PACKAGE}"
echo "Cutoff: ${cutoff} (${RETENTION_DAYS} days ago)"
echo "Dry run: ${DRY_RUN}"
echo ""
# Pagination — collect all versions
page=1
all_versions="[]"
while true; do
if ! batch=$(gh api \
"/orgs/${ORG}/packages/container/${PACKAGE}/versions?per_page=100&page=${page}" \
--jq '.' 2>&1); then
if [ "$page" = "1" ]; then
echo "::error::API request failed: ${batch}"
exit 1
fi
echo "::warning::API request failed (page ${page}): ${batch}"
break
fi
count=$(echo "$batch" | jq 'length')
if [ "$count" = "0" ]; then
break
fi
all_versions=$(echo "$all_versions $batch" | jq -s 'add')
page=$((page + 1))
done
total=$(echo "$all_versions" | jq 'length')
echo "Total versions: ${total}"
# Classify versions
keep=0
delete=0
delete_ids=""
for row in $(echo "$all_versions" | jq -r '.[] | @base64'); do
_jq() { echo "$row" | base64 -d | jq -r "$1"; }
id=$(_jq '.id')
updated=$(_jq '.updated_at')
tags=$(_jq '[.metadata.container.tags[]] | join(",")')
# Keep versions with semver tags (v1.2.3, 1.2.3, 1.2)
if echo "$tags" | grep -qE '(^|,)v?[0-9]+\.[0-9]+\.[0-9]+(,|$)' || \
echo "$tags" | grep -qE '(^|,)[0-9]+\.[0-9]+(,|$)'; then
keep=$((keep + 1))
continue
fi
# Keep versions with 'latest' or 'main' or cache-main tags
if echo "$tags" | grep -qE '(^|,)(latest|main|cache-main)(,|$)'; then
keep=$((keep + 1))
continue
fi
# Keep versions newer than cutoff
if [[ "$updated" > "$cutoff" ]]; then
keep=$((keep + 1))
continue
fi
# This version is eligible for deletion
delete=$((delete + 1))
delete_ids="${delete_ids} ${id}"
tag_display="${tags:-<untagged>}"
if [ "$DRY_RUN" = "true" ]; then
echo "[dry-run] Would delete version ${id} (tags: ${tag_display}, updated: ${updated})"
fi
done
echo ""
echo "Summary: ${keep} kept, ${delete} to delete (of ${total} total)"
if [ "$delete" = "0" ]; then
echo "Nothing to delete."
exit 0
fi
# Safety check — run before dry-run exit so users see the warning
if [ "$keep" -lt "$MIN_KEEP" ]; then
echo "::error::Safety check failed — only ${keep} versions would remain (minimum: ${MIN_KEEP}). Aborting."
exit 1
fi
if [ "$DRY_RUN" = "true" ]; then
echo ""
echo "Dry run — no versions deleted."
exit 0
fi
# Delete eligible versions
deleted=0
failed=0
for id in $delete_ids; do
if gh api --method DELETE \
"/orgs/${ORG}/packages/container/${PACKAGE}/versions/${id}" 2>/dev/null; then
deleted=$((deleted + 1))
else
echo "::warning::Failed to delete version ${id}"
failed=$((failed + 1))
fi
done
echo ""
echo "Deleted ${deleted} versions (${failed} failed)"
+26
View File
@@ -0,0 +1,26 @@
name: "Copilot Setup Steps"
# This workflow configures the environment for GitHub Copilot Agent with gh-aw MCP server
on:
workflow_dispatch:
push:
paths:
- .github/workflows/copilot-setup-steps.yml
jobs:
# The job MUST be called 'copilot-setup-steps' to be recognized by GitHub Copilot Agent
copilot-setup-steps:
runs-on: ubuntu-latest
# Set minimal permissions for setup steps
# Copilot Agent receives its own token with appropriate permissions
permissions:
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Install gh-aw extension
uses: github/gh-aw/actions/setup-cli@ce1794953e0ec42adc41b6fca05e02ab49ee21c3 # v0.68.3
with:
version: v0.49.3
@@ -0,0 +1,66 @@
name: Create release branch
on:
workflow_dispatch:
inputs:
base-ref:
description: 'Git ref to base from (defaults to latest tag)'
type: string
default: 'latest'
required: false
bump-type:
description: 'Version bump type (patch, minor)'
type: string
required: false
default: 'patch'
env:
FORCE_COLOR: 1
permissions:
contents: write
jobs:
create-branch:
if: github.repository == 'TryGhost/Ghost'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
if: inputs.base-ref == 'latest'
with:
ref: main
fetch-depth: 0
submodules: true
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
if: inputs.base-ref != 'latest'
with:
ref: ${{ inputs.base-ref }}
fetch-depth: 0
submodules: true
- name: Checkout most recent tag
run: git checkout "$(git describe --tags --abbrev=0 --match=v*)"
if: inputs.base-ref == 'latest'
- uses: asdf-vm/actions/install@b7bcd026f18772e44fe1026d729e1611cc435d47 # v4
with:
tool_versions: |
semver 3.3.0
- run: |
CURRENT_TAG=$(git describe --tags --abbrev=0 --match=v*)
NEW_VERSION=$(semver bump "$BUMP_TYPE_INPUT" "$CURRENT_TAG")
printf 'CURRENT_SHA=%s\n' "$(git rev-parse HEAD)" >> "$GITHUB_ENV"
printf 'NEW_VERSION=%s\n' "$NEW_VERSION" >> "$GITHUB_ENV"
env:
BUMP_TYPE_INPUT: ${{ inputs.bump-type }}
- name: Create branch
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const branchName = `v${process.env.NEW_VERSION}`;
console.log(`Creating branch: ${branchName}`);
await github.request('POST /repos/{owner}/{repo}/git/refs', {
owner: context.repo.owner,
repo: context.repo.repo,
ref: `refs/heads/${branchName}`,
sha: process.env.CURRENT_SHA
});
+127
View File
@@ -0,0 +1,127 @@
name: Deploy to Staging
# DISABLED: The deploy-to-staging label workflow is currently broken and disabled.
# Problems:
# 1. Admin is global — deploying a PR's admin overwrites admin-forward/ for ALL staging
# sites, not just demo.ghost.is. Per-site admin versioning is needed first.
# 2. Main merges overwrite — any merge to main triggers a full staging rollout that
# overwrites both the server version on demo.ghost.is and admin-forward/ globally.
# The deployment lasts only until the next merge to main, making it unreliable.
# See: https://www.notion.so/ghost/Proposal-Per-site-admin-versioning-31951439c03081daa133eb0215642202
on:
pull_request_target:
types: [labeled]
jobs:
deploy:
name: Deploy to Staging
# Runs when the "deploy-to-staging" label is added — requires collaborator write access.
# Fork PRs are rejected because they don't have GHCR images (CI uses artifact transfer).
if: >-
false
&& github.event.label.name == 'deploy-to-staging'
&& github.event.pull_request.head.repo.full_name == github.event.pull_request.base.repo.full_name
runs-on: ubuntu-latest
permissions:
contents: read
actions: read
env:
PR_NUMBER: ${{ github.event.pull_request.number }}
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
steps:
- name: Wait for CI build artifacts
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
echo "Waiting for CI to complete Docker build for $HEAD_SHA..."
TIMEOUT=1800 # 30 minutes
INTERVAL=30
START=$(date +%s)
while true; do
ELAPSED=$(( $(date +%s) - START ))
if [ "$ELAPSED" -ge "$TIMEOUT" ]; then
echo "::error::Timed out waiting for CI (${TIMEOUT}s)"
exit 1
fi
# Find the CI run for this SHA
RUN=$(gh api "repos/${{ github.repository }}/actions/workflows/ci.yml/runs?head_sha=${HEAD_SHA}&per_page=1" \
--jq '.workflow_runs[0] | {id, status, conclusion}' 2>/dev/null || echo "")
if [ -z "$RUN" ] || [ "$RUN" = "null" ]; then
echo " No CI run found yet, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
sleep "$INTERVAL"
continue
fi
STATUS=$(echo "$RUN" | jq -r '.status')
CONCLUSION=$(echo "$RUN" | jq -r '.conclusion // empty')
RUN_ID=$(echo "$RUN" | jq -r '.id')
if [ "$STATUS" = "completed" ]; then
if [ "$CONCLUSION" = "success" ] || [ "$CONCLUSION" = "failure" ]; then
# Check if Docker build job specifically succeeded (paginate — CI has 30+ jobs)
BUILD_JOB=$(gh api --paginate "repos/${{ github.repository }}/actions/runs/${RUN_ID}/jobs?per_page=100" \
--jq '.jobs[] | select(.name == "Build & Publish Artifacts") | .conclusion')
if [ -z "$BUILD_JOB" ]; then
echo "::error::Build & Publish Artifacts job not found in CI run ${RUN_ID}"
exit 1
elif [ "$BUILD_JOB" = "success" ]; then
echo "Docker build ready (CI run $RUN_ID)"
break
else
echo "::error::Docker build job did not succeed (conclusion: $BUILD_JOB)"
exit 1
fi
else
echo "::error::CI run failed (conclusion: $CONCLUSION)"
exit 1
fi
fi
echo " CI still running ($STATUS), waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
sleep "$INTERVAL"
done
- name: Re-check PR eligibility
id: recheck
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
PR=$(gh api "repos/${{ github.repository }}/pulls/${{ env.PR_NUMBER }}" \
--jq '{state, labels: [.labels[].name], head_sha: .head.sha}')
STATE=$(echo "$PR" | jq -r '.state')
HAS_LABEL=$(echo "$PR" | jq '.labels | any(. == "deploy-to-staging")')
CURRENT_SHA=$(echo "$PR" | jq -r '.head_sha')
if [ "$STATE" != "open" ]; then
echo "::warning::PR is no longer open ($STATE), skipping dispatch"
echo "skip=true" >> "$GITHUB_OUTPUT"
elif [ "$HAS_LABEL" != "true" ]; then
echo "::warning::deploy-to-staging label was removed, skipping dispatch"
echo "skip=true" >> "$GITHUB_OUTPUT"
elif [ "$CURRENT_SHA" != "$HEAD_SHA" ]; then
echo "::warning::HEAD SHA changed ($HEAD_SHA → $CURRENT_SHA), skipping dispatch (new push will trigger CI)"
echo "skip=true" >> "$GITHUB_OUTPUT"
else
echo "PR still eligible for deploy"
echo "skip=false" >> "$GITHUB_OUTPUT"
fi
- name: Dispatch to Ghost-Moya
if: steps.recheck.outputs.skip != 'true'
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
with:
token: ${{ secrets.CANARY_DOCKER_BUILD }}
repository: TryGhost/Ghost-Moya
event-type: ghost-artifacts-ready
client-payload: >-
{
"ref": "${{ env.PR_NUMBER }}",
"source_repo": "${{ github.repository }}",
"pr_number": "${{ env.PR_NUMBER }}",
"deploy": "true"
}
+21
View File
@@ -0,0 +1,21 @@
name: 'Label Issues & PRs'
on:
workflow_dispatch:
issues:
types: [opened, closed, labeled]
pull_request_target:
types: [opened, closed, labeled]
schedule:
- cron: '0 * * * *'
permissions:
issues: write
pull-requests: write
jobs:
action:
runs-on: ubuntu-latest
if: github.repository_owner == 'TryGhost'
steps:
- uses: tryghost/actions/actions/label-actions@20b5ae5f266e86f7b5f0815d92731d6388b8ce46 # main
File diff suppressed because it is too large Load Diff
+232
View File
@@ -0,0 +1,232 @@
---
description: Triage new Linear issues for the Berlin Bureau (BER) team — classify type, assign priority, tag product area, and post reasoning comments.
on:
workflow_dispatch:
schedule: daily on weekdays
permissions:
contents: read
if: github.repository == 'TryGhost/Ghost'
tools:
cache-memory: true
mcp-servers:
linear:
command: "npx"
args: ["-y", "mcp-remote", "https://mcp.linear.app/mcp", "--header", "Authorization:Bearer ${{ secrets.LINEAR_API_KEY }}"]
env:
LINEAR_API_KEY: ${{ secrets.LINEAR_API_KEY }}
network:
allowed:
- defaults
- node
- mcp.linear.app
safe-outputs:
create-issue:
noop:
---
# Linear Issue Triage Agent
You are an AI agent that triages new Linear issues for the **Berlin Bureau (BER)** team. Your goal is to reduce the time a human needs to complete triage by pre-classifying issues, assigning priority, tagging product areas, and recommending code investigations where appropriate.
**You do not move issues out of Triage** — a human still makes the final call on status transitions.
## Your Task
1. Use the Linear MCP tools to find the BER team and list all issues currently in the **Triage** state
2. Check your cache-memory to see which issues you have already triaged — skip those
3. For each untriaged issue, apply the triage rubric below to:
- Classify the issue type
- Assign priority (both a priority label and Linear's built-in priority field)
- Tag the product area
- Post a triage comment explaining your reasoning
4. Update your cache-memory with the newly triaged issue IDs
5. After processing, call the `noop` safe output with a summary of what you did — e.g. "Triaged 1 issue: BER-3367 (Bug, P3)" or "No new BER issues in Triage state" if there was nothing to triage
## Linear MCP Tools
You have access to the official Linear MCP server. Use its tools to:
- **Find issues**: Search for BER team issues in Triage state
- **Read issue details**: Get title, description, labels, priority, and comments
- **Update issues**: Add labels and set priority
- **Create comments**: Post triage reasoning comments
Start by listing available tools to discover the exact tool names and parameters.
**Important:** When updating labels, preserve existing labels. Fetch the issue's current labels first, then include both old and new label IDs in the update.
## Cache-Memory Format
Store and read a JSON file at the **exact path** `cache-memory/triage-cache.json`. Always use this filename — never rename it or create alternative files.
```json
{
"triaged_issue_ids": ["BER-3150", "BER-3151"],
"last_run": "2025-01-15T10:00:00Z"
}
```
On each run:
1. Read `cache-memory/triage-cache.json` to get previously triaged issue identifiers
2. Skip any issues already in the list
3. After processing, write the updated list back to `cache-memory/triage-cache.json` (append newly triaged IDs)
## Triage Rubric
### Decision 1: Type Classification
Classify each issue based on its title, description, and linked context:
| Type | Signal words / patterns | Label to apply |
|------|------------------------|----------------|
| **Bug** | "broken", "doesn't work", "regression", "error", "crash", stack traces, Sentry links, "unexpected behaviour" | `🐛 Bug` (`e51776f7-038e-474b-86ec-66981c9abb4f`) |
| **Security** | "vulnerability", "exploit", "bypass", "SSRF", "XSS", "injection", "authentication bypass", "2FA", CVE references | `🔒 Security` (`28c5afc1-8063-4e62-af11-e42d94591957`) — also apply Bug if applicable |
| **Feature** | "add support for", "it would be nice", "can we", "new feature", Featurebase links | `✨ Feature` (`db8672e2-1053-4bc7-9aab-9d38c5b01560`) |
| **Improvement** | "improve", "enhance", "optimise", "refactor", "clean up", "polish" | `🎨 Improvement` (`b36579e6-62e1-4f55-987d-ee1e5c0cde1a`) |
| **Performance** | "slow", "latency", "timeout", "memory", "CPU", "performance", load time complaints | `⚡️ Performance` (`9066d0ea-6326-4b22-b6f5-82fe7ce2c1d1`) |
| **Maintenance** | "upgrade dependency", "tech debt", "remove deprecated", "migrate" | `🛠️ Maintenance` (`0ca27922-3646-4ab7-bf03-e67230c0c39e`) |
| **Documentation** | "docs", "README", "guide", "tutorial", missing documentation | `📝 Documentation` (`25f8988a-5925-44cd-b0df-c0229463925f`) |
If an issue matches multiple types (e.g. a security bug), apply all relevant labels.
### Decision 2: Priority Assignment
Assign priority to all issue types. Set both the Linear priority field and the corresponding priority label.
**For bugs and security issues**, use these criteria:
#### P1 — Urgent (Linear priority: 1, Label: `📊 Priority → P1 - Urgent` `11de115f-3e40-46c6-bf42-2aa2b9195cbd`)
- Security vulnerability with a clear exploit path
- Data loss or corruption (MySQL, disk) — actual or imminent (exception: small lexical data issues can be P2)
- Multiple customers' businesses immediately affected (broken payment collection, broken emails, broken member login)
#### P2 — High (Linear priority: 2, Label: `📊 Priority → P2 - High` `aeda47fa-9db9-4f4d-a446-3cccf92c8d12`)
- Triggering monitoring alerts that wake on-call engineers (if recurring, bump to P1)
- Security vulnerability without a clear exploit
- Regression that breaks currently working core functionality
- Crashes the server or browser
- Significantly disrupts customers' members/end-users (e.g. incorrect pricing or access)
- Bugs with members, subscriptions, or newsletters without immediate business impact
#### P3 — Medium (Linear priority: 3, Label: `📊 Priority → P3 - Medium` `10ec8b7b-725f-453f-b5d2-ff160d3b3c1e`)
- Bugs with members, subscriptions, or newsletters affecting only a few customers
- Bugs in recently released features that significantly affect usability
- Issues with setup/upgrade flows
- Broken features (dashboards, line charts, analytics, etc.)
- Correctness issues (e.g. timezones)
#### P4 — Low (Linear priority: 4, Label: `📊 Priority → P4 - Low` `411a21ea-c8c0-4cb1-9736-7417383620ff`)
- Not quite working as expected, but little overall impact
- Not related to payments, email, or security
- Significantly more complex to fix than the value of fixing
- Purely cosmetic
- Has a clear and straightforward workaround
**For non-bug issues** (features, improvements, performance, maintenance, documentation), assign a **provisional priority** based on estimated impact and urgency. Clearly mark it as provisional in the triage comment.
#### Bump Modifiers
**Bump UP one level if:**
- It causes regular alerts for on-call engineers
- It affects lots of users or VIP customers
- It prevents users from carrying out a critical use case or workflow
- It prevents rolling back to a previous release
**Bump DOWN one level if:**
- Reported by a single, non-VIP user
- Only impacts an edge case or obscure use case
Note in your comment if a bump modifier was applied and why.
### Decision 3: Product Area Tagging
Apply the most relevant `Product Area →` label:
| Label | Covers |
|-------|--------|
| `Product Area → Editor` | Post/page editor, Koenig, Lexical, content blocks |
| `Product Area → Dashboard` | Admin dashboard, stats, overview |
| `Product Area → Analytics` | Analytics, charts, reporting |
| `Product Area → Memberships` | Member management, segmentation, member data |
| `Product Area → Portal` | Member-facing portal, signup/login flows |
| `Product Area → Newsletters` | Email newsletters, sending, email design |
| `Product Area → Admin` | General admin UI, settings, navigation |
| `Product Area → Settings area` | Settings screens specifically |
| `Product Area → Billing App` | Billing, subscription management |
| `Product Area → Themes` | Theme system, Handlebars, theme marketplace |
| `Product Area → Publishing` | Post publishing, scheduling, distribution |
| `Product Area → Growth` | Growth features, recommendations |
| `Product Area → Comments` | Comment system |
| `Product Area → Imports / Exports` | Data import/export |
| `Product Area → Welcome emails / Automations` | Automated emails, welcome sequences |
| `Product Area → Social Web` | ActivityPub, federation |
| `Product Area → i18n` | Internationalisation, translations |
| `Product Area → Sodo Search` | Search functionality |
| `Product Area → Admin-X Offers` | Offers system in Admin-X |
If the issue spans multiple areas, apply all relevant labels. If no product area is clearly identifiable, don't force a label — note this in the comment.
**Important:** Use the Linear MCP tools to look up product area label IDs before applying them.
### Decision 4: Triage Comment
Post a comment on the issue with your reasoning. Use this format:
```
🤖 **Automated Triage**
**Type:** Bug (Security)
**Priority:** P2 — High
**Product Area:** Memberships
**Bump modifiers applied:** UP — affects multiple customers
**Reasoning:**
This appears to be a security vulnerability in the session handling that could allow
2FA bypass. While no clear exploit path has been reported, the potential for
authentication bypass affecting all staff accounts warrants P2. Bumped up from P3
because it affects all customers with 2FA enabled.
**Recommended action:** Code investigation recommended — this is a security bug
that needs code-level analysis.
```
For non-bug issues, mark priority as provisional:
```
🤖 **Automated Triage**
**Type:** Improvement
**Priority:** P3 — Medium *(provisional)*
**Product Area:** Admin
**Bump modifiers applied:** None
**Reasoning:**
This is a refactoring task to share logic between two related functions. No user-facing
impact, but reduces maintenance burden for the retention offers codebase. Provisional
P3 based on moderate codebase impact and alignment with active project work.
**Recommended action:** Code investigation recommended — small refactoring task with
clear scope, no design input needed.
```
### Decision 5: Code Investigation Recommendation
Flag an issue for code investigation in your comment if **all** of these are true:
1. Classified as a bug, security issue, performance issue, or small improvement/maintenance task
2. Does not require design input (no UI mockups needed, no UX decisions)
3. Has enough description to investigate (not just a title with no context)
Do **not** recommend investigation for:
- Feature requests (need product/design input)
- Issues with vague descriptions and no reproduction steps — instead note "Needs more info" in the comment
- Issues that are clearly large architectural changes
## Guidelines
- Process issues one at a time, applying all decisions before moving to the next
- Be concise but include enough reasoning that a human can quickly validate or override
- When in doubt about classification, pick the closest match and note your uncertainty
- If an issue already has triage labels or a triage comment from a previous run, skip it
- Never move issues out of the Triage state
- After processing all issues, update cache-memory with the full list of triaged identifiers
+57
View File
@@ -0,0 +1,57 @@
name: Migration Review
on:
pull_request_target:
types: [opened]
paths:
- 'ghost/core/core/server/data/schema/**'
- 'ghost/core/core/server/data/migrations/versions/**'
jobs:
createComment:
runs-on: ubuntu-latest
if: github.repository_owner == 'TryGhost'
name: Add migration review requirements
steps:
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
github.rest.issues.addLabels({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
labels: ["migration"]
})
- uses: peter-evans/create-or-update-comment@57232238742e38b2ccc27136ce596ccae7ca28b4
with:
issue-number: ${{ github.event.pull_request.number }}
body: |
It looks like this PR contains a migration 👀
Here's the checklist for reviewing migrations:
### General requirements
- [ ] :warning: Tested performance on staging database servers, as performance on local machines is not comparable to a production environment
- [ ] Satisfies idempotency requirement (both `up()` and `down()`)
- [ ] Does not reference models
- [ ] Filename is in the correct format (and correctly ordered)
- [ ] Targets the next minor version
- [ ] All code paths have appropriate log messages
- [ ] Uses the correct utils
- [ ] Contains a minimal changeset
- [ ] Does not mix DDL/DML operations
- [ ] Tested in MySQL and SQLite
### Schema changes
- [ ] Both schema change and related migration have been implemented
- [ ] For index changes: has been performance tested for large tables
- [ ] For new tables/columns: fields use the appropriate predefined field lengths
- [ ] For new tables/columns: field names follow the appropriate conventions
- [ ] Does not drop a non-alpha table outside of a major version
### Data changes
- [ ] Mass updates/inserts are batched appropriately
- [ ] Does not loop over large tables/datasets
- [ ] Defends against missing or invalid data
- [ ] For settings updates: follows the appropriate guidelines
+137
View File
@@ -0,0 +1,137 @@
name: PR Preview
on:
pull_request_target:
types: [labeled, unlabeled, closed]
jobs:
deploy:
name: Deploy Preview
# Runs when the "preview" label is added — requires collaborator write access
if: >-
github.event.action == 'labeled'
&& github.event.label.name == 'preview'
runs-on: ubuntu-latest
permissions:
contents: read
actions: read
env:
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
steps:
- name: Wait for Docker build job
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
BUILD_JOB_NAME: Build & Publish Artifacts
run: |
echo "Waiting for '${BUILD_JOB_NAME}' job to complete for $HEAD_SHA..."
TIMEOUT=1800 # 30 minutes
INTERVAL=30
START=$(date +%s)
while true; do
ELAPSED=$(( $(date +%s) - START ))
if [ "$ELAPSED" -ge "$TIMEOUT" ]; then
echo "::error::Timed out waiting for '${BUILD_JOB_NAME}' (${TIMEOUT}s)"
exit 1
fi
# Find the CI run for this SHA
RUN=$(gh api "repos/${{ github.repository }}/actions/workflows/ci.yml/runs?head_sha=${HEAD_SHA}&per_page=1" \
--jq '.workflow_runs[0] | {id, status}' 2>/dev/null || echo "")
if [ -z "$RUN" ] || [ "$RUN" = "null" ]; then
echo " No CI run found yet, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
sleep "$INTERVAL"
continue
fi
RUN_ID=$(echo "$RUN" | jq -r '.id')
RUN_STATUS=$(echo "$RUN" | jq -r '.status')
# Look up the build job specifically (paginate — CI has 30+ jobs)
BUILD_JOB=$(gh api --paginate "repos/${{ github.repository }}/actions/runs/${RUN_ID}/jobs?per_page=100" \
--jq ".jobs[] | select(.name == \"${BUILD_JOB_NAME}\") | {status, conclusion}")
if [ -z "$BUILD_JOB" ]; then
if [ "$RUN_STATUS" = "completed" ]; then
echo "::error::CI run ${RUN_ID} completed but '${BUILD_JOB_NAME}' job was not found"
exit 1
fi
echo " '${BUILD_JOB_NAME}' job not started yet (run ${RUN_STATUS}), waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
sleep "$INTERVAL"
continue
fi
JOB_STATUS=$(echo "$BUILD_JOB" | jq -r '.status')
JOB_CONCLUSION=$(echo "$BUILD_JOB" | jq -r '.conclusion // empty')
if [ "$JOB_STATUS" = "completed" ]; then
if [ "$JOB_CONCLUSION" = "success" ]; then
echo "Docker build ready (CI run $RUN_ID)"
break
fi
echo "::error::'${BUILD_JOB_NAME}' did not succeed (conclusion: $JOB_CONCLUSION)"
exit 1
fi
echo " '${BUILD_JOB_NAME}' still ${JOB_STATUS}, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
sleep "$INTERVAL"
done
- name: Re-check PR eligibility
id: recheck
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
PR=$(gh api "repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}" \
--jq '{state, labels: [.labels[].name]}')
STATE=$(echo "$PR" | jq -r '.state')
HAS_LABEL=$(echo "$PR" | jq '.labels | any(. == "preview")')
if [ "$STATE" != "open" ]; then
echo "::warning::PR is no longer open ($STATE), skipping dispatch"
echo "skip=true" >> "$GITHUB_OUTPUT"
elif [ "$HAS_LABEL" != "true" ]; then
echo "::warning::preview label was removed, skipping dispatch"
echo "skip=true" >> "$GITHUB_OUTPUT"
else
echo "PR still eligible for preview deploy"
echo "skip=false" >> "$GITHUB_OUTPUT"
fi
- name: Dispatch deploy to Ghost-Moya
if: steps.recheck.outputs.skip != 'true'
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
with:
token: ${{ secrets.CANARY_DOCKER_BUILD }}
repository: TryGhost/Ghost-Moya
event-type: preview-deploy
client-payload: >-
{
"pr_number": "${{ github.event.pull_request.number }}",
"action": "deploy",
"seed": "true"
}
destroy:
name: Destroy Preview
# Runs when "preview" label is removed, or the PR is closed/merged while labeled
if: >-
(github.event.action == 'unlabeled' && github.event.label.name == 'preview')
|| (github.event.action == 'closed' && contains(github.event.pull_request.labels.*.name, 'preview'))
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Dispatch destroy to Ghost-Moya
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
with:
token: ${{ secrets.CANARY_DOCKER_BUILD }}
repository: TryGhost/Ghost-Moya
event-type: preview-destroy
client-payload: >-
{
"pr_number": "${{ github.event.pull_request.number }}",
"action": "destroy"
}
+46
View File
@@ -0,0 +1,46 @@
name: Publish tb-cli Image
on:
workflow_dispatch: # Manual trigger from GitHub UI or CLI
push:
branches: [main]
paths:
- 'docker/tb-cli/**'
permissions:
contents: read
packages: write
jobs:
publish:
name: Build and push tb-cli to GHCR
runs-on: ubuntu-latest
if: github.repository == 'TryGhost/Ghost' && github.ref == 'refs/heads/main'
concurrency:
group: publish-tb-cli
cancel-in-progress: true
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4
- name: Login to GHCR
uses: docker/login-action@4907a6ddec9925e35a0a9e82d7399ccc52663121 # v4
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7
with:
context: .
file: docker/tb-cli/Dockerfile
push: true
tags: |
ghcr.io/tryghost/tb-cli:latest
ghcr.io/tryghost/tb-cli:${{ github.sha }}
cache-from: type=gha
cache-to: type=gha,mode=max
+112
View File
@@ -0,0 +1,112 @@
name: Release
run-name: "Release — ${{ inputs.bump-type || 'auto' }} from ${{ inputs.branch || 'main' }}${{ inputs.dry-run && ' (dry run)' || '' }}"
on:
schedule:
- cron: '0 15 * * 5' # Friday 3pm UTC
workflow_dispatch:
inputs:
branch:
description: 'Git branch to release from'
type: string
default: 'main'
required: false
bump-type:
description: 'Version bump type (auto, patch, minor)'
type: string
required: false
default: 'auto'
skip-checks:
description: 'Skip CI status check verification'
type: boolean
default: false
dry-run:
description: 'Dry run (version bump without push)'
type: boolean
default: false
env:
FORCE_COLOR: 1
NODE_VERSION: 22.18.0
concurrency:
group: ${{ github.workflow }}
cancel-in-progress: false
jobs:
release:
runs-on: ubuntu-latest
name: Prepare & Push Release
steps:
- uses: webfactory/ssh-agent@e83874834305fe9a4a2997156cb26c5de65a8555 # v0.10.0
with:
ssh-private-key: ${{ secrets.DEPLOY_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
# Deploy key (via ssh-agent) is used for git push — it bypasses
# branch protection and triggers downstream workflows (unlike GITHUB_TOKEN)
ref: ${{ inputs.branch || 'main' }}
fetch-depth: 0
ssh-key: ${{ secrets.DEPLOY_KEY }}
# Fetch submodules separately via HTTPS — the deploy key is scoped to
# Ghost only and can't authenticate against Casper/Source over SSH
- run: git submodule update --init
- uses: pnpm/action-setup@b906affcce14559ad1aafd4ab0e942779e9f58b1 # v4
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
env:
FORCE_COLOR: 0
with:
node-version: ${{ env.NODE_VERSION }}
cache: pnpm
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Set up Git
run: |
git config user.name "Ghost CI"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
- name: Set up schedule defaults
if: github.event_name == 'schedule'
run: |
echo "RELEASE_BRANCH=main" >> "$GITHUB_ENV"
echo "RELEASE_BUMP_TYPE=auto" >> "$GITHUB_ENV"
echo "RELEASE_DRY_RUN=" >> "$GITHUB_ENV"
echo "RELEASE_SKIP_CHECKS=" >> "$GITHUB_ENV"
- name: Set up workflow_dispatch inputs
if: github.event_name == 'workflow_dispatch'
run: |
echo "RELEASE_BRANCH=${INPUT_BRANCH}" >> "$GITHUB_ENV"
echo "RELEASE_BUMP_TYPE=${INPUT_BUMP_TYPE}" >> "$GITHUB_ENV"
echo "RELEASE_DRY_RUN=${INPUT_DRY_RUN}" >> "$GITHUB_ENV"
echo "RELEASE_SKIP_CHECKS=${INPUT_SKIP_CHECKS}" >> "$GITHUB_ENV"
env:
INPUT_BRANCH: ${{ inputs.branch }}
INPUT_BUMP_TYPE: ${{ inputs.bump-type }}
INPUT_DRY_RUN: ${{ inputs.dry-run }}
INPUT_SKIP_CHECKS: ${{ inputs.skip-checks }}
- name: Run release script
run: |
ARGS="--branch=${{ env.RELEASE_BRANCH }} --bump-type=${{ env.RELEASE_BUMP_TYPE }}"
if [ "${{ env.RELEASE_DRY_RUN }}" = "true" ]; then
ARGS="$ARGS --dry-run"
fi
if [ "${{ env.RELEASE_SKIP_CHECKS }}" = "true" ]; then
ARGS="$ARGS --skip-checks"
fi
node scripts/release.js $ARGS
env:
GITHUB_TOKEN: ${{ secrets.CANARY_DOCKER_BUILD }} # PAT for GitHub API (check polling)
- name: Notify on failure
if: failure()
uses: tryghost/actions/actions/slack-build@20b5ae5f266e86f7b5f0815d92731d6388b8ce46 # main
with:
status: ${{ job.status }}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
+26
View File
@@ -0,0 +1,26 @@
name: 'Close stale i18n PRs'
on:
workflow_dispatch:
schedule:
- cron: '0 6 * * *'
jobs:
stale:
if: github.repository_owner == 'TryGhost'
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10
with:
stale-pr-message: |
Thanks for contributing to Ghost's i18n :)
This PR has been automatically marked as stale because there has not been any activity here in 3 weeks.
I18n PRs tend to get out of date quickly, so we're closing them to keep the PR list clean.
If you're still interested in working on this PR, please let us know. Otherwise this PR will be closed shortly, but can always be reopened later. Thank you for understanding 🙂
only-labels: 'affects:i18n'
days-before-pr-stale: 21
days-before-pr-close: 7
exempt-pr-labels: 'feature,pinned,needs:triage'
stale-pr-label: 'stale'
close-pr-message: |
This PR has been automatically closed due to inactivity. If you'd like to continue working on it, feel free to open a new PR.
+29
View File
@@ -0,0 +1,29 @@
name: 'Close stale issues and PRs'
on:
workflow_dispatch:
schedule:
- cron: '0 6 * * *'
jobs:
stale:
if: github.repository_owner == 'TryGhost'
runs-on: ubuntu-latest
steps:
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10
with:
stale-issue-message: |
Our bot has automatically marked this issue as stale because there has not been any activity here in some time.
The issue will be closed soon if there are no further updates, however we ask that you do not post comments to keep the issue open if you are not actively working on a PR.
We keep the issue list minimal so we can keep focus on the most pressing issues. Closed issues can always be reopened if a new contributor is found. Thank you for understanding 🙂
stale-pr-message: |
Our bot has automatically marked this PR as stale because there has not been any activity here in some time.
If weve missed reviewing your PR & youre still interested in working on it, please let us know. Otherwise this PR will be closed shortly, but can always be reopened later. Thank you for understanding 🙂
exempt-issue-labels: 'feature,pinned,needs:triage'
exempt-pr-labels: 'feature,pinned,needs:triage'
days-before-stale: 113
days-before-pr-stale: 358
stale-issue-label: 'stale'
stale-pr-label: 'stale'
close-issue-reason: 'not_planned'