This commit is contained in:
@@ -0,0 +1,17 @@
|
||||
---
|
||||
name: Add Admin API Endpoint
|
||||
description: Add a new endpoint or endpoints to Ghost's Admin API at `ghost/api/admin/**`.
|
||||
---
|
||||
|
||||
# Create Admin API Endpoint
|
||||
|
||||
## Instructions
|
||||
|
||||
1. If creating an endpoint for an entirely new resource, create a new endpoint file in `ghost/core/core/server/api/endpoints/`. Otherwise, locate the existing endpoint file in the same directory.
|
||||
2. The endpoint file should create a controller object using the JSDoc type from (@tryghost/api-framework).Controller, including at minimum a `docName` and a single endpoint definition, i.e. `browse`.
|
||||
3. Add routes for each endpoint to `ghost/core/core/server/web/api/endpoints/admin/routes.js`.
|
||||
4. Add basic `e2e-api` tests for the endpoint in `ghost/core/test/e2e-api/admin` to ensure the new endpoints function as expected.
|
||||
5. Run the tests and iterate until they pass: `cd ghost/core && pnpm test:single test/e2e-api/admin/{test-file-name}`.
|
||||
|
||||
## Reference
|
||||
For a detailed reference on Ghost's API framework and how to create API controllers, see [reference.md](reference.md).
|
||||
@@ -0,0 +1,711 @@
|
||||
# API Controller Permissions Guide
|
||||
|
||||
This guide explains how to configure permissions in api-framework controllers, covering all available patterns and best practices.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Overview](#overview)
|
||||
- [Permission Patterns](#permission-patterns)
|
||||
- [Boolean `true` - Default Permission Check](#pattern-1-boolean-true---default-permission-check)
|
||||
- [Boolean `false` - Skip Permissions](#pattern-2-boolean-false---skip-permissions)
|
||||
- [Function - Custom Permission Logic](#pattern-3-function---custom-permission-logic)
|
||||
- [Configuration Object - Default with Hooks](#pattern-4-configuration-object---default-with-hooks)
|
||||
- [The Frame Object](#the-frame-object)
|
||||
- [Configuration Object Properties](#configuration-object-properties)
|
||||
- [Complete Examples](#complete-examples)
|
||||
- [Best Practices](#best-practices)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The api-framework uses a **pipeline-based permission system** where permissions are handled as one of five request processing stages:
|
||||
|
||||
1. Input validation
|
||||
2. Input serialisation
|
||||
3. **Permissions** ← You are here
|
||||
4. Query (controller execution)
|
||||
5. Output serialisation
|
||||
|
||||
**Important**: Every controller method **MUST** explicitly define the `permissions` property. This is a security requirement that prevents accidental security holes and makes permission handling explicit.
|
||||
|
||||
```javascript
|
||||
// This will throw an IncorrectUsageError
|
||||
edit: {
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data, frame.options);
|
||||
}
|
||||
// Missing permissions property!
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Permission Patterns
|
||||
|
||||
### Pattern 1: Boolean `true` - Default Permission Check
|
||||
|
||||
The most common pattern that delegates to the default permission handler.
|
||||
|
||||
```javascript
|
||||
edit: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['include'],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
required: true,
|
||||
values: ['tags']
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Standard CRUD operations
|
||||
- When the default permission handler meets your needs
|
||||
- Most common case for authenticated endpoints
|
||||
|
||||
#### How the Default Permission Handler Works
|
||||
|
||||
When you set `permissions: true`, the framework delegates to the default permission handler at `ghost/core/core/server/api/endpoints/utils/permissions.js`. Here's what happens:
|
||||
|
||||
1. **Singular Name Derivation**: The handler converts the `docName` to singular form:
|
||||
- `posts` → `post`
|
||||
- `automated_emails` → `automated_email`
|
||||
- `categories` → `category` (handles `ies` → `y`)
|
||||
|
||||
2. **Permission Check**: It calls the permissions service:
|
||||
```javascript
|
||||
permissions.canThis(frame.options.context)[method][singular](identifier, unsafeAttrs)
|
||||
```
|
||||
|
||||
For example, with `docName: 'posts'` and method `edit`:
|
||||
```javascript
|
||||
permissions.canThis(context).edit.post(postId, unsafeAttrs)
|
||||
```
|
||||
|
||||
3. **Database Lookup**: The permissions service checks the `permissions` and `permissions_roles` tables:
|
||||
- Looks for a permission with `action_type` matching the method (e.g., `edit`)
|
||||
- And `object_type` matching the singular docName (e.g., `post`)
|
||||
- Verifies the user's role has that permission assigned
|
||||
|
||||
#### Required Database Setup
|
||||
|
||||
For the default handler to work, you must have:
|
||||
|
||||
1. **Permission records** in the `permissions` table:
|
||||
```sql
|
||||
INSERT INTO permissions (name, action_type, object_type) VALUES
|
||||
('Browse posts', 'browse', 'post'),
|
||||
('Read posts', 'read', 'post'),
|
||||
('Edit posts', 'edit', 'post'),
|
||||
('Add posts', 'add', 'post'),
|
||||
('Delete posts', 'destroy', 'post');
|
||||
```
|
||||
|
||||
2. **Role-permission mappings** in `permissions_roles` linking permissions to roles like Administrator, Editor, etc.
|
||||
|
||||
These are typically added via:
|
||||
- Initial fixtures in `ghost/core/core/server/data/schema/fixtures/fixtures.json`
|
||||
- Database migrations using `addPermissionWithRoles()` from `ghost/core/core/server/data/migrations/utils/permissions.js`
|
||||
|
||||
---
|
||||
|
||||
### Pattern 2: Boolean `false` - Skip Permissions
|
||||
|
||||
Completely bypasses the permissions stage.
|
||||
|
||||
```javascript
|
||||
browse: {
|
||||
options: ['page', 'limit'],
|
||||
permissions: false,
|
||||
query(frame) {
|
||||
return models.PublicResource.findAll(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Public endpoints that don't require authentication
|
||||
- Health check or status endpoints
|
||||
- Resources that should be accessible to everyone
|
||||
|
||||
**Warning**: Use with caution. Only disable permissions when you're certain the endpoint should be publicly accessible.
|
||||
|
||||
---
|
||||
|
||||
### Pattern 3: Function - Custom Permission Logic
|
||||
|
||||
Allows complete control over permission validation.
|
||||
|
||||
```javascript
|
||||
delete: {
|
||||
options: ['id'],
|
||||
permissions: async function(frame) {
|
||||
// Ensure user is authenticated
|
||||
if (!frame.user || !frame.user.id) {
|
||||
const UnauthorizedError = require('@tryghost/errors').UnauthorizedError;
|
||||
return Promise.reject(new UnauthorizedError({
|
||||
message: 'You must be logged in to perform this action'
|
||||
}));
|
||||
}
|
||||
|
||||
// Only the owner or an admin can delete
|
||||
const resource = await models.Resource.findOne({id: frame.options.id});
|
||||
|
||||
if (resource.get('author_id') !== frame.user.id && frame.user.role !== 'admin') {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
return Promise.reject(new NoPermissionError({
|
||||
message: 'You do not have permission to delete this resource'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
query(frame) {
|
||||
return models.Resource.destroy(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Complex permission logic that varies by resource
|
||||
- Owner-based permissions
|
||||
- Role-based access control beyond the default handler
|
||||
- When you need to query the database for permission decisions
|
||||
|
||||
---
|
||||
|
||||
### Pattern 4: Configuration Object - Default with Hooks
|
||||
|
||||
Combines default permission handling with configuration options and hooks.
|
||||
|
||||
```javascript
|
||||
edit: {
|
||||
options: ['include'],
|
||||
permissions: {
|
||||
unsafeAttrs: ['author', 'status'],
|
||||
before: async function(frame) {
|
||||
// Load additional user data needed for permission checks
|
||||
frame.user.permissions = await loadUserPermissions(frame.user.id);
|
||||
}
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Default permission handler is sufficient but needs configuration
|
||||
- You have attributes that require special permission handling
|
||||
- You need to prepare data before permission checks run
|
||||
|
||||
---
|
||||
|
||||
## The Frame Object
|
||||
|
||||
Permission handlers receive a `frame` object containing complete request context:
|
||||
|
||||
```javascript
|
||||
Frame {
|
||||
// Request data
|
||||
original: {}, // Original untransformed input
|
||||
options: {}, // Query/URL parameters
|
||||
data: {}, // Request body
|
||||
|
||||
// User context
|
||||
user: {}, // Logged-in user object
|
||||
|
||||
// File uploads
|
||||
file: {}, // Single uploaded file
|
||||
files: [], // Multiple uploaded files
|
||||
|
||||
// API context
|
||||
apiType: String, // 'content' or 'admin'
|
||||
docName: String, // Endpoint name (e.g., 'posts')
|
||||
method: String, // Method name (e.g., 'browse', 'add', 'edit')
|
||||
|
||||
// HTTP context (added by HTTP wrapper)
|
||||
context: {
|
||||
api_key: {}, // API key information
|
||||
user: userId, // User ID or null
|
||||
integration: {}, // Integration details
|
||||
member: {} // Member information or null
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Configuration Object Properties
|
||||
|
||||
When using Pattern 4, these properties are available:
|
||||
|
||||
### `unsafeAttrs` (Array)
|
||||
|
||||
Specifies attributes that require special permission handling.
|
||||
|
||||
```javascript
|
||||
permissions: {
|
||||
unsafeAttrs: ['author', 'visibility', 'status']
|
||||
}
|
||||
```
|
||||
|
||||
These attributes are passed to the permission handler for additional validation. Use this for fields that only certain users should be able to modify (e.g., only admins can change the author of a post).
|
||||
|
||||
### `before` (Function)
|
||||
|
||||
A hook that runs before the default permission handler.
|
||||
|
||||
```javascript
|
||||
permissions: {
|
||||
before: async function(frame) {
|
||||
// Prepare data needed for permission checks
|
||||
const membership = await loadMembership(frame.user.id);
|
||||
frame.user.membershipLevel = membership.level;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Complete Examples
|
||||
|
||||
### Example 1: Public Browse Endpoint
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'articles',
|
||||
|
||||
browse: {
|
||||
options: ['page', 'limit', 'filter'],
|
||||
validation: {
|
||||
options: {
|
||||
limit: {
|
||||
values: [10, 25, 50, 100]
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: false,
|
||||
query(frame) {
|
||||
return models.Article.findPage(frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 2: Authenticated CRUD Controller
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'posts',
|
||||
|
||||
browse: {
|
||||
options: ['include', 'page', 'limit', 'filter', 'order'],
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
},
|
||||
|
||||
read: {
|
||||
options: ['include'],
|
||||
data: ['id', 'slug'],
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findOne(frame.data, frame.options);
|
||||
}
|
||||
},
|
||||
|
||||
add: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['include'],
|
||||
permissions: {
|
||||
unsafeAttrs: ['author_id']
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.add(frame.data.posts[0], frame.options);
|
||||
}
|
||||
},
|
||||
|
||||
edit: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['include', 'id'],
|
||||
permissions: {
|
||||
unsafeAttrs: ['author_id', 'status']
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data.posts[0], frame.options);
|
||||
}
|
||||
},
|
||||
|
||||
destroy: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['id'],
|
||||
permissions: true,
|
||||
statusCode: 204,
|
||||
query(frame) {
|
||||
return models.Post.destroy(frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 3: Owner-Based Permissions
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'user_settings',
|
||||
|
||||
read: {
|
||||
options: ['user_id'],
|
||||
permissions: async function(frame) {
|
||||
// Users can only read their own settings
|
||||
if (frame.options.user_id !== frame.user.id) {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
return Promise.reject(new NoPermissionError({
|
||||
message: 'You can only view your own settings'
|
||||
}));
|
||||
}
|
||||
return Promise.resolve();
|
||||
},
|
||||
query(frame) {
|
||||
return models.UserSetting.findOne({user_id: frame.options.user_id});
|
||||
}
|
||||
},
|
||||
|
||||
edit: {
|
||||
options: ['user_id'],
|
||||
permissions: async function(frame) {
|
||||
// Users can only edit their own settings
|
||||
if (frame.options.user_id !== frame.user.id) {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
return Promise.reject(new NoPermissionError({
|
||||
message: 'You can only edit your own settings'
|
||||
}));
|
||||
}
|
||||
return Promise.resolve();
|
||||
},
|
||||
query(frame) {
|
||||
return models.UserSetting.edit(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 4: Role-Based Access Control
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'admin_settings',
|
||||
|
||||
browse: {
|
||||
permissions: async function(frame) {
|
||||
const allowedRoles = ['Owner', 'Administrator'];
|
||||
|
||||
if (!frame.user || !allowedRoles.includes(frame.user.role)) {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
return Promise.reject(new NoPermissionError({
|
||||
message: 'Only administrators can access these settings'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
query(frame) {
|
||||
return models.AdminSetting.findAll();
|
||||
}
|
||||
},
|
||||
|
||||
edit: {
|
||||
permissions: async function(frame) {
|
||||
// Only the owner can edit admin settings
|
||||
if (!frame.user || frame.user.role !== 'Owner') {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
return Promise.reject(new NoPermissionError({
|
||||
message: 'Only the site owner can modify these settings'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
query(frame) {
|
||||
return models.AdminSetting.edit(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 5: Permission with Data Preparation
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'premium_content',
|
||||
|
||||
read: {
|
||||
options: ['id'],
|
||||
permissions: {
|
||||
before: async function(frame) {
|
||||
// Load user's subscription status
|
||||
if (frame.user) {
|
||||
const subscription = await models.Subscription.findOne({
|
||||
user_id: frame.user.id
|
||||
});
|
||||
frame.user.subscription = subscription;
|
||||
}
|
||||
}
|
||||
},
|
||||
async query(frame) {
|
||||
// The query can now use frame.user.subscription
|
||||
const content = await models.Content.findOne({id: frame.options.id});
|
||||
|
||||
if (content.get('premium') && !frame.user?.subscription?.active) {
|
||||
const NoPermissionError = require('@tryghost/errors').NoPermissionError;
|
||||
throw new NoPermissionError({
|
||||
message: 'Premium subscription required'
|
||||
});
|
||||
}
|
||||
|
||||
return content;
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Always Define Permissions Explicitly
|
||||
|
||||
```javascript
|
||||
// Good - explicit about being public
|
||||
permissions: false
|
||||
|
||||
// Good - explicit about requiring auth
|
||||
permissions: true
|
||||
|
||||
// Bad - missing permissions (will throw error)
|
||||
// permissions: undefined
|
||||
```
|
||||
|
||||
### 2. Use the Appropriate Pattern
|
||||
|
||||
| Scenario | Pattern |
|
||||
|----------|---------|
|
||||
| Public endpoint | `permissions: false` |
|
||||
| Standard authenticated CRUD | `permissions: true` |
|
||||
| Need unsafe attrs tracking | `permissions: { unsafeAttrs: [...] }` |
|
||||
| Complex custom logic | `permissions: async function(frame) {...}` |
|
||||
| Need pre-processing | `permissions: { before: async function(frame) {...} }` |
|
||||
|
||||
### 3. Keep Permission Logic Focused
|
||||
|
||||
Permission functions should only check permissions, not perform business logic:
|
||||
|
||||
```javascript
|
||||
// Good - only checks permissions
|
||||
permissions: async function(frame) {
|
||||
if (!frame.user || frame.user.role !== 'admin') {
|
||||
throw new NoPermissionError();
|
||||
}
|
||||
}
|
||||
|
||||
// Bad - mixes permission check with business logic
|
||||
permissions: async function(frame) {
|
||||
if (!frame.user) throw new NoPermissionError();
|
||||
|
||||
// Don't do this in permissions!
|
||||
frame.data.processed = true;
|
||||
await sendNotification(frame.user);
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Use Meaningful Error Messages
|
||||
|
||||
```javascript
|
||||
permissions: async function(frame) {
|
||||
if (!frame.user) {
|
||||
throw new UnauthorizedError({
|
||||
message: 'Please log in to access this resource'
|
||||
});
|
||||
}
|
||||
|
||||
if (frame.user.role !== 'admin') {
|
||||
throw new NoPermissionError({
|
||||
message: 'Administrator access required for this operation'
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Validate Resource Ownership
|
||||
|
||||
When resources belong to specific users, always verify ownership:
|
||||
|
||||
```javascript
|
||||
permissions: async function(frame) {
|
||||
const resource = await models.Resource.findOne({id: frame.options.id});
|
||||
|
||||
if (!resource) {
|
||||
throw new NotFoundError({message: 'Resource not found'});
|
||||
}
|
||||
|
||||
const isOwner = resource.get('user_id') === frame.user.id;
|
||||
const isAdmin = frame.user.role === 'admin';
|
||||
|
||||
if (!isOwner && !isAdmin) {
|
||||
throw new NoPermissionError({
|
||||
message: 'You do not have permission to access this resource'
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Use `unsafeAttrs` for Sensitive Fields
|
||||
|
||||
Mark fields that require elevated permissions:
|
||||
|
||||
```javascript
|
||||
permissions: {
|
||||
unsafeAttrs: [
|
||||
'author_id', // Only admins should change authorship
|
||||
'status', // Publishing requires special permission
|
||||
'visibility', // Changing visibility is restricted
|
||||
'featured' // Only editors can feature content
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Error Types
|
||||
|
||||
Use appropriate error types from `@tryghost/errors`:
|
||||
|
||||
- **UnauthorizedError** - User is not authenticated
|
||||
- **NoPermissionError** - User is authenticated but lacks permission
|
||||
- **NotFoundError** - Resource doesn't exist (use carefully to avoid information leakage)
|
||||
- **ValidationError** - Input validation failed
|
||||
|
||||
```javascript
|
||||
const {
|
||||
UnauthorizedError,
|
||||
NoPermissionError,
|
||||
NotFoundError
|
||||
} = require('@tryghost/errors');
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Adding Permissions via Migrations
|
||||
|
||||
When creating a new API endpoint that uses the default permission handler (`permissions: true`), you need to add permissions to the database. Ghost provides utilities to make this easy.
|
||||
|
||||
### Migration Utilities
|
||||
|
||||
Import the permission utilities from `ghost/core/core/server/data/migrations/utils`:
|
||||
|
||||
```javascript
|
||||
const {combineTransactionalMigrations, addPermissionWithRoles} = require('../../utils');
|
||||
```
|
||||
|
||||
### Example: Adding CRUD Permissions for a New Resource
|
||||
|
||||
```javascript
|
||||
// ghost/core/core/server/data/migrations/versions/X.X/YYYY-MM-DD-HH-MM-SS-add-myresource-permissions.js
|
||||
|
||||
const {combineTransactionalMigrations, addPermissionWithRoles} = require('../../utils');
|
||||
|
||||
module.exports = combineTransactionalMigrations(
|
||||
addPermissionWithRoles({
|
||||
name: 'Browse my resources',
|
||||
action: 'browse',
|
||||
object: 'my_resource' // Singular form of docName
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
]),
|
||||
addPermissionWithRoles({
|
||||
name: 'Read my resources',
|
||||
action: 'read',
|
||||
object: 'my_resource'
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
]),
|
||||
addPermissionWithRoles({
|
||||
name: 'Edit my resources',
|
||||
action: 'edit',
|
||||
object: 'my_resource'
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
]),
|
||||
addPermissionWithRoles({
|
||||
name: 'Add my resources',
|
||||
action: 'add',
|
||||
object: 'my_resource'
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
]),
|
||||
addPermissionWithRoles({
|
||||
name: 'Delete my resources',
|
||||
action: 'destroy',
|
||||
object: 'my_resource'
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
])
|
||||
);
|
||||
```
|
||||
|
||||
### Available Roles
|
||||
|
||||
Common roles you can assign permissions to:
|
||||
|
||||
- **Administrator** - Full admin access
|
||||
- **Admin Integration** - API integrations with admin scope
|
||||
- **Editor** - Can manage all content
|
||||
- **Author** - Can manage own content
|
||||
- **Contributor** - Can create drafts only
|
||||
- **Owner** - Site owner (inherits all Administrator permissions)
|
||||
|
||||
### Permission Naming Conventions
|
||||
|
||||
- **name**: Human-readable, e.g., `'Browse automated emails'`
|
||||
- **action**: The API method - `browse`, `read`, `edit`, `add`, `destroy`
|
||||
- **object**: Singular form of `docName` - `automated_email` (not `automated_emails`)
|
||||
|
||||
### Restricting to Administrators Only
|
||||
|
||||
To make an endpoint accessible only to administrators (not editors, authors, etc.), only assign permissions to:
|
||||
- `Administrator`
|
||||
- `Admin Integration`
|
||||
|
||||
```javascript
|
||||
addPermissionWithRoles({
|
||||
name: 'Browse sensitive data',
|
||||
action: 'browse',
|
||||
object: 'sensitive_data'
|
||||
}, [
|
||||
'Administrator',
|
||||
'Admin Integration'
|
||||
])
|
||||
```
|
||||
@@ -0,0 +1,633 @@
|
||||
# Ghost API Framework Reference
|
||||
|
||||
## Overview
|
||||
|
||||
The API framework is a pipeline-based system that processes HTTP requests through a series of stages before executing the controller logic. It provides consistent validation, serialization, and permission handling across all API endpoints.
|
||||
|
||||
## Request Flow
|
||||
|
||||
Each request goes through these stages in order:
|
||||
|
||||
1. **Input Validation** - Validates query params, URL params, and request body
|
||||
2. **Input Serialization** - Transforms incoming data (e.g., maps `include` to `withRelated`)
|
||||
3. **Permissions** - Checks if the user/API key has access to the resource
|
||||
4. **Query** - Executes the actual business logic (your controller code)
|
||||
5. **Output Serialization** - Formats the response for the client
|
||||
|
||||
## The Frame Object
|
||||
|
||||
The `Frame` class holds all request information and is passed through each stage. Each stage can modify it by reference.
|
||||
|
||||
### Frame Structure
|
||||
|
||||
```javascript
|
||||
{
|
||||
original: Object, // Original input (for debugging)
|
||||
options: Object, // Query params, URL params, context, custom options
|
||||
data: Object, // Request body, or query/URL params if configured via `data`
|
||||
user: Object, // Logged in user object
|
||||
file: Object, // Single uploaded file
|
||||
files: Array, // Multiple uploaded files
|
||||
apiType: String, // 'content' or 'admin'
|
||||
docName: String, // Endpoint name (e.g., 'posts')
|
||||
method: String, // Method name (e.g., 'browse', 'read', 'add', 'edit')
|
||||
response: Object // Set by output serialization
|
||||
}
|
||||
```
|
||||
|
||||
### Frame Example
|
||||
|
||||
```javascript
|
||||
{
|
||||
original: {
|
||||
include: 'tags,authors'
|
||||
},
|
||||
options: {
|
||||
withRelated: ['tags', 'authors'],
|
||||
context: { user: '123' }
|
||||
},
|
||||
data: {
|
||||
posts: [{ title: 'My Post' }]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## API Controller Structure
|
||||
|
||||
Controllers are objects with a `docName` property and method configurations.
|
||||
|
||||
### Basic Structure
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'posts', // Required: endpoint name
|
||||
|
||||
browse: {
|
||||
headers: {},
|
||||
options: [],
|
||||
data: [],
|
||||
validation: {},
|
||||
permissions: true,
|
||||
query(frame) {}
|
||||
},
|
||||
|
||||
read: { /* ... */ },
|
||||
add: { /* ... */ },
|
||||
edit: { /* ... */ },
|
||||
destroy: { /* ... */ }
|
||||
};
|
||||
```
|
||||
|
||||
## Controller Method Properties
|
||||
|
||||
### `headers` (Object)
|
||||
|
||||
Configure HTTP response headers.
|
||||
|
||||
```javascript
|
||||
headers: {
|
||||
// Invalidate cache after mutation
|
||||
cacheInvalidate: true,
|
||||
// Or with specific path
|
||||
cacheInvalidate: { value: '/posts/*' },
|
||||
|
||||
// File disposition for downloads
|
||||
disposition: {
|
||||
type: 'csv', // 'csv', 'json', 'yaml', or 'file'
|
||||
value: 'export.csv' // Can also be a function
|
||||
},
|
||||
|
||||
// Location header (auto-generated for 'add' methods)
|
||||
location: false // Disable auto-generation
|
||||
}
|
||||
```
|
||||
|
||||
### `options` (Array)
|
||||
|
||||
Allowed query/URL parameters that go into `frame.options`.
|
||||
|
||||
```javascript
|
||||
options: ['include', 'filter', 'page', 'limit', 'order']
|
||||
```
|
||||
|
||||
Can also be a function:
|
||||
```javascript
|
||||
options: (frame) => {
|
||||
return frame.apiType === 'content'
|
||||
? ['include']
|
||||
: ['include', 'filter'];
|
||||
}
|
||||
```
|
||||
|
||||
### `data` (Array)
|
||||
|
||||
Parameters that go into `frame.data` instead of `frame.options`. Useful for READ requests where the model expects `findOne(data, options)`.
|
||||
|
||||
```javascript
|
||||
data: ['id', 'slug', 'email']
|
||||
```
|
||||
|
||||
### `validation` (Object | Function)
|
||||
|
||||
Configure input validation. The framework validates against global validators automatically.
|
||||
|
||||
```javascript
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
required: true,
|
||||
values: ['tags', 'authors', 'tiers']
|
||||
},
|
||||
filter: {
|
||||
required: false
|
||||
}
|
||||
},
|
||||
data: {
|
||||
slug: {
|
||||
required: true,
|
||||
values: ['specific-slug'] // Restrict to specific values
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Global validators** (automatically applied when parameters are present):
|
||||
- `id` - Must match `/^[a-f\d]{24}$|^1$|me/i`
|
||||
- `page` - Must be a number
|
||||
- `limit` - Must be a number or 'all'
|
||||
- `uuid` - Must be a valid UUID
|
||||
- `slug` - Must be a valid slug
|
||||
- `email` - Must be a valid email
|
||||
- `order` - Must match `/^[a-z0-9_,. ]+$/i`
|
||||
|
||||
For custom validation, use a function:
|
||||
```javascript
|
||||
validation(frame) {
|
||||
if (!frame.data.posts[0].title) {
|
||||
return Promise.reject(new errors.ValidationError({
|
||||
message: 'Title is required'
|
||||
}));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### `permissions` (Boolean | Object | Function)
|
||||
|
||||
**Required field** - you must always specify permissions to avoid security holes.
|
||||
|
||||
```javascript
|
||||
// Use default permission handling
|
||||
permissions: true,
|
||||
|
||||
// Skip permission checking (use sparingly!)
|
||||
permissions: false,
|
||||
|
||||
// With configuration
|
||||
permissions: {
|
||||
// Attributes that require elevated permissions
|
||||
unsafeAttrs: ['status', 'authors'],
|
||||
|
||||
// Run code before permission check
|
||||
before(frame) {
|
||||
// Modify frame or do pre-checks
|
||||
},
|
||||
|
||||
// Specify which resource type to check against
|
||||
docName: 'posts',
|
||||
|
||||
// Specify different method for permission check
|
||||
method: 'browse'
|
||||
}
|
||||
|
||||
// Custom permission handling
|
||||
permissions: async function(frame) {
|
||||
const hasAccess = await checkCustomAccess(frame);
|
||||
if (!hasAccess) {
|
||||
return Promise.reject(new errors.NoPermissionError());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### `query` (Function) - Required
|
||||
|
||||
The main business logic. Returns the API response.
|
||||
|
||||
```javascript
|
||||
query(frame) {
|
||||
// Access validated options
|
||||
const { include, filter, page, limit } = frame.options;
|
||||
|
||||
// Access request body
|
||||
const postData = frame.data.posts[0];
|
||||
|
||||
// Access context
|
||||
const userId = frame.options.context.user;
|
||||
|
||||
// Return model response
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
```
|
||||
|
||||
### `statusCode` (Number | Function)
|
||||
|
||||
Set the HTTP status code. Defaults to 200.
|
||||
|
||||
```javascript
|
||||
// Fixed status code
|
||||
statusCode: 201,
|
||||
|
||||
// Dynamic based on result
|
||||
statusCode: (result) => {
|
||||
return result.posts.length ? 200 : 204;
|
||||
}
|
||||
```
|
||||
|
||||
### `response` (Object)
|
||||
|
||||
Configure response format.
|
||||
|
||||
```javascript
|
||||
response: {
|
||||
format: 'plain' // Send as plain text instead of JSON
|
||||
}
|
||||
```
|
||||
|
||||
### `cache` (Object)
|
||||
|
||||
Enable endpoint-level caching.
|
||||
|
||||
```javascript
|
||||
cache: {
|
||||
async get(cacheKey, fallback) {
|
||||
const cached = await redis.get(cacheKey);
|
||||
return cached || await fallback();
|
||||
},
|
||||
async set(cacheKey, response) {
|
||||
await redis.set(cacheKey, response, 'EX', 3600);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### `generateCacheKeyData` (Function)
|
||||
|
||||
Customize cache key generation.
|
||||
|
||||
```javascript
|
||||
generateCacheKeyData(frame) {
|
||||
// Default uses frame.options
|
||||
return {
|
||||
...frame.options,
|
||||
customKey: 'value'
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Complete Controller Examples
|
||||
|
||||
### Browse Endpoint (List)
|
||||
|
||||
```javascript
|
||||
browse: {
|
||||
headers: {
|
||||
cacheInvalidate: false
|
||||
},
|
||||
options: [
|
||||
'include',
|
||||
'filter',
|
||||
'fields',
|
||||
'formats',
|
||||
'page',
|
||||
'limit',
|
||||
'order'
|
||||
],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors', 'tiers']
|
||||
},
|
||||
formats: {
|
||||
values: ['html', 'plaintext', 'mobiledoc']
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Read Endpoint (Single)
|
||||
|
||||
```javascript
|
||||
read: {
|
||||
headers: {
|
||||
cacheInvalidate: false
|
||||
},
|
||||
options: ['include', 'fields', 'formats'],
|
||||
data: ['id', 'slug'],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors']
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findOne(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Add Endpoint (Create)
|
||||
|
||||
```javascript
|
||||
add: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['include'],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors']
|
||||
}
|
||||
},
|
||||
data: {
|
||||
title: { required: true }
|
||||
}
|
||||
},
|
||||
permissions: {
|
||||
unsafeAttrs: ['status', 'authors']
|
||||
},
|
||||
statusCode: 201,
|
||||
query(frame) {
|
||||
return models.Post.add(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Edit Endpoint (Update)
|
||||
|
||||
```javascript
|
||||
edit: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['include', 'id'],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors']
|
||||
},
|
||||
id: {
|
||||
required: true
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: {
|
||||
unsafeAttrs: ['status', 'authors']
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Destroy Endpoint (Delete)
|
||||
|
||||
```javascript
|
||||
destroy: {
|
||||
headers: {
|
||||
cacheInvalidate: true
|
||||
},
|
||||
options: ['id'],
|
||||
validation: {
|
||||
options: {
|
||||
id: {
|
||||
required: true
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
statusCode: 204,
|
||||
query(frame) {
|
||||
return models.Post.destroy(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### File Upload Endpoint
|
||||
|
||||
```javascript
|
||||
uploadImage: {
|
||||
headers: {
|
||||
cacheInvalidate: false
|
||||
},
|
||||
permissions: {
|
||||
method: 'add'
|
||||
},
|
||||
query(frame) {
|
||||
// Access uploaded file
|
||||
const file = frame.file;
|
||||
|
||||
return imageService.upload({
|
||||
path: file.path,
|
||||
name: file.name,
|
||||
type: file.type
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### CSV Export Endpoint
|
||||
|
||||
```javascript
|
||||
exportCSV: {
|
||||
headers: {
|
||||
disposition: {
|
||||
type: 'csv',
|
||||
value() {
|
||||
return `members.${new Date().toISOString()}.csv`;
|
||||
}
|
||||
}
|
||||
},
|
||||
options: ['filter'],
|
||||
permissions: true,
|
||||
response: {
|
||||
format: 'plain'
|
||||
},
|
||||
query(frame) {
|
||||
return membersService.export(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Using the Framework
|
||||
|
||||
### HTTP Wrapper
|
||||
|
||||
Wrap controllers for Express routes:
|
||||
|
||||
```javascript
|
||||
const {http} = require('@tryghost/api-framework');
|
||||
|
||||
// In routes
|
||||
router.get('/posts', http(api.posts.browse));
|
||||
router.get('/posts/:id', http(api.posts.read));
|
||||
router.post('/posts', http(api.posts.add));
|
||||
router.put('/posts/:id', http(api.posts.edit));
|
||||
router.delete('/posts/:id', http(api.posts.destroy));
|
||||
```
|
||||
|
||||
### Internal API Calls
|
||||
|
||||
Call controllers programmatically:
|
||||
|
||||
```javascript
|
||||
// With data and options
|
||||
const result = await api.posts.add(
|
||||
{ posts: [{ title: 'New Post' }] }, // data
|
||||
{ context: { user: userId } } // options
|
||||
);
|
||||
|
||||
// Options only
|
||||
const posts = await api.posts.browse({
|
||||
filter: 'status:published',
|
||||
include: 'tags',
|
||||
context: { user: userId }
|
||||
});
|
||||
```
|
||||
|
||||
### Custom Validators
|
||||
|
||||
Create endpoint-specific validators in the API utils:
|
||||
|
||||
```javascript
|
||||
// In api/utils/validators/input/posts.js
|
||||
module.exports = {
|
||||
add(apiConfig, frame) {
|
||||
// Custom validation for posts.add
|
||||
const post = frame.data.posts[0];
|
||||
if (post.status === 'published' && !post.title) {
|
||||
return Promise.reject(new errors.ValidationError({
|
||||
message: 'Published posts must have a title'
|
||||
}));
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Custom Serializers
|
||||
|
||||
Create input/output serializers:
|
||||
|
||||
```javascript
|
||||
// Input serializer
|
||||
module.exports = {
|
||||
all(apiConfig, frame) {
|
||||
// Transform include to withRelated
|
||||
if (frame.options.include) {
|
||||
frame.options.withRelated = frame.options.include.split(',');
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Output serializer
|
||||
module.exports = {
|
||||
posts: {
|
||||
browse(response, apiConfig, frame) {
|
||||
// Transform model response to API response
|
||||
frame.response = {
|
||||
posts: response.data.map(post => serializePost(post)),
|
||||
meta: {
|
||||
pagination: response.meta.pagination
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
## Common Patterns
|
||||
|
||||
### Checking User Context
|
||||
|
||||
```javascript
|
||||
query(frame) {
|
||||
const isAdmin = frame.options.context.user;
|
||||
const isIntegration = frame.options.context.integration;
|
||||
const isMember = frame.options.context.member;
|
||||
|
||||
if (isAdmin) {
|
||||
return models.Post.findPage(frame.options);
|
||||
} else {
|
||||
frame.options.filter = 'status:published';
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Handling Express Response Directly
|
||||
|
||||
For streaming or special responses:
|
||||
|
||||
```javascript
|
||||
query(frame) {
|
||||
// Return a function to handle Express response
|
||||
return function handler(req, res, next) {
|
||||
const stream = generateStream();
|
||||
stream.pipe(res);
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### Setting Custom Headers in Query
|
||||
|
||||
```javascript
|
||||
query(frame) {
|
||||
// Set headers from within query
|
||||
frame.setHeader('X-Custom-Header', 'value');
|
||||
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
Use `@tryghost/errors` for consistent error responses:
|
||||
|
||||
```javascript
|
||||
const errors = require('@tryghost/errors');
|
||||
|
||||
query(frame) {
|
||||
if (!frame.data.posts[0].title) {
|
||||
throw new errors.ValidationError({
|
||||
message: 'Title is required'
|
||||
});
|
||||
}
|
||||
|
||||
if (notFound) {
|
||||
throw new errors.NotFoundError({
|
||||
message: 'Post not found'
|
||||
});
|
||||
}
|
||||
|
||||
if (noAccess) {
|
||||
throw new errors.NoPermissionError({
|
||||
message: 'You do not have permission to access this resource'
|
||||
});
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always specify `permissions`** - Never omit this field, it's a security requirement
|
||||
2. **Use `options` to whitelist params** - Only allowed params are passed through
|
||||
3. **Prefer declarative validation** - Use the validation object over custom functions
|
||||
4. **Set `cacheInvalidate` appropriately** - True for mutations, false for reads
|
||||
5. **Use `unsafeAttrs` for sensitive fields** - Requires elevated permissions to modify
|
||||
6. **Return model responses from `query`** - Let serializers handle transformation
|
||||
7. **Use `data` for READ endpoints** - When the model expects `findOne(data, options)`
|
||||
@@ -0,0 +1,747 @@
|
||||
# API Controller Validation Guide
|
||||
|
||||
This guide explains how to configure validations in api-framework controllers, covering all available patterns, built-in validators, and best practices.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Overview](#overview)
|
||||
- [Validation Patterns](#validation-patterns)
|
||||
- [Object-Based Validation](#pattern-1-object-based-validation)
|
||||
- [Function-Based Validation](#pattern-2-function-based-validation)
|
||||
- [Validating Options (Query Parameters)](#validating-options-query-parameters)
|
||||
- [Validating Data (Request Body)](#validating-data-request-body)
|
||||
- [Built-in Global Validators](#built-in-global-validators)
|
||||
- [Method-Specific Validation Behavior](#method-specific-validation-behavior)
|
||||
- [Complete Examples](#complete-examples)
|
||||
- [Error Handling](#error-handling)
|
||||
- [Best Practices](#best-practices)
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
The api-framework uses a **pipeline-based validation system** where validations run as the first processing stage:
|
||||
|
||||
1. **Validation** ← You are here
|
||||
2. Input serialisation
|
||||
3. Permissions
|
||||
4. Query (controller execution)
|
||||
5. Output serialisation
|
||||
|
||||
Validation ensures that:
|
||||
- Required fields are present
|
||||
- Values are in allowed lists
|
||||
- Data types are correct (IDs, emails, slugs, etc.)
|
||||
- Request structure is valid before processing
|
||||
|
||||
---
|
||||
|
||||
## Validation Patterns
|
||||
|
||||
### Pattern 1: Object-Based Validation
|
||||
|
||||
The most common pattern using configuration objects:
|
||||
|
||||
```javascript
|
||||
browse: {
|
||||
options: ['include', 'page', 'limit'],
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors'],
|
||||
required: true
|
||||
},
|
||||
page: {
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Standard field validation (required, allowed values)
|
||||
- Most common case for API endpoints
|
||||
|
||||
---
|
||||
|
||||
### Pattern 2: Function-Based Validation
|
||||
|
||||
Complete control over validation logic:
|
||||
|
||||
```javascript
|
||||
add: {
|
||||
validation(frame) {
|
||||
const {ValidationError} = require('@tryghost/errors');
|
||||
|
||||
if (!frame.data.posts || !frame.data.posts.length) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'No posts provided'
|
||||
}));
|
||||
}
|
||||
|
||||
const post = frame.data.posts[0];
|
||||
|
||||
if (!post.title || post.title.length < 3) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'Title must be at least 3 characters'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.add(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**When to use:**
|
||||
- Complex validation logic
|
||||
- Cross-field validation
|
||||
- Conditional validation rules
|
||||
- Custom error messages
|
||||
|
||||
---
|
||||
|
||||
## Validating Options (Query Parameters)
|
||||
|
||||
Options are URL query parameters and route params. Define allowed options in the `options` array and configure validation rules.
|
||||
|
||||
### Required Fields
|
||||
|
||||
```javascript
|
||||
browse: {
|
||||
options: ['filter'],
|
||||
validation: {
|
||||
options: {
|
||||
filter: {
|
||||
required: true
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findAll(frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Allowed Values
|
||||
|
||||
Two equivalent syntaxes:
|
||||
|
||||
**Object notation:**
|
||||
```javascript
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors', 'count.posts']
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Array shorthand:**
|
||||
```javascript
|
||||
validation: {
|
||||
options: {
|
||||
include: ['tags', 'authors', 'count.posts']
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Combined Rules
|
||||
|
||||
```javascript
|
||||
validation: {
|
||||
options: {
|
||||
include: {
|
||||
values: ['tags', 'authors'],
|
||||
required: true
|
||||
},
|
||||
status: {
|
||||
values: ['draft', 'published', 'scheduled'],
|
||||
required: false
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Special Behavior: Include Parameter
|
||||
|
||||
The `include` parameter has special handling - invalid values are silently filtered instead of causing an error:
|
||||
|
||||
```javascript
|
||||
// Request: ?include=tags,invalid_field,authors
|
||||
// Result: frame.options.include = 'tags,authors'
|
||||
```
|
||||
|
||||
This allows for graceful degradation when clients request unsupported includes.
|
||||
|
||||
---
|
||||
|
||||
## Validating Data (Request Body)
|
||||
|
||||
Data validation applies to request body content. The structure differs based on the HTTP method.
|
||||
|
||||
### For READ Operations
|
||||
|
||||
Data comes from query parameters:
|
||||
|
||||
```javascript
|
||||
read: {
|
||||
data: ['id', 'slug'],
|
||||
validation: {
|
||||
data: {
|
||||
slug: {
|
||||
values: ['featured', 'latest']
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findOne(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### For ADD/EDIT Operations
|
||||
|
||||
Data comes from the request body with a root key:
|
||||
|
||||
```javascript
|
||||
add: {
|
||||
validation: {
|
||||
data: {
|
||||
title: {
|
||||
required: true
|
||||
},
|
||||
status: {
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.add(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Request body structure:**
|
||||
```json
|
||||
{
|
||||
"posts": [{
|
||||
"title": "My Post",
|
||||
"status": "draft"
|
||||
}]
|
||||
}
|
||||
```
|
||||
|
||||
### Root Key Validation
|
||||
|
||||
For ADD/EDIT operations, the framework automatically validates:
|
||||
1. Root key exists (e.g., `posts`, `users`)
|
||||
2. Root key contains an array with at least one item
|
||||
3. Required fields exist and are not null
|
||||
|
||||
---
|
||||
|
||||
## Built-in Global Validators
|
||||
|
||||
The framework automatically validates common field types using the `@tryghost/validator` package:
|
||||
|
||||
| Field Name | Validation Rule | Example Valid Values |
|
||||
|------------|-----------------|---------------------|
|
||||
| `id` | MongoDB ObjectId, `1`, or `me` | `507f1f77bcf86cd799439011`, `me` |
|
||||
| `uuid` | UUID format | `550e8400-e29b-41d4-a716-446655440000` |
|
||||
| `slug` | URL-safe slug | `my-post-title` |
|
||||
| `email` | Email format | `user@example.com` |
|
||||
| `page` | Numeric | `1`, `25` |
|
||||
| `limit` | Numeric or `all` | `10`, `all` |
|
||||
| `from` | Date format | `2024-01-15` |
|
||||
| `to` | Date format | `2024-12-31` |
|
||||
| `order` | Sort format | `created_at desc`, `title asc` |
|
||||
| `columns` | Column list | `id,title,created_at` |
|
||||
|
||||
### Fields with No Validation
|
||||
|
||||
These fields skip validation by default:
|
||||
- `filter`
|
||||
- `context`
|
||||
- `forUpdate`
|
||||
- `transacting`
|
||||
- `include`
|
||||
- `formats`
|
||||
- `name`
|
||||
|
||||
---
|
||||
|
||||
## Method-Specific Validation Behavior
|
||||
|
||||
Different HTTP methods have different validation behaviors:
|
||||
|
||||
### BROWSE / READ
|
||||
|
||||
- Validates `frame.data` against `apiConfig.data`
|
||||
- Allows empty data
|
||||
- Uses global validators for field types
|
||||
|
||||
### ADD
|
||||
|
||||
1. Validates root key exists in `frame.data`
|
||||
2. Checks required fields are present
|
||||
3. Checks required fields are not null
|
||||
|
||||
**Error examples:**
|
||||
- `"No root key ('posts') provided."`
|
||||
- `"Validation (FieldIsRequired) failed for title"`
|
||||
- `"Validation (FieldIsInvalid) failed for title"` (when null)
|
||||
|
||||
### EDIT
|
||||
|
||||
1. Performs all ADD validations
|
||||
2. Validates ID consistency between URL and body
|
||||
|
||||
```javascript
|
||||
// URL: /posts/123
|
||||
// Body: { "posts": [{ "id": "456", ... }] }
|
||||
// Error: "Invalid id provided."
|
||||
```
|
||||
|
||||
### Special Methods
|
||||
|
||||
These methods use specific validation behaviors:
|
||||
- `changePassword()` - Uses ADD rules
|
||||
- `resetPassword()` - Uses ADD rules
|
||||
- `setup()` - Uses ADD rules
|
||||
- `publish()` - Uses BROWSE rules
|
||||
|
||||
---
|
||||
|
||||
## Complete Examples
|
||||
|
||||
### Example 1: Simple Browse with Options
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'posts',
|
||||
|
||||
browse: {
|
||||
options: ['include', 'page', 'limit', 'filter', 'order'],
|
||||
validation: {
|
||||
options: {
|
||||
include: ['tags', 'authors', 'count.posts'],
|
||||
page: {
|
||||
required: false
|
||||
},
|
||||
limit: {
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findPage(frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 2: Read with Data Validation
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'posts',
|
||||
|
||||
read: {
|
||||
options: ['include'],
|
||||
data: ['id', 'slug'],
|
||||
validation: {
|
||||
options: {
|
||||
include: ['tags', 'authors']
|
||||
},
|
||||
data: {
|
||||
id: {
|
||||
required: false
|
||||
},
|
||||
slug: {
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Post.findOne(frame.data, frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 3: Add with Required Fields
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'users',
|
||||
|
||||
add: {
|
||||
validation: {
|
||||
data: {
|
||||
name: {
|
||||
required: true
|
||||
},
|
||||
email: {
|
||||
required: true
|
||||
},
|
||||
password: {
|
||||
required: true
|
||||
},
|
||||
role: {
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.User.add(frame.data.users[0], frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 4: Custom Validation Function
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'subscriptions',
|
||||
|
||||
add: {
|
||||
validation(frame) {
|
||||
const {ValidationError} = require('@tryghost/errors');
|
||||
const subscription = frame.data.subscriptions?.[0];
|
||||
|
||||
if (!subscription) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'No subscription data provided'
|
||||
}));
|
||||
}
|
||||
|
||||
// Validate email format
|
||||
if (!subscription.email || !subscription.email.includes('@')) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'Valid email address is required'
|
||||
}));
|
||||
}
|
||||
|
||||
// Validate plan
|
||||
const validPlans = ['free', 'basic', 'premium'];
|
||||
if (!validPlans.includes(subscription.plan)) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: `Plan must be one of: ${validPlans.join(', ')}`
|
||||
}));
|
||||
}
|
||||
|
||||
// Cross-field validation
|
||||
if (subscription.plan !== 'free' && !subscription.payment_method) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'Payment method required for paid plans'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return models.Subscription.add(frame.data.subscriptions[0], frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 5: Edit with ID Consistency
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'posts',
|
||||
|
||||
edit: {
|
||||
options: ['id', 'include'],
|
||||
validation: {
|
||||
options: {
|
||||
include: ['tags', 'authors']
|
||||
},
|
||||
data: {
|
||||
title: {
|
||||
required: false
|
||||
},
|
||||
status: {
|
||||
values: ['draft', 'published', 'scheduled']
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: {
|
||||
unsafeAttrs: ['status', 'author_id']
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Example 6: Complex Browse with Multiple Validations
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
docName: 'analytics',
|
||||
|
||||
browse: {
|
||||
options: ['from', 'to', 'interval', 'metrics', 'dimensions'],
|
||||
validation: {
|
||||
options: {
|
||||
from: {
|
||||
required: true
|
||||
},
|
||||
to: {
|
||||
required: true
|
||||
},
|
||||
interval: {
|
||||
values: ['hour', 'day', 'week', 'month'],
|
||||
required: false
|
||||
},
|
||||
metrics: {
|
||||
values: ['pageviews', 'visitors', 'sessions', 'bounce_rate'],
|
||||
required: true
|
||||
},
|
||||
dimensions: {
|
||||
values: ['page', 'source', 'country', 'device'],
|
||||
required: false
|
||||
}
|
||||
}
|
||||
},
|
||||
permissions: true,
|
||||
query(frame) {
|
||||
return analytics.query(frame.options);
|
||||
}
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Error Types
|
||||
|
||||
Validation errors use types from `@tryghost/errors`:
|
||||
- **ValidationError** - Field validation failed
|
||||
- **BadRequestError** - Malformed request structure
|
||||
|
||||
### Error Message Format
|
||||
|
||||
```javascript
|
||||
// Missing required field
|
||||
"Validation (FieldIsRequired) failed for title"
|
||||
|
||||
// Invalid value
|
||||
"Validation (AllowedValues) failed for status"
|
||||
|
||||
// Field is null when required
|
||||
"Validation (FieldIsInvalid) failed for title"
|
||||
|
||||
// Missing root key
|
||||
"No root key ('posts') provided."
|
||||
|
||||
// ID mismatch
|
||||
"Invalid id provided."
|
||||
```
|
||||
|
||||
### Custom Error Messages
|
||||
|
||||
When using function-based validation:
|
||||
|
||||
```javascript
|
||||
validation(frame) {
|
||||
const {ValidationError} = require('@tryghost/errors');
|
||||
|
||||
if (!frame.data.email) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'Email address is required',
|
||||
context: 'Please provide a valid email address to continue',
|
||||
help: 'Check that the email field is included in your request'
|
||||
}));
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Define All Allowed Options
|
||||
|
||||
Always explicitly list allowed options to prevent unexpected parameters:
|
||||
|
||||
```javascript
|
||||
// Good - explicit allowed options
|
||||
options: ['include', 'page', 'limit', 'filter'],
|
||||
|
||||
// Bad - no options defined (might allow anything)
|
||||
// options: undefined
|
||||
```
|
||||
|
||||
### 2. Use Built-in Validators
|
||||
|
||||
Let the framework handle common field types:
|
||||
|
||||
```javascript
|
||||
// Good - framework validates automatically
|
||||
options: ['id', 'email', 'slug']
|
||||
|
||||
// Unnecessary - these are validated by default
|
||||
validation: {
|
||||
options: {
|
||||
id: { matches: /^[a-f\d]{24}$/ } // Already built-in
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Mark Required Fields Explicitly
|
||||
|
||||
Be explicit about which fields are required:
|
||||
|
||||
```javascript
|
||||
validation: {
|
||||
data: {
|
||||
title: { required: true },
|
||||
slug: { required: false },
|
||||
status: { required: false }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Use Array Shorthand for Simple Cases
|
||||
|
||||
When only validating allowed values:
|
||||
|
||||
```javascript
|
||||
// Shorter and cleaner
|
||||
validation: {
|
||||
options: {
|
||||
include: ['tags', 'authors'],
|
||||
status: ['draft', 'published']
|
||||
}
|
||||
}
|
||||
|
||||
// Equivalent verbose form
|
||||
validation: {
|
||||
options: {
|
||||
include: { values: ['tags', 'authors'] },
|
||||
status: { values: ['draft', 'published'] }
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Combine with Permissions
|
||||
|
||||
Validation runs before permissions, ensuring data structure is valid:
|
||||
|
||||
```javascript
|
||||
edit: {
|
||||
validation: {
|
||||
data: {
|
||||
author_id: { required: false }
|
||||
}
|
||||
},
|
||||
permissions: {
|
||||
unsafeAttrs: ['author_id'] // Validated first, then permission-checked
|
||||
},
|
||||
query(frame) {
|
||||
return models.Post.edit(frame.data.posts[0], frame.options);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 6. Use Custom Functions for Complex Logic
|
||||
|
||||
When validation rules depend on multiple fields or external state:
|
||||
|
||||
```javascript
|
||||
validation(frame) {
|
||||
// Date range validation
|
||||
if (frame.options.from && frame.options.to) {
|
||||
const from = new Date(frame.options.from);
|
||||
const to = new Date(frame.options.to);
|
||||
|
||||
if (from > to) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'From date must be before to date'
|
||||
}));
|
||||
}
|
||||
|
||||
// Max 30 day range
|
||||
const diffDays = (to - from) / (1000 * 60 * 60 * 24);
|
||||
if (diffDays > 30) {
|
||||
return Promise.reject(new ValidationError({
|
||||
message: 'Date range cannot exceed 30 days'
|
||||
}));
|
||||
}
|
||||
}
|
||||
|
||||
return Promise.resolve();
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Provide Helpful Error Messages
|
||||
|
||||
Make errors actionable for API consumers:
|
||||
|
||||
```javascript
|
||||
// Good - specific and actionable
|
||||
"Status must be one of: draft, published, scheduled"
|
||||
|
||||
// Bad - vague
|
||||
"Invalid status"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Validation Flow Diagram
|
||||
|
||||
```
|
||||
HTTP Request
|
||||
↓
|
||||
Frame Creation
|
||||
↓
|
||||
Frame Configuration (pick options/data)
|
||||
↓
|
||||
┌─────────────────────────────┐
|
||||
│ VALIDATION STAGE │
|
||||
├─────────────────────────────┤
|
||||
│ Is validation a function? │
|
||||
│ ├─ Yes → Run custom logic │
|
||||
│ └─ No → Framework validation│
|
||||
│ ├─ Global validators │
|
||||
│ ├─ Required fields │
|
||||
│ ├─ Allowed values │
|
||||
│ └─ Method-specific rules│
|
||||
└─────────────────────────────┘
|
||||
↓
|
||||
Input Serialisation
|
||||
↓
|
||||
Permissions
|
||||
↓
|
||||
Query Execution
|
||||
↓
|
||||
Output Serialisation
|
||||
↓
|
||||
HTTP Response
|
||||
```
|
||||
@@ -0,0 +1,28 @@
|
||||
---
|
||||
name: add-private-feature-flag
|
||||
description: Use when adding a new private (developer experiments) feature flag to Ghost, including the backend registration and settings UI toggle.
|
||||
---
|
||||
|
||||
# Add Private Feature Flag
|
||||
|
||||
## Overview
|
||||
Adds a new private feature flag to Ghost. Private flags appear in Labs settings under the "Private features" tab, visible only when developer experiments are enabled.
|
||||
|
||||
## Steps
|
||||
|
||||
1. **Add the flag to `ghost/core/core/shared/labs.js`**
|
||||
- Add the flag name (camelCase string) to the `PRIVATE_FEATURES` array.
|
||||
|
||||
2. **Add a UI toggle in `apps/admin-x-settings/src/components/settings/advanced/labs/private-features.tsx`**
|
||||
- Add a new entry to the `features` array with `title`, `description`, and `flag` (must match the string in `labs.js`).
|
||||
|
||||
3. **Run tests and update the config API snapshot**
|
||||
- Unit: `cd ghost/core && pnpm test:single test/unit/shared/labs.test.js`
|
||||
- Update snapshot and run e2e: `cd ghost/core && UPDATE_SNAPSHOTS=1 pnpm test:single test/e2e-api/admin/config.test.js`
|
||||
- Review the diff of `ghost/core/test/e2e-api/admin/__snapshots__/config.test.js.snap` to confirm only your new flag was added.
|
||||
|
||||
## Notes
|
||||
- No database migration is needed. Labs flags are stored in a single JSON `labs` setting.
|
||||
- The flag name must be identical in `labs.js`, `private-features.tsx`, and the snapshot.
|
||||
- Flags are camelCase strings (e.g. `welcomeEmailDesignCustomization`).
|
||||
- For public beta flags (visible to all users), add to `PUBLIC_BETA_FEATURES` in `labs.js` instead and add the toggle to `apps/admin-x-settings/src/components/settings/advanced/labs/beta-features.tsx`.
|
||||
@@ -0,0 +1,61 @@
|
||||
---
|
||||
name: commit
|
||||
description: Commit message formatting and guidelines
|
||||
---
|
||||
|
||||
# Commit
|
||||
|
||||
Use this skill whenever the user asks you to create a git commit for the current work.
|
||||
|
||||
## Instructions
|
||||
|
||||
1. Review the current git state before committing:
|
||||
- `git status`
|
||||
- `git diff`
|
||||
- `git log -5 --oneline`
|
||||
2. Only stage files relevant to the requested change. Do not include unrelated untracked files, generated files, or likely-local artifacts.
|
||||
3. Always follow Ghost's commit conventions (see below) for commit messages
|
||||
4. Run `git status --short` after committing and confirm the result.
|
||||
|
||||
## Important
|
||||
- Do not push to remote unless the user explicitly asks
|
||||
- Keep commits focused and avoid bundling unrelated changes
|
||||
- If there are no relevant changes, do not create an empty commit
|
||||
- If hooks fail, fix the issue and create a new commit. Never bypass hooks.
|
||||
|
||||
## Commit message format
|
||||
|
||||
We have a handful of simple standards for commit messages which help us to generate readable changelogs. Please follow this wherever possible and mention the associated issue number.
|
||||
|
||||
- **1st line:** Max 80 character summary
|
||||
- Written in past tense e.g. “Fixed the thing” not “Fixes the thing”
|
||||
- Start with one of: Fixed, Changed, Updated, Improved, Added, Removed, Reverted, Moved, Released, Bumped, Cleaned
|
||||
- **2nd line:** [Always blank]
|
||||
- **3rd line:** `ref <issue link>`, `fixes <issue link>`, `closes <issue link>` or blank
|
||||
- **4th line:** Why this change was made - the code includes the what, the commit message should describe the context of why - why this, why now, why not something else?
|
||||
|
||||
If your change is **user-facing** please prepend the first line of your commit with **an emoji**.
|
||||
|
||||
Because emoji commits are the release notes, it's important that anything that gets an emoji is a user-facing change that's significant and relevant for end-users to see.
|
||||
|
||||
The first line of an emoji commit message should be from the perspective of the user. For example, 🐛 Fixed a race condition in the members service is technical and tells the user nothing, but 🐛 Fixed a bug causing active members to lose access to paid content tells the user reading the release notes “oh yeah, they fixed that bug I kept hitting.”
|
||||
|
||||
### Main emojis we are using:
|
||||
|
||||
- ✨ Feature
|
||||
- 🎨 Improvement / change
|
||||
- 🐛 Bug Fix
|
||||
- 🌐 i18n (translation) submissions
|
||||
- 💡 Anything else flagged to users or whoever is writing release notes
|
||||
|
||||
### Example
|
||||
|
||||
```
|
||||
✨ Added config flag for disabling page analytics
|
||||
|
||||
ref https://linear.app/tryghost/issue/ENG-1234/
|
||||
|
||||
- analytics are brand new under development, therefore they need to be behind a flag
|
||||
- not using the developerExperiments flag as that is already in wide use and we aren't ready to deploy this anywhere yet
|
||||
- using the term `pageAnalytics` as this was discussed as best reflecting what this does
|
||||
```
|
||||
@@ -0,0 +1,24 @@
|
||||
---
|
||||
name: Create database migration
|
||||
description: Create a database migration to add a table, add columns to an existing table, add a setting, or otherwise change the schema of Ghost's MySQL database. Use this skill whenever the task involves modifying Ghost's database schema — including adding, removing, or renaming columns or tables, adding new settings, creating indexes, updating data, or any change that requires a migration file in ghost/core. Also use when the user references schema.js, knex-migrator, the migrations directory, or asks to "add a field" or "add a column" to any Ghost model/table. Even if the user frames it as a feature or Linear issue, if the implementation requires a schema change, this skill applies.
|
||||
---
|
||||
|
||||
# Create Database Migration
|
||||
|
||||
## Instructions
|
||||
|
||||
1. Create a new, empty migration file: `cd ghost/core && pnpm migrate:create <kebab-case-slug>`. IMPORTANT: do not create the migration file manually; always use this script to create the initial empty migration file. The slug must be kebab-case (e.g. `add-column-to-posts`).
|
||||
2. The above command will create a new directory in `ghost/core/core/server/data/migrations/versions` if needed, create the empty migration file with the appropriate name, and bump the core and admin package versions to RC if this is the first migration after a release.
|
||||
3. Update the migration file with the changes you want to make in the database, following the existing patterns in the codebase. Where appropriate, prefer to use the utility functions in `ghost/core/core/server/data/migrations/utils/*`.
|
||||
4. Update the schema definition file in `ghost/core/core/server/data/schema/schema.js`, and make sure it aligns with the latest changes from the migration.
|
||||
5. Test the migration manually: `cd ghost/core && pnpm knex-migrator migrate --v {version directory} --force`
|
||||
6. If adding or dropping a table, update `ghost/core/core/server/data/exporter/table-lists.js` as appropriate.
|
||||
7. If adding or dropping a table, also add or remove the table name from the expected tables list in `ghost/core/test/integration/exporter/exporter.test.js`. This test has a hardcoded alphabetically-sorted array of all database tables — it runs in CI integration tests (not unit tests) and will fail if the new table is missing.
|
||||
8. Run the schema integrity test, and update the hash: `cd ghost/core && pnpm test:single test/unit/server/data/schema/integrity.test.js`
|
||||
9. Run unit tests in Ghost core, and iterate until they pass: `cd ghost/core && pnpm test:unit`
|
||||
|
||||
## Examples
|
||||
See [examples.md](examples.md) for example migrations.
|
||||
|
||||
## Rules
|
||||
See [rules.md](rules.md) for rules that should always be followed when creating database migrations.
|
||||
@@ -0,0 +1,17 @@
|
||||
# Example database migrations
|
||||
|
||||
## Create a table
|
||||
|
||||
See [add mentions table](../../../ghost/core/core/server/data/migrations/versions/5.31/2023-01-19-07-46-add-mentions-table.js).
|
||||
|
||||
## Add column(s) to an existing table
|
||||
|
||||
See [add source columns to emails table](../../../ghost/core/core/server/data/migrations/versions/5.24/2022-11-21-09-32-add-source-columns-to-emails-table.js).
|
||||
|
||||
## Add a setting
|
||||
|
||||
See [add member track source setting](../../../ghost/core/core/server/data/migrations/versions/5.21/2022-10-27-09-50-add-member-track-source-setting.js)
|
||||
|
||||
## Manipulate data
|
||||
|
||||
See [update newsletter subscriptions](../../../ghost/core/core/server/data/migrations/versions/5.31/2022-12-05-09-56-update-newsletter-subscriptions.js).
|
||||
@@ -0,0 +1,33 @@
|
||||
# Rules for creating database migrations
|
||||
|
||||
## Migrations must be idempotent
|
||||
|
||||
It must be safe to run the migration twice. It's possible for a migration to stop executing due to external factors, so it must be safe to run the migration again successfully.
|
||||
|
||||
## Migrations must NOT use the model layer
|
||||
|
||||
Migrations are written for a specific version, and when they use the model layer, the asusmption is that they are using the models at that version. In reality, the models are of the version which is being migrated to, not from. This means that breaking changes in the models can inadvertently break migrations.
|
||||
|
||||
## Migrations are Immutable
|
||||
|
||||
Once migrations are on the `main` branch, they're final. If you need to make further changes after merging to main, create a new migration instead.
|
||||
|
||||
## Use utility functions
|
||||
|
||||
Wherever possible, use the utility functions in `ghost/core/core/server/data/migrations/utils`, such as `addTable`, `createTransactionalMigration`, and `addSetting`. These util functions have been tested and already include protections for idempotency, as well as log statements where appropriate to make migrations easier to debug.
|
||||
|
||||
## Migration PRs should be as minimal as possible
|
||||
|
||||
Migration PRs should contain the minimal amount of code to create the migration. Usually this means it should only include:
|
||||
- the new migration file
|
||||
- updates to the schema.js file
|
||||
- updated schema integrity hash tests
|
||||
- updated exporter table lists (when adding or removing tables)
|
||||
|
||||
## Migrations should be defensive
|
||||
|
||||
Protect against missing data. If a migration crashes, Ghost cannot boot.
|
||||
|
||||
## Migrations should log every code path
|
||||
|
||||
If we have to debug a migration, we need to know what it actually did. Without logging, that's impossible, so ensure all code paths and early returns contain logging. Note: when using the utility functions, logging is typically handled in the utility function itself, so no additional logging statements are necessary.
|
||||
@@ -0,0 +1,60 @@
|
||||
---
|
||||
name: Format numbers
|
||||
description: Format numbers using the formatNumber function from Shade whenever someone edits a TSX file.
|
||||
autoTrigger:
|
||||
- fileEdit: "**/*.tsx"
|
||||
---
|
||||
|
||||
# Format Numbers
|
||||
|
||||
When editing `.tsx` files, ensure all user-facing numbers are formatted using the `formatNumber` utility from `@tryghost/shade`.
|
||||
|
||||
## Import
|
||||
|
||||
```typescript
|
||||
import {formatNumber} from '@tryghost/shade';
|
||||
```
|
||||
|
||||
## When to use formatNumber
|
||||
|
||||
Use `formatNumber()` when rendering any numeric value that is displayed to the user, including:
|
||||
- Member counts, visitor counts, subscriber counts
|
||||
- Email engagement metrics (opens, clicks, bounces)
|
||||
- Revenue amounts (combine with `centsToDollars()` for monetary values)
|
||||
- Post analytics (views, link clicks)
|
||||
- Any count or quantity shown in UI
|
||||
|
||||
## Correct usage
|
||||
|
||||
```tsx
|
||||
<span>{formatNumber(totalMembers)}</span>
|
||||
<span>{formatNumber(link.count || 0)}</span>
|
||||
<span>{`${currencySymbol}${formatNumber(centsToDollars(mrr))}`}</span>
|
||||
<span>{post.members > 0 ? `+${formatNumber(post.members)}` : '0'}</span>
|
||||
```
|
||||
|
||||
## Antipatterns to avoid
|
||||
|
||||
Do NOT use any of these patterns for formatting numbers in TSX files:
|
||||
|
||||
```tsx
|
||||
// BAD: raw .toLocaleString()
|
||||
<span>{count.toLocaleString()}</span>
|
||||
|
||||
// BAD: manual Intl.NumberFormat
|
||||
<span>{new Intl.NumberFormat('en-US').format(count)}</span>
|
||||
|
||||
// BAD: raw number without formatting
|
||||
<span>{memberCount}</span>
|
||||
|
||||
// BAD: manual regex formatting
|
||||
<span>{count.toString().replace(/\B(?=(\d{3})+(?!\d))/g, ',')}</span>
|
||||
```
|
||||
|
||||
## Related utilities
|
||||
|
||||
- `formatPercentage()` - for percentages (e.g., open rates, click rates)
|
||||
- `abbreviateNumber()` - for compact notation (e.g., 1.2M, 50k)
|
||||
- `centsToDollars()` - convert cents to dollars before passing to `formatNumber`
|
||||
|
||||
All are imported from `@tryghost/shade`.
|
||||
+1
@@ -0,0 +1 @@
|
||||
../../.agents/skills/add-admin-api-endpoint
|
||||
+1
@@ -0,0 +1 @@
|
||||
../../.agents/skills/add-private-feature-flag
|
||||
Symlink
+1
@@ -0,0 +1 @@
|
||||
../../.agents/skills/commit
|
||||
@@ -0,0 +1 @@
|
||||
../../.agents/skills/create-database-migration
|
||||
Symlink
+1
@@ -0,0 +1 @@
|
||||
../../.agents/skills/format-number
|
||||
@@ -0,0 +1,11 @@
|
||||
# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json
|
||||
reviews:
|
||||
high_level_summary: false
|
||||
collapse_walkthrough: false
|
||||
changed_files_summary: false
|
||||
sequence_diagrams: false
|
||||
estimate_code_review_effort: false
|
||||
poem: false
|
||||
auto_review:
|
||||
base_branches:
|
||||
- 6.x
|
||||
@@ -0,0 +1,8 @@
|
||||
# THIS IS AUTOGENERATED. DO NOT EDIT MANUALLY
|
||||
version = 1
|
||||
name = "Ghost"
|
||||
|
||||
[setup]
|
||||
script = '''
|
||||
pnpm run setup
|
||||
'''
|
||||
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"setup-worktree": [
|
||||
"git submodule update --init --recursive",
|
||||
"pnpm"
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,34 @@
|
||||
node_modules
|
||||
|
||||
.nxcache
|
||||
.nx
|
||||
|
||||
**/*.log
|
||||
|
||||
build
|
||||
dist
|
||||
|
||||
coverage
|
||||
|
||||
.eslintcache
|
||||
|
||||
test-results
|
||||
|
||||
tsconfig.tsbuildinfo
|
||||
|
||||
Dockerfile
|
||||
.dockerignore
|
||||
|
||||
.git
|
||||
.vscode
|
||||
.editorconfig
|
||||
compose.yml
|
||||
|
||||
docker
|
||||
!docker/**/*.entrypoint.sh
|
||||
!docker/**/*entrypoint.sh
|
||||
|
||||
ghost/core/core/built/admin
|
||||
|
||||
# Ignore local config files (.json and .jsonc)
|
||||
ghost/core/config.local.json*
|
||||
@@ -0,0 +1,23 @@
|
||||
# http://editorconfig.org
|
||||
|
||||
root = true
|
||||
|
||||
[*]
|
||||
charset = utf-8
|
||||
indent_style = space
|
||||
indent_size = 4
|
||||
end_of_line = lf
|
||||
insert_final_newline = true
|
||||
trim_trailing_whitespace = true
|
||||
|
||||
[*.hbs]
|
||||
insert_final_newline = false
|
||||
|
||||
[{package}.json]
|
||||
indent_size = 2
|
||||
|
||||
[*.md]
|
||||
trim_trailing_whitespace = false
|
||||
|
||||
[*.yml]
|
||||
indent_size = 2
|
||||
@@ -0,0 +1,24 @@
|
||||
# Environment variables for Ghost development with docker compose
|
||||
## Use this file by running `cp .env.example .env` and then editing the values as needed
|
||||
|
||||
# Docker Compose profiles to enable
|
||||
## Run `docker compose config --profiles` to see all available profiles
|
||||
## See https://docs.docker.com/compose/how-tos/profiles/ for more information
|
||||
# COMPOSE_PROFILES=stripe
|
||||
|
||||
# Debug level to pass to Ghost
|
||||
# DEBUG=
|
||||
|
||||
# Stripe keys - used to forward Stripe webhooks to Ghost
|
||||
## Stripe Secret Key: sk_test_*******
|
||||
# STRIPE_SECRET_KEY=
|
||||
## Stripe Publishable Key: pk_test_*******
|
||||
#STRIPE_PUBLISHABLE_KEY=
|
||||
## Stripe Account ID: acct_1*******
|
||||
#STRIPE_ACCOUNT_ID=
|
||||
|
||||
# Mailgun SMTP credentials - used with `yarn dev:mailgun`
|
||||
## SMTP username from Mailgun (often starts with `postmaster@`)
|
||||
# MAILGUN_SMTP_USER=
|
||||
## SMTP password from Mailgun
|
||||
# MAILGUN_SMTP_PASS=
|
||||
@@ -0,0 +1,8 @@
|
||||
# enforce unix style line endings
|
||||
*.js text eol=lf
|
||||
*.md text eol=lf
|
||||
*.json text eol=lf
|
||||
*.yml text eol=lf
|
||||
*.hbs text eol=lf
|
||||
|
||||
.github/workflows/*.lock.yml linguist-generated=true merge=ours
|
||||
@@ -0,0 +1,18 @@
|
||||
# CODEOWNERS for Ghost Repository
|
||||
# This file defines code ownership for automatic review assignment
|
||||
# https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners
|
||||
|
||||
# E2E Test Ownership
|
||||
# The top-level e2e directory requires review from designated owners
|
||||
/e2e/ @9larsons
|
||||
|
||||
# Tinybird Analytics
|
||||
# Tinybird data pipelines and services require review from designated owners
|
||||
**/tinybird/ @9larsons @cmraible @evanhahn @troyciesco
|
||||
|
||||
# @tryghost/parse-email-address
|
||||
/ghost/parse-email-address/ @EvanHahn
|
||||
|
||||
# Inbox Links
|
||||
ghost/core/core/server/lib/get-inbox-links.ts @EvanHahn
|
||||
ghost/core/test/unit/server/lib/get-inbox-links.test.ts @EvanHahn
|
||||
@@ -0,0 +1,128 @@
|
||||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
We as members, contributors, and leaders pledge to make participation in our
|
||||
community a harassment-free experience for everyone, regardless of age, body
|
||||
size, visible or invisible disability, ethnicity, sex characteristics, gender
|
||||
identity and expression, level of experience, education, socio-economic status,
|
||||
nationality, personal appearance, race, religion, or sexual identity
|
||||
and orientation.
|
||||
|
||||
We pledge to act and interact in ways that contribute to an open, welcoming,
|
||||
diverse, inclusive, and healthy community.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to a positive environment for our
|
||||
community include:
|
||||
|
||||
* Demonstrating empathy and kindness toward other people
|
||||
* Being respectful of differing opinions, viewpoints, and experiences
|
||||
* Giving and gracefully accepting constructive feedback
|
||||
* Accepting responsibility and apologizing to those affected by our mistakes,
|
||||
and learning from the experience
|
||||
* Focusing on what is best not just for us as individuals, but for the
|
||||
overall community
|
||||
|
||||
Examples of unacceptable behavior include:
|
||||
|
||||
* The use of sexualized language or imagery, and sexual attention or
|
||||
advances of any kind
|
||||
* Trolling, insulting or derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or email
|
||||
address, without their explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Enforcement Responsibilities
|
||||
|
||||
Community leaders are responsible for clarifying and enforcing our standards of
|
||||
acceptable behavior and will take appropriate and fair corrective action in
|
||||
response to any behavior that they deem inappropriate, threatening, offensive,
|
||||
or harmful.
|
||||
|
||||
Community leaders have the right and responsibility to remove, edit, or reject
|
||||
comments, commits, code, wiki edits, issues, and other contributions that are
|
||||
not aligned to this Code of Conduct, and will communicate reasons for moderation
|
||||
decisions when appropriate.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies within all community spaces, and also applies when
|
||||
an individual is officially representing the community in public spaces.
|
||||
Examples of representing our community include using an official e-mail address,
|
||||
posting via an official social media account, or acting as an appointed
|
||||
representative at an online or offline event.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported to the community leaders responsible for enforcement at
|
||||
report@ghost.org.
|
||||
All complaints will be reviewed and investigated promptly and fairly.
|
||||
|
||||
All community leaders are obligated to respect the privacy and security of the
|
||||
reporter of any incident.
|
||||
|
||||
## Enforcement Guidelines
|
||||
|
||||
Community leaders will follow these Community Impact Guidelines in determining
|
||||
the consequences for any action they deem in violation of this Code of Conduct:
|
||||
|
||||
### 1. Correction
|
||||
|
||||
**Community Impact**: Use of inappropriate language or other behavior deemed
|
||||
unprofessional or unwelcome in the community.
|
||||
|
||||
**Consequence**: A private, written warning from community leaders, providing
|
||||
clarity around the nature of the violation and an explanation of why the
|
||||
behavior was inappropriate. A public apology may be requested.
|
||||
|
||||
### 2. Warning
|
||||
|
||||
**Community Impact**: A violation through a single incident or series
|
||||
of actions.
|
||||
|
||||
**Consequence**: A warning with consequences for continued behavior. No
|
||||
interaction with the people involved, including unsolicited interaction with
|
||||
those enforcing the Code of Conduct, for a specified period of time. This
|
||||
includes avoiding interactions in community spaces as well as external channels
|
||||
like social media. Violating these terms may lead to a temporary or
|
||||
permanent ban.
|
||||
|
||||
### 3. Temporary Ban
|
||||
|
||||
**Community Impact**: A serious violation of community standards, including
|
||||
sustained inappropriate behavior.
|
||||
|
||||
**Consequence**: A temporary ban from any sort of interaction or public
|
||||
communication with the community for a specified period of time. No public or
|
||||
private interaction with the people involved, including unsolicited interaction
|
||||
with those enforcing the Code of Conduct, is allowed during this period.
|
||||
Violating these terms may lead to a permanent ban.
|
||||
|
||||
### 4. Permanent Ban
|
||||
|
||||
**Community Impact**: Demonstrating a pattern of violation of community
|
||||
standards, including sustained inappropriate behavior, harassment of an
|
||||
individual, or aggression toward or disparagement of classes of individuals.
|
||||
|
||||
**Consequence**: A permanent ban from any sort of public interaction within
|
||||
the community.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
|
||||
version 2.0, available at
|
||||
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
|
||||
|
||||
Community Impact Guidelines were inspired by [Mozilla's code of conduct
|
||||
enforcement ladder](https://github.com/mozilla/diversity).
|
||||
|
||||
[homepage]: https://www.contributor-covenant.org
|
||||
|
||||
For answers to common questions about this code of conduct, see the FAQ at
|
||||
https://www.contributor-covenant.org/faq. Translations are available at
|
||||
https://www.contributor-covenant.org/translations.
|
||||
@@ -0,0 +1,84 @@
|
||||
# Contributing to Ghost
|
||||
|
||||
For **help**, **support**, **questions** and **ideas** please use **[our forum](https://forum.ghost.org)** 🚑.
|
||||
|
||||
---
|
||||
|
||||
## Where to Start
|
||||
|
||||
If you're a developer looking to contribute, but you're not sure where to begin: Check out the [good first issue](https://github.com/TryGhost/Ghost/labels/good%20first%20issue) label on Github, which contains small piece of work that have been specifically flagged as being friendly to new contributors.
|
||||
|
||||
After that, if you're looking for something a little more challenging to sink your teeth into, there's a broader [help wanted](https://github.com/TryGhost/Ghost/labels/help%20wanted) label encompassing issues which need some love.
|
||||
|
||||
If you've got an idea for a new feature, please start by suggesting it in the [forum](https://forum.ghost.org), as adding new features to Ghost first requires generating consensus around a design and spec.
|
||||
|
||||
|
||||
## Working on Ghost Core
|
||||
|
||||
If you're going to work on Ghost core you'll need to go through a slightly more involved install and setup process than the usual Ghost CLI version.
|
||||
|
||||
First you'll need to fork [Ghost](https://github.com/tryghost/ghost) to your personal Github account, and then follow the detailed [install from source](https://ghost.org/docs/install/source/) setup guide.
|
||||
|
||||
|
||||
### Branching Guide
|
||||
|
||||
`main` on the main repository always contains the latest changes. This means that it is WIP for the next minor version and should NOT be considered stable. Stable versions are tagged using [semantic versioning](http://semver.org/).
|
||||
|
||||
On your local repository, you should always work on a branch to make keeping up-to-date and submitting pull requests easier, but in most cases you should submit your pull requests to `main`. Where necessary, for example if multiple people are contributing on a large feature, or if a feature requires a database change, we make use of feature branches.
|
||||
|
||||
|
||||
### Commit Messages
|
||||
|
||||
We have a handful of simple standards for commit messages which help us to generate readable changelogs. Please follow this wherever possible and mention the associated issue number.
|
||||
|
||||
- **1st line:** Max 80 character summary
|
||||
- Written in past tense e.g. “Fixed the thing” not “Fixes the thing”
|
||||
- Start with one of: Fixed, Changed, Updated, Improved, Added, Removed, Reverted, Moved, Released, Bumped, Cleaned
|
||||
- **2nd line:** [Always blank]
|
||||
- **3rd line:** `ref <issue link>`, `fixes <issue link>`, `closes <issue link>` or blank
|
||||
- **4th line:** Why this change was made - the code includes the what, the commit message should describe the context of why - why this, why now, why not something else?
|
||||
|
||||
If your change is **user-facing** please prepend the first line of your commit with **an emoji key**. If the commit is for an alpha feature, no emoji is needed. We are following [gitmoji](https://gitmoji.carloscuesta.me/).
|
||||
|
||||
**Main emojis we are using:**
|
||||
|
||||
- ✨ Feature
|
||||
- 🎨 Improvement / change
|
||||
- 🐛 Bug Fix
|
||||
- 🌐 i18n (translation) submissions [[See Translating Ghost docs for more detail](https://www.notion.so/5af2858289b44f9194f73f8a1e17af59?pvs=25#bef8c9988e294a4b9a6dd624136de36f)]
|
||||
- 💡 Anything else flagged to users or whoever is writing release notes
|
||||
|
||||
Good commit message examples: [new feature](https://github.com/TryGhost/Ghost/commit/61db6defde3b10a4022c86efac29cf15ae60983f), [bug fix](https://github.com/TryGhost/Ghost/commit/6ef835bb5879421ae9133541ebf8c4e560a4a90e) and [translation](https://github.com/TryGhost/Ghost/commit/83904c1611ae7ab3257b3b7d55f03e50cead62d7).
|
||||
|
||||
**Bumping @tryghost dependencies**
|
||||
|
||||
When bumping `@tryghost/*` dependencies, the first line should follow the above format and say what has changed, not say what has been bumped.
|
||||
|
||||
There is no need to include what modules have changed in the commit message, as this is _very_ clear from the contents of the commit. The commit should focus on surfacing the underlying changes from the dependencies - what actually changed as a result of this dependency bump?
|
||||
|
||||
[Good example](https://github.com/TryGhost/Ghost/commit/95751a0e5fb719bb5bca74cb97fb5f29b225094f)
|
||||
|
||||
|
||||
|
||||
### Submitting Pull Requests
|
||||
|
||||
We aim to merge any straightforward, well-understood bug fixes or improvements immediately, as long as they pass our tests (run `pnpm test` to check locally). We generally don’t merge new features and larger changes without prior discussion with the core product team for tech/design specification.
|
||||
|
||||
Please provide plenty of context and reasoning around your changes, to help us merge quickly. Closing an already open issue is our preferred workflow. If your PR gets out of date, we may ask you to rebase as you are more familiar with your changes than we will be.
|
||||
|
||||
### Sharing feedback on Documentation
|
||||
|
||||
While the Docs are no longer Open Source, we welcome revisions and ideas on the forum! Please create a Post with your questions or suggestions in the [Contributing to Ghost Category](https://forum.ghost.org/c/contributing/27). Thank you for helping us keep the Docs relevant and up-to-date.
|
||||
|
||||
---
|
||||
|
||||
## Contributor License Agreement
|
||||
|
||||
By contributing your code to Ghost you grant the Ghost Foundation a non-exclusive, irrevocable, worldwide, royalty-free, sublicenseable, transferable license under all of Your relevant intellectual property rights (including copyright, patent, and any other rights), to use, copy, prepare derivative works of, distribute and publicly perform and display the Contributions on any licensing terms, including without limitation:
|
||||
(a) open source licenses like the MIT license; and (b) binary, proprietary, or commercial licenses. Except for the licenses granted herein, You reserve all right, title, and interest in and to the Contribution.
|
||||
|
||||
You confirm that you are able to grant us these rights. You represent that You are legally entitled to grant the above license. If Your employer has rights to intellectual property that You create, You represent that You have received permission to make the Contributions on behalf of that employer, or that Your employer has waived such rights for the Contributions.
|
||||
|
||||
You represent that the Contributions are Your original works of authorship, and to Your knowledge, no other person claims, or has the right to claim, any right in any invention or patent related to the Contributions. You also represent that You are not legally obligated, whether by entering into an agreement or otherwise, in any way that conflicts with the terms of this license.
|
||||
|
||||
The Ghost Foundation acknowledges that, except as explicitly described in this Agreement, any Contribution which you provide is on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OR CONDITIONS OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE.
|
||||
@@ -0,0 +1,3 @@
|
||||
# You can add one username per supported platform and one custom link
|
||||
github: tryghost
|
||||
open_collective: ghost
|
||||
@@ -0,0 +1,76 @@
|
||||
name: 🐛 Bug report
|
||||
description: Report reproducible software issues so we can improve
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
## Welcome 👋
|
||||
Thank you for taking the time to fill out a bug report 🙂
|
||||
|
||||
We'll respond as quickly as we can. The more information you provide the easier & quicker it is for us to diagnose the problem.
|
||||
- type: textarea
|
||||
id: summary
|
||||
attributes:
|
||||
label: Issue Summary
|
||||
description: Explain roughly what's wrong
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: reproduction
|
||||
attributes:
|
||||
label: Steps to Reproduce
|
||||
description: Also tell us, what did you expect to happen?
|
||||
placeholder: |
|
||||
1. This is the first step...
|
||||
2. This is the second step, etc.
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: version
|
||||
attributes:
|
||||
label: Ghost Version
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: node
|
||||
attributes:
|
||||
label: Node.js Version
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: install
|
||||
attributes:
|
||||
label: How did you install Ghost?
|
||||
description: Provide details of your host & operating system
|
||||
validations:
|
||||
required: true
|
||||
- type: dropdown
|
||||
id: database
|
||||
attributes:
|
||||
label: Database type
|
||||
options:
|
||||
- MySQL 5.7
|
||||
- MySQL 8
|
||||
- SQLite3
|
||||
- Other
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: browsers
|
||||
attributes:
|
||||
label: Browser & OS version
|
||||
description: Include this for frontend bugs
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Relevant log / error output
|
||||
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
|
||||
render: shell
|
||||
- type: checkboxes
|
||||
id: terms
|
||||
attributes:
|
||||
label: Code of Conduct
|
||||
description: By submitting this issue, you agree to follow our [Code of Conduct](https://ghost.org/conduct)
|
||||
options:
|
||||
- label: I agree to be friendly and polite to people in this repository
|
||||
required: true
|
||||
@@ -0,0 +1,11 @@
|
||||
blank_issues_enabled: true
|
||||
contact_links:
|
||||
- name: 🚑 Help & Support
|
||||
url: https://forum.ghost.org
|
||||
about: Please use the community forum for questions
|
||||
- name: 💡 Features & Ideas
|
||||
url: https://forum.ghost.org/c/Ideas
|
||||
about: Please vote for & post new ideas in the the forum
|
||||
- name: 📖 Documentation
|
||||
url: https://ghost.org/docs/
|
||||
about: Tutorials & reference guides for themes, the API and more
|
||||
@@ -0,0 +1,14 @@
|
||||
Got some code for us? Awesome 🎊!
|
||||
|
||||
Please take a minute to explain the change you're making:
|
||||
- Why are you making it?
|
||||
- What does it do?
|
||||
- Why is this something Ghost users or developers need?
|
||||
|
||||
Please check your PR against these items:
|
||||
|
||||
- [ ] I've read and followed the [Contributor Guide](https://github.com/TryGhost/Ghost/blob/main/.github/CONTRIBUTING.md)
|
||||
- [ ] I've explained my change
|
||||
- [ ] I've written an automated test to prove my change works
|
||||
|
||||
We appreciate your contribution! 🙏
|
||||
@@ -0,0 +1,28 @@
|
||||
# How to get support for Ghost 👨👩👧👦
|
||||
|
||||
For **help**, **support**, **questions** and **ideas** please use **[our forum](https://forum.ghost.org)** 🚑.
|
||||
|
||||
Please **_do not_** raise an issue on GitHub.
|
||||
|
||||
We have a **help** category in our **[forum](https://forum.ghost.org/)** where you can get quick answers,
|
||||
help with debugging weird issues, and general help with any aspect of Ghost. There's also an **ideas** category for feature requests.
|
||||
|
||||
Our extensive **documentation** can be found at https://ghost.org/docs/.
|
||||
|
||||
Please go to https://forum.ghost.org and signup to join our community.
|
||||
You can create a new account, or signup using Google, Twitter or Facebook.
|
||||
|
||||
Issues which are not bug reports will be closed.
|
||||
|
||||
## Using Ghost(Pro)?
|
||||
|
||||
**Ghost(Pro)** users have access to email support via the support at ghost dot org address.
|
||||
|
||||
## Why not GitHub?
|
||||
|
||||
GitHub is our office, it's the place where our development team does its work. We use the issue list
|
||||
to keep track of bugs and the features that we are working on. We do this openly for transparency.
|
||||
|
||||
With the forum, you can leverage the knowledge of our wider community to get help with any problems you are
|
||||
having with Ghost. Please keep in mind that Ghost is FLOSS, and free support is provided by the goodwill
|
||||
of our wonderful community members.
|
||||
@@ -0,0 +1,155 @@
|
||||
---
|
||||
description: GitHub Agentic Workflows (gh-aw) - Create, debug, and upgrade AI-powered workflows with intelligent prompt routing
|
||||
disable-model-invocation: true
|
||||
---
|
||||
|
||||
# GitHub Agentic Workflows Agent
|
||||
|
||||
This agent helps you work with **GitHub Agentic Workflows (gh-aw)**, a CLI extension for creating AI-powered workflows in natural language using markdown files.
|
||||
|
||||
## What This Agent Does
|
||||
|
||||
This is a **dispatcher agent** that routes your request to the appropriate specialized prompt based on your task:
|
||||
|
||||
- **Creating new workflows**: Routes to `create` prompt
|
||||
- **Updating existing workflows**: Routes to `update` prompt
|
||||
- **Debugging workflows**: Routes to `debug` prompt
|
||||
- **Upgrading workflows**: Routes to `upgrade-agentic-workflows` prompt
|
||||
- **Creating shared components**: Routes to `create-shared-agentic-workflow` prompt
|
||||
- **Fixing Dependabot PRs**: Routes to `dependabot` prompt — use this when Dependabot opens PRs that modify generated manifest files (`.github/workflows/package.json`, `.github/workflows/requirements.txt`, `.github/workflows/go.mod`). Never merge those PRs directly; instead update the source `.md` files and rerun `gh aw compile --dependabot` to bundle all fixes
|
||||
|
||||
Workflows may optionally include:
|
||||
|
||||
- **Project tracking / monitoring** (GitHub Projects updates, status reporting)
|
||||
- **Orchestration / coordination** (one workflow assigning agents or dispatching and coordinating other workflows)
|
||||
|
||||
## Files This Applies To
|
||||
|
||||
- Workflow files: `.github/workflows/*.md` and `.github/workflows/**/*.md`
|
||||
- Workflow lock files: `.github/workflows/*.lock.yml`
|
||||
- Shared components: `.github/workflows/shared/*.md`
|
||||
- Configuration: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/github-agentic-workflows.md
|
||||
|
||||
## Problems This Solves
|
||||
|
||||
- **Workflow Creation**: Design secure, validated agentic workflows with proper triggers, tools, and permissions
|
||||
- **Workflow Debugging**: Analyze logs, identify missing tools, investigate failures, and fix configuration issues
|
||||
- **Version Upgrades**: Migrate workflows to new gh-aw versions, apply codemods, fix breaking changes
|
||||
- **Component Design**: Create reusable shared workflow components that wrap MCP servers
|
||||
|
||||
## How to Use
|
||||
|
||||
When you interact with this agent, it will:
|
||||
|
||||
1. **Understand your intent** - Determine what kind of task you're trying to accomplish
|
||||
2. **Route to the right prompt** - Load the specialized prompt file for your task
|
||||
3. **Execute the task** - Follow the detailed instructions in the loaded prompt
|
||||
|
||||
## Available Prompts
|
||||
|
||||
### Create New Workflow
|
||||
**Load when**: User wants to create a new workflow from scratch, add automation, or design a workflow that doesn't exist yet
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/create-agentic-workflow.md
|
||||
|
||||
**Use cases**:
|
||||
- "Create a workflow that triages issues"
|
||||
- "I need a workflow to label pull requests"
|
||||
- "Design a weekly research automation"
|
||||
|
||||
### Update Existing Workflow
|
||||
**Load when**: User wants to modify, improve, or refactor an existing workflow
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/update-agentic-workflow.md
|
||||
|
||||
**Use cases**:
|
||||
- "Add web-fetch tool to the issue-classifier workflow"
|
||||
- "Update the PR reviewer to use discussions instead of issues"
|
||||
- "Improve the prompt for the weekly-research workflow"
|
||||
|
||||
### Debug Workflow
|
||||
**Load when**: User needs to investigate, audit, debug, or understand a workflow, troubleshoot issues, analyze logs, or fix errors
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/debug-agentic-workflow.md
|
||||
|
||||
**Use cases**:
|
||||
- "Why is this workflow failing?"
|
||||
- "Analyze the logs for workflow X"
|
||||
- "Investigate missing tool calls in run #12345"
|
||||
|
||||
### Upgrade Agentic Workflows
|
||||
**Load when**: User wants to upgrade workflows to a new gh-aw version or fix deprecations
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/upgrade-agentic-workflows.md
|
||||
|
||||
**Use cases**:
|
||||
- "Upgrade all workflows to the latest version"
|
||||
- "Fix deprecated fields in workflows"
|
||||
- "Apply breaking changes from the new release"
|
||||
|
||||
### Create Shared Agentic Workflow
|
||||
**Load when**: User wants to create a reusable workflow component or wrap an MCP server
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/create-shared-agentic-workflow.md
|
||||
|
||||
**Use cases**:
|
||||
- "Create a shared component for Notion integration"
|
||||
- "Wrap the Slack MCP server as a reusable component"
|
||||
- "Design a shared workflow for database queries"
|
||||
|
||||
### Fix Dependabot PRs
|
||||
**Load when**: User needs to close or fix open Dependabot PRs that update dependencies in generated manifest files (`.github/workflows/package.json`, `.github/workflows/requirements.txt`, `.github/workflows/go.mod`)
|
||||
|
||||
**Prompt file**: https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/dependabot.md
|
||||
|
||||
**Use cases**:
|
||||
- "Fix the open Dependabot PRs for npm dependencies"
|
||||
- "Bundle and close the Dependabot PRs for workflow dependencies"
|
||||
- "Update @playwright/test to fix the Dependabot PR"
|
||||
|
||||
## Instructions
|
||||
|
||||
When a user interacts with you:
|
||||
|
||||
1. **Identify the task type** from the user's request
|
||||
2. **Load the appropriate prompt** from the GitHub repository URLs listed above
|
||||
3. **Follow the loaded prompt's instructions** exactly
|
||||
4. **If uncertain**, ask clarifying questions to determine the right prompt
|
||||
|
||||
## Quick Reference
|
||||
|
||||
```bash
|
||||
# Initialize repository for agentic workflows
|
||||
gh aw init
|
||||
|
||||
# Generate the lock file for a workflow
|
||||
gh aw compile [workflow-name]
|
||||
|
||||
# Debug workflow runs
|
||||
gh aw logs [workflow-name]
|
||||
gh aw audit <run-id>
|
||||
|
||||
# Upgrade workflows
|
||||
gh aw fix --write
|
||||
gh aw compile --validate
|
||||
```
|
||||
|
||||
## Key Features of gh-aw
|
||||
|
||||
- **Natural Language Workflows**: Write workflows in markdown with YAML frontmatter
|
||||
- **AI Engine Support**: Copilot, Claude, Codex, or custom engines
|
||||
- **MCP Server Integration**: Connect to Model Context Protocol servers for tools
|
||||
- **Safe Outputs**: Structured communication between AI and GitHub API
|
||||
- **Strict Mode**: Security-first validation and sandboxing
|
||||
- **Shared Components**: Reusable workflow building blocks
|
||||
- **Repo Memory**: Persistent git-backed storage for agents
|
||||
- **Sandboxed Execution**: All workflows run in the Agent Workflow Firewall (AWF) sandbox, enabling full `bash` and `edit` tools by default
|
||||
|
||||
## Important Notes
|
||||
|
||||
- Always reference the instructions file at https://github.com/github/gh-aw/blob/v0.49.3/.github/aw/github-agentic-workflows.md for complete documentation
|
||||
- Use the MCP tool `agentic-workflows` when running in GitHub Copilot Cloud
|
||||
- Workflows must be compiled to `.lock.yml` files before running in GitHub Actions
|
||||
- **Bash tools are enabled by default** - Don't restrict bash commands unnecessarily since workflows are sandboxed by the AWF
|
||||
- Follow security best practices: minimal permissions, explicit network access, no template injection
|
||||
- **Single-file output**: When creating a workflow, produce exactly **one** workflow `.md` file. Do not create separate documentation files (architecture docs, runbooks, usage guides, etc.). If documentation is needed, add a brief `## Usage` section inside the workflow file itself.
|
||||
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"entries": {
|
||||
"actions/github-script@v8": {
|
||||
"repo": "actions/github-script",
|
||||
"version": "v8",
|
||||
"sha": "ed597411d8f924073f98dfc5c65a23a2325f34cd"
|
||||
},
|
||||
"github/gh-aw/actions/setup@v0.51.5": {
|
||||
"repo": "github/gh-aw/actions/setup",
|
||||
"version": "v0.51.5",
|
||||
"sha": "88319be75ab1adc60640307a10e5cf04b3deff1e"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,20 @@
|
||||
codecov:
|
||||
require_ci_to_pass: true
|
||||
coverage:
|
||||
status:
|
||||
patch: false
|
||||
project:
|
||||
default: false
|
||||
admin-tests:
|
||||
flags:
|
||||
- admin-tests
|
||||
threshold: 0.2%
|
||||
e2e-tests:
|
||||
flags:
|
||||
- e2e-tests
|
||||
threshold: 0.2%
|
||||
flags:
|
||||
admin-tests:
|
||||
carryforward: true
|
||||
e2e-tests:
|
||||
carryforward: true
|
||||
Executable
+2
@@ -0,0 +1,2 @@
|
||||
#!/usr/bin/env sh
|
||||
exec bash "$(dirname "$0")/commit-msg.bash" "$@"
|
||||
Executable
+97
@@ -0,0 +1,97 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Get the commit message file path from the first argument
|
||||
commit_msg_file="$1"
|
||||
|
||||
# Read the commit message
|
||||
commit_msg=$(cat "$commit_msg_file")
|
||||
|
||||
# Colors for output
|
||||
red='\033[0;31m'
|
||||
yellow='\033[1;33m'
|
||||
no_color='\033[0m'
|
||||
|
||||
# Get the first line (subject)
|
||||
subject=$(echo "$commit_msg" | head -n1)
|
||||
|
||||
# Get the second line
|
||||
second_line=$(echo "$commit_msg" | sed -n '2p')
|
||||
|
||||
# Get the third line
|
||||
third_line=$(echo "$commit_msg" | sed -n '3p')
|
||||
|
||||
# Get the rest of the message (body)
|
||||
body=$(echo "$commit_msg" | tail -n +4)
|
||||
|
||||
# Check subject length (max 80 characters)
|
||||
if [ ${#subject} -gt 80 ]; then
|
||||
echo -e "${yellow}Warning: Commit message subject is too long (max 80 characters)${no_color}"
|
||||
echo -e "Current length: ${#subject} characters"
|
||||
fi
|
||||
|
||||
# Check if second line is blank
|
||||
if [ ! -z "$second_line" ]; then
|
||||
echo -e "${yellow}Warning: Second line should be blank${no_color}"
|
||||
fi
|
||||
|
||||
# Check third line format
|
||||
if [ ! -z "$third_line" ]; then
|
||||
if [[ "$third_line" =~ ^(refs|ref:) ]]; then
|
||||
echo -e "${red}Error: Third line should not start with 'refs' or 'ref:'${no_color}" >&2
|
||||
echo -e "Use 'ref <issue link>', 'fixes <issue link>', or 'closes <issue link>' instead" >&2
|
||||
echo -e "${yellow}Press Enter to edit the message...${no_color}" >&2
|
||||
read < /dev/tty # Wait for Enter key press from the terminal
|
||||
|
||||
# Get the configured Git editor
|
||||
editor=$(git var GIT_EDITOR)
|
||||
if [ -z "$editor" ]; then
|
||||
editor=${VISUAL:-${EDITOR:-vi}} # Fallback logic similar to Git
|
||||
fi
|
||||
|
||||
# Re-open the editor on the commit message file, connected to the terminal
|
||||
$editor "$commit_msg_file" < /dev/tty
|
||||
|
||||
# Re-read the potentially modified commit message after editing
|
||||
commit_msg=$(cat "$commit_msg_file")
|
||||
# Need to update related variables as well
|
||||
subject=$(echo "$commit_msg" | head -n1)
|
||||
second_line=$(echo "$commit_msg" | sed -n '2p')
|
||||
third_line=$(echo "$commit_msg" | sed -n '3p')
|
||||
body=$(echo "$commit_msg" | tail -n +4)
|
||||
|
||||
# Re-check the third line *again* after editing
|
||||
if [[ "$third_line" =~ ^(refs|ref:) ]]; then
|
||||
echo -e "${red}Error: Third line still starts with 'refs' or 'ref:'. Commit aborted.${no_color}" >&2
|
||||
exit 1 # Abort commit if still invalid
|
||||
fi
|
||||
# If fixed, the script will continue to the next checks
|
||||
fi
|
||||
|
||||
if ! [[ "$third_line" =~ ^(ref|fixes|closes)\ .*$ ]]; then
|
||||
echo -e "${yellow}Warning: Third line should start with 'ref', 'fixes', or 'closes' followed by an issue link${no_color}" >&2
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for body content (why explanation)
|
||||
if [ -z "$body" ]; then
|
||||
echo -e "${yellow}Warning: Missing explanation of why this change was made${no_color}"
|
||||
echo -e "The body should explain: why this, why now, why not something else?"
|
||||
fi
|
||||
|
||||
# Check for emoji in user-facing changes
|
||||
if [[ "$subject" =~ ^[^[:space:]]*[[:space:]] ]]; then
|
||||
first_word="${subject%% *}"
|
||||
if [[ ! "$first_word" =~ ^[[:punct:]] ]]; then
|
||||
echo -e "${yellow}Warning: User-facing changes should start with an emoji${no_color}"
|
||||
echo -e "Common emojis: ✨ (Feature), 🎨 (Improvement), 🐛 (Bug Fix), 🌐 (i18n), 💡 (User-facing)"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check for past tense verbs in subject
|
||||
past_tense_words="Fixed|Changed|Updated|Improved|Added|Removed|Reverted|Moved|Released|Bumped|Cleaned"
|
||||
if ! echo "$subject" | grep -iE "$past_tense_words" > /dev/null; then
|
||||
echo -e "${yellow}Warning: Subject line should use past tense${no_color}"
|
||||
echo -e "Use one of: Fixed, Changed, Updated, Improved, Added, Removed, Reverted, Moved, Released, Bumped, Cleaned"
|
||||
fi
|
||||
|
||||
exit 0
|
||||
Executable
+2
@@ -0,0 +1,2 @@
|
||||
#!/usr/bin/env sh
|
||||
exec bash "$(dirname "$0")/pre-commit.bash" "$@"
|
||||
Executable
+116
@@ -0,0 +1,116 @@
|
||||
#!/usr/bin/env bash
|
||||
# Modified from https://github.com/chaitanyagupta/gitutils
|
||||
|
||||
[ -n "$CI" ] && exit 0
|
||||
|
||||
pnpm lint-staged --relative
|
||||
lintStatus=$?
|
||||
|
||||
if [ $lintStatus -ne 0 ]; then
|
||||
echo "❌ Linting failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
green='\033[0;32m'
|
||||
no_color='\033[0m'
|
||||
grey='\033[0;90m'
|
||||
red='\033[0;31m'
|
||||
|
||||
##
|
||||
## 1) Check and remove submodules before committing
|
||||
##
|
||||
|
||||
ROOT_DIR=$(git rev-parse --show-cdup)
|
||||
SUBMODULES=$(grep path ${ROOT_DIR}.gitmodules | sed 's/^.*path = //')
|
||||
MOD_SUBMODULES=$(git diff --cached --name-only --ignore-submodules=none | grep -F "$SUBMODULES" || true)
|
||||
|
||||
echo -e "Checking submodules ${grey}(pre-commit hook)${no_color} "
|
||||
|
||||
# If no modified submodules, exit with status code 0, else remove them and continue
|
||||
if [[ -n "$MOD_SUBMODULES" ]]; then
|
||||
echo -e "${grey}Removing submodules from commit...${no_color}"
|
||||
for SUB in $MOD_SUBMODULES
|
||||
do
|
||||
git reset --quiet HEAD "$SUB"
|
||||
echo -e "\t${grey}removed:\t$SUB${no_color}"
|
||||
done
|
||||
echo
|
||||
echo -e "${grey}Submodules removed from commit, continuing...${no_color}"
|
||||
|
||||
# If there are no changes to commit after removing submodules, abort to avoid an empty commit
|
||||
if output=$(git status --porcelain) && [ -z "$output" ]; then
|
||||
echo -e "nothing to commit, working tree clean"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "No submodules in commit, continuing..."
|
||||
fi
|
||||
|
||||
##
|
||||
## 2) Suggest shipping a new version of @tryghost/activitypub when changes are detected
|
||||
## The intent is to ship smaller changes more frequently to production
|
||||
##
|
||||
|
||||
increment_version() {
|
||||
local package_json_path=$1
|
||||
local version_type=$2
|
||||
|
||||
local current_version
|
||||
current_version=$(grep '"version":' "$package_json_path" | awk -F '"' '{print $4}')
|
||||
|
||||
IFS='.' read -r major minor patch <<< "$current_version"
|
||||
|
||||
case "$version_type" in
|
||||
major) ((major++)); minor=0; patch=0 ;;
|
||||
minor) ((minor++)); patch=0 ;;
|
||||
patch) ((patch++)) ;;
|
||||
*) echo "Invalid version type"; exit 1 ;;
|
||||
esac
|
||||
|
||||
new_version="$major.$minor.$patch"
|
||||
|
||||
# Update package.json with new version
|
||||
if [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
# macOS
|
||||
sed -i '' -E "s/\"version\": \"[0-9]+\.[0-9]+\.[0-9]+\"/\"version\": \"$new_version\"/" "$package_json_path"
|
||||
else
|
||||
# Linux and others
|
||||
sed -i -E "s/\"version\": \"[0-9]+\.[0-9]+\.[0-9]+\"/\"version\": \"$new_version\"/" "$package_json_path"
|
||||
fi
|
||||
|
||||
echo "Updated version to $new_version in $package_json_path"
|
||||
}
|
||||
|
||||
AP_BUMP_NEEDED=false
|
||||
MODIFIED_FILES=$(git diff --cached --name-only)
|
||||
|
||||
for FILE in $MODIFIED_FILES; do
|
||||
if [[ "$FILE" == apps/activitypub/* ]]; then
|
||||
AP_BUMP_NEEDED=true
|
||||
break
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ "$AP_BUMP_NEEDED" == true ]]; then
|
||||
echo -e "\nYou have made changes to @tryghost/activitypub."
|
||||
echo -e "Would you like to ship a new version? (yes)"
|
||||
read -r new_version </dev/tty
|
||||
|
||||
if [[ -z "$new_version" || "$new_version" == "yes" || "$new_version" == "y" ]]; then
|
||||
echo -e "Is that a patch, minor or major? (patch)"
|
||||
read -r version_type </dev/tty
|
||||
|
||||
# Default to patch
|
||||
if [[ -z "$version_type" ]]; then
|
||||
version_type="patch"
|
||||
fi
|
||||
|
||||
if [[ "$version_type" != "patch" && "$version_type" != "minor" && "$version_type" != "major" ]]; then
|
||||
echo -e "${red}Invalid input. Skipping version bump.${no_color}"
|
||||
else
|
||||
echo "Bumping version ($version_type)..."
|
||||
increment_version "apps/activitypub/package.json" "$version_type"
|
||||
git add apps/activitypub/package.json
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
@@ -0,0 +1,247 @@
|
||||
{
|
||||
"extends": [
|
||||
"github>tryghost/renovate-config"
|
||||
],
|
||||
// Limit concurrent branches to keep Renovate runs within the 30-minute
|
||||
// Mend timeout and avoid overwhelming CI with dozens of queued jobs.
|
||||
// The shared preset disables rate limiting, but Ghost's monorepo is
|
||||
// large enough that unlimited branches cause timeouts during rebasing.
|
||||
"branchConcurrentLimit": 10,
|
||||
// Keep manually-closed immortal/grouped PRs closed unless explicitly
|
||||
// reopened from the Dependency Dashboard.
|
||||
"recreateWhen": "never",
|
||||
// pnpm lockfile generation has been hitting Mend's 3GB memory ceiling.
|
||||
// Renovate maintainers suggested starting with toolSettings.nodeMaxMemory
|
||||
// set to 1024MB to reduce pnpm's Node heap usage and keep the overall job
|
||||
// under the hosted runner limit.
|
||||
"toolSettings": {
|
||||
"nodeMaxMemory": 1024
|
||||
},
|
||||
// We have to disable platform based automerge (forcing renovate to do it manually)
|
||||
// as otherwise renovate wont follow our schedule
|
||||
"platformAutomerge": false,
|
||||
"timezone": "Etc/UTC",
|
||||
// Restrict Renovate runs to the automerge windows so branch updates
|
||||
// (rebasing, force-pushes) happen around the same times automerge
|
||||
// can actually complete, not during the working day when CI is busy.
|
||||
// Each block starts one hour earlier than the matching automerge
|
||||
// window so Renovate has time to rebase and open/refresh PRs before
|
||||
// automerge is eligible to run.
|
||||
"schedule": [
|
||||
// Run all weekend
|
||||
"* * * * 0,6",
|
||||
// Run on Monday morning (Sun 23:00 is already covered by weekend)
|
||||
"* 0-12 * * 1",
|
||||
// Run on weekday evenings, starting 1 hour earlier than automerge
|
||||
"* 21-23 * * 1-5",
|
||||
// Run on early weekday mornings (previous day 23:00 is already
|
||||
// covered by the evening block above)
|
||||
"* 0-4 * * 2-6"
|
||||
],
|
||||
"automergeSchedule": [
|
||||
// Allow automerge all weekend
|
||||
"* * * * 0,6",
|
||||
// Allow automerge on Monday morning
|
||||
"* 0-12 * * 1",
|
||||
// Allow automerge overnight on weekday evenings (10pm-4:59am UTC)
|
||||
"* 22-23 * * 1-5",
|
||||
"* 0-4 * * 2-6"
|
||||
],
|
||||
"ignoreDeps": [
|
||||
// https://github.com/TryGhost/Ghost/commit/2b9e494dfcb95c40f596ccf54ec3151c25d53601
|
||||
// `got` 10.x has a Node 10 bug that makes it pretty much unusable for now
|
||||
"got",
|
||||
// https://github.com/TryGhost/Ghost/commit/2b9e494dfcb95c40f596ccf54ec3151c25d53601
|
||||
// `intl-messageformat` 6.0.0 introduced a breaking change in terms of
|
||||
// escaping that would be pretty difficult to fix for now
|
||||
"intl-messageformat",
|
||||
// https://github.com/TryGhost/Ghost/commit/b2fa84c7ff9bf8e21b0791f268f57e92759a87b1
|
||||
// no reason given
|
||||
"moment",
|
||||
// https://github.com/TryGhost/Ghost/pull/10672
|
||||
// https://github.com/TryGhost/Ghost/issues/10870
|
||||
"moment-timezone",
|
||||
// https://github.com/TryGhost/Admin/pull/1111/files
|
||||
// Ignored because of a mobiledoc-kit issue but that's now in koenig, can probably be cleaned up
|
||||
"simple-dom",
|
||||
// https://github.com/TryGhost/Admin/pull/1111/files
|
||||
// https://github.com/TryGhost/Ghost/pull/10672
|
||||
// These have been ignored since forever
|
||||
"ember-drag-drop",
|
||||
"normalize.css",
|
||||
"validator",
|
||||
|
||||
// https://github.com/TryGhost/Ghost/commit/7ebf2891b7470a1c2ffeddefb2fe5e7a57319df3
|
||||
// Changed how modules are loaded, caused a weird error during render
|
||||
"@embroider/macros",
|
||||
|
||||
// https://github.com/TryGhost/Ghost/commit/a10ad3767f60ed2c8e56feb49e7bf83d9618b2ab
|
||||
// Caused linespacing issues in the editor, but now it's used in different places
|
||||
// So not sure if it's relevant - soon we will finish switching to react-codemirror
|
||||
"codemirror",
|
||||
|
||||
// https://github.com/TryGhost/Ghost/commit/3236891b80988924fbbdb625d30cb64a7bf2afd1
|
||||
// ember-cli-code-coverage@2.0.0 broke our code coverage
|
||||
"ember-cli-code-coverage",
|
||||
// https://github.com/TryGhost/Ghost/commit/1382e34e42a513c201cb957b7f843369a2ce1b63
|
||||
// ember-cli-terser@4.0.2 has a regression that breaks our sourcemaps
|
||||
"ember-cli-terser"
|
||||
],
|
||||
"ignorePaths": [
|
||||
"test",
|
||||
"ghost/admin/lib/koenig-editor/package.json"
|
||||
],
|
||||
"packageRules": [
|
||||
// Always require dashboard approval for major updates
|
||||
// This was largely to avoid the noise of major updates which were ESM only
|
||||
// The idea was to check and accept major updates if they were NOT ESM
|
||||
// But this hasn't been workable with our capacity
|
||||
// Plus, ESM-only is an edge case in the grand scheme of dependencies
|
||||
{
|
||||
"description": "Require dashboard approval for major updates",
|
||||
"matchUpdateTypes": [
|
||||
"major"
|
||||
],
|
||||
"dependencyDashboardApproval": true
|
||||
},
|
||||
|
||||
// Group NQL packages separately from other TryGhost packages
|
||||
{
|
||||
"groupName": "NQL packages",
|
||||
"matchPackageNames": [
|
||||
"@tryghost/nql",
|
||||
"@tryghost/nql-lang"
|
||||
]
|
||||
},
|
||||
|
||||
// Split the broad shared TryGhost group into smaller logical lanes so
|
||||
// failures in one area (e.g. email rendering) don't block unrelated
|
||||
// internal package updates from merging.
|
||||
{
|
||||
"groupName": "TryGhost runtime packages",
|
||||
"matchPackageNames": [
|
||||
"@tryghost/adapter-base-cache",
|
||||
"@tryghost/admin-api-schema",
|
||||
"@tryghost/api-framework",
|
||||
"@tryghost/bookshelf-plugins",
|
||||
"@tryghost/database-info",
|
||||
"@tryghost/debug",
|
||||
"@tryghost/domain-events",
|
||||
"@tryghost/errors",
|
||||
"@tryghost/http-cache-utils",
|
||||
"@tryghost/job-manager",
|
||||
"@tryghost/logging",
|
||||
"@tryghost/metrics",
|
||||
"@tryghost/mw-error-handler",
|
||||
"@tryghost/mw-vhost",
|
||||
"@tryghost/pretty-cli",
|
||||
"@tryghost/prometheus-metrics",
|
||||
"@tryghost/promise",
|
||||
"@tryghost/referrer-parser",
|
||||
"@tryghost/root-utils",
|
||||
"@tryghost/security",
|
||||
"@tryghost/social-urls",
|
||||
"@tryghost/tpl",
|
||||
"@tryghost/validator",
|
||||
"@tryghost/version",
|
||||
"@tryghost/zip"
|
||||
]
|
||||
},
|
||||
{
|
||||
"groupName": "TryGhost admin support packages",
|
||||
"matchPackageNames": [
|
||||
"@tryghost/color-utils",
|
||||
"@tryghost/custom-fonts",
|
||||
"@tryghost/limit-service",
|
||||
"@tryghost/members-csv",
|
||||
"@tryghost/timezone-data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"groupName": "TryGhost content and email packages",
|
||||
"matchPackageNames": [
|
||||
"@tryghost/config-url-helpers",
|
||||
"@tryghost/content-api",
|
||||
"@tryghost/helpers",
|
||||
"@tryghost/html-to-mobiledoc",
|
||||
"@tryghost/html-to-plaintext",
|
||||
"@tryghost/nodemailer",
|
||||
"@tryghost/parse-email-address",
|
||||
"@tryghost/request",
|
||||
"@tryghost/string",
|
||||
"@tryghost/url-utils"
|
||||
]
|
||||
},
|
||||
{
|
||||
"groupName": "TryGhost test support packages",
|
||||
"matchPackageNames": [
|
||||
"@tryghost/email-mock-receiver",
|
||||
"@tryghost/express-test",
|
||||
"@tryghost/webhook-mock-receiver"
|
||||
]
|
||||
},
|
||||
|
||||
// Always automerge these packages:
|
||||
{
|
||||
"matchPackageNames": [
|
||||
// This is a pre-1.0.0 package, but it provides icons
|
||||
// and is very very regularly updated and seems safe to update
|
||||
"lucide-react"
|
||||
],
|
||||
"automerge": true
|
||||
},
|
||||
|
||||
// Allow internal Docker digest pins to automerge once the relevant
|
||||
// CI checks have gone green.
|
||||
{
|
||||
"description": "Automerge internal Docker digest updates after CI passes",
|
||||
"matchDatasources": [
|
||||
"docker"
|
||||
],
|
||||
"matchPackageNames": [
|
||||
"ghost/traffic-analytics",
|
||||
"tinybirdco/tinybird-local"
|
||||
],
|
||||
"matchUpdateTypes": [
|
||||
"digest"
|
||||
],
|
||||
"automerge": true,
|
||||
"automergeType": "pr"
|
||||
},
|
||||
|
||||
// Ignore all ember-related packages in admin
|
||||
// Our ember codebase is being replaced with react and
|
||||
// Most of the dependencies have breaking changes and it's too hard to update
|
||||
// Therefore, we'll leave these as-is for now
|
||||
{
|
||||
"groupName": "Disable ember updates",
|
||||
"matchFileNames": [
|
||||
"ghost/admin/package.json"
|
||||
],
|
||||
"matchPackageNames": [
|
||||
// `ember-foo` style packages
|
||||
"/^ember(-|$)/",
|
||||
// scoped `@ember/*` packages
|
||||
"/^@ember\\//",
|
||||
// foo/ember-something style packages
|
||||
"/\\/ember(-|$)/"
|
||||
],
|
||||
"enabled": false
|
||||
},
|
||||
|
||||
// Don't allow css preprocessor updates in admin
|
||||
{
|
||||
"groupName": "disable css",
|
||||
"matchFileNames": [
|
||||
"ghost/admin/package.json"
|
||||
],
|
||||
"matchPackageNames": [
|
||||
"autoprefixer",
|
||||
"ember-cli-postcss",
|
||||
"/^postcss/",
|
||||
"/^css/"
|
||||
],
|
||||
"enabled": false
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -0,0 +1,44 @@
|
||||
const fs = require('fs/promises');
|
||||
const exec = require('util').promisify(require('child_process').exec);
|
||||
const path = require('path');
|
||||
|
||||
const semver = require('semver');
|
||||
|
||||
(async () => {
|
||||
const core = await import('@actions/core');
|
||||
const corePackageJsonPath = path.join(__dirname, '../../ghost/core/package.json');
|
||||
const corePackageJson = require(corePackageJsonPath);
|
||||
|
||||
const current_version = corePackageJson.version;
|
||||
console.log(`Current version: ${current_version}`);
|
||||
|
||||
const firstArg = process.argv[2];
|
||||
console.log('firstArg', firstArg);
|
||||
|
||||
const buildString = await exec('git rev-parse --short HEAD').then(({stdout}) => stdout.trim());
|
||||
|
||||
let newVersion;
|
||||
|
||||
if (firstArg === 'canary' || firstArg === 'six') {
|
||||
const bumpedVersion = semver.inc(current_version, 'minor');
|
||||
newVersion = `${bumpedVersion}-pre-g${buildString}`;
|
||||
} else {
|
||||
newVersion = `${current_version}-0-g${buildString}`;
|
||||
}
|
||||
|
||||
newVersion += '+moya';
|
||||
console.log('newVersion', newVersion);
|
||||
|
||||
corePackageJson.version = newVersion;
|
||||
await fs.writeFile(corePackageJsonPath, JSON.stringify(corePackageJson, null, 2));
|
||||
|
||||
const adminPackageJsonPath = path.join(__dirname, '../../ghost/admin/package.json');
|
||||
const adminPackageJson = require(adminPackageJsonPath);
|
||||
adminPackageJson.version = newVersion;
|
||||
await fs.writeFile(adminPackageJsonPath, JSON.stringify(adminPackageJson, null, 2));
|
||||
|
||||
console.log('Version bumped to', newVersion);
|
||||
|
||||
core.setOutput('BUILD_VERSION', newVersion);
|
||||
core.setOutput('GIT_COMMIT_HASH', buildString);
|
||||
})();
|
||||
@@ -0,0 +1,256 @@
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const execFileSync = require('child_process').execFileSync;
|
||||
|
||||
const MONITORED_APPS = {
|
||||
portal: {
|
||||
packageName: '@tryghost/portal',
|
||||
path: 'apps/portal'
|
||||
},
|
||||
sodoSearch: {
|
||||
packageName: '@tryghost/sodo-search',
|
||||
path: 'apps/sodo-search'
|
||||
},
|
||||
comments: {
|
||||
packageName: '@tryghost/comments-ui',
|
||||
path: 'apps/comments-ui'
|
||||
},
|
||||
announcementBar: {
|
||||
packageName: '@tryghost/announcement-bar',
|
||||
path: 'apps/announcement-bar'
|
||||
},
|
||||
signupForm: {
|
||||
packageName: '@tryghost/signup-form',
|
||||
path: 'apps/signup-form'
|
||||
}
|
||||
};
|
||||
|
||||
const MONITORED_APP_ENTRIES = Object.entries(MONITORED_APPS);
|
||||
const MONITORED_APP_PATHS = MONITORED_APP_ENTRIES.map(([, app]) => app.path);
|
||||
|
||||
function runGit(args) {
|
||||
try {
|
||||
return execFileSync('git', args, {encoding: 'utf8'}).trim();
|
||||
} catch (error) {
|
||||
const stderr = error.stderr ? error.stderr.toString().trim() : '';
|
||||
const stdout = error.stdout ? error.stdout.toString().trim() : '';
|
||||
const message = stderr || stdout || error.message;
|
||||
throw new Error(`Failed to run "git ${args.join(' ')}": ${message}`);
|
||||
}
|
||||
}
|
||||
|
||||
function readVersionFromPackageJson(packageJsonContent, sourceLabel) {
|
||||
let parsedPackageJson;
|
||||
|
||||
try {
|
||||
parsedPackageJson = JSON.parse(packageJsonContent);
|
||||
} catch (error) {
|
||||
throw new Error(`Unable to parse ${sourceLabel}: ${error.message}`);
|
||||
}
|
||||
|
||||
if (!parsedPackageJson.version || typeof parsedPackageJson.version !== 'string') {
|
||||
throw new Error(`${sourceLabel} does not contain a valid "version" field`);
|
||||
}
|
||||
|
||||
return parsedPackageJson.version;
|
||||
}
|
||||
|
||||
function parseSemver(version) {
|
||||
const match = version.match(/^v?(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-([0-9A-Za-z-]+(?:\.[0-9A-Za-z-]+)*))?(?:\+[0-9A-Za-z-]+(?:\.[0-9A-Za-z-]+)*)?$/);
|
||||
|
||||
if (!match) {
|
||||
throw new Error(`Invalid semver version "${version}"`);
|
||||
}
|
||||
|
||||
const prerelease = match[4] ? match[4].split('.').map((identifier) => {
|
||||
if (/^\d+$/.test(identifier)) {
|
||||
return Number(identifier);
|
||||
}
|
||||
|
||||
return identifier;
|
||||
}) : [];
|
||||
|
||||
return {
|
||||
major: Number(match[1]),
|
||||
minor: Number(match[2]),
|
||||
patch: Number(match[3]),
|
||||
prerelease
|
||||
};
|
||||
}
|
||||
|
||||
function comparePrereleaseIdentifier(a, b) {
|
||||
const isANumber = typeof a === 'number';
|
||||
const isBNumber = typeof b === 'number';
|
||||
|
||||
if (isANumber && isBNumber) {
|
||||
if (a === b) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
return a > b ? 1 : -1;
|
||||
}
|
||||
|
||||
if (isANumber) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
if (isBNumber) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
if (a === b) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
return a > b ? 1 : -1;
|
||||
}
|
||||
|
||||
function compareSemver(a, b) {
|
||||
const aVersion = parseSemver(a);
|
||||
const bVersion = parseSemver(b);
|
||||
|
||||
if (aVersion.major !== bVersion.major) {
|
||||
return aVersion.major > bVersion.major ? 1 : -1;
|
||||
}
|
||||
|
||||
if (aVersion.minor !== bVersion.minor) {
|
||||
return aVersion.minor > bVersion.minor ? 1 : -1;
|
||||
}
|
||||
|
||||
if (aVersion.patch !== bVersion.patch) {
|
||||
return aVersion.patch > bVersion.patch ? 1 : -1;
|
||||
}
|
||||
|
||||
const aPrerelease = aVersion.prerelease;
|
||||
const bPrerelease = bVersion.prerelease;
|
||||
|
||||
if (!aPrerelease.length && !bPrerelease.length) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (!aPrerelease.length) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
if (!bPrerelease.length) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
const maxLength = Math.max(aPrerelease.length, bPrerelease.length);
|
||||
for (let i = 0; i < maxLength; i += 1) {
|
||||
const aIdentifier = aPrerelease[i];
|
||||
const bIdentifier = bPrerelease[i];
|
||||
|
||||
if (aIdentifier === undefined) {
|
||||
return -1;
|
||||
}
|
||||
|
||||
if (bIdentifier === undefined) {
|
||||
return 1;
|
||||
}
|
||||
|
||||
const identifierComparison = comparePrereleaseIdentifier(aIdentifier, bIdentifier);
|
||||
if (identifierComparison !== 0) {
|
||||
return identifierComparison;
|
||||
}
|
||||
}
|
||||
|
||||
return 0;
|
||||
}
|
||||
|
||||
function getChangedFiles(baseSha, compareSha) {
|
||||
let mergeBaseSha;
|
||||
|
||||
try {
|
||||
mergeBaseSha = runGit(['merge-base', baseSha, compareSha]);
|
||||
} catch (error) {
|
||||
throw new Error(`Unable to determine merge-base for ${baseSha} and ${compareSha}. Ensure the base branch history is available in the checkout.\n${error.message}`);
|
||||
}
|
||||
|
||||
return runGit(['diff', '--name-only', mergeBaseSha, compareSha, '--', ...MONITORED_APP_PATHS])
|
||||
.split('\n')
|
||||
.map(file => file.trim())
|
||||
.filter(Boolean);
|
||||
}
|
||||
|
||||
function getChangedApps(changedFiles) {
|
||||
return MONITORED_APP_ENTRIES
|
||||
.filter(([, app]) => {
|
||||
return changedFiles.some((file) => {
|
||||
return file === app.path || file.startsWith(`${app.path}/`);
|
||||
});
|
||||
})
|
||||
.map(([key, app]) => ({key, ...app}));
|
||||
}
|
||||
|
||||
function getPrVersion(app) {
|
||||
const packageJsonPath = path.resolve(__dirname, `../../${app.path}/package.json`);
|
||||
|
||||
if (!fs.existsSync(packageJsonPath)) {
|
||||
throw new Error(`${app.path}/package.json does not exist in this PR`);
|
||||
}
|
||||
|
||||
return readVersionFromPackageJson(
|
||||
fs.readFileSync(packageJsonPath, 'utf8'),
|
||||
`${app.path}/package.json from PR`
|
||||
);
|
||||
}
|
||||
|
||||
function getMainVersion(app) {
|
||||
return readVersionFromPackageJson(
|
||||
runGit(['show', `origin/main:${app.path}/package.json`]),
|
||||
`${app.path}/package.json from main`
|
||||
);
|
||||
}
|
||||
|
||||
function main() {
|
||||
const baseSha = process.env.PR_BASE_SHA;
|
||||
const compareSha = process.env.PR_COMPARE_SHA || process.env.GITHUB_SHA;
|
||||
|
||||
if (!baseSha) {
|
||||
throw new Error('Missing PR_BASE_SHA environment variable');
|
||||
}
|
||||
|
||||
if (!compareSha) {
|
||||
throw new Error('Missing PR_COMPARE_SHA/GITHUB_SHA environment variable');
|
||||
}
|
||||
|
||||
const changedFiles = getChangedFiles(baseSha, compareSha);
|
||||
const changedApps = getChangedApps(changedFiles);
|
||||
|
||||
if (changedApps.length === 0) {
|
||||
console.log(`No app changes detected. Skipping version bump check.`);
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(`Checking version bump for apps: ${changedApps.map(app => app.key).join(', ')}`);
|
||||
|
||||
const failedApps = [];
|
||||
|
||||
for (const app of changedApps) {
|
||||
const prVersion = getPrVersion(app);
|
||||
const mainVersion = getMainVersion(app);
|
||||
|
||||
if (compareSemver(prVersion, mainVersion) <= 0) {
|
||||
failedApps.push(
|
||||
`${app.key} (${app.packageName}) was changed but version was not bumped above main (${prVersion} <= ${mainVersion}). Please run "pnpm ship" in ${app.path} to bump the package version.`
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
console.log(`${app.key} version bump check passed (${prVersion} > ${mainVersion})`);
|
||||
}
|
||||
|
||||
if (failedApps.length) {
|
||||
throw new Error(`Version bump checks failed:\n- ${failedApps.join('\n- ')}`);
|
||||
}
|
||||
|
||||
console.log('All monitored app version bump checks passed.');
|
||||
}
|
||||
|
||||
try {
|
||||
main();
|
||||
} catch (error) {
|
||||
console.error(error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
@@ -0,0 +1,44 @@
|
||||
// NOTE: this file can't use any NPM dependencies because it needs to run even if dependencies aren't installed yet or are corrupted
|
||||
const {execSync} = require('child_process');
|
||||
|
||||
resetNxCache();
|
||||
deleteNodeModules();
|
||||
deleteBuildArtifacts();
|
||||
console.log('Cleanup complete!');
|
||||
|
||||
function deleteBuildArtifacts() {
|
||||
console.log('Deleting all build artifacts...');
|
||||
try {
|
||||
execSync('find ./ghost -type d -name "build" -exec rm -rf \'{}\' +', {
|
||||
stdio: 'inherit'
|
||||
});
|
||||
execSync('find ./ghost -type f -name "tsconfig.tsbuildinfo" -delete', {
|
||||
stdio: 'inherit'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Failed to delete build artifacts:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
function deleteNodeModules() {
|
||||
console.log('Deleting all node_modules directories...');
|
||||
try {
|
||||
execSync('find . -name "node_modules" -type d -prune -exec rm -rf \'{}\' +', {
|
||||
stdio: 'inherit'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Failed to delete node_modules directories:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
function resetNxCache() {
|
||||
console.log('Resetting NX cache...');
|
||||
try {
|
||||
execSync('rm -rf .nxcache .nx');
|
||||
} catch (error) {
|
||||
console.error('Failed to reset NX cache:', error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
Executable
+710
@@ -0,0 +1,710 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
'use strict';
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
const jsonc = require('jsonc-parser');
|
||||
const { execSync } = require('child_process');
|
||||
|
||||
/**
|
||||
* Parse pnpm outdated --json output into an array of
|
||||
* [packageName, current, wanted, latest, dependencyType] tuples.
|
||||
*
|
||||
* pnpm's JSON output is an object keyed by package name:
|
||||
* { "pkg": { "wanted": "1.0.1", "latest": "2.0.0", "dependencyType": "dependencies" } }
|
||||
*
|
||||
* pnpm's JSON output does not include a "current" field — "wanted"
|
||||
* represents the lockfile-resolved version, so we use it as current.
|
||||
*/
|
||||
function parsePnpmOutdatedOutput(stdout) {
|
||||
if (!stdout || !stdout.trim()) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const data = JSON.parse(stdout);
|
||||
return Object.entries(data).map(([name, info]) => [
|
||||
name,
|
||||
info.wanted,
|
||||
info.wanted,
|
||||
info.latest,
|
||||
info.dependencyType
|
||||
]);
|
||||
}
|
||||
|
||||
/**
|
||||
* Smart lockfile drift detector that focuses on actionable updates
|
||||
* and avoids API rate limits by using pnpm's built-in commands where possible
|
||||
*/
|
||||
|
||||
class LockfileDriftDetector {
|
||||
constructor() {
|
||||
this.workspaces = [];
|
||||
this.directDeps = new Map();
|
||||
this.outdatedInfo = [];
|
||||
this.workspaceStats = new Map();
|
||||
this.workspaceDepsCount = new Map();
|
||||
this.ignoredWorkspaceDeps = new Set();
|
||||
this.renovateIgnoredDeps = new Set();
|
||||
|
||||
// Parse command line arguments
|
||||
this.args = process.argv.slice(2);
|
||||
this.filterSeverity = null;
|
||||
|
||||
// Check for severity filters
|
||||
if (this.args.includes('--patch')) {
|
||||
this.filterSeverity = 'patch';
|
||||
} else if (this.args.includes('--minor')) {
|
||||
this.filterSeverity = 'minor';
|
||||
} else if (this.args.includes('--major')) {
|
||||
this.filterSeverity = 'major';
|
||||
}
|
||||
|
||||
// Check for help flag
|
||||
if (this.args.includes('--help') || this.args.includes('-h')) {
|
||||
this.showHelp();
|
||||
process.exit(0);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Show help message
|
||||
*/
|
||||
showHelp() {
|
||||
console.log(`
|
||||
Dependency Inspector - Smart lockfile drift detector
|
||||
|
||||
Usage: dependency-inspector.js [options]
|
||||
|
||||
Options:
|
||||
--patch Show all packages with patch updates
|
||||
--minor Show all packages with minor updates
|
||||
--major Show all packages with major updates
|
||||
--help, -h Show this help message
|
||||
|
||||
Without flags, shows high-priority updates sorted by impact.
|
||||
With a severity flag, shows all packages with that update type.
|
||||
`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Load ignored dependencies from renovate configuration
|
||||
*/
|
||||
loadRenovateConfig() {
|
||||
console.log('🔧 Loading renovate configuration...');
|
||||
|
||||
try {
|
||||
// Read renovate.json from project root (two levels up from .github/scripts/)
|
||||
const renovateConfigPath = path.join(__dirname, '../../.github/renovate.json5');
|
||||
const renovateConfig = jsonc.parse(fs.readFileSync(renovateConfigPath, 'utf8'));
|
||||
|
||||
if (renovateConfig.ignoreDeps) {
|
||||
for (const dep of renovateConfig.ignoreDeps) {
|
||||
this.renovateIgnoredDeps.add(dep);
|
||||
}
|
||||
console.log(`📝 Loaded ${renovateConfig.ignoreDeps.length} ignored dependencies from renovate.json`);
|
||||
console.log(` Ignored: ${Array.from(this.renovateIgnoredDeps).join(', ')}`);
|
||||
} else {
|
||||
console.log('📝 No ignoreDeps found in renovate.json');
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn('⚠️ Could not load renovate.json:', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all workspace package.json files
|
||||
*/
|
||||
async findWorkspaces() {
|
||||
// Read from project root (two levels up from .github/scripts/)
|
||||
const rootDir = path.join(__dirname, '../..');
|
||||
const rootPackage = JSON.parse(fs.readFileSync(path.join(rootDir, 'package.json'), 'utf8'));
|
||||
|
||||
// Read workspace patterns from pnpm-workspace.yaml (primary) or package.json (fallback)
|
||||
let workspacePatterns = [];
|
||||
const pnpmWorkspacePath = path.join(rootDir, 'pnpm-workspace.yaml');
|
||||
if (fs.existsSync(pnpmWorkspacePath)) {
|
||||
const content = fs.readFileSync(pnpmWorkspacePath, 'utf8');
|
||||
let inPackages = false;
|
||||
for (const line of content.split('\n')) {
|
||||
if (/^packages:/.test(line)) {
|
||||
inPackages = true;
|
||||
continue;
|
||||
}
|
||||
if (inPackages) {
|
||||
const match = line.match(/^\s+-\s+['"]?([^'"]+)['"]?\s*$/);
|
||||
if (match) {
|
||||
workspacePatterns.push(match[1]);
|
||||
} else if (/^\S/.test(line)) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
} else {
|
||||
workspacePatterns = rootPackage.workspaces || [];
|
||||
}
|
||||
|
||||
console.log('📦 Scanning workspaces...');
|
||||
|
||||
// Add root package
|
||||
this.workspaces.push({
|
||||
name: rootPackage.name || 'root',
|
||||
path: '.',
|
||||
packageJson: rootPackage
|
||||
});
|
||||
|
||||
// Find workspace packages
|
||||
for (const pattern of workspacePatterns) {
|
||||
const globPattern = path.join(rootDir, pattern.replace(/\*$/, ''));
|
||||
try {
|
||||
const dirs = fs.readdirSync(globPattern, { withFileTypes: true })
|
||||
.filter(dirent => dirent.isDirectory())
|
||||
.map(dirent => path.join(globPattern, dirent.name));
|
||||
|
||||
for (const dir of dirs) {
|
||||
const packageJsonPath = path.join(dir, 'package.json');
|
||||
if (fs.existsSync(packageJsonPath)) {
|
||||
try {
|
||||
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf8'));
|
||||
|
||||
// Skip ghost/admin directory but track its dependencies for filtering
|
||||
if (path.basename(dir) === 'admin' && dir.includes('ghost')) {
|
||||
console.log(`🚫 Ignoring ghost/admin workspace (tracking deps for filtering)`);
|
||||
const deps = {
|
||||
...packageJson.dependencies,
|
||||
...packageJson.devDependencies,
|
||||
...packageJson.peerDependencies,
|
||||
...packageJson.optionalDependencies
|
||||
};
|
||||
// Add all ghost/admin dependencies to ignore list
|
||||
for (const depName of Object.keys(deps || {})) {
|
||||
this.ignoredWorkspaceDeps.add(depName);
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
this.workspaces.push({
|
||||
name: packageJson.name || path.basename(dir),
|
||||
path: dir,
|
||||
packageJson
|
||||
});
|
||||
} catch (e) {
|
||||
console.warn(`⚠️ Skipped ${packageJsonPath}: ${e.message}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn(`⚠️ Skipped pattern ${pattern}: ${e.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`Found ${this.workspaces.length} workspaces`);
|
||||
return this.workspaces;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract all direct dependencies from workspaces
|
||||
*/
|
||||
extractDirectDependencies() {
|
||||
console.log('🔍 Extracting direct dependencies...');
|
||||
|
||||
for (const workspace of this.workspaces) {
|
||||
const { packageJson } = workspace;
|
||||
const deps = {
|
||||
...packageJson.dependencies,
|
||||
...packageJson.devDependencies,
|
||||
...packageJson.peerDependencies,
|
||||
...packageJson.optionalDependencies
|
||||
};
|
||||
|
||||
// Count total dependencies for this workspace
|
||||
const totalDepsForWorkspace = Object.keys(deps || {}).length;
|
||||
this.workspaceDepsCount.set(workspace.name, totalDepsForWorkspace);
|
||||
|
||||
for (const [name, range] of Object.entries(deps || {})) {
|
||||
if (!this.directDeps.has(name)) {
|
||||
this.directDeps.set(name, new Set());
|
||||
}
|
||||
this.directDeps.get(name).add({
|
||||
workspace: workspace.name,
|
||||
range,
|
||||
path: workspace.path
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return this.directDeps;
|
||||
}
|
||||
|
||||
/**
|
||||
* Use pnpm outdated to get comprehensive outdated info
|
||||
* This is much faster and more reliable than manual API calls
|
||||
*/
|
||||
async getOutdatedPackages() {
|
||||
console.log('🔄 Running pnpm outdated (this may take a moment)...');
|
||||
|
||||
let stdout;
|
||||
try {
|
||||
stdout = execSync('pnpm outdated --json', {
|
||||
encoding: 'utf8',
|
||||
maxBuffer: 10 * 1024 * 1024 // 10MB buffer for large output
|
||||
});
|
||||
} catch (error) {
|
||||
// pnpm outdated exits with code 1 when there are outdated packages
|
||||
if (error.status === 1 && error.stdout) {
|
||||
stdout = error.stdout;
|
||||
} else {
|
||||
console.error('Failed to run pnpm outdated:', error.message);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
return parsePnpmOutdatedOutput(stdout);
|
||||
}
|
||||
|
||||
/**
|
||||
* Analyze the severity of version differences
|
||||
*/
|
||||
analyzeVersionDrift(current, wanted, latest) {
|
||||
const parseVersion = (v) => {
|
||||
const match = v.match(/(\d+)\.(\d+)\.(\d+)/);
|
||||
if (!match) return { major: 0, minor: 0, patch: 0 };
|
||||
return {
|
||||
major: parseInt(match[1]),
|
||||
minor: parseInt(match[2]),
|
||||
patch: parseInt(match[3])
|
||||
};
|
||||
};
|
||||
|
||||
const currentVer = parseVersion(current);
|
||||
const latestVer = parseVersion(latest);
|
||||
|
||||
const majorDiff = latestVer.major - currentVer.major;
|
||||
const minorDiff = latestVer.minor - currentVer.minor;
|
||||
const patchDiff = latestVer.patch - currentVer.patch;
|
||||
|
||||
let severity = 'patch';
|
||||
let score = patchDiff;
|
||||
|
||||
if (majorDiff > 0) {
|
||||
severity = 'major';
|
||||
score = majorDiff * 1000 + minorDiff * 100 + patchDiff;
|
||||
} else if (minorDiff > 0) {
|
||||
severity = 'minor';
|
||||
score = minorDiff * 100 + patchDiff;
|
||||
}
|
||||
|
||||
return { severity, score, majorDiff, minorDiff, patchDiff };
|
||||
}
|
||||
|
||||
/**
|
||||
* Process and categorize outdated packages
|
||||
*/
|
||||
processOutdatedPackages(outdatedData) {
|
||||
console.log('📊 Processing outdated package information...');
|
||||
|
||||
// Initialize workspace stats
|
||||
for (const workspace of this.workspaces) {
|
||||
this.workspaceStats.set(workspace.name, {
|
||||
total: 0,
|
||||
major: 0,
|
||||
minor: 0,
|
||||
patch: 0,
|
||||
packages: [],
|
||||
outdatedPackageNames: new Set() // Track unique package names per workspace
|
||||
});
|
||||
}
|
||||
|
||||
const results = {
|
||||
direct: [],
|
||||
transitive: [],
|
||||
stats: {
|
||||
total: 0,
|
||||
major: 0,
|
||||
minor: 0,
|
||||
patch: 0
|
||||
}
|
||||
};
|
||||
|
||||
for (const [packageName, current, wanted, latest, packageType] of outdatedData) {
|
||||
const isDirect = this.directDeps.has(packageName);
|
||||
|
||||
// Skip packages that are only used by ignored workspaces (like ghost/admin)
|
||||
if (!isDirect && this.ignoredWorkspaceDeps.has(packageName)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Skip packages that are ignored by renovate configuration
|
||||
if (this.renovateIgnoredDeps.has(packageName)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const analysis = this.analyzeVersionDrift(current, wanted, latest);
|
||||
|
||||
const packageInfo = {
|
||||
name: packageName,
|
||||
current,
|
||||
wanted,
|
||||
latest,
|
||||
type: packageType || 'dependencies',
|
||||
isDirect,
|
||||
...analysis,
|
||||
workspaces: isDirect ? Array.from(this.directDeps.get(packageName)) : []
|
||||
};
|
||||
|
||||
// Update workspace statistics for direct dependencies
|
||||
if (isDirect) {
|
||||
for (const workspaceInfo of packageInfo.workspaces) {
|
||||
const stats = this.workspaceStats.get(workspaceInfo.workspace);
|
||||
if (stats && !stats.outdatedPackageNames.has(packageName)) {
|
||||
// Only count each package once per workspace
|
||||
stats.outdatedPackageNames.add(packageName);
|
||||
stats.total++;
|
||||
stats[analysis.severity]++;
|
||||
stats.packages.push({
|
||||
name: packageName,
|
||||
current,
|
||||
latest,
|
||||
severity: analysis.severity
|
||||
});
|
||||
}
|
||||
}
|
||||
results.direct.push(packageInfo);
|
||||
} else {
|
||||
results.transitive.push(packageInfo);
|
||||
}
|
||||
|
||||
results.stats.total++;
|
||||
results.stats[analysis.severity]++;
|
||||
}
|
||||
|
||||
// Deduplicate direct dependencies and count workspace impact
|
||||
const directDepsMap = new Map();
|
||||
for (const pkg of results.direct) {
|
||||
if (!directDepsMap.has(pkg.name)) {
|
||||
directDepsMap.set(pkg.name, {
|
||||
...pkg,
|
||||
workspaceCount: pkg.workspaces.length,
|
||||
impact: pkg.workspaces.length // Number of workspaces affected
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by impact: workspace count first, then severity, then score
|
||||
const sortByImpact = (a, b) => {
|
||||
// First by number of workspaces (more workspaces = higher priority)
|
||||
if (a.impact !== b.impact) {
|
||||
return b.impact - a.impact;
|
||||
}
|
||||
// Then by severity
|
||||
if (a.severity !== b.severity) {
|
||||
const severityOrder = { major: 3, minor: 2, patch: 1 };
|
||||
return severityOrder[b.severity] - severityOrder[a.severity];
|
||||
}
|
||||
// Finally by version drift score
|
||||
return b.score - a.score;
|
||||
};
|
||||
|
||||
results.direct = Array.from(directDepsMap.values()).sort(sortByImpact);
|
||||
results.transitive.sort((a, b) => {
|
||||
if (a.severity !== b.severity) {
|
||||
const severityOrder = { major: 3, minor: 2, patch: 1 };
|
||||
return severityOrder[b.severity] - severityOrder[a.severity];
|
||||
}
|
||||
return b.score - a.score;
|
||||
});
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
/**
|
||||
* Display filtered results by severity
|
||||
*/
|
||||
displayFilteredResults(results) {
|
||||
const severityEmoji = {
|
||||
major: '🔴',
|
||||
minor: '🟡',
|
||||
patch: '🟢'
|
||||
};
|
||||
|
||||
const emoji = severityEmoji[this.filterSeverity];
|
||||
const filterTitle = this.filterSeverity.toUpperCase();
|
||||
|
||||
console.log(`${emoji} ${filterTitle} UPDATES ONLY:\n`);
|
||||
|
||||
// Filter direct dependencies
|
||||
const filteredDirect = results.direct.filter(pkg => pkg.severity === this.filterSeverity);
|
||||
const filteredTransitive = results.transitive.filter(pkg => pkg.severity === this.filterSeverity);
|
||||
|
||||
console.log(`Found ${filteredDirect.length} direct and ${filteredTransitive.length} transitive ${this.filterSeverity} updates.\n`);
|
||||
|
||||
if (filteredDirect.length > 0) {
|
||||
console.log('📦 DIRECT DEPENDENCIES:');
|
||||
console.log('─'.repeat(80));
|
||||
|
||||
// Sort by workspace impact, then by package name
|
||||
filteredDirect.sort((a, b) => {
|
||||
if (a.impact !== b.impact) {
|
||||
return b.impact - a.impact;
|
||||
}
|
||||
return a.name.localeCompare(b.name);
|
||||
});
|
||||
|
||||
for (const pkg of filteredDirect) {
|
||||
const workspaceList = pkg.workspaces.map(w => w.workspace).join(', ');
|
||||
const impactNote = pkg.workspaceCount > 1 ? ` (${pkg.workspaceCount} workspaces)` : '';
|
||||
console.log(` ${emoji} ${pkg.name}: ${pkg.current} → ${pkg.latest}${impactNote}`);
|
||||
console.log(` Workspaces: ${workspaceList}`);
|
||||
}
|
||||
|
||||
console.log('\n🚀 UPDATE COMMANDS:');
|
||||
console.log('─'.repeat(80));
|
||||
for (const pkg of filteredDirect) {
|
||||
console.log(` pnpm update ${pkg.name}@latest`);
|
||||
}
|
||||
}
|
||||
|
||||
if (filteredTransitive.length > 0) {
|
||||
console.log('\n\n🔄 TRANSITIVE DEPENDENCIES:');
|
||||
console.log('─'.repeat(80));
|
||||
console.log(' These will likely be updated automatically when you update direct deps.\n');
|
||||
|
||||
// Sort by package name for easier scanning
|
||||
filteredTransitive.sort((a, b) => a.name.localeCompare(b.name));
|
||||
|
||||
for (const pkg of filteredTransitive) {
|
||||
console.log(` ${emoji} ${pkg.name}: ${pkg.current} → ${pkg.latest}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Show workspace-specific breakdown
|
||||
console.log('\n\n🏢 WORKSPACE BREAKDOWN:');
|
||||
console.log('─'.repeat(80));
|
||||
|
||||
for (const [workspaceName, stats] of this.workspaceStats.entries()) {
|
||||
const severityCount = stats[this.filterSeverity];
|
||||
if (severityCount > 0) {
|
||||
const packages = stats.packages.filter(p => p.severity === this.filterSeverity);
|
||||
console.log(`\n 📦 ${workspaceName}: ${severityCount} ${this.filterSeverity} update${severityCount !== 1 ? 's' : ''}`);
|
||||
|
||||
// Show all packages for this workspace with the selected severity
|
||||
for (const pkg of packages) {
|
||||
console.log(` ${emoji} ${pkg.name}: ${pkg.current} → ${pkg.latest}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.log('');
|
||||
}
|
||||
|
||||
/**
|
||||
* Display results in a helpful format
|
||||
*/
|
||||
displayResults(results) {
|
||||
console.log('\n🎯 DEPENDENCY ANALYSIS RESULTS\n');
|
||||
|
||||
// If filtering by severity, show filtered results
|
||||
if (this.filterSeverity) {
|
||||
this.displayFilteredResults(results);
|
||||
return;
|
||||
}
|
||||
|
||||
// Workspace-specific statistics
|
||||
console.log('🏢 WORKSPACE BREAKDOWN:');
|
||||
console.log(' Outdated packages per workspace:\n');
|
||||
|
||||
// Sort workspaces by percentage of outdated packages (descending), then by total count
|
||||
const sortedWorkspaces = Array.from(this.workspaceStats.entries())
|
||||
.sort(([nameA, a], [nameB, b]) => {
|
||||
const totalA = this.workspaceDepsCount.get(nameA) || 0;
|
||||
const totalB = this.workspaceDepsCount.get(nameB) || 0;
|
||||
const percentageA = totalA > 0 ? (a.total / totalA) * 100 : 0;
|
||||
const percentageB = totalB > 0 ? (b.total / totalB) * 100 : 0;
|
||||
|
||||
// Sort by percentage first, then by total count
|
||||
if (Math.abs(percentageA - percentageB) > 0.1) {
|
||||
return percentageB - percentageA;
|
||||
}
|
||||
return b.total - a.total;
|
||||
});
|
||||
|
||||
for (const [workspaceName, stats] of sortedWorkspaces) {
|
||||
const totalDeps = this.workspaceDepsCount.get(workspaceName) || 0;
|
||||
const outdatedCount = stats.total;
|
||||
const percentage = totalDeps > 0 ? ((outdatedCount / totalDeps) * 100).toFixed(1) : '0.0';
|
||||
|
||||
if (stats.total === 0) {
|
||||
console.log(` ✅ ${workspaceName}: All ${totalDeps} dependencies up to date! (0% outdated)`);
|
||||
} else {
|
||||
console.log(` 📦 ${workspaceName}: ${outdatedCount}/${totalDeps} outdated (${percentage}%)`);
|
||||
console.log(` 🔴 Major: ${stats.major} | 🟡 Minor: ${stats.minor} | 🟢 Patch: ${stats.patch}`);
|
||||
|
||||
// Show top 3 most outdated packages for this workspace
|
||||
const topPackages = stats.packages
|
||||
.sort((a, b) => {
|
||||
const severityOrder = { major: 3, minor: 2, patch: 1 };
|
||||
return severityOrder[b.severity] - severityOrder[a.severity];
|
||||
})
|
||||
.slice(0, 3);
|
||||
|
||||
if (topPackages.length > 0) {
|
||||
console.log(` Top issues: ${topPackages.map(p => {
|
||||
const emoji = p.severity === 'major' ? '🔴' : p.severity === 'minor' ? '🟡' : '🟢';
|
||||
return `${emoji} ${p.name} (${p.current}→${p.latest})`;
|
||||
}).join(', ')}`);
|
||||
}
|
||||
console.log('');
|
||||
}
|
||||
}
|
||||
console.log('');
|
||||
|
||||
// Direct dependencies (most actionable)
|
||||
if (results.direct.length > 0) {
|
||||
console.log('🎯 DIRECT DEPENDENCIES (High Priority):');
|
||||
console.log(' Sorted by impact: workspace count → severity → version drift\n');
|
||||
|
||||
const topDirect = results.direct.slice(0, 15);
|
||||
for (const pkg of topDirect) {
|
||||
const emoji = pkg.severity === 'major' ? '🔴' : pkg.severity === 'minor' ? '🟡' : '🟢';
|
||||
const impactEmoji = pkg.workspaceCount >= 5 ? '🌟' : pkg.workspaceCount >= 3 ? '⭐' : '';
|
||||
console.log(` ${emoji} ${impactEmoji} ${pkg.name}`);
|
||||
console.log(` ${pkg.current} → ${pkg.latest} (${pkg.severity})`);
|
||||
console.log(` Used in ${pkg.workspaceCount} workspace${pkg.workspaceCount !== 1 ? 's' : ''}: ${pkg.workspaces.map(w => w.workspace).join(', ')}`);
|
||||
console.log('');
|
||||
}
|
||||
|
||||
if (results.direct.length > 15) {
|
||||
console.log(` ... and ${results.direct.length - 15} more direct dependencies\n`);
|
||||
}
|
||||
}
|
||||
|
||||
// Sample of most outdated transitive dependencies
|
||||
if (results.transitive.length > 0) {
|
||||
console.log('🔄 MOST OUTDATED TRANSITIVE DEPENDENCIES (Lower Priority):');
|
||||
console.log(' These will likely be updated automatically when you update direct deps.\n');
|
||||
|
||||
const topTransitive = results.transitive.slice(0, 10);
|
||||
for (const pkg of topTransitive) {
|
||||
const emoji = pkg.severity === 'major' ? '🔴' : pkg.severity === 'minor' ? '🟡' : '🟢';
|
||||
console.log(` ${emoji} ${pkg.name}: ${pkg.current} → ${pkg.latest} (${pkg.severity})`);
|
||||
}
|
||||
|
||||
if (results.transitive.length > 10) {
|
||||
console.log(` ... and ${results.transitive.length - 10} more transitive dependencies\n`);
|
||||
}
|
||||
}
|
||||
|
||||
// Generate update commands for highest impact packages
|
||||
const topUpdates = results.direct.slice(0, 5);
|
||||
if (topUpdates.length > 0) {
|
||||
console.log('🚀 SUGGESTED COMMANDS (highest impact first):');
|
||||
for (const pkg of topUpdates) {
|
||||
const impactNote = pkg.workspaceCount > 1 ? ` (affects ${pkg.workspaceCount} workspaces)` : '';
|
||||
console.log(` pnpm update ${pkg.name}@latest${impactNote}`);
|
||||
}
|
||||
console.log('');
|
||||
}
|
||||
|
||||
const generatedAt = new Date().toISOString();
|
||||
const latestCommit = this.getLatestCommitRef();
|
||||
|
||||
// Summary at the end
|
||||
console.log('📈 SUMMARY:');
|
||||
console.log(` Generated at: ${generatedAt}`);
|
||||
console.log(` Latest commit: ${latestCommit}`);
|
||||
console.log(` Total dependencies: ${this.directDeps.size}`);
|
||||
console.log(` Total outdated: ${results.stats.total}`);
|
||||
console.log(` Major updates: ${results.stats.major}`);
|
||||
console.log(` Minor updates: ${results.stats.minor}`);
|
||||
console.log(` Patch updates: ${results.stats.patch}`);
|
||||
console.log(` Direct deps: ${results.direct.length}`);
|
||||
console.log(` Transitive deps: ${results.transitive.length}\n`);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the latest commit reference for the current checkout
|
||||
*/
|
||||
getLatestCommitRef() {
|
||||
try {
|
||||
return execSync("git log -1 --format='%h %ad %s' --date=iso-strict", {
|
||||
encoding: 'utf8'
|
||||
}).trim();
|
||||
} catch (error) {
|
||||
return 'Unavailable';
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Run pnpm audit and display a vulnerability summary
|
||||
*/
|
||||
displayAuditSummary() {
|
||||
console.log('🔒 SECURITY AUDIT:\n');
|
||||
|
||||
try {
|
||||
let stdout = '';
|
||||
try {
|
||||
stdout = execSync('pnpm audit --json', {
|
||||
encoding: 'utf8',
|
||||
maxBuffer: 10 * 1024 * 1024
|
||||
});
|
||||
} catch (error) {
|
||||
// pnpm audit exits with non-zero when vulnerabilities are found
|
||||
stdout = error.stdout || '';
|
||||
}
|
||||
|
||||
if (!stdout || !stdout.trim()) {
|
||||
console.log(' ⚠️ Could not parse audit summary\n');
|
||||
return;
|
||||
}
|
||||
|
||||
const data = JSON.parse(stdout);
|
||||
if (data.metadata && data.metadata.vulnerabilities) {
|
||||
const v = data.metadata.vulnerabilities;
|
||||
const total = v.info + v.low + v.moderate + v.high + v.critical;
|
||||
console.log(` Total vulnerabilities: ${total}`);
|
||||
console.log(` 🔴 Critical: ${v.critical}`);
|
||||
console.log(` 🟠 High: ${v.high}`);
|
||||
console.log(` 🟡 Moderate: ${v.moderate}`);
|
||||
console.log(` 🟢 Low: ${v.low}`);
|
||||
if (v.info > 0) {
|
||||
console.log(` ℹ️ Info: ${v.info}`);
|
||||
}
|
||||
console.log(` Total dependencies scanned: ${data.metadata.totalDependencies}\n`);
|
||||
} else {
|
||||
console.log(' ⚠️ Could not parse audit summary\n');
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(` ⚠️ Audit failed: ${error.message}\n`);
|
||||
}
|
||||
}
|
||||
|
||||
async run() {
|
||||
try {
|
||||
// Change to project root directory to run commands correctly
|
||||
const rootDir = path.join(__dirname, '../..');
|
||||
process.chdir(rootDir);
|
||||
|
||||
this.loadRenovateConfig();
|
||||
await this.findWorkspaces();
|
||||
this.extractDirectDependencies();
|
||||
const outdatedData = await this.getOutdatedPackages();
|
||||
|
||||
if (outdatedData.length === 0) {
|
||||
console.log('🎉 All packages are up to date!');
|
||||
return;
|
||||
}
|
||||
|
||||
const results = this.processOutdatedPackages(outdatedData);
|
||||
this.displayResults(results);
|
||||
this.displayAuditSummary();
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Error:', error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Run the detector
|
||||
const detector = new LockfileDriftDetector();
|
||||
detector.run();
|
||||
@@ -0,0 +1,25 @@
|
||||
const userAgent = process.env.npm_config_user_agent || '';
|
||||
|
||||
if (/\bpnpm\//.test(userAgent)) {
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
const detectedPackageManager = userAgent.split(' ')[0] || 'unknown';
|
||||
|
||||
console.error(`
|
||||
Ghost now uses pnpm for dependency installation.
|
||||
|
||||
Detected package manager: ${detectedPackageManager}
|
||||
|
||||
Use one of these instead:
|
||||
corepack enable pnpm
|
||||
pnpm install
|
||||
|
||||
Common command replacements:
|
||||
yarn setup -> pnpm run setup
|
||||
yarn dev -> pnpm dev
|
||||
yarn test -> pnpm test
|
||||
yarn lint -> pnpm lint
|
||||
`);
|
||||
|
||||
process.exit(1);
|
||||
Executable
+215
@@ -0,0 +1,215 @@
|
||||
const path = require('path');
|
||||
const fs = require('fs/promises');
|
||||
const exec = require('util').promisify(require('child_process').exec);
|
||||
const readline = require('readline/promises');
|
||||
|
||||
const semver = require('semver');
|
||||
|
||||
// Maps a package name to the config key in defaults.json
|
||||
const CONFIG_KEYS = {
|
||||
'@tryghost/portal': 'portal',
|
||||
'@tryghost/sodo-search': 'sodoSearch',
|
||||
'@tryghost/comments-ui': 'comments',
|
||||
'@tryghost/announcement-bar': 'announcementBar',
|
||||
'@tryghost/signup-form': 'signupForm'
|
||||
};
|
||||
|
||||
const CURRENT_DIR = process.cwd();
|
||||
|
||||
const packageJsonPath = path.join(CURRENT_DIR, 'package.json');
|
||||
const packageJson = require(packageJsonPath);
|
||||
|
||||
const APP_NAME = packageJson.name;
|
||||
const APP_VERSION = packageJson.version;
|
||||
|
||||
async function safeExec(command) {
|
||||
try {
|
||||
return await exec(command);
|
||||
} catch (err) {
|
||||
return {
|
||||
stdout: err.stdout,
|
||||
stderr: err.stderr
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureEnabledApp() {
|
||||
const ENABLED_APPS = Object.keys(CONFIG_KEYS);
|
||||
if (!ENABLED_APPS.includes(APP_NAME)) {
|
||||
console.error(`${APP_NAME} is not enabled, please modify ${__filename}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureNotOnMain() {
|
||||
const currentGitBranch = await safeExec(`git branch --show-current`);
|
||||
if (currentGitBranch.stderr) {
|
||||
console.error(`There was an error checking the current git branch`)
|
||||
console.error(`${currentGitBranch.stderr}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (currentGitBranch.stdout.trim() === 'main') {
|
||||
console.error(`The release can not be done on the "main" branch`)
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureCleanGit() {
|
||||
const localGitChanges = await safeExec(`git status --porcelain`);
|
||||
if (localGitChanges.stderr) {
|
||||
console.error(`There was an error checking the local git status`)
|
||||
console.error(`${localGitChanges.stderr}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (localGitChanges.stdout) {
|
||||
console.error(`You have local git changes - are you sure you're ready to release?`)
|
||||
console.error(`${localGitChanges.stdout}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
async function getNewVersion() {
|
||||
const rl = readline.createInterface({input: process.stdin, output: process.stdout});
|
||||
const bumpTypeInput = await rl.question('Is this a patch, minor or major (patch)? ');
|
||||
rl.close();
|
||||
const bumpType = bumpTypeInput.trim().toLowerCase() || 'patch';
|
||||
if (!['patch', 'minor', 'major'].includes(bumpType)) {
|
||||
console.error(`Unknown bump type ${bumpTypeInput} - expected one of "patch", "minor, "major"`)
|
||||
process.exit(1);
|
||||
}
|
||||
return semver.inc(APP_VERSION, bumpType);
|
||||
}
|
||||
|
||||
async function updateConfig(newVersion) {
|
||||
const defaultConfigPath = path.resolve(__dirname, '../../ghost/core/core/shared/config/defaults.json');
|
||||
const defaultConfig = require(defaultConfigPath);
|
||||
|
||||
const configKey = CONFIG_KEYS[APP_NAME];
|
||||
|
||||
defaultConfig[configKey].version = `${semver.major(newVersion)}.${semver.minor(newVersion)}`;
|
||||
|
||||
await fs.writeFile(defaultConfigPath, JSON.stringify(defaultConfig, null, 4) + '\n');
|
||||
}
|
||||
|
||||
async function updatePackageJson(newVersion) {
|
||||
const newPackageJson = Object.assign({}, packageJson, {
|
||||
version: newVersion
|
||||
});
|
||||
|
||||
await fs.writeFile(packageJsonPath, JSON.stringify(newPackageJson, null, 2) + '\n');
|
||||
}
|
||||
|
||||
async function getChangelog(newVersion) {
|
||||
const rl = readline.createInterface({input: process.stdin, output: process.stdout});
|
||||
const i18nChangesInput = await rl.question('Does this release contain i18n updates (Y/n)? ');
|
||||
rl.close();
|
||||
|
||||
const i18nChanges = i18nChangesInput.trim().toLowerCase() !== 'n';
|
||||
|
||||
let changelogItems = [];
|
||||
|
||||
if (i18nChanges) {
|
||||
changelogItems.push('Updated i18n translations');
|
||||
}
|
||||
|
||||
// Restrict git log to only the current directory (the specific app)
|
||||
const lastFiftyCommits = await safeExec(`git log -n 50 --oneline -- .`);
|
||||
|
||||
if (lastFiftyCommits.stderr) {
|
||||
console.error(`There was an error getting the last 50 commits`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const lastFiftyCommitsList = lastFiftyCommits.stdout.split('\n');
|
||||
const releaseRegex = new RegExp(`Released ${APP_NAME} v${APP_VERSION}`);
|
||||
const indexOfLastRelease = lastFiftyCommitsList.findIndex((commitLine) => {
|
||||
const commitMessage = commitLine.slice(11); // Take the hash off the front
|
||||
return releaseRegex.test(commitMessage);
|
||||
});
|
||||
|
||||
if (indexOfLastRelease === -1) {
|
||||
console.warn(`Could not find commit for previous release. Will include recent commits affecting this app.`);
|
||||
|
||||
// Fallback: get recent commits for this app (last 20)
|
||||
const recentCommits = await safeExec(`git log -n 20 --pretty=format:"%h%n%B__SPLIT__" -- .`);
|
||||
if (recentCommits.stderr) {
|
||||
console.error(`There was an error getting recent commits`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const recentCommitsList = recentCommits.stdout.split('__SPLIT__');
|
||||
|
||||
const recentCommitsWhichMentionLinear = recentCommitsList.filter((commitBlock) => {
|
||||
return commitBlock.includes('https://linear.app/ghost');
|
||||
});
|
||||
|
||||
const commitChangelogItems = recentCommitsWhichMentionLinear.map((commitBlock) => {
|
||||
const lines = commitBlock.split('\n');
|
||||
if (!lines.length || !lines[0].trim()) {
|
||||
return null; // Skip entries with no hash
|
||||
}
|
||||
const hash = lines[0].trim();
|
||||
return `https://github.com/TryGhost/Ghost/commit/${hash}`;
|
||||
}).filter(Boolean); // Filter out any null entries
|
||||
|
||||
changelogItems.push(...commitChangelogItems);
|
||||
} else {
|
||||
const lastReleaseCommit = lastFiftyCommitsList[indexOfLastRelease];
|
||||
const lastReleaseCommitHash = lastReleaseCommit.slice(0, 10);
|
||||
|
||||
// Also restrict this git log to only the current directory (the specific app)
|
||||
const commitsSinceLastRelease = await safeExec(`git log ${lastReleaseCommitHash}..HEAD --pretty=format:"%h%n%B__SPLIT__" -- .`);
|
||||
if (commitsSinceLastRelease.stderr) {
|
||||
console.error(`There was an error getting commits since the last release`);
|
||||
process.exit(1);
|
||||
}
|
||||
const commitsSinceLastReleaseList = commitsSinceLastRelease.stdout.split('__SPLIT__');
|
||||
|
||||
const commitsSinceLastReleaseWhichMentionLinear = commitsSinceLastReleaseList.filter((commitBlock) => {
|
||||
return commitBlock.includes('https://linear.app/ghost');
|
||||
});
|
||||
|
||||
const commitChangelogItems = commitsSinceLastReleaseWhichMentionLinear.map((commitBlock) => {
|
||||
const lines = commitBlock.split('\n');
|
||||
if (!lines.length || !lines[0].trim()) {
|
||||
return null; // Skip entries with no hash
|
||||
}
|
||||
const hash = lines[0].trim();
|
||||
return `https://github.com/TryGhost/Ghost/commit/${hash}`;
|
||||
}).filter(Boolean); // Filter out any null entries
|
||||
|
||||
changelogItems.push(...commitChangelogItems);
|
||||
}
|
||||
|
||||
const changelogList = changelogItems.map(item => ` - ${item}`).join('\n');
|
||||
return `Changelog for v${APP_VERSION} -> ${newVersion}: \n${changelogList}`;
|
||||
}
|
||||
|
||||
async function main() {
|
||||
await ensureEnabledApp();
|
||||
await ensureNotOnMain();
|
||||
await ensureCleanGit();
|
||||
|
||||
console.log(`Running release for ${APP_NAME}`);
|
||||
console.log(`Current version is ${APP_VERSION}`);
|
||||
|
||||
const newVersion = await getNewVersion();
|
||||
|
||||
console.log(`Bumping to version ${newVersion}`);
|
||||
|
||||
const changelog = await getChangelog(newVersion);
|
||||
|
||||
await updatePackageJson(newVersion);
|
||||
await exec(`git add package.json`);
|
||||
|
||||
await updateConfig(newVersion);
|
||||
await exec(`git add ../../ghost/core/core/shared/config/defaults.json`);
|
||||
|
||||
await exec(`git commit -m 'Released ${APP_NAME} v${newVersion}\n\n${changelog}'`);
|
||||
|
||||
console.log(`Release commit created - please double check it and use "git commit --amend" to make any changes before opening a PR to merge into main`)
|
||||
}
|
||||
|
||||
main();
|
||||
@@ -0,0 +1,30 @@
|
||||
name: CI (Release)
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'v[0-9]*'
|
||||
|
||||
# Tags must never be cancelled — each is a public release
|
||||
concurrency:
|
||||
group: ci-release-${{ github.ref_name }}
|
||||
cancel-in-progress: false
|
||||
|
||||
# Workflow-level permissions set the ceiling for the reusable ci.yml.
|
||||
# id-token is never in the default token, so it must be granted explicitly
|
||||
# here — otherwise the ci: job's `permissions:` block exceeds the caller
|
||||
# workflow's permissions and GitHub rejects the run with startup_failure.
|
||||
permissions:
|
||||
actions: read
|
||||
contents: write
|
||||
packages: write
|
||||
id-token: write
|
||||
|
||||
jobs:
|
||||
ci:
|
||||
uses: ./.github/workflows/ci.yml
|
||||
secrets: inherit
|
||||
permissions:
|
||||
actions: read
|
||||
contents: write
|
||||
packages: write
|
||||
id-token: write
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,158 @@
|
||||
name: Cleanup GHCR Images
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "30 4 * * *" # Daily at 04:30 UTC
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
dry_run:
|
||||
description: "Log what would be deleted without making changes"
|
||||
required: false
|
||||
default: true
|
||||
type: boolean
|
||||
retention_days:
|
||||
description: "Delete versions older than this many days"
|
||||
required: false
|
||||
default: 14
|
||||
type: number
|
||||
min_keep:
|
||||
description: "Always keep at least this many versions per package"
|
||||
required: false
|
||||
default: 10
|
||||
type: number
|
||||
|
||||
permissions:
|
||||
packages: write
|
||||
|
||||
env:
|
||||
ORG: TryGhost
|
||||
RETENTION_DAYS: ${{ inputs.retention_days || 14 }}
|
||||
MIN_KEEP: ${{ inputs.min_keep || 10 }}
|
||||
|
||||
jobs:
|
||||
cleanup:
|
||||
name: Cleanup
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
package: [ghost, ghost-core, ghost-development]
|
||||
steps:
|
||||
- name: Delete old non-release versions
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
DRY_RUN: ${{ github.event_name == 'schedule' && 'false' || inputs.dry_run }}
|
||||
PACKAGE: ${{ matrix.package }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cutoff=$(date -u -d "-${RETENTION_DAYS} days" +%Y-%m-%dT%H:%M:%SZ 2>/dev/null \
|
||||
|| date -u -v-${RETENTION_DAYS}d +%Y-%m-%dT%H:%M:%SZ)
|
||||
|
||||
echo "Package: ${ORG}/${PACKAGE}"
|
||||
echo "Cutoff: ${cutoff} (${RETENTION_DAYS} days ago)"
|
||||
echo "Dry run: ${DRY_RUN}"
|
||||
echo ""
|
||||
|
||||
# Pagination — collect all versions
|
||||
page=1
|
||||
all_versions="[]"
|
||||
while true; do
|
||||
if ! batch=$(gh api \
|
||||
"/orgs/${ORG}/packages/container/${PACKAGE}/versions?per_page=100&page=${page}" \
|
||||
--jq '.' 2>&1); then
|
||||
if [ "$page" = "1" ]; then
|
||||
echo "::error::API request failed: ${batch}"
|
||||
exit 1
|
||||
fi
|
||||
echo "::warning::API request failed (page ${page}): ${batch}"
|
||||
break
|
||||
fi
|
||||
|
||||
count=$(echo "$batch" | jq 'length')
|
||||
if [ "$count" = "0" ]; then
|
||||
break
|
||||
fi
|
||||
|
||||
all_versions=$(echo "$all_versions $batch" | jq -s 'add')
|
||||
page=$((page + 1))
|
||||
done
|
||||
|
||||
total=$(echo "$all_versions" | jq 'length')
|
||||
echo "Total versions: ${total}"
|
||||
|
||||
# Classify versions
|
||||
keep=0
|
||||
delete=0
|
||||
delete_ids=""
|
||||
|
||||
for row in $(echo "$all_versions" | jq -r '.[] | @base64'); do
|
||||
_jq() { echo "$row" | base64 -d | jq -r "$1"; }
|
||||
|
||||
id=$(_jq '.id')
|
||||
updated=$(_jq '.updated_at')
|
||||
tags=$(_jq '[.metadata.container.tags[]] | join(",")')
|
||||
|
||||
# Keep versions with semver tags (v1.2.3, 1.2.3, 1.2)
|
||||
if echo "$tags" | grep -qE '(^|,)v?[0-9]+\.[0-9]+\.[0-9]+(,|$)' || \
|
||||
echo "$tags" | grep -qE '(^|,)[0-9]+\.[0-9]+(,|$)'; then
|
||||
keep=$((keep + 1))
|
||||
continue
|
||||
fi
|
||||
|
||||
# Keep versions with 'latest' or 'main' or cache-main tags
|
||||
if echo "$tags" | grep -qE '(^|,)(latest|main|cache-main)(,|$)'; then
|
||||
keep=$((keep + 1))
|
||||
continue
|
||||
fi
|
||||
|
||||
# Keep versions newer than cutoff
|
||||
if [[ "$updated" > "$cutoff" ]]; then
|
||||
keep=$((keep + 1))
|
||||
continue
|
||||
fi
|
||||
|
||||
# This version is eligible for deletion
|
||||
delete=$((delete + 1))
|
||||
delete_ids="${delete_ids} ${id}"
|
||||
|
||||
tag_display="${tags:-<untagged>}"
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo "[dry-run] Would delete version ${id} (tags: ${tag_display}, updated: ${updated})"
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo "Summary: ${keep} kept, ${delete} to delete (of ${total} total)"
|
||||
|
||||
if [ "$delete" = "0" ]; then
|
||||
echo "Nothing to delete."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Safety check — run before dry-run exit so users see the warning
|
||||
if [ "$keep" -lt "$MIN_KEEP" ]; then
|
||||
echo "::error::Safety check failed — only ${keep} versions would remain (minimum: ${MIN_KEEP}). Aborting."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ "$DRY_RUN" = "true" ]; then
|
||||
echo ""
|
||||
echo "Dry run — no versions deleted."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Delete eligible versions
|
||||
deleted=0
|
||||
failed=0
|
||||
for id in $delete_ids; do
|
||||
if gh api --method DELETE \
|
||||
"/orgs/${ORG}/packages/container/${PACKAGE}/versions/${id}" 2>/dev/null; then
|
||||
deleted=$((deleted + 1))
|
||||
else
|
||||
echo "::warning::Failed to delete version ${id}"
|
||||
failed=$((failed + 1))
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo "Deleted ${deleted} versions (${failed} failed)"
|
||||
@@ -0,0 +1,26 @@
|
||||
name: "Copilot Setup Steps"
|
||||
|
||||
# This workflow configures the environment for GitHub Copilot Agent with gh-aw MCP server
|
||||
on:
|
||||
workflow_dispatch:
|
||||
push:
|
||||
paths:
|
||||
- .github/workflows/copilot-setup-steps.yml
|
||||
|
||||
jobs:
|
||||
# The job MUST be called 'copilot-setup-steps' to be recognized by GitHub Copilot Agent
|
||||
copilot-setup-steps:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
# Set minimal permissions for setup steps
|
||||
# Copilot Agent receives its own token with appropriate permissions
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
- name: Install gh-aw extension
|
||||
uses: github/gh-aw/actions/setup-cli@ce1794953e0ec42adc41b6fca05e02ab49ee21c3 # v0.68.3
|
||||
with:
|
||||
version: v0.49.3
|
||||
@@ -0,0 +1,66 @@
|
||||
name: Create release branch
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
base-ref:
|
||||
description: 'Git ref to base from (defaults to latest tag)'
|
||||
type: string
|
||||
default: 'latest'
|
||||
required: false
|
||||
bump-type:
|
||||
description: 'Version bump type (patch, minor)'
|
||||
type: string
|
||||
required: false
|
||||
default: 'patch'
|
||||
env:
|
||||
FORCE_COLOR: 1
|
||||
permissions:
|
||||
contents: write
|
||||
jobs:
|
||||
create-branch:
|
||||
if: github.repository == 'TryGhost/Ghost'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
if: inputs.base-ref == 'latest'
|
||||
with:
|
||||
ref: main
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
if: inputs.base-ref != 'latest'
|
||||
with:
|
||||
ref: ${{ inputs.base-ref }}
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Checkout most recent tag
|
||||
run: git checkout "$(git describe --tags --abbrev=0 --match=v*)"
|
||||
if: inputs.base-ref == 'latest'
|
||||
|
||||
- uses: asdf-vm/actions/install@b7bcd026f18772e44fe1026d729e1611cc435d47 # v4
|
||||
with:
|
||||
tool_versions: |
|
||||
semver 3.3.0
|
||||
|
||||
- run: |
|
||||
CURRENT_TAG=$(git describe --tags --abbrev=0 --match=v*)
|
||||
NEW_VERSION=$(semver bump "$BUMP_TYPE_INPUT" "$CURRENT_TAG")
|
||||
printf 'CURRENT_SHA=%s\n' "$(git rev-parse HEAD)" >> "$GITHUB_ENV"
|
||||
printf 'NEW_VERSION=%s\n' "$NEW_VERSION" >> "$GITHUB_ENV"
|
||||
env:
|
||||
BUMP_TYPE_INPUT: ${{ inputs.bump-type }}
|
||||
|
||||
- name: Create branch
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
|
||||
with:
|
||||
script: |
|
||||
const branchName = `v${process.env.NEW_VERSION}`;
|
||||
console.log(`Creating branch: ${branchName}`);
|
||||
await github.request('POST /repos/{owner}/{repo}/git/refs', {
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
ref: `refs/heads/${branchName}`,
|
||||
sha: process.env.CURRENT_SHA
|
||||
});
|
||||
@@ -0,0 +1,127 @@
|
||||
name: Deploy to Staging
|
||||
|
||||
# DISABLED: The deploy-to-staging label workflow is currently broken and disabled.
|
||||
# Problems:
|
||||
# 1. Admin is global — deploying a PR's admin overwrites admin-forward/ for ALL staging
|
||||
# sites, not just demo.ghost.is. Per-site admin versioning is needed first.
|
||||
# 2. Main merges overwrite — any merge to main triggers a full staging rollout that
|
||||
# overwrites both the server version on demo.ghost.is and admin-forward/ globally.
|
||||
# The deployment lasts only until the next merge to main, making it unreliable.
|
||||
# See: https://www.notion.so/ghost/Proposal-Per-site-admin-versioning-31951439c03081daa133eb0215642202
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [labeled]
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
name: Deploy to Staging
|
||||
# Runs when the "deploy-to-staging" label is added — requires collaborator write access.
|
||||
# Fork PRs are rejected because they don't have GHCR images (CI uses artifact transfer).
|
||||
if: >-
|
||||
false
|
||||
&& github.event.label.name == 'deploy-to-staging'
|
||||
&& github.event.pull_request.head.repo.full_name == github.event.pull_request.base.repo.full_name
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
actions: read
|
||||
env:
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
|
||||
steps:
|
||||
- name: Wait for CI build artifacts
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
echo "Waiting for CI to complete Docker build for $HEAD_SHA..."
|
||||
TIMEOUT=1800 # 30 minutes
|
||||
INTERVAL=30
|
||||
START=$(date +%s)
|
||||
|
||||
while true; do
|
||||
ELAPSED=$(( $(date +%s) - START ))
|
||||
if [ "$ELAPSED" -ge "$TIMEOUT" ]; then
|
||||
echo "::error::Timed out waiting for CI (${TIMEOUT}s)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Find the CI run for this SHA
|
||||
RUN=$(gh api "repos/${{ github.repository }}/actions/workflows/ci.yml/runs?head_sha=${HEAD_SHA}&per_page=1" \
|
||||
--jq '.workflow_runs[0] | {id, status, conclusion}' 2>/dev/null || echo "")
|
||||
|
||||
if [ -z "$RUN" ] || [ "$RUN" = "null" ]; then
|
||||
echo " No CI run found yet, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
|
||||
sleep "$INTERVAL"
|
||||
continue
|
||||
fi
|
||||
|
||||
STATUS=$(echo "$RUN" | jq -r '.status')
|
||||
CONCLUSION=$(echo "$RUN" | jq -r '.conclusion // empty')
|
||||
RUN_ID=$(echo "$RUN" | jq -r '.id')
|
||||
|
||||
if [ "$STATUS" = "completed" ]; then
|
||||
if [ "$CONCLUSION" = "success" ] || [ "$CONCLUSION" = "failure" ]; then
|
||||
# Check if Docker build job specifically succeeded (paginate — CI has 30+ jobs)
|
||||
BUILD_JOB=$(gh api --paginate "repos/${{ github.repository }}/actions/runs/${RUN_ID}/jobs?per_page=100" \
|
||||
--jq '.jobs[] | select(.name == "Build & Publish Artifacts") | .conclusion')
|
||||
if [ -z "$BUILD_JOB" ]; then
|
||||
echo "::error::Build & Publish Artifacts job not found in CI run ${RUN_ID}"
|
||||
exit 1
|
||||
elif [ "$BUILD_JOB" = "success" ]; then
|
||||
echo "Docker build ready (CI run $RUN_ID)"
|
||||
break
|
||||
else
|
||||
echo "::error::Docker build job did not succeed (conclusion: $BUILD_JOB)"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "::error::CI run failed (conclusion: $CONCLUSION)"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
echo " CI still running ($STATUS), waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
|
||||
sleep "$INTERVAL"
|
||||
done
|
||||
|
||||
- name: Re-check PR eligibility
|
||||
id: recheck
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
PR=$(gh api "repos/${{ github.repository }}/pulls/${{ env.PR_NUMBER }}" \
|
||||
--jq '{state, labels: [.labels[].name], head_sha: .head.sha}')
|
||||
|
||||
STATE=$(echo "$PR" | jq -r '.state')
|
||||
HAS_LABEL=$(echo "$PR" | jq '.labels | any(. == "deploy-to-staging")')
|
||||
CURRENT_SHA=$(echo "$PR" | jq -r '.head_sha')
|
||||
|
||||
if [ "$STATE" != "open" ]; then
|
||||
echo "::warning::PR is no longer open ($STATE), skipping dispatch"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
elif [ "$HAS_LABEL" != "true" ]; then
|
||||
echo "::warning::deploy-to-staging label was removed, skipping dispatch"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
elif [ "$CURRENT_SHA" != "$HEAD_SHA" ]; then
|
||||
echo "::warning::HEAD SHA changed ($HEAD_SHA → $CURRENT_SHA), skipping dispatch (new push will trigger CI)"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "PR still eligible for deploy"
|
||||
echo "skip=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Dispatch to Ghost-Moya
|
||||
if: steps.recheck.outputs.skip != 'true'
|
||||
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
|
||||
with:
|
||||
token: ${{ secrets.CANARY_DOCKER_BUILD }}
|
||||
repository: TryGhost/Ghost-Moya
|
||||
event-type: ghost-artifacts-ready
|
||||
client-payload: >-
|
||||
{
|
||||
"ref": "${{ env.PR_NUMBER }}",
|
||||
"source_repo": "${{ github.repository }}",
|
||||
"pr_number": "${{ env.PR_NUMBER }}",
|
||||
"deploy": "true"
|
||||
}
|
||||
@@ -0,0 +1,21 @@
|
||||
name: 'Label Issues & PRs'
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
issues:
|
||||
types: [opened, closed, labeled]
|
||||
pull_request_target:
|
||||
types: [opened, closed, labeled]
|
||||
schedule:
|
||||
- cron: '0 * * * *'
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
action:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.repository_owner == 'TryGhost'
|
||||
steps:
|
||||
- uses: tryghost/actions/actions/label-actions@20b5ae5f266e86f7b5f0815d92731d6388b8ce46 # main
|
||||
Generated
+1141
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,232 @@
|
||||
---
|
||||
description: Triage new Linear issues for the Berlin Bureau (BER) team — classify type, assign priority, tag product area, and post reasoning comments.
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule: daily on weekdays
|
||||
permissions:
|
||||
contents: read
|
||||
if: github.repository == 'TryGhost/Ghost'
|
||||
tools:
|
||||
cache-memory: true
|
||||
mcp-servers:
|
||||
linear:
|
||||
command: "npx"
|
||||
args: ["-y", "mcp-remote", "https://mcp.linear.app/mcp", "--header", "Authorization:Bearer ${{ secrets.LINEAR_API_KEY }}"]
|
||||
env:
|
||||
LINEAR_API_KEY: ${{ secrets.LINEAR_API_KEY }}
|
||||
network:
|
||||
allowed:
|
||||
- defaults
|
||||
- node
|
||||
- mcp.linear.app
|
||||
safe-outputs:
|
||||
create-issue:
|
||||
noop:
|
||||
---
|
||||
|
||||
# Linear Issue Triage Agent
|
||||
|
||||
You are an AI agent that triages new Linear issues for the **Berlin Bureau (BER)** team. Your goal is to reduce the time a human needs to complete triage by pre-classifying issues, assigning priority, tagging product areas, and recommending code investigations where appropriate.
|
||||
|
||||
**You do not move issues out of Triage** — a human still makes the final call on status transitions.
|
||||
|
||||
## Your Task
|
||||
|
||||
1. Use the Linear MCP tools to find the BER team and list all issues currently in the **Triage** state
|
||||
2. Check your cache-memory to see which issues you have already triaged — skip those
|
||||
3. For each untriaged issue, apply the triage rubric below to:
|
||||
- Classify the issue type
|
||||
- Assign priority (both a priority label and Linear's built-in priority field)
|
||||
- Tag the product area
|
||||
- Post a triage comment explaining your reasoning
|
||||
4. Update your cache-memory with the newly triaged issue IDs
|
||||
5. After processing, call the `noop` safe output with a summary of what you did — e.g. "Triaged 1 issue: BER-3367 (Bug, P3)" or "No new BER issues in Triage state" if there was nothing to triage
|
||||
|
||||
## Linear MCP Tools
|
||||
|
||||
You have access to the official Linear MCP server. Use its tools to:
|
||||
|
||||
- **Find issues**: Search for BER team issues in Triage state
|
||||
- **Read issue details**: Get title, description, labels, priority, and comments
|
||||
- **Update issues**: Add labels and set priority
|
||||
- **Create comments**: Post triage reasoning comments
|
||||
|
||||
Start by listing available tools to discover the exact tool names and parameters.
|
||||
|
||||
**Important:** When updating labels, preserve existing labels. Fetch the issue's current labels first, then include both old and new label IDs in the update.
|
||||
|
||||
## Cache-Memory Format
|
||||
|
||||
Store and read a JSON file at the **exact path** `cache-memory/triage-cache.json`. Always use this filename — never rename it or create alternative files.
|
||||
|
||||
```json
|
||||
{
|
||||
"triaged_issue_ids": ["BER-3150", "BER-3151"],
|
||||
"last_run": "2025-01-15T10:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
On each run:
|
||||
1. Read `cache-memory/triage-cache.json` to get previously triaged issue identifiers
|
||||
2. Skip any issues already in the list
|
||||
3. After processing, write the updated list back to `cache-memory/triage-cache.json` (append newly triaged IDs)
|
||||
|
||||
## Triage Rubric
|
||||
|
||||
### Decision 1: Type Classification
|
||||
|
||||
Classify each issue based on its title, description, and linked context:
|
||||
|
||||
| Type | Signal words / patterns | Label to apply |
|
||||
|------|------------------------|----------------|
|
||||
| **Bug** | "broken", "doesn't work", "regression", "error", "crash", stack traces, Sentry links, "unexpected behaviour" | `🐛 Bug` (`e51776f7-038e-474b-86ec-66981c9abb4f`) |
|
||||
| **Security** | "vulnerability", "exploit", "bypass", "SSRF", "XSS", "injection", "authentication bypass", "2FA", CVE references | `🔒 Security` (`28c5afc1-8063-4e62-af11-e42d94591957`) — also apply Bug if applicable |
|
||||
| **Feature** | "add support for", "it would be nice", "can we", "new feature", Featurebase links | `✨ Feature` (`db8672e2-1053-4bc7-9aab-9d38c5b01560`) |
|
||||
| **Improvement** | "improve", "enhance", "optimise", "refactor", "clean up", "polish" | `🎨 Improvement` (`b36579e6-62e1-4f55-987d-ee1e5c0cde1a`) |
|
||||
| **Performance** | "slow", "latency", "timeout", "memory", "CPU", "performance", load time complaints | `⚡️ Performance` (`9066d0ea-6326-4b22-b6f5-82fe7ce2c1d1`) |
|
||||
| **Maintenance** | "upgrade dependency", "tech debt", "remove deprecated", "migrate" | `🛠️ Maintenance` (`0ca27922-3646-4ab7-bf03-e67230c0c39e`) |
|
||||
| **Documentation** | "docs", "README", "guide", "tutorial", missing documentation | `📝 Documentation` (`25f8988a-5925-44cd-b0df-c0229463925f`) |
|
||||
|
||||
If an issue matches multiple types (e.g. a security bug), apply all relevant labels.
|
||||
|
||||
### Decision 2: Priority Assignment
|
||||
|
||||
Assign priority to all issue types. Set both the Linear priority field and the corresponding priority label.
|
||||
|
||||
**For bugs and security issues**, use these criteria:
|
||||
|
||||
#### P1 — Urgent (Linear priority: 1, Label: `📊 Priority → P1 - Urgent` `11de115f-3e40-46c6-bf42-2aa2b9195cbd`)
|
||||
- Security vulnerability with a clear exploit path
|
||||
- Data loss or corruption (MySQL, disk) — actual or imminent (exception: small lexical data issues can be P2)
|
||||
- Multiple customers' businesses immediately affected (broken payment collection, broken emails, broken member login)
|
||||
|
||||
#### P2 — High (Linear priority: 2, Label: `📊 Priority → P2 - High` `aeda47fa-9db9-4f4d-a446-3cccf92c8d12`)
|
||||
- Triggering monitoring alerts that wake on-call engineers (if recurring, bump to P1)
|
||||
- Security vulnerability without a clear exploit
|
||||
- Regression that breaks currently working core functionality
|
||||
- Crashes the server or browser
|
||||
- Significantly disrupts customers' members/end-users (e.g. incorrect pricing or access)
|
||||
- Bugs with members, subscriptions, or newsletters without immediate business impact
|
||||
|
||||
#### P3 — Medium (Linear priority: 3, Label: `📊 Priority → P3 - Medium` `10ec8b7b-725f-453f-b5d2-ff160d3b3c1e`)
|
||||
- Bugs with members, subscriptions, or newsletters affecting only a few customers
|
||||
- Bugs in recently released features that significantly affect usability
|
||||
- Issues with setup/upgrade flows
|
||||
- Broken features (dashboards, line charts, analytics, etc.)
|
||||
- Correctness issues (e.g. timezones)
|
||||
|
||||
#### P4 — Low (Linear priority: 4, Label: `📊 Priority → P4 - Low` `411a21ea-c8c0-4cb1-9736-7417383620ff`)
|
||||
- Not quite working as expected, but little overall impact
|
||||
- Not related to payments, email, or security
|
||||
- Significantly more complex to fix than the value of fixing
|
||||
- Purely cosmetic
|
||||
- Has a clear and straightforward workaround
|
||||
|
||||
**For non-bug issues** (features, improvements, performance, maintenance, documentation), assign a **provisional priority** based on estimated impact and urgency. Clearly mark it as provisional in the triage comment.
|
||||
|
||||
#### Bump Modifiers
|
||||
|
||||
**Bump UP one level if:**
|
||||
- It causes regular alerts for on-call engineers
|
||||
- It affects lots of users or VIP customers
|
||||
- It prevents users from carrying out a critical use case or workflow
|
||||
- It prevents rolling back to a previous release
|
||||
|
||||
**Bump DOWN one level if:**
|
||||
- Reported by a single, non-VIP user
|
||||
- Only impacts an edge case or obscure use case
|
||||
|
||||
Note in your comment if a bump modifier was applied and why.
|
||||
|
||||
### Decision 3: Product Area Tagging
|
||||
|
||||
Apply the most relevant `Product Area →` label:
|
||||
|
||||
| Label | Covers |
|
||||
|-------|--------|
|
||||
| `Product Area → Editor` | Post/page editor, Koenig, Lexical, content blocks |
|
||||
| `Product Area → Dashboard` | Admin dashboard, stats, overview |
|
||||
| `Product Area → Analytics` | Analytics, charts, reporting |
|
||||
| `Product Area → Memberships` | Member management, segmentation, member data |
|
||||
| `Product Area → Portal` | Member-facing portal, signup/login flows |
|
||||
| `Product Area → Newsletters` | Email newsletters, sending, email design |
|
||||
| `Product Area → Admin` | General admin UI, settings, navigation |
|
||||
| `Product Area → Settings area` | Settings screens specifically |
|
||||
| `Product Area → Billing App` | Billing, subscription management |
|
||||
| `Product Area → Themes` | Theme system, Handlebars, theme marketplace |
|
||||
| `Product Area → Publishing` | Post publishing, scheduling, distribution |
|
||||
| `Product Area → Growth` | Growth features, recommendations |
|
||||
| `Product Area → Comments` | Comment system |
|
||||
| `Product Area → Imports / Exports` | Data import/export |
|
||||
| `Product Area → Welcome emails / Automations` | Automated emails, welcome sequences |
|
||||
| `Product Area → Social Web` | ActivityPub, federation |
|
||||
| `Product Area → i18n` | Internationalisation, translations |
|
||||
| `Product Area → Sodo Search` | Search functionality |
|
||||
| `Product Area → Admin-X Offers` | Offers system in Admin-X |
|
||||
|
||||
If the issue spans multiple areas, apply all relevant labels. If no product area is clearly identifiable, don't force a label — note this in the comment.
|
||||
|
||||
**Important:** Use the Linear MCP tools to look up product area label IDs before applying them.
|
||||
|
||||
### Decision 4: Triage Comment
|
||||
|
||||
Post a comment on the issue with your reasoning. Use this format:
|
||||
|
||||
```
|
||||
🤖 **Automated Triage**
|
||||
|
||||
**Type:** Bug (Security)
|
||||
**Priority:** P2 — High
|
||||
**Product Area:** Memberships
|
||||
**Bump modifiers applied:** UP — affects multiple customers
|
||||
|
||||
**Reasoning:**
|
||||
This appears to be a security vulnerability in the session handling that could allow
|
||||
2FA bypass. While no clear exploit path has been reported, the potential for
|
||||
authentication bypass affecting all staff accounts warrants P2. Bumped up from P3
|
||||
because it affects all customers with 2FA enabled.
|
||||
|
||||
**Recommended action:** Code investigation recommended — this is a security bug
|
||||
that needs code-level analysis.
|
||||
```
|
||||
|
||||
For non-bug issues, mark priority as provisional:
|
||||
|
||||
```
|
||||
🤖 **Automated Triage**
|
||||
|
||||
**Type:** Improvement
|
||||
**Priority:** P3 — Medium *(provisional)*
|
||||
**Product Area:** Admin
|
||||
**Bump modifiers applied:** None
|
||||
|
||||
**Reasoning:**
|
||||
This is a refactoring task to share logic between two related functions. No user-facing
|
||||
impact, but reduces maintenance burden for the retention offers codebase. Provisional
|
||||
P3 based on moderate codebase impact and alignment with active project work.
|
||||
|
||||
**Recommended action:** Code investigation recommended — small refactoring task with
|
||||
clear scope, no design input needed.
|
||||
```
|
||||
|
||||
### Decision 5: Code Investigation Recommendation
|
||||
|
||||
Flag an issue for code investigation in your comment if **all** of these are true:
|
||||
|
||||
1. Classified as a bug, security issue, performance issue, or small improvement/maintenance task
|
||||
2. Does not require design input (no UI mockups needed, no UX decisions)
|
||||
3. Has enough description to investigate (not just a title with no context)
|
||||
|
||||
Do **not** recommend investigation for:
|
||||
- Feature requests (need product/design input)
|
||||
- Issues with vague descriptions and no reproduction steps — instead note "Needs more info" in the comment
|
||||
- Issues that are clearly large architectural changes
|
||||
|
||||
## Guidelines
|
||||
|
||||
- Process issues one at a time, applying all decisions before moving to the next
|
||||
- Be concise but include enough reasoning that a human can quickly validate or override
|
||||
- When in doubt about classification, pick the closest match and note your uncertainty
|
||||
- If an issue already has triage labels or a triage comment from a previous run, skip it
|
||||
- Never move issues out of the Triage state
|
||||
- After processing all issues, update cache-memory with the full list of triaged identifiers
|
||||
@@ -0,0 +1,57 @@
|
||||
name: Migration Review
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [opened]
|
||||
paths:
|
||||
- 'ghost/core/core/server/data/schema/**'
|
||||
- 'ghost/core/core/server/data/migrations/versions/**'
|
||||
jobs:
|
||||
createComment:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.repository_owner == 'TryGhost'
|
||||
name: Add migration review requirements
|
||||
steps:
|
||||
- uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
|
||||
with:
|
||||
script: |
|
||||
github.rest.issues.addLabels({
|
||||
issue_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
labels: ["migration"]
|
||||
})
|
||||
|
||||
- uses: peter-evans/create-or-update-comment@57232238742e38b2ccc27136ce596ccae7ca28b4
|
||||
with:
|
||||
issue-number: ${{ github.event.pull_request.number }}
|
||||
body: |
|
||||
It looks like this PR contains a migration 👀
|
||||
Here's the checklist for reviewing migrations:
|
||||
|
||||
### General requirements
|
||||
|
||||
- [ ] :warning: Tested performance on staging database servers, as performance on local machines is not comparable to a production environment
|
||||
- [ ] Satisfies idempotency requirement (both `up()` and `down()`)
|
||||
- [ ] Does not reference models
|
||||
- [ ] Filename is in the correct format (and correctly ordered)
|
||||
- [ ] Targets the next minor version
|
||||
- [ ] All code paths have appropriate log messages
|
||||
- [ ] Uses the correct utils
|
||||
- [ ] Contains a minimal changeset
|
||||
- [ ] Does not mix DDL/DML operations
|
||||
- [ ] Tested in MySQL and SQLite
|
||||
|
||||
### Schema changes
|
||||
|
||||
- [ ] Both schema change and related migration have been implemented
|
||||
- [ ] For index changes: has been performance tested for large tables
|
||||
- [ ] For new tables/columns: fields use the appropriate predefined field lengths
|
||||
- [ ] For new tables/columns: field names follow the appropriate conventions
|
||||
- [ ] Does not drop a non-alpha table outside of a major version
|
||||
|
||||
### Data changes
|
||||
|
||||
- [ ] Mass updates/inserts are batched appropriately
|
||||
- [ ] Does not loop over large tables/datasets
|
||||
- [ ] Defends against missing or invalid data
|
||||
- [ ] For settings updates: follows the appropriate guidelines
|
||||
@@ -0,0 +1,137 @@
|
||||
name: PR Preview
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types: [labeled, unlabeled, closed]
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
name: Deploy Preview
|
||||
# Runs when the "preview" label is added — requires collaborator write access
|
||||
if: >-
|
||||
github.event.action == 'labeled'
|
||||
&& github.event.label.name == 'preview'
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
actions: read
|
||||
env:
|
||||
HEAD_SHA: ${{ github.event.pull_request.head.sha }}
|
||||
steps:
|
||||
- name: Wait for Docker build job
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
BUILD_JOB_NAME: Build & Publish Artifacts
|
||||
run: |
|
||||
echo "Waiting for '${BUILD_JOB_NAME}' job to complete for $HEAD_SHA..."
|
||||
TIMEOUT=1800 # 30 minutes
|
||||
INTERVAL=30
|
||||
START=$(date +%s)
|
||||
|
||||
while true; do
|
||||
ELAPSED=$(( $(date +%s) - START ))
|
||||
if [ "$ELAPSED" -ge "$TIMEOUT" ]; then
|
||||
echo "::error::Timed out waiting for '${BUILD_JOB_NAME}' (${TIMEOUT}s)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Find the CI run for this SHA
|
||||
RUN=$(gh api "repos/${{ github.repository }}/actions/workflows/ci.yml/runs?head_sha=${HEAD_SHA}&per_page=1" \
|
||||
--jq '.workflow_runs[0] | {id, status}' 2>/dev/null || echo "")
|
||||
|
||||
if [ -z "$RUN" ] || [ "$RUN" = "null" ]; then
|
||||
echo " No CI run found yet, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
|
||||
sleep "$INTERVAL"
|
||||
continue
|
||||
fi
|
||||
|
||||
RUN_ID=$(echo "$RUN" | jq -r '.id')
|
||||
RUN_STATUS=$(echo "$RUN" | jq -r '.status')
|
||||
|
||||
# Look up the build job specifically (paginate — CI has 30+ jobs)
|
||||
BUILD_JOB=$(gh api --paginate "repos/${{ github.repository }}/actions/runs/${RUN_ID}/jobs?per_page=100" \
|
||||
--jq ".jobs[] | select(.name == \"${BUILD_JOB_NAME}\") | {status, conclusion}")
|
||||
|
||||
if [ -z "$BUILD_JOB" ]; then
|
||||
if [ "$RUN_STATUS" = "completed" ]; then
|
||||
echo "::error::CI run ${RUN_ID} completed but '${BUILD_JOB_NAME}' job was not found"
|
||||
exit 1
|
||||
fi
|
||||
echo " '${BUILD_JOB_NAME}' job not started yet (run ${RUN_STATUS}), waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
|
||||
sleep "$INTERVAL"
|
||||
continue
|
||||
fi
|
||||
|
||||
JOB_STATUS=$(echo "$BUILD_JOB" | jq -r '.status')
|
||||
JOB_CONCLUSION=$(echo "$BUILD_JOB" | jq -r '.conclusion // empty')
|
||||
|
||||
if [ "$JOB_STATUS" = "completed" ]; then
|
||||
if [ "$JOB_CONCLUSION" = "success" ]; then
|
||||
echo "Docker build ready (CI run $RUN_ID)"
|
||||
break
|
||||
fi
|
||||
echo "::error::'${BUILD_JOB_NAME}' did not succeed (conclusion: $JOB_CONCLUSION)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo " '${BUILD_JOB_NAME}' still ${JOB_STATUS}, waiting ${INTERVAL}s... (${ELAPSED}s elapsed)"
|
||||
sleep "$INTERVAL"
|
||||
done
|
||||
|
||||
- name: Re-check PR eligibility
|
||||
id: recheck
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
PR=$(gh api "repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}" \
|
||||
--jq '{state, labels: [.labels[].name]}')
|
||||
|
||||
STATE=$(echo "$PR" | jq -r '.state')
|
||||
HAS_LABEL=$(echo "$PR" | jq '.labels | any(. == "preview")')
|
||||
|
||||
if [ "$STATE" != "open" ]; then
|
||||
echo "::warning::PR is no longer open ($STATE), skipping dispatch"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
elif [ "$HAS_LABEL" != "true" ]; then
|
||||
echo "::warning::preview label was removed, skipping dispatch"
|
||||
echo "skip=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "PR still eligible for preview deploy"
|
||||
echo "skip=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Dispatch deploy to Ghost-Moya
|
||||
if: steps.recheck.outputs.skip != 'true'
|
||||
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
|
||||
with:
|
||||
token: ${{ secrets.CANARY_DOCKER_BUILD }}
|
||||
repository: TryGhost/Ghost-Moya
|
||||
event-type: preview-deploy
|
||||
client-payload: >-
|
||||
{
|
||||
"pr_number": "${{ github.event.pull_request.number }}",
|
||||
"action": "deploy",
|
||||
"seed": "true"
|
||||
}
|
||||
|
||||
destroy:
|
||||
name: Destroy Preview
|
||||
# Runs when "preview" label is removed, or the PR is closed/merged while labeled
|
||||
if: >-
|
||||
(github.event.action == 'unlabeled' && github.event.label.name == 'preview')
|
||||
|| (github.event.action == 'closed' && contains(github.event.pull_request.labels.*.name, 'preview'))
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
steps:
|
||||
- name: Dispatch destroy to Ghost-Moya
|
||||
uses: peter-evans/repository-dispatch@28959ce8df70de7be546dd1250a005dd32156697 # v4
|
||||
with:
|
||||
token: ${{ secrets.CANARY_DOCKER_BUILD }}
|
||||
repository: TryGhost/Ghost-Moya
|
||||
event-type: preview-destroy
|
||||
client-payload: >-
|
||||
{
|
||||
"pr_number": "${{ github.event.pull_request.number }}",
|
||||
"action": "destroy"
|
||||
}
|
||||
@@ -0,0 +1,46 @@
|
||||
name: Publish tb-cli Image
|
||||
|
||||
on:
|
||||
workflow_dispatch: # Manual trigger from GitHub UI or CLI
|
||||
push:
|
||||
branches: [main]
|
||||
paths:
|
||||
- 'docker/tb-cli/**'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
|
||||
jobs:
|
||||
publish:
|
||||
name: Build and push tb-cli to GHCR
|
||||
runs-on: ubuntu-latest
|
||||
if: github.repository == 'TryGhost/Ghost' && github.ref == 'refs/heads/main'
|
||||
concurrency:
|
||||
group: publish-tb-cli
|
||||
cancel-in-progress: true
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4
|
||||
|
||||
- name: Login to GHCR
|
||||
uses: docker/login-action@4907a6ddec9925e35a0a9e82d7399ccc52663121 # v4
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7
|
||||
with:
|
||||
context: .
|
||||
file: docker/tb-cli/Dockerfile
|
||||
push: true
|
||||
tags: |
|
||||
ghcr.io/tryghost/tb-cli:latest
|
||||
ghcr.io/tryghost/tb-cli:${{ github.sha }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
@@ -0,0 +1,112 @@
|
||||
name: Release
|
||||
run-name: "Release — ${{ inputs.bump-type || 'auto' }} from ${{ inputs.branch || 'main' }}${{ inputs.dry-run && ' (dry run)' || '' }}"
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 15 * * 5' # Friday 3pm UTC
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
branch:
|
||||
description: 'Git branch to release from'
|
||||
type: string
|
||||
default: 'main'
|
||||
required: false
|
||||
bump-type:
|
||||
description: 'Version bump type (auto, patch, minor)'
|
||||
type: string
|
||||
required: false
|
||||
default: 'auto'
|
||||
skip-checks:
|
||||
description: 'Skip CI status check verification'
|
||||
type: boolean
|
||||
default: false
|
||||
dry-run:
|
||||
description: 'Dry run (version bump without push)'
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
env:
|
||||
FORCE_COLOR: 1
|
||||
NODE_VERSION: 22.18.0
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
release:
|
||||
runs-on: ubuntu-latest
|
||||
name: Prepare & Push Release
|
||||
steps:
|
||||
- uses: webfactory/ssh-agent@e83874834305fe9a4a2997156cb26c5de65a8555 # v0.10.0
|
||||
with:
|
||||
ssh-private-key: ${{ secrets.DEPLOY_KEY }}
|
||||
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
|
||||
with:
|
||||
# Deploy key (via ssh-agent) is used for git push — it bypasses
|
||||
# branch protection and triggers downstream workflows (unlike GITHUB_TOKEN)
|
||||
ref: ${{ inputs.branch || 'main' }}
|
||||
fetch-depth: 0
|
||||
ssh-key: ${{ secrets.DEPLOY_KEY }}
|
||||
|
||||
# Fetch submodules separately via HTTPS — the deploy key is scoped to
|
||||
# Ghost only and can't authenticate against Casper/Source over SSH
|
||||
- run: git submodule update --init
|
||||
|
||||
- uses: pnpm/action-setup@b906affcce14559ad1aafd4ab0e942779e9f58b1 # v4
|
||||
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6
|
||||
env:
|
||||
FORCE_COLOR: 0
|
||||
with:
|
||||
node-version: ${{ env.NODE_VERSION }}
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
|
||||
- name: Set up Git
|
||||
run: |
|
||||
git config user.name "Ghost CI"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Set up schedule defaults
|
||||
if: github.event_name == 'schedule'
|
||||
run: |
|
||||
echo "RELEASE_BRANCH=main" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_BUMP_TYPE=auto" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_DRY_RUN=" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_SKIP_CHECKS=" >> "$GITHUB_ENV"
|
||||
|
||||
- name: Set up workflow_dispatch inputs
|
||||
if: github.event_name == 'workflow_dispatch'
|
||||
run: |
|
||||
echo "RELEASE_BRANCH=${INPUT_BRANCH}" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_BUMP_TYPE=${INPUT_BUMP_TYPE}" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_DRY_RUN=${INPUT_DRY_RUN}" >> "$GITHUB_ENV"
|
||||
echo "RELEASE_SKIP_CHECKS=${INPUT_SKIP_CHECKS}" >> "$GITHUB_ENV"
|
||||
env:
|
||||
INPUT_BRANCH: ${{ inputs.branch }}
|
||||
INPUT_BUMP_TYPE: ${{ inputs.bump-type }}
|
||||
INPUT_DRY_RUN: ${{ inputs.dry-run }}
|
||||
INPUT_SKIP_CHECKS: ${{ inputs.skip-checks }}
|
||||
|
||||
- name: Run release script
|
||||
run: |
|
||||
ARGS="--branch=${{ env.RELEASE_BRANCH }} --bump-type=${{ env.RELEASE_BUMP_TYPE }}"
|
||||
if [ "${{ env.RELEASE_DRY_RUN }}" = "true" ]; then
|
||||
ARGS="$ARGS --dry-run"
|
||||
fi
|
||||
if [ "${{ env.RELEASE_SKIP_CHECKS }}" = "true" ]; then
|
||||
ARGS="$ARGS --skip-checks"
|
||||
fi
|
||||
node scripts/release.js $ARGS
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.CANARY_DOCKER_BUILD }} # PAT for GitHub API (check polling)
|
||||
|
||||
- name: Notify on failure
|
||||
if: failure()
|
||||
uses: tryghost/actions/actions/slack-build@20b5ae5f266e86f7b5f0815d92731d6388b8ce46 # main
|
||||
with:
|
||||
status: ${{ job.status }}
|
||||
env:
|
||||
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
|
||||
@@ -0,0 +1,26 @@
|
||||
name: 'Close stale i18n PRs'
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 6 * * *'
|
||||
jobs:
|
||||
stale:
|
||||
if: github.repository_owner == 'TryGhost'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10
|
||||
with:
|
||||
stale-pr-message: |
|
||||
Thanks for contributing to Ghost's i18n :)
|
||||
|
||||
This PR has been automatically marked as stale because there has not been any activity here in 3 weeks.
|
||||
I18n PRs tend to get out of date quickly, so we're closing them to keep the PR list clean.
|
||||
|
||||
If you're still interested in working on this PR, please let us know. Otherwise this PR will be closed shortly, but can always be reopened later. Thank you for understanding 🙂
|
||||
only-labels: 'affects:i18n'
|
||||
days-before-pr-stale: 21
|
||||
days-before-pr-close: 7
|
||||
exempt-pr-labels: 'feature,pinned,needs:triage'
|
||||
stale-pr-label: 'stale'
|
||||
close-pr-message: |
|
||||
This PR has been automatically closed due to inactivity. If you'd like to continue working on it, feel free to open a new PR.
|
||||
@@ -0,0 +1,29 @@
|
||||
name: 'Close stale issues and PRs'
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 6 * * *'
|
||||
jobs:
|
||||
stale:
|
||||
if: github.repository_owner == 'TryGhost'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@b5d41d4e1d5dceea10e7104786b73624c18a190f # v10
|
||||
with:
|
||||
stale-issue-message: |
|
||||
Our bot has automatically marked this issue as stale because there has not been any activity here in some time.
|
||||
|
||||
The issue will be closed soon if there are no further updates, however we ask that you do not post comments to keep the issue open if you are not actively working on a PR.
|
||||
|
||||
We keep the issue list minimal so we can keep focus on the most pressing issues. Closed issues can always be reopened if a new contributor is found. Thank you for understanding 🙂
|
||||
stale-pr-message: |
|
||||
Our bot has automatically marked this PR as stale because there has not been any activity here in some time.
|
||||
|
||||
If we’ve missed reviewing your PR & you’re still interested in working on it, please let us know. Otherwise this PR will be closed shortly, but can always be reopened later. Thank you for understanding 🙂
|
||||
exempt-issue-labels: 'feature,pinned,needs:triage'
|
||||
exempt-pr-labels: 'feature,pinned,needs:triage'
|
||||
days-before-stale: 113
|
||||
days-before-pr-stale: 358
|
||||
stale-issue-label: 'stale'
|
||||
stale-pr-label: 'stale'
|
||||
close-issue-reason: 'not_planned'
|
||||
+211
@@ -0,0 +1,211 @@
|
||||
# Node template
|
||||
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
|
||||
# Runtime data
|
||||
pids
|
||||
*.pid
|
||||
*.seed
|
||||
*.pid.lock
|
||||
|
||||
# Directory for instrumented libs generated by jscoverage/JSCover
|
||||
lib-cov
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
coverage
|
||||
coverage*
|
||||
|
||||
# nyc test coverage
|
||||
.nyc_output
|
||||
|
||||
# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
|
||||
.grunt
|
||||
|
||||
# Bower dependency directory (https://bower.io/)
|
||||
bower_components
|
||||
|
||||
# node-waf configuration
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons (https://nodejs.org/api/addons.html)
|
||||
build/Release
|
||||
|
||||
# Dependency directories
|
||||
node_modules/
|
||||
jspm_packages/
|
||||
|
||||
# Typescript v1 declaration files
|
||||
typings/
|
||||
|
||||
# Optional npm cache directory
|
||||
.npm
|
||||
|
||||
# Nx
|
||||
.nxcache
|
||||
.nx/cache
|
||||
.nx/workspace-data
|
||||
|
||||
# Optional eslint cache
|
||||
.eslintcache
|
||||
|
||||
# Optional REPL history
|
||||
.node_repl_history
|
||||
|
||||
# Output of 'npm pack'
|
||||
*.tgz
|
||||
|
||||
# pnpm store
|
||||
.pnpm-store
|
||||
|
||||
# dotenv environment variables file
|
||||
.env
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
*.iml
|
||||
*.sublime-*
|
||||
.vscode/*
|
||||
!.vscode/launch.json
|
||||
!.vscode/settings.json
|
||||
|
||||
# OSX
|
||||
.DS_Store
|
||||
|
||||
!test/utils/fixtures/**/*.csv
|
||||
|
||||
# Ghost DB file
|
||||
*.db
|
||||
*.db-journal
|
||||
|
||||
/ghost/core/test-results/
|
||||
/ghost/core/core/server/data/export/exported*
|
||||
/ghost/core/content/tmp/*
|
||||
/ghost/core/content/data/*
|
||||
/ghost/core/content/logs/*
|
||||
/ghost/core/content/settings/*
|
||||
/ghost/core/content/apps/**/*
|
||||
/ghost/core/content/themes/**/*
|
||||
/ghost/core/content/images/**/*
|
||||
/ghost/core/content/media/**/*
|
||||
/ghost/core/content/files/**/*
|
||||
/ghost/core/content/public/*
|
||||
/ghost/core/content/adapters/storage/**/*
|
||||
/ghost/core/content/adapters/scheduling/**/*
|
||||
/ghost/core/content/themes/casper
|
||||
/ghost/core/content/themes/source
|
||||
!/ghost/core/README.md
|
||||
!/ghost/core/content/**/README.md
|
||||
|
||||
# Changelog, which is autogenerated, not committed
|
||||
/ghost/core/CHANGELOG.md
|
||||
|
||||
# Assets bundled into the release but we don't want to commit
|
||||
/ghost/core/LICENSE
|
||||
/ghost/core/PRIVACY.md
|
||||
/ghost/core/README.md
|
||||
/ghost/core/pnpm-lock.yaml
|
||||
|
||||
# Test generated files
|
||||
test/functional/*.png
|
||||
|
||||
# ignore all custom json files for config
|
||||
/ghost/core/config.*.json
|
||||
/ghost/core/config.*.jsonc
|
||||
|
||||
# Built asset files
|
||||
/ghost/core/core/built
|
||||
/ghost/core/core/frontend/public/ghost.min.css
|
||||
/ghost/core/core/frontend/public/comment-counts.min.js
|
||||
/ghost/core/core/frontend/public/member-attribution.min.js
|
||||
/ghost/core/core/frontend/public/ghost-stats.min.js
|
||||
/ghost/core/core/frontend/public/private.min.js
|
||||
# Caddyfile - for local development with ssl + caddy
|
||||
Caddyfile
|
||||
!docker/caddy/Caddyfile
|
||||
!docker/dev-gateway/Caddyfile
|
||||
|
||||
# Playwright state with cookies it keeps across tests
|
||||
/ghost/core/playwright-state.json
|
||||
/ghost/core/playwright-report
|
||||
/playwright-report
|
||||
/test-results
|
||||
|
||||
# Admin
|
||||
/ghost/admin/dist
|
||||
|
||||
# Comments-UI
|
||||
/apps/comments-ui/umd
|
||||
/apps/comments-ui/playwright-report
|
||||
/ghost/comments-ui/playwright/.cache/
|
||||
/apps/comments-ui/test-results/
|
||||
|
||||
# Portal
|
||||
!/apps/portal/.env
|
||||
/apps/portal/umd
|
||||
|
||||
# Sodo-Search
|
||||
/apps/sodo-search/public/main.css
|
||||
/apps/sodo-search/umd
|
||||
|
||||
# Signup Form and local environments
|
||||
/apps/signup-form/umd
|
||||
/apps/signup-form/.env*.local
|
||||
/apps/signup-form/test-results/
|
||||
/apps/signup-form/playwright-report/
|
||||
/apps/signup-form/playwright/.cache/
|
||||
|
||||
# Announcement-Bar
|
||||
/apps/announcement-bar/umd
|
||||
|
||||
# Build files
|
||||
/apps/*/build
|
||||
/ghost/*/build
|
||||
# Typescript build artifacts
|
||||
tsconfig.tsbuildinfo
|
||||
|
||||
# Admin X
|
||||
/apps/admin-x-settings/dist
|
||||
/apps/admin-x-settings/dist-ssr
|
||||
/apps/admin-x-settings/test-results/
|
||||
/apps/admin-x-settings/playwright-report/
|
||||
/apps/admin-x-settings/playwright/.cache/
|
||||
|
||||
# Tinybird
|
||||
.tinyb
|
||||
.venv
|
||||
.diff_tmp
|
||||
temp*.sql
|
||||
|
||||
# Docker pnpm Cache
|
||||
.pnpmhash
|
||||
|
||||
# yalc — for linking local packages in a docker compatible way
|
||||
yalc.lock
|
||||
.yalc
|
||||
|
||||
# A folder for AI generated files
|
||||
# useful for keeping local plans etc
|
||||
/ai
|
||||
|
||||
# direnv environment loader files
|
||||
.envrc
|
||||
|
||||
# Private Claude Code instructions
|
||||
*.local.md
|
||||
.claude/settings.local.json
|
||||
.mcp.local.json
|
||||
|
||||
# e2e test suite
|
||||
/e2e/test-results
|
||||
/e2e/playwright-report
|
||||
/e2e/build
|
||||
/e2e/playwright
|
||||
/e2e/data
|
||||
.env.tinybird
|
||||
.cursor/rules/nx-rules.mdc
|
||||
.github/instructions/nx.instructions.md
|
||||
@@ -0,0 +1,8 @@
|
||||
[submodule "ghost/core/content/themes/casper"]
|
||||
path = ghost/core/content/themes/casper
|
||||
url = ../../TryGhost/Casper.git
|
||||
ignore = all
|
||||
[submodule "ghost/core/content/themes/source"]
|
||||
path = ghost/core/content/themes/source
|
||||
url = ../../TryGhost/Source.git
|
||||
ignore = all
|
||||
@@ -0,0 +1,69 @@
|
||||
const path = require('path');
|
||||
|
||||
const SCOPED_WORKSPACES = [
|
||||
'e2e',
|
||||
'apps/admin',
|
||||
'apps/posts',
|
||||
'apps/shade'
|
||||
];
|
||||
|
||||
function normalize(file) {
|
||||
return file.split(path.sep).join('/');
|
||||
}
|
||||
|
||||
function isInWorkspace(file, workspace) {
|
||||
const normalizedFile = normalize(path.relative(process.cwd(), file));
|
||||
const normalizedWorkspace = normalize(workspace);
|
||||
|
||||
return normalizedFile === normalizedWorkspace || normalizedFile.startsWith(`${normalizedWorkspace}/`);
|
||||
}
|
||||
|
||||
function shellQuote(value) {
|
||||
return `'${value.replace(/'/g, `'\\''`)}'`;
|
||||
}
|
||||
|
||||
function buildScopedEslintCommand(workspace, files) {
|
||||
if (files.length === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const relativeFiles = files
|
||||
.map(file => normalize(path.relative(workspace, file)))
|
||||
.map(shellQuote)
|
||||
.join(' ');
|
||||
|
||||
return `pnpm --dir ${shellQuote(workspace)} exec eslint --cache ${relativeFiles}`;
|
||||
}
|
||||
|
||||
function buildRootEslintCommand(files) {
|
||||
if (files.length === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const quotedFiles = files.map(file => shellQuote(normalize(file))).join(' ');
|
||||
return `eslint --cache ${quotedFiles}`;
|
||||
}
|
||||
|
||||
module.exports = {
|
||||
'*.{js,ts,tsx,jsx,cjs}': (files) => {
|
||||
const workspaceGroups = new Map(SCOPED_WORKSPACES.map(workspace => [workspace, []]));
|
||||
const rootFiles = [];
|
||||
|
||||
for (const file of files) {
|
||||
const workspace = SCOPED_WORKSPACES.find(candidate => isInWorkspace(file, candidate));
|
||||
|
||||
if (workspace) {
|
||||
workspaceGroups.get(workspace).push(file);
|
||||
} else {
|
||||
rootFiles.push(file);
|
||||
}
|
||||
}
|
||||
|
||||
return [
|
||||
...SCOPED_WORKSPACES
|
||||
.map(workspace => buildScopedEslintCommand(workspace, workspaceGroups.get(workspace)))
|
||||
.filter(Boolean),
|
||||
buildRootEslintCommand(rootFiles)
|
||||
].filter(Boolean);
|
||||
}
|
||||
};
|
||||
Vendored
+54
@@ -0,0 +1,54 @@
|
||||
{
|
||||
// Use IntelliSense to learn about possible attributes.
|
||||
// Hover to view descriptions of existing attributes.
|
||||
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"args": [
|
||||
"--require",
|
||||
"./test/utils/overrides.js",
|
||||
"-u",
|
||||
"bdd",
|
||||
"--timeout",
|
||||
"999999",
|
||||
"--colors",
|
||||
"./test/e2e-api/**/*.test.js"
|
||||
],
|
||||
"cwd": "${workspaceFolder}/ghost/core",
|
||||
"internalConsoleOptions": "openOnSessionStart",
|
||||
"name": "E2E API Tests",
|
||||
"program": "./node_modules/.bin/_mocha",
|
||||
"request": "launch",
|
||||
"skipFiles": [
|
||||
"<node_internals>/**"
|
||||
],
|
||||
"env": {
|
||||
"NODE_ENV": "testing-mysql"
|
||||
},
|
||||
"type": "node"
|
||||
},
|
||||
{
|
||||
"args": [
|
||||
"-u",
|
||||
"bdd",
|
||||
"--timeout",
|
||||
"999999",
|
||||
"--colors",
|
||||
"./test/**/*.test.js"
|
||||
],
|
||||
"cwd": "${workspaceFolder}/ghost/email-service/",
|
||||
"internalConsoleOptions": "openOnSessionStart",
|
||||
"name": "Email Service Unit Tests",
|
||||
"program": "./node_modules/.bin/_mocha",
|
||||
"request": "launch",
|
||||
"skipFiles": [
|
||||
"<node_internals>/**"
|
||||
],
|
||||
"env": {
|
||||
"NODE_ENV": "testing-mysql"
|
||||
},
|
||||
"type": "node"
|
||||
}
|
||||
]
|
||||
}
|
||||
Vendored
+31
@@ -0,0 +1,31 @@
|
||||
{
|
||||
"editor.quickSuggestions": {
|
||||
"strings": true
|
||||
},
|
||||
"eslint.workingDirectories": [
|
||||
{
|
||||
"pattern": "./apps/*/"
|
||||
},
|
||||
{
|
||||
"pattern": "./ghost/*/"
|
||||
}
|
||||
],
|
||||
"search.exclude": {
|
||||
"**/.git": true,
|
||||
"**/build/*": true,
|
||||
"**/coverage/**": true,
|
||||
"**/dist/**": true,
|
||||
"**/ghost.map": true,
|
||||
"**/node_modules": true,
|
||||
"ghost/core/core/built/**": true,
|
||||
"**/config.*.json": false,
|
||||
"**/config.*.jsonc": false
|
||||
},
|
||||
"tailwindCSS.experimental.classRegex": [
|
||||
["clsx\\(([^)]*)\\)", "(?:'|\"|`)([^']*)(?:'|\"|`)"]
|
||||
],
|
||||
"git.detectSubmodules": false,
|
||||
"[typescript]": {
|
||||
"editor.defaultFormatter": "dbaeumer.vscode-eslint"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,283 @@
|
||||
# AGENTS.md
|
||||
|
||||
This file provides guidance to AI Agents when working with code in this repository.
|
||||
|
||||
## Package Manager
|
||||
|
||||
**Always use `pnpm` for all commands.** This repository uses pnpm workspaces, not npm.
|
||||
|
||||
## Monorepo Structure
|
||||
|
||||
Ghost is a pnpm + Nx monorepo with three workspace groups:
|
||||
|
||||
### ghost/* - Core Ghost packages
|
||||
- **ghost/core** - Main Ghost application (Node.js/Express backend)
|
||||
- Core server: `ghost/core/core/server/`
|
||||
- Frontend rendering: `ghost/core/core/frontend/`
|
||||
- **ghost/admin** - Ember.js admin client (legacy, being migrated to React)
|
||||
- **ghost/i18n** - Centralized internationalization for all apps
|
||||
|
||||
### apps/* - React-based UI applications
|
||||
Two categories of apps:
|
||||
|
||||
**Admin Apps** (embedded in Ghost Admin):
|
||||
- `admin-x-settings`, `admin-x-activitypub` - Settings and integrations
|
||||
- `posts`, `stats` - Post analytics and site-wide analytics
|
||||
- Built with Vite + React + `@tanstack/react-query`
|
||||
|
||||
**Public Apps** (served to site visitors):
|
||||
- `portal`, `comments-ui`, `signup-form`, `sodo-search`, `announcement-bar`
|
||||
- Built as UMD bundles, loaded via CDN in site themes
|
||||
|
||||
**Foundation Libraries**:
|
||||
- `admin-x-framework` - Shared API hooks, routing, utilities
|
||||
- `admin-x-design-system` - Legacy design system (being phased out)
|
||||
- `shade` - New design system (shadcn/ui + Radix UI + react-hook-form + zod)
|
||||
|
||||
### e2e/ - End-to-end tests
|
||||
- Playwright-based E2E tests with Docker container isolation
|
||||
- See `e2e/CLAUDE.md` for detailed testing guidance
|
||||
|
||||
## Common Commands
|
||||
|
||||
### Development
|
||||
```bash
|
||||
corepack enable pnpm # Enable corepack to use the correct pnpm version
|
||||
pnpm run setup # First-time setup (installs deps + submodules)
|
||||
pnpm dev # Start development (Docker backend + host frontend dev servers)
|
||||
```
|
||||
|
||||
### Building
|
||||
```bash
|
||||
pnpm build # Build all packages (Nx handles dependencies)
|
||||
pnpm build:clean # Clean build artifacts and rebuild
|
||||
```
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
# Unit tests (from root)
|
||||
pnpm test:unit # Run all unit tests in all packages
|
||||
|
||||
# Ghost core tests (from ghost/core/)
|
||||
cd ghost/core
|
||||
pnpm test:unit # Unit tests only
|
||||
pnpm test:integration # Integration tests
|
||||
pnpm test:e2e # E2E API tests (not browser)
|
||||
pnpm test:all # All test types
|
||||
|
||||
# E2E browser tests (from root)
|
||||
pnpm test:e2e # Run e2e/ Playwright tests
|
||||
|
||||
# Running a single test
|
||||
cd ghost/core
|
||||
pnpm test:single test/unit/path/to/test.test.js
|
||||
```
|
||||
|
||||
### Linting
|
||||
```bash
|
||||
pnpm lint # Lint all packages
|
||||
cd ghost/core && pnpm lint # Lint Ghost core (server, shared, frontend, tests)
|
||||
cd ghost/admin && pnpm lint # Lint Ember admin
|
||||
```
|
||||
|
||||
### Database
|
||||
```bash
|
||||
pnpm knex-migrator migrate # Run database migrations
|
||||
pnpm reset:data # Reset database with test data (1000 members, 100 posts) (requires pnpm dev running)
|
||||
pnpm reset:data:empty # Reset database with no data (requires pnpm dev running)
|
||||
```
|
||||
|
||||
### Docker
|
||||
```bash
|
||||
pnpm docker:build # Build Docker images
|
||||
pnpm docker:clean # Stop containers, remove volumes and local images
|
||||
pnpm docker:down # Stop containers
|
||||
```
|
||||
|
||||
### How `pnpm dev` works
|
||||
|
||||
The `pnpm dev` command uses a **hybrid Docker + host development** setup:
|
||||
|
||||
**What runs in Docker:**
|
||||
- Ghost Core backend (with hot-reload via mounted source)
|
||||
- MySQL, Redis, Mailpit
|
||||
- Caddy gateway/reverse proxy
|
||||
|
||||
**What runs on host:**
|
||||
- Frontend dev servers (Admin, Portal, Comments UI, etc.) in watch mode with HMR
|
||||
- Foundation libraries (shade, admin-x-framework, etc.)
|
||||
|
||||
**Setup:**
|
||||
```bash
|
||||
# Start everything (Docker + frontend dev servers)
|
||||
pnpm dev
|
||||
|
||||
# With optional services (uses Docker Compose file composition)
|
||||
pnpm dev:analytics # Include Tinybird analytics
|
||||
pnpm dev:storage # Include MinIO S3-compatible object storage
|
||||
pnpm dev:all # Include all optional services
|
||||
```
|
||||
|
||||
**Accessing Services:**
|
||||
- Ghost: `http://localhost:2368` (database: `ghost_dev`)
|
||||
- Mailpit UI: `http://localhost:8025` (email testing)
|
||||
- MySQL: `localhost:3306`
|
||||
- Redis: `localhost:6379`
|
||||
- Tinybird: `http://localhost:7181` (when analytics enabled)
|
||||
- MinIO Console: `http://localhost:9001` (when storage enabled)
|
||||
- MinIO S3 API: `http://localhost:9000` (when storage enabled)
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
### Admin Apps Integration (Micro-Frontend)
|
||||
|
||||
**Build Process:**
|
||||
1. Admin-x React apps build to `apps/*/dist` using Vite
|
||||
2. `ghost/admin/lib/asset-delivery` copies them to `ghost/core/core/built/admin/assets/*`
|
||||
3. Ghost admin serves from `/ghost/assets/{app-name}/{app-name}.js`
|
||||
|
||||
**Runtime Loading:**
|
||||
- Ember admin uses `AdminXComponent` to dynamically import React apps
|
||||
- React components wrapped in Suspense with error boundaries
|
||||
- Apps receive config via `additionalProps()` method
|
||||
|
||||
### Public Apps Integration
|
||||
|
||||
- Built as UMD bundles to `apps/*/umd/*.min.js`
|
||||
- Loaded via `<script>` tags in theme templates (injected by `{{ghost_head}}`)
|
||||
- Configuration passed via data attributes
|
||||
|
||||
### i18n Architecture
|
||||
|
||||
**Centralized Translations:**
|
||||
- Single source: `ghost/i18n/locales/{locale}/{namespace}.json`
|
||||
- Namespaces: `ghost`, `portal`, `signup-form`, `comments`, `search`
|
||||
- 60+ supported locales
|
||||
- Context descriptions: `ghost/i18n/locales/context.json` — every key must have a non-empty description
|
||||
|
||||
**Translation Workflow:**
|
||||
```bash
|
||||
pnpm --filter @tryghost/i18n translate # Extract keys from source, update all locale files + context.json
|
||||
pnpm --filter @tryghost/i18n lint:translations # Validate interpolation variables across locales
|
||||
```
|
||||
|
||||
`translate` is run as part of `pnpm --filter @tryghost/i18n test`. In CI, it fails if translation keys or `context.json` are out of date (`failOnUpdate: process.env.CI`). Always run `pnpm --filter @tryghost/i18n translate` after adding or changing `t()` calls.
|
||||
|
||||
**Rules for Translation Keys:**
|
||||
1. **Never split sentences across multiple `t()` calls.** Translators cannot reorder words across separate keys. Instead, use `@doist/react-interpolate` to embed React elements (links, bold, etc.) within a single translatable string.
|
||||
2. **Always provide context descriptions.** When adding a new key, add a description in `context.json` explaining where the string appears and what it does. CI will reject empty descriptions.
|
||||
3. **Use interpolation for dynamic values.** Ghost uses `{variable}` syntax: `t('Welcome back, {name}!', {name: firstname})`
|
||||
4. **Use `<tag>` syntax for inline elements.** Combined with `@doist/react-interpolate`: `t('Click <a>here</a> to retry')` with `mapping={{ a: <a href="..." /> }}`
|
||||
|
||||
**Correct pattern (using Interpolate):**
|
||||
```jsx
|
||||
import Interpolate from '@doist/react-interpolate';
|
||||
|
||||
<Interpolate
|
||||
mapping={{ a: <a href={link} /> }}
|
||||
string={t('Could not sign in. <a>Click here to retry</a>')}
|
||||
/>
|
||||
```
|
||||
|
||||
**Incorrect pattern (split sentences):**
|
||||
```jsx
|
||||
// BAD: translators cannot reorder "Click here to retry" relative to the first sentence
|
||||
{t('Could not sign in.')} <a href={link}>{t('Click here to retry')}</a>
|
||||
```
|
||||
|
||||
See `apps/portal/src/components/pages/email-receiving-faq.js` for a canonical example of correct `Interpolate` usage.
|
||||
|
||||
### Build Dependencies (Nx)
|
||||
|
||||
Critical build order (Nx handles automatically):
|
||||
1. `shade` + `admin-x-design-system` build
|
||||
2. `admin-x-framework` builds (depends on #1)
|
||||
3. Admin apps build (depend on #2)
|
||||
4. `ghost/admin` builds (depends on #3, copies via asset-delivery)
|
||||
5. `ghost/core` serves admin build
|
||||
|
||||
## CSS Architecture
|
||||
|
||||
### TailwindCSS v4 Setup
|
||||
|
||||
Ghost Admin uses **TailwindCSS v4** via the `@tailwindcss/vite` plugin. CSS processing is centralized — only `apps/admin/vite.config.ts` loads the `@tailwindcss/vite` plugin. All embedded React apps (posts, stats, activitypub, admin-x-settings, admin-x-design-system) are scanned from this single entry point.
|
||||
|
||||
### Entry Point
|
||||
|
||||
`apps/admin/src/index.css` is the main CSS entry point. It contains:
|
||||
- `@source` directives that scan class usage in shade, posts, stats, activitypub, admin-x-settings, admin-x-design-system, and kg-unsplash-selector
|
||||
- `@import "@tryghost/shade/styles.css"` which loads the Shade design system styles
|
||||
|
||||
### Shade Styles
|
||||
|
||||
`apps/shade/styles.css` uses **unlayered** Tailwind imports:
|
||||
```css
|
||||
@import "tailwindcss/theme.css";
|
||||
@import "./preflight.css";
|
||||
@import "tailwindcss/utilities.css";
|
||||
@import "tw-animate-css";
|
||||
@import "./tailwind.theme.css";
|
||||
```
|
||||
|
||||
**Why unlayered:** Ember's legacy CSS (`.flex`, `.hidden`, etc.) is unlayered. If Tailwind utilities were in a `@layer`, they would lose to Ember's unlayered CSS in the cascade. Keeping both unlayered means source order determines specificity.
|
||||
|
||||
Theme tokens/variants/animations are defined in CSS (`apps/shade/tailwind.theme.css` + runtime vars in `styles.css`), so there is no JS `@config` bridge in the Admin runtime lane. `tw-animate-css` is the v4 replacement for `tailwindcss-animate`.
|
||||
|
||||
### Critical Rule: Embedded Apps Must NOT Import Shade Independently
|
||||
|
||||
Apps consumed via `@source` (posts, stats, activitypub) must **NOT** import `@tryghost/shade/styles.css` in their own CSS. Doing so causes duplicate Tailwind utilities and cascade conflicts. All Tailwind CSS is generated once via the admin entry point.
|
||||
|
||||
### Public Apps
|
||||
|
||||
Public-facing apps (`comments-ui`, `signup-form`, `sodo-search`, `portal`, `announcement-bar`) remain on **TailwindCSS v3**. They are built as UMD bundles for CDN distribution and are independent of the admin CSS pipeline.
|
||||
|
||||
### Legacy Apps
|
||||
|
||||
`admin-x-design-system` and `admin-x-settings` are consumed via `@source` in admin's centralized v4 pipeline for production, and both packages build with CSS-first Tailwind v4 setup.
|
||||
|
||||
## Code Guidelines
|
||||
|
||||
### Commit Messages
|
||||
When the user asks you to create a commit or draft a commit message, load and follow the `commit` skill from `.agents/skills/commit`.
|
||||
|
||||
### When Working on Admin UI
|
||||
- **New features:** Build in React (`apps/admin-x-*` or `apps/posts`)
|
||||
- **Use:** `admin-x-framework` for API hooks (`useBrowse`, `useEdit`, etc.)
|
||||
- **Use:** `shade` design system for new components (not admin-x-design-system)
|
||||
- **Translations:** Add to `ghost/i18n/locales/en/ghost.json`
|
||||
|
||||
### When Working on Public UI
|
||||
- **Edit:** `apps/portal`, `apps/comments-ui`, etc.
|
||||
- **Translations:** Separate namespaces (`portal.json`, `comments.json`)
|
||||
- **Build:** UMD bundles for CDN distribution
|
||||
|
||||
### When Working on Backend
|
||||
- **Core logic:** `ghost/core/core/server/`
|
||||
- **Database Schema:** `ghost/core/core/server/data/schema/`
|
||||
- **API routes:** `ghost/core/core/server/api/`
|
||||
- **Services:** `ghost/core/core/server/services/`
|
||||
- **Models:** `ghost/core/core/server/models/`
|
||||
- **Frontend & theme rendering:** `ghost/core/core/frontend/`
|
||||
|
||||
### Design System Usage
|
||||
- **New components:** Use `shade` (shadcn/ui-inspired)
|
||||
- **Legacy:** `admin-x-design-system` (being phased out, avoid for new work)
|
||||
|
||||
### Analytics (Tinybird)
|
||||
- **Local development:** `pnpm dev:analytics` (starts Tinybird + MySQL)
|
||||
- **Config:** Add Tinybird config to `ghost/core/config.development.json`
|
||||
- **Scripts:** `ghost/core/core/server/data/tinybird/scripts/`
|
||||
- **Datafiles:** `ghost/core/core/server/data/tinybird/`
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Build Issues
|
||||
```bash
|
||||
pnpm fix # Clean cache + node_modules + reinstall
|
||||
pnpm build:clean # Clean build artifacts
|
||||
pnpm nx reset # Reset Nx cache
|
||||
```
|
||||
|
||||
### Test Issues
|
||||
- **E2E failures:** Check `e2e/CLAUDE.md` for debugging tips
|
||||
- **Docker issues:** `pnpm docker:clean && pnpm docker:build`
|
||||
@@ -0,0 +1,61 @@
|
||||
# syntax=docker/dockerfile:1-labs@sha256:7eca9451d94f9b8ad22e44988b92d595d3e4d65163794237949a8c3413fbed5d
|
||||
|
||||
# Production Dockerfile for Ghost
|
||||
# Two targets:
|
||||
# core — server + production deps, no admin (Ghost-Pro base)
|
||||
# full — core + built admin (self-hosting)
|
||||
#
|
||||
# Build context: extracted `npm pack` output from ghost/core
|
||||
|
||||
ARG NODE_VERSION=22.18.0
|
||||
|
||||
# ---- Core: server + production deps ----
|
||||
FROM node:$NODE_VERSION-bookworm-slim AS core
|
||||
|
||||
ARG GHOST_BUILD_VERSION=""
|
||||
ENV NODE_ENV=production
|
||||
ENV GHOST_BUILD_VERSION=${GHOST_BUILD_VERSION}
|
||||
|
||||
RUN apt-get update && \
|
||||
apt-get install -y --no-install-recommends libjemalloc2 fontconfig && \
|
||||
rm -rf /var/lib/apt/lists/* && \
|
||||
groupmod -g 1001 node && \
|
||||
usermod -u 1001 node && \
|
||||
adduser --disabled-password --gecos "" -u 1000 ghost
|
||||
|
||||
WORKDIR /home/ghost
|
||||
|
||||
COPY --exclude=core/built/admin . .
|
||||
|
||||
RUN corepack enable
|
||||
|
||||
RUN --mount=type=cache,target=/root/.local/share/pnpm/store,id=pnpm-store \
|
||||
apt-get update && \
|
||||
apt-get install -y --no-install-recommends build-essential python3 && \
|
||||
pnpm install --ignore-scripts --prod --prefer-offline && \
|
||||
(cd node_modules/sqlite3 && npm run install) && \
|
||||
apt-get purge -y build-essential python3 && \
|
||||
apt-get autoremove -y && \
|
||||
rm -rf /var/lib/apt/lists/* && \
|
||||
mkdir -p default log && \
|
||||
cp -R content base_content && \
|
||||
cp -R content/themes/casper default/casper && \
|
||||
([ -d content/themes/source ] && cp -R content/themes/source default/source || true) && \
|
||||
chown ghost:ghost /home/ghost && \
|
||||
chown -R nobody:nogroup /home/ghost/* && \
|
||||
chown -R ghost:ghost /home/ghost/content /home/ghost/log
|
||||
|
||||
USER ghost
|
||||
ENV LD_PRELOAD=libjemalloc.so.2
|
||||
|
||||
EXPOSE 2368
|
||||
|
||||
CMD ["node", "index.js"]
|
||||
|
||||
# ---- Full: core + admin ----
|
||||
FROM core AS full
|
||||
|
||||
USER root
|
||||
COPY core/built/admin core/built/admin
|
||||
RUN chown -R nobody:nogroup core/built/admin
|
||||
USER ghost
|
||||
@@ -0,0 +1,22 @@
|
||||
Copyright (c) 2013-2026 Ghost Foundation
|
||||
|
||||
Permission is hereby granted, free of charge, to any person
|
||||
obtaining a copy of this software and associated documentation
|
||||
files (the "Software"), to deal in the Software without
|
||||
restriction, including without limitation the rights to use,
|
||||
copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the
|
||||
Software is furnished to do so, subject to the following
|
||||
conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be
|
||||
included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
|
||||
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
|
||||
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
|
||||
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
|
||||
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
|
||||
OTHER DEALINGS IN THE SOFTWARE.
|
||||
@@ -0,0 +1,104 @@
|
||||
|
||||
<p align="center">
|
||||
<a href="https://ghost.org/#gh-light-mode-only" target="_blank">
|
||||
<img src="https://user-images.githubusercontent.com/65487235/157884383-1b75feb1-45d8-4430-b636-3f7e06577347.png" alt="Ghost" width="200px">
|
||||
</a>
|
||||
<a href="https://ghost.org/#gh-dark-mode-only" target="_blank">
|
||||
<img src="https://user-images.githubusercontent.com/65487235/157849205-aa24152c-4610-4d7d-b752-3a8c4f9319e6.png" alt="Ghost" width="200px">
|
||||
</a>
|
||||
</p>
|
||||
|
||||
|
||||
<p align="center">
|
||||
<a href="https://ghost.org/">Ghost.org</a> •
|
||||
<a href="https://forum.ghost.org">Forum</a> •
|
||||
<a href="https://ghost.org/docs/">Docs</a> •
|
||||
<a href="https://github.com/TryGhost/Ghost/blob/main/.github/CONTRIBUTING.md">Contributing</a> •
|
||||
<a href="https://twitter.com/ghost">Twitter</a>
|
||||
<br /><br />
|
||||
<a href="https://ghost.org/">
|
||||
<img src="https://img.shields.io/badge/downloads-100M+-brightgreen.svg" alt="Downloads" />
|
||||
</a>
|
||||
<a href="https://github.com/TryGhost/Ghost/releases/">
|
||||
<img src="https://img.shields.io/github/release/TryGhost/Ghost.svg" alt="Latest release" />
|
||||
</a>
|
||||
<a href="https://github.com/TryGhost/Ghost/actions">
|
||||
<img src="https://github.com/TryGhost/Ghost/workflows/CI/badge.svg?branch=main" alt="Build status" />
|
||||
</a>
|
||||
<a href="https://github.com/TryGhost/Ghost/contributors/">
|
||||
<img src="https://img.shields.io/github/contributors/TryGhost/Ghost.svg" alt="Contributors" />
|
||||
</a>
|
||||
</p>
|
||||
|
||||
|
||||
|
||||
> [!NOTE]
|
||||
> Love open source? We're hiring! Ghost is looking staff engineers to [join the team](https://careers.ghost.org) and work with us full-time
|
||||
|
||||
<a href="https://ghost.org/"><img src="https://user-images.githubusercontent.com/353959/169805900-66be5b89-0859-4816-8da9-528ed7534704.png" alt="Fiercely independent, professional publishing. Ghost is the most popular open source, headless Node.js CMS which already works with all the tools you know and love." /></a>
|
||||
|
||||
|
||||
|
||||
<a href="https://ghost.org/pricing/#gh-light-mode-only" target="_blank"><img src="https://user-images.githubusercontent.com/65487235/157849437-9b8fcc48-1920-4b26-a1e8-5806db0e6bb9.png" alt="Ghost(Pro)" width="165px" /></a>
|
||||
<a href="https://ghost.org/pricing/#gh-dark-mode-only" target="_blank"><img src="https://user-images.githubusercontent.com/65487235/157849438-79889b04-b7b6-4ba7-8de6-4c1e4b4e16a5.png" alt="Ghost(Pro)" width="165px" /></a>
|
||||
|
||||
The easiest way to get a production instance deployed is with our official **[Ghost(Pro)](https://ghost.org/pricing/)** managed service. It takes about 2 minutes to launch a new site with worldwide CDN, backups, security and maintenance all done for you.
|
||||
|
||||
For most people this ends up being the best value option because of [how much time it saves](https://ghost.org/docs/hosting/) — and 100% of revenue goes to the Ghost Foundation; funding the maintenance and further development of the project itself. So you’ll be supporting open source software *and* getting a great service!
|
||||
|
||||
|
||||
|
||||
# Quickstart install
|
||||
|
||||
If you want to run your own instance of Ghost, in most cases the best way is to use our **CLI tool**
|
||||
|
||||
```
|
||||
npm install ghost-cli -g
|
||||
```
|
||||
|
||||
|
||||
|
||||
Then, if installing locally add the `local` flag to get up and running in under a minute - [Local install docs](https://ghost.org/docs/install/local/)
|
||||
|
||||
```
|
||||
ghost install local
|
||||
```
|
||||
|
||||
|
||||
|
||||
or on a server run the full install, including automatic SSL setup using LetsEncrypt - [Production install docs](https://ghost.org/docs/install/ubuntu/)
|
||||
|
||||
```
|
||||
ghost install
|
||||
```
|
||||
|
||||
|
||||
|
||||
Check out our [official documentation](https://ghost.org/docs/) for more information about our [recommended hosting stack](https://ghost.org/docs/hosting/) & properly [upgrading Ghost](https://ghost.org/docs/update/), plus everything you need to develop your own Ghost [themes](https://ghost.org/docs/themes/) or work with [our API](https://ghost.org/docs/content-api/).
|
||||
|
||||
### Contributors & advanced developers
|
||||
|
||||
For anyone wishing to contribute to Ghost or to hack/customize core files we recommend following our full development setup guides: [Contributor guide](https://ghost.org/docs/contributing/) • [Developer setup](https://ghost.org/docs/install/source/)
|
||||
|
||||
|
||||
|
||||
# Ghost sponsors
|
||||
|
||||
A big thanks to our sponsors and partners who make Ghost possible. If you're interested in sponsoring Ghost and supporting the project, please check out our profile on [GitHub sponsors](https://github.com/sponsors/TryGhost) :heart:
|
||||
|
||||
**[DigitalOcean](https://m.do.co/c/9ff29836d717)** • **[Fastly](https://www.fastly.com/)** • **[Tinybird](https://tbrd.co/ghost)** • **[BairesDev](https://www.bairesdev.com)** • **[Hostinger](https://www.hostinger.com/)**
|
||||
|
||||
|
||||
|
||||
# Getting help
|
||||
|
||||
Everyone can get help and support from a large community of developers over on the [Ghost forum](https://forum.ghost.org/). **Ghost(Pro)** customers have access to 24/7 email support.
|
||||
|
||||
To stay up to date with all the latest news and product updates, make sure you [subscribe to our changelog newsletter](https://ghost.org/changelog/) — or follow us [on Twitter](https://twitter.com/Ghost), if you prefer your updates bite-sized and facetious. :saxophone::turtle:
|
||||
|
||||
|
||||
|
||||
# License & trademark
|
||||
|
||||
Copyright (c) 2013-2026 Ghost Foundation - Released under the [MIT license](LICENSE).
|
||||
Ghost and the Ghost Logo are trademarks of Ghost Foundation Ltd. Please see our [trademark policy](https://ghost.org/trademark/) for info on acceptable usage.
|
||||
@@ -0,0 +1,9 @@
|
||||
# Reporting Security Vulnerabilities
|
||||
|
||||
Potential security vulnerabilities can be reported directly us at `security@ghost.org`. The Ghost Security Team communicates privately and works in a secured, isolated repository for tracking, testing, and resolving security-related issues.
|
||||
|
||||
The full, up-to-date details of our security policy and procedure can always be found in our documentation:
|
||||
|
||||
https://ghost.org/docs/security/
|
||||
|
||||
Please refer to this before emailing us. Thanks for helping make Ghost safe for everyone 🙏.
|
||||
@@ -0,0 +1,40 @@
|
||||
# Adopt Arrange–Act–Assert (AAA) Pattern for All Tests
|
||||
|
||||
## Status
|
||||
Proposed
|
||||
|
||||
## Context
|
||||
|
||||
Our tests are currently written in different styles, which makes them harder to read, understand, and maintain.
|
||||
|
||||
To improve **readability** and make it easier to **debug failing tests**, we want to standardize the structure of tests by following the well-known **Arrange–Act–Assert (AAA)** pattern.
|
||||
|
||||
## Decision
|
||||
|
||||
We will adopt the AAA pattern for tests. Every test should follow this structure:
|
||||
|
||||
1. **Arrange**: Set up data, mocks, page state, or environment
|
||||
2. **Act**: Perform the action being tested
|
||||
3. **Assert**: Check the expected outcome
|
||||
|
||||
## Guidelines
|
||||
|
||||
- ✅ Multiple actions and assertions are **allowed** as long as they belong to a **single AAA flow**
|
||||
- 🚫 **Repeating the full AAA structure in a single test is discouraged**, except for performance‑sensitive tests where setup cost is prohibitively high
|
||||
- ✂️ If a test involves multiple unrelated behaviors, **split it into separate test cases**
|
||||
- 🧼 Keep tests focused and predictable: one test = one scenario
|
||||
|
||||
## Example
|
||||
|
||||
```ts
|
||||
test('user can view their post', async ({ page }) => {
|
||||
// Arrange
|
||||
const user = await userFactory.create();
|
||||
const post = await postFactory.create({ userId: user.id });
|
||||
|
||||
// Act
|
||||
await page.goto(`/posts/${post.id}`);
|
||||
|
||||
// Assert
|
||||
await expect(page.getByText(post.title)).toBeVisible();
|
||||
});
|
||||
@@ -0,0 +1,101 @@
|
||||
# Adopt Page Objects Pattern for E2E Test Organization
|
||||
|
||||
## Status
|
||||
Proposed
|
||||
|
||||
## Context
|
||||
|
||||
Our Playwright tests currently interact directly with page elements using raw selectors and actions scattered throughout test files. This approach leads to several issues:
|
||||
|
||||
- **Code duplication**: The same selectors and interactions are repeated across multiple tests
|
||||
- **Maintenance burden**: When UI changes, we need to update selectors in many places
|
||||
- **Poor readability**: Tests are cluttered with low-level DOM interactions instead of focusing on business logic
|
||||
- **Fragile tests**: Direct coupling between tests and implementation details makes tests brittle
|
||||
|
||||
To improve **maintainability**, **readability**, and **test stability**, we want to adopt the Page Objects pattern to encapsulate page-specific knowledge and provide a clean API for test interactions.
|
||||
|
||||
The Page Objects pattern was originally described by [Martin Fowler](https://martinfowler.com/bliki/PageObject.html) as a way to "wrap an HTML page, or fragment, with an application-specific API, allowing you to manipulate page elements without digging around in the HTML."
|
||||
|
||||
## Decision
|
||||
|
||||
We will adopt the Page Objects pattern for organizing E2E tests. Every page or major UI component should have a corresponding page object class that:
|
||||
|
||||
1. **Encapsulates locators**: All element selectors are defined in one place
|
||||
2. **Provides semantic methods**: Expose high-level actions like `login()`, `createPost()`, `navigateToSettings()`
|
||||
3. **Abstracts implementation details**: Tests interact with business concepts, not DOM elements
|
||||
4. **Centralizes page-specific logic**: Complex interactions and waits are handled within page objects
|
||||
5. **Assertions live in test files**: Page Objects may include readiness guards (e.g., locator.waitFor({state: 'visible'})) before actions, business assertions (expect(...)) should be in tests
|
||||
6. **Expose semantic locators, hide selectors**: Page Objects should surface public readonly Locators for tests to assert on, while keeping selector strings and construction internal
|
||||
|
||||
## Guidelines
|
||||
|
||||
Following both [Fowler's original principles](https://martinfowler.com/bliki/PageObject.html) and modern Playwright best practices:
|
||||
|
||||
- ✅ **One page object per logical page or major component** (e.g., `LoginPage`, `PostEditor`, `AdminDashboard`)
|
||||
- ✅ **Model the structure that makes sense to the user**: not necessarily the HTML structure
|
||||
- ✅ **Use descriptive method names** that reflect user actions (e.g., `fillPostTitle()` not `typeInTitleInput()`)
|
||||
- ✅ **Return elements or data**: for assertions in tests (e.g., `getErrorMessage()` returns locator)
|
||||
- ✅ **Include wait methods**: for page readiness and async operations (e.g., `waitForErrorMessage()`)
|
||||
- ✅ **Chain related actions**: in fluent interfaces where it makes sense
|
||||
- ✅ **Keep assertions in test files**: page objects should return data/elements, tests should assert
|
||||
- ✅ **Handle concurrency issues** within page objects (async operations, loading states)
|
||||
- ✅ **Expose Locators (read-only), not raw selector strings**: you can tests assert against public locators (Playwright encourages it, with helpers on assertion)
|
||||
- `loginPage.saveButton.click` instead of `page.locator('[data-testid="save-button"]')`
|
||||
- ✅ **Selector priority: prefer getByRole / getByLabel / data-testid over CSS or XPath.**: add data-testid attributes where needed for stability
|
||||
- ✅ **Use guards, not assertions, in POM**: prefer locator.waitFor({state:'visible'})
|
||||
- 🚫 **Don't include expectations/assertions** in page object methods (following Fowler's recommendation)
|
||||
- 📁 **Organize in `/e2e/helpers/pages/` directory** with clear naming conventions
|
||||
|
||||
## Example
|
||||
|
||||
```ts
|
||||
// e2e/helpers/pages/admin/LoginPage.ts
|
||||
export class LoginPage extends BasePage {
|
||||
public readonly emailInput = this.page.locator('[data-testid="email-input"]');
|
||||
public readonly passwordInput = this.page.locator('[data-testid="password-input"]');
|
||||
public readonly loginButton = this.page.locator('[data-testid="login-button"]');
|
||||
public readonly errorMessage = this.page.locator('[data-testid="login-error"]');
|
||||
|
||||
constructor(page: Page) {
|
||||
super(page);
|
||||
this.pageUrl = '/login';
|
||||
}
|
||||
|
||||
async login(email: string, password: string) {
|
||||
await this.emailInput.fill(email);
|
||||
await this.passwordInput.fill(password);
|
||||
await this.loginButton.click();
|
||||
}
|
||||
|
||||
async waitForErrorMessage() {
|
||||
await this.errorMessage.waitFor({ state: 'visible' });
|
||||
return this.errorMessage;
|
||||
}
|
||||
|
||||
getErrorMessage() {
|
||||
return this.errorMessage;
|
||||
}
|
||||
}
|
||||
|
||||
// In test file
|
||||
test.describe('Login', () => {
|
||||
test('invalid credentials', async ({page}) => {
|
||||
// Arrange
|
||||
const loginPage = new LoginPage(page);
|
||||
|
||||
// Act
|
||||
await loginPage.goto();
|
||||
await loginPage.login('invalid@email.com', 'wrongpassword');
|
||||
const errorMessage = await loginPage.waitForErrorMessage();
|
||||
|
||||
// Assert
|
||||
await expect(errorMessage).toHaveText('Invalid credentials');
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
## References
|
||||
|
||||
- [Page Object - Martin Fowler](https://martinfowler.com/bliki/PageObject.html) - Original pattern definition
|
||||
- [Selenium Page Objects](https://selenium-python.readthedocs.io/page-objects.html) - Early implementation guidance
|
||||
- [Playwright Page Object Model](https://playwright.dev/docs/pom) - Modern Playwright-specific approaches
|
||||
@@ -0,0 +1,22 @@
|
||||
# Architecture Decision Records (ADRs)
|
||||
|
||||
This directory contains Architecture Decision Records (ADRs) specific to the E2E test suite.
|
||||
|
||||
ADRs are short, version-controlled documents that capture important architectural and process decisions, along with the reasoning behind them.
|
||||
They help document **why** something was decided — not just **what** was done — which improves transparency, consistency, and long-term maintainability.
|
||||
|
||||
Each ADR includes the following sections:
|
||||
|
||||
- `Status` – `Proposed`, `Accepted`, `Rejected`, etc.
|
||||
- `Context` – Why the decision was needed
|
||||
- `Decision` – What was decided and why
|
||||
- `Guidelines` – (Optional) How the decision should be applied
|
||||
- `Example` – (Optional) Minimal working example to clarify intent
|
||||
- `References` - (Optional) - Lists documents, links, or resources that informed or support the decision
|
||||
|
||||
## Guidelines for contributing
|
||||
|
||||
- We follow a simplified and slightly adapted version of the [Michael Nygard ADR format](https://github.com/joelparkerhenderson/architecture-decision-record/tree/main/locales/en/templates/decision-record-template-by-michael-nygard)
|
||||
- Keep ADRs focused, short, and scoped to one decision
|
||||
- Start with `Status: Proposed` and update to `Status: Accepted` after code review
|
||||
- Use sequential filenames with a descriptive slug, for example: `0002-page-objects-pattern.md`
|
||||
@@ -0,0 +1 @@
|
||||
tailwind.config.cjs
|
||||
@@ -0,0 +1,73 @@
|
||||
/* eslint-env node */
|
||||
const tailwindCssConfig = `${__dirname}/../admin/src/index.css`;
|
||||
|
||||
module.exports = {
|
||||
root: true,
|
||||
extends: [
|
||||
'plugin:ghost/ts',
|
||||
'plugin:react/recommended',
|
||||
'plugin:react-hooks/recommended'
|
||||
],
|
||||
plugins: [
|
||||
'ghost',
|
||||
'react-refresh',
|
||||
'tailwindcss'
|
||||
],
|
||||
settings: {
|
||||
react: {
|
||||
version: 'detect'
|
||||
},
|
||||
tailwindcss: {
|
||||
config: tailwindCssConfig
|
||||
}
|
||||
},
|
||||
rules: {
|
||||
'no-shadow': 'off',
|
||||
'@typescript-eslint/no-shadow': 'error',
|
||||
|
||||
// Enforce kebab-case (lowercase with hyphens) for all filenames
|
||||
'ghost/filenames/match-regex': ['error', '^[a-z0-9.-]+$', false],
|
||||
|
||||
// sort multiple import lines into alphabetical groups
|
||||
'ghost/sort-imports-es6-autofix/sort-imports-es6': ['error', {
|
||||
memberSyntaxSortOrder: ['none', 'all', 'single', 'multiple']
|
||||
}],
|
||||
'no-restricted-imports': ['error', {
|
||||
paths: [{
|
||||
name: '@tryghost/shade',
|
||||
message: 'Import from layered subpaths instead (components/primitives/patterns/utils/app/tokens).'
|
||||
}]
|
||||
}],
|
||||
|
||||
// TODO: re-enable this (maybe fixed fast refresh?)
|
||||
'react-refresh/only-export-components': 'off',
|
||||
|
||||
// suppress errors for missing 'import React' in JSX files, as we don't need it
|
||||
'react/react-in-jsx-scope': 'off',
|
||||
// ignore prop-types for now
|
||||
'react/prop-types': 'off',
|
||||
|
||||
// TODO: re-enable these if deemed useful
|
||||
'@typescript-eslint/no-non-null-assertion': 'off',
|
||||
'@typescript-eslint/no-empty-function': 'off',
|
||||
|
||||
// custom react rules
|
||||
'react/jsx-sort-props': ['error', {
|
||||
reservedFirst: true,
|
||||
callbacksLast: true,
|
||||
shorthandLast: true,
|
||||
locale: 'en'
|
||||
}],
|
||||
'react/button-has-type': 'error',
|
||||
'react/no-array-index-key': 'error',
|
||||
'react/jsx-key': 'off',
|
||||
|
||||
'tailwindcss/classnames-order': 'error',
|
||||
'tailwindcss/enforces-negative-arbitrary-values': 'warn',
|
||||
'tailwindcss/enforces-shorthand': 'warn',
|
||||
'tailwindcss/migration-from-tailwind-2': 'warn',
|
||||
'tailwindcss/no-arbitrary-value': 'off',
|
||||
'tailwindcss/no-custom-classname': 'off',
|
||||
'tailwindcss/no-contradicting-classname': 'error'
|
||||
}
|
||||
};
|
||||
@@ -0,0 +1,4 @@
|
||||
dist
|
||||
types
|
||||
playwright-report
|
||||
test-results
|
||||
@@ -0,0 +1,13 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>AdminX Standalone</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="root"></div>
|
||||
<script type="module" src="/src/standalone.tsx"></script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -0,0 +1,95 @@
|
||||
{
|
||||
"name": "@tryghost/activitypub",
|
||||
"version": "3.1.13",
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/TryGhost/Ghost/tree/main/apps/activitypub"
|
||||
},
|
||||
"author": "Ghost Foundation",
|
||||
"files": [
|
||||
"LICENSE",
|
||||
"README.md",
|
||||
"dist/"
|
||||
],
|
||||
"main": "./dist/activitypub.umd.cjs",
|
||||
"module": "./dist/activitypub.js",
|
||||
"exports": {
|
||||
".": {
|
||||
"import": "./dist/activitypub.js",
|
||||
"require": "./dist/activitypub.umd.cjs"
|
||||
},
|
||||
"./api": "./src/index.tsx"
|
||||
},
|
||||
"private": false,
|
||||
"scripts": {
|
||||
"dev": "vite build --watch",
|
||||
"dev:start": "vite",
|
||||
"build": "tsc && vite build",
|
||||
"lint": "pnpm run lint:code && pnpm run lint:test",
|
||||
"lint:code": "eslint --ext .js,.ts,.cjs,.tsx --cache src",
|
||||
"lint:test": "eslint -c test/.eslintrc.cjs --ext .js,.ts,.cjs,.tsx --cache test",
|
||||
"test": "pnpm test:unit",
|
||||
"test:unit": "tsc --noEmit && vitest run",
|
||||
"test:acceptance": "NODE_OPTIONS='--experimental-specifier-resolution=node --no-warnings' VITE_TEST=true playwright test",
|
||||
"test:acceptance:slowmo": "TIMEOUT=100000 PLAYWRIGHT_SLOWMO=100 pnpm test:acceptance --headed",
|
||||
"test:acceptance:full": "ALL_BROWSERS=1 pnpm test:acceptance",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@playwright/test": "1.59.1",
|
||||
"@testing-library/react": "14.3.1",
|
||||
"@types/dompurify": "3.2.0",
|
||||
"@types/jest": "29.5.14",
|
||||
"@types/react": "18.3.28",
|
||||
"@types/react-dom": "18.3.7",
|
||||
"jest": "29.7.0",
|
||||
"tailwindcss": "^4.2.2",
|
||||
"ts-jest": "29.4.9",
|
||||
"vite": "5.4.21",
|
||||
"vitest": "1.6.1"
|
||||
},
|
||||
"nx": {
|
||||
"targets": {
|
||||
"build": {
|
||||
"dependsOn": [
|
||||
"^build"
|
||||
]
|
||||
},
|
||||
"dev": {
|
||||
"dependsOn": [
|
||||
"^build"
|
||||
]
|
||||
},
|
||||
"test:unit": {
|
||||
"dependsOn": [
|
||||
"^build",
|
||||
"test:unit"
|
||||
]
|
||||
},
|
||||
"test:acceptance": {
|
||||
"dependsOn": [
|
||||
"^build",
|
||||
"test:acceptance"
|
||||
]
|
||||
}
|
||||
}
|
||||
},
|
||||
"dependencies": {
|
||||
"@hookform/resolvers": "5.2.2",
|
||||
"@radix-ui/react-form": "0.1.8",
|
||||
"@tanstack/react-query": "4.36.1",
|
||||
"@tryghost/admin-x-framework": "workspace:*",
|
||||
"@tryghost/shade": "workspace:*",
|
||||
"clsx": "2.1.1",
|
||||
"dompurify": "3.3.1",
|
||||
"html2canvas-objectfit-fix": "1.2.0",
|
||||
"react": "18.3.1",
|
||||
"react-dom": "18.3.1",
|
||||
"react-hook-form": "7.72.1",
|
||||
"react-router": "7.14.0",
|
||||
"sonner": "2.0.7",
|
||||
"use-debounce": "10.1.1",
|
||||
"zod": "4.1.12"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,3 @@
|
||||
import {adminXPlaywrightConfig} from '@tryghost/admin-x-framework/playwright';
|
||||
|
||||
export default adminXPlaywrightConfig();
|
||||
@@ -0,0 +1,29 @@
|
||||
import {FeatureFlagsProvider} from './lib/feature-flags';
|
||||
import {FrameworkProvider, Outlet, RouterProvider, TopLevelFrameworkProps} from '@tryghost/admin-x-framework';
|
||||
import {ShadeApp} from '@tryghost/shade/app';
|
||||
import {routes} from '@src/routes';
|
||||
|
||||
interface AppProps {
|
||||
framework: TopLevelFrameworkProps;
|
||||
activityPubEnabled?: boolean;
|
||||
}
|
||||
|
||||
const App: React.FC<AppProps> = ({framework, activityPubEnabled}) => {
|
||||
if (activityPubEnabled === false) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return (
|
||||
<FrameworkProvider {...framework}>
|
||||
<RouterProvider prefix={'/'} routes={routes}>
|
||||
<FeatureFlagsProvider>
|
||||
<ShadeApp className="shade-activitypub" darkMode={false} fetchKoenigLexical={null}>
|
||||
<Outlet />
|
||||
</ShadeApp>
|
||||
</FeatureFlagsProvider>
|
||||
</RouterProvider>
|
||||
</FrameworkProvider>
|
||||
);
|
||||
};
|
||||
|
||||
export default App;
|
||||
@@ -0,0 +1,7 @@
|
||||
import './styles/index.css';
|
||||
|
||||
export {default as AdminXApp} from './app';
|
||||
|
||||
export {routes} from './routes';
|
||||
export {FeatureFlagsProvider} from './lib/feature-flags';
|
||||
export {useNotificationsCountForUser} from './hooks/use-activity-pub-queries';
|
||||
@@ -0,0 +1,142 @@
|
||||
import AppError from '@components/layout/error';
|
||||
|
||||
import {Navigate, Outlet, RouteObject, lazyComponent} from '@tryghost/admin-x-framework';
|
||||
|
||||
const basePath = import.meta.env.VITE_TEST ? '' : 'activitypub';
|
||||
|
||||
export type CustomRouteObject = RouteObject & {
|
||||
pageTitle?: string;
|
||||
children?: CustomRouteObject[];
|
||||
showBackButton?: boolean;
|
||||
};
|
||||
|
||||
export const routes: CustomRouteObject[] = [
|
||||
{
|
||||
// Root route that defines the app's base path
|
||||
path: basePath,
|
||||
element: <Outlet />,
|
||||
errorElement: <AppError />, // This will catch all errors in child routes
|
||||
handle: 'activitypub-basepath',
|
||||
children: [
|
||||
{
|
||||
index: true,
|
||||
element: <Navigate to="reader" />
|
||||
},
|
||||
{
|
||||
path: 'inbox',
|
||||
element: <Navigate to="../reader" replace />
|
||||
},
|
||||
{
|
||||
path: 'feed',
|
||||
element: <Navigate to="../notes" replace />
|
||||
},
|
||||
{
|
||||
path: 'reader',
|
||||
lazy: lazyComponent(() => import('./views/inbox')),
|
||||
pageTitle: 'Reader'
|
||||
},
|
||||
{
|
||||
path: 'reader/:postId',
|
||||
lazy: lazyComponent(() => import('./views/inbox')),
|
||||
pageTitle: 'Reader'
|
||||
},
|
||||
{
|
||||
path: 'notes',
|
||||
lazy: lazyComponent(() => import('./views/feed/feed')),
|
||||
pageTitle: 'Notes'
|
||||
},
|
||||
{
|
||||
path: 'notes/:postId',
|
||||
lazy: lazyComponent(() => import('./views/feed/note')),
|
||||
pageTitle: 'Note'
|
||||
},
|
||||
{
|
||||
path: 'notifications',
|
||||
lazy: lazyComponent(() => import('./views/notifications')),
|
||||
pageTitle: 'Notifications'
|
||||
},
|
||||
{
|
||||
path: 'explore',
|
||||
lazy: lazyComponent(() => import('./views/explore')),
|
||||
pageTitle: 'Explore'
|
||||
},
|
||||
{
|
||||
path: 'explore/:topic',
|
||||
lazy: lazyComponent(() => import('./views/explore')),
|
||||
pageTitle: 'Explore'
|
||||
},
|
||||
{
|
||||
path: 'profile',
|
||||
lazy: lazyComponent(() => import('./views/profile')),
|
||||
pageTitle: 'Profile'
|
||||
},
|
||||
{
|
||||
path: 'profile/likes',
|
||||
lazy: lazyComponent(() => import('./views/profile')),
|
||||
pageTitle: 'Profile'
|
||||
},
|
||||
{
|
||||
path: 'profile/following',
|
||||
lazy: lazyComponent(() => import('./views/profile')),
|
||||
pageTitle: 'Profile'
|
||||
},
|
||||
{
|
||||
path: 'profile/followers',
|
||||
lazy: lazyComponent(() => import('./views/profile')),
|
||||
pageTitle: 'Profile'
|
||||
},
|
||||
{
|
||||
path: 'profile/:handle/:tab?',
|
||||
lazy: lazyComponent(() => import('./views/profile')),
|
||||
pageTitle: 'Profile'
|
||||
},
|
||||
{
|
||||
path: 'preferences',
|
||||
lazy: lazyComponent(() => import('./views/preferences')),
|
||||
pageTitle: 'Preferences'
|
||||
},
|
||||
{
|
||||
path: 'preferences/moderation',
|
||||
lazy: lazyComponent(() => import('./views/preferences/components/moderation')),
|
||||
pageTitle: 'Moderation',
|
||||
showBackButton: true
|
||||
},
|
||||
{
|
||||
path: 'preferences/bluesky-sharing',
|
||||
lazy: lazyComponent(() => import('./views/preferences/components/bluesky-sharing')),
|
||||
showBackButton: true
|
||||
},
|
||||
{
|
||||
path: 'welcome',
|
||||
lazy: lazyComponent(() => import('./components/layout/onboarding')),
|
||||
pageTitle: 'Welcome',
|
||||
children: [
|
||||
{
|
||||
path: '',
|
||||
element: <Navigate to="1" replace />
|
||||
},
|
||||
{
|
||||
path: '1',
|
||||
lazy: lazyComponent(() => import('./components/layout/onboarding/step-1'))
|
||||
},
|
||||
{
|
||||
path: '2',
|
||||
lazy: lazyComponent(() => import('./components/layout/onboarding/step-2'))
|
||||
},
|
||||
{
|
||||
path: '3',
|
||||
lazy: lazyComponent(() => import('./components/layout/onboarding/step-3'))
|
||||
},
|
||||
{
|
||||
path: '*',
|
||||
element: <Navigate to="1" replace />
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
path: '*',
|
||||
lazy: lazyComponent(() => import('./components/layout/error'))
|
||||
}
|
||||
]
|
||||
}
|
||||
];
|
||||
@@ -0,0 +1,5 @@
|
||||
import './styles/index.css';
|
||||
import App from './app.tsx';
|
||||
import renderStandaloneApp from '@tryghost/admin-x-framework/test/render';
|
||||
|
||||
renderStandaloneApp(App, {});
|
||||
@@ -0,0 +1,10 @@
|
||||
module.exports = {
|
||||
plugins: ['ghost'],
|
||||
extends: [
|
||||
'plugin:ghost/ts-test'
|
||||
],
|
||||
rules: {
|
||||
// Enforce kebab-case (lowercase with hyphens) for all filenames
|
||||
'ghost/filenames/match-regex': ['error', '^[a-z0-9.-]+$', false]
|
||||
}
|
||||
};
|
||||
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"noEmit": false,
|
||||
"composite": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"declarationDir": "./types",
|
||||
"emitDeclarationOnly": true,
|
||||
"tsBuildInfoFile": "./types/tsconfig.tsbuildinfo",
|
||||
"rootDir": "./src"
|
||||
},
|
||||
"include": ["src"],
|
||||
"exclude": ["src/**/*.stories.tsx", "src/**/*.test.ts", "src/**/*.test.tsx"]
|
||||
}
|
||||
|
||||
@@ -0,0 +1,36 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ESNext",
|
||||
"lib": ["DOM", "DOM.Iterable", "ESNext"],
|
||||
"module": "ESNext",
|
||||
"skipLibCheck": true,
|
||||
"types": ["vite/client", "jest"],
|
||||
|
||||
/* Bundler mode */
|
||||
"moduleResolution": "bundler",
|
||||
"allowImportingTsExtensions": true,
|
||||
"resolveJsonModule": true,
|
||||
"isolatedModules": true,
|
||||
"noEmit": true,
|
||||
"jsx": "react-jsx",
|
||||
|
||||
/* Linting */
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
|
||||
/* Path aliases */
|
||||
"baseUrl": "./src",
|
||||
"paths": {
|
||||
"@src/*": ["*"],
|
||||
"@assets/*": ["assets/*"],
|
||||
"@components/*": ["components/*"],
|
||||
"@hooks/*": ["hooks/*"],
|
||||
"@utils/*": ["utils/*"],
|
||||
"@views/*": ["views/*"]
|
||||
}
|
||||
},
|
||||
"include": ["src", "test"]
|
||||
}
|
||||
@@ -0,0 +1,72 @@
|
||||
import adminXViteConfig from '@tryghost/admin-x-framework/vite';
|
||||
import pkg from './package.json';
|
||||
import {resolve} from 'path';
|
||||
import fs from 'fs';
|
||||
|
||||
const GHOST_CARDS_PATH = resolve(__dirname, '../../ghost/core/core/frontend/src/cards');
|
||||
|
||||
const validateCardsDirectoryPlugin = (cardsPath) => {
|
||||
return {
|
||||
name: 'validate-cards-directory',
|
||||
buildStart() {
|
||||
const jsPath = resolve(cardsPath, 'js');
|
||||
const cssPath = resolve(cardsPath, 'css');
|
||||
|
||||
if (!fs.existsSync(cardsPath)) {
|
||||
throw new Error(`Ghost cards directory not found at: ${cardsPath}`);
|
||||
}
|
||||
|
||||
if (!fs.existsSync(jsPath)) {
|
||||
throw new Error(`Ghost cards JS directory not found at: ${jsPath}`);
|
||||
}
|
||||
|
||||
if (!fs.existsSync(cssPath)) {
|
||||
throw new Error(`Ghost cards CSS directory not found at: ${cssPath}`);
|
||||
}
|
||||
|
||||
const jsFiles = fs.readdirSync(jsPath).filter(f => f.endsWith('.js'));
|
||||
const cssFiles = fs.readdirSync(cssPath).filter(f => f.endsWith('.css'));
|
||||
|
||||
if (jsFiles.length === 0) {
|
||||
throw new Error(`No JavaScript files found in Ghost cards directory: ${jsPath}`);
|
||||
}
|
||||
|
||||
if (cssFiles.length === 0) {
|
||||
throw new Error(`No CSS files found in Ghost cards directory: ${cssPath}`);
|
||||
}
|
||||
|
||||
console.log(`✓ Found ${jsFiles.length} JS and ${cssFiles.length} CSS card files at: ${cardsPath}`);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
export default (function viteConfig() {
|
||||
const config = adminXViteConfig({
|
||||
packageName: pkg.name,
|
||||
entry: resolve(__dirname, 'src/index.tsx'),
|
||||
overrides: {
|
||||
test: {
|
||||
include: [
|
||||
'./test/unit/**/*',
|
||||
'./src/**/*.test.ts'
|
||||
]
|
||||
},
|
||||
resolve: {
|
||||
alias: {
|
||||
'@src': resolve(__dirname, './src'),
|
||||
'@assets': resolve(__dirname, './src/assets'),
|
||||
'@components': resolve(__dirname, './src/components'),
|
||||
'@hooks': resolve(__dirname, './src/hooks'),
|
||||
'@utils': resolve(__dirname, './src/utils'),
|
||||
'@views': resolve(__dirname, './src/views'),
|
||||
'@ghost-cards': GHOST_CARDS_PATH
|
||||
}
|
||||
},
|
||||
plugins: [
|
||||
validateCardsDirectoryPlugin(GHOST_CARDS_PATH)
|
||||
]
|
||||
}
|
||||
});
|
||||
|
||||
return config;
|
||||
});
|
||||
@@ -0,0 +1,49 @@
|
||||
const tailwindCssConfig = `${__dirname}/../admin/src/index.css`;
|
||||
|
||||
module.exports = {
|
||||
extends: [
|
||||
'plugin:ghost/ts',
|
||||
'plugin:react/recommended',
|
||||
'plugin:react-hooks/recommended'
|
||||
],
|
||||
plugins: [
|
||||
'ghost',
|
||||
'react-refresh',
|
||||
'tailwindcss'
|
||||
],
|
||||
settings: {
|
||||
react: {
|
||||
version: 'detect'
|
||||
},
|
||||
tailwindcss: {
|
||||
config: tailwindCssConfig
|
||||
}
|
||||
},
|
||||
rules: {
|
||||
// suppress errors for missing 'import React' in JSX files, as we don't need it
|
||||
'react/react-in-jsx-scope': 'off',
|
||||
// ignore prop-types for now
|
||||
'react/prop-types': 'off',
|
||||
|
||||
'react/jsx-sort-props': ['error', {
|
||||
reservedFirst: true,
|
||||
callbacksLast: true,
|
||||
shorthandLast: true,
|
||||
locale: 'en'
|
||||
}],
|
||||
'react/button-has-type': 'error',
|
||||
'react/no-array-index-key': 'error',
|
||||
'react/jsx-key': 'off',
|
||||
|
||||
// Enforce kebab-case (lowercase with hyphens) for all filenames
|
||||
'ghost/filenames/match-regex': ['error', '^[a-z0-9.-]+$', false],
|
||||
|
||||
'tailwindcss/classnames-order': 'error',
|
||||
'tailwindcss/enforces-negative-arbitrary-values': 'warn',
|
||||
'tailwindcss/enforces-shorthand': 'warn',
|
||||
'tailwindcss/migration-from-tailwind-2': 'warn',
|
||||
'tailwindcss/no-arbitrary-value': 'off',
|
||||
'tailwindcss/no-custom-classname': 'off',
|
||||
'tailwindcss/no-contradicting-classname': 'error'
|
||||
}
|
||||
};
|
||||
@@ -0,0 +1,2 @@
|
||||
es
|
||||
types
|
||||
Binary file not shown.
@@ -0,0 +1,38 @@
|
||||
import {create} from '@storybook/theming/create';
|
||||
|
||||
export default create({
|
||||
base: 'light',
|
||||
// Typography
|
||||
fontBase: '"Inter", sans-serif',
|
||||
fontCode: 'monospace',
|
||||
|
||||
brandTitle: 'AdminX Design System',
|
||||
brandUrl: 'https://ghost.org',
|
||||
brandImage: 'https://github.com/peterzimon/playground/assets/353959/c4358b4e-232f-4dba-8abb-adb3142ccd89',
|
||||
brandTarget: '_self',
|
||||
|
||||
//
|
||||
colorPrimary: '#30CF43',
|
||||
colorSecondary: '#15171A',
|
||||
|
||||
// UI
|
||||
appBg: '#ffffff',
|
||||
appContentBg: '#ffffff',
|
||||
appBorderColor: '#EBEEF0',
|
||||
appBorderRadius: 0,
|
||||
|
||||
// Text colors
|
||||
textColor: '#15171A',
|
||||
textInverseColor: '#ffffff',
|
||||
|
||||
// Toolbar default and active colors
|
||||
barTextColor: '#9E9E9E',
|
||||
barSelectedColor: '#15171A',
|
||||
barBg: '#ffffff',
|
||||
|
||||
// Form colors
|
||||
inputBg: '#ffffff',
|
||||
inputBorder: '#15171A',
|
||||
inputTextColor: '#15171A',
|
||||
inputBorderRadius: 2,
|
||||
});
|
||||
@@ -0,0 +1,27 @@
|
||||
import type { StorybookConfig } from "@storybook/react-vite";
|
||||
|
||||
const config: StorybookConfig = {
|
||||
stories: ["../src/**/*.mdx", "../src/**/*.stories.@(js|jsx|ts|tsx)"],
|
||||
addons: [
|
||||
"@storybook/addon-links",
|
||||
"@storybook/addon-essentials",
|
||||
"@storybook/addon-interactions",
|
||||
{
|
||||
name: '@storybook/addon-styling',
|
||||
},
|
||||
],
|
||||
framework: {
|
||||
name: "@storybook/react-vite",
|
||||
options: {},
|
||||
},
|
||||
docs: {
|
||||
autodocs: "tag",
|
||||
},
|
||||
async viteFinal(config, options) {
|
||||
config.resolve!.alias = {
|
||||
crypto: require.resolve('rollup-plugin-node-builtins')
|
||||
}
|
||||
return config;
|
||||
}
|
||||
};
|
||||
export default config;
|
||||
@@ -0,0 +1,6 @@
|
||||
import {addons} from '@storybook/manager-api';
|
||||
import adminxTheme from './adminx-theme';
|
||||
|
||||
addons.setConfig({
|
||||
theme: adminxTheme
|
||||
});
|
||||
@@ -0,0 +1,107 @@
|
||||
import React from 'react';
|
||||
|
||||
import '../styles.css';
|
||||
import './storybook.css';
|
||||
|
||||
import type { Preview } from "@storybook/react";
|
||||
import DesignSystemProvider from '../src/providers/design-system-provider';
|
||||
import adminxTheme from './adminx-theme';
|
||||
|
||||
// import { MINIMAL_VIEWPORTS } from '@storybook/addon-viewport';
|
||||
|
||||
const customViewports = {
|
||||
sm: {
|
||||
name: 'sm',
|
||||
styles: {
|
||||
width: '480px',
|
||||
height: '801px',
|
||||
},
|
||||
},
|
||||
md: {
|
||||
name: 'md',
|
||||
styles: {
|
||||
width: '640px',
|
||||
height: '801px',
|
||||
},
|
||||
},
|
||||
lg: {
|
||||
name: 'lg',
|
||||
styles: {
|
||||
width: '1024px',
|
||||
height: '801px',
|
||||
},
|
||||
},
|
||||
xl: {
|
||||
name: 'xl',
|
||||
styles: {
|
||||
width: '1320px',
|
||||
height: '801px',
|
||||
},
|
||||
},
|
||||
tablet: {
|
||||
name: 'tablet',
|
||||
styles: {
|
||||
width: '860px',
|
||||
height: '801px',
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const preview: Preview = {
|
||||
parameters: {
|
||||
actions: { argTypesRegex: "^on[A-Z].*" },
|
||||
controls: {
|
||||
matchers: {
|
||||
color: /(background|color)$/i,
|
||||
date: /Date$/,
|
||||
},
|
||||
},
|
||||
options: {
|
||||
storySort: {
|
||||
method: 'alphabetical',
|
||||
order: ['Welcome', 'Foundations', ['Style Guide', 'Colors', 'Icons', 'ErrorHandling'], 'Global', ['Form', 'Chrome', 'Modal', 'Layout', ['View Container', 'Page Header', 'Page'], 'List', 'Table', '*'], 'Settings', ['Setting Section', 'Setting Group', '*'], 'Experimental'],
|
||||
},
|
||||
},
|
||||
docs: {
|
||||
theme: adminxTheme,
|
||||
},
|
||||
viewport: {
|
||||
viewports: {
|
||||
...customViewports,
|
||||
},
|
||||
},
|
||||
},
|
||||
decorators: [
|
||||
(Story, context) => {
|
||||
let {scheme} = context.globals;
|
||||
|
||||
return (
|
||||
<div className={`admin-x-design-system admin-x-base ${scheme === 'dark' ? 'dark' : ''}`} style={{
|
||||
// padding: '24px',
|
||||
// width: 'unset',
|
||||
height: 'unset',
|
||||
// overflow: 'unset',
|
||||
background: (scheme === 'dark' ? '#131416' : '')
|
||||
}}>
|
||||
{/* 👇 Decorators in Storybook also accept a function. Replace <Story/> with Story() to enable it */}
|
||||
<DesignSystemProvider fetchKoenigLexical={async () => {}}>
|
||||
<Story />
|
||||
</DesignSystemProvider>
|
||||
</div>);
|
||||
},
|
||||
],
|
||||
globalTypes: {
|
||||
scheme: {
|
||||
name: "Scheme",
|
||||
description: "Select light or dark mode",
|
||||
defaultValue: "light",
|
||||
toolbar: {
|
||||
icon: "mirror",
|
||||
items: ["light", "dark"],
|
||||
dynamicTitle: true
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
export default preview;
|
||||
@@ -0,0 +1,247 @@
|
||||
/*
|
||||
* We load Inter in Ember admin, so loading it explicitly here makes the final rendering
|
||||
* in Storybook match the final rendering when embedded in Ember
|
||||
*/
|
||||
@font-face {
|
||||
font-family: "Inter";
|
||||
src: url("./Inter.ttf") format("truetype-variations");
|
||||
font-weight: 100 900;
|
||||
}
|
||||
|
||||
:root {
|
||||
font-size: 62.5%;
|
||||
line-height: 1.5;
|
||||
-ms-text-size-adjust: 100%;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
|
||||
text-rendering: optimizeLegibility;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
}
|
||||
|
||||
html, body, #root {
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
margin: 0;
|
||||
letter-spacing: unset;
|
||||
}
|
||||
|
||||
.sbdocs-wrapper {
|
||||
padding: 3vmin !important;
|
||||
}
|
||||
|
||||
.sbdocs-wrapper .sbdocs-content {
|
||||
max-width: 1320px;
|
||||
}
|
||||
|
||||
.sb-doc {
|
||||
max-width: 740px;
|
||||
width: 100%;
|
||||
margin: 0 auto !important;
|
||||
}
|
||||
|
||||
.sb-doc,
|
||||
.sb-doc a,
|
||||
.sb-doc h1,
|
||||
.sb-doc h2,
|
||||
.sb-doc h3,
|
||||
.sb-doc h4,
|
||||
.sb-doc h5,
|
||||
.sb-doc h6,
|
||||
.sb-doc p,
|
||||
.sb-doc ul li,
|
||||
.sbdocs-title,
|
||||
.sb-doc ol li {
|
||||
font-family: Inter, sans-serif !important;
|
||||
padding: 0 !important;
|
||||
}
|
||||
|
||||
.sb-doc a {
|
||||
color: #30CF43;
|
||||
}
|
||||
|
||||
.sb-doc h1 {
|
||||
font-size: 48px !important;
|
||||
letter-spacing: -0.04em !important;
|
||||
margin-bottom: 20px;
|
||||
}
|
||||
|
||||
.sb-doc h2 {
|
||||
margin-top: 40px !important;
|
||||
font-size: 27px;
|
||||
border: none;
|
||||
margin-bottom: 2px;
|
||||
}
|
||||
|
||||
.sb-doc h3 {
|
||||
margin-top: 40px !important;
|
||||
margin-bottom: 4px !important;
|
||||
font-size: 20px;
|
||||
}
|
||||
|
||||
.sb-doc h4 {
|
||||
margin: 0 0 4px !important;
|
||||
}
|
||||
|
||||
.sb-doc p,
|
||||
.sb-doc div,
|
||||
.sb-doc ul li,
|
||||
.sb-doc ol li {
|
||||
font-size: 15px;
|
||||
line-height: 1.5em;
|
||||
}
|
||||
|
||||
.sb-doc ul li,
|
||||
.sb-doc ol li {
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
.sb-doc h2 + p,
|
||||
.sb-doc h3 + p {
|
||||
margin-top: 8px;
|
||||
}
|
||||
|
||||
.sb-doc img,
|
||||
.sb-wide img {
|
||||
margin-top: 40px !important;
|
||||
margin-bottom: 40px !important;
|
||||
}
|
||||
|
||||
.sb-doc img.small {
|
||||
max-width: 520px;
|
||||
margin: 0 auto;
|
||||
display: block;
|
||||
}
|
||||
|
||||
.sb-doc p.excerpt {
|
||||
font-size: 19px;
|
||||
letter-spacing: -0.02em;
|
||||
}
|
||||
|
||||
.sb-doc .highlight {
|
||||
padding: 12px 20px;
|
||||
border-radius: 4px;
|
||||
background: #EBEEF0;
|
||||
}
|
||||
|
||||
.sb-doc .highlight.purple {
|
||||
background: #F0E9FA;
|
||||
}
|
||||
|
||||
.sb-doc .highlight.purple a {
|
||||
color: #8E42FF;
|
||||
}
|
||||
|
||||
/* Welcome */
|
||||
.sb-doc img.main-image {
|
||||
margin-top: -2vmin !important;
|
||||
margin-left: -44px;
|
||||
margin-right: -32px;
|
||||
margin-bottom: 0 !important;
|
||||
max-width: unset;
|
||||
width: calc(100% + 64px);
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container {
|
||||
display: flex;
|
||||
gap: 32px;
|
||||
margin: 32px 0 80px;
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container div {
|
||||
flex-basis: 33%;
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container div p {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 4px;
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container img {
|
||||
margin: 12px 0 !important;
|
||||
width: 32px;
|
||||
height: 32px;
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container div h4 {
|
||||
border-bottom: 1px solid #EBEEF0;
|
||||
padding-bottom: 8px !important;
|
||||
margin-bottom: 8px !important;
|
||||
}
|
||||
|
||||
.sb-doc .main-structure-container div p {
|
||||
margin: 0;
|
||||
font-size: 13.5px;
|
||||
}
|
||||
|
||||
/* Colors */
|
||||
.color-grid {
|
||||
display: flex;
|
||||
gap: 20px;
|
||||
flex-wrap: wrap;
|
||||
margin-top: 20px;
|
||||
}
|
||||
|
||||
.color-grid div {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
gap: 8px;
|
||||
padding: 12px;
|
||||
border-radius: 4px;
|
||||
border: 1px solid #EBEEF0;
|
||||
}
|
||||
|
||||
.color-grid .swatch {
|
||||
display: block;
|
||||
background: #EFEFEF;
|
||||
border-radius: 100%;
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
}
|
||||
|
||||
.swatch.green {
|
||||
background: #30CF43;
|
||||
}
|
||||
|
||||
.swatch.black {
|
||||
background: #15171A;
|
||||
}
|
||||
|
||||
.swatch.white {
|
||||
background: #FFFFFF;
|
||||
border: 1px solid #EBEEF0;
|
||||
}
|
||||
|
||||
.swatch.lime {
|
||||
background: #B5FF18;
|
||||
}
|
||||
.swatch.blue {
|
||||
background: #14B8FF;
|
||||
}
|
||||
.swatch.purple {
|
||||
background: #8E42FF;
|
||||
}
|
||||
.swatch.pink {
|
||||
background: #FB2D8D;
|
||||
}
|
||||
.swatch.yellow {
|
||||
background: #FFB41F;
|
||||
}
|
||||
.swatch.red {
|
||||
background: #F50B23;
|
||||
}
|
||||
|
||||
/* Icons */
|
||||
|
||||
.sb-doc .streamline {
|
||||
display: grid;
|
||||
grid-template-columns: auto 240px;
|
||||
gap: 32px;
|
||||
}
|
||||
|
||||
.sbdocs-a {
|
||||
color: #30CF43 !important;
|
||||
}
|
||||
@@ -0,0 +1,17 @@
|
||||
# Admin X Design
|
||||
|
||||
Components, design guidelines and documentation for building apps in Ghost Admin
|
||||
|
||||
## Develop
|
||||
|
||||
This is a monorepo package.
|
||||
|
||||
Follow the instructions for the top-level repo.
|
||||
1. `git clone` this repo & `cd` into it as usual
|
||||
2. Run `pnpm` to install top-level dependencies.
|
||||
|
||||
## Test
|
||||
|
||||
- `pnpm lint` run just eslint
|
||||
- `pnpm test` run lint and tests
|
||||
|
||||
@@ -0,0 +1,112 @@
|
||||
{
|
||||
"name": "@tryghost/admin-x-design-system",
|
||||
"type": "module",
|
||||
"version": "0.0.0",
|
||||
"repository": "https://github.com/TryGhost/Ghost/tree/main/packages/admin-x-design-system",
|
||||
"author": "Ghost Foundation",
|
||||
"private": true,
|
||||
"main": "es/index.js",
|
||||
"types": "types/index.d.ts",
|
||||
"sideEffects": false,
|
||||
"scripts": {
|
||||
"dev": "vite build --watch",
|
||||
"build": "tsc -p tsconfig.declaration.json && vite build",
|
||||
"test": "pnpm test:unit",
|
||||
"test:unit": "pnpm test:types && vitest run",
|
||||
"test:types": "tsc --noEmit",
|
||||
"lint:code": "eslint --ext .js,.ts,.cjs,.tsx src/ --cache",
|
||||
"lint": "pnpm lint:code && pnpm lint:test",
|
||||
"lint:test": "eslint -c test/.eslintrc.cjs --ext .js,.ts,.cjs,.tsx test/ --cache",
|
||||
"storybook": "storybook dev -p 6006",
|
||||
"build-storybook": "storybook build"
|
||||
},
|
||||
"files": [
|
||||
"es",
|
||||
"types"
|
||||
],
|
||||
"devDependencies": {
|
||||
"@codemirror/lang-html": "6.4.11",
|
||||
"@codemirror/state": "6.6.0",
|
||||
"@dnd-kit/utilities": "^3.2.2",
|
||||
"@radix-ui/react-tooltip": "1.2.8",
|
||||
"@storybook/addon-essentials": "8.6.14",
|
||||
"@storybook/addon-interactions": "8.6.14",
|
||||
"@storybook/addon-links": "8.6.14",
|
||||
"@storybook/addon-styling": "1.3.7",
|
||||
"@storybook/blocks": "8.6.14",
|
||||
"@storybook/preview-api": "^8.6.14",
|
||||
"@storybook/react": "8.6.14",
|
||||
"@storybook/react-vite": "8.6.14",
|
||||
"@storybook/testing-library": "0.2.2",
|
||||
"@tailwindcss/postcss": "4.2.1",
|
||||
"@testing-library/react": "14.3.1",
|
||||
"@testing-library/react-hooks": "8.0.1",
|
||||
"@types/lodash-es": "4.17.12",
|
||||
"@types/react": "18.3.28",
|
||||
"@types/react-dom": "18.3.7",
|
||||
"@types/validator": "13.15.10",
|
||||
"@vitejs/plugin-react": "4.7.0",
|
||||
"autoprefixer": "10.4.21",
|
||||
"c8": "10.1.3",
|
||||
"chai": "4.5.0",
|
||||
"eslint": "catalog:",
|
||||
"eslint-plugin-react-hooks": "4.6.2",
|
||||
"eslint-plugin-react-refresh": "0.4.24",
|
||||
"eslint-plugin-tailwindcss": "4.0.0-beta.0",
|
||||
"glob": "^10.5.0",
|
||||
"jsdom": "28.1.0",
|
||||
"lodash-es": "4.18.1",
|
||||
"postcss": "8.5.6",
|
||||
"postcss-import": "16.1.1",
|
||||
"react": "18.3.1",
|
||||
"react-dom": "18.3.1",
|
||||
"rollup-plugin-node-builtins": "2.1.2",
|
||||
"sinon": "18.0.1",
|
||||
"storybook": "8.6.15",
|
||||
"tailwindcss": "4.2.1",
|
||||
"typescript": "5.9.3",
|
||||
"validator": "13.12.0",
|
||||
"vite": "5.4.21",
|
||||
"vite-plugin-svgr": "3.3.0",
|
||||
"vitest": "1.6.1"
|
||||
},
|
||||
"dependencies": {
|
||||
"@dnd-kit/core": "6.3.1",
|
||||
"@dnd-kit/sortable": "7.0.2",
|
||||
"@ebay/nice-modal-react": "1.2.13",
|
||||
"@radix-ui/react-avatar": "1.1.11",
|
||||
"@radix-ui/react-checkbox": "1.3.3",
|
||||
"@radix-ui/react-form": "0.1.8",
|
||||
"@radix-ui/react-popover": "1.1.15",
|
||||
"@radix-ui/react-radio-group": "1.3.8",
|
||||
"@radix-ui/react-separator": "1.1.8",
|
||||
"@radix-ui/react-switch": "1.2.6",
|
||||
"@radix-ui/react-tabs": "1.1.13",
|
||||
"@radix-ui/react-tooltip": "1.2.8",
|
||||
"@sentry/react": "7.120.4",
|
||||
"@tryghost/shade": "workspace:*",
|
||||
"@uiw/react-codemirror": "4.25.2",
|
||||
"clsx": "2.1.1",
|
||||
"react-colorful": "5.6.1",
|
||||
"react-hot-toast": "2.6.0",
|
||||
"react-select": "5.10.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0"
|
||||
},
|
||||
"nx": {
|
||||
"targets": {
|
||||
"build": {
|
||||
"dependsOn": [
|
||||
"^build"
|
||||
]
|
||||
},
|
||||
"test:unit": {
|
||||
"dependsOn": [
|
||||
"^build"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
module.exports = {
|
||||
plugins: {
|
||||
'postcss-import': {},
|
||||
'@tailwindcss/postcss': {},
|
||||
autoprefixer: {}
|
||||
}
|
||||
};
|
||||
@@ -0,0 +1,381 @@
|
||||
.admin-x-base {
|
||||
/*
|
||||
1. Prevent padding and border from affecting element width. (https://github.com/mozdevs/cssremedy/issues/4)
|
||||
2. Allow adding a border to an element by just adding a border-width. (https://github.com/tailwindcss/tailwindcss/pull/116)
|
||||
*/
|
||||
|
||||
*,
|
||||
::before,
|
||||
::after {
|
||||
box-sizing: border-box; /* 1 */
|
||||
max-width: revert;
|
||||
max-height: revert;
|
||||
min-width: revert;
|
||||
min-height: revert;
|
||||
border-width: 0; /* 2 */
|
||||
border-style: solid; /* 2 */
|
||||
border-color: theme('borderColor.DEFAULT', currentColor); /* 2 */
|
||||
}
|
||||
|
||||
::before,
|
||||
::after {
|
||||
--tw-content: '';
|
||||
}
|
||||
|
||||
/*
|
||||
1. Use a consistent sensible line-height in all browsers.
|
||||
2. Prevent adjustments of font size after orientation changes in iOS.
|
||||
3. Use a more readable tab size.
|
||||
4. Use the user's configured `sans` font-family by default.
|
||||
*/
|
||||
|
||||
html {
|
||||
line-height: 1.5; /* 1 */
|
||||
-webkit-text-size-adjust: 100%; /* 2 */
|
||||
-moz-tab-size: 4; /* 3 */
|
||||
tab-size: 4; /* 3 */
|
||||
font-family: theme('fontFamily.sans', ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, "Noto Sans", sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Noto Color Emoji"); /* 4 */
|
||||
}
|
||||
|
||||
/*
|
||||
1. Remove the margin in all browsers.
|
||||
2. Inherit line-height from `html` so users can set them as a class directly on the `html` element.
|
||||
*/
|
||||
|
||||
body {
|
||||
margin: 0; /* 1 */
|
||||
line-height: inherit; /* 2 */
|
||||
}
|
||||
|
||||
/*
|
||||
1. Add the correct height in Firefox.
|
||||
2. Correct the inheritance of border color in Firefox. (https://bugzilla.mozilla.org/show_bug.cgi?id=190655)
|
||||
3. Ensure horizontal rules are visible by default.
|
||||
*/
|
||||
|
||||
hr {
|
||||
height: 0; /* 1 */
|
||||
color: inherit; /* 2 */
|
||||
border-top-width: 1px; /* 3 */
|
||||
}
|
||||
|
||||
/*
|
||||
Add the correct text decoration in Chrome, Edge, and Safari.
|
||||
*/
|
||||
|
||||
abbr:where([title]) {
|
||||
text-decoration: underline dotted;
|
||||
}
|
||||
|
||||
/*
|
||||
Remove the default font size and weight for headings.
|
||||
*/
|
||||
|
||||
h1,
|
||||
h2,
|
||||
h3,
|
||||
h4,
|
||||
h5,
|
||||
h6 {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
/*
|
||||
Reset links to optimize for opt-in styling instead of opt-out.
|
||||
*/
|
||||
|
||||
a {
|
||||
color: inherit;
|
||||
text-decoration: inherit;
|
||||
}
|
||||
|
||||
/*
|
||||
Add the correct font weight in Edge and Safari.
|
||||
*/
|
||||
|
||||
b,
|
||||
strong {
|
||||
font-weight: bolder;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Use the user's configured `mono` font family by default.
|
||||
2. Correct the odd `em` font sizing in all browsers.
|
||||
*/
|
||||
|
||||
code,
|
||||
kbd,
|
||||
samp,
|
||||
pre {
|
||||
font-family: theme('fontFamily.mono', ui-monospace, SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace); /* 1 */
|
||||
font-size: 1em; /* 2 */
|
||||
}
|
||||
|
||||
/*
|
||||
Add the correct font size in all browsers.
|
||||
*/
|
||||
|
||||
small {
|
||||
font-size: 80%;
|
||||
}
|
||||
|
||||
/*
|
||||
Prevent `sub` and `sup` elements from affecting the line height in all browsers.
|
||||
*/
|
||||
|
||||
sub,
|
||||
sup {
|
||||
font-size: 75%;
|
||||
line-height: 0;
|
||||
position: relative;
|
||||
vertical-align: baseline;
|
||||
}
|
||||
|
||||
sub {
|
||||
bottom: -0.25em;
|
||||
}
|
||||
|
||||
sup {
|
||||
top: -0.5em;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Remove text indentation from table contents in Chrome and Safari. (https://bugs.chromium.org/p/chromium/issues/detail?id=999088, https://bugs.webkit.org/show_bug.cgi?id=201297)
|
||||
2. Correct table border color inheritance in all Chrome and Safari. (https://bugs.chromium.org/p/chromium/issues/detail?id=935729, https://bugs.webkit.org/show_bug.cgi?id=195016)
|
||||
3. Remove gaps between table borders by default.
|
||||
*/
|
||||
|
||||
table {
|
||||
text-indent: 0; /* 1 */
|
||||
border-color: inherit; /* 2 */
|
||||
border-collapse: collapse; /* 3 */
|
||||
margin: 0;
|
||||
width: auto;
|
||||
max-width: auto;
|
||||
}
|
||||
|
||||
table td, table th {
|
||||
padding: unset;
|
||||
vertical-align: middle;
|
||||
text-align: left;
|
||||
line-height: auto;
|
||||
-webkit-user-select: text;
|
||||
-moz-user-select: text;
|
||||
user-select: text;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Change the font styles in all browsers.
|
||||
2. Remove the margin in Firefox and Safari.
|
||||
3. Remove default padding in all browsers.
|
||||
*/
|
||||
|
||||
button,
|
||||
input,
|
||||
optgroup,
|
||||
select,
|
||||
textarea {
|
||||
font-family: inherit; /* 1 */
|
||||
font-size: 100%; /* 1 */
|
||||
font-weight: inherit; /* 1 */
|
||||
line-height: inherit; /* 1 */
|
||||
color: inherit; /* 1 */
|
||||
margin: 0; /* 2 */
|
||||
padding: 0; /* 3 */
|
||||
outline: none;
|
||||
}
|
||||
|
||||
/*
|
||||
Remove the inheritance of text transform in Edge and Firefox.
|
||||
*/
|
||||
|
||||
button,
|
||||
select {
|
||||
text-transform: none;
|
||||
letter-spacing: inherit;
|
||||
border-radius: inherit;
|
||||
appearance: auto;
|
||||
-webkit-appearance: auto;
|
||||
background: unset;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Correct the inability to style clickable types in iOS and Safari.
|
||||
2. Remove default button styles.
|
||||
*/
|
||||
|
||||
button,
|
||||
/* [type='button'], */
|
||||
[type='reset'],
|
||||
[type='submit'] {
|
||||
-webkit-appearance: button; /* 1 */
|
||||
background-color: transparent; /* 2 */
|
||||
background-image: none; /* 2 */
|
||||
}
|
||||
|
||||
|
||||
|
||||
/*
|
||||
Use the modern Firefox focus style for all focusable elements.
|
||||
*/
|
||||
|
||||
:-moz-focusring {
|
||||
outline: none;
|
||||
}
|
||||
|
||||
/*
|
||||
Remove the additional `:invalid` styles in Firefox. (https://github.com/mozilla/gecko-dev/blob/2f9eacd9d3d995c937b4251a5557d95d494c9be1/layout/style/res/forms.css#L728-L737)
|
||||
*/
|
||||
|
||||
:-moz-ui-invalid {
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
/*
|
||||
Add the correct vertical alignment in Chrome and Firefox.
|
||||
*/
|
||||
|
||||
progress {
|
||||
vertical-align: baseline;
|
||||
}
|
||||
|
||||
/*
|
||||
Correct the cursor style of increment and decrement buttons in Safari.
|
||||
*/
|
||||
|
||||
::-webkit-inner-spin-button,
|
||||
::-webkit-outer-spin-button {
|
||||
height: auto;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Correct the odd appearance in Chrome and Safari.
|
||||
2. Correct the outline style in Safari.
|
||||
*/
|
||||
|
||||
[type='search'] {
|
||||
-webkit-appearance: textfield; /* 1 */
|
||||
outline-offset: -2px; /* 2 */
|
||||
}
|
||||
|
||||
/*
|
||||
Remove the inner padding in Chrome and Safari on macOS.
|
||||
*/
|
||||
|
||||
::-webkit-search-decoration {
|
||||
-webkit-appearance: none;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Correct the inability to style clickable types in iOS and Safari.
|
||||
2. Change font properties to `inherit` in Safari.
|
||||
*/
|
||||
|
||||
::-webkit-file-upload-button {
|
||||
-webkit-appearance: button; /* 1 */
|
||||
font: inherit; /* 2 */
|
||||
}
|
||||
|
||||
/*
|
||||
Add the correct display in Chrome and Safari.
|
||||
*/
|
||||
|
||||
summary {
|
||||
display: list-item;
|
||||
}
|
||||
|
||||
/*
|
||||
Removes the default spacing and border for appropriate elements.
|
||||
*/
|
||||
|
||||
blockquote,
|
||||
dl,
|
||||
dd,
|
||||
h1,
|
||||
h2,
|
||||
h3,
|
||||
h4,
|
||||
h5,
|
||||
h6,
|
||||
hr,
|
||||
figure,
|
||||
p,
|
||||
pre {
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
fieldset {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
legend {
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
ol,
|
||||
ul,
|
||||
menu {
|
||||
list-style: none;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
li {
|
||||
margin: unset;
|
||||
line-height: unset;
|
||||
}
|
||||
|
||||
/*
|
||||
Prevent resizing textareas horizontally by default.
|
||||
*/
|
||||
|
||||
textarea {
|
||||
resize: vertical;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Reset the default placeholder opacity in Firefox. (https://github.com/tailwindlabs/tailwindcss/issues/3300)
|
||||
2. Set the default placeholder color to the user's configured gray 400 color.
|
||||
*/
|
||||
|
||||
input::placeholder,
|
||||
textarea::placeholder {
|
||||
opacity: 1; /* 1 */
|
||||
@apply text-grey-500; /* 2 */
|
||||
}
|
||||
|
||||
button:focus-visible,
|
||||
input:focus-visible {
|
||||
outline: none;
|
||||
}
|
||||
|
||||
/*
|
||||
1. Make replaced elements `display: block` by default. (https://github.com/mozdevs/cssremedy/issues/14)
|
||||
2. Add `vertical-align: middle` to align replaced elements more sensibly by default. (https://github.com/jensimmons/cssremedy/issues/14#issuecomment-634934210)
|
||||
This can trigger a poorly considered lint error in some tools but is included by design.
|
||||
*/
|
||||
|
||||
img,
|
||||
svg,
|
||||
video,
|
||||
canvas,
|
||||
audio,
|
||||
iframe,
|
||||
embed,
|
||||
object {
|
||||
display: block; /* 1 */
|
||||
vertical-align: middle; /* 2 */
|
||||
}
|
||||
|
||||
/*
|
||||
Constrain images and videos to the parent width and preserve their intrinsic aspect ratio. (https://github.com/mozdevs/cssremedy/issues/14)
|
||||
*/
|
||||
|
||||
img,
|
||||
video {
|
||||
max-width: 100%;
|
||||
height: auto;
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user