Monorepo: consolidate 7 repos into one
All checks were successful
Build and Deploy / build-and-deploy (push) Successful in 1m5s

Combines shared, blog, market, cart, events, federation, and account
into a single repository. Eliminates submodule sync, sibling model
copying at build time, and per-app CI orchestration.

Changes:
- Remove per-app .git, .gitmodules, .gitea, submodule shared/ dirs
- Remove stale sibling model copies from each app
- Update all 6 Dockerfiles for monorepo build context (root = .)
- Add build directives to docker-compose.yml
- Add single .gitea/workflows/ci.yml with change detection
- Add .dockerignore for monorepo build context
- Create __init__.py for federation and account (cross-app imports)
This commit is contained in:
giles
2026-02-24 19:44:17 +00:00
commit f42042ccb7
895 changed files with 61147 additions and 0 deletions

2
shared/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
__pycache__/
*.pyc

91
shared/README.md Normal file
View File

@@ -0,0 +1,91 @@
# Shared
Shared infrastructure, models, contracts, services, and templates used by all five Rose Ash microservices (blog, market, cart, events, federation). Included as a git submodule in each app.
## Structure
```
shared/
db/
base.py # SQLAlchemy declarative Base
session.py # Async session factory (get_session, register_db)
models/ # Canonical domain models
user.py # User
magic_link.py # MagicLink (auth tokens)
(domain_event.py removed — table dropped, see migration n4l2i8j0k1)
kv.py # KeyValue (key-value store)
menu_item.py # MenuItem (deprecated — use MenuNode)
menu_node.py # MenuNode (navigation tree)
container_relation.py # ContainerRelation (parent-child content)
ghost_membership_entities.py # GhostNewsletter, UserNewsletter
federation.py # ActorProfile, APActivity, APFollower, APFollowing,
# RemoteActor, APRemotePost, APLocalPost,
# APInteraction, APNotification, APAnchor, IPFSPin
contracts/
dtos.py # Frozen dataclasses for cross-domain data transfer
protocols.py # Service protocols (Blog, Calendar, Market, Cart, Federation)
widgets.py # Widget types (NavWidget, CardWidget, AccountPageWidget)
services/
registry.py # Typed singleton: services.blog, .calendar, .market, .cart, .federation
blog_impl.py # SqlBlogService
calendar_impl.py # SqlCalendarService
market_impl.py # SqlMarketService
cart_impl.py # SqlCartService
federation_impl.py # SqlFederationService
federation_publish.py # try_publish() — inline AP publication helper
stubs.py # No-op stubs for absent domains
navigation.py # get_navigation_tree()
relationships.py # attach_child, get_children, detach_child
widget_registry.py # Widget registry singleton
widgets/ # Per-domain widget registration
infrastructure/
factory.py # create_base_app() — Quart app factory
cart_identity.py # current_cart_identity() (user_id or session_id)
cart_loader.py # Cart data loader for context processors
context.py # Jinja2 context processors
jinja_setup.py # Jinja2 template environment setup
urls.py # URL helpers (blog_url, market_url, etc.)
user_loader.py # Load current user from session
http_utils.py # HTTP utility functions
events/
bus.py # emit_activity(), register_activity_handler()
processor.py # EventProcessor (polls ap_activities, runs handlers)
handlers/ # Shared activity handlers
container_handlers.py # Navigation rebuild on attach/detach
login_handlers.py # Cart/entry adoption on login
order_handlers.py # Order lifecycle events
ap_delivery_handler.py # AP activity delivery to follower inboxes (wildcard)
utils/
__init__.py
calendar_helpers.py # Calendar period/entry utilities
http_signatures.py # RSA keypair generation, HTTP signature signing/verification
ipfs_client.py # Async IPFS client (add_bytes, add_json, pin_cid)
anchoring.py # Merkle trees + OpenTimestamps Bitcoin anchoring
webfinger.py # WebFinger actor resolution
browser/
app/ # Middleware, CSRF, errors, Redis caching, authz, filters
templates/ # ~300 Jinja2 templates shared across all apps
containers.py # ContainerType, container_filter, content_filter helpers
config.py # YAML config loader
log_config/setup.py # Logging configuration (JSON formatter)
static/ # Shared static assets (CSS, JS, images, FontAwesome)
editor/ # Koenig (Ghost) rich text editor build
alembic/ # Database migrations
```
## Key Patterns
- **App factory:** All apps call `create_base_app()` which sets up DB sessions, CSRF, error handling, event processing, logging, widget registration, and domain service wiring.
- **Service contracts:** Cross-domain communication via typed Protocols + frozen DTO dataclasses. Apps call `services.calendar.method()`, never import models from other domains.
- **Service registry:** Typed singleton (`services.blog`, `.calendar`, `.market`, `.cart`, `.federation`). Apps wire their own domain + stubs for others via `register_domain_services()`.
- **Activity bus:** `emit_activity()` writes to `ap_activities` table in the caller's transaction. `EventProcessor` polls pending activities and dispatches to registered handlers. Internal events use `visibility="internal"`; federation activities use `visibility="public"` and are delivered to follower inboxes by the wildcard delivery handler.
- **Widget registry:** Domain services register widgets (nav, card, account); templates consume via `widgets.container_nav`, `widgets.container_cards`.
- **Cart identity:** `current_cart_identity()` returns `{"user_id": int|None, "session_id": str|None}` from the request session.
## Alembic Migrations
All apps share one PostgreSQL database. Migrations are managed here and run from the blog app's entrypoint (other apps skip migrations on startup).
```bash
alembic -c shared/alembic.ini upgrade head
```

1
shared/__init__.py Normal file
View File

@@ -0,0 +1 @@
# shared package — infrastructure, models, contracts, and services

35
shared/alembic.ini Normal file
View File

@@ -0,0 +1,35 @@
[alembic]
script_location = alembic
sqlalchemy.url =
[loggers]
keys = root,sqlalchemy,alembic
[handlers]
keys = console
[formatters]
keys = generic
[logger_root]
level = WARN
handlers = console
[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine
[logger_alembic]
level = INFO
handlers =
qualname = alembic
[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic
[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s

69
shared/alembic/env.py Normal file
View File

@@ -0,0 +1,69 @@
from __future__ import annotations
import os, sys
from logging.config import fileConfig
from alembic import context
from sqlalchemy import engine_from_config, pool
config = context.config
if config.config_file_name is not None:
try:
fileConfig(config.config_file_name)
except Exception:
pass
# Add project root so all app model packages are importable
sys.path.insert(0, os.path.abspath(os.path.join(os.path.dirname(__file__), "../..")))
from shared.db.base import Base
# Import ALL models so Base.metadata sees every table
import shared.models # noqa: F401 User, KV, MagicLink, MenuItem, Ghost*
for _mod in ("blog.models", "market.models", "cart.models", "events.models", "federation.models"):
try:
__import__(_mod)
except ImportError:
pass # OK in Docker — only needed for autogenerate
target_metadata = Base.metadata
def _get_url() -> str:
url = os.getenv(
"ALEMBIC_DATABASE_URL",
os.getenv("DATABASE_URL", config.get_main_option("sqlalchemy.url") or "")
)
print(url)
return url
def run_migrations_offline() -> None:
url = _get_url()
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
compare_type=True,
)
with context.begin_transaction():
context.run_migrations()
def run_migrations_online() -> None:
url = _get_url()
if url:
config.set_main_option("sqlalchemy.url", url)
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata, compare_type=True)
with context.begin_transaction():
context.run_migrations()
if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()

View File

@@ -0,0 +1,24 @@
<%text>
# Alembic migration script template
</%text>
"""empty message
Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = ${repr(up_revision)}
down_revision = ${repr(down_revision)}
branch_labels = ${repr(branch_labels)}
depends_on = ${repr(depends_on)}
def upgrade() -> None:
${upgrades if upgrades else "pass"}
def downgrade() -> None:
${downgrades if downgrades else "pass"}

View File

@@ -0,0 +1,33 @@
"""Initial database schema from schema.sql"""
from alembic import op
import sqlalchemy as sa
import pathlib
# revision identifiers, used by Alembic
revision = '0001_initial_schema'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
return
schema_path = pathlib.Path(__file__).parent.parent.parent / "schema.sql"
with open(schema_path, encoding="utf-8") as f:
sql = f.read()
conn = op.get_bind()
conn.execute(sa.text(sql))
def downgrade():
return
# Drop all user-defined tables in the 'public' schema
conn = op.get_bind()
conn.execute(sa.text("""
DO $$ DECLARE
r RECORD;
BEGIN
FOR r IN (SELECT tablename FROM pg_tables WHERE schemaname = 'public') LOOP
EXECUTE 'DROP TABLE IF EXISTS public.' || quote_ident(r.tablename) || ' CASCADE';
END LOOP;
END $$;
"""))

View File

@@ -0,0 +1,78 @@
"""Add cart_items table for shopping cart"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0002_add_cart_items"
down_revision = "0001_initial_schema"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"cart_items",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
# Either a logged-in user *or* an anonymous session_id
sa.Column(
"user_id",
sa.Integer(),
sa.ForeignKey("users.id", ondelete="CASCADE"),
nullable=True,
),
sa.Column("session_id", sa.String(length=128), nullable=True),
# IMPORTANT: reference products.id (PK), not slug
sa.Column(
"product_id",
sa.Integer(),
sa.ForeignKey("products.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column(
"quantity",
sa.Integer(),
nullable=False,
server_default="1",
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("now()"),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("now()"),
),
sa.Column(
"deleted_at",
sa.DateTime(timezone=True),
nullable=True,
),
)
# Indexes to speed up cart lookups
op.create_index(
"ix_cart_items_user_product",
"cart_items",
["user_id", "product_id"],
unique=False,
)
op.create_index(
"ix_cart_items_session_product",
"cart_items",
["session_id", "product_id"],
unique=False,
)
def downgrade() -> None:
op.drop_index("ix_cart_items_session_product", table_name="cart_items")
op.drop_index("ix_cart_items_user_product", table_name="cart_items")
op.drop_table("cart_items")

View File

@@ -0,0 +1,118 @@
"""Add orders and order_items tables for checkout"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0003_add_orders"
down_revision = "0002_add_cart_items"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"orders",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column("user_id", sa.Integer(), sa.ForeignKey("users.id"), nullable=True),
sa.Column("session_id", sa.String(length=64), nullable=True),
sa.Column(
"status",
sa.String(length=32),
nullable=False,
server_default="pending",
),
sa.Column(
"currency",
sa.String(length=16),
nullable=False,
server_default="GBP",
),
sa.Column(
"total_amount",
sa.Numeric(12, 2),
nullable=False,
),
# SumUp integration fields
sa.Column("sumup_checkout_id", sa.String(length=128), nullable=True),
sa.Column("sumup_status", sa.String(length=32), nullable=True),
sa.Column("sumup_hosted_url", sa.Text(), nullable=True),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
)
# Indexes to match model hints (session_id + sumup_checkout_id index=True)
op.create_index(
"ix_orders_session_id",
"orders",
["session_id"],
unique=False,
)
op.create_index(
"ix_orders_sumup_checkout_id",
"orders",
["sumup_checkout_id"],
unique=False,
)
op.create_table(
"order_items",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column(
"order_id",
sa.Integer(),
sa.ForeignKey("orders.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column(
"product_id",
sa.Integer(),
sa.ForeignKey("products.id"),
nullable=False,
),
sa.Column("product_title", sa.String(length=512), nullable=True),
sa.Column(
"quantity",
sa.Integer(),
nullable=False,
server_default="1",
),
sa.Column(
"unit_price",
sa.Numeric(12, 2),
nullable=False,
),
sa.Column(
"currency",
sa.String(length=16),
nullable=False,
server_default="GBP",
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.func.now(),
),
)
def downgrade() -> None:
op.drop_table("order_items")
op.drop_index("ix_orders_sumup_checkout_id", table_name="orders")
op.drop_index("ix_orders_session_id", table_name="orders")
op.drop_table("orders")

View File

@@ -0,0 +1,27 @@
"""Add sumup_reference to orders"""
from alembic import op
import sqlalchemy as sa
revision = "0004_add_sumup_reference"
down_revision = "0003_add_orders"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"orders",
sa.Column("sumup_reference", sa.String(length=255), nullable=True),
)
op.create_index(
"ix_orders_sumup_reference",
"orders",
["sumup_reference"],
unique=False,
)
def downgrade() -> None:
op.drop_index("ix_orders_sumup_reference", table_name="orders")
op.drop_column("orders", "sumup_reference")

View File

@@ -0,0 +1,27 @@
"""Add description field to orders"""
from alembic import op
import sqlalchemy as sa
revision = "0005_add_description"
down_revision = "0004_add_sumup_reference"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"orders",
sa.Column("description", sa.Text(), nullable=True),
)
op.create_index(
"ix_orders_description",
"orders",
["description"],
unique=False,
)
def downgrade() -> None:
op.drop_index("ix_orders_description", table_name="orders")
op.drop_column("orders", "description")

View File

@@ -0,0 +1,28 @@
from alembic import op
import sqlalchemy as sa
revision = '0006_update_calendar_entries'
down_revision = '0005_add_description' # use the appropriate previous revision ID
branch_labels = None
depends_on = None
def upgrade():
# Add user_id and session_id columns
op.add_column('calendar_entries', sa.Column('user_id', sa.Integer(), nullable=True))
op.create_foreign_key('fk_calendar_entries_user_id', 'calendar_entries', 'users', ['user_id'], ['id'])
op.add_column('calendar_entries', sa.Column('session_id', sa.String(length=128), nullable=True))
# Add state and cost columns
op.add_column('calendar_entries', sa.Column('state', sa.String(length=20), nullable=False, server_default='pending'))
op.add_column('calendar_entries', sa.Column('cost', sa.Numeric(10,2), nullable=False, server_default='10'))
# (Optional) Create indexes on the new columns
op.create_index('ix_calendar_entries_user_id', 'calendar_entries', ['user_id'])
op.create_index('ix_calendar_entries_session_id', 'calendar_entries', ['session_id'])
def downgrade():
op.drop_index('ix_calendar_entries_session_id', table_name='calendar_entries')
op.drop_index('ix_calendar_entries_user_id', table_name='calendar_entries')
op.drop_column('calendar_entries', 'cost')
op.drop_column('calendar_entries', 'state')
op.drop_column('calendar_entries', 'session_id')
op.drop_constraint('fk_calendar_entries_user_id', 'calendar_entries', type_='foreignkey')
op.drop_column('calendar_entries', 'user_id')

View File

@@ -0,0 +1,50 @@
from alembic import op
import sqlalchemy as sa
revision = "0007_add_oid_entries"
down_revision = "0006_update_calendar_entries"
branch_labels = None
depends_on = None
def upgrade():
# Add order_id column
op.add_column(
"calendar_entries",
sa.Column("order_id", sa.Integer(), nullable=True),
)
op.create_foreign_key(
"fk_calendar_entries_order_id",
"calendar_entries",
"orders",
["order_id"],
["id"],
ondelete="SET NULL",
)
op.create_index(
"ix_calendar_entries_order_id",
"calendar_entries",
["order_id"],
unique=False,
)
# Optional: add an index on state if you want faster queries by state
op.create_index(
"ix_calendar_entries_state",
"calendar_entries",
["state"],
unique=False,
)
def downgrade():
# Drop indexes and FK in reverse order
op.drop_index("ix_calendar_entries_state", table_name="calendar_entries")
op.drop_index("ix_calendar_entries_order_id", table_name="calendar_entries")
op.drop_constraint(
"fk_calendar_entries_order_id",
"calendar_entries",
type_="foreignkey",
)
op.drop_column("calendar_entries", "order_id")

View File

@@ -0,0 +1,33 @@
"""add flexible flag to calendar_slots
Revision ID: 0008_add_flexible_to_calendar_slots
Revises: 0007_add_order_id_to_calendar_entries
Create Date: 2025-12-06 12:34:56.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0008_add_flexible_to_slots"
down_revision = "0007_add_oid_entries"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"calendar_slots",
sa.Column(
"flexible",
sa.Boolean(),
nullable=False,
server_default=sa.false(), # set existing rows to False
),
)
# Optional: drop server_default so future inserts must supply a value
op.alter_column("calendar_slots", "flexible", server_default=None)
def downgrade() -> None:
op.drop_column("calendar_slots", "flexible")

View File

@@ -0,0 +1,54 @@
"""add slot_id to calendar_entries
Revision ID: 0009_add_slot_id_to_entries
Revises: 0008_add_flexible_to_slots
Create Date: 2025-12-06 13:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0009_add_slot_id_to_entries"
down_revision = "0008_add_flexible_to_slots"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add slot_id column as nullable initially
op.add_column(
"calendar_entries",
sa.Column(
"slot_id",
sa.Integer(),
nullable=True,
),
)
# Add foreign key constraint
op.create_foreign_key(
"fk_calendar_entries_slot_id_calendar_slots",
"calendar_entries",
"calendar_slots",
["slot_id"],
["id"],
ondelete="SET NULL",
)
# Add index for better query performance
op.create_index(
"ix_calendar_entries_slot_id",
"calendar_entries",
["slot_id"],
)
def downgrade() -> None:
op.drop_index("ix_calendar_entries_slot_id", table_name="calendar_entries")
op.drop_constraint(
"fk_calendar_entries_slot_id_calendar_slots",
"calendar_entries",
type_="foreignkey",
)
op.drop_column("calendar_entries", "slot_id")

View File

@@ -0,0 +1,64 @@
"""Add post_likes table for liking blog posts
Revision ID: 0010_add_post_likes
Revises: 0009_add_slot_id_to_entries
Create Date: 2025-12-07 13:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "0010_add_post_likes"
down_revision = "0009_add_slot_id_to_entries"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"post_likes",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column(
"user_id",
sa.Integer(),
sa.ForeignKey("users.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column(
"post_id",
sa.Integer(),
sa.ForeignKey("posts.id", ondelete="CASCADE"),
nullable=False,
),
sa.Column(
"created_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("now()"),
),
sa.Column(
"updated_at",
sa.DateTime(timezone=True),
nullable=False,
server_default=sa.text("now()"),
),
sa.Column(
"deleted_at",
sa.DateTime(timezone=True),
nullable=True,
),
)
# Index for fast user+post lookups
op.create_index(
"ix_post_likes_user_post",
"post_likes",
["user_id", "post_id"],
unique=False,
)
def downgrade() -> None:
op.drop_index("ix_post_likes_user_post", table_name="post_likes")
op.drop_table("post_likes")

View File

@@ -0,0 +1,43 @@
"""Add ticket_price and ticket_count to calendar_entries
Revision ID: 0011_add_entry_tickets
Revises: 0010_add_post_likes
Create Date: 2025-12-07 14:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import NUMERIC
# revision identifiers, used by Alembic.
revision = "0011_add_entry_tickets"
down_revision = "0010_add_post_likes"
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add ticket_price column (nullable - NULL means no tickets)
op.add_column(
"calendar_entries",
sa.Column(
"ticket_price",
NUMERIC(10, 2),
nullable=True,
),
)
# Add ticket_count column (nullable - NULL means unlimited)
op.add_column(
"calendar_entries",
sa.Column(
"ticket_count",
sa.Integer(),
nullable=True,
),
)
def downgrade() -> None:
op.drop_column("calendar_entries", "ticket_count")
op.drop_column("calendar_entries", "ticket_price")

View File

@@ -0,0 +1,41 @@
# Alembic migration script template
"""add ticket_types table
Revision ID: 47fc53fc0d2b
Revises: a9f54e4eaf02
Create Date: 2025-12-08 07:29:11.422435
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
# revision identifiers, used by Alembic.
revision = '47fc53fc0d2b'
down_revision = 'a9f54e4eaf02'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'ticket_types',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('entry_id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=255), nullable=False),
sa.Column('cost', sa.Numeric(precision=10, scale=2), nullable=False),
sa.Column('count', sa.Integer(), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(['entry_id'], ['calendar_entries.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
op.create_index('ix_ticket_types_entry_id', 'ticket_types', ['entry_id'], unique=False)
op.create_index('ix_ticket_types_name', 'ticket_types', ['name'], unique=False)
def downgrade() -> None:
op.drop_index('ix_ticket_types_name', table_name='ticket_types')
op.drop_index('ix_ticket_types_entry_id', table_name='ticket_types')
op.drop_table('ticket_types')

View File

@@ -0,0 +1,36 @@
# Alembic migration script template
"""Add calendar_entry_posts association table
Revision ID: 6cb124491c9d
Revises: 0011_add_entry_tickets
Create Date: 2025-12-07 03:40:49.194068
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import TIMESTAMP
# revision identifiers, used by Alembic.
revision = '6cb124491c9d'
down_revision = '0011_add_entry_tickets'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'calendar_entry_posts',
sa.Column('id', sa.Integer(), primary_key=True, autoincrement=True),
sa.Column('entry_id', sa.Integer(), sa.ForeignKey('calendar_entries.id', ondelete='CASCADE'), nullable=False),
sa.Column('post_id', sa.Integer(), sa.ForeignKey('posts.id', ondelete='CASCADE'), nullable=False),
sa.Column('created_at', TIMESTAMP(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column('deleted_at', TIMESTAMP(timezone=True), nullable=True),
)
op.create_index('ix_entry_posts_entry_id', 'calendar_entry_posts', ['entry_id'])
op.create_index('ix_entry_posts_post_id', 'calendar_entry_posts', ['post_id'])
def downgrade() -> None:
op.drop_index('ix_entry_posts_post_id', 'calendar_entry_posts')
op.drop_index('ix_entry_posts_entry_id', 'calendar_entry_posts')
op.drop_table('calendar_entry_posts')

View File

@@ -0,0 +1,74 @@
"""add page_configs table
Revision ID: a1b2c3d4e5f6
Revises: f6d4a1b2c3e7
Create Date: 2026-02-10
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy import text
revision = 'a1b2c3d4e5f6'
down_revision = 'f6d4a1b2c3e7'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'page_configs',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('post_id', sa.Integer(), nullable=False),
sa.Column('features', sa.JSON(), server_default='{}', nullable=False),
sa.Column('sumup_merchant_code', sa.String(64), nullable=True),
sa.Column('sumup_api_key', sa.Text(), nullable=True),
sa.Column('sumup_checkout_prefix', sa.String(64), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(['post_id'], ['posts.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('post_id'),
)
# Backfill: create PageConfig for every existing page
conn = op.get_bind()
# 1. Pages with calendars -> features={"calendar": true}
conn.execute(text("""
INSERT INTO page_configs (post_id, features, created_at, updated_at)
SELECT p.id, '{"calendar": true}'::jsonb, now(), now()
FROM posts p
WHERE p.is_page = true
AND p.deleted_at IS NULL
AND EXISTS (
SELECT 1 FROM calendars c
WHERE c.post_id = p.id AND c.deleted_at IS NULL
)
"""))
# 2. Market page (slug='market', is_page=true) -> features={"market": true}
# Only if not already inserted above
conn.execute(text("""
INSERT INTO page_configs (post_id, features, created_at, updated_at)
SELECT p.id, '{"market": true}'::jsonb, now(), now()
FROM posts p
WHERE p.slug = 'market'
AND p.is_page = true
AND p.deleted_at IS NULL
AND p.id NOT IN (SELECT post_id FROM page_configs)
"""))
# 3. All other pages -> features={}
conn.execute(text("""
INSERT INTO page_configs (post_id, features, created_at, updated_at)
SELECT p.id, '{}'::jsonb, now(), now()
FROM posts p
WHERE p.is_page = true
AND p.deleted_at IS NULL
AND p.id NOT IN (SELECT post_id FROM page_configs)
"""))
def downgrade() -> None:
op.drop_table('page_configs')

View File

@@ -0,0 +1,37 @@
# Alembic migration script template
"""add menu_items table
Revision ID: a9f54e4eaf02
Revises: 6cb124491c9d
Create Date: 2025-12-07 17:38:54.839296
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'a9f54e4eaf02'
down_revision = '6cb124491c9d'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table('menu_items',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('post_id', sa.Integer(), nullable=False),
sa.Column('sort_order', sa.Integer(), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(['post_id'], ['posts.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_menu_items_post_id'), 'menu_items', ['post_id'], unique=False)
op.create_index(op.f('ix_menu_items_sort_order'), 'menu_items', ['sort_order'], unique=False)
def downgrade() -> None:
op.drop_index(op.f('ix_menu_items_sort_order'), table_name='menu_items')
op.drop_index(op.f('ix_menu_items_post_id'), table_name='menu_items')
op.drop_table('menu_items')

View File

@@ -0,0 +1,97 @@
"""add market_places table and nav_tops.market_id
Revision ID: b2c3d4e5f6a7
Revises: a1b2c3d4e5f6
Create Date: 2026-02-10
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy import text
revision = 'b2c3d4e5f6a7'
down_revision = 'a1b2c3d4e5f6'
branch_labels = None
depends_on = None
def upgrade() -> None:
# 1. Create market_places table
op.create_table(
'market_places',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('post_id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(255), nullable=False),
sa.Column('slug', sa.String(255), nullable=False),
sa.Column('description', sa.Text(), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(['post_id'], ['posts.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id'),
)
op.create_index('ix_market_places_post_id', 'market_places', ['post_id'])
op.create_index(
'ux_market_places_slug_active',
'market_places',
[sa.text('lower(slug)')],
unique=True,
postgresql_where=sa.text('deleted_at IS NULL'),
)
# 2. Add market_id column to nav_tops
op.add_column(
'nav_tops',
sa.Column('market_id', sa.Integer(), nullable=True),
)
op.create_foreign_key(
'fk_nav_tops_market_id',
'nav_tops',
'market_places',
['market_id'],
['id'],
ondelete='SET NULL',
)
op.create_index('ix_nav_tops_market_id', 'nav_tops', ['market_id'])
# 3. Backfill: create default MarketPlace for the 'market' page
conn = op.get_bind()
# Find the market page
result = conn.execute(text("""
SELECT id FROM posts
WHERE slug = 'market' AND is_page = true AND deleted_at IS NULL
LIMIT 1
"""))
row = result.fetchone()
if row:
post_id = row[0]
# Insert the default market
conn.execute(text("""
INSERT INTO market_places (post_id, name, slug, created_at, updated_at)
VALUES (:post_id, 'Suma Market', 'suma-market', now(), now())
"""), {"post_id": post_id})
# Get the new market_places id
market_row = conn.execute(text("""
SELECT id FROM market_places
WHERE slug = 'suma-market' AND deleted_at IS NULL
LIMIT 1
""")).fetchone()
if market_row:
market_id = market_row[0]
# Assign all active nav_tops to this market
conn.execute(text("""
UPDATE nav_tops SET market_id = :market_id
WHERE deleted_at IS NULL
"""), {"market_id": market_id})
def downgrade() -> None:
op.drop_index('ix_nav_tops_market_id', table_name='nav_tops')
op.drop_constraint('fk_nav_tops_market_id', 'nav_tops', type_='foreignkey')
op.drop_column('nav_tops', 'market_id')
op.drop_index('ux_market_places_slug_active', table_name='market_places')
op.drop_index('ix_market_places_post_id', table_name='market_places')
op.drop_table('market_places')

View File

@@ -0,0 +1,35 @@
"""add snippets table
Revision ID: c3a1f7b9d4e5
Revises: 47fc53fc0d2b
Create Date: 2026-02-07
"""
from alembic import op
import sqlalchemy as sa
revision = 'c3a1f7b9d4e5'
down_revision = '47fc53fc0d2b'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'snippets',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=255), nullable=False),
sa.Column('value', sa.Text(), nullable=False),
sa.Column('visibility', sa.String(length=20), server_default='private', nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['users.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user_id', 'name', name='uq_snippets_user_name'),
)
op.create_index('ix_snippets_visibility', 'snippets', ['visibility'])
def downgrade() -> None:
op.drop_index('ix_snippets_visibility', table_name='snippets')
op.drop_table('snippets')

View File

@@ -0,0 +1,55 @@
"""add page_config_id to orders, market_place_id to cart_items
Revision ID: c3d4e5f6a7b8
Revises: b2c3d4e5f6a7
Create Date: 2026-02-10
"""
from alembic import op
import sqlalchemy as sa
revision = 'c3d4e5f6a7b8'
down_revision = 'b2c3d4e5f6a7'
branch_labels = None
depends_on = None
def upgrade() -> None:
# 1. Add market_place_id to cart_items
op.add_column(
'cart_items',
sa.Column('market_place_id', sa.Integer(), nullable=True),
)
op.create_foreign_key(
'fk_cart_items_market_place_id',
'cart_items',
'market_places',
['market_place_id'],
['id'],
ondelete='SET NULL',
)
op.create_index('ix_cart_items_market_place_id', 'cart_items', ['market_place_id'])
# 2. Add page_config_id to orders
op.add_column(
'orders',
sa.Column('page_config_id', sa.Integer(), nullable=True),
)
op.create_foreign_key(
'fk_orders_page_config_id',
'orders',
'page_configs',
['page_config_id'],
['id'],
ondelete='SET NULL',
)
op.create_index('ix_orders_page_config_id', 'orders', ['page_config_id'])
def downgrade() -> None:
op.drop_index('ix_orders_page_config_id', table_name='orders')
op.drop_constraint('fk_orders_page_config_id', 'orders', type_='foreignkey')
op.drop_column('orders', 'page_config_id')
op.drop_index('ix_cart_items_market_place_id', table_name='cart_items')
op.drop_constraint('fk_cart_items_market_place_id', 'cart_items', type_='foreignkey')
op.drop_column('cart_items', 'market_place_id')

View File

@@ -0,0 +1,45 @@
"""add post user_id, author email, publish_requested
Revision ID: d4b2e8f1a3c7
Revises: c3a1f7b9d4e5
Create Date: 2026-02-08
"""
from alembic import op
import sqlalchemy as sa
revision = 'd4b2e8f1a3c7'
down_revision = 'c3a1f7b9d4e5'
branch_labels = None
depends_on = None
def upgrade() -> None:
# Add author.email
op.add_column('authors', sa.Column('email', sa.String(255), nullable=True))
# Add post.user_id FK
op.add_column('posts', sa.Column('user_id', sa.Integer(), nullable=True))
op.create_foreign_key('fk_posts_user_id', 'posts', 'users', ['user_id'], ['id'], ondelete='SET NULL')
op.create_index('ix_posts_user_id', 'posts', ['user_id'])
# Add post.publish_requested
op.add_column('posts', sa.Column('publish_requested', sa.Boolean(), server_default='false', nullable=False))
# Backfill: match posts to users via primary_author email
op.execute("""
UPDATE posts
SET user_id = u.id
FROM authors a
JOIN users u ON lower(a.email) = lower(u.email)
WHERE posts.primary_author_id = a.id
AND posts.user_id IS NULL
AND a.email IS NOT NULL
""")
def downgrade() -> None:
op.drop_column('posts', 'publish_requested')
op.drop_index('ix_posts_user_id', table_name='posts')
op.drop_constraint('fk_posts_user_id', 'posts', type_='foreignkey')
op.drop_column('posts', 'user_id')
op.drop_column('authors', 'email')

View File

@@ -0,0 +1,45 @@
"""add tag_groups and tag_group_tags
Revision ID: e5c3f9a2b1d6
Revises: d4b2e8f1a3c7
Create Date: 2026-02-08
"""
from alembic import op
import sqlalchemy as sa
revision = 'e5c3f9a2b1d6'
down_revision = 'd4b2e8f1a3c7'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'tag_groups',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('name', sa.String(length=255), nullable=False),
sa.Column('slug', sa.String(length=191), nullable=False),
sa.Column('feature_image', sa.Text(), nullable=True),
sa.Column('colour', sa.String(length=32), nullable=True),
sa.Column('sort_order', sa.Integer(), nullable=False, server_default='0'),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('slug'),
)
op.create_table(
'tag_group_tags',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('tag_group_id', sa.Integer(), nullable=False),
sa.Column('tag_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['tag_group_id'], ['tag_groups.id'], ondelete='CASCADE'),
sa.ForeignKeyConstraint(['tag_id'], ['tags.id'], ondelete='CASCADE'),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('tag_group_id', 'tag_id', name='uq_tag_group_tag'),
)
def downgrade() -> None:
op.drop_table('tag_group_tags')
op.drop_table('tag_groups')

View File

@@ -0,0 +1,40 @@
"""add domain_events table
Revision ID: f6d4a0b2c3e7
Revises: e5c3f9a2b1d6
Create Date: 2026-02-11
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
revision = 'f6d4a0b2c3e7'
down_revision = 'e5c3f9a2b1d6'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'domain_events',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('event_type', sa.String(128), nullable=False),
sa.Column('aggregate_type', sa.String(64), nullable=False),
sa.Column('aggregate_id', sa.Integer(), nullable=False),
sa.Column('payload', postgresql.JSONB(astext_type=sa.Text()), nullable=True),
sa.Column('state', sa.String(20), server_default='pending', nullable=False),
sa.Column('attempts', sa.Integer(), server_default='0', nullable=False),
sa.Column('max_attempts', sa.Integer(), server_default='5', nullable=False),
sa.Column('last_error', sa.Text(), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('processed_at', sa.DateTime(timezone=True), nullable=True),
sa.PrimaryKeyConstraint('id'),
)
op.create_index('ix_domain_events_event_type', 'domain_events', ['event_type'])
op.create_index('ix_domain_events_state', 'domain_events', ['state'])
def downgrade() -> None:
op.drop_index('ix_domain_events_state', table_name='domain_events')
op.drop_index('ix_domain_events_event_type', table_name='domain_events')
op.drop_table('domain_events')

View File

@@ -0,0 +1,47 @@
"""add tickets table
Revision ID: f6d4a1b2c3e7
Revises: e5c3f9a2b1d6
Create Date: 2026-02-09
"""
from alembic import op
import sqlalchemy as sa
revision = 'f6d4a1b2c3e7'
down_revision = 'e5c3f9a2b1d6'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
'tickets',
sa.Column('id', sa.Integer(), primary_key=True),
sa.Column('entry_id', sa.Integer(), sa.ForeignKey('calendar_entries.id', ondelete='CASCADE'), nullable=False),
sa.Column('ticket_type_id', sa.Integer(), sa.ForeignKey('ticket_types.id', ondelete='SET NULL'), nullable=True),
sa.Column('user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=True),
sa.Column('session_id', sa.String(64), nullable=True),
sa.Column('order_id', sa.Integer(), sa.ForeignKey('orders.id', ondelete='SET NULL'), nullable=True),
sa.Column('code', sa.String(64), unique=True, nullable=False),
sa.Column('state', sa.String(20), nullable=False, server_default=sa.text("'reserved'")),
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column('checked_in_at', sa.DateTime(timezone=True), nullable=True),
)
op.create_index('ix_tickets_entry_id', 'tickets', ['entry_id'])
op.create_index('ix_tickets_ticket_type_id', 'tickets', ['ticket_type_id'])
op.create_index('ix_tickets_user_id', 'tickets', ['user_id'])
op.create_index('ix_tickets_session_id', 'tickets', ['session_id'])
op.create_index('ix_tickets_order_id', 'tickets', ['order_id'])
op.create_index('ix_tickets_code', 'tickets', ['code'], unique=True)
op.create_index('ix_tickets_state', 'tickets', ['state'])
def downgrade() -> None:
op.drop_index('ix_tickets_state', 'tickets')
op.drop_index('ix_tickets_code', 'tickets')
op.drop_index('ix_tickets_order_id', 'tickets')
op.drop_index('ix_tickets_session_id', 'tickets')
op.drop_index('ix_tickets_user_id', 'tickets')
op.drop_index('ix_tickets_ticket_type_id', 'tickets')
op.drop_index('ix_tickets_entry_id', 'tickets')
op.drop_table('tickets')

View File

@@ -0,0 +1,115 @@
"""replace post_id FKs with container_type + container_id
Revision ID: g7e5b1c3d4f8
Revises: f6d4a0b2c3e7
Create Date: 2026-02-11
"""
from alembic import op
import sqlalchemy as sa
revision = 'g7e5b1c3d4f8'
down_revision = 'f6d4a0b2c3e7'
branch_labels = None
depends_on = None
def upgrade() -> None:
# --- calendars: post_id → container_type + container_id ---
op.add_column('calendars', sa.Column('container_type', sa.String(32), nullable=True))
op.add_column('calendars', sa.Column('container_id', sa.Integer(), nullable=True))
op.execute("UPDATE calendars SET container_type = 'page', container_id = post_id")
op.alter_column('calendars', 'container_type', nullable=False, server_default=sa.text("'page'"))
op.alter_column('calendars', 'container_id', nullable=False)
op.drop_index('ix_calendars_post_id', table_name='calendars')
op.drop_index('ux_calendars_post_slug_active', table_name='calendars')
op.drop_constraint('calendars_post_id_fkey', 'calendars', type_='foreignkey')
op.drop_column('calendars', 'post_id')
op.create_index('ix_calendars_container', 'calendars', ['container_type', 'container_id'])
op.create_index(
'ux_calendars_container_slug_active',
'calendars',
['container_type', 'container_id', sa.text('lower(slug)')],
unique=True,
postgresql_where=sa.text('deleted_at IS NULL'),
)
# --- market_places: post_id → container_type + container_id ---
op.add_column('market_places', sa.Column('container_type', sa.String(32), nullable=True))
op.add_column('market_places', sa.Column('container_id', sa.Integer(), nullable=True))
op.execute("UPDATE market_places SET container_type = 'page', container_id = post_id")
op.alter_column('market_places', 'container_type', nullable=False, server_default=sa.text("'page'"))
op.alter_column('market_places', 'container_id', nullable=False)
op.drop_index('ix_market_places_post_id', table_name='market_places')
op.drop_constraint('market_places_post_id_fkey', 'market_places', type_='foreignkey')
op.drop_column('market_places', 'post_id')
op.create_index('ix_market_places_container', 'market_places', ['container_type', 'container_id'])
# --- page_configs: post_id → container_type + container_id ---
op.add_column('page_configs', sa.Column('container_type', sa.String(32), nullable=True))
op.add_column('page_configs', sa.Column('container_id', sa.Integer(), nullable=True))
op.execute("UPDATE page_configs SET container_type = 'page', container_id = post_id")
op.alter_column('page_configs', 'container_type', nullable=False, server_default=sa.text("'page'"))
op.alter_column('page_configs', 'container_id', nullable=False)
op.drop_constraint('page_configs_post_id_fkey', 'page_configs', type_='foreignkey')
op.drop_column('page_configs', 'post_id')
op.create_index('ix_page_configs_container', 'page_configs', ['container_type', 'container_id'])
# --- calendar_entry_posts: post_id → content_type + content_id ---
op.add_column('calendar_entry_posts', sa.Column('content_type', sa.String(32), nullable=True))
op.add_column('calendar_entry_posts', sa.Column('content_id', sa.Integer(), nullable=True))
op.execute("UPDATE calendar_entry_posts SET content_type = 'post', content_id = post_id")
op.alter_column('calendar_entry_posts', 'content_type', nullable=False, server_default=sa.text("'post'"))
op.alter_column('calendar_entry_posts', 'content_id', nullable=False)
op.drop_index('ix_entry_posts_post_id', table_name='calendar_entry_posts')
op.drop_constraint('calendar_entry_posts_post_id_fkey', 'calendar_entry_posts', type_='foreignkey')
op.drop_column('calendar_entry_posts', 'post_id')
op.create_index('ix_entry_posts_content', 'calendar_entry_posts', ['content_type', 'content_id'])
def downgrade() -> None:
# --- calendar_entry_posts: restore post_id ---
op.add_column('calendar_entry_posts', sa.Column('post_id', sa.Integer(), nullable=True))
op.execute("UPDATE calendar_entry_posts SET post_id = content_id WHERE content_type = 'post'")
op.alter_column('calendar_entry_posts', 'post_id', nullable=False)
op.create_foreign_key('calendar_entry_posts_post_id_fkey', 'calendar_entry_posts', 'posts', ['post_id'], ['id'], ondelete='CASCADE')
op.create_index('ix_entry_posts_post_id', 'calendar_entry_posts', ['post_id'])
op.drop_index('ix_entry_posts_content', table_name='calendar_entry_posts')
op.drop_column('calendar_entry_posts', 'content_id')
op.drop_column('calendar_entry_posts', 'content_type')
# --- page_configs: restore post_id ---
op.add_column('page_configs', sa.Column('post_id', sa.Integer(), nullable=True))
op.execute("UPDATE page_configs SET post_id = container_id WHERE container_type = 'page'")
op.alter_column('page_configs', 'post_id', nullable=False)
op.create_foreign_key('page_configs_post_id_fkey', 'page_configs', 'posts', ['post_id'], ['id'], ondelete='CASCADE')
op.drop_index('ix_page_configs_container', table_name='page_configs')
op.drop_column('page_configs', 'container_id')
op.drop_column('page_configs', 'container_type')
# --- market_places: restore post_id ---
op.add_column('market_places', sa.Column('post_id', sa.Integer(), nullable=True))
op.execute("UPDATE market_places SET post_id = container_id WHERE container_type = 'page'")
op.alter_column('market_places', 'post_id', nullable=False)
op.create_foreign_key('market_places_post_id_fkey', 'market_places', 'posts', ['post_id'], ['id'], ondelete='CASCADE')
op.create_index('ix_market_places_post_id', 'market_places', ['post_id'])
op.drop_index('ix_market_places_container', table_name='market_places')
op.drop_column('market_places', 'container_id')
op.drop_column('market_places', 'container_type')
# --- calendars: restore post_id ---
op.add_column('calendars', sa.Column('post_id', sa.Integer(), nullable=True))
op.execute("UPDATE calendars SET post_id = container_id WHERE container_type = 'page'")
op.alter_column('calendars', 'post_id', nullable=False)
op.create_foreign_key('calendars_post_id_fkey', 'calendars', 'posts', ['post_id'], ['id'], ondelete='CASCADE')
op.create_index('ix_calendars_post_id', 'calendars', ['post_id'])
op.create_index(
'ux_calendars_post_slug_active',
'calendars',
['post_id', sa.text('lower(slug)')],
unique=True,
postgresql_where=sa.text('deleted_at IS NULL'),
)
op.drop_index('ux_calendars_container_slug_active', table_name='calendars')
op.drop_index('ix_calendars_container', table_name='calendars')
op.drop_column('calendars', 'container_id')
op.drop_column('calendars', 'container_type')

View File

@@ -0,0 +1,23 @@
"""merge heads
Revision ID: h8f6c2d4e5a9
Revises: c3d4e5f6a7b8, g7e5b1c3d4f8
Create Date: 2026-02-11 00:00:00.000000
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'h8f6c2d4e5a9'
down_revision = ('c3d4e5f6a7b8', 'g7e5b1c3d4f8')
branch_labels = None
depends_on = None
def upgrade() -> None:
pass
def downgrade() -> None:
pass

View File

@@ -0,0 +1,98 @@
"""add glue layer tables (container_relations + menu_nodes)
Revision ID: i9g7d3e5f6
Revises: h8f6c2d4e5a9
Create Date: 2026-02-11
"""
from alembic import op
import sqlalchemy as sa
revision = 'i9g7d3e5f6'
down_revision = 'h8f6c2d4e5a9'
branch_labels = None
depends_on = None
def upgrade() -> None:
# --- container_relations ---
op.create_table(
'container_relations',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('parent_type', sa.String(32), nullable=False),
sa.Column('parent_id', sa.Integer(), nullable=False),
sa.Column('child_type', sa.String(32), nullable=False),
sa.Column('child_id', sa.Integer(), nullable=False),
sa.Column('sort_order', sa.Integer(), nullable=False, server_default='0'),
sa.Column('label', sa.String(255), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint(
'parent_type', 'parent_id', 'child_type', 'child_id',
name='uq_container_relations_parent_child',
),
)
op.create_index('ix_container_relations_parent', 'container_relations', ['parent_type', 'parent_id'])
op.create_index('ix_container_relations_child', 'container_relations', ['child_type', 'child_id'])
# --- menu_nodes ---
op.create_table(
'menu_nodes',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('container_type', sa.String(32), nullable=False),
sa.Column('container_id', sa.Integer(), nullable=False),
sa.Column('parent_id', sa.Integer(), nullable=True),
sa.Column('sort_order', sa.Integer(), nullable=False, server_default='0'),
sa.Column('depth', sa.Integer(), nullable=False, server_default='0'),
sa.Column('label', sa.String(255), nullable=False),
sa.Column('slug', sa.String(255), nullable=True),
sa.Column('href', sa.String(1024), nullable=True),
sa.Column('icon', sa.String(64), nullable=True),
sa.Column('feature_image', sa.Text(), nullable=True),
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('now()'), nullable=False),
sa.Column('deleted_at', sa.DateTime(timezone=True), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.ForeignKeyConstraint(['parent_id'], ['menu_nodes.id'], ondelete='SET NULL'),
)
op.create_index('ix_menu_nodes_container', 'menu_nodes', ['container_type', 'container_id'])
op.create_index('ix_menu_nodes_parent_id', 'menu_nodes', ['parent_id'])
# --- Backfill container_relations from existing container-pattern tables ---
op.execute("""
INSERT INTO container_relations (parent_type, parent_id, child_type, child_id, sort_order)
SELECT 'page', container_id, 'calendar', id, 0
FROM calendars
WHERE deleted_at IS NULL AND container_type = 'page'
""")
op.execute("""
INSERT INTO container_relations (parent_type, parent_id, child_type, child_id, sort_order)
SELECT 'page', container_id, 'market', id, 0
FROM market_places
WHERE deleted_at IS NULL AND container_type = 'page'
""")
op.execute("""
INSERT INTO container_relations (parent_type, parent_id, child_type, child_id, sort_order)
SELECT 'page', container_id, 'page_config', id, 0
FROM page_configs
WHERE deleted_at IS NULL AND container_type = 'page'
""")
# --- Backfill menu_nodes from existing menu_items + posts ---
op.execute("""
INSERT INTO menu_nodes (container_type, container_id, label, slug, feature_image, sort_order)
SELECT 'page', mi.post_id, p.title, p.slug, p.feature_image, mi.sort_order
FROM menu_items mi
JOIN posts p ON mi.post_id = p.id
WHERE mi.deleted_at IS NULL
""")
def downgrade() -> None:
op.drop_index('ix_menu_nodes_parent_id', table_name='menu_nodes')
op.drop_index('ix_menu_nodes_container', table_name='menu_nodes')
op.drop_table('menu_nodes')
op.drop_index('ix_container_relations_child', table_name='container_relations')
op.drop_index('ix_container_relations_parent', table_name='container_relations')
op.drop_table('container_relations')

View File

@@ -0,0 +1,51 @@
"""drop cross-domain FK constraints (events → cart)
Merge three existing heads and remove:
- calendar_entries.order_id FK → orders.id
- tickets.order_id FK → orders.id
Columns are kept as plain integers.
Revision ID: j0h8e4f6g7
Revises: c3d4e5f6a7b8, i9g7d3e5f6, g7e5b1c3d4f8
Create Date: 2026-02-14
"""
from alembic import op
import sqlalchemy as sa
revision = 'j0h8e4f6g7'
down_revision = ('c3d4e5f6a7b8', 'i9g7d3e5f6', 'g7e5b1c3d4f8')
branch_labels = None
depends_on = None
def upgrade() -> None:
op.drop_constraint(
'fk_calendar_entries_order_id',
'calendar_entries',
type_='foreignkey',
)
op.drop_constraint(
'tickets_order_id_fkey',
'tickets',
type_='foreignkey',
)
def downgrade() -> None:
op.create_foreign_key(
'fk_calendar_entries_order_id',
'calendar_entries',
'orders',
['order_id'],
['id'],
ondelete='SET NULL',
)
op.create_foreign_key(
'tickets_order_id_fkey',
'tickets',
'orders',
['order_id'],
['id'],
ondelete='SET NULL',
)

View File

@@ -0,0 +1,142 @@
"""add federation tables
Revision ID: k1i9f5g7h8
Revises: j0h8e4f6g7
Create Date: 2026-02-21
Creates:
- ap_actor_profiles — AP identity per user
- ap_activities — local + remote AP activities
- ap_followers — remote followers
- ap_inbox_items — raw incoming AP activities
- ap_anchors — OpenTimestamps merkle batches
- ipfs_pins — IPFS content tracking (platform-wide)
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects import postgresql
revision = "k1i9f5g7h8"
down_revision = "j0h8e4f6g7"
branch_labels = None
depends_on = None
def upgrade() -> None:
# -- ap_anchors (referenced by ap_activities) ----------------------------
op.create_table(
"ap_anchors",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("merkle_root", sa.String(128), nullable=False),
sa.Column("tree_ipfs_cid", sa.String(128), nullable=True),
sa.Column("ots_proof_cid", sa.String(128), nullable=True),
sa.Column("activity_count", sa.Integer(), nullable=False, server_default="0"),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column("confirmed_at", sa.DateTime(timezone=True), nullable=True),
sa.Column("bitcoin_txid", sa.String(128), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
# -- ap_actor_profiles ---------------------------------------------------
op.create_table(
"ap_actor_profiles",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("preferred_username", sa.String(64), nullable=False),
sa.Column("display_name", sa.String(255), nullable=True),
sa.Column("summary", sa.Text(), nullable=True),
sa.Column("public_key_pem", sa.Text(), nullable=False),
sa.Column("private_key_pem", sa.Text(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("preferred_username"),
sa.UniqueConstraint("user_id"),
)
op.create_index("ix_ap_actor_user_id", "ap_actor_profiles", ["user_id"], unique=True)
op.create_index("ix_ap_actor_username", "ap_actor_profiles", ["preferred_username"], unique=True)
# -- ap_activities -------------------------------------------------------
op.create_table(
"ap_activities",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("activity_id", sa.String(512), nullable=False),
sa.Column("activity_type", sa.String(64), nullable=False),
sa.Column("actor_profile_id", sa.Integer(), nullable=False),
sa.Column("object_type", sa.String(64), nullable=True),
sa.Column("object_data", postgresql.JSONB(), nullable=True),
sa.Column("published", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column("signature", postgresql.JSONB(), nullable=True),
sa.Column("is_local", sa.Boolean(), nullable=False, server_default="true"),
sa.Column("source_type", sa.String(64), nullable=True),
sa.Column("source_id", sa.Integer(), nullable=True),
sa.Column("ipfs_cid", sa.String(128), nullable=True),
sa.Column("anchor_id", sa.Integer(), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.ForeignKeyConstraint(["actor_profile_id"], ["ap_actor_profiles.id"], ondelete="CASCADE"),
sa.ForeignKeyConstraint(["anchor_id"], ["ap_anchors.id"], ondelete="SET NULL"),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("activity_id"),
)
op.create_index("ix_ap_activity_actor", "ap_activities", ["actor_profile_id"])
op.create_index("ix_ap_activity_source", "ap_activities", ["source_type", "source_id"])
op.create_index("ix_ap_activity_published", "ap_activities", ["published"])
# -- ap_followers --------------------------------------------------------
op.create_table(
"ap_followers",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("actor_profile_id", sa.Integer(), nullable=False),
sa.Column("follower_acct", sa.String(512), nullable=False),
sa.Column("follower_inbox", sa.String(512), nullable=False),
sa.Column("follower_actor_url", sa.String(512), nullable=False),
sa.Column("follower_public_key", sa.Text(), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.ForeignKeyConstraint(["actor_profile_id"], ["ap_actor_profiles.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("actor_profile_id", "follower_acct", name="uq_follower_acct"),
)
op.create_index("ix_ap_follower_actor", "ap_followers", ["actor_profile_id"])
# -- ap_inbox_items ------------------------------------------------------
op.create_table(
"ap_inbox_items",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("actor_profile_id", sa.Integer(), nullable=False),
sa.Column("raw_json", postgresql.JSONB(), nullable=False),
sa.Column("activity_type", sa.String(64), nullable=True),
sa.Column("from_actor", sa.String(512), nullable=True),
sa.Column("state", sa.String(20), nullable=False, server_default="pending"),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column("processed_at", sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(["actor_profile_id"], ["ap_actor_profiles.id"], ondelete="CASCADE"),
sa.PrimaryKeyConstraint("id"),
)
op.create_index("ix_ap_inbox_state", "ap_inbox_items", ["state"])
op.create_index("ix_ap_inbox_actor", "ap_inbox_items", ["actor_profile_id"])
# -- ipfs_pins -----------------------------------------------------------
op.create_table(
"ipfs_pins",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("content_hash", sa.String(128), nullable=False),
sa.Column("ipfs_cid", sa.String(128), nullable=False),
sa.Column("pin_type", sa.String(64), nullable=False),
sa.Column("source_type", sa.String(64), nullable=True),
sa.Column("source_id", sa.Integer(), nullable=True),
sa.Column("size_bytes", sa.BigInteger(), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("ipfs_cid"),
)
op.create_index("ix_ipfs_pin_source", "ipfs_pins", ["source_type", "source_id"])
op.create_index("ix_ipfs_pin_cid", "ipfs_pins", ["ipfs_cid"], unique=True)
def downgrade() -> None:
op.drop_table("ipfs_pins")
op.drop_table("ap_inbox_items")
op.drop_table("ap_followers")
op.drop_table("ap_activities")
op.drop_table("ap_actor_profiles")
op.drop_table("ap_anchors")

View File

@@ -0,0 +1,138 @@
"""add fediverse social tables
Revision ID: l2j0g6h8i9
Revises: k1i9f5g7h8
Create Date: 2026-02-22
Creates:
- ap_remote_actors — cached profiles of remote actors
- ap_following — outbound follows (local → remote)
- ap_remote_posts — ingested posts from remote actors
- ap_local_posts — native posts composed in federation UI
- ap_interactions — likes and boosts
- ap_notifications — follow/like/boost/mention/reply notifications
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import JSONB
revision = "l2j0g6h8i9"
down_revision = "k1i9f5g7h8"
branch_labels = None
depends_on = None
def upgrade() -> None:
# -- ap_remote_actors --
op.create_table(
"ap_remote_actors",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("actor_url", sa.String(512), unique=True, nullable=False),
sa.Column("inbox_url", sa.String(512), nullable=False),
sa.Column("shared_inbox_url", sa.String(512), nullable=True),
sa.Column("preferred_username", sa.String(255), nullable=False),
sa.Column("display_name", sa.String(255), nullable=True),
sa.Column("summary", sa.Text, nullable=True),
sa.Column("icon_url", sa.String(512), nullable=True),
sa.Column("public_key_pem", sa.Text, nullable=True),
sa.Column("domain", sa.String(255), nullable=False),
sa.Column("fetched_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
)
op.create_index("ix_ap_remote_actor_url", "ap_remote_actors", ["actor_url"], unique=True)
op.create_index("ix_ap_remote_actor_domain", "ap_remote_actors", ["domain"])
# -- ap_following --
op.create_table(
"ap_following",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("actor_profile_id", sa.Integer, sa.ForeignKey("ap_actor_profiles.id", ondelete="CASCADE"), nullable=False),
sa.Column("remote_actor_id", sa.Integer, sa.ForeignKey("ap_remote_actors.id", ondelete="CASCADE"), nullable=False),
sa.Column("state", sa.String(20), nullable=False, server_default="pending"),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("accepted_at", sa.DateTime(timezone=True), nullable=True),
sa.UniqueConstraint("actor_profile_id", "remote_actor_id", name="uq_following"),
)
op.create_index("ix_ap_following_actor", "ap_following", ["actor_profile_id"])
op.create_index("ix_ap_following_remote", "ap_following", ["remote_actor_id"])
# -- ap_remote_posts --
op.create_table(
"ap_remote_posts",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("remote_actor_id", sa.Integer, sa.ForeignKey("ap_remote_actors.id", ondelete="CASCADE"), nullable=False),
sa.Column("activity_id", sa.String(512), unique=True, nullable=False),
sa.Column("object_id", sa.String(512), unique=True, nullable=False),
sa.Column("object_type", sa.String(64), nullable=False, server_default="Note"),
sa.Column("content", sa.Text, nullable=True),
sa.Column("summary", sa.Text, nullable=True),
sa.Column("url", sa.String(512), nullable=True),
sa.Column("attachment_data", JSONB, nullable=True),
sa.Column("tag_data", JSONB, nullable=True),
sa.Column("in_reply_to", sa.String(512), nullable=True),
sa.Column("conversation", sa.String(512), nullable=True),
sa.Column("published", sa.DateTime(timezone=True), nullable=True),
sa.Column("fetched_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
)
op.create_index("ix_ap_remote_post_actor", "ap_remote_posts", ["remote_actor_id"])
op.create_index("ix_ap_remote_post_published", "ap_remote_posts", ["published"])
op.create_index("ix_ap_remote_post_object", "ap_remote_posts", ["object_id"], unique=True)
# -- ap_local_posts --
op.create_table(
"ap_local_posts",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("actor_profile_id", sa.Integer, sa.ForeignKey("ap_actor_profiles.id", ondelete="CASCADE"), nullable=False),
sa.Column("content", sa.Text, nullable=False),
sa.Column("visibility", sa.String(20), nullable=False, server_default="public"),
sa.Column("in_reply_to", sa.String(512), nullable=True),
sa.Column("published", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.Column("updated_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
)
op.create_index("ix_ap_local_post_actor", "ap_local_posts", ["actor_profile_id"])
op.create_index("ix_ap_local_post_published", "ap_local_posts", ["published"])
# -- ap_interactions --
op.create_table(
"ap_interactions",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("actor_profile_id", sa.Integer, sa.ForeignKey("ap_actor_profiles.id", ondelete="CASCADE"), nullable=True),
sa.Column("remote_actor_id", sa.Integer, sa.ForeignKey("ap_remote_actors.id", ondelete="CASCADE"), nullable=True),
sa.Column("post_type", sa.String(20), nullable=False),
sa.Column("post_id", sa.Integer, nullable=False),
sa.Column("interaction_type", sa.String(20), nullable=False),
sa.Column("activity_id", sa.String(512), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
)
op.create_index("ix_ap_interaction_post", "ap_interactions", ["post_type", "post_id"])
op.create_index("ix_ap_interaction_actor", "ap_interactions", ["actor_profile_id"])
op.create_index("ix_ap_interaction_remote", "ap_interactions", ["remote_actor_id"])
# -- ap_notifications --
op.create_table(
"ap_notifications",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("actor_profile_id", sa.Integer, sa.ForeignKey("ap_actor_profiles.id", ondelete="CASCADE"), nullable=False),
sa.Column("notification_type", sa.String(20), nullable=False),
sa.Column("from_remote_actor_id", sa.Integer, sa.ForeignKey("ap_remote_actors.id", ondelete="SET NULL"), nullable=True),
sa.Column("from_actor_profile_id", sa.Integer, sa.ForeignKey("ap_actor_profiles.id", ondelete="SET NULL"), nullable=True),
sa.Column("target_activity_id", sa.Integer, sa.ForeignKey("ap_activities.id", ondelete="SET NULL"), nullable=True),
sa.Column("target_remote_post_id", sa.Integer, sa.ForeignKey("ap_remote_posts.id", ondelete="SET NULL"), nullable=True),
sa.Column("read", sa.Boolean, nullable=False, server_default="false"),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
)
op.create_index("ix_ap_notification_actor", "ap_notifications", ["actor_profile_id"])
op.create_index("ix_ap_notification_read", "ap_notifications", ["actor_profile_id", "read"])
op.create_index("ix_ap_notification_created", "ap_notifications", ["created_at"])
def downgrade() -> None:
op.drop_table("ap_notifications")
op.drop_table("ap_interactions")
op.drop_table("ap_local_posts")
op.drop_table("ap_remote_posts")
op.drop_table("ap_following")
op.drop_table("ap_remote_actors")

View File

@@ -0,0 +1,113 @@
"""add unified event bus columns to ap_activities
Revision ID: m3k1h7i9j0
Revises: l2j0g6h8i9
Create Date: 2026-02-22
Adds processing and visibility columns so ap_activities can serve as the
unified event bus for both internal domain events and federation delivery.
"""
revision = "m3k1h7i9j0"
down_revision = "l2j0g6h8i9"
branch_labels = None
depends_on = None
from alembic import op
import sqlalchemy as sa
def upgrade() -> None:
# Add new columns with defaults so existing rows stay valid
op.add_column(
"ap_activities",
sa.Column("actor_uri", sa.String(512), nullable=True),
)
op.add_column(
"ap_activities",
sa.Column(
"visibility", sa.String(20),
nullable=False, server_default="public",
),
)
op.add_column(
"ap_activities",
sa.Column(
"process_state", sa.String(20),
nullable=False, server_default="completed",
),
)
op.add_column(
"ap_activities",
sa.Column(
"process_attempts", sa.Integer(),
nullable=False, server_default="0",
),
)
op.add_column(
"ap_activities",
sa.Column(
"process_max_attempts", sa.Integer(),
nullable=False, server_default="5",
),
)
op.add_column(
"ap_activities",
sa.Column("process_error", sa.Text(), nullable=True),
)
op.add_column(
"ap_activities",
sa.Column(
"processed_at", sa.DateTime(timezone=True), nullable=True,
),
)
# Backfill actor_uri from the related actor_profile
op.execute(
"""
UPDATE ap_activities a
SET actor_uri = CONCAT(
'https://',
COALESCE(current_setting('app.ap_domain', true), 'rose-ash.com'),
'/users/',
p.preferred_username
)
FROM ap_actor_profiles p
WHERE a.actor_profile_id = p.id
AND a.actor_uri IS NULL
"""
)
# Make actor_profile_id nullable (internal events have no actor profile)
op.alter_column(
"ap_activities", "actor_profile_id",
existing_type=sa.Integer(),
nullable=True,
)
# Index for processor polling
op.create_index(
"ix_ap_activity_process", "ap_activities", ["process_state"],
)
def downgrade() -> None:
op.drop_index("ix_ap_activity_process", table_name="ap_activities")
# Restore actor_profile_id NOT NULL (remove any rows without it first)
op.execute(
"DELETE FROM ap_activities WHERE actor_profile_id IS NULL"
)
op.alter_column(
"ap_activities", "actor_profile_id",
existing_type=sa.Integer(),
nullable=False,
)
op.drop_column("ap_activities", "processed_at")
op.drop_column("ap_activities", "process_error")
op.drop_column("ap_activities", "process_max_attempts")
op.drop_column("ap_activities", "process_attempts")
op.drop_column("ap_activities", "process_state")
op.drop_column("ap_activities", "visibility")
op.drop_column("ap_activities", "actor_uri")

View File

@@ -0,0 +1,46 @@
"""drop domain_events table
Revision ID: n4l2i8j0k1
Revises: m3k1h7i9j0
Create Date: 2026-02-22
The domain_events table is no longer used — all events now flow through
ap_activities with the unified activity bus.
"""
revision = "n4l2i8j0k1"
down_revision = "m3k1h7i9j0"
branch_labels = None
depends_on = None
from alembic import op
import sqlalchemy as sa
from sqlalchemy.dialects.postgresql import JSONB
def upgrade() -> None:
op.drop_index("ix_domain_events_state", table_name="domain_events")
op.drop_index("ix_domain_events_event_type", table_name="domain_events")
op.drop_table("domain_events")
def downgrade() -> None:
op.create_table(
"domain_events",
sa.Column("id", sa.Integer(), primary_key=True, autoincrement=True),
sa.Column("event_type", sa.String(128), nullable=False),
sa.Column("aggregate_type", sa.String(64), nullable=False),
sa.Column("aggregate_id", sa.Integer(), nullable=False),
sa.Column("payload", JSONB(), nullable=True),
sa.Column("state", sa.String(20), nullable=False, server_default="pending"),
sa.Column("attempts", sa.Integer(), nullable=False, server_default="0"),
sa.Column("max_attempts", sa.Integer(), nullable=False, server_default="5"),
sa.Column("last_error", sa.Text(), nullable=True),
sa.Column(
"created_at", sa.DateTime(timezone=True),
nullable=False, server_default=sa.func.now(),
),
sa.Column("processed_at", sa.DateTime(timezone=True), nullable=True),
)
op.create_index("ix_domain_events_event_type", "domain_events", ["event_type"])
op.create_index("ix_domain_events_state", "domain_events", ["state"])

View File

@@ -0,0 +1,35 @@
"""Add origin_app column to ap_activities
Revision ID: o5m3j9k1l2
Revises: n4l2i8j0k1
Create Date: 2026-02-22
"""
from alembic import op
import sqlalchemy as sa
from sqlalchemy import inspect as sa_inspect
revision = "o5m3j9k1l2"
down_revision = "n4l2i8j0k1"
branch_labels = None
depends_on = None
def upgrade() -> None:
conn = op.get_bind()
inspector = sa_inspect(conn)
columns = [c["name"] for c in inspector.get_columns("ap_activities")]
if "origin_app" not in columns:
op.add_column(
"ap_activities",
sa.Column("origin_app", sa.String(64), nullable=True),
)
# Index is idempotent with if_not_exists
op.create_index(
"ix_ap_activity_origin_app", "ap_activities", ["origin_app"],
if_not_exists=True,
)
def downgrade() -> None:
op.drop_index("ix_ap_activity_origin_app", table_name="ap_activities")
op.drop_column("ap_activities", "origin_app")

View File

@@ -0,0 +1,37 @@
"""Add oauth_codes table
Revision ID: p6n4k0l2m3
Revises: o5m3j9k1l2
Create Date: 2026-02-23
"""
from alembic import op
import sqlalchemy as sa
revision = "p6n4k0l2m3"
down_revision = "o5m3j9k1l2"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"oauth_codes",
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
sa.Column("code", sa.String(128), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("client_id", sa.String(64), nullable=False),
sa.Column("redirect_uri", sa.String(512), nullable=False),
sa.Column("expires_at", sa.DateTime(timezone=True), nullable=False),
sa.Column("used_at", sa.DateTime(timezone=True), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), server_default=sa.func.now(), nullable=False),
sa.PrimaryKeyConstraint("id"),
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
)
op.create_index("ix_oauth_code_code", "oauth_codes", ["code"], unique=True)
op.create_index("ix_oauth_code_user", "oauth_codes", ["user_id"])
def downgrade() -> None:
op.drop_index("ix_oauth_code_user", table_name="oauth_codes")
op.drop_index("ix_oauth_code_code", table_name="oauth_codes")
op.drop_table("oauth_codes")

View File

@@ -0,0 +1,41 @@
"""Add oauth_grants table
Revision ID: q7o5l1m3n4
Revises: p6n4k0l2m3
"""
from alembic import op
import sqlalchemy as sa
revision = "q7o5l1m3n4"
down_revision = "p6n4k0l2m3"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"oauth_grants",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("token", sa.String(128), unique=True, nullable=False),
sa.Column("user_id", sa.Integer, sa.ForeignKey("users.id", ondelete="CASCADE"), nullable=False),
sa.Column("client_id", sa.String(64), nullable=False),
sa.Column("issuer_session", sa.String(128), nullable=False),
sa.Column("device_id", sa.String(128), nullable=True),
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.Column("revoked_at", sa.DateTime(timezone=True), nullable=True),
)
op.create_index("ix_oauth_grant_token", "oauth_grants", ["token"], unique=True)
op.create_index("ix_oauth_grant_issuer", "oauth_grants", ["issuer_session"])
op.create_index("ix_oauth_grant_user", "oauth_grants", ["user_id"])
op.create_index("ix_oauth_grant_device", "oauth_grants", ["device_id", "client_id"])
# Add grant_token column to oauth_codes to link code → grant
op.add_column("oauth_codes", sa.Column("grant_token", sa.String(128), nullable=True))
def downgrade():
op.drop_column("oauth_codes", "grant_token")
op.drop_index("ix_oauth_grant_user", table_name="oauth_grants")
op.drop_index("ix_oauth_grant_issuer", table_name="oauth_grants")
op.drop_index("ix_oauth_grant_token", table_name="oauth_grants")
op.drop_table("oauth_grants")

View File

@@ -0,0 +1,29 @@
"""Add device_id column to oauth_grants
Revision ID: r8p6m2n4o5
Revises: q7o5l1m3n4
"""
from alembic import op
import sqlalchemy as sa
revision = "r8p6m2n4o5"
down_revision = "q7o5l1m3n4"
branch_labels = None
depends_on = None
def upgrade():
# device_id was added to the create_table migration after it had already
# run, so the column is missing from the live DB. Add it now.
op.add_column(
"oauth_grants",
sa.Column("device_id", sa.String(128), nullable=True),
)
op.create_index(
"ix_oauth_grant_device", "oauth_grants", ["device_id", "client_id"]
)
def downgrade():
op.drop_index("ix_oauth_grant_device", table_name="oauth_grants")
op.drop_column("oauth_grants", "device_id")

View File

@@ -0,0 +1,30 @@
"""Add ap_delivery_log table for idempotent federation delivery
Revision ID: s9q7n3o5p6
Revises: r8p6m2n4o5
"""
from alembic import op
import sqlalchemy as sa
revision = "s9q7n3o5p6"
down_revision = "r8p6m2n4o5"
branch_labels = None
depends_on = None
def upgrade():
op.create_table(
"ap_delivery_log",
sa.Column("id", sa.Integer, primary_key=True, autoincrement=True),
sa.Column("activity_id", sa.Integer, sa.ForeignKey("ap_activities.id", ondelete="CASCADE"), nullable=False),
sa.Column("inbox_url", sa.String(512), nullable=False),
sa.Column("status_code", sa.Integer, nullable=True),
sa.Column("delivered_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.func.now()),
sa.UniqueConstraint("activity_id", "inbox_url", name="uq_delivery_activity_inbox"),
)
op.create_index("ix_ap_delivery_activity", "ap_delivery_log", ["activity_id"])
def downgrade():
op.drop_index("ix_ap_delivery_activity", table_name="ap_delivery_log")
op.drop_table("ap_delivery_log")

View File

@@ -0,0 +1,51 @@
"""Add app_domain to ap_followers for per-app AP actors
Revision ID: t0r8n4o6p7
Revises: s9q7n3o5p6
"""
from alembic import op
import sqlalchemy as sa
revision = "t0r8n4o6p7"
down_revision = "s9q7n3o5p6"
branch_labels = None
depends_on = None
def upgrade():
# Add column as nullable first so we can backfill
op.add_column(
"ap_followers",
sa.Column("app_domain", sa.String(64), nullable=True),
)
# Backfill existing rows: all current followers are aggregate
op.execute("UPDATE ap_followers SET app_domain = 'federation' WHERE app_domain IS NULL")
# Now make it NOT NULL with a default
op.alter_column(
"ap_followers", "app_domain",
nullable=False, server_default="federation",
)
# Replace old unique constraint with one that includes app_domain
op.drop_constraint("uq_follower_acct", "ap_followers", type_="unique")
op.create_unique_constraint(
"uq_follower_acct_app",
"ap_followers",
["actor_profile_id", "follower_acct", "app_domain"],
)
op.create_index(
"ix_ap_follower_app_domain",
"ap_followers",
["actor_profile_id", "app_domain"],
)
def downgrade():
op.drop_index("ix_ap_follower_app_domain", table_name="ap_followers")
op.drop_constraint("uq_follower_acct_app", "ap_followers", type_="unique")
op.create_unique_constraint(
"uq_follower_acct",
"ap_followers",
["actor_profile_id", "follower_acct"],
)
op.alter_column("ap_followers", "app_domain", nullable=True, server_default=None)
op.drop_column("ap_followers", "app_domain")

View File

@@ -0,0 +1,33 @@
"""Add app_domain to ap_delivery_log for per-domain idempotency
Revision ID: u1s9o5p7q8
Revises: t0r8n4o6p7
"""
from alembic import op
import sqlalchemy as sa
revision = "u1s9o5p7q8"
down_revision = "t0r8n4o6p7"
def upgrade() -> None:
op.add_column(
"ap_delivery_log",
sa.Column("app_domain", sa.String(128), nullable=False, server_default="federation"),
)
op.drop_constraint("uq_delivery_activity_inbox", "ap_delivery_log", type_="unique")
op.create_unique_constraint(
"uq_delivery_activity_inbox_domain",
"ap_delivery_log",
["activity_id", "inbox_url", "app_domain"],
)
def downgrade() -> None:
op.drop_constraint("uq_delivery_activity_inbox_domain", "ap_delivery_log", type_="unique")
op.drop_column("ap_delivery_log", "app_domain")
op.create_unique_constraint(
"uq_delivery_activity_inbox",
"ap_delivery_log",
["activity_id", "inbox_url"],
)

View File

@@ -0,0 +1 @@
# suma_browser package

View File

@@ -0,0 +1,12 @@
# The monolith has been split into three apps (apps/blog, apps/market, apps/cart).
# This package remains for shared infrastructure modules (middleware, redis_cacher,
# csrf, errors, authz, filters, utils, bp/*).
#
# To run individual apps:
# hypercorn apps.blog.app:app --bind 0.0.0.0:8000
# hypercorn apps.market.app:app --bind 0.0.0.0:8001
# hypercorn apps.cart.app:app --bind 0.0.0.0:8002
#
# Legacy single-process:
# hypercorn suma_browser.app.app:app --bind 0.0.0.0:8000
# (runs the old monolith from app.py, which still works)

152
shared/browser/app/authz.py Normal file
View File

@@ -0,0 +1,152 @@
from __future__ import annotations
from functools import wraps
from typing import Any, Dict, Iterable, Optional
import inspect
from quart import g, abort, redirect, request, current_app
from shared.infrastructure.urls import login_url
def require_rights(*rights: str, any_of: bool = True):
"""
Decorator for routes that require certain user rights.
"""
if not rights:
raise ValueError("require_rights needs at least one right name")
required_set = frozenset(rights)
def decorator(view_func):
@wraps(view_func)
async def wrapper(*args: Any, **kwargs: Any):
# Not logged in → go to login, with ?next=<current path>
user = g.get("user")
if not user:
return redirect(login_url(request.url))
rights_dict = g.get("rights") or {}
if any_of:
allowed = any(rights_dict.get(name) for name in required_set)
else:
allowed = all(rights_dict.get(name) for name in required_set)
if not allowed:
abort(403)
result = view_func(*args, **kwargs)
if inspect.isawaitable(result):
return await result
return result
# ---- expose access requirements on the wrapper ----
wrapper.__access_requires__ = {
"rights": required_set,
"any_of": any_of,
}
return wrapper
return decorator
def require_login(view_func):
"""
Decorator for routes that require any logged-in user.
"""
@wraps(view_func)
async def wrapper(*args: Any, **kwargs: Any):
user = g.get("user")
if not user:
return redirect(login_url(request.url))
result = view_func(*args, **kwargs)
if inspect.isawaitable(result):
return await result
return result
return wrapper
def require_admin(view_func=None):
"""
Shortcut for routes that require the 'admin' right.
"""
if view_func is None:
return require_rights("admin")
return require_rights("admin")(view_func)
def require_post_author(view_func):
"""Allow admin or post owner."""
@wraps(view_func)
async def wrapper(*args, **kwargs):
user = g.get("user")
if not user:
return redirect(login_url(request.url))
is_admin = bool((g.get("rights") or {}).get("admin"))
if is_admin:
result = view_func(*args, **kwargs)
if inspect.isawaitable(result):
return await result
return result
post = getattr(g, "post_data", {}).get("original_post")
if post and post.user_id == user.id:
result = view_func(*args, **kwargs)
if inspect.isawaitable(result):
return await result
return result
abort(403)
return wrapper
def _get_access_meta(view_func) -> Optional[Dict[str, Any]]:
"""
Walk the wrapper chain looking for __access_requires__ metadata.
"""
func = view_func
seen: set[int] = set()
while func is not None and id(func) not in seen:
seen.add(id(func))
meta = getattr(func, "__access_requires__", None)
if meta is not None:
return meta
func = getattr(func, "__wrapped__", None)
return None
def has_access(endpoint: str) -> bool:
"""
Return True if the current user has access to the given endpoint.
Example:
has_access("settings.home")
has_access("settings.clear_cache_view")
"""
view = current_app.view_functions.get(endpoint)
if view is None:
# Unknown endpoint: be conservative
return False
meta = _get_access_meta(view)
# If the route has no rights metadata, treat it as public:
if meta is None:
return True
required: Iterable[str] = meta["rights"]
any_of: bool = meta["any_of"]
# Must be in a request context; if no user, they don't have access
user = g.get("user")
if not user:
return False
rights_dict = g.get("rights") or {}
if any_of:
return any(rights_dict.get(name) for name in required)
else:
return all(rights_dict.get(name) for name in required)

View File

@@ -0,0 +1,99 @@
from __future__ import annotations
import secrets
from typing import Callable, Awaitable, Optional
from quart import (
abort,
current_app,
request,
session as qsession,
)
SAFE_METHODS = {"GET", "HEAD", "OPTIONS", "TRACE"}
def generate_csrf_token() -> str:
"""
Per-session CSRF token.
In Jinja:
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
"""
token = qsession.get("csrf_token")
if not token:
token = secrets.token_urlsafe(32)
qsession["csrf_token"] = token
return token
def _is_exempt_endpoint() -> bool:
endpoint = request.endpoint
if not endpoint:
return False
view = current_app.view_functions.get(endpoint)
# Walk decorator stack (__wrapped__) to find csrf_exempt
while view is not None:
if getattr(view, "_csrf_exempt", False):
return True
view = getattr(view, "__wrapped__", None)
return False
async def protect() -> None:
"""
Enforce CSRF on unsafe methods.
Supports:
* Forms: hidden input "csrf_token"
* JSON: "csrf_token" or "csrfToken" field
* HTMX/AJAX: "X-CSRFToken" or "X-CSRF-Token" header
"""
if request.method in SAFE_METHODS:
return
if _is_exempt_endpoint():
return
session_token = qsession.get("csrf_token")
if not session_token:
abort(400, "Missing CSRF session token")
supplied_token: Optional[str] = None
# JSON body
if request.mimetype == "application/json":
data = await request.get_json(silent=True) or {}
supplied_token = data.get("csrf_token") or data.get("csrfToken")
# Form body
if not supplied_token and request.mimetype != "application/json":
form = await request.form
supplied_token = form.get("csrf_token")
# Headers (HTMX / fetch)
if not supplied_token:
supplied_token = (
request.headers.get("X-CSRFToken")
or request.headers.get("X-CSRF-Token")
)
if not supplied_token or supplied_token != session_token:
abort(400, "Invalid CSRF token")
def csrf_exempt(view: Callable[..., Awaitable]) -> Callable[..., Awaitable]:
"""
Mark a view as CSRF-exempt.
from suma_browser.app.csrf import csrf_exempt
@csrf_exempt
@blueprint.post("/hook")
async def webhook():
...
"""
setattr(view, "_csrf_exempt", True)
return view

View File

@@ -0,0 +1,126 @@
from werkzeug.exceptions import HTTPException
from shared.utils import hx_fragment_request
from quart import (
request,
render_template,
make_response,
current_app
)
from markupsafe import escape
class AppError(ValueError):
"""
Base class for app-level, client-safe errors.
Behaves like ValueError so existing except ValueError: still works.
"""
status_code: int = 400
def __init__(self, message, *, status_code: int | None = None):
# Support a single message or a list/tuple of messages
if isinstance(message, (list, tuple, set)):
self.messages = [str(m) for m in message]
msg = self.messages[0] if self.messages else ""
else:
self.messages = [str(message)]
msg = str(message)
super().__init__(msg)
if status_code is not None:
self.status_code = status_code
def errors(app):
def _info(e):
return {
"exception": e,
"method": request.method,
"url": str(request.url),
"base_url": str(request.base_url),
"root_path": request.root_path,
"path": request.path,
"full_path": request.full_path,
"endpoint": request.endpoint,
"url_rule": str(request.url_rule) if request.url_rule else None,
"headers": {k: v for k, v in request.headers.items()
if k.lower().startswith("x-forwarded") or k in ("Host",)},
}
@app.errorhandler(404)
async def not_found(e):
current_app.logger.warning("404 %s", _info(e))
if hx_fragment_request():
html = await render_template(
"_types/root/exceptions/hx/_.html",
errnum='404'
)
else:
html = await render_template(
"_types/root/exceptions/_.html",
errnum='404',
)
return await make_response(html, 404)
@app.errorhandler(403)
async def not_allowed(e):
current_app.logger.warning("403 %s", _info(e))
if hx_fragment_request():
html = await render_template(
"_types/root/exceptions/hx/_.html",
errnum='403'
)
else:
html = await render_template(
"_types/root/exceptions/_.html",
errnum='403',
)
return await make_response(html, 403)
@app.errorhandler(AppError)
async def app_error(e: AppError):
# App-level, client-safe errors
current_app.logger.info("AppError %s", _info(e))
status = getattr(e, "status_code", 400)
messages = getattr(e, "messages", [str(e)])
if request.headers.get("HX-Request") == "true":
# Build a little styled <ul><li>...</li></ul> snippet
lis = "".join(
f"<li>{escape(m)}</li>"
for m in messages if m
)
html = (
"<ul class='list-disc pl-5 space-y-1 text-sm text-red-600'>"
f"{lis}"
"</ul>"
)
return await make_response(html, status)
# Non-HTMX: show a nicer page with error messages
html = await render_template(
"_types/root/exceptions/app_error.html",
messages=messages,
)
return await make_response(html, status)
@app.errorhandler(Exception)
async def error(e):
current_app.logger.exception("Exception %s", _info(e))
status = 500
if isinstance(e, HTTPException):
status = e.code or 500
if request.headers.get("HX-Request") == "true":
# Generic message for unexpected/untrusted errors
return await make_response(
"Something went wrong. Please try again.",
status,
)
html = await render_template("_types/root/exceptions/error.html")
return await make_response(html, status)

View File

@@ -0,0 +1,17 @@
def register(app):
from .highlight import highlight
app.jinja_env.filters["highlight"] = highlight
from .qs import register as qs
from .url_join import register as url_join
from .combine import register as combine
from .currency import register as currency
from .truncate import register as truncate
from .getattr import register as getattr
qs(app)
url_join(app)
combine(app)
currency(app)
getattr(app)
# truncate(app)

View File

@@ -0,0 +1,25 @@
from __future__ import annotations
from typing import Any, Mapping
def _deep_merge(dst: dict, src: Mapping) -> dict:
out = dict(dst)
for k, v in src.items():
if isinstance(v, Mapping) and isinstance(out.get(k), Mapping):
out[k] = _deep_merge(out[k], v) # type: ignore[arg-type]
else:
out[k] = v
return out
def register(app):
@app.template_filter("combine")
def combine_filter(a: Any, b: Any, deep: bool = False, drop_none: bool = False) -> Any:
"""
Jinja filter: merge two dict-like objects.
- Non-dict inputs: returns `a` unchanged.
- If drop_none=True, keys in `b` with value None are ignored.
- If deep=True, nested dicts are merged recursively.
"""
if not isinstance(a, Mapping) or not isinstance(b, Mapping):
return a
b2 = {k: v for k, v in b.items() if not (drop_none and v is None)}
return _deep_merge(a, b2) if deep else {**a, **b2}

View File

@@ -0,0 +1,12 @@
from decimal import Decimal
def register(app):
@app.template_filter("currency")
def currency_filter(value, code="GBP"):
if value is None:
return ""
# ensure decimal-ish
if isinstance(value, float):
value = Decimal(str(value))
symbol = "£" if code == "GBP" else code
return f"{symbol}{value:.2f}"

View File

@@ -0,0 +1,6 @@
def register(app):
@app.template_filter("getattr")
def jinja_getattr(obj, name, default=None):
# Safe getattr: returns default if the attribute is missing
return getattr(obj, name, default)

View File

@@ -0,0 +1,21 @@
# ---------- misc helpers / filters ----------
from markupsafe import Markup, escape
def highlight(text: str, needle: str, cls: str = "bg-yellow-200 rounded") -> Markup:
"""
Wraps case-insensitive matches of `needle` inside <mark class="...">.
Escapes everything safely.
"""
import re
if not text or not needle:
return Markup(escape(text or ""))
pattern = re.compile(re.escape(needle), re.IGNORECASE)
def repl(m: re.Match) -> str:
return f'<mark class="{escape(cls)}">{escape(m.group(0))}</mark>'
esc = escape(text)
result = pattern.sub(lambda m: Markup(repl(m)), esc)
return Markup(result)

View File

@@ -0,0 +1,13 @@
from typing import Dict
from quart import g
def register(app):
@app.template_filter("qs")
def qs_filter(dict: Dict):
if getattr(g, "makeqs_factory", False):
q= g.makeqs_factory()(
**dict,
)
return q
else:
return ""

View File

@@ -0,0 +1,78 @@
"""
Shared query-string primitives used by blog, market, and order qs modules.
"""
from __future__ import annotations
from urllib.parse import urlencode
# Sentinel meaning "leave value as-is" (used as default arg in makeqs)
KEEP = object()
def _iterify(x):
"""Normalize *x* to a list: None → [], scalar → [scalar], iterable → as-is."""
if x is None:
return []
if isinstance(x, (list, tuple, set)):
return x
return [x]
def _norm(s: str) -> str:
"""Strip + lowercase — used for case-insensitive filter dedup."""
return s.strip().lower()
def make_filter_set(
base: list[str],
add,
remove,
clear_filters: bool,
*,
single_select: bool = False,
) -> list[str]:
"""
Build a deduplicated, sorted filter list.
Parameters
----------
base : list[str]
Current filter values.
add : str | list | None
Value(s) to add.
remove : str | list | None
Value(s) to remove.
clear_filters : bool
If True, start from empty instead of *base*.
single_select : bool
If True, *add* **replaces** the list (blog tags/authors).
If False, *add* is **appended** (market brands/stickers/labels).
"""
add_list = [s for s in _iterify(add) if s is not None]
if single_select:
# Blog-style: adding replaces the entire set
if add_list:
table = {_norm(s): s for s in add_list}
else:
table = {_norm(s): s for s in base if not clear_filters}
else:
# Market-style: adding appends to the existing set
table = {_norm(s): s for s in base if not clear_filters}
for s in add_list:
k = _norm(s)
if k not in table:
table[k] = s
for s in _iterify(remove):
if s is None:
continue
table.pop(_norm(s), None)
return [table[k] for k in sorted(table)]
def build_qs(params: list[tuple[str, str]], *, leading_q: bool = True) -> str:
"""URL-encode *params* and optionally prepend ``?``."""
qs = urlencode(params, doseq=True)
return ("?" + qs) if (qs and leading_q) else qs

View File

@@ -0,0 +1,33 @@
"""
NamedTuple types returned by each blueprint's ``decode()`` function.
"""
from __future__ import annotations
from typing import NamedTuple
class BlogQuery(NamedTuple):
page: int
search: str | None
sort: str | None
selected_tags: tuple[str, ...]
selected_authors: tuple[str, ...]
liked: str | None
view: str | None
drafts: str | None
selected_groups: tuple[str, ...]
class MarketQuery(NamedTuple):
page: int
search: str | None
sort: str | None
selected_brands: tuple[str, ...]
selected_stickers: tuple[str, ...]
selected_labels: tuple[str, ...]
liked: str | None
class OrderQuery(NamedTuple):
page: int
search: str | None

View File

@@ -0,0 +1,22 @@
from __future__ import annotations
def register(app):
@app.template_filter("truncate")
def truncate(text, max_length=100):
"""
Truncate text to max_length characters and add an ellipsis character (…)
if it was longer.
"""
if text is None:
return ""
text = str(text)
if len(text) <= max_length:
return text
# Leave space for the ellipsis itself
if max_length <= 1:
return ""
return text[:max_length - 1] + ""

View File

@@ -0,0 +1,19 @@
from typing import Iterable, Union
from shared.utils import join_url, host_url, _join_url_parts, route_prefix
# --- Register as a Jinja filter (Quart / Flask) ---
def register(app):
@app.template_filter("urljoin")
def urljoin_filter(value: Union[str, Iterable[str]]):
return join_url(value)
@app.template_filter("urlhost")
def urlhost_filter(value: Union[str, Iterable[str]]):
return host_url(value)
@app.template_filter("urlhost_no_slash")
def urlhost_no_slash_filter(value: Union[str, Iterable[str]]):
return host_url(value, True)
@app.template_filter("host")
def host_filter(value: str):
return _join_url_parts([route_prefix(), value])

View File

@@ -0,0 +1,58 @@
def register(app):
import json
from typing import Any
def _decode_headers(scope) -> dict[str, str]:
out = {}
for k, v in scope.get("headers", []):
try:
out[k.decode("latin1")] = v.decode("latin1")
except Exception:
out[repr(k)] = repr(v)
return out
def _safe(obj: Any):
# make scope json-serialisable; fall back to repr()
try:
json.dumps(obj)
return obj
except Exception:
return repr(obj)
class ScopeDumpMiddleware:
def __init__(self, app, *, log_bodies: bool = False):
self.app = app
self.log_bodies = log_bodies # keep False; bodies aren't needed for routing
async def __call__(self, scope, receive, send):
if scope["type"] in ("http", "websocket"):
# Build a compact view of keys relevant to routing
scope_view = {
"type": scope.get("type"),
"asgi": scope.get("asgi"),
"http_version": scope.get("http_version"),
"scheme": scope.get("scheme"),
"method": scope.get("method"),
"server": scope.get("server"),
"client": scope.get("client"),
"root_path": scope.get("root_path"),
"path": scope.get("path"),
"raw_path": scope.get("raw_path").decode("latin1") if scope.get("raw_path") else None,
"query_string": scope.get("query_string", b"").decode("latin1"),
"headers": _decode_headers(scope),
}
print("\n=== ASGI SCOPE (routing) ===")
print(json.dumps({_safe(k): _safe(v) for k, v in scope_view.items()}, indent=2))
print("=== END SCOPE ===\n", flush=True)
return await self.app(scope, receive, send)
# wrap LAST so you see what hits Quart
#app.asgi_app = ScopeDumpMiddleware(app.asgi_app)
from hypercorn.middleware import ProxyFixMiddleware
# trust a single proxy hop; use legacy X-Forwarded-* headers
app.asgi_app = ProxyFixMiddleware(app.asgi_app, mode="legacy", trusted_hops=1)

View File

@@ -0,0 +1 @@

View File

@@ -0,0 +1,133 @@
from __future__ import annotations
import os
from typing import Any, Dict, TYPE_CHECKING
import httpx
from quart import current_app
from shared.config import config
if TYPE_CHECKING:
from shared.models.order import Order
SUMUP_BASE_URL = "https://api.sumup.com/v0.1"
def _sumup_settings() -> Dict[str, str]:
cfg = config()
sumup_cfg = cfg.get("sumup", {}) or {}
api_key_env = sumup_cfg.get("api_key_env", "SUMUP_API_KEY")
api_key = os.getenv(api_key_env)
if not api_key:
raise RuntimeError(f"Missing SumUp API key in environment variable {api_key_env}")
merchant_code = sumup_cfg.get("merchant_code")
prefix = sumup_cfg.get("checkout_prefix", "")
if not merchant_code:
raise RuntimeError("Missing 'sumup.merchant_code' in app-config.yaml")
currency = sumup_cfg.get("currency", "GBP")
return {
"api_key": api_key,
"merchant_code": merchant_code,
"currency": currency,
"checkout_reference_prefix": prefix,
}
async def create_checkout(
order: Order,
redirect_url: str,
webhook_url: str | None = None,
description: str | None = None,
page_config: Any | None = None,
) -> Dict[str, Any]:
settings = _sumup_settings()
# Per-page SumUp credentials override globals
if page_config and getattr(page_config, "sumup_api_key", None):
settings["api_key"] = page_config.sumup_api_key
if page_config and getattr(page_config, "sumup_merchant_code", None):
settings["merchant_code"] = page_config.sumup_merchant_code
# Use stored reference if present, otherwise build it
checkout_reference = order.sumup_reference or f"{settings['checkout_reference_prefix']}{order.id}"
payload: Dict[str, Any] = {
"checkout_reference": checkout_reference,
"amount": float(order.total_amount),
"currency": settings["currency"],
"merchant_code": settings["merchant_code"],
"description": description or f"Order {order.id} at {current_app.config.get('APP_TITLE', 'Rose Ash')}",
"return_url": webhook_url or redirect_url,
"redirect_url": redirect_url,
"hosted_checkout": {"enabled": True},
}
headers = {
"Authorization": f"Bearer {settings['api_key']}",
"Content-Type": "application/json",
}
# Optional: log for debugging
current_app.logger.info(
"Creating SumUp checkout %s for Order %s amount %.2f",
checkout_reference,
order.id,
float(order.total_amount),
)
async with httpx.AsyncClient(timeout=15.0) as client:
resp = await client.post(f"{SUMUP_BASE_URL}/checkouts", json=payload, headers=headers)
if resp.status_code == 409:
# Duplicate checkout — retrieve the existing one by reference
current_app.logger.warning(
"SumUp duplicate checkout for ref %s order %s, fetching existing",
checkout_reference,
order.id,
)
list_resp = await client.get(
f"{SUMUP_BASE_URL}/checkouts",
params={"checkout_reference": checkout_reference},
headers=headers,
)
list_resp.raise_for_status()
items = list_resp.json()
if isinstance(items, list) and items:
return items[0]
if isinstance(items, dict) and items.get("items"):
return items["items"][0]
# Fallback: re-raise original error
resp.raise_for_status()
if resp.status_code >= 400:
current_app.logger.error(
"SumUp checkout error for ref %s order %s: %s",
checkout_reference,
order.id,
resp.text,
)
resp.raise_for_status()
data = resp.json()
return data
async def get_checkout(checkout_id: str, page_config: Any | None = None) -> Dict[str, Any]:
"""Fetch checkout status/details from SumUp."""
settings = _sumup_settings()
if page_config and getattr(page_config, "sumup_api_key", None):
settings["api_key"] = page_config.sumup_api_key
headers = {
"Authorization": f"Bearer {settings['api_key']}",
"Content-Type": "application/json",
}
async with httpx.AsyncClient(timeout=10.0) as client:
resp = await client.get(f"{SUMUP_BASE_URL}/checkouts/{checkout_id}", headers=headers)
resp.raise_for_status()
return resp.json()

View File

@@ -0,0 +1,346 @@
from __future__ import annotations
from functools import wraps
from typing import Optional, Literal
import asyncio
from quart import (
Quart,
request,
Response,
g,
current_app,
)
from redis import asyncio as aioredis
Scope = Literal["user", "global", "anon"]
TagScope = Literal["all", "user"] # for clear_cache
# ---------------------------------------------------------------------------
# Redis setup
# ---------------------------------------------------------------------------
def register(app: Quart) -> None:
@app.before_serving
async def setup_redis() -> None:
if app.config["REDIS_URL"] and app.config["REDIS_URL"] != 'no':
app.redis = aioredis.Redis.from_url(
app.config["REDIS_URL"],
encoding="utf-8",
decode_responses=False, # store bytes
)
else:
app.redis = False
@app.after_serving
async def close_redis() -> None:
if app.redis:
await app.redis.close()
# optional: await app.redis.connection_pool.disconnect()
def get_redis():
return current_app.redis
# ---------------------------------------------------------------------------
# Key helpers
# ---------------------------------------------------------------------------
def get_user_id() -> str:
"""
Returns a string id or 'anon'.
Adjust based on your auth system.
"""
user = getattr(g, "user", None)
if user:
return str(user.id)
return "anon"
def make_cache_key(cache_user_id: str) -> str:
"""
Build a cache key for this (user/global/anon) + path + query + HTMX status.
HTMX requests and normal requests get different cache keys because they
return different content (partials vs full pages).
Keys are namespaced by app name (from CACHE_APP_PREFIX) to avoid
collisions between apps that may share the same paths.
"""
app_prefix = current_app.config.get("CACHE_APP_PREFIX", "app")
path = request.path
qs = request.query_string.decode() if request.query_string else ""
# Check if this is an HTMX request
is_htmx = request.headers.get("HX-Request", "").lower() == "true"
htmx_suffix = ":htmx" if is_htmx else ""
if qs:
return f"cache:{app_prefix}:page:{cache_user_id}:{path}?{qs}{htmx_suffix}"
else:
return f"cache:{app_prefix}:page:{cache_user_id}:{path}{htmx_suffix}"
def user_set_key(user_id: str) -> str:
"""
Redis set that tracks all cache keys for a given user id.
Only used when scope='user'.
"""
return f"cache:user:{user_id}"
def tag_set_key(tag: str) -> str:
"""
Redis set that tracks all cache keys associated with a tag
(across all scopes/users).
"""
return f"cache:tag:{tag}"
# ---------------------------------------------------------------------------
# Invalidation helpers
# ---------------------------------------------------------------------------
async def invalidate_user_cache(user_id: str) -> None:
"""
Delete all cached pages for a specific user (scope='user' caches).
"""
r = get_redis()
if r:
s_key = user_set_key(user_id)
keys = await r.smembers(s_key) # set of bytes
if keys:
await r.delete(*keys)
await r.delete(s_key)
async def invalidate_tag_cache(tag: str) -> None:
"""
Delete all cached pages associated with this tag, for all users/scopes.
"""
r = get_redis()
if r:
t_key = tag_set_key(tag)
keys = await r.smembers(t_key) # set of bytes
if keys:
await r.delete(*keys)
await r.delete(t_key)
async def invalidate_tag_cache_for_user(tag: str, cache_uid: str) -> None:
r = get_redis()
if not r:
return
t_key = tag_set_key(tag)
keys = await r.smembers(t_key) # set of bytes
if not keys:
return
prefix = f"cache:page:{cache_uid}:".encode("utf-8")
# Filter keys belonging to this cache_uid only
to_delete = [k for k in keys if k.startswith(prefix)]
if not to_delete:
return
# Delete those page entries
await r.delete(*to_delete)
# Remove them from the tag set (leave other users' keys intact)
await r.srem(t_key, *to_delete)
async def invalidate_tag_cache_for_current_user(tag: str) -> None:
"""
Convenience helper: delete tag cache for the current user_id (scope='user').
"""
uid = get_user_id()
await invalidate_tag_cache_for_user(tag, uid)
# ---------------------------------------------------------------------------
# Cache decorator for GET
# ---------------------------------------------------------------------------
def cache_page(
ttl: int = 0,
tag: Optional[str] = None,
scope: Scope = "user",
):
"""
Cache GET responses in Redis.
ttl:
Seconds to keep the cache. 0 = no expiry.
tag:
Optional tag name used for bulk invalidation via invalidate_tag_cache().
scope:
"user" → cache per-user (includes 'anon'), tracked in cache:user:{id}
"global" → single cache shared by everyone (no per-user tracking)
"anon" → cache only for anonymous users; logged-in users bypass cache
"""
def decorator(view):
@wraps(view)
async def wrapper(*args, **kwargs):
r = get_redis()
if not r or request.method != "GET":
return await view(*args, **kwargs)
uid = get_user_id()
# Decide who the cache key is keyed on
if scope == "global":
cache_uid = "global"
elif scope == "anon":
# Only cache for anonymous users
if uid != "anon":
return await view(*args, **kwargs)
cache_uid = "anon"
else: # scope == "user"
cache_uid = uid
key = make_cache_key(cache_uid)
cached = await r.hgetall(key)
if cached:
body = cached[b"body"]
status = int(cached[b"status"].decode())
content_type = cached.get(b"content_type", b"text/html").decode()
return Response(body, status=status, content_type=content_type)
# Not cached, call the view
resp = await view(*args, **kwargs)
# Normalise: if the view returned a string/bytes, wrap it
if not isinstance(resp, Response):
resp = Response(resp, content_type="text/html")
# Only cache successful responses
if resp.status_code == 200:
body = await resp.get_data() # bytes
pipe = r.pipeline()
pipe.hset(
key,
mapping={
"body": body,
"status": str(resp.status_code),
"content_type": resp.content_type or "text/html",
},
)
if ttl:
pipe.expire(key, ttl)
# Track per-user keys only when scope='user'
if scope == "user":
pipe.sadd(user_set_key(cache_uid), key)
# Track per-tag keys (all scopes)
if tag:
pipe.sadd(tag_set_key(tag), key)
await pipe.execute()
resp.set_data(body)
return resp
return wrapper
return decorator
# ---------------------------------------------------------------------------
# Clear cache decorator for POST (or any method)
# ---------------------------------------------------------------------------
def clear_cache(
*,
tag: Optional[str] = None,
tag_scope: TagScope = "all",
clear_user: bool = False,
):
"""
Decorator for routes that should clear cache after they run.
Use on POST/PUT/PATCH/DELETE handlers.
Params:
tag:
If set, will clear caches for this tag.
tag_scope:
"all" → invalidate_tag_cache(tag) (all users/scopes)
"user" → invalidate_tag_cache_for_current_user(tag)
clear_user:
If True, also run invalidate_user_cache(current_user_id).
Typical usage:
@bp.post("/posts/<slug>/edit")
@clear_cache(tag="post.post_detail", tag_scope="all")
async def edit_post(slug):
...
@bp.post("/prefs")
@clear_cache(tag="dashboard", tag_scope="user", clear_user=True)
async def update_prefs():
...
"""
def decorator(view):
@wraps(view)
async def wrapper(*args, **kwargs):
# Run the view first
resp = await view(*args, **kwargs)
if get_redis():
# Only clear cache if the view succeeded (2xx)
status = getattr(resp, "status_code", None)
if status is None:
# Non-Response return (string, dict) -> treat as success
success = True
else:
success = 200 <= status < 300
if not success:
return resp
# Perform invalidations
tasks = []
if clear_user:
uid = get_user_id()
tasks.append(invalidate_user_cache(uid))
if tag:
if tag_scope == "all":
tasks.append(invalidate_tag_cache(tag))
else: # tag_scope == "user"
tasks.append(invalidate_tag_cache_for_current_user(tag))
if tasks:
# Run them concurrently
await asyncio.gather(*tasks)
return resp
return wrapper
return decorator
async def clear_all_cache(prefix: str = "cache:") -> None:
r = get_redis()
if not r:
return
cursor = 0
pattern = f"{prefix}*"
while True:
cursor, keys = await r.scan(cursor=cursor, match=pattern, count=500)
if keys:
await r.delete(*keys)
if cursor == 0:
break

View File

@@ -0,0 +1,12 @@
from .parse import (
parse_time,
parse_cost,
parse_dt
)
from .utils import (
current_route_relative_path,
current_url_without_page,
vary,
)
from .utc import utcnow

View File

@@ -0,0 +1,46 @@
"""HTMX utilities for detecting and handling HTMX requests."""
from quart import request
def is_htmx_request() -> bool:
"""
Check if the current request is an HTMX request.
Returns:
bool: True if HX-Request header is present and true
"""
return request.headers.get("HX-Request", "").lower() == "true"
def get_htmx_target() -> str | None:
"""
Get the target element ID from HTMX request headers.
Returns:
str | None: Target element ID or None
"""
return request.headers.get("HX-Target")
def get_htmx_trigger() -> str | None:
"""
Get the trigger element ID from HTMX request headers.
Returns:
str | None: Trigger element ID or None
"""
return request.headers.get("HX-Trigger")
def should_return_fragment() -> bool:
"""
Determine if we should return a fragment vs full page.
For HTMX requests, return fragment.
For normal requests, return full page.
Returns:
bool: True if fragment should be returned
"""
return is_htmx_request()

View File

@@ -0,0 +1,36 @@
from datetime import datetime, timezone
def parse_time(val: str | None):
if not val:
return None
try:
h,m = val.split(':', 1)
from datetime import time
return time(int(h), int(m))
except Exception:
return None
def parse_cost(val: str | None):
if not val:
return None
try:
return float(val)
except Exception:
return None
if not val:
return None
dt = datetime.fromisoformat(val)
# make TZ-aware (assume local if naive; convert to UTC)
if dt.tzinfo is None:
dt = dt.replace(tzinfo=timezone.utc)
return dt
def parse_dt(val: str | None) -> datetime | None:
if not val:
return None
dt = datetime.fromisoformat(val)
if dt.tzinfo is None:
dt = dt.replace(tzinfo=timezone.utc)
return dt

View File

@@ -0,0 +1,6 @@
from datetime import datetime, timezone
def utcnow() -> datetime:
return datetime.now(timezone.utc)

View File

@@ -0,0 +1,51 @@
from quart import (
Response,
request,
g,
)
from shared.utils import host_url
from urllib.parse import urlencode
def current_route_relative_path() -> str:
"""
Returns the current request path relative to the app's mount point (script_root).
"""
(request.script_root or "").rstrip("/")
path = request.path # excludes query string
if g.root and path.startswith(f"/{g.root}"):
rel = path[len(g.root+1):]
return rel if rel.startswith("/") else "/" + rel
return path # app at /
def current_url_without_page() -> str:
"""
Build current URL (host+path+qs) but with ?page= removed.
Used for Hx-Push-Url.
"""
base = host_url(current_route_relative_path())
params = request.args.to_dict(flat=False) # keep multivals
params.pop("page", None)
qs = urlencode(params, doseq=True)
return f"{base}?{qs}" if qs else base
def vary(resp: Response) -> Response:
"""
Ensure caches/CDNs vary on HX headers so htmx/non-htmx versions don't get mixed.
"""
v = resp.headers.get("Vary", "")
parts = [p.strip() for p in v.split(",") if p.strip()]
for h in ("HX-Request", "X-Origin"):
if h not in parts:
parts.append(h)
if parts:
resp.headers["Vary"] = ", ".join(parts)
return resp

View File

@@ -0,0 +1,33 @@
{% extends oob.oob_extends %}
{# OOB elements for HTMX navigation - all elements that need updating #}
{# Import shared OOB macros #}
{% from '_types/root/_oob_menu.html' import mobile_menu with context %}
{% block oobs %}
{% from '_types/root/_n/macros.html' import oob_header with context %}
{{oob_header(
oob.parent_id,
oob.child_id,
oob.header,
)}}
{% from oob.parent_header import header_row with context %}
{{ header_row(oob=True) }}
{% endblock %}
{# Mobile menu - from market/index.html _main_mobile_menu block #}
{% set mobile_nav %}
{% include oob.nav %}
{% endset %}
{{ mobile_menu(mobile_nav) }}
{% block content %}
{% include oob.main %}
{% endblock %}

View File

@@ -0,0 +1,11 @@
{% set href=account_url('/') %}
<a
href="{{ href }}"
class="justify-center cursor-pointer flex flex-row items-center p-3 gap-2 rounded bg-stone-200 text-black {{select_colours}}"
data-close-details
>
<i class="fa-solid fa-user"></i>
<span>{{g.user.email}}</span>
</a>

View File

@@ -0,0 +1,13 @@
<div class="md:hidden bg-stone-200 rounded">
<svg class="h-12 w-12 transition-transform group-open/root:hidden block self-start" viewBox="0 0 24 24" fill="none" stroke="currentColor">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2"
d="M4 6h16M4 12h16M4 18h16" />
</svg>
<svg aria-hidden="true" viewBox="0 0 24 24"
class="w-12 h-12 rotate-180 transition-transform group-open/root:block hidden self-start">
<path d="M6 9l6 6 6-6" fill="currentColor"/>
</svg>
</div>
<!-- Desktop nav -->

View File

@@ -0,0 +1,67 @@
<style>
@media (min-width: 768px) { .js-mobile-sentinel { display:none !important; } }
</style>
<link rel="stylesheet" type="text/css" href="{{asset_url('styles/basics.css')}}">
<link rel="stylesheet" type="text/css" href="{{asset_url('styles/cards.css')}}">
<link rel="stylesheet" type="text/css" href="{{asset_url('styles/blog-content.css')}}">
<script src="https://unpkg.com/htmx.org@2.0.8"></script>
<script src="https://unpkg.com/hyperscript.org@0.9.12"></script>
<script src="https://cdn.tailwindcss.com"></script>
<link rel="stylesheet" href="{{asset_url('fontawesome/css/all.min.css')}}">
<link rel="stylesheet" href="{{asset_url('fontawesome/css/v4-shims.min.css')}}">
<link href="https://unpkg.com/prismjs/themes/prism.css" rel="stylesheet" />
<script src="https://unpkg.com/prismjs/prism.js"></script>
<script src="https://unpkg.com/prismjs/components/prism-javascript.min.js"></script>
<script src="https://unpkg.com/prismjs/components/prism-python.min.js"></script>
<script src="https://unpkg.com/prismjs/components/prism-bash.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/sweetalert2@11"></script>
<script>
if (matchMedia('(hover: hover) and (pointer: fine)').matches) {
document.documentElement.classList.add('hover-capable');
}
</script>
<script>
document.addEventListener('click', function (e) {
const closeTarget = e.target.closest('[data-close-details]');
if (!closeTarget) return;
const details = closeTarget.closest('details');
if (details) {
details.removeAttribute('open');
}
});
</script>
<style>
/* hide disclosure glyph */
details[data-toggle-group="mobile-panels"] > summary {
list-style: none;
}
details[data-toggle-group="mobile-panels"] > summary::-webkit-details-marker {
display: none;
}
/* Desktop hover/focus dropdowns */
@media (min-width: 768px) {
.nav-group:focus-within .submenu,
.nav-group:hover .submenu { display:block }
}
img { max-width: 100%; height: auto; }
.clamp-2 { display: -webkit-box; -webkit-line-clamp: 2; -webkit-box-orient: vertical; overflow: hidden; }
.clamp-3 { display: -webkit-box; -webkit-line-clamp: 3; -webkit-box-orient: vertical; overflow: hidden; }
.no-scrollbar::-webkit-scrollbar { display: none; }
.no-scrollbar { -ms-overflow-style: none; scrollbar-width: none; }
details.group { overflow: hidden; }
details.group > summary { list-style: none; }
details.group > summary::-webkit-details-marker { display:none; }
.htmx-indicator { display: none; }
.htmx-request .htmx-indicator { display: inline-flex; }
</style>
<style>
.js-wrap.open .js-pop { display:block; }
.js-wrap.open .js-backdrop { display:block; }
</style>

View File

@@ -0,0 +1,13 @@
{% extends '_types/root/index.html' %}
{% from 'macros/glyphs.html' import opener %}
{% from 'macros/title.html' import title with context %}
{% block main_mobile_menu %}
<div class="flex flex-col gap-2 md:hidden z-40">
{% block _main_mobile_menu %}
{% include '_types/root/_nav.html' %}
{% include '_types/root/_nav_panel.html' %}
{% endblock %}
</div>
{% endblock %}

View File

@@ -0,0 +1,35 @@
{% macro header(id=False, oob=False) %}
<div
{% if id %}id="{{id}}"{% endif %}
{% if oob %}hx-swap-oob="outerHTML"{% endif %}
class="w-full"
>
{{ caller() }}
</div>
{% endmacro %}
{% macro oob_header(id, child_id, row_macro) %}
{% call header(id=id, oob=True) %}
{% call header() %}
{% from row_macro import header_row with context %}
{{header_row()}}
<div id="{{child_id}}">
</div>
{% endcall %}
{% endcall %}
{% endmacro %}
{% macro index_row(id, row_macro) %}
{% from '_types/root/_n/macros.html' import header with context %}
{% set _caller = caller %}
{% call header() %}
{% from row_macro import header_row with context %}
{{ header_row() }}
<div id="{{id}}">
{{_caller()}}
</div>
{% endcall %}
{% endmacro %}

View File

@@ -0,0 +1,29 @@
{% set _app_slugs = {
'cart': cart_url('/'),
'market': market_url('/'),
'events': events_url('/'),
'federation': federation_url('/'),
'account': account_url('/'),
} %}
{% set _first_seg = request.path.strip('/').split('/')[0] %}
<div class="flex flex-col sm:flex-row sm:items-center gap-2 border-r border-stone-200 mr-2 sm:max-w-2xl"
id="menu-items-nav-wrapper">
{% from 'macros/scrolling_menu.html' import scrolling_menu with context %}
{% call(item) scrolling_menu('menu-items-container', menu_items) %}
{% set _href = _app_slugs.get(item.slug, blog_url('/' + item.slug + '/')) %}
<a
href="{{ _href }}"
aria-selected="{{ 'true' if (item.slug == _first_seg or item.slug == app_name) else 'false' }}"
class="{{styles.nav_button_less_pad}}"
>
{% if item.feature_image %}
<img src="{{ item.feature_image }}"
alt="{{ item.label }}"
class="w-8 h-8 rounded-full object-cover flex-shrink-0" />
{% else %}
<div class="w-8 h-8 rounded-full bg-stone-200 flex-shrink-0"></div>
{% endif %}
<span>{{ item.label }}</span>
</a>
{% endcall %}
</div>

View File

@@ -0,0 +1,7 @@
{% import 'macros/links.html' as links %}
{% if g.rights.admin %}
<a href="{{ blog_url('/settings/') }}" class="{{styles.nav_button}}">
<i class="fa fa-cog" aria-hidden="true"></i>
</a>
{% endif %}

View File

@@ -0,0 +1,46 @@
{#
Shared mobile menu for both base templates and OOB updates
This macro can be used in two modes:
- oob=true: Outputs full wrapper with hx-swap-oob attribute (for OOB updates)
- oob=false: Outputs just content, assumes wrapper exists (for base templates)
The caller can pass section-specific nav items via section_nav parameter.
#}
{% macro mobile_menu(section_nav='', oob=true) %}
{% if oob %}
<div id="root-menu" hx-swap-oob="outerHTML" class="md:hidden">
{% endif %}
<nav id="nav-panel" {% if oob %}hx-swap-oob="true"{% endif %} class="flex flex-col gap-2 mt-2 px-2 pb-2" role="listbox">
{% if not g.user %}
{% include '_types/root/_sign_in.html' %}
{% endif %}
{% include '_types/root/_nav.html' %}
{# Section-specific mobile nav #}
{% if section_nav %}
{{ section_nav }}
{% else %}
{% include "_types/root/_nav_panel.html"%}
{% endif %}
</nav>
{% if oob %}
</div>
{% endif %}
{% endmacro %}
{% macro oob_mobile_menu() %}
<div id="root-menu" hx-swap-oob="outerHTML" class="md:hidden">
<nav id="nav-panel" class="flex flex-col gap-2 mt-2 px-2 pb-2" role="listbox">
{{caller()}}
</nav>
</div>
{% endmacro %}

View File

@@ -0,0 +1,10 @@
<a
href="{{ account_url('/') }}"
aria-selected="{{ 'true' if '/auth/login' in request.path else 'false' }}"
class="justify-center cursor-pointer flex flex-row items-center p-3 gap-2 rounded bg-stone-200 text-black {{select_colours}}"
data-close-details
>
<i class="fa-solid fa-key"></i>
<span>sign in or register</span>
</a>

View File

@@ -0,0 +1 @@
{{asset_url('errors/403.gif')}}

View File

@@ -0,0 +1 @@
YOU CAN'T DO THAT

View File

@@ -0,0 +1 @@
{{asset_url('errors/404.gif')}}

View File

@@ -0,0 +1 @@
NOT FOUND

View File

@@ -0,0 +1,12 @@
{% extends '_types/root/exceptions/base.html' %}
{% block error_summary %}
<div>
{% include '_types/root/exceptions/' + errnum + '/message.html' %}
</div>
{% endblock %}
{% block error_content %}
<img src="{% include '_types/root/exceptions/' + errnum + '/img.html' %}" width="300px" height="300px"/>
{% endblock %}

View File

@@ -0,0 +1,42 @@
{% extends '_types/root/_index.html' %}
{% block content %}
<div class="flex flex-col items-center justify-center min-h-[50vh] p-8">
<div class="max-w-md w-full bg-white rounded-lg shadow-lg p-6">
<div class="flex items-center justify-center w-12 h-12 mx-auto mb-4 rounded-full bg-red-100">
<svg class="w-6 h-6 text-red-600" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 9v2m0 4h.01m-6.938 4h13.856c1.54 0 2.502-1.667 1.732-3L13.732 4c-.77-1.333-2.694-1.333-3.464 0L3.34 16c-.77 1.333.192 3 1.732 3z"/>
</svg>
</div>
<h1 class="text-xl font-semibold text-center text-stone-800 mb-4">
Something went wrong
</h1>
{% if messages %}
<div class="space-y-2 mb-6">
{% for message in messages %}
<div class="p-3 bg-red-50 border border-red-200 rounded text-sm text-red-700">
{{ message }}
</div>
{% endfor %}
</div>
{% endif %}
<div class="flex flex-col sm:flex-row gap-3 justify-center">
<button
onclick="history.back()"
class="px-4 py-2 border border-stone-300 text-stone-700 rounded hover:bg-stone-50 transition-colors"
>
← Go Back
</button>
<a
href="{{ blog_url('/') }}"
class="px-4 py-2 bg-stone-800 text-white rounded hover:bg-stone-700 transition-colors text-center"
>
Home
</a>
</div>
</div>
</div>
{% endblock %}

View File

@@ -0,0 +1,17 @@
{% extends '_types/root/index.html' %}
{% block content %}
<div
class="w-full flex justify-center font-bold text-2xl md:text-4xl px-2 flex-1 text-red-500"
>
{% block error_summary %}
{% endblock %}
</div>
<div
class="w-full flex justify-center"
>
{% block error_content %}
{% endblock %}
</div>
{% endblock %}

View File

@@ -0,0 +1,12 @@
{% extends '_types/root/exceptions/base.html' %}
{% block error_summary %}
<div>
WELL THIS IS EMBARASSING...
</div>
{% endblock %}
{% block error_content %}
<img src="{{asset_url('errors/error.gif')}}" width="300px" height="300px"/>
{% endblock %}

View File

@@ -0,0 +1,8 @@
<div class="flex flex-col gap-2 items-center">
<div>
{% include '_types/root/exceptions/' + errnum + '/message.html' %}
</div>
<img src="{% include '_types/root/exceptions/' + errnum + '/img.html' %}" width="300px" height="300px"/>
</div>

View File

@@ -0,0 +1,41 @@
{% set select_colours = "
[.hover-capable_&]:hover:bg-yellow-300
aria-selected:bg-stone-500 aria-selected:text-white
[.hover-capable_&[aria-selected=true]:hover]:bg-orange-500
"%}
{% import 'macros/links.html' as links %}
{% macro header_row(oob=False) %}
{% call links.menu_row(id='root-row', oob=oob) %}
<div class="w-full flex flex-row items-top">
{# Cart mini — fetched from cart app as fragment #}
{% if cart_mini_html %}
{{ cart_mini_html | safe }}
{% endif %}
{# Site title #}
<div class="font-bold text-5xl flex-1">
{% from 'macros/title.html' import title with context %}
{{ title('flex justify-center md:justify-start')}}
</div>
{# Desktop nav #}
<nav class="hidden md:flex gap-4 text-sm ml-2 justify-end items-center flex-0">
{% if nav_tree_html %}
{{ nav_tree_html | safe }}
{% endif %}
{% if auth_menu_html %}
{{ auth_menu_html | safe }}
{% endif %}
{% include "_types/root/_nav_panel.html"%}
</nav>
{% include '_types/root/_hamburger.html' %}
</div>
{% endcall %}
{# Mobile user info #}
<div class="block md:hidden text-md font-bold">
{% if auth_menu_html %}
{{ auth_menu_html | safe }}
{% endif %}
</div>
{% endmacro %}

View File

@@ -0,0 +1,67 @@
{#
Shared root header for both base templates and OOB updates
This macro can be used in two modes:
- oob=true: Outputs full div with hx-swap-oob attribute (for OOB updates)
- oob=false: Outputs just content, assumes wrapper div exists (for base templates)
Usage:
1. Call root_header_start(oob=true/false)
2. Add any section-specific headers
3. Call root_header_end(oob=true/false)
#}
{% macro root_header_start(oob=true) %}
{% set select_colours = "
[.hover-capable_&]:hover:bg-yellow-300
aria-selected:bg-stone-500 aria-selected:text-white
[.hover-capable_&[aria-selected=true]:hover]:bg-orange-500
"%}
{% if oob %}
<div id="root-header" hx-swap-oob="outerHTML" class="flex items-start gap-2 p-1 bg-{{ menu_colour }}-{{ (500-(level()*100))|string }}">
{% endif %}
<div class="flex flex-col items-center flex-1">
<div class="flex w-full justify-center md:justify-start">
{# Cart mini — rendered via fragment #}
{% if cart_mini_html %}
{{ cart_mini_html | safe }}
{% endif %}
{# Site title #}
<div class="font-bold text-5xl flex-1">
{% from 'macros/title.html' import title with context %}
{{ title('flex justify-center md:justify-start')}}
</div>
{# Desktop nav #}
<nav class="hidden md:flex gap-4 text-sm ml-2 justify-end items-center flex-0">
{% include '_types/root/_nav.html' %}
{% if not g.user %}
{% include '_types/root/_sign_in.html' %}
{% else %}
{% include '_types/root/_full_user.html' %}
{% endif %}
{% include "_types/root/_nav_panel.html"%}
</nav>
{% include '_types/root/_hamburger.html' %}
</div>
{# Mobile user info #}
<div class="block md:hidden text-md font-bold">
{% if g.user %}
{% include '_types/root/mobile/_full_user.html' %}
{% else %}
{% include '_types/root/mobile/_sign_in.html' %}
{% endif %}
</div>
{# Section-specific headers go here (caller adds them between start and end) #}
{% endmacro %}
{% macro root_header_end(oob=true) %}
</div>
{% if oob %}
</div>
{% endif %}
{% endmacro %}

View File

@@ -0,0 +1,38 @@
{#
Shared root header for both base templates and OOB updates
This macro can be used in two modes:
- oob=true: Outputs full div with hx-swap-oob attribute (for OOB updates)
- oob=false: Outputs just content, assumes wrapper div exists (for base templates)
Usage:
1. Call root_header_start(oob=true/false)
2. Add any section-specific headers
3. Call root_header_end(oob=true/false)
#}
{% macro root_header(oob=true) %}
{% set select_colours = "
[.hover-capable_&]:hover:bg-yellow-300
aria-selected:bg-stone-500 aria-selected:text-white
[.hover-capable_&[aria-selected=true]:hover]:bg-orange-500
"%}
{% if oob %}
<div id="root-header"
hx-swap-oob="outerHTML"
class="flex items-start gap-2 p-1 bg-{{ menu_colour }}-{{ (500-(level()*100))|string }}">
{% endif %}
<div class="flex flex-col items-center flex-1">
{% from '_types/root/header/_header.html' import header_row with context %}
{{ header_row() }}
{{ caller() }}
</div>
{% if oob %}
</div>
{% endif %}
{% endmacro %}

View File

@@ -0,0 +1,84 @@
{% import 'macros/layout.html' as layout %}
{% from '_types/root/header/_oob.html' import root_header_start, root_header_end with context %}
{% from '_types/root/_oob_menu.html' import mobile_menu with context %}
<!doctype html>
<html lang="en">
<head>
{% block meta %}
{% include 'social/meta_site.html' %}
{% endblock %}
{% include '_types/root/_head.html' %}
</head>
<body class="bg-stone-50 text-stone-900">
<div class="max-w-screen-2xl mx-auto py-1 px-1">
{% block header %}
{% from '_types/root/_n/macros.html' import header with context %}
{% call header() %}
{% call layout.details('/root-header') %}
{% call layout.summary(
'root-header-summary',
_class='flex items-start gap-2 p-1 + bg-' + menu_colour + '-' + (500-(level()*100))|string,
)
%}
<div class="flex flex-col w-full items-center">
{% from '_types/root/header/_header.html' import header_row with context %}
{{ header_row() }}
<div id="root-header-child" class="flex flex-col w-full items-center">
{% block root_header_child %}
{% endblock %}
</div>
</div>
{% endcall %}
{% call layout.menu('root-menu', 'md:hidden bg-yellow-100') %}
{% block main_mobile_menu %}
{% endblock %}
{% endcall %}
{% endcall %}
{% endcall %}
{% endblock %}
<div
id="filter"
>
{% block filter %}
{% endblock %}
</div>
<main
id="root-panel"
class="max-w-full">
<div class="md:min-h-0">
<div class="flex flex-row md:h-full md:min-h-0">
<aside
id="aside"
class="hidden md:flex md:flex-col max-w-xs md:h-full md:min-h-0 mr-3"
>
{% block aside %}
{% endblock %}
</aside>
<section
id="main-panel"
class="flex-1 md:h-full md:min-h-0 overflow-y-auto overscroll-contain js-grid-viewport"
>
{% block content %}
{% endblock %}
<div class="pb-8"></div>
</section>
</div>
</div>
</main>
</div>
<script src="{{asset_url('scripts/body.js')}}"></script>
</body>
</html>

View File

@@ -0,0 +1,10 @@
{% set href=account_url('/') %}
<a
href="{{ href }}"
data-close-details
>
<i class="fa-solid fa-user"></i>
<span>{{g.user.email}}</span>
</a>

View File

@@ -0,0 +1,8 @@
<a
href="{{ account_url('/') }}"
aria-selected="{{ 'true' if '/auth/login' in request.path else 'false' }}"
>
<i class="fa-solid fa-key"></i>
<span>sign in or register</span>
</a>

View File

@@ -0,0 +1,21 @@
{#
Shared admin navigation macro
Use this instead of duplicate _nav.html files
#}
{% macro admin_nav_item(href, icon='cog', label='', select_colours='', aclass=styles.nav_button) %}
{% import 'macros/links.html' as links %}
{% call links.link(href, hx_select_search, select_colours, True, aclass=aclass) %}
<i class="fa fa-{{ icon }}" aria-hidden="true"></i>
{{ label }}
{% endcall %}
{% endmacro %}
{% macro placeholder_nav() %}
{# Placeholder for admin sections without specific nav items #}
<div class="relative nav-group">
<span class="block px-3 py-2 text-stone-400 text-sm italic">
Admin options
</span>
</div>
{% endmacro %}

View File

@@ -0,0 +1,31 @@
{# Cart icon/badge — shows logo when empty, cart icon with count when items present #}
{% macro cart_icon(count=0, oob=False) %}
<div id="cart-mini" {% if oob %}hx-swap-oob="{{oob}}"{% endif %}>
{% if count == 0 %}
<div class="h-12 w-12 rounded-full overflow-hidden border border-stone-300 flex-shrink-0">
<a
href="{{ blog_url('/') }}"
class="h-full w-full font-bold text-5xl flex-shrink-0 flex flex-row items-center gap-1"
>
<img
src="{{ site().logo }}"
class="h-full w-full rounded-full object-cover border border-stone-300 flex-shrink-0"
>
</a>
</div>
{% else %}
<a
href="{{ cart_url('/') }}"
class="relative inline-flex items-center justify-center text-stone-700 hover:text-emerald-700"
>
<i class="fa fa-shopping-cart text-5xl" aria-hidden="true"></i>
<span
class="absolute top-1/2 left-1/2 -translate-x-1/2 -translate-y-1/2 inline-flex items-center justify-center rounded-full bg-emerald-600 text-white text-sm w-5 h-5"
>
{{ count }}
</span>
</a>
{% endif %}
</div>
{% endmacro %}

View File

@@ -0,0 +1,17 @@
{% macro opener(group=False) %}
<svg
class="h-4 w-4 transition-transform group-open{{ '/' + group if group else ''}}:rotate-180"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 24 24"
fill="none"
stroke="currentColor"
>
<path
stroke-linecap="round"
stroke-linejoin="round"
stroke-width="2"
d="M6 9l6 6 6-6"
/>
</svg>
{% endmacro %}

View File

@@ -0,0 +1,61 @@
{# templates/macros/layout.html #}
{% macro details(group = '', _class='') %}
<details
class="group{{group}} p-2 {{_class}}" data-toggle-group="mobile-panels">
{{ caller() }}
</details>
{%- endmacro %}
{% macro summary(id, _class=None, oob=False) %}
<summary>
<header class="z-50">
<div
id="{{id}}"
{% if oob %}
hx-swap-oob="true"
{% endif %}
class="{{'flex justify-between items-start gap-2' if not _class else _class}}">
{{ caller() }}
</div>
</header>
</summary>
{%- endmacro %}
{% macro filter_summary(id, current_local_href, search, search_count, hx_select, oob=True) %}
<summary class="bg-white/90">
<div class="flex flex-row items-start">
<div>
<div class="md:hidden mx-2 bg-stone-200 rounded">
<span class="flex items-center justify-center text-stone-600 text-lg h-12 w-12 transition-transform group-open/filter:hidden self-start">
<i class="fa-solid fa-filter"></i>
</span>
<span>
<svg aria-hidden="true" viewBox="0 0 24 24"
class="w-12 h-12 rotate-180 transition-transform group-open/filter:block hidden self-start">
<path d="M6 9l6 6 6-6" fill="currentColor"/>
</svg>
</span>
</div>
</div>
<div
id="{{id}}"
class="flex-1 md:hidden grid grid-cols-12 items-center gap-3"
>
<div class="flex flex-col items-start gap-2">
{{ caller() }}
</div>
</div>
{% from 'macros/search.html' import search_mobile %}
{{ search_mobile(current_local_href, search, search_count, hx_select) }}
</div>
</summary>
{%- endmacro %}
{% macro menu(id, _class="") %}
<div id="{{id}}" hx-swap-oob="outerHTML" class="{{_class}}">
{{ caller() }}
</div>
{%- endmacro %}

View File

@@ -0,0 +1,59 @@
{% macro link(url, select, select_colours='', highlight=True, _class='', aclass='') %}
{% set href=url|host%}
<div class="relative nav-group {{_class}}">
<a
href="{{ href }}"
hx-get="{{ href }}"
hx-target="#main-panel"
hx-select="{{select}}"
hx-swap="outerHTML"
hx-push-url="true"
aria-selected="{{ 'true' if (request.path|host).startswith(href) else 'false' }}"
{% if aclass %}
class="{{aclass}}"
{% elif select_colours %}
class="whitespace-normal flex gap-2 px-3 py-2 rounded
text-center break-words leading-snug
bg-stone-200 text-black
{{select_colours if highlight else ''}}
"
{% else %}
class="w-full whitespace-normal flex items-center gap-2 font-bold text-2xl px-3 py-2"
{% endif %}
>
{{ caller() }}
</a>
</div>
{% endmacro %}
{% macro menu_row(id=False, oob=False) %}
<div
{% if id %}
id="{{id}}"
{% endif %}
{% if oob %}
hx-swap-oob="outerHTML"
{% endif %}
class="flex flex-col items-center md:flex-row justify-center md:justify-between w-full p-1 bg-{{menu_colour}}-{{(500-(level()*100))|string}}"
>
{{ caller() }}
</div>
{{level_up()}}
{% endmacro %}
{% macro desktop_nav() %}
<nav class="hidden md:flex gap-4 text-sm ml-2 justify-end items-center flex-0">
{{ caller() }}
</nav>
{% endmacro %}
{% macro admin() %}
<i class="fa fa-cog" aria-hidden="true"></i>
<div>
settings
</div>
{% endmacro %}

View File

@@ -0,0 +1,68 @@
{#
Scrolling menu macro with arrow navigation
Creates a horizontally scrollable menu (desktop) or vertically scrollable (mobile)
with arrow buttons that appear/hide based on content overflow.
Parameters:
- container_id: Unique ID for the scroll container
- items: List of items to iterate over
- item_content: Caller block that renders each item (receives 'item' variable)
- wrapper_class: Optional additional classes for outer wrapper
- container_class: Optional additional classes for scroll container
- item_class: Optional additional classes for each item wrapper
#}
{% macro scrolling_menu(container_id, items, wrapper_class='', container_class='', item_class='') %}
{% if items %}
{# Left scroll arrow - desktop only #}
<button
class="scrolling-menu-arrow-{{ container_id }} hidden flex-shrink-0 p-2 hover:bg-stone-200 rounded"
aria-label="Scroll left"
_="on click
set #{{ container_id }}.scrollLeft to #{{ container_id }}.scrollLeft - 200">
<i class="fa fa-chevron-left"></i>
</button>
{# Scrollable container #}
<div id="{{ container_id }}"
class="overflow-y-auto sm:overflow-x-auto sm:overflow-y-visible scrollbar-hide max-h-[50vh] sm:max-h-none {{ container_class }}"
style="scroll-behavior: smooth;"
_="on load or scroll
-- Show arrows if content overflows (desktop only)
if window.innerWidth >= 640 and my.scrollWidth > my.clientWidth
remove .hidden from .scrolling-menu-arrow-{{ container_id }}
add .flex to .scrolling-menu-arrow-{{ container_id }}
else
add .hidden to .scrolling-menu-arrow-{{ container_id }}
remove .flex from .scrolling-menu-arrow-{{ container_id }}
end">
<div class="flex flex-col sm:flex-row gap-1 {{ wrapper_class }}">
{% for item in items %}
<div class="{{ item_class }}">
{{ caller(item) }}
</div>
{% endfor %}
</div>
</div>
<style>
.scrollbar-hide::-webkit-scrollbar {
display: none;
}
.scrollbar-hide {
-ms-overflow-style: none;
scrollbar-width: none;
}
</style>
{# Right scroll arrow - desktop only #}
<button
class="scrolling-menu-arrow-{{ container_id }} hidden flex-shrink-0 p-2 hover:bg-stone-200 rounded"
aria-label="Scroll right"
_="on click
set #{{ container_id }}.scrollLeft to #{{ container_id }}.scrollLeft + 200">
<i class="fa fa-chevron-right"></i>
</button>
{% endif %}
{% endmacro %}

View File

@@ -0,0 +1,83 @@
{# Shared search input macros for filter UIs #}
{% macro search_mobile(current_local_href, search, search_count, hx_select) -%}
<div
id="search-mobile-wrapper"
class="flex flex-row gap-2 items-center flex-1 min-w-0 pr-2"
>
<input
id="search-mobile"
type="text"
name="search"
aria-label="search"
class="text-base md:text-sm col-span-5 rounded-md px-3 py-2 mb-2 w-full min-w-0 max-w-full border-2 border-stone-200 placeholder-shown:border-stone-200 [&:not(:placeholder-shown)]:border-yellow-200"
hx-preserve
value="{{ search|default('', true) }}"
placeholder="search"
hx-trigger="input changed delay:300ms"
hx-target="#main-panel"
hx-select="{{hx_select}}, #search-mobile-wrapper, #search-desktop-wrapper"
hx-get="{{ (current_local_href ~ {'search': None}|qs)|host }}"
hx-swap="outerHTML"
hx-push-url="true"
hx-headers='{"X-Origin":"search-mobile", "X-Search":"true"}'
hx-sync="this:replace"
autocomplete="off"
>
<div
id="search-count-mobile"
aria-label="search count"
{% if not search_count %}
class="text-xl text-red-500"
{% endif %}
>
{% if search %}
{{search_count}}
{% endif %}
</div>
</div>
{%- endmacro %}
{% macro search_desktop(current_local_href, search, search_count, hx_select) -%}
<div
id="search-desktop-wrapper"
class="flex flex-row gap-2 items-center"
>
<input
id="search-desktop"
type="text"
name="search"
aria-label="search"
class="w-full mx-1 my-3 px-3 py-2 text-md rounded-xl border-2 shadow-sm border-white placeholder-shown:border-white [&:not(:placeholder-shown)]:border-yellow-200"
hx-preserve
value="{{ search|default('', true) }}"
placeholder="search"
hx-trigger="input changed delay:300ms"
hx-target="#main-panel"
hx-select="{{hx_select}}, #search-mobile-wrapper, #search-desktop-wrapper"
hx-get="{{ (current_local_href ~ {'search': None}|qs)|host}}"
hx-swap="outerHTML"
hx-push-url="true"
hx-headers='{"X-Origin":"search-desktop", "X-Search":"true"}'
hx-sync="this:replace"
autocomplete="off"
>
<div
id="search-count-desktop"
aria-label="search count"
{% if not search_count %}
class="text-xl text-red-500"
{% endif %}
>
{% if search %}
{{search_count}}
{% endif %}
{{zap_filter}}
</div>
</div>
{%- endmacro %}

View File

@@ -0,0 +1,24 @@
{% macro sticker(src, title, enabled, size=40, found=false) -%}
<span class="relative inline-flex items-center justify-center group"
tabindex="0" aria-label="{{ title|capitalize }}">
<!-- sticker icon -->
<img
src="{{ src }}"
width="{{size}}" height="{{size}}"
alt="{{ title|capitalize }}"
title="{{ title|capitalize }}"
class="{% if found %}border-2 border-yellow-200 bg-yellow-300{% endif %} {%if enabled %} opacity-100 {% else %} opacity-40 saturate-0 {% endif %}"
/>
<!-- tooltip -->
<span role="tooltip"
class="pointer-events-none absolute z-50 bottom-full left-1/2 -translate-x-1/2 mb-2 hidden group-hover/tt:block group-focus-visible/tt:block whitespace-nowrap rounded-md bg-stone-900 text-white text-xs px-2 py-1 shadow-lg">
{{ title|capitalize if title|lower != 'sugarfree' else 'Sugar' }}
<!-- little arrow -->
<span class="absolute top-full left-1/2 -translate-x-1/2 border-4 border-transparent border-t-stone-900"></span>
</span>
</span>
{%- endmacro -%}

Some files were not shown because too many files have changed in this diff Show More