Compare commits
343 Commits
4c2e716558
...
51ebf347ba
| Author | SHA1 | Date | |
|---|---|---|---|
| 51ebf347ba | |||
| 877e776977 | |||
| 1560207097 | |||
| aed4c03537 | |||
| dfccd113fc | |||
| b15025befd | |||
| 0144220427 | |||
| c71ca6754d | |||
| e81d77437e | |||
| 36a0bd8577 | |||
| 4298d5be16 | |||
| 1077fae815 | |||
| 57a31a3b83 | |||
| 1db52472e3 | |||
| 278ae3e8f6 | |||
| ad75798ab7 | |||
| 0456b3d25c | |||
| 959e63d440 | |||
| 57e0d0c341 | |||
| 7fda7a8027 | |||
| 8be00df6d9 | |||
| ad6a8ecb17 | |||
| 8772d59d84 | |||
| ece30fb1d2 | |||
| 5344b382a5 | |||
| 0e0a42ac04 | |||
| 9cbfb09b41 | |||
| 5690bb0388 | |||
| 8eaf4026ab | |||
| 76bc293faa | |||
| 992a9e1731 | |||
| 03d7b29745 | |||
| 9a9999d2e1 | |||
| 015469e401 | |||
| 2258a0790b | |||
| 527c4186ee | |||
| 0b4443f394 | |||
| 4939884f25 | |||
| e23d73d1b1 | |||
| 715df11f82 | |||
| 69d328b20f | |||
| 121aa30f32 | |||
| be3e86d8d6 | |||
| 1dbf600af2 | |||
| 9be8a38fe9 | |||
| a30e7228d8 | |||
| 2f26437004 | |||
| e4bfd46c48 | |||
| 2e23feb09e | |||
| 45c5e4a0db | |||
| a84916e82f | |||
| f5c266e785 | |||
| d551806976 | |||
| 2663dfb095 | |||
| ccd9b969ea | |||
| 7325bb9ecf | |||
| 6f3562707a | |||
| 2609e782fc | |||
| 28cbe60dc6 | |||
| 0f82294dc1 | |||
| 19d59f5f4b | |||
| 28388540d5 | |||
| 5fac47c132 | |||
| 213421516e | |||
| 3bffc212cc | |||
| b51b050dda | |||
| 5bb02b7dd5 | |||
| 16f0908ec9 | |||
| 7419ecf3c0 | |||
| 31a8b755d9 | |||
| 049796c391 | |||
| 8578eb525e | |||
| 96a4f56424 | |||
| e72f7485f4 | |||
| da8d2e342f | |||
| fd67f202c2 | |||
| 5069072715 | |||
| a3318b4fd7 | |||
| 8a945db37b | |||
| 03f9968979 | |||
| 96132d9cfe | |||
| baf9f1468d | |||
| c2fe142039 | |||
| f0fbcef3f6 | |||
| d7f9afff8e | |||
| f2910ad767 | |||
| e75c8d16d1 | |||
| 984e2ebed0 | |||
| d80894dbf5 | |||
| 8e16cc459a | |||
| 336a4ad9a1 | |||
| d6f3250a77 | |||
| 486ab834de | |||
| 41e803335a | |||
| 1f36987f77 | |||
| e53e8cc1f7 | |||
| 418ac9424f | |||
| fb8f115acb | |||
| 63b895afd8 | |||
| 50b33ab08e | |||
| bd314a0be7 | |||
| 41cdd6eab8 | |||
| 1a6503782d | |||
| 72997068c6 | |||
| dacb61b0ae | |||
| 400667b15a | |||
| 44503a7d9b | |||
| e085fe43b4 | |||
| 0554f8a113 | |||
| 4e5f9ff16c | |||
| 193578ef88 | |||
| 03f0929fdf | |||
| f551fc7453 | |||
| e30cb0a992 | |||
| 293f7713d6 | |||
| 4ba63bda17 | |||
| 0a81a2af01 | |||
| 0c9dbd6657 | |||
| a4377668be | |||
| a98354c0f0 | |||
| df8b19ccb8 | |||
| 544892edd9 | |||
| c243d17eeb | |||
| 5b4cacaf19 | |||
| a8c0741f54 | |||
| 0af07f9f2e | |||
| 222738546a | |||
| 4098c32878 | |||
| 3bd4f4b661 | |||
| 5dd1161816 | |||
| 002cc49f2c | |||
| e6b0849ce3 | |||
| 8024fa5b13 | |||
| ea18a402d6 | |||
| e4e43177a8 | |||
| 8445c36270 | |||
| 5578923242 | |||
| 9754b892d6 | |||
| ab75e505a8 | |||
| 13bcf755f6 | |||
| 3d55145e5f | |||
| 8b2785ccb0 | |||
| 03196c3ad0 | |||
| 815c5285d5 | |||
| ed30f88f05 | |||
| 8aedbc9e62 | |||
| 8ceb9aee62 | |||
| 4668c30890 | |||
| 39f61eddd6 | |||
| 5436dfe76c | |||
| 4ede0368dc | |||
| a8e06e87fb | |||
| 588d240ddc | |||
| aa5c251a45 | |||
| 7ccb463a8b | |||
| 341fc4cf28 | |||
| 1a5969202e | |||
| 3bc5de126d | |||
| 1447122a0c | |||
| ab45e21c7c | |||
| c0d369eb8e | |||
| 755313bd29 | |||
| 01a67029f0 | |||
| b54f7b4b56 | |||
| 5ede32e21c | |||
| 7aea1f1be9 | |||
| 0ef4a93a92 | |||
| 48696498ef | |||
| b7d95a8b4e | |||
| e7d5c6734b | |||
| e4a6d2dfc8 | |||
| 0a5562243b | |||
| 2b41aaa6ce | |||
| cfe66e5342 | |||
| 382d1b7c7a | |||
| a580a53328 | |||
| 0f9af31ffe | |||
| e8bc228c7f | |||
| 17cebe07e7 | |||
| 82b411f25a | |||
| a643b3532d | |||
| 22802bd36b | |||
| 0d48fd22ee | |||
| b92e7a763e | |||
| fec5ecdfb1 | |||
| 269bcc02be | |||
| 9f2f0dacaf | |||
| 39e013a75e | |||
| 2df1014ee3 | |||
| e8a991834b | |||
| bc7a4a5128 | |||
| 8e4c2c139e | |||
| db3f48ec75 | |||
| b40f3d124c | |||
| 3809affcab | |||
| 81e51ae7bc | |||
| b6119b7f04 | |||
| 75cb5d43b9 | |||
| f628b35fc3 | |||
| 2e4fbd5777 | |||
| b47ad6224b | |||
| 2d08d6f787 | |||
| beebe559cd | |||
| b63aa72efb | |||
| 8cfa12de6b | |||
| 3dd62bd9bf | |||
| c926e5221d | |||
| d62643312a | |||
| 8852ab1108 | |||
| 1559c5c931 | |||
| 00efbc2a35 | |||
| 6c44a5f3d0 | |||
| 6d43404b12 | |||
| 97c4e25ba7 | |||
| f1b7fdd37d | |||
| 597b0d7a2f | |||
| ee41e30d5b | |||
| 5957bd8941 | |||
| a8edc26a1d | |||
| 6a331e4ad8 | |||
| 4a99bc56e9 | |||
| 4fe5afe3e6 | |||
| efae7f5533 | |||
| 105f4c4679 | |||
| a7cca2f720 | |||
| 8269977751 | |||
| 0df932bd94 | |||
| c220fe21d6 | |||
| f9d9697c67 | |||
| f4c2f4b6b8 | |||
| 881ed2cdcc | |||
| 2ce2077d14 | |||
| 8cf834dd55 | |||
| 4daecabf30 | |||
| 19240c6ca3 | |||
| 3e29c2a334 | |||
| a70d3648ec | |||
| 0d1ce92e52 | |||
| 09b5a5b4f6 | |||
| f0a100fd77 | |||
| 16da08ff05 | |||
| 5c6d83f474 | |||
| da8a766e3f | |||
| 9fa3b8800c | |||
| f24292f99d | |||
| de3a6e4dde | |||
| 0bb57136d2 | |||
| 495e6589dc | |||
| 903193d825 | |||
| eda95ec58b | |||
| d2f1da4944 | |||
| 53c4a0a1e0 | |||
| 9c6170ed31 | |||
| a0a0f5ebc2 | |||
| 6f1d5bac3c | |||
| b52ef719bf | |||
| 838ec982eb | |||
| e65232761b | |||
| 1c794b6c0e | |||
| d53b9648a9 | |||
| 8013317b41 | |||
| 04419a1ec6 | |||
| 573aec7dfa | |||
| 36b5f1d19d | |||
| 28c66c3650 | |||
| 5d9f1586af | |||
| fbb7a1422c | |||
| 09010db70e | |||
| 0fb87e3b1c | |||
| 996ddad2ea | |||
| f486e02413 | |||
| 69a0989b7a | |||
| 0c4682e4d7 | |||
| bcac8e5adc | |||
| e1b47e5b62 | |||
| ae134907a4 | |||
| db7342c7d2 | |||
| 94b1fca938 | |||
| 96b02d93df | |||
| fe34ea8e5b | |||
| f2d040c323 | |||
| 22460db450 | |||
| 1a74d811f7 | |||
| 1a179de547 | |||
| fa431ee13e | |||
| 76a9436ea1 | |||
| 8f8bc4fad9 | |||
| e45edbf362 | |||
| 1f3d98ecc1 | |||
| dd52417241 | |||
| 98aee1f656 | |||
| 81112c716b | |||
| cf7fbd8e9b | |||
| 00249dd2a9 | |||
| c015f3f02f | |||
| 404449fcab | |||
| 984ef9c65e | |||
| 6f0965aa9c | |||
| e65bd41ebe | |||
| bde2fd73b8 | |||
| 5cca66574e | |||
|
|
e9848653d7 | ||
|
|
c3ba28ea03 | ||
|
|
b9fe884ab9 | ||
|
|
3797a0c7c9 | ||
|
|
1ea9ae4050 | ||
|
|
507a5a66ff | ||
|
|
094b6c55cd | ||
|
|
97d2021a00 | ||
|
|
9f29073cda | ||
|
|
c53f3025d9 | ||
|
|
3053cb321d | ||
|
|
3be287532d | ||
|
|
95bd32bd71 | ||
|
|
50a9e5d952 | ||
|
|
961067841e | ||
|
|
0ccf897f74 | ||
|
|
c6271931a6 | ||
|
|
99ab363cfd | ||
|
|
8680ec37d6 | ||
|
|
b91a58f30a | ||
|
|
8f4104a4bf | ||
|
|
e454187035 | ||
|
|
f6cdf126e4 | ||
|
|
3b707ec8a0 | ||
|
|
5dafbdbda9 | ||
|
|
580f551700 | ||
|
|
57d2a6a6e3 | ||
|
|
e4c7432303 | ||
|
|
1dc87d0f64 | ||
|
|
66c0c23de9 | ||
|
|
660a6db6fb | ||
|
|
2fe27fb34a | ||
|
|
b3d853ad35 | ||
|
|
4d7f8cfea2 | ||
|
|
567888c9e0 | ||
|
|
8b52a11b67 | ||
|
|
e1f9c964f5 | ||
|
|
796443c06d | ||
|
|
a8d1c7a130 | ||
|
|
dd309ca894 | ||
|
|
1b3922d46d | ||
|
|
f42042ccb7 |
236
.claude/plans/flickering-gathering-wilkes.md
Normal file
236
.claude/plans/flickering-gathering-wilkes.md
Normal file
@@ -0,0 +1,236 @@
|
||||
# Ticket Purchase Through Cart
|
||||
|
||||
## Context
|
||||
|
||||
Tickets (Ticket model) are currently created with state="reserved" immediately when a user clicks "Buy" (`POST /tickets/buy/`). They bypass the cart and checkout entirely — no cart display, no SumUp payment, no order linkage. The user wants tickets to flow through the cart exactly like products and calendar bookings: appear in the cart, go through checkout, get confirmed on payment. Login required. No reservation — if the event sells out before payment completes, the user gets refunded (admin handles refund; we show a notice).
|
||||
|
||||
## Current Flow vs Desired Flow
|
||||
|
||||
**Now:** Click Buy → Ticket created (state="reserved") → done (no cart, no payment)
|
||||
|
||||
**Desired:** Click Buy → Ticket created (state="pending", in cart) → Checkout → SumUp payment → Ticket confirmed
|
||||
|
||||
## Approach
|
||||
|
||||
Mirror the CalendarEntry pattern: CalendarEntry uses state="pending" to mean "in cart". We add state="pending" for Ticket. Pending tickets don't count toward availability (not allocated). At checkout, pending→reserved + linked to order. On payment, reserved→confirmed.
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Update TicketDTO
|
||||
|
||||
**File:** `shared/contracts/dtos.py`
|
||||
|
||||
Add fields needed for cart display and page-grouping:
|
||||
- `entry_id: int` (for linking back)
|
||||
- `cost: Decimal` (ticket price — from ticket_type.cost or entry.ticket_price)
|
||||
- `calendar_container_id: int | None` (for page-grouping in cart)
|
||||
- `calendar_container_type: str | None`
|
||||
|
||||
Also add `ticket_count` and `ticket_total` to `CartSummaryDTO`.
|
||||
|
||||
## Step 2: Add ticket methods to CalendarService protocol
|
||||
|
||||
**File:** `shared/contracts/protocols.py`
|
||||
|
||||
```python
|
||||
async def pending_tickets(
|
||||
self, session: AsyncSession, *, user_id: int,
|
||||
) -> list[TicketDTO]: ...
|
||||
|
||||
async def claim_tickets_for_order(
|
||||
self, session: AsyncSession, order_id: int, user_id: int,
|
||||
page_post_id: int | None = None,
|
||||
) -> None: ...
|
||||
|
||||
async def confirm_tickets_for_order(
|
||||
self, session: AsyncSession, order_id: int,
|
||||
) -> None: ...
|
||||
```
|
||||
|
||||
## Step 3: Implement in SqlCalendarService
|
||||
|
||||
**File:** `shared/services/calendar_impl.py`
|
||||
|
||||
- **`pending_tickets`**: Query `Ticket` where `user_id` matches, `state="pending"`, eager-load entry→calendar + ticket_type. Map to TicketDTO with cost from `ticket_type.cost` or `entry.ticket_price`.
|
||||
- **`claim_tickets_for_order`**: UPDATE Ticket SET state="reserved", order_id=? WHERE user_id=? AND state="pending". If `page_post_id`, filter via entry→calendar→container.
|
||||
- **`confirm_tickets_for_order`**: UPDATE Ticket SET state="confirmed" WHERE order_id=? AND state="reserved".
|
||||
|
||||
Update `_ticket_to_dto` to populate the new fields (entry_id, cost, calendar_container_id/type).
|
||||
|
||||
## Step 4: Add stubs
|
||||
|
||||
**File:** `shared/services/stubs.py`
|
||||
|
||||
Add no-op stubs returning `[]`/`None` for the 3 new methods.
|
||||
|
||||
## Step 5: Update SqlCartService
|
||||
|
||||
**File:** `shared/services/cart_impl.py`
|
||||
|
||||
In `cart_summary()`, also query pending tickets via `services.calendar.pending_tickets()` and include `ticket_count` + `ticket_total` in the returned `CartSummaryDTO`.
|
||||
|
||||
## Step 6: Update cart internal API
|
||||
|
||||
**File:** `cart/bp/cart/api.py`
|
||||
|
||||
Add `ticket_count` and `ticket_total` to the JSON summary response. Query via `services.calendar.pending_tickets()`.
|
||||
|
||||
## Step 7: Add ticket cart service functions
|
||||
|
||||
**File:** `cart/bp/cart/services/calendar_cart.py`
|
||||
|
||||
Add:
|
||||
```python
|
||||
async def get_ticket_cart_entries(session):
|
||||
ident = current_cart_identity()
|
||||
if ident["user_id"] is None:
|
||||
return []
|
||||
return await services.calendar.pending_tickets(session, user_id=ident["user_id"])
|
||||
|
||||
def ticket_total(tickets) -> float:
|
||||
return sum((t.cost or 0) for t in tickets if t.cost is not None)
|
||||
```
|
||||
|
||||
**File:** `cart/bp/cart/services/__init__.py` — export the new functions.
|
||||
|
||||
## Step 8: Update cart page grouping
|
||||
|
||||
**File:** `cart/bp/cart/services/page_cart.py`
|
||||
|
||||
In `get_cart_grouped_by_page()`:
|
||||
- Fetch ticket cart entries via `get_ticket_cart_entries()`
|
||||
- Attach tickets to page groups by `calendar_container_id` (same pattern as calendar entries)
|
||||
- Add `ticket_count` and `ticket_total` to each group dict
|
||||
|
||||
## Step 9: Modify ticket buy route
|
||||
|
||||
**File:** `events/bp/tickets/routes.py` — `buy_tickets()`
|
||||
|
||||
- **Require login**: If `ident["user_id"]` is None, return error prompting sign-in
|
||||
- **Create with state="pending"** instead of "reserved"
|
||||
- **Remove availability check** at buy time (pending tickets not allocated)
|
||||
- Update response template to say "added to cart" instead of "reserved"
|
||||
|
||||
## Step 10: Update availability count
|
||||
|
||||
**File:** `events/bp/tickets/services/tickets.py` — `get_available_ticket_count()`
|
||||
|
||||
Change from counting `state != "cancelled"` to counting `state.in_(("reserved", "confirmed", "checked_in"))`. This excludes "pending" (in-cart) tickets from sold count.
|
||||
|
||||
## Step 11: Update buy form template
|
||||
|
||||
**File:** `events/templates/_types/tickets/_buy_form.html`
|
||||
|
||||
- If user not logged in, show "Sign in to buy tickets" link instead of buy form
|
||||
- Keep existing form for logged-in users
|
||||
|
||||
**File:** `events/templates/_types/tickets/_buy_result.html`
|
||||
|
||||
- Change "reserved" messaging to "added to cart"
|
||||
- Add link to cart app
|
||||
- Add sold-out refund notice: "If the event sells out before payment, you will be refunded."
|
||||
|
||||
## Step 12: Update cart display templates
|
||||
|
||||
**File:** `shared/browser/templates/_types/cart/_cart.html`
|
||||
|
||||
In `show_cart()` macro:
|
||||
- Add empty check: `{% if not cart and not calendar_cart_entries and not ticket_cart_entries %}`
|
||||
- Add tickets section after calendar bookings (same style)
|
||||
- Add sold-out notice under tickets section
|
||||
|
||||
In `summary()` and `cart_grand_total()` macros:
|
||||
- Include ticket_total in the grand total calculation
|
||||
|
||||
**File:** `shared/browser/templates/_types/cart/_mini.html`
|
||||
|
||||
- Add ticket count to the badge total
|
||||
|
||||
## Step 13: Update cart overview template
|
||||
|
||||
**File:** `cart/templates/_types/cart/overview/_main_panel.html`
|
||||
|
||||
- Add ticket count badge alongside product and calendar count badges
|
||||
|
||||
## Step 14: Update checkout flow
|
||||
|
||||
**File:** `cart/bp/cart/global_routes.py` — `checkout()`
|
||||
|
||||
- Fetch pending tickets: `get_ticket_cart_entries(g.s)`
|
||||
- Include ticket total in cart_total calculation
|
||||
- Include `not ticket_entries` in empty check
|
||||
- Pass tickets to `create_order_from_cart()` (or claim separately after)
|
||||
|
||||
**File:** `cart/bp/cart/page_routes.py` — `page_checkout()`
|
||||
|
||||
Same changes, scoped to page.
|
||||
|
||||
**File:** `cart/bp/cart/services/checkout.py` — `create_order_from_cart()`
|
||||
|
||||
- Accept new param `ticket_total: float` (add to order total)
|
||||
- After claiming calendar entries, also claim tickets: `services.calendar.claim_tickets_for_order()`
|
||||
- Include tickets in `resolve_page_config` page detection
|
||||
|
||||
## Step 15: Update payment confirmation
|
||||
|
||||
**File:** `cart/bp/cart/services/check_sumup_status.py`
|
||||
|
||||
When status == "PAID", also call `services.calendar.confirm_tickets_for_order(session, order.id)` alongside `confirm_entries_for_order`.
|
||||
|
||||
## Step 16: Update checkout return page
|
||||
|
||||
**File:** `cart/bp/cart/global_routes.py` — `checkout_return()`
|
||||
|
||||
- Also fetch tickets for order: `services.calendar.user_tickets()` filtered by order_id (or add a `get_tickets_for_order` method)
|
||||
|
||||
**File:** `shared/browser/templates/_types/order/_calendar_items.html`
|
||||
|
||||
- Add a tickets section showing ordered/confirmed tickets.
|
||||
|
||||
## Step 17: Sync shared files
|
||||
|
||||
Copy all changed shared files to blog/, cart/, events/, market/ submodules.
|
||||
|
||||
---
|
||||
|
||||
## Files Modified (Summary)
|
||||
|
||||
### Shared contracts/services:
|
||||
- `shared/contracts/dtos.py` — update TicketDTO, CartSummaryDTO
|
||||
- `shared/contracts/protocols.py` — add 3 methods to CalendarService
|
||||
- `shared/services/calendar_impl.py` — implement 3 new methods, update _ticket_to_dto
|
||||
- `shared/services/stubs.py` — add stubs
|
||||
- `shared/services/cart_impl.py` — include tickets in cart_summary
|
||||
|
||||
### Cart app:
|
||||
- `cart/bp/cart/api.py` — add ticket info to summary API
|
||||
- `cart/bp/cart/services/calendar_cart.py` — add ticket functions
|
||||
- `cart/bp/cart/services/__init__.py` — export new functions
|
||||
- `cart/bp/cart/services/page_cart.py` — include tickets in grouped view
|
||||
- `cart/bp/cart/global_routes.py` — include tickets in checkout + return
|
||||
- `cart/bp/cart/page_routes.py` — include tickets in page checkout
|
||||
- `cart/bp/cart/services/checkout.py` — include ticket total in order
|
||||
- `cart/bp/cart/services/check_sumup_status.py` — confirm tickets on payment
|
||||
|
||||
### Events app:
|
||||
- `events/bp/tickets/routes.py` — require login, state="pending"
|
||||
- `events/bp/tickets/services/tickets.py` — update availability count
|
||||
- `events/templates/_types/tickets/_buy_form.html` — login gate
|
||||
- `events/templates/_types/tickets/_buy_result.html` — "added to cart" messaging
|
||||
|
||||
### Templates (shared):
|
||||
- `shared/browser/templates/_types/cart/_cart.html` — ticket section + totals
|
||||
- `shared/browser/templates/_types/cart/_mini.html` — ticket count in badge
|
||||
- `cart/templates/_types/cart/overview/_main_panel.html` — ticket badge
|
||||
- `shared/browser/templates/_types/order/_calendar_items.html` — ticket section
|
||||
|
||||
## Verification
|
||||
|
||||
1. Go to an event entry with tickets configured (state="confirmed", ticket_price set)
|
||||
2. Click "Buy Tickets" while not logged in → should see "sign in" prompt
|
||||
3. Log in, click "Buy Tickets" → ticket created with state="pending"
|
||||
4. Navigate to cart → ticket appears alongside any products/bookings
|
||||
5. Proceed to checkout → SumUp payment page
|
||||
6. Complete payment → ticket state becomes "confirmed"
|
||||
7. Check cart mini badge shows ticket count
|
||||
8. Verify availability count doesn't include pending tickets
|
||||
177
.claude/plans/glittery-discovering-kahn.md
Normal file
177
.claude/plans/glittery-discovering-kahn.md
Normal file
@@ -0,0 +1,177 @@
|
||||
# Sexp Fragment Protocol: Component Defs Between Services
|
||||
|
||||
## Context
|
||||
|
||||
Fragment endpoints return raw sexp source (e.g., `(~blog-nav-wrapper :items ...)`). The consuming service embeds this in its page sexp, which the client evaluates. But blog-specific components like `~blog-nav-wrapper` are only in blog's `_COMPONENT_ENV` — not in market's. So market's `client_components_tag()` never sends them to the client, causing "Unknown component" errors.
|
||||
|
||||
The fix: transfer component definitions alongside fragments. Services tell the provider what they already have; the provider sends only what's missing. The consuming service registers received defs into its `_COMPONENT_ENV` so they're included in `client_components_tag()` output for the client.
|
||||
|
||||
## Approach: Structured Sexp Request/Response
|
||||
|
||||
Replace the current GET + `X-Fragment-Request` header protocol with POST + sexp body. This aligns with the vision in `docs/sexpr-internal-protocol-first.md`.
|
||||
|
||||
### Request format (POST body)
|
||||
```scheme
|
||||
(fragment-request
|
||||
:type "nav-tree"
|
||||
:params (:app-name "market" :path "/")
|
||||
:components (~blog-nav-wrapper ~blog-nav-item-link ~header-row-sx ...))
|
||||
```
|
||||
|
||||
`:components` lists component names already in the consumer's `_COMPONENT_ENV`. Provider skips these.
|
||||
|
||||
### Response format
|
||||
```scheme
|
||||
(fragment-response
|
||||
:defs ((defcomp ~blog-nav-wrapper (&key ...) ...) (defcomp ~blog-nav-item-link ...))
|
||||
:content (~blog-nav-wrapper :items ...))
|
||||
```
|
||||
|
||||
`:defs` contains only components the consumer doesn't have. `:content` is the fragment sexp (same as current response body).
|
||||
|
||||
## Changes
|
||||
|
||||
### 1. `shared/infrastructure/fragments.py` — Client side
|
||||
|
||||
**`fetch_fragment()`**: Switch from GET to POST with sexp body.
|
||||
|
||||
- Build request body using `sexp_call`:
|
||||
```python
|
||||
from shared.sexp.helpers import sexp_call, SexpExpr
|
||||
from shared.sexp.jinja_bridge import _COMPONENT_ENV
|
||||
|
||||
comp_names = [k for k in _COMPONENT_ENV if k.startswith("~")]
|
||||
body = sexp_call("fragment-request",
|
||||
type=fragment_type,
|
||||
params=params or {},
|
||||
components=SexpExpr("(" + " ".join(comp_names) + ")"))
|
||||
```
|
||||
- POST to same URL, body as `text/sexp`, keep `X-Fragment-Request` header for backward compat
|
||||
- Parse response: extract `:defs` and `:content` from the sexp response
|
||||
- Register defs into `_COMPONENT_ENV` via `register_components()`
|
||||
- Return `:content` wrapped as `SexpExpr`
|
||||
|
||||
**New helper `_parse_fragment_response(text)`**:
|
||||
- `parse()` the response sexp
|
||||
- Extract keyword args (reuse the keyword-extraction pattern from `evaluator.py`)
|
||||
- Return `(defs_source, content_source)` tuple
|
||||
|
||||
### 2. `shared/sexp/helpers.py` — Response builder
|
||||
|
||||
**New `fragment_response(content, request_text)`**:
|
||||
|
||||
```python
|
||||
def fragment_response(content: str, request_text: str) -> str:
|
||||
"""Build a structured fragment response with missing component defs."""
|
||||
from .parser import parse, serialize
|
||||
from .types import Keyword, Component
|
||||
from .jinja_bridge import _COMPONENT_ENV
|
||||
|
||||
# Parse request to get :components list
|
||||
req = parse(request_text)
|
||||
loaded = set()
|
||||
# extract :components keyword value
|
||||
...
|
||||
|
||||
# Diff against _COMPONENT_ENV, serialize missing defs
|
||||
defs_parts = []
|
||||
for key, val in _COMPONENT_ENV.items():
|
||||
if not isinstance(val, Component):
|
||||
continue
|
||||
if key in loaded or f"~{val.name}" in loaded:
|
||||
continue
|
||||
defs_parts.append(_serialize_defcomp(val))
|
||||
|
||||
defs_sexp = "(" + " ".join(defs_parts) + ")" if defs_parts else "nil"
|
||||
return sexp_call("fragment-response",
|
||||
defs=SexpExpr(defs_sexp),
|
||||
content=SexpExpr(content))
|
||||
```
|
||||
|
||||
### 3. Fragment endpoints — All services
|
||||
|
||||
**Generic change in each `bp/fragments/routes.py`**: Update the route handler to accept POST, read sexp body, use `fragment_response()` for the response.
|
||||
|
||||
The `get_fragment` handler becomes:
|
||||
```python
|
||||
@bp.route("/<fragment_type>", methods=["GET", "POST"])
|
||||
async def get_fragment(fragment_type: str):
|
||||
handler = _handlers.get(fragment_type)
|
||||
if handler is None:
|
||||
return Response("", status=200, content_type="text/sexp")
|
||||
content = await handler()
|
||||
|
||||
# Structured sexp protocol (POST with sexp body)
|
||||
request_body = await request.get_data(as_text=True)
|
||||
if request_body and request.content_type == "text/sexp":
|
||||
from shared.sexp.helpers import fragment_response
|
||||
body = fragment_response(content, request_body)
|
||||
return Response(body, status=200, content_type="text/sexp")
|
||||
|
||||
# Legacy GET fallback
|
||||
return Response(content, status=200, content_type="text/sexp")
|
||||
```
|
||||
|
||||
Since all fragment endpoints follow the identical `_handlers` + `get_fragment` pattern, we can extract this into a shared helper in `fragments.py` or a new `shared/infrastructure/fragment_endpoint.py`.
|
||||
|
||||
### 4. Extract shared fragment endpoint helper
|
||||
|
||||
To avoid touching every service's fragment routes, create a shared blueprint factory:
|
||||
|
||||
**`shared/infrastructure/fragment_endpoint.py`**:
|
||||
```python
|
||||
def create_fragment_blueprint(handlers: dict) -> Blueprint:
|
||||
"""Create a fragment endpoint blueprint with sexp protocol support."""
|
||||
bp = Blueprint("fragments", __name__, url_prefix="/internal/fragments")
|
||||
|
||||
@bp.before_request
|
||||
async def _require_fragment_header():
|
||||
if not request.headers.get(FRAGMENT_HEADER):
|
||||
return Response("", status=403)
|
||||
|
||||
@bp.route("/<fragment_type>", methods=["GET", "POST"])
|
||||
async def get_fragment(fragment_type: str):
|
||||
handler = handlers.get(fragment_type)
|
||||
if handler is None:
|
||||
return Response("", status=200, content_type="text/sexp")
|
||||
content = await handler()
|
||||
|
||||
# Sexp protocol: POST with structured request/response
|
||||
if request.method == "POST" and request.content_type == "text/sexp":
|
||||
request_body = await request.get_data(as_text=True)
|
||||
from shared.sexp.helpers import fragment_response
|
||||
body = fragment_response(content, request_body)
|
||||
return Response(body, status=200, content_type="text/sexp")
|
||||
|
||||
return Response(content, status=200, content_type="text/sexp")
|
||||
|
||||
return bp
|
||||
```
|
||||
|
||||
Then each service's `register()` just returns `create_fragment_blueprint(_handlers)`. This is a small refactor since they all duplicate the same boilerplate today.
|
||||
|
||||
## Files to modify
|
||||
|
||||
| File | Change |
|
||||
|------|--------|
|
||||
| `shared/infrastructure/fragments.py` | POST sexp body, parse response, register defs |
|
||||
| `shared/sexp/helpers.py` | `fragment_response()` builder, `_serialize_defcomp()` |
|
||||
| `shared/infrastructure/fragment_endpoint.py` | **New** — shared blueprint factory |
|
||||
| `blog/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `market/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `events/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `cart/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `account/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `orders/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `federation/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
| `relations/bp/fragments/routes.py` | Use `create_fragment_blueprint` |
|
||||
|
||||
## Verification
|
||||
|
||||
1. Start blog + market services: `./dev.sh blog market`
|
||||
2. Load market page — should fetch nav-tree from blog with sexp protocol
|
||||
3. Check market logs: no "Unknown component" errors
|
||||
4. Inspect page source: `client_components_tag()` output includes `~blog-nav-wrapper` etc.
|
||||
5. Cross-domain sx-get navigation (blog → market) works without reload
|
||||
6. Run sexp tests: `python3 -m pytest shared/sexp/tests/ -x -q`
|
||||
7. Second page load: `:components` list in request includes blog nav components, response `:defs` is empty
|
||||
425
.claude/plans/glittery-zooming-hummingbird.md
Normal file
425
.claude/plans/glittery-zooming-hummingbird.md
Normal file
@@ -0,0 +1,425 @@
|
||||
# Phase 6: Full Cross-App Decoupling via Glue Services
|
||||
|
||||
## Context
|
||||
|
||||
Phases 1-5 are complete. All cross-domain FK constraints have been dropped (except `OrderItem.product_id` and `CartItem.product_id`/`market_place_id`/`user_id`, kept as pragmatic exceptions). Cross-domain **writes** go through glue services.
|
||||
|
||||
However, **25+ cross-app model imports** remain — apps still `from blog.models.ghost_content import Post`, `from market.models.market import CartItem`, etc. This means every app needs every other app's code on disk to start, making separate databases or independent deployment impossible.
|
||||
|
||||
**Goal:** Eliminate all cross-app model imports. Every app only imports from its own `models/`, from `shared/`, and from `glue/`. Cross-domain access goes through glue services. After this phase, each app could theoretically run against its own database.
|
||||
|
||||
---
|
||||
|
||||
## Inventory of Cross-App Imports to Eliminate
|
||||
|
||||
### Cart app imports (9 files, 4 foreign models):
|
||||
| File | Import | Usage |
|
||||
|------|--------|-------|
|
||||
| `cart/bp/cart/api.py` | `market.models.market.CartItem` | Query cart items |
|
||||
| `cart/bp/cart/api.py` | `market.models.market_place.MarketPlace` | Filter by container |
|
||||
| `cart/bp/cart/api.py` | `events.models.calendars.CalendarEntry, Calendar` | Query pending entries |
|
||||
| `cart/bp/cart/api.py` | `blog.models.ghost_content.Post` | Resolve page slug |
|
||||
| `cart/bp/cart/services/checkout.py` | `market.models.market.Product, CartItem` | Find cart items, validate products |
|
||||
| `cart/bp/cart/services/checkout.py` | `events.models.calendars.CalendarEntry, Calendar` | Resolve page containers |
|
||||
| `cart/bp/cart/services/checkout.py` | `market.models.market_place.MarketPlace` | Get container_id |
|
||||
| `cart/bp/cart/services/page_cart.py` | `market.models.market.CartItem` | Query page cart |
|
||||
| `cart/bp/cart/services/page_cart.py` | `market.models.market_place.MarketPlace` | Join for container |
|
||||
| `cart/bp/cart/services/page_cart.py` | `events.models.calendars.CalendarEntry, Calendar` | Query page entries |
|
||||
| `cart/bp/cart/services/page_cart.py` | `blog.models.ghost_content.Post` | Batch-load posts |
|
||||
| `cart/bp/cart/services/get_cart.py` | `market.models.market.CartItem` | Query cart items |
|
||||
| `cart/bp/cart/services/calendar_cart.py` | `events.models.calendars.CalendarEntry` | Query pending entries |
|
||||
| `cart/bp/cart/services/clear_cart_for_order.py` | `market.models.market.CartItem` | Soft-delete items |
|
||||
| `cart/bp/cart/services/clear_cart_for_order.py` | `market.models.market_place.MarketPlace` | Filter by page |
|
||||
| `cart/bp/orders/routes.py` | `market.models.market.Product` | Join for search |
|
||||
| `cart/bp/order/routes.py` | `market.models.market.Product` | Load product details |
|
||||
| `cart/app.py` | `blog.models.ghost_content.Post` | Page slug hydration |
|
||||
|
||||
### Blog app imports (8 files, 3 foreign models):
|
||||
| File | Import | Usage |
|
||||
|------|--------|-------|
|
||||
| `blog/bp/post/admin/routes.py` | `cart.models.page_config.PageConfig` (3 places) | Load/update page config |
|
||||
| `blog/bp/post/admin/routes.py` | `events.models.calendars.Calendar` (3 places) | Query calendars |
|
||||
| `blog/bp/post/admin/routes.py` | `market.models.market_place.MarketPlace` (3 places) | Query/create/delete markets |
|
||||
| `blog/bp/post/services/markets.py` | `market.models.market_place.MarketPlace` | Create/delete markets |
|
||||
| `blog/bp/post/services/markets.py` | `cart.models.page_config.PageConfig` | Check feature flag |
|
||||
| `blog/bp/post/services/entry_associations.py` | `events.models.calendars.CalendarEntry, CalendarEntryPost, Calendar` | Post-entry associations |
|
||||
| `blog/bp/post/routes.py` | `events.models.calendars.Calendar` | Page context |
|
||||
| `blog/bp/post/routes.py` | `market.models.market_place.MarketPlace` | Page context |
|
||||
| `blog/bp/blog/ghost_db.py` | `cart.models.page_config.PageConfig` | Query page configs |
|
||||
| `blog/bp/blog/ghost/ghost_sync.py` | `cart.models.page_config.PageConfig` | Sync page config |
|
||||
| `blog/bp/blog/services/posts_data.py` | `events.models.calendars.CalendarEntry, CalendarEntryPost` | Fetch associated entries |
|
||||
|
||||
### Events app imports (5 files, 3 foreign models):
|
||||
| File | Import | Usage |
|
||||
|------|--------|-------|
|
||||
| `events/app.py` | `blog.models.ghost_content.Post` | Page slug hydration |
|
||||
| `events/app.py` | `market.models.market_place.MarketPlace` | Context processor |
|
||||
| `events/bp/markets/services/markets.py` | `market.models.market_place.MarketPlace` | Create/delete markets |
|
||||
| `events/bp/markets/services/markets.py` | `blog.models.ghost_content.Post` | Validate post exists |
|
||||
| `events/bp/markets/routes.py` | `market.models.market_place.MarketPlace` | Query/delete markets |
|
||||
| `events/bp/calendars/services/calendars.py` | `blog.models.ghost_content.Post` | Validate post exists |
|
||||
| `events/bp/calendar_entry/services/post_associations.py` | `blog.models.ghost_content.Post` | Manage post-entry assocs |
|
||||
| `events/bp/payments/routes.py` | `cart.models.page_config.PageConfig` | Load/update SumUp config |
|
||||
|
||||
### Market app imports (1 file):
|
||||
| File | Import | Usage |
|
||||
|------|--------|-------|
|
||||
| `market/app.py` | `blog.models.ghost_content.Post` | Page slug hydration |
|
||||
|
||||
### Glue layer imports (2 files):
|
||||
| File | Import | Usage |
|
||||
|------|--------|-------|
|
||||
| `glue/services/cart_adoption.py` | `market.models.market.CartItem` | Adopt cart items |
|
||||
| `glue/services/cart_adoption.py` | `events.models.calendars.CalendarEntry` | Adopt entries |
|
||||
| `glue/services/order_lifecycle.py` | `events.models.calendars.CalendarEntry, Calendar` | Claim/confirm entries |
|
||||
|
||||
---
|
||||
|
||||
## Design Decisions
|
||||
|
||||
1. **Glue services return ORM objects** (not dicts) when the model is standalone — PageConfig, MarketPlace, Calendar, CalendarEntry. This avoids template changes and keeps SQLAlchemy lazy-load working.
|
||||
|
||||
2. **Glue services for Post return dicts** — other apps only need `{id, slug, title, is_page, feature_image}`. Returning the full ORM object would couple them to the blog schema.
|
||||
|
||||
3. **CartItem stays in `market/models/market.py`** — it has FKs to `products.id`, `market_places.id`, and `users.id`, plus relationships to `Product`, `MarketPlace`, and `User`. Moving it to cart/ would just reverse the cross-app import direction. Instead, cart reads CartItem through glue.
|
||||
|
||||
4. **OrderItem.product relationship uses string forward-ref** — already works via SQLAlchemy string resolution as long as Product is registered in the mapper. Glue setup handles this.
|
||||
|
||||
5. **Glue services are allowed to import from any app's models** — that's the glue layer's job. Apps call glue; glue touches models.
|
||||
|
||||
6. **blog/bp/post/services/markets.py and entry_associations.py move to glue** — these are pure cross-domain CRUD (blog writes to MarketPlace, blog reads CalendarEntry). They belong in glue.
|
||||
|
||||
---
|
||||
|
||||
## Step 1: Glue service for pages (Post access)
|
||||
|
||||
New file: `glue/services/pages.py`
|
||||
|
||||
Provides dict-based Post access for non-blog apps:
|
||||
|
||||
```python
|
||||
async def get_page_by_slug(session, slug) -> dict | None:
|
||||
"""Return {id, slug, title, is_page, feature_image, ...} or None."""
|
||||
|
||||
async def get_page_by_id(session, post_id) -> dict | None:
|
||||
"""Return page dict by id."""
|
||||
|
||||
async def get_pages_by_ids(session, post_ids) -> dict[int, dict]:
|
||||
"""Batch-load pages. Returns {id: page_dict}."""
|
||||
|
||||
async def page_exists(session, post_id) -> bool:
|
||||
"""Check if post exists (for validation before creating calendars/markets)."""
|
||||
|
||||
async def is_page(session, post_id) -> bool:
|
||||
"""Check if post exists and is_page=True."""
|
||||
|
||||
async def search_posts(session, query, page=1, per_page=10) -> tuple[list[dict], int]:
|
||||
"""Search posts by title (for events post_associations)."""
|
||||
```
|
||||
|
||||
All functions import `from blog.models.ghost_content import Post` internally.
|
||||
|
||||
**Files changed:**
|
||||
- `market/app.py` — replace `from blog.models.ghost_content import Post` with `from glue.services.pages import get_page_by_slug`
|
||||
- `events/app.py` — same
|
||||
- `cart/app.py` — same
|
||||
- `cart/bp/cart/api.py` — replace Post import with `from glue.services.pages import get_page_by_slug`
|
||||
- `cart/bp/cart/services/page_cart.py` — replace Post import with `from glue.services.pages import get_pages_by_ids`
|
||||
- `events/bp/calendars/services/calendars.py` — replace `from blog.models.ghost_content import Post` with `from glue.services.pages import page_exists, is_page`
|
||||
- `events/bp/markets/services/markets.py` — replace `from blog.models.ghost_content import Post` with `from glue.services.pages import page_exists, is_page`
|
||||
|
||||
---
|
||||
|
||||
## Step 2: Glue service for page config
|
||||
|
||||
New file: `glue/services/page_config.py`
|
||||
|
||||
```python
|
||||
async def get_page_config(session, post_id) -> PageConfig | None:
|
||||
"""Load PageConfig for a page."""
|
||||
|
||||
async def get_or_create_page_config(session, post_id) -> PageConfig:
|
||||
"""Load or create PageConfig. Emits container.child_attached if created."""
|
||||
|
||||
async def get_page_configs_by_ids(session, post_ids) -> dict[int, PageConfig]:
|
||||
"""Batch-load PageConfigs by container_id."""
|
||||
```
|
||||
|
||||
Imports `from cart.models.page_config import PageConfig` internally.
|
||||
|
||||
**Files changed:**
|
||||
- `blog/bp/post/admin/routes.py` — replace `from cart.models.page_config import PageConfig` with glue service calls
|
||||
- `blog/bp/post/services/markets.py` — replace PageConfig import
|
||||
- `blog/bp/blog/ghost_db.py` — replace PageConfig import
|
||||
- `blog/bp/blog/ghost/ghost_sync.py` — replace PageConfig import
|
||||
- `events/bp/payments/routes.py` — replace PageConfig import
|
||||
- `cart/bp/cart/services/checkout.py` — replace `from models.page_config import PageConfig` stays (same app)
|
||||
|
||||
---
|
||||
|
||||
## Step 3: Glue service for calendars (events access from blog)
|
||||
|
||||
New file: `glue/services/calendars.py`
|
||||
|
||||
```python
|
||||
async def get_calendars_for_page(session, post_id) -> list[Calendar]:
|
||||
"""Return active calendars for a page."""
|
||||
|
||||
async def get_calendar_entries_for_posts(session, post_ids) -> dict[int, list]:
|
||||
"""Fetch confirmed CalendarEntries associated with posts (via CalendarEntryPost).
|
||||
Returns {post_id: [entry, ...]}."""
|
||||
```
|
||||
|
||||
Move and adapt from `blog/bp/post/services/entry_associations.py`:
|
||||
|
||||
```python
|
||||
async def toggle_entry_association(session, post_id, entry_id) -> tuple[bool, str | None]:
|
||||
async def get_post_entry_ids(session, post_id) -> set[int]:
|
||||
async def get_associated_entries(session, post_id, page=1, per_page=10) -> dict:
|
||||
```
|
||||
|
||||
These functions import from `events.models.calendars` internally.
|
||||
|
||||
**Files changed:**
|
||||
- `blog/bp/post/routes.py` — replace `from events.models.calendars import Calendar` + `from market.models.market_place import MarketPlace` with glue service calls
|
||||
- `blog/bp/post/admin/routes.py` — replace Calendar imports with glue service calls
|
||||
- `blog/bp/post/services/entry_associations.py` — **delete file**, moved to glue
|
||||
- `blog/bp/blog/services/posts_data.py` — replace `from events.models.calendars import CalendarEntry, CalendarEntryPost` with glue service call
|
||||
|
||||
---
|
||||
|
||||
## Step 4: Glue service for marketplaces
|
||||
|
||||
New file: `glue/services/marketplaces.py`
|
||||
|
||||
```python
|
||||
async def get_marketplaces_for_page(session, post_id) -> list[MarketPlace]:
|
||||
"""Return active marketplaces for a page."""
|
||||
|
||||
async def create_marketplace(session, post_id, name) -> MarketPlace:
|
||||
"""Create marketplace (validates page exists via pages service)."""
|
||||
|
||||
async def soft_delete_marketplace(session, post_slug, market_slug) -> bool:
|
||||
"""Soft-delete a marketplace."""
|
||||
```
|
||||
|
||||
Move the logic from `blog/bp/post/services/markets.py` and `events/bp/markets/services/markets.py` (they're nearly identical).
|
||||
|
||||
**Files changed:**
|
||||
- `blog/bp/post/services/markets.py` — **delete file**, moved to glue
|
||||
- `blog/bp/post/admin/routes.py` — replace MarketPlace imports + service calls with glue
|
||||
- `blog/bp/post/routes.py` — replace MarketPlace import with glue service
|
||||
- `events/bp/markets/services/markets.py` — **delete file**, moved to glue
|
||||
- `events/bp/markets/routes.py` — replace MarketPlace import, use glue
|
||||
- `events/app.py` — replace MarketPlace import with glue service
|
||||
|
||||
---
|
||||
|
||||
## Step 5: Glue service for cart items (market model access from cart)
|
||||
|
||||
New file: `glue/services/cart_items.py`
|
||||
|
||||
```python
|
||||
async def get_cart_items(session, user_id=None, session_id=None, *, page_post_id=None) -> list[CartItem]:
|
||||
"""Get cart items for identity, optionally scoped to page."""
|
||||
|
||||
async def find_or_create_cart_item(session, product_id, user_id, session_id) -> CartItem | None:
|
||||
"""Find existing or create new cart item. Returns None if product missing."""
|
||||
|
||||
async def clear_cart_for_order(session, order, *, page_post_id=None) -> None:
|
||||
"""Soft-delete cart items for order identity."""
|
||||
|
||||
async def get_calendar_cart_entries(session, user_id=None, session_id=None, *, page_post_id=None) -> list[CalendarEntry]:
|
||||
"""Get pending calendar entries for identity, optionally scoped to page."""
|
||||
```
|
||||
|
||||
Imports `CartItem`, `Product`, `MarketPlace` from market, `CalendarEntry`, `Calendar` from events internally.
|
||||
|
||||
**Files changed:**
|
||||
- `cart/bp/cart/services/get_cart.py` — replace CartItem import with glue call
|
||||
- `cart/bp/cart/services/calendar_cart.py` — replace CalendarEntry import with glue call
|
||||
- `cart/bp/cart/services/clear_cart_for_order.py` — replace CartItem/MarketPlace imports with glue call
|
||||
- `cart/bp/cart/services/checkout.py` — replace CartItem/Product/MarketPlace/CalendarEntry/Calendar imports with glue calls
|
||||
- `cart/bp/cart/api.py` — replace CartItem/MarketPlace/CalendarEntry/Calendar imports with glue calls
|
||||
- `cart/bp/cart/services/page_cart.py` — replace CartItem/MarketPlace/CalendarEntry/Calendar imports with glue calls
|
||||
|
||||
---
|
||||
|
||||
## Step 6: Glue service for products (market access from cart orders)
|
||||
|
||||
New file: `glue/services/products.py`
|
||||
|
||||
```python
|
||||
async def get_product(session, product_id) -> Product | None:
|
||||
"""Get product by ID."""
|
||||
```
|
||||
|
||||
This is minimal — only needed by `cart/bp/order/routes.py` and `cart/bp/orders/routes.py` for search/display. However, `OrderItem.product` relationship already resolves via string forward-ref. We only need Product for the join-based search in orders listing.
|
||||
|
||||
**Files changed:**
|
||||
- `cart/bp/orders/routes.py` — replace `from market.models.market import Product` with glue import or use `OrderItem.product` relationship
|
||||
- `cart/bp/order/routes.py` — replace `from market.models.market import Product` (already uses OrderItem.product relationship for display)
|
||||
|
||||
---
|
||||
|
||||
## Step 7: Glue service for post associations (events-side)
|
||||
|
||||
Move `events/bp/calendar_entry/services/post_associations.py` into glue:
|
||||
|
||||
New additions to `glue/services/pages.py` (or separate file `glue/services/post_associations.py`):
|
||||
|
||||
```python
|
||||
async def add_post_to_entry(session, entry_id, post_id) -> tuple[bool, str | None]:
|
||||
async def remove_post_from_entry(session, entry_id, post_id) -> tuple[bool, str | None]:
|
||||
async def get_entry_posts(session, entry_id) -> list[dict]:
|
||||
async def search_posts_for_entry(session, query, page=1, per_page=10) -> tuple[list[dict], int]:
|
||||
```
|
||||
|
||||
**Files changed:**
|
||||
- `events/bp/calendar_entry/services/post_associations.py` — **delete file**, moved to glue
|
||||
- Update any routes in events that call this service to use glue instead
|
||||
|
||||
---
|
||||
|
||||
## Step 8: Update glue model registration
|
||||
|
||||
`glue/setup.py` needs to ensure all models from all apps are registered in SQLAlchemy's mapper when starting any app. This is because string-based relationship references (like `OrderItem.product → "Product"`) need the target model class registered.
|
||||
|
||||
```python
|
||||
def register_models():
|
||||
"""Import all model modules to register them with SQLAlchemy mapper."""
|
||||
# These are already imported by each app, but ensure completeness:
|
||||
try:
|
||||
import blog.models.ghost_content # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
try:
|
||||
import market.models.market # noqa
|
||||
import market.models.market_place # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
try:
|
||||
import cart.models.order # noqa
|
||||
import cart.models.page_config # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
try:
|
||||
import events.models.calendars # noqa
|
||||
except ImportError:
|
||||
pass
|
||||
```
|
||||
|
||||
Each app's `app.py` calls `register_models()` at startup. The try/except guards handle Docker where only one app's code is present — but since all apps share `glue/` and the DB, all model files need to be importable.
|
||||
|
||||
**Note:** In Docker, each container only has its own app + shared + glue. For glue services that import from other apps' models, those models must be available. This means either:
|
||||
- (a) Include all model files in each container (symlinks or copies), or
|
||||
- (b) Have glue services that import other apps' models use try/except at import time
|
||||
|
||||
Since all apps already share one DB and all model files are available in development, option (a) is cleaner for production. Alternatively, the current Docker setup could be extended to include cross-app model files in each image.
|
||||
|
||||
---
|
||||
|
||||
## Step 9: Update existing glue services
|
||||
|
||||
**`glue/services/cart_adoption.py`** — already imports from market and events (correct — this is glue's job). No change needed.
|
||||
|
||||
**`glue/services/order_lifecycle.py`** — already imports from events. No change needed.
|
||||
|
||||
---
|
||||
|
||||
## Step 10: Clean up dead imports and update app.py files
|
||||
|
||||
After all glue services are wired:
|
||||
|
||||
- `cart/app.py` — remove `from blog.models.ghost_content import Post`, use `from glue.services.pages import get_page_by_slug`
|
||||
- `market/app.py` — remove `from blog.models.ghost_content import Post`, use `from glue.services.pages import get_page_by_slug`
|
||||
- `events/app.py` — remove `from blog.models.ghost_content import Post` and `from market.models.market_place import MarketPlace`
|
||||
- Remove any now-empty cross-app model directories if they exist
|
||||
|
||||
---
|
||||
|
||||
## Files Summary
|
||||
|
||||
| Repo | File | Change |
|
||||
|------|------|--------|
|
||||
| **glue** | `services/pages.py` | **NEW** — Post access (slug, id, exists, search) |
|
||||
| **glue** | `services/page_config.py` | **NEW** — PageConfig CRUD |
|
||||
| **glue** | `services/calendars.py` | **NEW** — Calendar queries + entry associations (from blog) |
|
||||
| **glue** | `services/marketplaces.py` | **NEW** — MarketPlace CRUD (from blog+events) |
|
||||
| **glue** | `services/cart_items.py` | **NEW** — CartItem/CalendarEntry queries for cart |
|
||||
| **glue** | `services/products.py` | **NEW** — Product access for cart orders |
|
||||
| **glue** | `services/post_associations.py` | **NEW** — Post-CalendarEntry associations (from events) |
|
||||
| **glue** | `setup.py` | Add `register_models()` |
|
||||
| **cart** | `app.py` | Replace Post import with glue |
|
||||
| **cart** | `bp/cart/api.py` | Replace all 4 cross-app imports with glue |
|
||||
| **cart** | `bp/cart/services/checkout.py` | Replace cross-app imports with glue |
|
||||
| **cart** | `bp/cart/services/page_cart.py` | Replace all cross-app imports with glue |
|
||||
| **cart** | `bp/cart/services/get_cart.py` | Replace CartItem import with glue |
|
||||
| **cart** | `bp/cart/services/calendar_cart.py` | Replace CalendarEntry import with glue |
|
||||
| **cart** | `bp/cart/services/clear_cart_for_order.py` | Replace CartItem/MarketPlace with glue |
|
||||
| **cart** | `bp/orders/routes.py` | Replace Product import with glue |
|
||||
| **cart** | `bp/order/routes.py` | Replace Product import with glue |
|
||||
| **blog** | `bp/post/admin/routes.py` | Replace PageConfig/Calendar/MarketPlace with glue |
|
||||
| **blog** | `bp/post/routes.py` | Replace Calendar/MarketPlace with glue |
|
||||
| **blog** | `bp/post/services/entry_associations.py` | **DELETE** — moved to `glue/services/calendars.py` |
|
||||
| **blog** | `bp/post/services/markets.py` | **DELETE** — moved to `glue/services/marketplaces.py` |
|
||||
| **blog** | `bp/blog/ghost_db.py` | Replace PageConfig import with glue |
|
||||
| **blog** | `bp/blog/ghost/ghost_sync.py` | Replace PageConfig import with glue |
|
||||
| **blog** | `bp/blog/services/posts_data.py` | Replace CalendarEntry/CalendarEntryPost with glue |
|
||||
| **events** | `app.py` | Replace Post + MarketPlace imports with glue |
|
||||
| **events** | `bp/markets/services/markets.py` | **DELETE** — moved to `glue/services/marketplaces.py` |
|
||||
| **events** | `bp/markets/routes.py` | Replace MarketPlace import, use glue |
|
||||
| **events** | `bp/calendars/services/calendars.py` | Replace Post import with glue |
|
||||
| **events** | `bp/calendar_entry/services/post_associations.py` | **DELETE** — moved to `glue/services/post_associations.py` |
|
||||
| **events** | `bp/payments/routes.py` | Replace PageConfig import with glue |
|
||||
| **market** | `app.py` | Replace Post import with glue |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Order
|
||||
|
||||
1. **Step 1** (pages.py) — unlocks Steps 2-4 which depend on page validation
|
||||
2. **Step 2** (page_config.py) — independent after Step 1
|
||||
3. **Steps 3-4** (calendars.py, marketplaces.py) — can be done in parallel, both use pages.py
|
||||
4. **Step 5** (cart_items.py) — depends on steps 1, 3 for calendar queries
|
||||
5. **Step 6** (products.py) — independent
|
||||
6. **Step 7** (post_associations.py) — independent, uses pages.py
|
||||
7. **Steps 8-10** (registration, cleanup) — after all services exist
|
||||
|
||||
---
|
||||
|
||||
## What's NOT changing
|
||||
|
||||
- **CartItem stays in `market/models/market.py`** — moving it creates equal or worse coupling
|
||||
- **OrderItem stays in `cart/models/order.py`** with `product_id` FK — pragmatic exception
|
||||
- **OrderItem.product_id FK** — kept, denormalized `product_title` makes it non-critical
|
||||
- **CartItem.product_id FK** — kept, same DB
|
||||
- **CartItem.market_place_id FK** — kept, same DB
|
||||
- **CartItem.user_id FK** — kept, shared model
|
||||
- **Internal HTTP APIs** (cart/summary, coop/*, events/*) — not changing
|
||||
- **`shared/` models** (User, MagicLink, etc.) — shared across all apps by design
|
||||
|
||||
---
|
||||
|
||||
## Docker Consideration
|
||||
|
||||
For glue services to work in Docker (single app per container), model files from other apps must be importable. Options:
|
||||
1. **Copy model files** into each Docker image during build (just the `models/` dirs)
|
||||
2. **Use try/except** in glue services at import time (degrade gracefully)
|
||||
3. **Mount shared volume** with all model files
|
||||
|
||||
Recommend option 2 for now — glue services that can't import a model simply raise ImportError at call time, which only happens if the service is called from the wrong app (shouldn't happen in practice).
|
||||
|
||||
---
|
||||
|
||||
## Verification
|
||||
|
||||
1. `grep -r "from blog\.models" cart/ market/ events/ glue/` — should return zero results (only in blog/ itself)
|
||||
2. `grep -r "from market\.models" blog/ cart/ events/` — should return zero results (only in market/ and glue/)
|
||||
3. `grep -r "from cart\.models" blog/ market/ events/` — should return zero results (only in cart/ and glue/)
|
||||
4. `grep -r "from events\.models" blog/ cart/ market/` — should return zero results (only in events/ and glue/)
|
||||
5. All 4 apps start without import errors
|
||||
6. Checkout flow works end-to-end
|
||||
7. Blog admin: can toggle features, create/delete markets, manage calendar entries
|
||||
8. Events admin: can create calendars, manage markets, configure payments
|
||||
9. Market app: markets listing page loads correctly
|
||||
325
.claude/plans/hazy-sniffing-sphinx.md
Normal file
325
.claude/plans/hazy-sniffing-sphinx.md
Normal file
@@ -0,0 +1,325 @@
|
||||
# Split Cart into Microservices
|
||||
|
||||
## Context
|
||||
The cart app currently owns too much: CartItem, Order/OrderItem, PageConfig, ContainerRelation, plus all checkout/payment logic. We're splitting it into 4 pieces:
|
||||
|
||||
1. **Relations service** — internal only, owns ContainerRelation
|
||||
2. **Likes service** — internal only, unified generic likes replacing ProductLike + PostLike
|
||||
3. **PageConfig → blog** — move to blog (which already owns pages)
|
||||
4. **Orders service** — public (orders.rose-ash.com), owns Order/OrderItem + SumUp checkout
|
||||
|
||||
After the split, cart becomes a thin CartItem CRUD + inbox service.
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Relations Service (internal only)
|
||||
|
||||
### 1.1 Scaffold `relations/`
|
||||
Create minimal internal-only app (no templates, no context_fn):
|
||||
|
||||
| File | Notes |
|
||||
|------|-------|
|
||||
| `relations/__init__.py` | Empty |
|
||||
| `relations/path_setup.py` | Copy from cart |
|
||||
| `relations/app.py` | `create_base_app("relations")`, register data + actions BPs only |
|
||||
| `relations/services/__init__.py` | Empty `register_domain_services()` |
|
||||
| `relations/models/__init__.py` | `from shared.models.container_relation import ContainerRelation` |
|
||||
| `relations/bp/__init__.py` | Export `register_data`, `register_actions` |
|
||||
| `relations/bp/data/routes.py` | Move `get-children` handler from `cart/bp/data/routes.py:175-198` |
|
||||
| `relations/bp/actions/routes.py` | Move `attach-child` + `detach-child` from `cart/bp/actions/routes.py:112-153` |
|
||||
| `relations/alembic.ini` | Copy from cart, adjust path |
|
||||
| `relations/alembic/env.py` | MODELS=`["shared.models.container_relation"]`, TABLES=`{"container_relations"}` |
|
||||
| `relations/alembic/versions/0001_initial.py` | Create `container_relations` table |
|
||||
| `relations/Dockerfile` | Follow cart pattern, `COPY relations/ ./` |
|
||||
| `relations/entrypoint.sh` | Standard pattern, db=`db_relations` |
|
||||
|
||||
### 1.2 Retarget callers (`"cart"` → `"relations"`)
|
||||
|
||||
| File | Lines | Change |
|
||||
|------|-------|--------|
|
||||
| `events/bp/calendars/services/calendars.py` | 74, 111, 121 | `call_action("cart", ...)` → `call_action("relations", ...)` |
|
||||
| `blog/bp/menu_items/services/menu_items.py` | 83, 137, 141, 157 | Same |
|
||||
| `shared/services/market_impl.py` | 96, 109, 133 | Same |
|
||||
|
||||
### 1.3 Clean up cart
|
||||
- Remove `get-children` from `cart/bp/data/routes.py:175-198`
|
||||
- Remove `attach-child`, `detach-child` from `cart/bp/actions/routes.py:112-153`
|
||||
- Remove `"shared.models.container_relation"` and `"container_relations"` from `cart/alembic/env.py`
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Likes Service (internal only)
|
||||
|
||||
### 2.1 New unified model
|
||||
Single `likes` table in `db_likes`:
|
||||
|
||||
```python
|
||||
class Like(Base):
|
||||
__tablename__ = "likes"
|
||||
id: Mapped[int] (pk)
|
||||
user_id: Mapped[int] (not null, indexed)
|
||||
target_type: Mapped[str] (String 32, not null) # "product" or "post"
|
||||
target_slug: Mapped[str | None] (String 255) # for products
|
||||
target_id: Mapped[int | None] (Integer) # for posts
|
||||
created_at, updated_at, deleted_at
|
||||
|
||||
UniqueConstraint("user_id", "target_type", "target_slug")
|
||||
UniqueConstraint("user_id", "target_type", "target_id")
|
||||
Index("ix_likes_target", "target_type", "target_slug")
|
||||
```
|
||||
|
||||
Products use `target_type="product"`, `target_slug=slug`. Posts use `target_type="post"`, `target_id=post.id`.
|
||||
|
||||
### 2.2 Scaffold `likes/`
|
||||
|
||||
| File | Notes |
|
||||
|------|-------|
|
||||
| `likes/__init__.py` | Empty |
|
||||
| `likes/path_setup.py` | Standard |
|
||||
| `likes/app.py` | Internal-only, `create_base_app("likes")`, data + actions BPs |
|
||||
| `likes/services/__init__.py` | Empty `register_domain_services()` |
|
||||
| `likes/models/__init__.py` | Import Like |
|
||||
| `likes/models/like.py` | Generic Like model (above) |
|
||||
| `likes/bp/__init__.py` | Export register functions |
|
||||
| `likes/bp/data/routes.py` | `is-liked`, `liked-slugs`, `liked-ids` |
|
||||
| `likes/bp/actions/routes.py` | `toggle` action |
|
||||
| `likes/alembic.ini` | Standard |
|
||||
| `likes/alembic/env.py` | MODELS=`["likes.models.like"]`, TABLES=`{"likes"}` |
|
||||
| `likes/alembic/versions/0001_initial.py` | Create `likes` table |
|
||||
| `likes/Dockerfile` | Standard pattern |
|
||||
| `likes/entrypoint.sh` | Standard, db=`db_likes` |
|
||||
|
||||
### 2.3 Data endpoints (`likes/bp/data/routes.py`)
|
||||
- `is-liked`: params `user_id, target_type, target_slug/target_id` → `{"liked": bool}`
|
||||
- `liked-slugs`: params `user_id, target_type` → `["slug1", "slug2"]`
|
||||
- `liked-ids`: params `user_id, target_type` → `[1, 2, 3]`
|
||||
|
||||
### 2.4 Action endpoints (`likes/bp/actions/routes.py`)
|
||||
- `toggle`: payload `{user_id, target_type, target_slug?, target_id?}` → `{"liked": bool}`
|
||||
|
||||
### 2.5 Retarget market app
|
||||
|
||||
**`market/bp/product/routes.py`** (like_toggle, ~line 119):
|
||||
Replace `toggle_product_like(g.s, user_id, product_slug)` with:
|
||||
```python
|
||||
result = await call_action("likes", "toggle", payload={
|
||||
"user_id": user_id, "target_type": "product", "target_slug": product_slug
|
||||
})
|
||||
liked = result["liked"]
|
||||
```
|
||||
|
||||
**`market/bp/browse/services/db_backend.py`** (most complex):
|
||||
- `db_product_full` / `db_product_full_id`: Replace `ProductLike` subquery with `fetch_data("likes", "is-liked", ...)`. Annotate `is_liked` after query.
|
||||
- `db_products_nocounts` / `db_products_counts`: Fetch `liked_slugs` once via `fetch_data("likes", "liked-slugs", ...)`, filter `Product.slug.in_(liked_slugs)` for `?liked=true`, annotate `is_liked` post-query.
|
||||
|
||||
**Delete**: `toggle_product_like` from `market/bp/product/services/product_operations.py`
|
||||
|
||||
### 2.6 Retarget blog app
|
||||
|
||||
**`blog/bp/post/routes.py`** (like_toggle):
|
||||
Replace `toggle_post_like(g.s, user_id, post_id)` with `call_action("likes", "toggle", payload={...})`.
|
||||
|
||||
**Delete**: `toggle_post_like` from `blog/bp/post/services/post_operations.py`
|
||||
|
||||
### 2.7 Remove old like models
|
||||
- Remove `ProductLike` from `shared/models/market.py` (lines 118-131) + `Product.likes` relationship (lines 110-114)
|
||||
- Remove `PostLike` from `shared/models/ghost_content.py` + `Post.likes` relationship
|
||||
- Remove `product_likes` from market alembic TABLES
|
||||
- Remove `post_likes` from blog alembic TABLES
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: PageConfig → Blog
|
||||
|
||||
### 3.1 Replace blog proxy endpoints with direct DB queries
|
||||
|
||||
**`blog/bp/data/routes.py`** (lines 77-102): Replace the 3 proxy handlers that currently call `fetch_data("cart", ...)` with direct DB queries. Copy logic from `cart/bp/data/routes.py`:
|
||||
- `page-config` (cart lines 114-134)
|
||||
- `page-config-by-id` (cart lines 136-149)
|
||||
- `page-configs-batch` (cart lines 151-172)
|
||||
- `page-config-ensure` (cart lines 49-81) — add new
|
||||
|
||||
Also add the `_page_config_dict` helper (cart lines 203-213).
|
||||
|
||||
### 3.2 Move action to blog
|
||||
|
||||
**`blog/bp/actions/routes.py`** (~line 40): Replace `call_action("cart", "update-page-config", ...)` proxy with direct handler. Copy logic from `cart/bp/actions/routes.py:51-110`.
|
||||
|
||||
### 3.3 Blog callers become local
|
||||
|
||||
| File | Current | After |
|
||||
|------|---------|-------|
|
||||
| `blog/bp/post/admin/routes.py:34` | `fetch_data("cart", "page-config", ...)` | Direct DB query (blog now owns table) |
|
||||
| `blog/bp/post/admin/routes.py:87,132` | `call_action("cart", "update-page-config", ...)` | Direct call to local handler |
|
||||
| `blog/bp/post/services/markets.py:44` | `fetch_data("cart", "page-config", ...)` | Direct DB query |
|
||||
| `blog/bp/blog/ghost_db.py:295` | `fetch_data("cart", "page-configs-batch", ...)` | Direct DB query |
|
||||
|
||||
### 3.4 Retarget cross-service callers (`"cart"` → `"blog"`)
|
||||
|
||||
| File | Change |
|
||||
|------|--------|
|
||||
| `cart/bp/cart/services/page_cart.py:181` | `fetch_data("cart", "page-configs-batch", ...)` → `fetch_data("blog", "page-configs-batch", ...)` |
|
||||
| `cart/bp/cart/global_routes.py:274` | `fetch_data("cart", "page-config-by-id", ...)` → `fetch_data("blog", "page-config-by-id", ...)` |
|
||||
|
||||
(Note: `checkout.py:117` and `cart/app.py:177` already target `"blog"`)
|
||||
|
||||
### 3.5 Update blog alembic
|
||||
**`blog/alembic/env.py`**: Add `"shared.models.page_config"` to MODELS and `"page_configs"` to TABLES.
|
||||
|
||||
### 3.6 Clean up cart
|
||||
- Remove all `page-config*` handlers from `cart/bp/data/routes.py` (lines 49-172)
|
||||
- Remove `update-page-config` from `cart/bp/actions/routes.py` (lines 50-110)
|
||||
- Remove `"shared.models.page_config"` and `"page_configs"` from `cart/alembic/env.py`
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Orders Service (public, orders.rose-ash.com)
|
||||
|
||||
### 4.1 Scaffold `orders/`
|
||||
|
||||
| File | Notes |
|
||||
|------|-------|
|
||||
| `orders/__init__.py` | Empty |
|
||||
| `orders/path_setup.py` | Standard |
|
||||
| `orders/app.py` | Public app with `context_fn`, templates, fragments, page slug hydration |
|
||||
| `orders/services/__init__.py` | `register_domain_services()` |
|
||||
| `orders/models/__init__.py` | `from shared.models.order import Order, OrderItem` |
|
||||
| `orders/bp/__init__.py` | Export all BPs |
|
||||
| `orders/bp/order/` | Move from `cart/bp/order/` (single order: detail, pay, recheck) |
|
||||
| `orders/bp/orders/` | Move from `cart/bp/orders/` (order list + pagination) |
|
||||
| `orders/bp/checkout/routes.py` | Webhook + return routes from `cart/bp/cart/global_routes.py` |
|
||||
| `orders/bp/data/routes.py` | Minimal |
|
||||
| `orders/bp/actions/routes.py` | `create-order` action (called by cart during checkout) |
|
||||
| `orders/bp/fragments/routes.py` | `account-nav-item` fragment (orders link) |
|
||||
| `orders/templates/` | Move `_types/order/`, `_types/orders/`, checkout templates from cart |
|
||||
| `orders/alembic.ini` | Standard |
|
||||
| `orders/alembic/env.py` | MODELS=`["shared.models.order"]`, TABLES=`{"orders", "order_items"}` |
|
||||
| `orders/alembic/versions/0001_initial.py` | Create `orders` + `order_items` tables |
|
||||
| `orders/Dockerfile` | Standard, public-facing |
|
||||
| `orders/entrypoint.sh` | Standard, db=`db_orders` |
|
||||
|
||||
### 4.2 Move checkout services to orders
|
||||
|
||||
**Move to `orders/services/`:**
|
||||
- `checkout.py` — from `cart/bp/cart/services/checkout.py` (move: `create_order_from_cart`, `resolve_page_config`, `build_sumup_*`, `get_order_with_details`. Keep `find_or_create_cart_item` in cart.)
|
||||
- `check_sumup_status.py` — from `cart/bp/cart/services/check_sumup_status.py`
|
||||
|
||||
**`clear_cart_for_order`** stays in cart as new action:
|
||||
- Add `clear-cart-for-order` to `cart/bp/actions/routes.py`
|
||||
- Orders calls `call_action("cart", "clear-cart-for-order", payload={user_id, session_id, page_post_id})`
|
||||
|
||||
### 4.3 `create-order` action endpoint (`orders/bp/actions/routes.py`)
|
||||
Cart's `POST /checkout/` calls this:
|
||||
```
|
||||
Payload: {cart_items: [{product_id, product_title, product_slug, product_image,
|
||||
product_special_price, product_regular_price, product_price_currency,
|
||||
quantity, market_place_container_id}],
|
||||
calendar_entries, tickets, user_id, session_id,
|
||||
product_total, calendar_total, ticket_total,
|
||||
page_post_id, redirect_url, webhook_base_url}
|
||||
Returns: {order_id, sumup_hosted_url, page_config_id, sumup_reference, description}
|
||||
```
|
||||
|
||||
### 4.4 Refactor cart's checkout route
|
||||
`cart/bp/cart/global_routes.py` `POST /checkout/`:
|
||||
1. Load local cart data (get_cart, calendar entries, tickets, totals)
|
||||
2. Serialize cart items to dicts
|
||||
3. `result = await call_action("orders", "create-order", payload={...})`
|
||||
4. Redirect to `result["sumup_hosted_url"]`
|
||||
|
||||
Same for page-scoped checkout in `cart/bp/cart/page_routes.py`.
|
||||
|
||||
### 4.5 Move webhook + return routes to orders
|
||||
- `POST /checkout/webhook/<order_id>/` → `orders/bp/checkout/routes.py`
|
||||
- `GET /checkout/return/<order_id>/` → `orders/bp/checkout/routes.py`
|
||||
- SumUp redirect/webhook URLs must now point to orders.rose-ash.com
|
||||
|
||||
### 4.6 Move order list/detail routes
|
||||
- `cart/bp/order/` → `orders/bp/order/`
|
||||
- `cart/bp/orders/` → `orders/bp/orders/`
|
||||
|
||||
### 4.7 Move startup reconciliation
|
||||
`_reconcile_pending_orders` from `cart/app.py:209-265` → `orders/app.py`
|
||||
|
||||
### 4.8 Clean up cart
|
||||
- Remove `cart/bp/order/`, `cart/bp/orders/`
|
||||
- Remove checkout webhook/return from `cart/bp/cart/global_routes.py`
|
||||
- Remove `_reconcile_pending_orders` from `cart/app.py`
|
||||
- Remove order templates from `cart/templates/`
|
||||
- Remove `"shared.models.order"` and `"orders", "order_items"` from `cart/alembic/env.py`
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: Infrastructure (applies to all new services)
|
||||
|
||||
### 5.1 docker-compose.yml
|
||||
Add 3 new services (relations, likes, orders) with own DATABASE_URL (db_relations, db_likes, db_orders), own REDIS_URL (Redis DB 7, 8, 9).
|
||||
|
||||
Add to `x-app-env`:
|
||||
```yaml
|
||||
INTERNAL_URL_RELATIONS: http://relations:8000
|
||||
INTERNAL_URL_LIKES: http://likes:8000
|
||||
INTERNAL_URL_ORDERS: http://orders:8000
|
||||
APP_URL_ORDERS: https://orders.rose-ash.com
|
||||
```
|
||||
|
||||
### 5.2 docker-compose.dev.yml
|
||||
Add all 3 services with dev volumes (ports 8008, 8009, 8010).
|
||||
Add to `x-sibling-models` for all 3 new services.
|
||||
|
||||
### 5.3 deploy.sh
|
||||
Add `relations likes orders` to APPS list.
|
||||
|
||||
### 5.4 Caddyfile (`/root/caddy/Caddyfile`)
|
||||
Add only orders (public):
|
||||
```
|
||||
orders.rose-ash.com { reverse_proxy rose-ash-dev-orders-1:8000 }
|
||||
```
|
||||
|
||||
### 5.5 shared/infrastructure/factory.py
|
||||
Add to model import loop: `"relations.models", "likes.models", "orders.models"`
|
||||
|
||||
### 5.6 shared/infrastructure/urls.py
|
||||
Add `orders_url(path)` helper.
|
||||
|
||||
### 5.7 All existing Dockerfiles
|
||||
Add sibling model COPY lines for the 3 new services to every existing Dockerfile (blog, market, cart, events, federation, account).
|
||||
|
||||
### 5.8 CLAUDE.md
|
||||
Update project structure and add notes about the new services.
|
||||
|
||||
---
|
||||
|
||||
## Data Migration (one-time, run before code switch)
|
||||
|
||||
1. `container_relations` from `db_cart` → `db_relations`
|
||||
2. `product_likes` from `db_market` + `post_likes` from `db_blog` → `db_likes.likes`
|
||||
3. `page_configs` from `db_cart` → `db_blog`
|
||||
4. `orders` + `order_items` from `db_cart` → `db_orders`
|
||||
|
||||
Use `pg_dump`/`pg_restore` or direct SQL for migration.
|
||||
|
||||
---
|
||||
|
||||
## Post-Split Cart State
|
||||
|
||||
After all 4 phases, cart owns only:
|
||||
- **Model**: CartItem (table in db_cart)
|
||||
- **Alembic**: `cart_items` only
|
||||
- **Data endpoints**: `cart-summary`, `cart-items`
|
||||
- **Action endpoints**: `adopt-cart-for-user`, `clear-cart-for-order` (new)
|
||||
- **Inbox handlers**: Add/Remove/Update `rose:CartItem`
|
||||
- **Public routes**: cart overview, page cart, add-to-cart, quantity, delete
|
||||
- **Fragments**: `cart-mini`
|
||||
- **Checkout**: POST /checkout/ (creates order via `call_action("orders", "create-order")`, redirects to SumUp)
|
||||
|
||||
---
|
||||
|
||||
## Verification
|
||||
1. **Relations**: Blog attach/detach marketplace to page; events attach/detach calendar
|
||||
2. **Likes**: Toggle product like on market page; toggle post like on blog; `?liked=true` filter
|
||||
3. **PageConfig**: Blog admin page config update; cart checkout resolves page config from blog
|
||||
4. **Orders**: Add to cart → checkout → SumUp redirect → webhook → order paid; order list/detail on orders.rose-ash.com
|
||||
5. No remaining `call_action("cart", "attach-child|detach-child|update-page-config")`
|
||||
6. No remaining `fetch_data("cart", "page-config*|get-children")`
|
||||
7. Cart alembic only manages `cart_items` table
|
||||
149
.claude/plans/rippling-tumbling-cocke.md
Normal file
149
.claude/plans/rippling-tumbling-cocke.md
Normal file
@@ -0,0 +1,149 @@
|
||||
# Ticket UX Improvements: +/- Buttons, Sold Count, Cart Grouping
|
||||
|
||||
## Context
|
||||
The entry page currently uses a numeric input + "Buy Tickets" button, which replaces itself with a confirmation after purchase. The cart lists each ticket individually. The user wants the ticket UX to match the product pattern: +/- buttons, "in basket" count, tickets grouped by event on cart.
|
||||
|
||||
## Requirements
|
||||
1. **Entry page**: Show tickets sold count + current user's "in basket" count
|
||||
2. **Entry page**: Replace qty input with "Add to basket" / +/- buttons (product pattern)
|
||||
3. **Entry page**: Keep form active after adding (don't replace with confirmation)
|
||||
4. **Cart page**: Group tickets by event (entry_id + ticket_type), show quantity with +/- buttons
|
||||
|
||||
---
|
||||
|
||||
## 1. Add `ticket_type_id` to TicketDTO
|
||||
|
||||
**File**: `shared/contracts/dtos.py`
|
||||
- Add `ticket_type_id: int | None = None` field to `TicketDTO`
|
||||
|
||||
**File**: `shared/services/calendar_impl.py`
|
||||
- In `_ticket_to_dto()`, populate `ticket_type_id=ticket.ticket_type_id`
|
||||
|
||||
**Sync**: Copy to all 4 app submodule copies.
|
||||
|
||||
## 2. New ticket service functions
|
||||
|
||||
**File**: `events/bp/tickets/services/tickets.py`
|
||||
- Add `get_user_reserved_count(session, entry_id, user_id, session_id, ticket_type_id=None) -> int`
|
||||
- Counts reserved tickets for this user+entry+type
|
||||
- Add `get_sold_ticket_count(session, entry_id) -> int`
|
||||
- Counts all non-cancelled tickets for this entry
|
||||
- Add `cancel_latest_reserved_ticket(session, entry_id, user_id, session_id, ticket_type_id=None) -> bool`
|
||||
- Finds the most recently created reserved ticket for this user+entry+type, sets state='cancelled'. Returns True if one was cancelled.
|
||||
|
||||
## 3. Add `adjust_quantity` route to events tickets blueprint
|
||||
|
||||
**File**: `events/bp/tickets/routes.py`
|
||||
- New route: `POST /tickets/adjust/`
|
||||
- Form fields: `entry_id`, `ticket_type_id` (optional), `count` (target quantity)
|
||||
- Logic:
|
||||
- Get current user reserved count for this entry/type
|
||||
- If count > current: create `(count - current)` tickets via `create_ticket()`
|
||||
- If count < current: cancel `(current - count)` tickets via `cancel_latest_reserved_ticket()` in a loop
|
||||
- If count == 0: cancel all
|
||||
- Check availability before adding (like existing `buy_tickets`)
|
||||
- Response: re-render `_buy_form.html` (HTMX swap replaces form, keeps it active)
|
||||
- Include OOB cart-mini update: `{{ mini(oob='true') }}`
|
||||
|
||||
## 4. Inject ticket counts into entry page context
|
||||
|
||||
**File**: `events/bp/calendar_entry/routes.py` — `inject_root` context processor
|
||||
- Add `ticket_sold_count`: total non-cancelled tickets for entry (via `get_sold_ticket_count`)
|
||||
- Add `user_ticket_count`: current user's reserved count (via `get_user_reserved_count`)
|
||||
- For multi-type entries, add `user_ticket_counts_by_type`: dict mapping ticket_type_id → count
|
||||
|
||||
## 5. Rewrite entry page buy form
|
||||
|
||||
**File**: `events/templates/_types/tickets/_buy_form.html`
|
||||
- Show "X sold" (from `ticket_sold_count`) alongside "X remaining"
|
||||
- Show "X in basket" for current user
|
||||
|
||||
**For single-price entries (no ticket types)**:
|
||||
- If `user_ticket_count == 0`: show "Add to basket" button (posts to `/tickets/adjust/` with count=1)
|
||||
- If `user_ticket_count > 0`: show `[-]` [count badge] `[+]` buttons
|
||||
- Minus: posts count=user_ticket_count-1
|
||||
- Plus: posts count=user_ticket_count+1
|
||||
- All forms: `hx-post`, `hx-target="#ticket-buy-{{ entry.id }}"`, `hx-swap="outerHTML"`
|
||||
|
||||
**For multi-type entries**:
|
||||
- Same pattern per ticket type row, using `user_ticket_counts_by_type[tt.id]`
|
||||
|
||||
Style: match product pattern exactly — emerald circular buttons, w-8 h-8, cart icon with badge.
|
||||
|
||||
## 6. Add ticket quantity route to cart app
|
||||
|
||||
**File**: `cart/bp/cart/global_routes.py`
|
||||
- New route: `POST /cart/ticket-quantity/`
|
||||
- Form fields: `entry_id`, `ticket_type_id` (optional), `count` (target quantity)
|
||||
- Logic: call into CalendarService or directly use ticket functions
|
||||
- Since cart app uses service contracts, add `adjust_ticket_quantity` to CalendarService protocol
|
||||
|
||||
**File**: `shared/contracts/protocols.py` — CalendarService
|
||||
- Add: `adjust_ticket_quantity(session, entry_id, count, *, user_id, session_id, ticket_type_id=None) -> int`
|
||||
|
||||
**File**: `shared/services/calendar_impl.py`
|
||||
- Implement `adjust_ticket_quantity`:
|
||||
- Same logic as events adjust route (create/cancel to match target count)
|
||||
- Return new count
|
||||
|
||||
**File**: `shared/services/stubs.py`
|
||||
- Add stub: returns 0
|
||||
|
||||
Response: `HX-Refresh: true` (same as product quantity route).
|
||||
|
||||
## 7. Cart page: group tickets by event with +/- buttons
|
||||
|
||||
**File**: `cart/templates/_types/cart/_cart.html` — ticket section (lines 63-95)
|
||||
- Replace individual ticket list with grouped display
|
||||
- Group `ticket_cart_entries` by `(entry_id, ticket_type_id)`:
|
||||
- Use Jinja `groupby` on `entry_id` first, then sub-group by `ticket_type_name`
|
||||
- Or pre-group in the route handler and pass as a dict
|
||||
|
||||
**Approach**: Pre-group in the route handler for cleaner templates.
|
||||
|
||||
**File**: `cart/bp/cart/page_routes.py` — `page_view`
|
||||
- After getting `page_tickets`, group them into a list of dicts:
|
||||
```
|
||||
[{"entry_name": ..., "entry_id": ..., "ticket_type_name": ..., "ticket_type_id": ...,
|
||||
"entry_start_at": ..., "entry_end_at": ..., "price": ..., "quantity": N}]
|
||||
```
|
||||
- Pass as `ticket_groups` to template
|
||||
|
||||
**File**: `cart/bp/cart/global_routes.py` — overview/checkout routes
|
||||
- Same grouping for global cart view if tickets appear there
|
||||
|
||||
**Cart ticket group template**: Each group shows:
|
||||
- Event name + ticket type (if any)
|
||||
- Date/time
|
||||
- Price per ticket
|
||||
- `-` [qty] `+` buttons (posting to `/cart/ticket-quantity/`)
|
||||
- Line total (price × qty)
|
||||
|
||||
Match product `cart_item` macro style (article card with quantity controls).
|
||||
|
||||
## 8. Cart summary update
|
||||
|
||||
**File**: `cart/templates/_types/cart/_cart.html` — `summary` macro
|
||||
- Update Items count: include ticket quantities in total (currently just product quantities)
|
||||
|
||||
## Files to modify (summary)
|
||||
- `shared/contracts/dtos.py` — add ticket_type_id to TicketDTO
|
||||
- `shared/contracts/protocols.py` — add adjust_ticket_quantity to CalendarService
|
||||
- `shared/services/calendar_impl.py` — implement adjust_ticket_quantity, update _ticket_to_dto
|
||||
- `shared/services/stubs.py` — add stub
|
||||
- `events/bp/tickets/services/tickets.py` — add count/cancel functions
|
||||
- `events/bp/tickets/routes.py` — add adjust route
|
||||
- `events/bp/calendar_entry/routes.py` — inject sold/user counts
|
||||
- `events/templates/_types/tickets/_buy_form.html` — rewrite with +/- pattern
|
||||
- `cart/bp/cart/global_routes.py` — add ticket-quantity route
|
||||
- `cart/bp/cart/page_routes.py` — group tickets
|
||||
- `cart/templates/_types/cart/_cart.html` — grouped ticket display with +/-
|
||||
- All 4 app `shared/` submodule copies synced
|
||||
|
||||
## Verification
|
||||
1. Visit entry page → see "X sold", "X in basket", "Add to basket" button
|
||||
2. Click "Add to basket" → form stays, shows `-` [1] `+`, basket count shows "1 in basket"
|
||||
3. Click `+` → count increases, sold count increases
|
||||
4. Click `-` → count decreases, ticket cancelled
|
||||
5. Visit cart page → tickets grouped by event, +/- buttons work
|
||||
6. Checkout flow still works (existing tests)
|
||||
171
.claude/plans/unified-inventing-kay.md
Normal file
171
.claude/plans/unified-inventing-kay.md
Normal file
@@ -0,0 +1,171 @@
|
||||
# Social Network Sharing Integration
|
||||
|
||||
## Context
|
||||
|
||||
Rose Ash already has ActivityPub for federated social sharing. This plan adds OAuth-based sharing to mainstream social networks — Facebook, Instagram, Threads, Twitter/X, LinkedIn, and Mastodon. Users connect their social accounts via the account dashboard, then manually share content (blog posts, events, products) via a share button on content pages.
|
||||
|
||||
All social logic lives in the **account** microservice. Content apps get a share button that opens the account share page.
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Data Model + Encryption
|
||||
|
||||
### 1a. `shared/models/social_connection.py` (NEW)
|
||||
- SQLAlchemy 2.0 model following `oauth_grant.py` pattern
|
||||
- Table `social_connections` in db_account
|
||||
- Columns: `id`, `user_id` (FK to users.id with CASCADE), `platform` (facebook/instagram/threads/twitter/linkedin/mastodon), `platform_user_id`, `platform_username`, `display_name`, `access_token_enc`, `refresh_token_enc`, `token_expires_at`, `scopes`, `extra_data` (JSONB — mastodon instance URL, facebook page ID, etc.), `created_at`, `updated_at`, `revoked_at`
|
||||
- Indexes: `(user_id, platform)`, unique `(platform, platform_user_id)`
|
||||
|
||||
### 1b. `shared/models/__init__.py` (MODIFY)
|
||||
- Add `from .social_connection import SocialConnection`
|
||||
|
||||
### 1c. `shared/infrastructure/social_crypto.py` (NEW)
|
||||
- Fernet encrypt/decrypt using `SOCIAL_ENCRYPTION_KEY` env var
|
||||
- `encrypt_token(plaintext) -> str`, `decrypt_token(ciphertext) -> str`
|
||||
|
||||
### 1d. Alembic migration (NEW)
|
||||
- Creates `social_connections` table
|
||||
|
||||
### 1e. `docker-compose.yml` (MODIFY)
|
||||
- Add to `x-app-env`: `SOCIAL_ENCRYPTION_KEY`, plus per-platform credentials (`SOCIAL_FACEBOOK_APP_ID`, `SOCIAL_FACEBOOK_APP_SECRET`, `SOCIAL_TWITTER_CLIENT_ID`, `SOCIAL_TWITTER_CLIENT_SECRET`, `SOCIAL_LINKEDIN_CLIENT_ID`, `SOCIAL_LINKEDIN_CLIENT_SECRET`)
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: Platform OAuth Clients
|
||||
|
||||
All in `account/services/social_platforms/`:
|
||||
|
||||
### 2a. `base.py` (NEW)
|
||||
- `OAuthResult` dataclass (platform_user_id, tokens, expiry, extra_data)
|
||||
- `ShareResult` dataclass (success, platform_post_id, platform_post_url, error)
|
||||
- `SocialPlatform` abstract base class: `get_authorize_url()`, `exchange_code()`, `refresh_access_token()`, `share_link()`, `verify_token()`
|
||||
|
||||
### 2b. `meta.py` (NEW) — Facebook + Instagram + Threads
|
||||
- **Facebook**: OAuth2 via Graph API, `pages_manage_posts` scope, exchange user token → long-lived → page token, post via `/{page_id}/feed`
|
||||
- **Instagram**: Same Meta OAuth, `instagram_basic` + `instagram_content_publish` scopes, business/creator accounts only, container → publish workflow
|
||||
- **Threads**: Separate OAuth at threads.net, `threads_basic` + `threads_content_publish` scopes, container → publish
|
||||
|
||||
### 2c. `twitter.py` (NEW) — Twitter/X
|
||||
- OAuth 2.0 with PKCE, `tweet.write` + `offline.access` scopes
|
||||
- Post via `POST https://api.twitter.com/2/tweets`
|
||||
|
||||
### 2d. `linkedin.py` (NEW) — LinkedIn
|
||||
- OAuth 2.0, `w_member_social` + `openid` scopes
|
||||
- Post via LinkedIn Posts API
|
||||
|
||||
### 2e. `mastodon.py` (NEW) — Mastodon
|
||||
- Dynamic app registration per instance (`POST /api/v1/apps`)
|
||||
- OAuth 2.0, `write:statuses` scope
|
||||
- Post via `POST /api/v1/statuses`
|
||||
- Instance URL stored in `extra_data["instance_url"]`
|
||||
|
||||
### 2f. `__init__.py` (NEW) — Platform registry
|
||||
- `PLATFORMS` dict, lazy-initialized from env vars
|
||||
- Mastodon always available (no pre-configured credentials)
|
||||
- `get_platform(name)`, `available_platforms()`
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: Account Social Blueprint
|
||||
|
||||
### 3a. `account/bp/social/__init__.py` (NEW)
|
||||
### 3b. `account/bp/social/routes.py` (NEW)
|
||||
|
||||
Routes (all require login):
|
||||
- `GET /social/` — list connected accounts + available platforms
|
||||
- `GET /social/connect/<platform>/` — start OAuth redirect (Mastodon: accept instance URL param)
|
||||
- `GET /social/callback/<platform>/` — OAuth callback, exchange code, encrypt & store tokens
|
||||
- `POST /social/disconnect/<int:id>/` — soft-delete (set revoked_at)
|
||||
- `GET /social/share/` — share page (params: url, title, description, image)
|
||||
- `POST /social/share/` — execute share to selected accounts, return results
|
||||
|
||||
OAuth state stored in session (nonce + platform + redirect params).
|
||||
|
||||
### 3c. `account/bp/__init__.py` (MODIFY)
|
||||
- Add `from .social.routes import register as register_social_bp`
|
||||
|
||||
### 3d. `account/app.py` (MODIFY)
|
||||
- Register social blueprint **before** account blueprint (account has catch-all `/<slug>/`)
|
||||
```python
|
||||
app.register_blueprint(register_auth_bp())
|
||||
app.register_blueprint(register_social_bp()) # <-- NEW, before account
|
||||
app.register_blueprint(register_account_bp())
|
||||
app.register_blueprint(register_fragments())
|
||||
```
|
||||
|
||||
### 3e. `account/templates/_types/auth/_nav.html` (MODIFY)
|
||||
- Add "social" link between newsletters and `account_nav_html`
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Templates
|
||||
|
||||
### 4a. `account/templates/_types/auth/_social_panel.html` (NEW)
|
||||
- Platform cards with icons (Font Awesome: `fa-facebook`, `fa-instagram`, `fa-threads`, `fa-x-twitter`, `fa-linkedin`, `fa-mastodon`)
|
||||
- Connected accounts per platform: display name, username, disconnect button
|
||||
- "Connect" button per platform
|
||||
- Mastodon: instance URL input before connecting
|
||||
|
||||
### 4b. `account/templates/_types/auth/_share_panel.html` (NEW)
|
||||
- Content preview card (title, image, URL)
|
||||
- Connected accounts as checkboxes grouped by platform
|
||||
- Optional message textarea
|
||||
- Share button → HTMX POST to `/social/share/`
|
||||
|
||||
### 4c. `account/templates/_types/auth/_share_result.html` (NEW)
|
||||
- Per-platform success/failure with links to created posts
|
||||
|
||||
### 4d. `account/templates/_types/auth/_mastodon_connect.html` (NEW)
|
||||
- Instance URL input form
|
||||
|
||||
---
|
||||
|
||||
## Phase 5: Share Button in Content Apps
|
||||
|
||||
### 5a. `account/bp/fragments/routes.py` (MODIFY)
|
||||
- Add `share-button` handler: accepts url, title, description, image params
|
||||
- Returns a share icon/link pointing to `account.rose-ash.com/social/share/?url=...&title=...`
|
||||
|
||||
### 5b. `account/templates/fragments/share_button.html` (NEW)
|
||||
- Small button: `<a href="..." target="_blank"><i class="fa-solid fa-share-nodes"></i> Share</a>`
|
||||
|
||||
### 5c. Content app integration
|
||||
- Blog post detail: fetch `share-button` fragment from account, render in post template
|
||||
- Events detail: same pattern
|
||||
- Market product detail: same pattern
|
||||
- Each passes its own public URL, title, description, image to the fragment
|
||||
|
||||
---
|
||||
|
||||
## Phase 6: Token Refresh + Share History
|
||||
|
||||
### 6a. Token refresh in share flow
|
||||
- Before posting, check `token_expires_at`; if expired, call `refresh_access_token()`
|
||||
- Update encrypted tokens in DB
|
||||
- If refresh fails, mark connection with error and prompt reconnect
|
||||
|
||||
### 6b. `shared/models/social_share.py` (NEW, optional)
|
||||
- Table `social_shares`: connection_id, shared_url, shared_title, platform_post_id, platform_post_url, status, error_message, created_at
|
||||
- Prevents duplicate shares, enables "shared" indicator on content pages
|
||||
|
||||
---
|
||||
|
||||
## Key Patterns to Follow
|
||||
|
||||
| Pattern | Reference File |
|
||||
|---------|---------------|
|
||||
| ORM model (mapped_column, FK, indexes) | `shared/models/oauth_grant.py` |
|
||||
| Blueprint registration + OOB template | `account/bp/account/routes.py` |
|
||||
| Fragment handler dict | `account/bp/fragments/routes.py` |
|
||||
| Account nav link | `account/templates/_types/auth/_nav.html` |
|
||||
| httpx async client | `shared/infrastructure/actions.py` |
|
||||
|
||||
## Verification
|
||||
|
||||
1. Generate `SOCIAL_ENCRYPTION_KEY`, add to `.env`
|
||||
2. Run Alembic migration
|
||||
3. Start account app, navigate to `/social/`
|
||||
4. Connect a test Mastodon account (easiest — no app review needed)
|
||||
5. Navigate to a blog post, click Share, select Mastodon account, verify post appears
|
||||
6. Disconnect account, verify soft-delete
|
||||
7. Test token refresh by connecting Facebook with short-lived token
|
||||
@@ -1,8 +1,13 @@
|
||||
.git
|
||||
.gitea
|
||||
**/.env
|
||||
**/.env.gpu
|
||||
.env
|
||||
_snapshot
|
||||
docs
|
||||
schema.sql
|
||||
**/.gitmodules
|
||||
**/.gitignore
|
||||
**/README.md
|
||||
**/__pycache__
|
||||
**/.pytest_cache
|
||||
**/node_modules
|
||||
**/*.pyc
|
||||
test/
|
||||
|
||||
@@ -2,11 +2,11 @@ name: Build and Deploy
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
branches: ['**']
|
||||
|
||||
env:
|
||||
REGISTRY: registry.rose-ash.com:5000
|
||||
ARTDAG_DIR: /root/art-dag-mono
|
||||
APP_DIR: /root/rose-ash
|
||||
|
||||
jobs:
|
||||
build-and-deploy:
|
||||
@@ -28,87 +28,76 @@ jobs:
|
||||
chmod 600 ~/.ssh/id_rsa
|
||||
ssh-keyscan -H "$DEPLOY_HOST" >> ~/.ssh/known_hosts 2>/dev/null || true
|
||||
|
||||
- name: Build and deploy
|
||||
- name: Build and deploy changed apps
|
||||
env:
|
||||
DEPLOY_HOST: ${{ secrets.DEPLOY_HOST }}
|
||||
run: |
|
||||
ssh "root@$DEPLOY_HOST" "
|
||||
cd ${{ env.ARTDAG_DIR }}
|
||||
cd ${{ env.APP_DIR }}
|
||||
|
||||
# Save current HEAD before updating
|
||||
OLD_HEAD=\$(git rev-parse HEAD 2>/dev/null || echo none)
|
||||
|
||||
git fetch origin main
|
||||
git reset --hard origin/main
|
||||
git fetch origin ${{ github.ref_name }}
|
||||
git reset --hard origin/${{ github.ref_name }}
|
||||
|
||||
NEW_HEAD=\$(git rev-parse HEAD)
|
||||
|
||||
# Change detection
|
||||
BUILD_L1=false
|
||||
BUILD_L2=false
|
||||
# Detect what changed
|
||||
REBUILD_ALL=false
|
||||
if [ \"\$OLD_HEAD\" = \"none\" ] || [ \"\$OLD_HEAD\" = \"\$NEW_HEAD\" ]; then
|
||||
BUILD_L1=true
|
||||
BUILD_L2=true
|
||||
# First deploy or CI re-run on same commit — rebuild all
|
||||
REBUILD_ALL=true
|
||||
else
|
||||
CHANGED=\$(git diff --name-only \$OLD_HEAD \$NEW_HEAD)
|
||||
# common/ or core/ change -> rebuild both
|
||||
if echo \"\$CHANGED\" | grep -qE '^(common|core)/'; then
|
||||
BUILD_L1=true
|
||||
BUILD_L2=true
|
||||
if echo \"\$CHANGED\" | grep -q '^shared/'; then
|
||||
REBUILD_ALL=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^l1/'; then
|
||||
BUILD_L1=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^l2/'; then
|
||||
BUILD_L2=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^client/'; then
|
||||
BUILD_L1=true
|
||||
if echo \"\$CHANGED\" | grep -q '^docker-compose.yml'; then
|
||||
REBUILD_ALL=true
|
||||
fi
|
||||
fi
|
||||
|
||||
# Build L1
|
||||
if [ \"\$BUILD_L1\" = true ]; then
|
||||
echo 'Building L1...'
|
||||
docker build \
|
||||
--build-arg CACHEBUST=\$(date +%s) \
|
||||
-f l1/Dockerfile \
|
||||
-t ${{ env.REGISTRY }}/celery-l1-server:latest \
|
||||
-t ${{ env.REGISTRY }}/celery-l1-server:${{ github.sha }} \
|
||||
.
|
||||
docker push ${{ env.REGISTRY }}/celery-l1-server:latest
|
||||
docker push ${{ env.REGISTRY }}/celery-l1-server:${{ github.sha }}
|
||||
# Map compose service name to source directory
|
||||
app_dir() {
|
||||
case \"\$1\" in
|
||||
sx_docs) echo \"sx\" ;;
|
||||
*) echo \"\$1\" ;;
|
||||
esac
|
||||
}
|
||||
|
||||
for app in blog market cart events federation account relations likes orders test sx_docs; do
|
||||
dir=\$(app_dir \"\$app\")
|
||||
IMAGE_EXISTS=\$(docker image ls -q ${{ env.REGISTRY }}/\$app:latest 2>/dev/null)
|
||||
if [ \"\$REBUILD_ALL\" = true ] || echo \"\$CHANGED\" | grep -q \"^\$dir/\" || [ -z \"\$IMAGE_EXISTS\" ]; then
|
||||
echo \"Building \$app...\"
|
||||
docker build \
|
||||
--build-arg CACHEBUST=\$(date +%s) \
|
||||
-f \$dir/Dockerfile \
|
||||
-t ${{ env.REGISTRY }}/\$app:latest \
|
||||
-t ${{ env.REGISTRY }}/\$app:${{ github.sha }} \
|
||||
.
|
||||
docker push ${{ env.REGISTRY }}/\$app:latest
|
||||
docker push ${{ env.REGISTRY }}/\$app:${{ github.sha }}
|
||||
else
|
||||
echo \"Skipping \$app (no changes)\"
|
||||
fi
|
||||
done
|
||||
|
||||
# Deploy swarm stack only on main branch
|
||||
if [ '${{ github.ref_name }}' = 'main' ]; then
|
||||
source .env
|
||||
docker stack deploy -c docker-compose.yml rose-ash
|
||||
echo 'Waiting for swarm services to update...'
|
||||
sleep 10
|
||||
docker stack services rose-ash
|
||||
else
|
||||
echo 'Skipping L1 (no changes)'
|
||||
echo 'Skipping swarm deploy (branch: ${{ github.ref_name }})'
|
||||
fi
|
||||
|
||||
# Build L2
|
||||
if [ \"\$BUILD_L2\" = true ]; then
|
||||
echo 'Building L2...'
|
||||
docker build \
|
||||
--build-arg CACHEBUST=\$(date +%s) \
|
||||
-f l2/Dockerfile \
|
||||
-t ${{ env.REGISTRY }}/l2-server:latest \
|
||||
-t ${{ env.REGISTRY }}/l2-server:${{ github.sha }} \
|
||||
.
|
||||
docker push ${{ env.REGISTRY }}/l2-server:latest
|
||||
docker push ${{ env.REGISTRY }}/l2-server:${{ github.sha }}
|
||||
else
|
||||
echo 'Skipping L2 (no changes)'
|
||||
fi
|
||||
|
||||
# Deploy stacks (--resolve-image always forces re-pull of :latest)
|
||||
if [ \"\$BUILD_L1\" = true ]; then
|
||||
cd l1 && source .env && docker stack deploy --resolve-image always -c docker-compose.yml celery && cd ..
|
||||
echo 'L1 stack deployed'
|
||||
fi
|
||||
if [ \"\$BUILD_L2\" = true ]; then
|
||||
cd l2 && source .env && docker stack deploy --resolve-image always -c docker-compose.yml activitypub && cd ..
|
||||
echo 'L2 stack deployed'
|
||||
fi
|
||||
|
||||
sleep 10
|
||||
echo '=== L1 Services ==='
|
||||
docker stack services celery
|
||||
echo '=== L2 Services ==='
|
||||
docker stack services activitypub
|
||||
# Dev stack always deployed (bind-mounted source + auto-reload)
|
||||
echo 'Deploying dev stack...'
|
||||
docker compose -p rose-ash-dev -f docker-compose.yml -f docker-compose.dev.yml up -d
|
||||
echo 'Dev stack deployed'
|
||||
docker compose -p rose-ash-dev -f docker-compose.yml -f docker-compose.dev.yml ps
|
||||
"
|
||||
|
||||
12
.gitignore
vendored
Normal file
12
.gitignore
vendored
Normal file
@@ -0,0 +1,12 @@
|
||||
__pycache__/
|
||||
*.pyc
|
||||
*.pyo
|
||||
.env
|
||||
node_modules/
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
.venv/
|
||||
venv/
|
||||
_snapshot/
|
||||
_debug/
|
||||
176
CLAUDE.md
176
CLAUDE.md
@@ -1,72 +1,166 @@
|
||||
# Art DAG Monorepo
|
||||
# Rose Ash Monorepo
|
||||
|
||||
Federated content-addressed DAG execution engine for distributed media processing with ActivityPub ownership and provenance tracking.
|
||||
Cooperative web platform: federated content, commerce, events, and media processing. Each domain runs as an independent Quart microservice with its own database, communicating via HMAC-signed internal HTTP and ActivityPub events.
|
||||
|
||||
## Deployment
|
||||
|
||||
- **Do NOT push** until explicitly told to. Pushes reload code to dev automatically.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
core/ # DAG engine (artdag package) - nodes, effects, analysis, planning
|
||||
l1/ # L1 Celery rendering server (FastAPI + Celery + Redis + PostgreSQL)
|
||||
l2/ # L2 ActivityPub registry (FastAPI + PostgreSQL)
|
||||
common/ # Shared templates, middleware, models (artdag_common package)
|
||||
client/ # CLI client
|
||||
test/ # Integration & e2e tests
|
||||
blog/ # Content management, Ghost CMS sync, navigation, WYSIWYG editor
|
||||
market/ # Product catalog, marketplace pages, web scraping
|
||||
cart/ # Shopping cart CRUD, checkout (delegates order creation to orders)
|
||||
events/ # Calendar & event management, ticketing
|
||||
federation/ # ActivityPub social hub, user profiles
|
||||
account/ # OAuth2 authorization server, user dashboard, membership
|
||||
orders/ # Order history, SumUp payment/webhook handling, reconciliation
|
||||
relations/ # (internal) Cross-domain parent/child relationship tracking
|
||||
likes/ # (internal) Unified like/favourite tracking across domains
|
||||
shared/ # Shared library: models, infrastructure, templates, static assets
|
||||
artdag/ # Art DAG — media processing engine (separate codebase, see below)
|
||||
```
|
||||
|
||||
### Shared Library (`shared/`)
|
||||
|
||||
```
|
||||
shared/
|
||||
models/ # Canonical SQLAlchemy ORM models for all domains
|
||||
db/ # Async session management, per-domain DB support, alembic helpers
|
||||
infrastructure/ # App factory, OAuth, ActivityPub, fragments, internal auth, Jinja
|
||||
services/ # Domain service implementations + DI registry
|
||||
contracts/ # DTOs and service protocols
|
||||
browser/ # Middleware, Redis caching, CSRF, error handlers
|
||||
events/ # Activity bus + background processor (AP-shaped events)
|
||||
config/ # YAML config loading (frozen/readonly)
|
||||
static/ # Shared CSS, JS, images
|
||||
templates/ # Base HTML layouts, partials (inherited by all apps)
|
||||
```
|
||||
|
||||
### Art DAG (`artdag/`)
|
||||
|
||||
Federated content-addressed DAG execution engine for distributed media processing.
|
||||
|
||||
```
|
||||
artdag/
|
||||
core/ # DAG engine (artdag package) — nodes, effects, analysis, planning
|
||||
l1/ # L1 Celery rendering server (FastAPI + Celery + Redis + PostgreSQL)
|
||||
l2/ # L2 ActivityPub registry (FastAPI + PostgreSQL)
|
||||
common/ # Shared templates, middleware, models (artdag_common package)
|
||||
client/ # CLI client
|
||||
test/ # Integration & e2e tests
|
||||
```
|
||||
|
||||
## Tech Stack
|
||||
|
||||
Python 3.11+, FastAPI, Celery, Redis, PostgreSQL (asyncpg for L1), SQLAlchemy, Pydantic, JAX (CPU/GPU), IPFS/Kubo, Docker Swarm, HTMX + Jinja2 for web UI.
|
||||
**Web platform:** Python 3.11+, Quart (async Flask), SQLAlchemy (asyncpg), Jinja2, HTMX, PostgreSQL, Redis, Docker Swarm, Hypercorn.
|
||||
|
||||
**Art DAG:** FastAPI, Celery, JAX (CPU/GPU), IPFS/Kubo, Pydantic.
|
||||
|
||||
## Key Commands
|
||||
|
||||
### Testing
|
||||
### Development
|
||||
```bash
|
||||
cd l1 && pytest tests/ # L1 unit tests
|
||||
cd core && pytest tests/ # Core unit tests
|
||||
cd test && python run.py # Full integration pipeline
|
||||
./dev.sh # Start all services + infra (db, redis, pgbouncer)
|
||||
./dev.sh blog market # Start specific services + infra
|
||||
./dev.sh --build blog # Rebuild image then start
|
||||
./dev.sh down # Stop everything
|
||||
./dev.sh logs blog # Tail service logs
|
||||
```
|
||||
- pytest uses `asyncio_mode = "auto"` for async tests
|
||||
- Test files: `test_*.py`, fixtures in `conftest.py`
|
||||
|
||||
### Linting & Type Checking (L1)
|
||||
### Deployment
|
||||
```bash
|
||||
cd l1 && ruff check . # Lint (E, F, I, UP rules)
|
||||
cd l1 && mypy app/types.py app/routers/recipes.py tests/
|
||||
./deploy.sh # Auto-detect changed apps, build + push + restart
|
||||
./deploy.sh blog market # Deploy specific apps
|
||||
./deploy.sh --all # Deploy everything
|
||||
```
|
||||
- Line length: 100 chars (E501 ignored)
|
||||
- Mypy: strict on `app/types.py`, `app/routers/recipes.py`, `tests/`; gradual elsewhere
|
||||
- Mypy ignores imports for: celery, redis, artdag, artdag_common, ipfs_client
|
||||
|
||||
### Docker
|
||||
### Art DAG
|
||||
```bash
|
||||
docker build -f l1/Dockerfile -t celery-l1-server:latest .
|
||||
docker build -f l1/Dockerfile.gpu -t celery-l1-gpu:latest .
|
||||
docker build -f l2/Dockerfile -t l2-server:latest .
|
||||
./deploy.sh # Build, push, deploy stacks
|
||||
cd artdag/l1 && pytest tests/ # L1 unit tests
|
||||
cd artdag/core && pytest tests/ # Core unit tests
|
||||
cd artdag/test && python run.py # Full integration pipeline
|
||||
cd artdag/l1 && ruff check . # Lint
|
||||
cd artdag/l1 && mypy app/types.py app/routers/recipes.py tests/
|
||||
```
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
- **3-Phase Execution**: Analyze -> Plan -> Execute (tasks in `l1/tasks/`)
|
||||
- **Content-Addressed**: All data identified by SHA3-256 hashes or IPFS CIDs
|
||||
- **Services Pattern**: Business logic in `app/services/`, API endpoints in `app/routers/`
|
||||
- **Types Module**: Pydantic models and TypedDicts in `app/types.py`
|
||||
- **Celery Tasks**: In `l1/tasks/`, decorated with `@app.task`
|
||||
- **S-Expression Effects**: Composable effect language in `l1/sexp_effects/`
|
||||
- **Storage**: Local filesystem, S3, or IPFS backends (`storage_providers.py`)
|
||||
### Web Platform
|
||||
|
||||
## Auth
|
||||
- **App factory:** `create_base_app(name, context_fn, before_request_fns, domain_services_fn)` in `shared/infrastructure/factory.py` — creates Quart app with DB, Redis, CSRF, OAuth, AP, session management
|
||||
- **Blueprint pattern:** Each blueprint exposes `register() -> Blueprint`, handlers stored in `_handlers` dict
|
||||
- **Per-service database:** Each service has own PostgreSQL DB via PgBouncer; cross-domain data fetched via HTTP
|
||||
- **Alembic per-service:** Each service declares `MODELS` and `TABLES` in `alembic/env.py`, delegates to `shared.db.alembic_env.run_alembic()`
|
||||
- **Inter-service reads:** `fetch_data(service, query, params)` → GET `/internal/data/{query}` (HMAC-signed, 3s timeout)
|
||||
- **Inter-service writes:** `call_action(service, action, payload)` → POST `/internal/actions/{action}` (HMAC-signed, 5s timeout)
|
||||
- **Inter-service AP inbox:** `send_internal_activity()` → POST `/internal/inbox` (HMAC-signed, AP-shaped activities for cross-service writes)
|
||||
- **Fragments:** HTML fragments fetched cross-service via `fetch_fragments()` for composing shared UI (nav, cart mini, auth menu)
|
||||
- **Soft deletes:** Models use `deleted_at` column pattern
|
||||
- **Context processors:** Each app provides its own `context_fn` that assembles template context from local DB + cross-service fragments
|
||||
|
||||
- L1 <-> L2: scoped JWT tokens (no shared secrets)
|
||||
- L2: password + OAuth SSO, token revocation in Redis (30-day expiry)
|
||||
- Federation: ActivityPub RSA signatures (`core/artdag/activitypub/`)
|
||||
### Auth
|
||||
|
||||
- **Account** is the OAuth2 authorization server; all other apps are OAuth clients
|
||||
- Per-app first-party session cookies (Safari ITP compatible), synchronized via device ID
|
||||
- Grant verification: apps check grant validity against account DB (cached in Redis)
|
||||
- Silent SSO: `prompt=none` OAuth flow for automatic cross-app login
|
||||
- ActivityPub: RSA signatures, per-app virtual actor projections sharing same keypair
|
||||
|
||||
### SX Rendering Pipeline
|
||||
|
||||
The SX system renders component trees defined in s-expressions. The same AST can be evaluated in different modes depending on where the server/client rendering boundary is drawn:
|
||||
|
||||
- `render_to_html(name, **kw)` — server-side, produces HTML. Used by route handlers returning full HTML.
|
||||
- `render_to_sx(name, **kw)` — server-side, produces SX wire format. Component calls stay **unexpanded** (serialized for client-side rendering by sx.js).
|
||||
- `render_to_sx_with_env(name, env, **kw)` — server-side, **expands the top-level component** then serializes children as SX wire format. Used by layout components that need Python context (auth state, fragments, URLs) resolved server-side.
|
||||
- `sx_page(ctx, page_sx)` — produces the full HTML shell (`<!doctype html>...`) with component definitions, CSS, and page SX inlined for client-side boot.
|
||||
|
||||
See the docstring in `shared/sx/async_eval.py` for the full evaluation modes table.
|
||||
|
||||
### Service SX Directory Convention
|
||||
|
||||
Each service has two SX-related directories:
|
||||
|
||||
- **`{service}/sx/`** — service-specific component definitions (`.sx` files with `defcomp`). Loaded at startup by `load_service_components()`. These define layout components, reusable UI fragments, etc.
|
||||
- **`{service}/sxc/`** — page definitions and Python rendering logic. Contains `defpage` definitions (client-routed pages) and the Python functions that compose headers, layouts, and page content.
|
||||
|
||||
Shared components live in `shared/sx/templates/` and are loaded by `load_shared_components()` in the app factory.
|
||||
|
||||
### Art DAG
|
||||
|
||||
- **3-Phase Execution:** Analyze → Plan → Execute (tasks in `artdag/l1/tasks/`)
|
||||
- **Content-Addressed:** All data identified by SHA3-256 hashes or IPFS CIDs
|
||||
- **S-Expression Effects:** Composable effect language in `artdag/l1/sexp_effects/`
|
||||
- **Storage:** Local filesystem, S3, or IPFS backends
|
||||
- L1 ↔ L2: scoped JWT tokens; L2: password + OAuth SSO
|
||||
|
||||
## Domains
|
||||
|
||||
| Service | Public URL | Dev Port |
|
||||
|---------|-----------|----------|
|
||||
| blog | blog.rose-ash.com | 8001 |
|
||||
| market | market.rose-ash.com | 8002 |
|
||||
| cart | cart.rose-ash.com | 8003 |
|
||||
| events | events.rose-ash.com | 8004 |
|
||||
| federation | federation.rose-ash.com | 8005 |
|
||||
| account | account.rose-ash.com | 8006 |
|
||||
| relations | (internal only) | 8008 |
|
||||
| likes | (internal only) | 8009 |
|
||||
| orders | orders.rose-ash.com | 8010 |
|
||||
|
||||
## Dev Container Mounts
|
||||
|
||||
Dev bind mounts in `docker-compose.dev.yml` must mirror the Docker image's COPY paths. When adding a new directory to a service (e.g. `{service}/sx/`), add a corresponding volume mount (`./service/sx:/app/sx`) or the directory won't be visible inside the dev container. Hypercorn `--reload` watches for Python file changes; `.sx` file hot-reload is handled by `reload_if_changed()` in `shared/sx/jinja_bridge.py`.
|
||||
|
||||
## Key Config Files
|
||||
|
||||
- `l1/pyproject.toml` - mypy, pytest, ruff config for L1
|
||||
- `l1/celery_app.py` - Celery initialization
|
||||
- `l1/database.py` / `l2/db.py` - SQLAlchemy models
|
||||
- `l1/docker-compose.yml` / `l2/docker-compose.yml` - Swarm stacks
|
||||
- `docker-compose.yml` / `docker-compose.dev.yml` — service definitions, env vars, volumes
|
||||
- `deploy.sh` / `dev.sh` — deployment and development scripts
|
||||
- `shared/infrastructure/factory.py` — app factory (all services use this)
|
||||
- `{service}/alembic/env.py` — per-service migration config
|
||||
- `_config/app-config.yaml` — runtime YAML config (mounted into containers)
|
||||
|
||||
## Tools
|
||||
|
||||
|
||||
138
README.md
Normal file
138
README.md
Normal file
@@ -0,0 +1,138 @@
|
||||
# Rose Ash
|
||||
|
||||
Monorepo for the Rose Ash cooperative platform — six Quart microservices sharing a common infrastructure layer, a single PostgreSQL database, and an ActivityPub federation layer.
|
||||
|
||||
## Services
|
||||
|
||||
| Service | URL | Description |
|
||||
|---------|-----|-------------|
|
||||
| **blog** | blog.rose-ash.com | Content management, Ghost sync, navigation, editor |
|
||||
| **market** | market.rose-ash.com | Product listings, scraping, market pages |
|
||||
| **cart** | cart.rose-ash.com | Shopping cart, checkout, orders, SumUp payments |
|
||||
| **events** | events.rose-ash.com | Calendar, event entries, container widgets |
|
||||
| **federation** | federation.rose-ash.com | OAuth2 authorization server, ActivityPub hub, social features |
|
||||
| **account** | account.rose-ash.com | User dashboard, newsletters, tickets, bookings |
|
||||
|
||||
All services are Python 3.11 / Quart apps served by Hypercorn, deployed as a Docker Swarm stack.
|
||||
|
||||
## Repository structure
|
||||
|
||||
```
|
||||
rose-ash/
|
||||
├── shared/ # Common code: models, services, infrastructure, templates
|
||||
│ ├── models/ # Canonical SQLAlchemy ORM models (all domains)
|
||||
│ ├── services/ # Domain service implementations + registry
|
||||
│ ├── contracts/ # DTOs, protocols, widget contracts
|
||||
│ ├── infrastructure/ # App factory, OAuth, ActivityPub, fragments, Jinja setup
|
||||
│ ├── templates/ # Shared base templates and partials
|
||||
│ ├── static/ # Shared CSS, JS, images
|
||||
│ ├── editor/ # Prose editor (Node build, blog only)
|
||||
│ └── alembic/ # Database migrations
|
||||
├── blog/ # Blog app
|
||||
├── market/ # Market app
|
||||
├── cart/ # Cart app
|
||||
├── events/ # Events app
|
||||
├── federation/ # Federation app
|
||||
├── account/ # Account app
|
||||
├── docker-compose.yml # Swarm stack definition
|
||||
├── deploy.sh # Local build + restart script
|
||||
├── .gitea/workflows/ # CI: build changed apps + deploy
|
||||
├── _config/ # Runtime config (app-config.yaml)
|
||||
├── schema.sql # Reference schema snapshot
|
||||
└── .env # Environment variables (not committed)
|
||||
```
|
||||
|
||||
Each app follows the same layout:
|
||||
|
||||
```
|
||||
{app}/
|
||||
├── app.py # App entry point (creates Quart app)
|
||||
├── path_setup.py # Adds project root + app dir to sys.path
|
||||
├── entrypoint.sh # Container entrypoint (wait for DB, run migrations, start)
|
||||
├── Dockerfile # Build instructions (monorepo context)
|
||||
├── bp/ # Blueprints (routes, handlers)
|
||||
│ └── fragments/ # Fragment endpoints for cross-app composition
|
||||
├── models/ # Re-export stubs pointing to shared/models/
|
||||
├── services/ # App-specific service wiring
|
||||
├── templates/ # App-specific templates (override shared/)
|
||||
└── config/ # App-specific config
|
||||
```
|
||||
|
||||
## Key architecture patterns
|
||||
|
||||
**Shared models** — All ORM models live in `shared/models/`. Each app's `models/` directory contains thin re-export stubs. `factory.py` imports all six apps' models at startup so SQLAlchemy relationship references resolve across domains.
|
||||
|
||||
**Service contracts** — Apps communicate through typed protocols (`shared/contracts/protocols.py`) and frozen dataclass DTOs (`shared/contracts/dtos.py`), wired via a singleton registry (`shared/services/registry.py`). No direct HTTP calls between apps for domain logic.
|
||||
|
||||
**Fragment composition** — Apps expose HTML fragments at `/internal/fragments/<type>` for cross-app UI composition. The blog fetches cart, account, navigation, and event fragments to compose its pages. Fragments are cached in Redis with short TTLs.
|
||||
|
||||
**OAuth SSO** — Federation is the OAuth2 authorization server. All other apps are OAuth clients with per-app first-party session cookies (Safari ITP compatible). Login/callback/logout routes are auto-registered via `shared/infrastructure/oauth.py`.
|
||||
|
||||
**ActivityPub** — Each app has its own AP actor (virtual projection of the same keypair). The federation app is the social hub (timeline, compose, follow, notifications). Activities are emitted to `ap_activities` table and processed by `EventProcessor`.
|
||||
|
||||
## Development
|
||||
|
||||
### Quick deploy (skip CI)
|
||||
|
||||
```bash
|
||||
# Rebuild + restart one app
|
||||
./deploy.sh blog
|
||||
|
||||
# Rebuild + restart multiple apps
|
||||
./deploy.sh blog market
|
||||
|
||||
# Rebuild all
|
||||
./deploy.sh --all
|
||||
|
||||
# Auto-detect changes from git
|
||||
./deploy.sh
|
||||
```
|
||||
|
||||
### Full stack deploy
|
||||
|
||||
```bash
|
||||
source .env
|
||||
docker stack deploy -c docker-compose.yml coop
|
||||
```
|
||||
|
||||
### Build a single app image
|
||||
|
||||
```bash
|
||||
docker build -f blog/Dockerfile -t registry.rose-ash.com:5000/blog:latest .
|
||||
```
|
||||
|
||||
### Run migrations
|
||||
|
||||
Migrations run automatically on the **blog** service startup when `RUN_MIGRATIONS=true` is set (only blog runs migrations; all other apps skip them).
|
||||
|
||||
```bash
|
||||
# Manual migration
|
||||
docker exec -it $(docker ps -qf name=coop_blog) bash -c "cd shared && alembic upgrade head"
|
||||
```
|
||||
|
||||
## CI/CD
|
||||
|
||||
A single Gitea Actions workflow (`.gitea/workflows/ci.yml`) handles all six apps:
|
||||
|
||||
1. Detects which files changed since the last deploy
|
||||
2. If `shared/` or `docker-compose.yml` changed, rebuilds all apps
|
||||
3. Otherwise rebuilds only apps with changes (or missing images)
|
||||
4. Pushes images to the private registry
|
||||
5. Runs `docker stack deploy` to update the swarm
|
||||
|
||||
### Required secrets
|
||||
|
||||
| Secret | Value |
|
||||
|--------|-------|
|
||||
| `DEPLOY_SSH_KEY` | Private SSH key for root access to the deploy host |
|
||||
| `DEPLOY_HOST` | Hostname or IP of the deploy server |
|
||||
|
||||
## Infrastructure
|
||||
|
||||
- **Runtime**: Python 3.11, Quart (async Flask), Hypercorn
|
||||
- **Database**: PostgreSQL 16 (shared by all apps)
|
||||
- **Cache**: Redis 7 (page cache, fragment cache, sessions)
|
||||
- **Orchestration**: Docker Swarm
|
||||
- **Registry**: `registry.rose-ash.com:5000`
|
||||
- **CI**: Gitea Actions
|
||||
- **Reverse proxy**: Caddy (external, not in this repo)
|
||||
86
_config/app-config.yaml
Normal file
86
_config/app-config.yaml
Normal file
@@ -0,0 +1,86 @@
|
||||
root: "/rose-ash-wholefood-coop" # no trailing slash needed (we normalize it)
|
||||
host: "https://rose-ash.com"
|
||||
base_host: "wholesale.suma.coop"
|
||||
base_login: https://wholesale.suma.coop/customer/account/login/
|
||||
base_url: https://wholesale.suma.coop/
|
||||
title: ROSE-ASH 2.0
|
||||
market_root: /market
|
||||
market_title: Market
|
||||
blog_root: /
|
||||
blog_title: all the news
|
||||
cart_root: /cart
|
||||
app_urls:
|
||||
blog: "https://blog.rose-ash.com"
|
||||
market: "https://market.rose-ash.com"
|
||||
cart: "https://cart.rose-ash.com"
|
||||
events: "https://events.rose-ash.com"
|
||||
federation: "https://federation.rose-ash.com"
|
||||
account: "https://account.rose-ash.com"
|
||||
sx: "https://sx.rose-ash.com"
|
||||
test: "https://test.rose-ash.com"
|
||||
orders: "https://orders.rose-ash.com"
|
||||
cache:
|
||||
fs_root: /app/_snapshot # <- absolute path to your snapshot dir
|
||||
categories:
|
||||
allow:
|
||||
Basics: basics
|
||||
Branded Goods: branded-goods
|
||||
Chilled: chilled
|
||||
Frozen: frozen
|
||||
Non-foods: non-foods
|
||||
Supplements: supplements
|
||||
Christmas: christmas
|
||||
slugs:
|
||||
skip:
|
||||
- ""
|
||||
- customer
|
||||
- account
|
||||
- checkout
|
||||
- wishlist
|
||||
- sales
|
||||
- contact
|
||||
- privacy-policy
|
||||
- terms-and-conditions
|
||||
- delivery
|
||||
- catalogsearch
|
||||
- quickorder
|
||||
- apply
|
||||
- search
|
||||
- static
|
||||
- media
|
||||
section-titles:
|
||||
- ingredients
|
||||
- allergy information
|
||||
- allergens
|
||||
- nutritional information
|
||||
- nutrition
|
||||
- storage
|
||||
- directions
|
||||
- preparation
|
||||
- serving suggestions
|
||||
- origin
|
||||
- country of origin
|
||||
- recycling
|
||||
- general information
|
||||
- additional information
|
||||
- a note about prices
|
||||
|
||||
blacklist:
|
||||
category:
|
||||
- branded-goods/alcoholic-drinks
|
||||
- branded-goods/beers
|
||||
- branded-goods/ciders
|
||||
- branded-goods/wines
|
||||
product:
|
||||
- list-price-suma-current-suma-price-list-each-bk012-2-html
|
||||
product-details:
|
||||
- General Information
|
||||
- A Note About Prices
|
||||
sumup:
|
||||
merchant_code: "ME4J6100"
|
||||
currency: "GBP"
|
||||
# Name of the environment variable that holds your SumUp API key
|
||||
api_key_env: "SUMUP_API_KEY"
|
||||
webhook_secret: "jfwlekjfwef798ewf769ew8f679ew8f7weflwef"
|
||||
|
||||
|
||||
11
_config/init-databases.sql
Normal file
11
_config/init-databases.sql
Normal file
@@ -0,0 +1,11 @@
|
||||
-- Per-domain databases for the coop stack.
|
||||
-- Run once on fresh deployments (not needed for existing single-DB setups
|
||||
-- that use the split-databases.sh migration script instead).
|
||||
--
|
||||
-- Usage: psql -U postgres -f init-databases.sql
|
||||
|
||||
CREATE DATABASE db_account;
|
||||
CREATE DATABASE db_blog;
|
||||
CREATE DATABASE db_market; -- also houses cart tables (commerce bounded context)
|
||||
CREATE DATABASE db_events;
|
||||
CREATE DATABASE db_federation;
|
||||
17
_config/move-page-configs.sql
Normal file
17
_config/move-page-configs.sql
Normal file
@@ -0,0 +1,17 @@
|
||||
-- Move page_configs data from db_events to db_blog.
|
||||
-- Run after split-databases.sh if page_configs data ended up in db_events.
|
||||
--
|
||||
-- Usage:
|
||||
-- PGHOST=db PGUSER=postgres PGPASSWORD=change-me psql -f move-page-configs.sql
|
||||
--
|
||||
|
||||
-- Step 1: Dump page_configs from db_events into db_blog
|
||||
\c db_events
|
||||
COPY page_configs TO '/tmp/page_configs.csv' WITH CSV HEADER;
|
||||
|
||||
\c db_blog
|
||||
TRUNCATE page_configs;
|
||||
COPY page_configs FROM '/tmp/page_configs.csv' WITH CSV HEADER;
|
||||
|
||||
-- Step 2: Verify
|
||||
SELECT count(*) AS blog_page_configs FROM page_configs;
|
||||
153
_config/split-databases.sh
Executable file
153
_config/split-databases.sh
Executable file
@@ -0,0 +1,153 @@
|
||||
#!/usr/bin/env bash
|
||||
#
|
||||
# split-databases.sh — Migrate from single appdb to per-domain databases.
|
||||
#
|
||||
# Prerequisites:
|
||||
# - All apps stopped (5-min maintenance window)
|
||||
# - init-databases.sql already run (CREATE DATABASE db_*)
|
||||
# - Run from a host that can reach the Postgres container
|
||||
#
|
||||
# Usage:
|
||||
# PGHOST=db PGUSER=postgres PGPASSWORD=change-me bash split-databases.sh
|
||||
#
|
||||
set -euo pipefail
|
||||
|
||||
SOURCE_DB="${SOURCE_DB:-appdb}"
|
||||
|
||||
# ── Table → database mapping ───────────────────────────────────────────────
|
||||
|
||||
declare -A DB_TABLES
|
||||
|
||||
DB_TABLES[db_account]="
|
||||
users
|
||||
magic_links
|
||||
oauth_codes
|
||||
oauth_grants
|
||||
ghost_labels
|
||||
user_labels
|
||||
ghost_newsletters
|
||||
user_newsletters
|
||||
ghost_tiers
|
||||
ghost_subscriptions
|
||||
kv
|
||||
"
|
||||
|
||||
DB_TABLES[db_blog]="
|
||||
authors
|
||||
tags
|
||||
posts
|
||||
post_authors
|
||||
post_tags
|
||||
post_likes
|
||||
menu_items
|
||||
menu_nodes
|
||||
container_relations
|
||||
page_configs
|
||||
"
|
||||
|
||||
DB_TABLES[db_market]="
|
||||
products
|
||||
product_images
|
||||
product_sections
|
||||
product_labels
|
||||
product_stickers
|
||||
product_attributes
|
||||
product_nutrition
|
||||
product_allergens
|
||||
product_likes
|
||||
product_logs
|
||||
market_places
|
||||
nav_tops
|
||||
nav_subs
|
||||
listings
|
||||
listing_items
|
||||
link_errors
|
||||
link_externals
|
||||
subcategory_redirects
|
||||
cart_items
|
||||
orders
|
||||
order_items
|
||||
"
|
||||
|
||||
# db_cart merged into db_market — cart and market share the same bounded context
|
||||
# (commerce). Cart needs direct read access to products/market_places.
|
||||
|
||||
DB_TABLES[db_events]="
|
||||
calendars
|
||||
calendar_slots
|
||||
calendar_entries
|
||||
calendar_entry_posts
|
||||
ticket_types
|
||||
tickets
|
||||
"
|
||||
|
||||
DB_TABLES[db_federation]="
|
||||
ap_anchors
|
||||
ap_actor_profiles
|
||||
ap_activities
|
||||
ap_followers
|
||||
ap_inbox_items
|
||||
ap_remote_actors
|
||||
ap_following
|
||||
ap_remote_posts
|
||||
ap_local_posts
|
||||
ap_interactions
|
||||
ap_notifications
|
||||
ap_delivery_log
|
||||
ipfs_pins
|
||||
"
|
||||
|
||||
# ── Migrate each domain ────────────────────────────────────────────────────
|
||||
|
||||
for target_db in db_account db_blog db_market db_events db_federation; do
|
||||
tables="${DB_TABLES[$target_db]}"
|
||||
table_list=""
|
||||
for t in $tables; do
|
||||
table_list="$table_list --table=$t"
|
||||
done
|
||||
|
||||
echo "=== Migrating $target_db ==="
|
||||
echo " Tables: $(echo $tables | tr '\n' ' ')"
|
||||
|
||||
# Dump schema + data for these tables from the source DB
|
||||
pg_dump "$SOURCE_DB" $table_list --no-owner --no-privileges \
|
||||
| psql -q "$target_db"
|
||||
|
||||
echo " Done."
|
||||
done
|
||||
|
||||
# ── Stamp Alembic head in each domain DB ──────────────────────────────────
|
||||
|
||||
echo ""
|
||||
echo "=== Stamping Alembic head in each DB ==="
|
||||
for target_db in db_account db_blog db_market db_events db_federation; do
|
||||
# Create alembic_version table and stamp current head
|
||||
psql -q "$target_db" <<'SQL'
|
||||
CREATE TABLE IF NOT EXISTS alembic_version (
|
||||
version_num VARCHAR(32) NOT NULL,
|
||||
CONSTRAINT alembic_version_pkc PRIMARY KEY (version_num)
|
||||
);
|
||||
DELETE FROM alembic_version;
|
||||
INSERT INTO alembic_version (version_num) VALUES ('w3u1q9r0s1');
|
||||
SQL
|
||||
echo " $target_db stamped at w3u1q9r0s1"
|
||||
done
|
||||
|
||||
echo ""
|
||||
echo "=== Migration complete ==="
|
||||
echo ""
|
||||
echo "Next steps:"
|
||||
echo " 1. Update docker-compose.yml — set per-app DATABASE_URL to the new DBs"
|
||||
echo " 2. Remove schema_sql config (no longer needed)"
|
||||
echo " 3. Redeploy all services"
|
||||
echo ""
|
||||
echo "Per-app DATABASE_URL values:"
|
||||
echo " blog: postgresql+asyncpg://postgres:change-me@db:5432/db_blog"
|
||||
echo " market: postgresql+asyncpg://postgres:change-me@db:5432/db_market"
|
||||
echo " cart: postgresql+asyncpg://postgres:change-me@db:5432/db_market (shared with market)"
|
||||
echo " events: postgresql+asyncpg://postgres:change-me@db:5432/db_events"
|
||||
echo " federation: postgresql+asyncpg://postgres:change-me@db:5432/db_federation"
|
||||
echo " account: postgresql+asyncpg://postgres:change-me@db:5432/db_account"
|
||||
echo ""
|
||||
echo " DATABASE_URL_ACCOUNT: postgresql+asyncpg://postgres:change-me@db:5432/db_account"
|
||||
echo " DATABASE_URL_FEDERATION: postgresql+asyncpg://postgres:change-me@db:5432/db_federation"
|
||||
56
account/Dockerfile
Normal file
56
account/Dockerfile
Normal file
@@ -0,0 +1,56 @@
|
||||
# syntax=docker/dockerfile:1
|
||||
|
||||
# ---------- Python application ----------
|
||||
FROM python:3.11-slim AS base
|
||||
|
||||
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
PYTHONPATH=/app \
|
||||
PIP_NO_CACHE_DIR=1 \
|
||||
APP_PORT=8000 \
|
||||
APP_MODULE=app:app
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system deps + psql client
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
ca-certificates \
|
||||
postgresql-client \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
COPY shared/requirements.txt ./requirements.txt
|
||||
RUN pip install -r requirements.txt
|
||||
|
||||
# Shared code (replaces submodule)
|
||||
COPY shared/ ./shared/
|
||||
|
||||
# App code
|
||||
COPY account/ ./
|
||||
|
||||
# Sibling models for cross-domain SQLAlchemy imports
|
||||
COPY blog/__init__.py ./blog/__init__.py
|
||||
COPY blog/models/ ./blog/models/
|
||||
COPY market/__init__.py ./market/__init__.py
|
||||
COPY market/models/ ./market/models/
|
||||
COPY cart/__init__.py ./cart/__init__.py
|
||||
COPY cart/models/ ./cart/models/
|
||||
COPY events/__init__.py ./events/__init__.py
|
||||
COPY events/models/ ./events/models/
|
||||
COPY federation/__init__.py ./federation/__init__.py
|
||||
COPY federation/models/ ./federation/models/
|
||||
COPY relations/__init__.py ./relations/__init__.py
|
||||
COPY relations/models/ ./relations/models/
|
||||
COPY likes/__init__.py ./likes/__init__.py
|
||||
COPY likes/models/ ./likes/models/
|
||||
COPY orders/__init__.py ./orders/__init__.py
|
||||
COPY orders/models/ ./orders/models/
|
||||
|
||||
# ---------- Runtime setup ----------
|
||||
COPY account/entrypoint.sh /usr/local/bin/entrypoint.sh
|
||||
RUN chmod +x /usr/local/bin/entrypoint.sh
|
||||
|
||||
RUN useradd -m -u 10001 appuser && chown -R appuser:appuser /app
|
||||
USER appuser
|
||||
|
||||
EXPOSE ${APP_PORT}
|
||||
ENTRYPOINT ["/usr/local/bin/entrypoint.sh"]
|
||||
36
account/README.md
Normal file
36
account/README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# Account App
|
||||
|
||||
User dashboard for the Rose Ash cooperative. Provides account management, newsletter preferences, and widget pages for tickets and bookings.
|
||||
|
||||
## Structure
|
||||
|
||||
```
|
||||
app.py # Application factory (create_base_app + blueprints)
|
||||
path_setup.py # Adds project root + app dir to sys.path
|
||||
entrypoint.sh # Container entrypoint (Redis flush, start)
|
||||
bp/
|
||||
account/ # Dashboard, newsletters, widget pages (tickets, bookings)
|
||||
auth/ # OAuth client routes + HTTP token exchange for non-coop clients
|
||||
fragments/ # auth-menu fragment (sign-in button / user menu)
|
||||
models/ # Re-export stubs pointing to shared/models/
|
||||
services/ # register_domain_services() — wires all domains
|
||||
templates/ # Account-specific templates (override shared/)
|
||||
```
|
||||
|
||||
## Auth menu
|
||||
|
||||
Account serves the `auth-menu` fragment consumed by all other apps' headers. It renders either a sign-in button (anonymous) or the user's email with a dropdown (authenticated), for both desktop and mobile layouts.
|
||||
|
||||
## OAuth token exchange
|
||||
|
||||
`POST /auth/oauth/token` provides HTTP-based token exchange for non-coop OAuth clients (e.g., Artdag).
|
||||
|
||||
## Cross-domain communication
|
||||
|
||||
- `services.blog.*` — post queries for page context
|
||||
- `services.calendar.*` — calendar/entry queries for bookings panel
|
||||
- `services.cart.*` — cart summary + orders for tickets panel
|
||||
|
||||
## Fragments served
|
||||
|
||||
- **auth-menu** — sign-in button or user email menu (desktop + mobile)
|
||||
4
account/actions.sx
Normal file
4
account/actions.sx
Normal file
@@ -0,0 +1,4 @@
|
||||
;; Account service — inter-service action endpoints
|
||||
;;
|
||||
;; ghost-sync-member and ghost-push-member use local service imports —
|
||||
;; remain as Python fallbacks.
|
||||
35
account/alembic.ini
Normal file
35
account/alembic.ini
Normal file
@@ -0,0 +1,35 @@
|
||||
[alembic]
|
||||
script_location = alembic
|
||||
sqlalchemy.url =
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
18
account/alembic/env.py
Normal file
18
account/alembic/env.py
Normal file
@@ -0,0 +1,18 @@
|
||||
from alembic import context
|
||||
from shared.db.alembic_env import run_alembic
|
||||
|
||||
MODELS = [
|
||||
"shared.models.user",
|
||||
"shared.models.ghost_membership_entities",
|
||||
"shared.models.magic_link",
|
||||
"shared.models.oauth_code",
|
||||
"shared.models.oauth_grant",
|
||||
]
|
||||
|
||||
TABLES = frozenset({
|
||||
"users", "user_labels", "user_newsletters",
|
||||
"magic_links", "oauth_codes", "oauth_grants",
|
||||
"ghost_labels", "ghost_newsletters", "ghost_tiers", "ghost_subscriptions",
|
||||
})
|
||||
|
||||
run_alembic(context.config, MODELS, TABLES)
|
||||
209
account/alembic/versions/0001_initial.py
Normal file
209
account/alembic/versions/0001_initial.py
Normal file
@@ -0,0 +1,209 @@
|
||||
"""Initial account tables
|
||||
|
||||
Revision ID: acct_0001
|
||||
Revises: -
|
||||
Create Date: 2026-02-26
|
||||
"""
|
||||
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
from sqlalchemy.dialects.postgresql import JSONB
|
||||
|
||||
revision = "acct_0001"
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def _table_exists(conn, name):
|
||||
result = conn.execute(sa.text(
|
||||
"SELECT 1 FROM information_schema.tables WHERE table_schema='public' AND table_name=:t"
|
||||
), {"t": name})
|
||||
return result.scalar() is not None
|
||||
|
||||
|
||||
def upgrade():
|
||||
if _table_exists(op.get_bind(), "users"):
|
||||
return
|
||||
|
||||
# 1. users
|
||||
op.create_table(
|
||||
"users",
|
||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column("email", sa.String(255), nullable=False),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.text("now()")),
|
||||
sa.Column("last_login_at", sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column("ghost_id", sa.String(64), nullable=True),
|
||||
sa.Column("name", sa.String(255), nullable=True),
|
||||
sa.Column("ghost_status", sa.String(50), nullable=True),
|
||||
sa.Column("ghost_subscribed", sa.Boolean(), nullable=False, server_default=sa.true()),
|
||||
sa.Column("ghost_note", sa.Text(), nullable=True),
|
||||
sa.Column("avatar_image", sa.Text(), nullable=True),
|
||||
sa.Column("stripe_customer_id", sa.String(255), nullable=True),
|
||||
sa.Column("ghost_raw", JSONB(), nullable=True),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index("ix_user_email", "users", ["email"], unique=True)
|
||||
op.create_index(op.f("ix_users_ghost_id"), "users", ["ghost_id"], unique=True)
|
||||
op.create_index(op.f("ix_users_stripe_customer_id"), "users", ["stripe_customer_id"])
|
||||
|
||||
# 2. ghost_labels
|
||||
op.create_table(
|
||||
"ghost_labels",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("ghost_id", sa.String(64), nullable=False),
|
||||
sa.Column("name", sa.String(255), nullable=False),
|
||||
sa.Column("slug", sa.String(255), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_ghost_labels_ghost_id"), "ghost_labels", ["ghost_id"], unique=True)
|
||||
|
||||
# 3. user_labels
|
||||
op.create_table(
|
||||
"user_labels",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=True),
|
||||
sa.Column("label_id", sa.Integer(), nullable=True),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.ForeignKeyConstraint(["label_id"], ["ghost_labels.id"], ondelete="CASCADE"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
sa.UniqueConstraint("user_id", "label_id", name="uq_user_label"),
|
||||
)
|
||||
op.create_index(op.f("ix_user_labels_user_id"), "user_labels", ["user_id"])
|
||||
op.create_index(op.f("ix_user_labels_label_id"), "user_labels", ["label_id"])
|
||||
|
||||
# 4. ghost_newsletters
|
||||
op.create_table(
|
||||
"ghost_newsletters",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("ghost_id", sa.String(64), nullable=False),
|
||||
sa.Column("name", sa.String(255), nullable=False),
|
||||
sa.Column("slug", sa.String(255), nullable=True),
|
||||
sa.Column("description", sa.Text(), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column("updated_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_ghost_newsletters_ghost_id"), "ghost_newsletters", ["ghost_id"], unique=True)
|
||||
|
||||
# 5. user_newsletters
|
||||
op.create_table(
|
||||
"user_newsletters",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=True),
|
||||
sa.Column("newsletter_id", sa.Integer(), nullable=True),
|
||||
sa.Column("subscribed", sa.Boolean(), nullable=False),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.ForeignKeyConstraint(["newsletter_id"], ["ghost_newsletters.id"], ondelete="CASCADE"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
sa.UniqueConstraint("user_id", "newsletter_id", name="uq_user_newsletter"),
|
||||
)
|
||||
op.create_index(op.f("ix_user_newsletters_user_id"), "user_newsletters", ["user_id"])
|
||||
op.create_index(op.f("ix_user_newsletters_newsletter_id"), "user_newsletters", ["newsletter_id"])
|
||||
|
||||
# 6. ghost_tiers
|
||||
op.create_table(
|
||||
"ghost_tiers",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("ghost_id", sa.String(64), nullable=False),
|
||||
sa.Column("name", sa.String(255), nullable=False),
|
||||
sa.Column("slug", sa.String(255), nullable=True),
|
||||
sa.Column("type", sa.String(50), nullable=True),
|
||||
sa.Column("visibility", sa.String(50), nullable=True),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_ghost_tiers_ghost_id"), "ghost_tiers", ["ghost_id"], unique=True)
|
||||
|
||||
# 7. ghost_subscriptions
|
||||
op.create_table(
|
||||
"ghost_subscriptions",
|
||||
sa.Column("id", sa.Integer(), nullable=False),
|
||||
sa.Column("ghost_id", sa.String(64), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=True),
|
||||
sa.Column("status", sa.String(50), nullable=True),
|
||||
sa.Column("tier_id", sa.Integer(), nullable=True),
|
||||
sa.Column("cadence", sa.String(50), nullable=True),
|
||||
sa.Column("price_amount", sa.Integer(), nullable=True),
|
||||
sa.Column("price_currency", sa.String(10), nullable=True),
|
||||
sa.Column("stripe_customer_id", sa.String(255), nullable=True),
|
||||
sa.Column("stripe_subscription_id", sa.String(255), nullable=True),
|
||||
sa.Column("raw", JSONB(), nullable=True),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.ForeignKeyConstraint(["tier_id"], ["ghost_tiers.id"], ondelete="SET NULL"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index(op.f("ix_ghost_subscriptions_ghost_id"), "ghost_subscriptions", ["ghost_id"], unique=True)
|
||||
op.create_index(op.f("ix_ghost_subscriptions_user_id"), "ghost_subscriptions", ["user_id"])
|
||||
op.create_index(op.f("ix_ghost_subscriptions_tier_id"), "ghost_subscriptions", ["tier_id"])
|
||||
op.create_index(op.f("ix_ghost_subscriptions_stripe_customer_id"), "ghost_subscriptions", ["stripe_customer_id"])
|
||||
op.create_index(op.f("ix_ghost_subscriptions_stripe_subscription_id"), "ghost_subscriptions", ["stripe_subscription_id"])
|
||||
|
||||
# 8. magic_links
|
||||
op.create_table(
|
||||
"magic_links",
|
||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column("token", sa.String(128), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=False),
|
||||
sa.Column("purpose", sa.String(32), nullable=False),
|
||||
sa.Column("expires_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column("used_at", sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.text("now()")),
|
||||
sa.Column("ip", sa.String(64), nullable=True),
|
||||
sa.Column("user_agent", sa.String(256), nullable=True),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index("ix_magic_link_token", "magic_links", ["token"], unique=True)
|
||||
op.create_index("ix_magic_link_user", "magic_links", ["user_id"])
|
||||
|
||||
# 9. oauth_codes
|
||||
op.create_table(
|
||||
"oauth_codes",
|
||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column("code", sa.String(128), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=False),
|
||||
sa.Column("client_id", sa.String(64), nullable=False),
|
||||
sa.Column("redirect_uri", sa.String(512), nullable=False),
|
||||
sa.Column("expires_at", sa.DateTime(timezone=True), nullable=False),
|
||||
sa.Column("used_at", sa.DateTime(timezone=True), nullable=True),
|
||||
sa.Column("grant_token", sa.String(128), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.text("now()")),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index("ix_oauth_code_code", "oauth_codes", ["code"], unique=True)
|
||||
op.create_index("ix_oauth_code_user", "oauth_codes", ["user_id"])
|
||||
|
||||
# 10. oauth_grants
|
||||
op.create_table(
|
||||
"oauth_grants",
|
||||
sa.Column("id", sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column("token", sa.String(128), nullable=False),
|
||||
sa.Column("user_id", sa.Integer(), nullable=False),
|
||||
sa.Column("client_id", sa.String(64), nullable=False),
|
||||
sa.Column("issuer_session", sa.String(128), nullable=False),
|
||||
sa.Column("device_id", sa.String(128), nullable=True),
|
||||
sa.Column("created_at", sa.DateTime(timezone=True), nullable=False, server_default=sa.text("now()")),
|
||||
sa.Column("revoked_at", sa.DateTime(timezone=True), nullable=True),
|
||||
sa.ForeignKeyConstraint(["user_id"], ["users.id"], ondelete="CASCADE"),
|
||||
sa.PrimaryKeyConstraint("id"),
|
||||
)
|
||||
op.create_index("ix_oauth_grant_token", "oauth_grants", ["token"], unique=True)
|
||||
op.create_index(op.f("ix_oauth_grants_user_id"), "oauth_grants", ["user_id"])
|
||||
op.create_index("ix_oauth_grant_issuer", "oauth_grants", ["issuer_session"])
|
||||
op.create_index("ix_oauth_grant_device", "oauth_grants", ["device_id", "client_id"])
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_table("oauth_grants")
|
||||
op.drop_table("oauth_codes")
|
||||
op.drop_table("magic_links")
|
||||
op.drop_table("ghost_subscriptions")
|
||||
op.drop_table("ghost_tiers")
|
||||
op.drop_table("user_newsletters")
|
||||
op.drop_table("ghost_newsletters")
|
||||
op.drop_table("user_labels")
|
||||
op.drop_table("ghost_labels")
|
||||
op.drop_table("users")
|
||||
86
account/alembic/versions/0002_hash_oauth_tokens.py
Normal file
86
account/alembic/versions/0002_hash_oauth_tokens.py
Normal file
@@ -0,0 +1,86 @@
|
||||
"""Add token_hash columns to oauth_grants and oauth_codes
|
||||
|
||||
Revision ID: acct_0002
|
||||
Revises: acct_0001
|
||||
Create Date: 2026-02-26
|
||||
"""
|
||||
|
||||
import hashlib
|
||||
import sqlalchemy as sa
|
||||
from alembic import op
|
||||
|
||||
revision = "acct_0002"
|
||||
down_revision = "acct_0001"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def _hash(token: str) -> str:
|
||||
return hashlib.sha256(token.encode()).hexdigest()
|
||||
|
||||
|
||||
def upgrade():
|
||||
# Add new hash columns
|
||||
op.add_column("oauth_grants", sa.Column("token_hash", sa.String(64), nullable=True))
|
||||
op.add_column("oauth_codes", sa.Column("code_hash", sa.String(64), nullable=True))
|
||||
op.add_column("oauth_codes", sa.Column("grant_token_hash", sa.String(64), nullable=True))
|
||||
|
||||
# Backfill hashes from existing plaintext tokens
|
||||
conn = op.get_bind()
|
||||
grants = conn.execute(sa.text("SELECT id, token FROM oauth_grants WHERE token IS NOT NULL"))
|
||||
for row in grants:
|
||||
conn.execute(
|
||||
sa.text("UPDATE oauth_grants SET token_hash = :h WHERE id = :id"),
|
||||
{"h": _hash(row.token), "id": row.id},
|
||||
)
|
||||
|
||||
codes = conn.execute(sa.text("SELECT id, code, grant_token FROM oauth_codes WHERE code IS NOT NULL"))
|
||||
for row in codes:
|
||||
params = {"id": row.id, "ch": _hash(row.code)}
|
||||
params["gh"] = _hash(row.grant_token) if row.grant_token else None
|
||||
conn.execute(
|
||||
sa.text("UPDATE oauth_codes SET code_hash = :ch, grant_token_hash = :gh WHERE id = :id"),
|
||||
params,
|
||||
)
|
||||
|
||||
# Create unique indexes on hash columns
|
||||
op.create_index("ix_oauth_grant_token_hash", "oauth_grants", ["token_hash"], unique=True)
|
||||
op.create_index("ix_oauth_code_code_hash", "oauth_codes", ["code_hash"], unique=True)
|
||||
|
||||
# Make original token columns nullable (keep for rollback safety)
|
||||
op.alter_column("oauth_grants", "token", nullable=True)
|
||||
op.alter_column("oauth_codes", "code", nullable=True)
|
||||
|
||||
# Drop old unique indexes on plaintext columns
|
||||
try:
|
||||
op.drop_index("ix_oauth_grant_token", "oauth_grants")
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
op.drop_index("ix_oauth_code_code", "oauth_codes")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
def downgrade():
|
||||
# Restore original NOT NULL constraints
|
||||
op.alter_column("oauth_grants", "token", nullable=False)
|
||||
op.alter_column("oauth_codes", "code", nullable=False)
|
||||
|
||||
# Drop hash columns and indexes
|
||||
try:
|
||||
op.drop_index("ix_oauth_grant_token_hash", "oauth_grants")
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
op.drop_index("ix_oauth_code_code_hash", "oauth_codes")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
op.drop_column("oauth_grants", "token_hash")
|
||||
op.drop_column("oauth_codes", "code_hash")
|
||||
op.drop_column("oauth_codes", "grant_token_hash")
|
||||
|
||||
# Restore original unique indexes
|
||||
op.create_index("ix_oauth_grant_token", "oauth_grants", ["token"], unique=True)
|
||||
op.create_index("ix_oauth_code_code", "oauth_codes", ["code"], unique=True)
|
||||
43
account/alembic/versions/0003_add_user_profile_fields.py
Normal file
43
account/alembic/versions/0003_add_user_profile_fields.py
Normal file
@@ -0,0 +1,43 @@
|
||||
"""Add author profile fields to users table.
|
||||
|
||||
Merges Ghost Author profile data into User — bio, profile_image, cover_image,
|
||||
website, location, facebook, twitter, slug, is_admin.
|
||||
|
||||
Revision ID: 0003
|
||||
Revises: 0002_hash_oauth_tokens
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
revision = "acct_0003"
|
||||
down_revision = "acct_0002"
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
op.add_column("users", sa.Column("slug", sa.String(191), nullable=True))
|
||||
op.add_column("users", sa.Column("bio", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("profile_image", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("cover_image", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("website", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("location", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("facebook", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column("twitter", sa.Text(), nullable=True))
|
||||
op.add_column("users", sa.Column(
|
||||
"is_admin", sa.Boolean(), nullable=False, server_default=sa.text("false"),
|
||||
))
|
||||
op.create_index("ix_users_slug", "users", ["slug"], unique=True)
|
||||
|
||||
|
||||
def downgrade():
|
||||
op.drop_index("ix_users_slug")
|
||||
op.drop_column("users", "is_admin")
|
||||
op.drop_column("users", "twitter")
|
||||
op.drop_column("users", "facebook")
|
||||
op.drop_column("users", "location")
|
||||
op.drop_column("users", "website")
|
||||
op.drop_column("users", "cover_image")
|
||||
op.drop_column("users", "profile_image")
|
||||
op.drop_column("users", "bio")
|
||||
op.drop_column("users", "slug")
|
||||
118
account/app.py
Normal file
118
account/app.py
Normal file
@@ -0,0 +1,118 @@
|
||||
from __future__ import annotations
|
||||
import path_setup # noqa: F401 # adds shared/ to sys.path
|
||||
from pathlib import Path
|
||||
|
||||
from quart import g, request
|
||||
from jinja2 import FileSystemLoader, ChoiceLoader
|
||||
|
||||
from shared.infrastructure.factory import create_base_app
|
||||
|
||||
from bp import register_account_bp, register_auth_bp
|
||||
|
||||
|
||||
async def account_context() -> dict:
|
||||
"""Account app context processor."""
|
||||
from shared.infrastructure.context import base_context
|
||||
from shared.infrastructure.cart_identity import current_cart_identity
|
||||
from shared.infrastructure.fragments import fetch_fragments
|
||||
from shared.infrastructure.data_client import fetch_data
|
||||
from shared.contracts.dtos import CartSummaryDTO, dto_from_dict
|
||||
|
||||
ctx = await base_context()
|
||||
|
||||
# menu_nodes lives in db_blog; nav-tree fragment provides the real nav
|
||||
ctx["menu_items"] = []
|
||||
|
||||
# Cart data via internal data endpoint
|
||||
ident = current_cart_identity()
|
||||
summary_params = {}
|
||||
if ident["user_id"] is not None:
|
||||
summary_params["user_id"] = ident["user_id"]
|
||||
if ident["session_id"] is not None:
|
||||
summary_params["session_id"] = ident["session_id"]
|
||||
raw = await fetch_data("cart", "cart-summary", params=summary_params, required=False)
|
||||
summary = dto_from_dict(CartSummaryDTO, raw) if raw else CartSummaryDTO()
|
||||
ctx["cart_count"] = summary.count + summary.calendar_count + summary.ticket_count
|
||||
ctx["cart_total"] = float(summary.total + summary.calendar_total + summary.ticket_total)
|
||||
|
||||
# Pre-fetch cross-app HTML fragments concurrently
|
||||
user = getattr(g, "user", None)
|
||||
cart_params = {}
|
||||
if ident["user_id"] is not None:
|
||||
cart_params["user_id"] = ident["user_id"]
|
||||
if ident["session_id"] is not None:
|
||||
cart_params["session_id"] = ident["session_id"]
|
||||
|
||||
cart_mini, auth_menu, nav_tree = await fetch_fragments([
|
||||
("cart", "cart-mini", cart_params or None),
|
||||
("account", "auth-menu", {"email": user.email} if user else None),
|
||||
("blog", "nav-tree", {"app_name": "account", "path": request.path}),
|
||||
])
|
||||
ctx["cart_mini"] = cart_mini
|
||||
ctx["auth_menu"] = auth_menu
|
||||
ctx["nav_tree"] = nav_tree
|
||||
|
||||
return ctx
|
||||
|
||||
|
||||
def create_app() -> "Quart":
|
||||
from services import register_domain_services
|
||||
|
||||
app = create_base_app(
|
||||
"account",
|
||||
context_fn=account_context,
|
||||
domain_services_fn=register_domain_services,
|
||||
)
|
||||
|
||||
# App-specific templates override shared templates
|
||||
app_templates = str(Path(__file__).resolve().parent / "templates")
|
||||
app.jinja_loader = ChoiceLoader([
|
||||
FileSystemLoader(app_templates),
|
||||
app.jinja_loader,
|
||||
])
|
||||
|
||||
# Load .sx component files and setup defpage routes
|
||||
from shared.sx.jinja_bridge import load_service_components
|
||||
load_service_components(str(Path(__file__).resolve().parent), service_name="account")
|
||||
from sxc.pages import setup_account_pages
|
||||
setup_account_pages()
|
||||
|
||||
# --- blueprints ---
|
||||
app.register_blueprint(register_auth_bp())
|
||||
|
||||
account_bp = register_account_bp()
|
||||
app.register_blueprint(account_bp)
|
||||
|
||||
from shared.sx.pages import auto_mount_pages
|
||||
auto_mount_pages(app, "account")
|
||||
|
||||
from shared.sx.handlers import auto_mount_fragment_handlers
|
||||
auto_mount_fragment_handlers(app, "account")
|
||||
|
||||
from bp.actions.routes import register as register_actions
|
||||
app.register_blueprint(register_actions())
|
||||
|
||||
from bp.data.routes import register as register_data
|
||||
app.register_blueprint(register_data())
|
||||
|
||||
# --- Ghost membership sync at startup (background) ---
|
||||
# Runs as a background task to avoid blocking Hypercorn's startup timeout.
|
||||
@app.before_serving
|
||||
async def _schedule_ghost_membership_sync():
|
||||
import asyncio
|
||||
async def _sync():
|
||||
from services.ghost_membership import sync_all_membership_from_ghost
|
||||
from shared.db.session import get_session
|
||||
try:
|
||||
async with get_session() as s:
|
||||
await sync_all_membership_from_ghost(s)
|
||||
await s.commit()
|
||||
print("[account] Ghost membership sync complete")
|
||||
except Exception as e:
|
||||
print(f"[account] Ghost membership sync failed (non-fatal): {e}")
|
||||
asyncio.get_event_loop().create_task(_sync())
|
||||
|
||||
return app
|
||||
|
||||
|
||||
app = create_app()
|
||||
2
account/bp/__init__.py
Normal file
2
account/bp/__init__.py
Normal file
@@ -0,0 +1,2 @@
|
||||
from .account.routes import register as register_account_bp
|
||||
from .auth.routes import register as register_auth_bp
|
||||
0
account/bp/account/__init__.py
Normal file
0
account/bp/account/__init__.py
Normal file
79
account/bp/account/routes.py
Normal file
79
account/bp/account/routes.py
Normal file
@@ -0,0 +1,79 @@
|
||||
"""Account pages blueprint.
|
||||
|
||||
Moved from federation/bp/auth — newsletters, fragment pages (tickets, bookings).
|
||||
Mounted at root /. GET page handlers replaced by defpage.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from quart import (
|
||||
Blueprint,
|
||||
g,
|
||||
)
|
||||
from sqlalchemy import select
|
||||
|
||||
from shared.models import UserNewsletter
|
||||
from shared.infrastructure.fragments import fetch_fragments
|
||||
from shared.sx.helpers import sx_response, sx_call
|
||||
|
||||
|
||||
def register(url_prefix="/"):
|
||||
account_bp = Blueprint("account", __name__, url_prefix=url_prefix)
|
||||
|
||||
@account_bp.before_request
|
||||
async def _prepare_page_data():
|
||||
"""Fetch account_nav fragments for layout."""
|
||||
events_nav, cart_nav, artdag_nav = await fetch_fragments([
|
||||
("events", "account-nav-item", {}),
|
||||
("cart", "account-nav-item", {}),
|
||||
("artdag", "nav-item", {}),
|
||||
], required=False)
|
||||
g.account_nav = events_nav + cart_nav + artdag_nav
|
||||
|
||||
@account_bp.post("/newsletter/<int:newsletter_id>/toggle/")
|
||||
async def toggle_newsletter(newsletter_id: int):
|
||||
if not g.get("user"):
|
||||
return "", 401
|
||||
|
||||
result = await g.s.execute(
|
||||
select(UserNewsletter).where(
|
||||
UserNewsletter.user_id == g.user.id,
|
||||
UserNewsletter.newsletter_id == newsletter_id,
|
||||
)
|
||||
)
|
||||
un = result.scalar_one_or_none()
|
||||
|
||||
if un:
|
||||
un.subscribed = not un.subscribed
|
||||
else:
|
||||
un = UserNewsletter(
|
||||
user_id=g.user.id,
|
||||
newsletter_id=newsletter_id,
|
||||
subscribed=True,
|
||||
)
|
||||
g.s.add(un)
|
||||
|
||||
await g.s.flush()
|
||||
|
||||
# Render toggle directly — no sx_components intermediary
|
||||
from shared.browser.app.csrf import generate_csrf_token
|
||||
from shared.infrastructure.urls import account_url
|
||||
|
||||
nid = un.newsletter_id
|
||||
url_fn = getattr(g, "_account_url", None) or account_url
|
||||
toggle_url = url_fn(f"/newsletter/{nid}/toggle/")
|
||||
csrf = generate_csrf_token()
|
||||
bg = "bg-emerald-500" if un.subscribed else "bg-stone-300"
|
||||
translate = "translate-x-6" if un.subscribed else "translate-x-1"
|
||||
checked = "true" if un.subscribed else "false"
|
||||
|
||||
return sx_response(sx_call(
|
||||
"account-newsletter-toggle",
|
||||
id=f"nl-{nid}", url=toggle_url,
|
||||
hdrs=f'{{"X-CSRFToken": "{csrf}"}}',
|
||||
target=f"#nl-{nid}",
|
||||
cls=f"relative inline-flex h-6 w-11 items-center rounded-full transition-colors focus:outline-none focus:ring-2 focus:ring-emerald-500 focus:ring-offset-2 {bg}",
|
||||
checked=checked,
|
||||
knob_cls=f"inline-block h-4 w-4 rounded-full bg-white shadow transform transition-transform {translate}",
|
||||
))
|
||||
|
||||
return account_bp
|
||||
0
account/bp/actions/__init__.py
Normal file
0
account/bp/actions/__init__.py
Normal file
37
account/bp/actions/routes.py
Normal file
37
account/bp/actions/routes.py
Normal file
@@ -0,0 +1,37 @@
|
||||
"""Account app action endpoints.
|
||||
|
||||
All actions remain as Python fallbacks (local service imports).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from quart import Blueprint, g, request
|
||||
|
||||
from shared.infrastructure.query_blueprint import create_action_blueprint
|
||||
|
||||
|
||||
def register() -> Blueprint:
|
||||
bp, _handlers = create_action_blueprint("account")
|
||||
|
||||
async def _ghost_sync_member():
|
||||
data = await request.get_json()
|
||||
ghost_id = data.get("ghost_id")
|
||||
if not ghost_id:
|
||||
return {"error": "ghost_id required"}, 400
|
||||
from services.ghost_membership import sync_single_member
|
||||
await sync_single_member(g.s, ghost_id)
|
||||
return {"ok": True}
|
||||
|
||||
_handlers["ghost-sync-member"] = _ghost_sync_member
|
||||
|
||||
async def _ghost_push_member():
|
||||
data = await request.get_json()
|
||||
user_id = data.get("user_id")
|
||||
if not user_id:
|
||||
return {"error": "user_id required"}, 400
|
||||
from services.ghost_membership import sync_member_to_ghost
|
||||
result_id = await sync_member_to_ghost(g.s, int(user_id))
|
||||
return {"ok": True, "ghost_id": result_id}
|
||||
|
||||
_handlers["ghost-push-member"] = _ghost_push_member
|
||||
|
||||
return bp
|
||||
0
account/bp/auth/__init__.py
Normal file
0
account/bp/auth/__init__.py
Normal file
764
account/bp/auth/routes.py
Normal file
764
account/bp/auth/routes.py
Normal file
@@ -0,0 +1,764 @@
|
||||
"""Authentication routes for the account app.
|
||||
|
||||
Account is the OAuth authorization server. Owns magic link login/logout,
|
||||
OAuth2 authorize endpoint, grant verification, and SSO logout.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import secrets
|
||||
from datetime import datetime, timezone, timedelta
|
||||
|
||||
from quart import (
|
||||
Blueprint,
|
||||
request,
|
||||
redirect,
|
||||
url_for,
|
||||
session as qsession,
|
||||
g,
|
||||
current_app,
|
||||
jsonify,
|
||||
)
|
||||
from sqlalchemy import select, update
|
||||
from sqlalchemy.exc import SQLAlchemyError
|
||||
|
||||
from shared.db.session import get_session
|
||||
from shared.models import User
|
||||
from shared.models.oauth_code import OAuthCode
|
||||
from shared.models.oauth_grant import OAuthGrant, hash_token
|
||||
from shared.infrastructure.urls import account_url, app_url
|
||||
from shared.infrastructure.cart_identity import current_cart_identity
|
||||
from shared.infrastructure.rate_limit import rate_limit, check_poll_backoff
|
||||
from shared.events import emit_activity
|
||||
|
||||
from .services import (
|
||||
pop_login_redirect_target,
|
||||
store_login_redirect_target,
|
||||
send_magic_email,
|
||||
find_or_create_user,
|
||||
create_magic_link,
|
||||
validate_magic_link,
|
||||
validate_email,
|
||||
)
|
||||
|
||||
SESSION_USER_KEY = "uid"
|
||||
ACCOUNT_SESSION_KEY = "account_sid"
|
||||
|
||||
|
||||
async def _render_auth_page(component: str, title: str, **kwargs) -> str:
|
||||
"""Render an auth page with root layout — replaces sx_components helpers."""
|
||||
from shared.sx.helpers import sx_call, full_page_sx, root_header_sx
|
||||
from shared.sx.page import get_template_context
|
||||
ctx = await get_template_context()
|
||||
hdr = await root_header_sx(ctx)
|
||||
content = sx_call(component, **{k: v for k, v in kwargs.items() if v})
|
||||
return await full_page_sx(ctx, header_rows=hdr, content=content,
|
||||
meta_html=f"<title>{title}</title>")
|
||||
|
||||
ALLOWED_CLIENTS = {"blog", "market", "cart", "events", "federation", "orders", "test", "sx", "artdag", "artdag_l2"}
|
||||
|
||||
|
||||
def register(url_prefix="/auth"):
|
||||
auth_bp = Blueprint("auth", __name__, url_prefix=url_prefix)
|
||||
|
||||
# --- OAuth2 authorize endpoint -------------------------------------------
|
||||
|
||||
@auth_bp.get("/oauth/authorize")
|
||||
@auth_bp.get("/oauth/authorize/")
|
||||
async def oauth_authorize():
|
||||
client_id = request.args.get("client_id", "")
|
||||
redirect_uri = request.args.get("redirect_uri", "")
|
||||
state = request.args.get("state", "")
|
||||
device_id = request.args.get("device_id", "")
|
||||
prompt = request.args.get("prompt", "")
|
||||
|
||||
if client_id not in ALLOWED_CLIENTS:
|
||||
return "Invalid client_id", 400
|
||||
|
||||
expected_redirect = app_url(client_id, "/auth/callback")
|
||||
if redirect_uri != expected_redirect:
|
||||
return "Invalid redirect_uri", 400
|
||||
|
||||
# Account's own device id — always available via factory hook
|
||||
account_did = g.device_id
|
||||
|
||||
# Not logged in
|
||||
if not g.get("user"):
|
||||
if prompt == "none":
|
||||
# Silent check — pass account_did so client can watch for future logins
|
||||
sep = "&" if "?" in redirect_uri else "?"
|
||||
return redirect(
|
||||
f"{redirect_uri}{sep}error=login_required"
|
||||
f"&state={state}&account_did={account_did}"
|
||||
)
|
||||
authorize_path = request.full_path
|
||||
store_login_redirect_target()
|
||||
return redirect(url_for("auth.login_form", next=authorize_path))
|
||||
|
||||
# Logged in — create grant + authorization code
|
||||
account_sid = qsession.get(ACCOUNT_SESSION_KEY)
|
||||
if not account_sid:
|
||||
account_sid = secrets.token_urlsafe(32)
|
||||
qsession[ACCOUNT_SESSION_KEY] = account_sid
|
||||
|
||||
grant_token = secrets.token_urlsafe(48)
|
||||
code = secrets.token_urlsafe(48)
|
||||
now = datetime.now(timezone.utc)
|
||||
expires = now + timedelta(minutes=5)
|
||||
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
grant = OAuthGrant(
|
||||
token=None,
|
||||
token_hash=hash_token(grant_token),
|
||||
user_id=g.user.id,
|
||||
client_id=client_id,
|
||||
issuer_session=account_sid,
|
||||
device_id=device_id or None,
|
||||
)
|
||||
s.add(grant)
|
||||
|
||||
oauth_code = OAuthCode(
|
||||
code=None,
|
||||
code_hash=hash_token(code),
|
||||
user_id=g.user.id,
|
||||
client_id=client_id,
|
||||
redirect_uri=redirect_uri,
|
||||
expires_at=expires,
|
||||
grant_token=None,
|
||||
grant_token_hash=hash_token(grant_token),
|
||||
)
|
||||
s.add(oauth_code)
|
||||
|
||||
sep = "&" if "?" in redirect_uri else "?"
|
||||
return redirect(
|
||||
f"{redirect_uri}{sep}code={code}&state={state}"
|
||||
f"&account_did={account_did}&grant_token={grant_token}"
|
||||
)
|
||||
|
||||
# --- OAuth2 token exchange (for external clients like artdag) -------------
|
||||
|
||||
from shared.browser.app.csrf import csrf_exempt
|
||||
|
||||
@csrf_exempt
|
||||
@auth_bp.post("/oauth/token")
|
||||
@auth_bp.post("/oauth/token/")
|
||||
async def oauth_token():
|
||||
"""Exchange an authorization code for user info + grant token.
|
||||
|
||||
Used by clients that don't share the coop database (e.g. artdag).
|
||||
Accepts JSON: {code, client_id, redirect_uri}
|
||||
Returns JSON: {user_id, username, display_name, grant_token}
|
||||
"""
|
||||
data = await request.get_json()
|
||||
if not data:
|
||||
return jsonify({"error": "invalid_request"}), 400
|
||||
|
||||
code = data.get("code", "")
|
||||
client_id = data.get("client_id", "")
|
||||
redirect_uri = data.get("redirect_uri", "")
|
||||
|
||||
if client_id not in ALLOWED_CLIENTS:
|
||||
return jsonify({"error": "invalid_client"}), 400
|
||||
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
code_h = hash_token(code)
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
# Look up by hash first (new grants), fall back to plaintext (migration)
|
||||
result = await s.execute(
|
||||
select(OAuthCode)
|
||||
.where(
|
||||
(OAuthCode.code_hash == code_h) | (OAuthCode.code == code)
|
||||
)
|
||||
.with_for_update()
|
||||
)
|
||||
oauth_code = result.scalar_one_or_none()
|
||||
|
||||
if not oauth_code:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
if oauth_code.used_at is not None:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
if oauth_code.expires_at < now:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
if oauth_code.client_id != client_id:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
if oauth_code.redirect_uri != redirect_uri:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
oauth_code.used_at = now
|
||||
user_id = oauth_code.user_id
|
||||
grant_token = oauth_code.grant_token
|
||||
|
||||
user = await s.get(User, user_id)
|
||||
if not user:
|
||||
return jsonify({"error": "invalid_grant"}), 400
|
||||
|
||||
return jsonify({
|
||||
"user_id": user_id,
|
||||
"username": user.email or "",
|
||||
"display_name": user.name or "",
|
||||
"grant_token": grant_token,
|
||||
})
|
||||
|
||||
# --- Grant verification (internal endpoint) ------------------------------
|
||||
|
||||
@auth_bp.get("/internal/verify-grant")
|
||||
async def verify_grant():
|
||||
"""Called by client apps to check if a grant is still valid."""
|
||||
token = request.args.get("token", "")
|
||||
if not token:
|
||||
return jsonify({"valid": False}), 200
|
||||
|
||||
token_h = hash_token(token)
|
||||
async with get_session() as s:
|
||||
grant = await s.scalar(
|
||||
select(OAuthGrant).where(
|
||||
(OAuthGrant.token_hash == token_h) | (OAuthGrant.token == token)
|
||||
)
|
||||
)
|
||||
if not grant or grant.revoked_at is not None:
|
||||
return jsonify({"valid": False}), 200
|
||||
user = await s.get(User, grant.user_id)
|
||||
return jsonify({
|
||||
"valid": True,
|
||||
"user_id": grant.user_id,
|
||||
"username": user.email if user else "",
|
||||
"display_name": user.name if user else "",
|
||||
}), 200
|
||||
|
||||
@auth_bp.get("/internal/check-device")
|
||||
async def check_device():
|
||||
"""Called by client apps to check if a device has an active auth.
|
||||
|
||||
Looks up the most recent grant for (device_id, client_id).
|
||||
If the grant is active → {active: true}.
|
||||
If revoked but user has logged in since → {active: true} (re-auth needed).
|
||||
Otherwise → {active: false}.
|
||||
"""
|
||||
device_id = request.args.get("device_id", "")
|
||||
app_name = request.args.get("app", "")
|
||||
if not device_id or not app_name:
|
||||
return jsonify({"active": False}), 200
|
||||
|
||||
async with get_session() as s:
|
||||
# Find the most recent grant for this device + app
|
||||
result = await s.execute(
|
||||
select(OAuthGrant)
|
||||
.where(OAuthGrant.device_id == device_id)
|
||||
.where(OAuthGrant.client_id == app_name)
|
||||
.order_by(OAuthGrant.created_at.desc())
|
||||
.limit(1)
|
||||
)
|
||||
grant = result.scalar_one_or_none()
|
||||
|
||||
if not grant:
|
||||
return jsonify({"active": False}), 200
|
||||
|
||||
# Grant still active
|
||||
if grant.revoked_at is None:
|
||||
return jsonify({"active": True}), 200
|
||||
|
||||
# Grant revoked — check if user logged in since
|
||||
user = await s.get(User, grant.user_id)
|
||||
if user and user.last_login_at and user.last_login_at > grant.revoked_at:
|
||||
return jsonify({"active": True}), 200
|
||||
|
||||
return jsonify({"active": False}), 200
|
||||
|
||||
# --- Magic link login flow -----------------------------------------------
|
||||
|
||||
@auth_bp.get("/login/")
|
||||
async def login_form():
|
||||
store_login_redirect_target()
|
||||
cross_cart_sid = request.args.get("cart_sid")
|
||||
if cross_cart_sid:
|
||||
import re
|
||||
# Validate cart_sid is a hex token (32 chars from token_hex(16))
|
||||
if re.fullmatch(r"[0-9a-f]{32}", cross_cart_sid):
|
||||
qsession["cart_sid"] = cross_cart_sid
|
||||
if g.get("user"):
|
||||
redirect_url = pop_login_redirect_target()
|
||||
return redirect(redirect_url)
|
||||
|
||||
return await _render_auth_page("account-login-content", "Login \u2014 Rose Ash")
|
||||
|
||||
@rate_limit(
|
||||
key_func=lambda: request.headers.get("X-Forwarded-For", request.remote_addr),
|
||||
max_requests=10, window_seconds=900, scope="magic_ip",
|
||||
)
|
||||
@auth_bp.post("/start/")
|
||||
async def start_login():
|
||||
form = await request.form
|
||||
email_input = form.get("email") or ""
|
||||
|
||||
is_valid, email = validate_email(email_input)
|
||||
if not is_valid:
|
||||
return await _render_auth_page(
|
||||
"account-login-content", "Login \u2014 Rose Ash",
|
||||
error="Please enter a valid email address.", email=email_input,
|
||||
), 400
|
||||
|
||||
# Per-email rate limit: 5 magic links per 15 minutes
|
||||
from shared.infrastructure.rate_limit import _check_rate_limit
|
||||
try:
|
||||
allowed, _ = await _check_rate_limit(f"magic_email:{email}", 5, 900)
|
||||
if not allowed:
|
||||
return await _render_auth_page(
|
||||
"account-check-email-content", "Check your email \u2014 Rose Ash",
|
||||
email=email,
|
||||
), 200
|
||||
except Exception:
|
||||
pass # Redis down — allow the request
|
||||
|
||||
user = await find_or_create_user(g.s, email)
|
||||
token, expires = await create_magic_link(g.s, user.id)
|
||||
|
||||
from shared.utils import host_url
|
||||
magic_url = host_url(url_for("auth.magic", token=token))
|
||||
|
||||
email_error = None
|
||||
try:
|
||||
await send_magic_email(email, magic_url)
|
||||
except Exception as e:
|
||||
current_app.logger.error("EMAIL SEND FAILED: %r", e)
|
||||
email_error = (
|
||||
"We couldn't send the email automatically. "
|
||||
"Please try again in a moment."
|
||||
)
|
||||
|
||||
return await _render_auth_page(
|
||||
"account-check-email-content", "Check your email \u2014 Rose Ash",
|
||||
email=email, email_error=email_error,
|
||||
)
|
||||
|
||||
@auth_bp.get("/magic/<token>/")
|
||||
async def magic(token: str):
|
||||
now = datetime.now(timezone.utc)
|
||||
user_id: int | None = None
|
||||
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
user, error = await validate_magic_link(s, token)
|
||||
|
||||
if error:
|
||||
return await _render_auth_page(
|
||||
"account-login-content", "Login \u2014 Rose Ash",
|
||||
error=error,
|
||||
), 400
|
||||
user_id = user.id
|
||||
|
||||
except Exception:
|
||||
return await _render_auth_page(
|
||||
"account-login-content", "Login \u2014 Rose Ash",
|
||||
error="Could not sign you in right now. Please try again.",
|
||||
), 502
|
||||
|
||||
assert user_id is not None
|
||||
|
||||
ident = current_cart_identity()
|
||||
anon_session_id = ident.get("session_id")
|
||||
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
u2 = await s.get(User, user_id)
|
||||
if u2:
|
||||
u2.last_login_at = now
|
||||
if anon_session_id:
|
||||
await emit_activity(
|
||||
s,
|
||||
activity_type="rose:Login",
|
||||
actor_uri="internal:system",
|
||||
object_type="Person",
|
||||
object_data={
|
||||
"user_id": user_id,
|
||||
"session_id": anon_session_id,
|
||||
},
|
||||
)
|
||||
# Notify external services of device login
|
||||
await emit_activity(
|
||||
s,
|
||||
activity_type="rose:DeviceAuth",
|
||||
actor_uri="internal:system",
|
||||
object_type="Device",
|
||||
object_data={
|
||||
"device_id": g.device_id,
|
||||
"action": "login",
|
||||
},
|
||||
)
|
||||
except SQLAlchemyError:
|
||||
current_app.logger.exception(
|
||||
"[auth] non-fatal DB update for user_id=%s", user_id
|
||||
)
|
||||
|
||||
qsession[SESSION_USER_KEY] = user_id
|
||||
# Fresh account session ID for grant tracking
|
||||
qsession[ACCOUNT_SESSION_KEY] = secrets.token_urlsafe(32)
|
||||
|
||||
# Signal login for this device so client apps can detect it
|
||||
try:
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
import time as _time
|
||||
_auth_r = await get_auth_redis()
|
||||
await _auth_r.set(
|
||||
f"did_auth:{g.device_id}",
|
||||
str(_time.time()).encode(),
|
||||
ex=30 * 24 * 3600,
|
||||
)
|
||||
except Exception:
|
||||
current_app.logger.exception("[auth] failed to set did_auth in Redis")
|
||||
|
||||
redirect_url = pop_login_redirect_target()
|
||||
return redirect(redirect_url, 303)
|
||||
|
||||
@auth_bp.post("/logout/")
|
||||
async def logout():
|
||||
# Revoke all grants issued by this account session
|
||||
account_sid = qsession.get(ACCOUNT_SESSION_KEY)
|
||||
if account_sid:
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
await s.execute(
|
||||
update(OAuthGrant)
|
||||
.where(OAuthGrant.issuer_session == account_sid)
|
||||
.where(OAuthGrant.revoked_at.is_(None))
|
||||
.values(revoked_at=datetime.now(timezone.utc))
|
||||
)
|
||||
except SQLAlchemyError:
|
||||
current_app.logger.exception("[auth] failed to revoke grants")
|
||||
|
||||
# Clear login signal for this device
|
||||
try:
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
_auth_r = await get_auth_redis()
|
||||
await _auth_r.delete(f"did_auth:{g.device_id}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Notify external services of device logout
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
await emit_activity(
|
||||
s,
|
||||
activity_type="rose:DeviceAuth",
|
||||
actor_uri="internal:system",
|
||||
object_type="Device",
|
||||
object_data={
|
||||
"device_id": g.device_id,
|
||||
"action": "logout",
|
||||
},
|
||||
)
|
||||
except Exception:
|
||||
current_app.logger.exception("[auth] failed to emit DeviceAuth logout")
|
||||
|
||||
qsession.pop(SESSION_USER_KEY, None)
|
||||
qsession.pop(ACCOUNT_SESSION_KEY, None)
|
||||
from shared.infrastructure.urls import blog_url
|
||||
return redirect(blog_url("/"))
|
||||
|
||||
@auth_bp.get("/sso-logout/")
|
||||
async def sso_logout():
|
||||
"""SSO logout called by client apps: revoke grants, clear session."""
|
||||
account_sid = qsession.get(ACCOUNT_SESSION_KEY)
|
||||
if account_sid:
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
await s.execute(
|
||||
update(OAuthGrant)
|
||||
.where(OAuthGrant.issuer_session == account_sid)
|
||||
.where(OAuthGrant.revoked_at.is_(None))
|
||||
.values(revoked_at=datetime.now(timezone.utc))
|
||||
)
|
||||
except SQLAlchemyError:
|
||||
current_app.logger.exception("[auth] failed to revoke grants")
|
||||
|
||||
# Clear login signal for this device
|
||||
try:
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
_auth_r = await get_auth_redis()
|
||||
await _auth_r.delete(f"did_auth:{g.device_id}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Notify external services of device logout
|
||||
try:
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
await emit_activity(
|
||||
s,
|
||||
activity_type="rose:DeviceAuth",
|
||||
actor_uri="internal:system",
|
||||
object_type="Device",
|
||||
object_data={
|
||||
"device_id": g.device_id,
|
||||
"action": "logout",
|
||||
},
|
||||
)
|
||||
except Exception:
|
||||
current_app.logger.exception("[auth] failed to emit DeviceAuth logout")
|
||||
|
||||
qsession.pop(SESSION_USER_KEY, None)
|
||||
qsession.pop(ACCOUNT_SESSION_KEY, None)
|
||||
from shared.infrastructure.urls import blog_url
|
||||
return redirect(blog_url("/"))
|
||||
|
||||
@auth_bp.get("/clear/")
|
||||
async def clear():
|
||||
"""One-time migration helper: clear all session cookies."""
|
||||
qsession.clear()
|
||||
resp = redirect(account_url("/"))
|
||||
resp.delete_cookie("blog_session", domain=".rose-ash.com", path="/")
|
||||
return resp
|
||||
|
||||
# --- Device Authorization Flow (RFC 8628) ---------------------------------
|
||||
|
||||
_DEVICE_ALPHABET = "ABCDEFGHJKMNPQRSTVWXYZ"
|
||||
_DEVICE_CODE_TTL = 900 # 15 minutes
|
||||
_DEVICE_POLL_INTERVAL = 5
|
||||
|
||||
def _generate_user_code() -> str:
|
||||
"""Generate an unambiguous 8-char user code like KBMN-TWRP."""
|
||||
chars = [secrets.choice(_DEVICE_ALPHABET) for _ in range(8)]
|
||||
return "".join(chars[:4]) + "-" + "".join(chars[4:])
|
||||
|
||||
async def _approve_device(device_code: str, user) -> bool:
|
||||
"""Approve a pending device flow and create an OAuthGrant."""
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
|
||||
r = await get_auth_redis()
|
||||
raw = await r.get(f"devflow:{device_code}")
|
||||
if not raw:
|
||||
return False
|
||||
|
||||
blob = json.loads(raw)
|
||||
if blob.get("status") != "pending":
|
||||
return False
|
||||
|
||||
account_sid = qsession.get(ACCOUNT_SESSION_KEY)
|
||||
if not account_sid:
|
||||
account_sid = secrets.token_urlsafe(32)
|
||||
qsession[ACCOUNT_SESSION_KEY] = account_sid
|
||||
|
||||
grant_token = secrets.token_urlsafe(48)
|
||||
|
||||
async with get_session() as s:
|
||||
async with s.begin():
|
||||
grant = OAuthGrant(
|
||||
token=None,
|
||||
token_hash=hash_token(grant_token),
|
||||
user_id=user.id,
|
||||
client_id=blob["client_id"],
|
||||
issuer_session=account_sid,
|
||||
)
|
||||
s.add(grant)
|
||||
|
||||
# Update Redis blob
|
||||
blob["status"] = "approved"
|
||||
blob["user_id"] = user.id
|
||||
blob["grant_token"] = grant_token
|
||||
user_code = blob["user_code"]
|
||||
|
||||
ttl = await r.ttl(f"devflow:{device_code}")
|
||||
if ttl and ttl > 0:
|
||||
await r.set(f"devflow:{device_code}", json.dumps(blob).encode(), ex=ttl)
|
||||
else:
|
||||
await r.set(f"devflow:{device_code}", json.dumps(blob).encode(), ex=_DEVICE_CODE_TTL)
|
||||
|
||||
# Remove reverse lookup (code already used)
|
||||
normalized_uc = user_code.replace("-", "").upper()
|
||||
await r.delete(f"devflow_uc:{normalized_uc}")
|
||||
|
||||
return True
|
||||
|
||||
@rate_limit(
|
||||
key_func=lambda: request.headers.get("X-Forwarded-For", request.remote_addr),
|
||||
max_requests=10, window_seconds=3600, scope="dev_auth",
|
||||
)
|
||||
@csrf_exempt
|
||||
@auth_bp.post("/device/authorize")
|
||||
@auth_bp.post("/device/authorize/")
|
||||
async def device_authorize():
|
||||
"""RFC 8628 — CLI requests a device code."""
|
||||
data = await request.get_json(silent=True) or {}
|
||||
client_id = data.get("client_id", "")
|
||||
|
||||
if client_id not in ALLOWED_CLIENTS:
|
||||
return jsonify({"error": "invalid_client"}), 400
|
||||
|
||||
device_code = secrets.token_urlsafe(32)
|
||||
user_code = _generate_user_code()
|
||||
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
|
||||
r = await get_auth_redis()
|
||||
|
||||
blob = json.dumps({
|
||||
"client_id": client_id,
|
||||
"user_code": user_code,
|
||||
"status": "pending",
|
||||
"user_id": None,
|
||||
"grant_token": None,
|
||||
}).encode()
|
||||
|
||||
normalized_uc = user_code.replace("-", "").upper()
|
||||
pipe = r.pipeline()
|
||||
pipe.set(f"devflow:{device_code}", blob, ex=_DEVICE_CODE_TTL)
|
||||
pipe.set(f"devflow_uc:{normalized_uc}", device_code.encode(), ex=_DEVICE_CODE_TTL)
|
||||
await pipe.execute()
|
||||
|
||||
verification_uri = account_url("/auth/device")
|
||||
|
||||
return jsonify({
|
||||
"device_code": device_code,
|
||||
"user_code": user_code,
|
||||
"verification_uri": verification_uri,
|
||||
"expires_in": _DEVICE_CODE_TTL,
|
||||
"interval": _DEVICE_POLL_INTERVAL,
|
||||
})
|
||||
|
||||
@csrf_exempt
|
||||
@auth_bp.post("/device/token")
|
||||
@auth_bp.post("/device/token/")
|
||||
async def device_token():
|
||||
"""RFC 8628 — CLI polls for the grant token."""
|
||||
data = await request.get_json(silent=True) or {}
|
||||
device_code = data.get("device_code", "")
|
||||
client_id = data.get("client_id", "")
|
||||
|
||||
if not device_code or client_id not in ALLOWED_CLIENTS:
|
||||
return jsonify({"error": "invalid_request"}), 400
|
||||
|
||||
# Enforce polling backoff per RFC 8628
|
||||
try:
|
||||
poll_ok, interval = await check_poll_backoff(device_code)
|
||||
if not poll_ok:
|
||||
return jsonify({"error": "slow_down", "interval": interval}), 400
|
||||
except Exception:
|
||||
pass # Redis down — allow the request
|
||||
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
|
||||
r = await get_auth_redis()
|
||||
raw = await r.get(f"devflow:{device_code}")
|
||||
if not raw:
|
||||
return jsonify({"error": "expired_token"}), 400
|
||||
|
||||
blob = json.loads(raw)
|
||||
|
||||
if blob.get("client_id") != client_id:
|
||||
return jsonify({"error": "invalid_request"}), 400
|
||||
|
||||
if blob["status"] == "pending":
|
||||
return jsonify({"error": "authorization_pending"}), 428
|
||||
|
||||
if blob["status"] == "denied":
|
||||
return jsonify({"error": "access_denied"}), 400
|
||||
|
||||
if blob["status"] == "approved":
|
||||
async with get_session() as s:
|
||||
user = await s.get(User, blob["user_id"])
|
||||
if not user:
|
||||
return jsonify({"error": "access_denied"}), 400
|
||||
|
||||
# Clean up Redis
|
||||
await r.delete(f"devflow:{device_code}")
|
||||
|
||||
return jsonify({
|
||||
"access_token": blob["grant_token"],
|
||||
"token_type": "bearer",
|
||||
"user_id": blob["user_id"],
|
||||
"username": user.email or "",
|
||||
"display_name": user.name or "",
|
||||
})
|
||||
|
||||
return jsonify({"error": "invalid_request"}), 400
|
||||
|
||||
@auth_bp.get("/device")
|
||||
@auth_bp.get("/device/")
|
||||
async def device_form():
|
||||
"""Browser form where user enters the code displayed in terminal."""
|
||||
code = request.args.get("code", "")
|
||||
return await _render_auth_page(
|
||||
"account-device-content", "Authorize Device \u2014 Rose Ash",
|
||||
code=code,
|
||||
)
|
||||
|
||||
@auth_bp.post("/device")
|
||||
@auth_bp.post("/device/")
|
||||
async def device_submit():
|
||||
"""Browser submit — validates code, approves if logged in."""
|
||||
form = await request.form
|
||||
user_code = (form.get("code") or "").strip().replace("-", "").upper()
|
||||
|
||||
if not user_code or len(user_code) != 8:
|
||||
return await _render_auth_page(
|
||||
"account-device-content", "Authorize Device \u2014 Rose Ash",
|
||||
error="Please enter a valid 8-character code.", code=form.get("code", ""),
|
||||
), 400
|
||||
|
||||
from shared.infrastructure.auth_redis import get_auth_redis
|
||||
|
||||
r = await get_auth_redis()
|
||||
device_code = await r.get(f"devflow_uc:{user_code}")
|
||||
if not device_code:
|
||||
return await _render_auth_page(
|
||||
"account-device-content", "Authorize Device \u2014 Rose Ash",
|
||||
error="Code not found or expired. Please try again.", code=form.get("code", ""),
|
||||
), 400
|
||||
|
||||
if isinstance(device_code, bytes):
|
||||
device_code = device_code.decode()
|
||||
|
||||
# Not logged in — redirect to login, then come back to complete
|
||||
if not g.get("user"):
|
||||
complete_url = url_for("auth.device_complete", code=device_code)
|
||||
store_login_redirect_target()
|
||||
return redirect(url_for("auth.login_form", next=complete_url))
|
||||
|
||||
# Logged in — approve immediately
|
||||
ok = await _approve_device(device_code, g.user)
|
||||
if not ok:
|
||||
return await _render_auth_page(
|
||||
"account-device-content", "Authorize Device \u2014 Rose Ash",
|
||||
error="Code expired or already used.",
|
||||
), 400
|
||||
|
||||
return await _render_auth_page(
|
||||
"account-device-approved", "Device Authorized \u2014 Rose Ash",
|
||||
)
|
||||
|
||||
@auth_bp.get("/device/complete")
|
||||
@auth_bp.get("/device/complete/")
|
||||
async def device_complete():
|
||||
"""Post-login redirect — completes approval after magic link auth."""
|
||||
device_code = request.args.get("code", "")
|
||||
|
||||
if not device_code:
|
||||
return redirect(url_for("auth.device_form"))
|
||||
|
||||
if not g.get("user"):
|
||||
store_login_redirect_target()
|
||||
return redirect(url_for("auth.login_form"))
|
||||
|
||||
ok = await _approve_device(device_code, g.user)
|
||||
if not ok:
|
||||
return await _render_auth_page(
|
||||
"account-device-content", "Authorize Device \u2014 Rose Ash",
|
||||
error="Code expired or already used. Please start the login process again in your terminal.",
|
||||
), 400
|
||||
|
||||
return await _render_auth_page(
|
||||
"account-device-approved", "Device Authorized \u2014 Rose Ash",
|
||||
)
|
||||
|
||||
return auth_bp
|
||||
24
account/bp/auth/services/__init__.py
Normal file
24
account/bp/auth/services/__init__.py
Normal file
@@ -0,0 +1,24 @@
|
||||
from .login_redirect import pop_login_redirect_target, store_login_redirect_target
|
||||
from .auth_operations import (
|
||||
get_app_host,
|
||||
get_app_root,
|
||||
send_magic_email,
|
||||
load_user_by_id,
|
||||
find_or_create_user,
|
||||
create_magic_link,
|
||||
validate_magic_link,
|
||||
validate_email,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"pop_login_redirect_target",
|
||||
"store_login_redirect_target",
|
||||
"get_app_host",
|
||||
"get_app_root",
|
||||
"send_magic_email",
|
||||
"load_user_by_id",
|
||||
"find_or_create_user",
|
||||
"create_magic_link",
|
||||
"validate_magic_link",
|
||||
"validate_email",
|
||||
]
|
||||
156
account/bp/auth/services/auth_operations.py
Normal file
156
account/bp/auth/services/auth_operations.py
Normal file
@@ -0,0 +1,156 @@
|
||||
"""Auth operations for the account app.
|
||||
|
||||
Owns magic-link login. Shared models, shared config.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import secrets
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional, Tuple
|
||||
|
||||
from quart import current_app, render_template, request, g
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
from shared.models import User, MagicLink
|
||||
from shared.config import config
|
||||
|
||||
|
||||
def get_app_host() -> str:
|
||||
host = (
|
||||
config().get("host") or os.getenv("APP_HOST") or "http://localhost:8000"
|
||||
).rstrip("/")
|
||||
return host
|
||||
|
||||
|
||||
def get_app_root() -> str:
|
||||
root = (g.root).rstrip("/")
|
||||
return root
|
||||
|
||||
|
||||
async def send_magic_email(to_email: str, link_url: str) -> None:
|
||||
host = os.getenv("SMTP_HOST")
|
||||
port = int(os.getenv("SMTP_PORT") or "587")
|
||||
username = os.getenv("SMTP_USER")
|
||||
password = os.getenv("SMTP_PASS")
|
||||
mail_from = os.getenv("MAIL_FROM") or "no-reply@example.com"
|
||||
|
||||
site_name = config().get("title", "Rose Ash")
|
||||
subject = f"Your sign-in link \u2014 {site_name}"
|
||||
|
||||
tpl_vars = dict(site_name=site_name, link_url=link_url)
|
||||
text_body = await render_template("_email/magic_link.txt", **tpl_vars)
|
||||
html_body = await render_template("_email/magic_link.html", **tpl_vars)
|
||||
|
||||
if not host or not username or not password:
|
||||
current_app.logger.warning(
|
||||
"SMTP not configured. Printing magic link to console for %s: %s",
|
||||
to_email,
|
||||
link_url,
|
||||
)
|
||||
print(f"[DEV] Magic link for {to_email}: {link_url}")
|
||||
return
|
||||
|
||||
import aiosmtplib
|
||||
from email.message import EmailMessage
|
||||
|
||||
msg = EmailMessage()
|
||||
msg["From"] = mail_from
|
||||
msg["To"] = to_email
|
||||
msg["Subject"] = subject
|
||||
msg.set_content(text_body)
|
||||
msg.add_alternative(html_body, subtype="html")
|
||||
|
||||
is_secure = port == 465
|
||||
if is_secure:
|
||||
smtp = aiosmtplib.SMTP(
|
||||
hostname=host, port=port, use_tls=True,
|
||||
username=username, password=password,
|
||||
)
|
||||
else:
|
||||
smtp = aiosmtplib.SMTP(
|
||||
hostname=host, port=port, start_tls=True,
|
||||
username=username, password=password,
|
||||
)
|
||||
|
||||
async with smtp:
|
||||
await smtp.send_message(msg)
|
||||
|
||||
|
||||
async def load_user_by_id(session: AsyncSession, user_id: int) -> Optional[User]:
|
||||
stmt = (
|
||||
select(User)
|
||||
.options(selectinload(User.labels))
|
||||
.where(User.id == user_id)
|
||||
)
|
||||
result = await session.execute(stmt)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
|
||||
async def find_or_create_user(session: AsyncSession, email: str) -> User:
|
||||
result = await session.execute(select(User).where(User.email == email))
|
||||
user = result.scalar_one_or_none()
|
||||
|
||||
if user is None:
|
||||
user = User(email=email)
|
||||
session.add(user)
|
||||
await session.flush()
|
||||
|
||||
return user
|
||||
|
||||
|
||||
async def create_magic_link(
|
||||
session: AsyncSession,
|
||||
user_id: int,
|
||||
purpose: str = "signin",
|
||||
expires_minutes: int = 15,
|
||||
) -> Tuple[str, datetime]:
|
||||
token = secrets.token_urlsafe(32)
|
||||
expires = datetime.now(timezone.utc) + timedelta(minutes=expires_minutes)
|
||||
|
||||
ml = MagicLink(
|
||||
token=token,
|
||||
user_id=user_id,
|
||||
purpose=purpose,
|
||||
expires_at=expires,
|
||||
ip=request.headers.get("x-forwarded-for", request.remote_addr),
|
||||
user_agent=request.headers.get("user-agent"),
|
||||
)
|
||||
session.add(ml)
|
||||
|
||||
return token, expires
|
||||
|
||||
|
||||
async def validate_magic_link(
|
||||
session: AsyncSession,
|
||||
token: str,
|
||||
) -> Tuple[Optional[User], Optional[str]]:
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
ml = await session.scalar(
|
||||
select(MagicLink)
|
||||
.where(MagicLink.token == token)
|
||||
.with_for_update()
|
||||
)
|
||||
|
||||
if not ml or ml.purpose != "signin":
|
||||
return None, "Invalid or expired link."
|
||||
|
||||
if ml.used_at or ml.expires_at < now:
|
||||
return None, "This link has expired. Please request a new one."
|
||||
|
||||
user = await session.get(User, ml.user_id)
|
||||
if not user:
|
||||
return None, "User not found."
|
||||
|
||||
ml.used_at = now
|
||||
return user, None
|
||||
|
||||
|
||||
def validate_email(email: str) -> Tuple[bool, str]:
|
||||
email = email.strip().lower()
|
||||
if not email or "@" not in email:
|
||||
return False, email
|
||||
return True, email
|
||||
45
account/bp/auth/services/login_redirect.py
Normal file
45
account/bp/auth/services/login_redirect.py
Normal file
@@ -0,0 +1,45 @@
|
||||
from urllib.parse import urlparse
|
||||
from quart import session
|
||||
|
||||
from shared.infrastructure.urls import account_url
|
||||
|
||||
|
||||
LOGIN_REDIRECT_SESSION_KEY = "login_redirect_to"
|
||||
|
||||
|
||||
def store_login_redirect_target() -> None:
|
||||
from quart import request
|
||||
|
||||
target = request.args.get("next")
|
||||
if not target:
|
||||
ref = request.referrer or ""
|
||||
try:
|
||||
parsed = urlparse(ref)
|
||||
target = parsed.path or ""
|
||||
except Exception:
|
||||
target = ""
|
||||
|
||||
if not target:
|
||||
return
|
||||
|
||||
# Accept both relative paths and absolute URLs (cross-app redirects)
|
||||
if target.startswith("http://") or target.startswith("https://"):
|
||||
session[LOGIN_REDIRECT_SESSION_KEY] = target
|
||||
elif target.startswith("/") and not target.startswith("//"):
|
||||
session[LOGIN_REDIRECT_SESSION_KEY] = target
|
||||
|
||||
|
||||
def pop_login_redirect_target() -> str:
|
||||
path = session.pop(LOGIN_REDIRECT_SESSION_KEY, None)
|
||||
if not path or not isinstance(path, str):
|
||||
return account_url("/")
|
||||
|
||||
# Absolute URL: return as-is (cross-app redirect)
|
||||
if path.startswith("http://") or path.startswith("https://"):
|
||||
return path
|
||||
|
||||
# Relative path: must start with / and not //
|
||||
if path.startswith("/") and not path.startswith("//"):
|
||||
return account_url(path)
|
||||
|
||||
return account_url("/")
|
||||
0
account/bp/data/__init__.py
Normal file
0
account/bp/data/__init__.py
Normal file
14
account/bp/data/routes.py
Normal file
14
account/bp/data/routes.py
Normal file
@@ -0,0 +1,14 @@
|
||||
"""Account app data endpoints.
|
||||
|
||||
All queries are defined in ``account/queries.sx``.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from quart import Blueprint
|
||||
|
||||
from shared.infrastructure.query_blueprint import create_data_blueprint
|
||||
|
||||
|
||||
def register() -> Blueprint:
|
||||
bp, _handlers = create_data_blueprint("account")
|
||||
return bp
|
||||
62
account/entrypoint.sh
Executable file
62
account/entrypoint.sh
Executable file
@@ -0,0 +1,62 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Optional: wait for Postgres to be reachable
|
||||
if [[ -n "${DATABASE_HOST:-}" && -n "${DATABASE_PORT:-}" ]]; then
|
||||
echo "Waiting for Postgres at ${DATABASE_HOST}:${DATABASE_PORT}..."
|
||||
for i in {1..60}; do
|
||||
(echo > /dev/tcp/${DATABASE_HOST}/${DATABASE_PORT}) >/dev/null 2>&1 && break || true
|
||||
sleep 1
|
||||
done
|
||||
fi
|
||||
|
||||
# Create own database + run own migrations
|
||||
if [[ "${RUN_MIGRATIONS:-}" == "true" && -n "${ALEMBIC_DATABASE_URL:-}" ]]; then
|
||||
python3 -c "
|
||||
import os, re
|
||||
url = os.environ['ALEMBIC_DATABASE_URL']
|
||||
m = re.match(r'postgresql\+\w+://([^:]+):([^@]+)@([^:]+):(\d+)/(.+)', url)
|
||||
if not m:
|
||||
print('Could not parse ALEMBIC_DATABASE_URL, skipping DB creation')
|
||||
exit(0)
|
||||
user, password, host, port, dbname = m.groups()
|
||||
|
||||
import psycopg
|
||||
conn = psycopg.connect(
|
||||
f'postgresql://{user}:{password}@{host}:{port}/postgres',
|
||||
autocommit=True,
|
||||
)
|
||||
cur = conn.execute('SELECT 1 FROM pg_database WHERE datname = %s', (dbname,))
|
||||
if not cur.fetchone():
|
||||
conn.execute(f'CREATE DATABASE {dbname}')
|
||||
print(f'Created database {dbname}')
|
||||
else:
|
||||
print(f'Database {dbname} already exists')
|
||||
conn.close()
|
||||
" || echo "DB creation failed (non-fatal), continuing..."
|
||||
|
||||
echo "Running account Alembic migrations..."
|
||||
if [ -d account ]; then (cd account && alembic upgrade head); else alembic upgrade head; fi
|
||||
fi
|
||||
|
||||
# Clear Redis page cache on deploy
|
||||
if [[ -n "${REDIS_URL:-}" && "${REDIS_URL}" != "no" ]]; then
|
||||
echo "Flushing Redis cache..."
|
||||
python3 -c "
|
||||
import redis, os
|
||||
r = redis.from_url(os.environ['REDIS_URL'])
|
||||
r.flushdb()
|
||||
print('Redis cache cleared.')
|
||||
" || echo "Redis flush failed (non-fatal), continuing..."
|
||||
fi
|
||||
|
||||
# Start the app
|
||||
RELOAD_FLAG=""
|
||||
if [[ "${RELOAD:-}" == "true" ]]; then
|
||||
RELOAD_FLAG="--reload"
|
||||
python3 -m shared.dev_watcher &
|
||||
echo "Starting Hypercorn (${APP_MODULE:-app:app}) with auto-reload..."
|
||||
else
|
||||
echo "Starting Hypercorn (${APP_MODULE:-app:app})..."
|
||||
fi
|
||||
PYTHONUNBUFFERED=1 exec hypercorn "${APP_MODULE:-app:app}" --bind 0.0.0.0:${PORT:-8000} --workers ${WORKERS:-2} --keep-alive 75 ${RELOAD_FLAG}
|
||||
0
account/models/__init__.py
Normal file
0
account/models/__init__.py
Normal file
9
account/path_setup.py
Normal file
9
account/path_setup.py
Normal file
@@ -0,0 +1,9 @@
|
||||
import sys
|
||||
import os
|
||||
|
||||
_app_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
_project_root = os.path.dirname(_app_dir)
|
||||
|
||||
for _p in (_project_root, _app_dir):
|
||||
if _p not in sys.path:
|
||||
sys.path.insert(0, _p)
|
||||
9
account/queries.sx
Normal file
9
account/queries.sx
Normal file
@@ -0,0 +1,9 @@
|
||||
;; Account service — inter-service data queries
|
||||
|
||||
(defquery user-by-email (&key email)
|
||||
"Return user_id for a given email address."
|
||||
(service "account" "user-by-email" :email email))
|
||||
|
||||
(defquery newsletters ()
|
||||
"Return all Ghost newsletters."
|
||||
(service "account" "newsletters"))
|
||||
12
account/services/__init__.py
Normal file
12
account/services/__init__.py
Normal file
@@ -0,0 +1,12 @@
|
||||
"""Account app service registration."""
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
def register_domain_services() -> None:
|
||||
"""Register services for the account app."""
|
||||
from shared.services.registry import services
|
||||
from .account_page import AccountPageService
|
||||
services.register("account_page", AccountPageService())
|
||||
|
||||
from shared.services.account_impl import SqlAccountDataService
|
||||
services.register("account", SqlAccountDataService())
|
||||
40
account/services/account_page.py
Normal file
40
account/services/account_page.py
Normal file
@@ -0,0 +1,40 @@
|
||||
"""Account page data service — provides serialized dicts for .sx defpages."""
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
class AccountPageService:
|
||||
"""Service for account page data, callable via (service "account-page" ...)."""
|
||||
|
||||
async def newsletters_data(self, session, **kw):
|
||||
"""Return newsletter list with user subscription status."""
|
||||
from quart import g
|
||||
from sqlalchemy import select
|
||||
from shared.models import UserNewsletter
|
||||
from shared.models.ghost_membership_entities import GhostNewsletter
|
||||
|
||||
result = await session.execute(
|
||||
select(GhostNewsletter).order_by(GhostNewsletter.name)
|
||||
)
|
||||
all_newsletters = result.scalars().all()
|
||||
|
||||
sub_result = await session.execute(
|
||||
select(UserNewsletter).where(
|
||||
UserNewsletter.user_id == g.user.id,
|
||||
)
|
||||
)
|
||||
user_subs = {un.newsletter_id: un for un in sub_result.scalars().all()}
|
||||
|
||||
newsletter_list = []
|
||||
for nl in all_newsletters:
|
||||
un = user_subs.get(nl.id)
|
||||
newsletter_list.append({
|
||||
"newsletter": {"id": nl.id, "name": nl.name, "description": nl.description},
|
||||
"un": {"newsletter_id": un.newsletter_id, "subscribed": un.subscribed} if un else None,
|
||||
"subscribed": un.subscribed if un else False,
|
||||
})
|
||||
|
||||
from shared.infrastructure.urls import account_url
|
||||
return {
|
||||
"newsletter_list": newsletter_list,
|
||||
"account_url": account_url(""),
|
||||
}
|
||||
621
account/services/ghost_membership.py
Normal file
621
account/services/ghost_membership.py
Normal file
@@ -0,0 +1,621 @@
|
||||
"""Ghost membership sync — account-owned.
|
||||
|
||||
Handles Ghost ↔ DB sync for user/membership data:
|
||||
- Ghost → DB: fetch members from Ghost API, upsert into account tables
|
||||
- DB → Ghost: push local user changes back to Ghost API
|
||||
|
||||
All tables involved (users, ghost_labels, user_labels, ghost_newsletters,
|
||||
user_newsletters, ghost_tiers, ghost_subscriptions) live in db_account.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import re
|
||||
import asyncio
|
||||
from datetime import datetime
|
||||
from typing import Dict, Any, Optional
|
||||
|
||||
import httpx
|
||||
from sqlalchemy import select, delete, or_, and_
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm.attributes import flag_modified
|
||||
|
||||
from shared.models import User
|
||||
from shared.models.ghost_membership_entities import (
|
||||
GhostLabel, UserLabel,
|
||||
GhostNewsletter, UserNewsletter,
|
||||
GhostTier, GhostSubscription,
|
||||
)
|
||||
|
||||
from shared.infrastructure.ghost_admin_token import make_ghost_admin_jwt
|
||||
from urllib.parse import quote
|
||||
|
||||
GHOST_ADMIN_API_URL = os.environ.get("GHOST_ADMIN_API_URL", "")
|
||||
|
||||
|
||||
def _auth_header() -> dict[str, str]:
|
||||
return {"Authorization": f"Ghost {make_ghost_admin_jwt()}"}
|
||||
|
||||
|
||||
def _iso(val: str | None) -> datetime | None:
|
||||
if not val:
|
||||
return None
|
||||
return datetime.fromisoformat(val.replace("Z", "+00:00"))
|
||||
|
||||
|
||||
def _to_str_or_none(v) -> Optional[str]:
|
||||
if v is None:
|
||||
return None
|
||||
if isinstance(v, (dict, list, set, tuple, bytes, bytearray)):
|
||||
return None
|
||||
s = str(v).strip()
|
||||
return s or None
|
||||
|
||||
|
||||
def _sanitize_member_payload(payload: dict) -> dict:
|
||||
"""Coerce types Ghost expects and drop empties to avoid 422/500 quirks."""
|
||||
out: dict = {}
|
||||
|
||||
email = _to_str_or_none(payload.get("email"))
|
||||
if email:
|
||||
out["email"] = email.lower()
|
||||
|
||||
name = _to_str_or_none(payload.get("name"))
|
||||
if name is not None:
|
||||
out["name"] = name
|
||||
|
||||
note = _to_str_or_none(payload.get("note"))
|
||||
if note is not None:
|
||||
out["note"] = note
|
||||
|
||||
if "subscribed" in payload:
|
||||
out["subscribed"] = bool(payload.get("subscribed"))
|
||||
|
||||
labels = []
|
||||
for item in payload.get("labels") or []:
|
||||
gid = _to_str_or_none(item.get("id"))
|
||||
gname = _to_str_or_none(item.get("name"))
|
||||
if gid:
|
||||
labels.append({"id": gid})
|
||||
elif gname:
|
||||
labels.append({"name": gname})
|
||||
if labels:
|
||||
out["labels"] = labels
|
||||
|
||||
newsletters = []
|
||||
for item in payload.get("newsletters") or []:
|
||||
gid = _to_str_or_none(item.get("id"))
|
||||
gname = _to_str_or_none(item.get("name"))
|
||||
row = {"subscribed": bool(item.get("subscribed", True))}
|
||||
if gid:
|
||||
row["id"] = gid
|
||||
newsletters.append(row)
|
||||
elif gname:
|
||||
row["name"] = gname
|
||||
newsletters.append(row)
|
||||
if newsletters:
|
||||
out["newsletters"] = newsletters
|
||||
|
||||
gid = _to_str_or_none(payload.get("id"))
|
||||
if gid:
|
||||
out["id"] = gid
|
||||
|
||||
return out
|
||||
|
||||
|
||||
def _member_email(m: dict[str, Any]) -> Optional[str]:
|
||||
email = (m.get("email") or "").strip().lower() or None
|
||||
return email
|
||||
|
||||
|
||||
# ---- upsert helpers for related entities ----
|
||||
|
||||
async def _upsert_label(sess: AsyncSession, data: dict) -> GhostLabel:
|
||||
res = await sess.execute(select(GhostLabel).where(GhostLabel.ghost_id == data["id"]))
|
||||
obj = res.scalar_one_or_none()
|
||||
if not obj:
|
||||
obj = GhostLabel(ghost_id=data["id"])
|
||||
sess.add(obj)
|
||||
obj.name = data.get("name") or obj.name
|
||||
obj.slug = data.get("slug") or obj.slug
|
||||
await sess.flush()
|
||||
return obj
|
||||
|
||||
|
||||
async def _upsert_newsletter(sess: AsyncSession, data: dict) -> GhostNewsletter:
|
||||
res = await sess.execute(select(GhostNewsletter).where(GhostNewsletter.ghost_id == data["id"]))
|
||||
obj = res.scalar_one_or_none()
|
||||
if not obj:
|
||||
obj = GhostNewsletter(ghost_id=data["id"])
|
||||
sess.add(obj)
|
||||
obj.name = data.get("name") or obj.name
|
||||
obj.slug = data.get("slug") or obj.slug
|
||||
obj.description = data.get("description") or obj.description
|
||||
await sess.flush()
|
||||
return obj
|
||||
|
||||
|
||||
async def _upsert_tier(sess: AsyncSession, data: dict) -> GhostTier:
|
||||
res = await sess.execute(select(GhostTier).where(GhostTier.ghost_id == data["id"]))
|
||||
obj = res.scalar_one_or_none()
|
||||
if not obj:
|
||||
obj = GhostTier(ghost_id=data["id"])
|
||||
sess.add(obj)
|
||||
obj.name = data.get("name") or obj.name
|
||||
obj.slug = data.get("slug") or obj.slug
|
||||
obj.type = data.get("type") or obj.type
|
||||
obj.visibility = data.get("visibility") or obj.visibility
|
||||
await sess.flush()
|
||||
return obj
|
||||
|
||||
|
||||
def _price_cents(sd: dict) -> Optional[int]:
|
||||
try:
|
||||
return int((sd.get("price") or {}).get("amount"))
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
# ---- find/create user by ghost_id or email ----
|
||||
|
||||
async def _find_or_create_user_by_ghost_or_email(sess: AsyncSession, data: dict) -> User:
|
||||
ghost_id = data.get("id")
|
||||
email = _member_email(data)
|
||||
|
||||
if ghost_id:
|
||||
res = await sess.execute(select(User).where(User.ghost_id == ghost_id))
|
||||
u = res.scalar_one_or_none()
|
||||
if u:
|
||||
return u
|
||||
|
||||
if email:
|
||||
res = await sess.execute(select(User).where(User.email.ilike(email)))
|
||||
u = res.scalar_one_or_none()
|
||||
if u:
|
||||
if ghost_id and not u.ghost_id:
|
||||
u.ghost_id = ghost_id
|
||||
return u
|
||||
|
||||
u = User(email=email or f"_ghost_{ghost_id}@invalid.local")
|
||||
if ghost_id:
|
||||
u.ghost_id = ghost_id
|
||||
sess.add(u)
|
||||
await sess.flush()
|
||||
return u
|
||||
|
||||
|
||||
# ---- apply membership data to user ----
|
||||
|
||||
async def _apply_user_membership(sess: AsyncSession, user: User, m: dict) -> User:
|
||||
"""Apply Ghost member payload to local User."""
|
||||
sess.add(user)
|
||||
|
||||
user.name = m.get("name") or user.name
|
||||
user.ghost_status = m.get("status") or user.ghost_status
|
||||
user.ghost_subscribed = bool(m.get("subscribed", True))
|
||||
user.ghost_note = m.get("note") or user.ghost_note
|
||||
user.avatar_image = m.get("avatar_image") or user.avatar_image
|
||||
user.stripe_customer_id = (
|
||||
(m.get("stripe") or {}).get("customer_id")
|
||||
or (m.get("customer") or {}).get("id")
|
||||
or m.get("stripe_customer_id")
|
||||
or user.stripe_customer_id
|
||||
)
|
||||
user.ghost_raw = dict(m)
|
||||
flag_modified(user, "ghost_raw")
|
||||
|
||||
await sess.flush()
|
||||
|
||||
# Labels join
|
||||
label_ids: list[int] = []
|
||||
for ld in m.get("labels") or []:
|
||||
lbl = await _upsert_label(sess, ld)
|
||||
label_ids.append(lbl.id)
|
||||
await sess.execute(delete(UserLabel).where(UserLabel.user_id == user.id))
|
||||
for lid in label_ids:
|
||||
sess.add(UserLabel(user_id=user.id, label_id=lid))
|
||||
await sess.flush()
|
||||
|
||||
# Newsletters join with subscribed flag
|
||||
nl_rows: list[tuple[int, bool]] = []
|
||||
for nd in m.get("newsletters") or []:
|
||||
nl = await _upsert_newsletter(sess, nd)
|
||||
nl_rows.append((nl.id, bool(nd.get("subscribed", True))))
|
||||
await sess.execute(delete(UserNewsletter).where(UserNewsletter.user_id == user.id))
|
||||
for nl_id, subbed in nl_rows:
|
||||
sess.add(UserNewsletter(user_id=user.id, newsletter_id=nl_id, subscribed=subbed))
|
||||
await sess.flush()
|
||||
|
||||
# Subscriptions
|
||||
for sd in m.get("subscriptions") or []:
|
||||
sid = sd.get("id")
|
||||
if not sid:
|
||||
continue
|
||||
|
||||
tier_id: Optional[int] = None
|
||||
if sd.get("tier"):
|
||||
tier = await _upsert_tier(sess, sd["tier"])
|
||||
await sess.flush()
|
||||
tier_id = tier.id
|
||||
|
||||
res = await sess.execute(select(GhostSubscription).where(GhostSubscription.ghost_id == sid))
|
||||
sub = res.scalar_one_or_none()
|
||||
if not sub:
|
||||
sub = GhostSubscription(ghost_id=sid, user_id=user.id)
|
||||
sess.add(sub)
|
||||
|
||||
sub.user_id = user.id
|
||||
sub.status = sd.get("status") or sub.status
|
||||
sub.cadence = (sd.get("plan") or {}).get("interval") or sd.get("cadence") or sub.cadence
|
||||
sub.price_amount = _price_cents(sd)
|
||||
sub.price_currency = (sd.get("price") or {}).get("currency") or sub.price_currency
|
||||
sub.stripe_customer_id = (
|
||||
(sd.get("customer") or {}).get("id")
|
||||
or (sd.get("stripe") or {}).get("customer_id")
|
||||
or sub.stripe_customer_id
|
||||
)
|
||||
sub.stripe_subscription_id = (
|
||||
sd.get("stripe_subscription_id")
|
||||
or (sd.get("stripe") or {}).get("subscription_id")
|
||||
or sub.stripe_subscription_id
|
||||
)
|
||||
if tier_id is not None:
|
||||
sub.tier_id = tier_id
|
||||
sub.raw = dict(sd)
|
||||
flag_modified(sub, "raw")
|
||||
|
||||
await sess.flush()
|
||||
return user
|
||||
|
||||
|
||||
# =====================================================
|
||||
# PUSH MEMBERS FROM LOCAL DB -> GHOST (DB -> Ghost)
|
||||
# =====================================================
|
||||
|
||||
def _ghost_member_payload_base(u: User) -> dict:
|
||||
email = _to_str_or_none(getattr(u, "email", None))
|
||||
payload: dict = {}
|
||||
if email:
|
||||
payload["email"] = email.lower()
|
||||
|
||||
name = _to_str_or_none(getattr(u, "name", None))
|
||||
if name:
|
||||
payload["name"] = name
|
||||
|
||||
note = _to_str_or_none(getattr(u, "ghost_note", None))
|
||||
if note:
|
||||
payload["note"] = note
|
||||
|
||||
subscribed = getattr(u, "ghost_subscribed", True)
|
||||
payload["subscribed"] = bool(subscribed)
|
||||
|
||||
return payload
|
||||
|
||||
|
||||
async def _newsletters_for_user(sess: AsyncSession, user_id: int) -> list[dict]:
|
||||
q = await sess.execute(
|
||||
select(GhostNewsletter.ghost_id, UserNewsletter.subscribed, GhostNewsletter.name)
|
||||
.join(UserNewsletter, UserNewsletter.newsletter_id == GhostNewsletter.id)
|
||||
.where(UserNewsletter.user_id == user_id)
|
||||
)
|
||||
seen = set()
|
||||
out: list[dict] = []
|
||||
for gid, subscribed, name in q.all():
|
||||
gid = (gid or "").strip() or None
|
||||
name = (name or "").strip() or None
|
||||
row: dict = {"subscribed": bool(subscribed)}
|
||||
if gid:
|
||||
key = ("id", gid)
|
||||
if key in seen:
|
||||
continue
|
||||
row["id"] = gid
|
||||
seen.add(key)
|
||||
out.append(row)
|
||||
elif name:
|
||||
key = ("name", name.lower())
|
||||
if key in seen:
|
||||
continue
|
||||
row["name"] = name
|
||||
seen.add(key)
|
||||
out.append(row)
|
||||
return out
|
||||
|
||||
|
||||
async def _labels_for_user(sess: AsyncSession, user_id: int) -> list[dict]:
|
||||
q = await sess.execute(
|
||||
select(GhostLabel.ghost_id, GhostLabel.name)
|
||||
.join(UserLabel, UserLabel.label_id == GhostLabel.id)
|
||||
.where(UserLabel.user_id == user_id)
|
||||
)
|
||||
seen = set()
|
||||
out: list[dict] = []
|
||||
for gid, name in q.all():
|
||||
gid = (gid or "").strip() or None
|
||||
name = (name or "").strip() or None
|
||||
if gid:
|
||||
key = ("id", gid)
|
||||
if key not in seen:
|
||||
out.append({"id": gid})
|
||||
seen.add(key)
|
||||
elif name:
|
||||
key = ("name", name.lower())
|
||||
if key not in seen:
|
||||
out.append({"name": name})
|
||||
seen.add(key)
|
||||
return out
|
||||
|
||||
|
||||
async def _ghost_find_member_by_email(email: str) -> Optional[dict]:
|
||||
if not email:
|
||||
return None
|
||||
async with httpx.AsyncClient(timeout=30) as client:
|
||||
resp = await client.get(
|
||||
f"{GHOST_ADMIN_API_URL}/members/?filter=email:{quote(email)}&limit=1",
|
||||
headers=_auth_header(),
|
||||
)
|
||||
resp.raise_for_status()
|
||||
members = resp.json().get("members") or []
|
||||
return members[0] if members else None
|
||||
|
||||
|
||||
async def _ghost_upsert_member(payload: dict, ghost_id: str | None = None) -> dict:
|
||||
"""Create/update a member, with sanitization + 5xx retry/backoff."""
|
||||
safe_keys = ("email", "name", "note", "subscribed", "labels", "newsletters", "id")
|
||||
pl_raw = {k: v for k, v in payload.items() if k in safe_keys}
|
||||
pl = _sanitize_member_payload(pl_raw)
|
||||
|
||||
async def _request_with_retry(client: httpx.AsyncClient, method: str, url: str, json: dict) -> httpx.Response:
|
||||
delay = 0.5
|
||||
for attempt in range(3):
|
||||
r = await client.request(method, url, headers=_auth_header(), json=json)
|
||||
if r.status_code >= 500:
|
||||
if attempt < 2:
|
||||
await asyncio.sleep(delay)
|
||||
delay *= 2
|
||||
continue
|
||||
return r
|
||||
return r
|
||||
|
||||
async with httpx.AsyncClient(timeout=30) as client:
|
||||
|
||||
async def _put(mid: str, p: dict) -> dict:
|
||||
r = await _request_with_retry(
|
||||
client, "PUT",
|
||||
f"{GHOST_ADMIN_API_URL}/members/{mid}/",
|
||||
{"members": [p]},
|
||||
)
|
||||
if r.status_code == 404:
|
||||
existing = await _ghost_find_member_by_email(p.get("email", ""))
|
||||
if existing and existing.get("id"):
|
||||
r2 = await _request_with_retry(
|
||||
client, "PUT",
|
||||
f"{GHOST_ADMIN_API_URL}/members/{existing['id']}/",
|
||||
{"members": [p]},
|
||||
)
|
||||
r2.raise_for_status()
|
||||
return (r2.json().get("members") or [None])[0] or {}
|
||||
r3 = await _request_with_retry(
|
||||
client, "POST",
|
||||
f"{GHOST_ADMIN_API_URL}/members/",
|
||||
{"members": [p]},
|
||||
)
|
||||
r3.raise_for_status()
|
||||
return (r3.json().get("members") or [None])[0] or {}
|
||||
|
||||
if r.status_code == 422:
|
||||
body = (r.text or "").lower()
|
||||
retry = dict(p)
|
||||
dropped = False
|
||||
if '"note"' in body or "for note" in body:
|
||||
retry.pop("note", None); dropped = True
|
||||
if '"name"' in body or "for name" in body:
|
||||
retry.pop("name", None); dropped = True
|
||||
if "labels.name" in body:
|
||||
retry.pop("labels", None); dropped = True
|
||||
if dropped:
|
||||
r2 = await _request_with_retry(
|
||||
client, "PUT",
|
||||
f"{GHOST_ADMIN_API_URL}/members/{mid}/",
|
||||
{"members": [retry]},
|
||||
)
|
||||
if r2.status_code == 404:
|
||||
existing = await _ghost_find_member_by_email(retry.get("email", ""))
|
||||
if existing and existing.get("id"):
|
||||
r3 = await _request_with_retry(
|
||||
client, "PUT",
|
||||
f"{GHOST_ADMIN_API_URL}/members/{existing['id']}/",
|
||||
{"members": [retry]},
|
||||
)
|
||||
r3.raise_for_status()
|
||||
return (r3.json().get("members") or [None])[0] or {}
|
||||
r3 = await _request_with_retry(
|
||||
client, "POST",
|
||||
f"{GHOST_ADMIN_API_URL}/members/",
|
||||
{"members": [retry]},
|
||||
)
|
||||
r3.raise_for_status()
|
||||
return (r3.json().get("members") or [None])[0] or {}
|
||||
r2.raise_for_status()
|
||||
return (r2.json().get("members") or [None])[0] or {}
|
||||
r.raise_for_status()
|
||||
return (r.json().get("members") or [None])[0] or {}
|
||||
|
||||
async def _post_upsert(p: dict) -> dict:
|
||||
r = await _request_with_retry(
|
||||
client, "POST",
|
||||
f"{GHOST_ADMIN_API_URL}/members/?upsert=true",
|
||||
{"members": [p]},
|
||||
)
|
||||
if r.status_code == 422:
|
||||
lower = (r.text or "").lower()
|
||||
|
||||
retry = dict(p)
|
||||
changed = False
|
||||
if '"note"' in lower or "for note" in lower:
|
||||
retry.pop("note", None); changed = True
|
||||
if '"name"' in lower or "for name" in lower:
|
||||
retry.pop("name", None); changed = True
|
||||
if "labels.name" in lower:
|
||||
retry.pop("labels", None); changed = True
|
||||
|
||||
if changed:
|
||||
r2 = await _request_with_retry(
|
||||
client, "POST",
|
||||
f"{GHOST_ADMIN_API_URL}/members/?upsert=true",
|
||||
{"members": [retry]},
|
||||
)
|
||||
if r2.status_code != 422:
|
||||
r2.raise_for_status()
|
||||
return (r2.json().get("members") or [None])[0] or {}
|
||||
lower = (r2.text or "").lower()
|
||||
|
||||
if "already exists" in lower and "email address" in lower:
|
||||
existing = await _ghost_find_member_by_email(p.get("email", ""))
|
||||
if existing and existing.get("id"):
|
||||
return await _put(existing["id"], p)
|
||||
|
||||
raise httpx.HTTPStatusError(
|
||||
"Validation error, cannot edit member.",
|
||||
request=r.request,
|
||||
response=r,
|
||||
)
|
||||
r.raise_for_status()
|
||||
return (r.json().get("members") or [None])[0] or {}
|
||||
|
||||
if ghost_id:
|
||||
return await _put(ghost_id, pl)
|
||||
return await _post_upsert(pl)
|
||||
|
||||
|
||||
async def sync_member_to_ghost(sess: AsyncSession, user_id: int) -> Optional[str]:
|
||||
"""Push a single user's membership data to Ghost."""
|
||||
res = await sess.execute(select(User).where(User.id == user_id))
|
||||
user = res.scalar_one_or_none()
|
||||
if not user:
|
||||
return None
|
||||
|
||||
payload = _ghost_member_payload_base(user)
|
||||
|
||||
labels = await _labels_for_user(sess, user.id)
|
||||
if labels:
|
||||
payload["labels"] = labels
|
||||
|
||||
ghost_member = await _ghost_upsert_member(payload, ghost_id=user.ghost_id)
|
||||
|
||||
if ghost_member:
|
||||
gm_id = ghost_member.get("id")
|
||||
if gm_id and user.ghost_id != gm_id:
|
||||
user.ghost_id = gm_id
|
||||
user.ghost_raw = dict(ghost_member)
|
||||
flag_modified(user, "ghost_raw")
|
||||
await sess.flush()
|
||||
return user.ghost_id or gm_id
|
||||
return user.ghost_id
|
||||
|
||||
|
||||
async def sync_members_to_ghost(
|
||||
sess: AsyncSession,
|
||||
changed_since: Optional[datetime] = None,
|
||||
limit: Optional[int] = None,
|
||||
) -> int:
|
||||
"""Upsert a batch of users to Ghost. Returns count processed."""
|
||||
stmt = select(User.id)
|
||||
if changed_since:
|
||||
stmt = stmt.where(
|
||||
or_(
|
||||
User.created_at >= changed_since,
|
||||
and_(User.last_login_at != None, User.last_login_at >= changed_since),
|
||||
)
|
||||
)
|
||||
if limit:
|
||||
stmt = stmt.limit(limit)
|
||||
|
||||
ids = [row[0] for row in (await sess.execute(stmt)).all()]
|
||||
processed = 0
|
||||
for uid in ids:
|
||||
try:
|
||||
await sync_member_to_ghost(sess, uid)
|
||||
processed += 1
|
||||
except httpx.HTTPStatusError as e:
|
||||
print(f"[ghost sync] failed upsert for user {uid}: {e.response.status_code} {e.response.text}")
|
||||
except Exception as e:
|
||||
print(f"[ghost sync] failed upsert for user {uid}: {e}")
|
||||
return processed
|
||||
|
||||
|
||||
# =====================================================
|
||||
# Membership fetch/sync (Ghost -> DB) bulk + single
|
||||
# =====================================================
|
||||
|
||||
async def fetch_all_members_from_ghost() -> list[dict[str, Any]]:
|
||||
async with httpx.AsyncClient(timeout=60) as client:
|
||||
resp = await client.get(
|
||||
f"{GHOST_ADMIN_API_URL}/members/?include=labels,subscriptions,tiers,newsletters&limit=all",
|
||||
headers=_auth_header(),
|
||||
)
|
||||
resp.raise_for_status()
|
||||
return resp.json().get("members", [])
|
||||
|
||||
|
||||
async def sync_all_membership_from_ghost(sess: AsyncSession) -> None:
|
||||
"""Bulk sync: fetch all members from Ghost, upsert into DB."""
|
||||
members = await fetch_all_members_from_ghost()
|
||||
|
||||
label_bucket: Dict[str, dict[str, Any]] = {}
|
||||
tier_bucket: Dict[str, dict[str, Any]] = {}
|
||||
newsletter_bucket: Dict[str, dict[str, Any]] = {}
|
||||
|
||||
for m in members:
|
||||
for l in m.get("labels") or []:
|
||||
label_bucket[l["id"]] = l
|
||||
for n in m.get("newsletters") or []:
|
||||
newsletter_bucket[n["id"]] = n
|
||||
for s in m.get("subscriptions") or []:
|
||||
t = s.get("tier")
|
||||
if isinstance(t, dict) and t.get("id"):
|
||||
tier_bucket[t["id"]] = t
|
||||
|
||||
for L in label_bucket.values():
|
||||
await _upsert_label(sess, L)
|
||||
for T in tier_bucket.values():
|
||||
await _upsert_tier(sess, T)
|
||||
for N in newsletter_bucket.values():
|
||||
await _upsert_newsletter(sess, N)
|
||||
|
||||
for gm in members:
|
||||
user = await _find_or_create_user_by_ghost_or_email(sess, gm)
|
||||
await _apply_user_membership(sess, user, gm)
|
||||
|
||||
|
||||
async def fetch_single_member_from_ghost(ghost_id: str) -> Optional[dict[str, Any]]:
|
||||
async with httpx.AsyncClient(timeout=30) as client:
|
||||
resp = await client.get(
|
||||
f"{GHOST_ADMIN_API_URL}/members/{ghost_id}/?include=labels,newsletters,subscriptions,tiers",
|
||||
headers=_auth_header(),
|
||||
)
|
||||
if resp.status_code == 404:
|
||||
return None
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
items = data.get("members") or data.get("member") or []
|
||||
if isinstance(items, dict):
|
||||
return items
|
||||
return (items[0] if items else None)
|
||||
|
||||
|
||||
async def sync_single_member(sess: AsyncSession, ghost_id: str) -> None:
|
||||
"""Sync a single member from Ghost into DB."""
|
||||
m = await fetch_single_member_from_ghost(ghost_id)
|
||||
if m is None:
|
||||
return
|
||||
|
||||
for l in m.get("labels") or []:
|
||||
await _upsert_label(sess, l)
|
||||
for n in m.get("newsletters") or []:
|
||||
await _upsert_newsletter(sess, n)
|
||||
for s in m.get("subscriptions") or []:
|
||||
if isinstance(s.get("tier"), dict):
|
||||
await _upsert_tier(sess, s["tier"])
|
||||
|
||||
user = await _find_or_create_user_by_ghost_or_email(sess, m)
|
||||
await _apply_user_membership(sess, user, m)
|
||||
0
account/sx/__init__.py
Normal file
0
account/sx/__init__.py
Normal file
51
account/sx/auth.sx
Normal file
51
account/sx/auth.sx
Normal file
@@ -0,0 +1,51 @@
|
||||
;; Auth page components (device auth — account-specific)
|
||||
;; Login and check-email components are shared: see shared/sx/templates/auth.sx
|
||||
|
||||
(defcomp ~account-device-error (&key error)
|
||||
(when error
|
||||
(div :class "bg-red-50 border border-red-200 text-red-700 p-3 rounded mb-4"
|
||||
error)))
|
||||
|
||||
(defcomp ~account-device-form (&key error action csrf-token code)
|
||||
(div :class "py-8 max-w-md mx-auto"
|
||||
(h1 :class "text-2xl font-bold mb-6" "Authorize device")
|
||||
(p :class "text-stone-600 mb-4" "Enter the code shown in your terminal to sign in.")
|
||||
error
|
||||
(form :method "post" :action action :class "space-y-4"
|
||||
(input :type "hidden" :name "csrf_token" :value csrf-token)
|
||||
(div
|
||||
(label :for "code" :class "block text-sm font-medium mb-1" "Device code")
|
||||
(input :type "text" :name "code" :id "code" :value code :placeholder "XXXX-XXXX"
|
||||
:required true :autofocus true :maxlength "9" :autocomplete "off" :spellcheck "false"
|
||||
:class "w-full border border-stone-300 rounded px-3 py-3 text-center text-2xl tracking-widest font-mono uppercase focus:outline-none focus:ring-2 focus:ring-stone-500"))
|
||||
(button :type "submit"
|
||||
:class "w-full bg-stone-800 text-white py-2 px-4 rounded hover:bg-stone-700 transition"
|
||||
"Authorize"))))
|
||||
|
||||
(defcomp ~account-device-approved ()
|
||||
(div :class "py-8 max-w-md mx-auto text-center"
|
||||
(h1 :class "text-2xl font-bold mb-4" "Device authorized")
|
||||
(p :class "text-stone-600" "You can close this window and return to your terminal.")))
|
||||
|
||||
;; Assembled auth page content — replaces Python _login_page_content etc.
|
||||
|
||||
(defcomp ~account-login-content (&key error email)
|
||||
(~auth-login-form
|
||||
:error (when error (~auth-error-banner :error error))
|
||||
:action (url-for "auth.start_login")
|
||||
:csrf-token (csrf-token)
|
||||
:email (or email "")))
|
||||
|
||||
(defcomp ~account-device-content (&key error code)
|
||||
(~account-device-form
|
||||
:error (when error (~account-device-error :error error))
|
||||
:action (url-for "auth.device_submit")
|
||||
:csrf-token (csrf-token)
|
||||
:code (or code "")))
|
||||
|
||||
(defcomp ~account-check-email-content (&key email email-error)
|
||||
(~auth-check-email
|
||||
:email (escape (or email ""))
|
||||
:error (when email-error
|
||||
(~auth-check-email-error :error (escape email-error)))))
|
||||
|
||||
60
account/sx/dashboard.sx
Normal file
60
account/sx/dashboard.sx
Normal file
@@ -0,0 +1,60 @@
|
||||
;; Account dashboard components
|
||||
|
||||
(defcomp ~account-error-banner (&key error)
|
||||
(when error
|
||||
(div :class "rounded-lg border border-red-200 bg-red-50 text-red-800 px-4 py-3 text-sm"
|
||||
error)))
|
||||
|
||||
(defcomp ~account-user-email (&key email)
|
||||
(when email
|
||||
(p :class "text-sm text-stone-500 mt-1" email)))
|
||||
|
||||
(defcomp ~account-user-name (&key name)
|
||||
(when name
|
||||
(p :class "text-sm text-stone-600" name)))
|
||||
|
||||
(defcomp ~account-logout-form (&key csrf-token)
|
||||
(form :action "/auth/logout/" :method "post"
|
||||
(input :type "hidden" :name "csrf_token" :value csrf-token)
|
||||
(button :type "submit"
|
||||
:class "inline-flex items-center gap-2 rounded-full border border-stone-300 px-4 py-2 text-sm font-medium text-stone-700 hover:bg-stone-50 transition"
|
||||
(i :class "fa-solid fa-right-from-bracket text-xs") " Sign out")))
|
||||
|
||||
(defcomp ~account-label-item (&key name)
|
||||
(span :class "inline-flex items-center rounded-full border border-stone-200 px-3 py-1 text-xs font-medium bg-white/60"
|
||||
name))
|
||||
|
||||
(defcomp ~account-labels-section (&key items)
|
||||
(when items
|
||||
(div
|
||||
(h2 :class "text-base font-semibold tracking-tight mb-3" "Labels")
|
||||
(div :class "flex flex-wrap gap-2" items))))
|
||||
|
||||
(defcomp ~account-main-panel (&key error email name logout labels)
|
||||
(div :class "w-full max-w-3xl mx-auto px-4 py-6"
|
||||
(div :class "bg-white/70 backdrop-blur rounded-2xl shadow border border-stone-200 p-6 sm:p-8 space-y-8"
|
||||
error
|
||||
(div :class "flex items-center justify-between"
|
||||
(div
|
||||
(h1 :class "text-xl font-semibold tracking-tight" "Account")
|
||||
email
|
||||
name)
|
||||
logout)
|
||||
labels)))
|
||||
|
||||
;; Assembled dashboard content — replaces Python _account_main_panel_sx
|
||||
(defcomp ~account-dashboard-content (&key error)
|
||||
(let* ((user (current-user))
|
||||
(csrf (csrf-token)))
|
||||
(~account-main-panel
|
||||
:error (when error (~account-error-banner :error error))
|
||||
:email (when (get user "email")
|
||||
(~account-user-email :email (get user "email")))
|
||||
:name (when (get user "name")
|
||||
(~account-user-name :name (get user "name")))
|
||||
:logout (~account-logout-form :csrf-token csrf)
|
||||
:labels (when (not (empty? (or (get user "labels") (list))))
|
||||
(~account-labels-section
|
||||
:items (map (lambda (label)
|
||||
(~account-label-item :name (get label "name")))
|
||||
(get user "labels")))))))
|
||||
9
account/sx/handlers/auth-menu.sx
Normal file
9
account/sx/handlers/auth-menu.sx
Normal file
@@ -0,0 +1,9 @@
|
||||
;; Account auth-menu fragment handler
|
||||
;; returns: sx
|
||||
;;
|
||||
;; Renders the desktop + mobile auth menu (sign-in or user link).
|
||||
|
||||
(defhandler auth-menu (&key email)
|
||||
(~auth-menu
|
||||
:user-email (when email email)
|
||||
:account-url (app-url "account" "")))
|
||||
20
account/sx/layouts.sx
Normal file
20
account/sx/layouts.sx
Normal file
@@ -0,0 +1,20 @@
|
||||
;; Account layout defcomps — fully self-contained via IO primitives.
|
||||
;; Registered via register_sx_layout("account", ...) in __init__.py.
|
||||
|
||||
;; Full page: root header + auth header row in header-child
|
||||
(defcomp ~account-layout-full ()
|
||||
(<> (~root-header-auto)
|
||||
(~header-child-sx
|
||||
:inner (~auth-header-row-auto))))
|
||||
|
||||
;; OOB (HTMX): auth row + root header, both with oob=true
|
||||
(defcomp ~account-layout-oob ()
|
||||
(<> (~auth-header-row-auto true)
|
||||
(~root-header-auto true)))
|
||||
|
||||
;; Mobile menu: auth section + root nav
|
||||
(defcomp ~account-layout-mobile ()
|
||||
(<> (~mobile-menu-section
|
||||
:label "account" :href "/" :level 1 :colour "sky"
|
||||
:items (~auth-nav-items-auto))
|
||||
(~root-mobile-auto)))
|
||||
62
account/sx/newsletters.sx
Normal file
62
account/sx/newsletters.sx
Normal file
@@ -0,0 +1,62 @@
|
||||
;; Newsletter management components
|
||||
|
||||
(defcomp ~account-newsletter-desc (&key description)
|
||||
(when description
|
||||
(p :class "text-xs text-stone-500 mt-0.5 truncate" description)))
|
||||
|
||||
(defcomp ~account-newsletter-toggle (&key id url hdrs target cls checked knob-cls)
|
||||
(div :id id :class "flex items-center"
|
||||
(button :sx-post url :sx-headers hdrs :sx-target target :sx-swap "outerHTML"
|
||||
:class cls :role "switch" :aria-checked checked
|
||||
(span :class knob-cls))))
|
||||
|
||||
|
||||
(defcomp ~account-newsletter-item (&key name desc toggle)
|
||||
(div :class "flex items-center justify-between py-4 first:pt-0 last:pb-0"
|
||||
(div :class "min-w-0 flex-1"
|
||||
(p :class "text-sm font-medium text-stone-800" name)
|
||||
desc)
|
||||
(div :class "ml-4 flex-shrink-0" toggle)))
|
||||
|
||||
(defcomp ~account-newsletter-list (&key items)
|
||||
(div :class "divide-y divide-stone-100" items))
|
||||
|
||||
(defcomp ~account-newsletter-empty ()
|
||||
(p :class "text-sm text-stone-500" "No newsletters available."))
|
||||
|
||||
(defcomp ~account-newsletters-panel (&key list)
|
||||
(div :class "w-full max-w-3xl mx-auto px-4 py-6"
|
||||
(div :class "bg-white/70 backdrop-blur rounded-2xl shadow border border-stone-200 p-6 sm:p-8 space-y-6"
|
||||
(h1 :class "text-xl font-semibold tracking-tight" "Newsletters")
|
||||
list)))
|
||||
|
||||
;; Assembled newsletters content — replaces Python _newsletters_panel_sx
|
||||
;; Takes pre-fetched newsletter-list from page helper
|
||||
(defcomp ~account-newsletters-content (&key newsletter-list account-url)
|
||||
(let* ((csrf (csrf-token)))
|
||||
(if (empty? newsletter-list)
|
||||
(~account-newsletter-empty)
|
||||
(~account-newsletters-panel
|
||||
:list (~account-newsletter-list
|
||||
:items (map (lambda (item)
|
||||
(let* ((nl (get item "newsletter"))
|
||||
(un (get item "un"))
|
||||
(nid (get nl "id"))
|
||||
(subscribed (get item "subscribed"))
|
||||
(toggle-url (str (or account-url "") "/newsletter/" nid "/toggle/"))
|
||||
(bg (if subscribed "bg-emerald-500" "bg-stone-300"))
|
||||
(translate (if subscribed "translate-x-6" "translate-x-1"))
|
||||
(checked (if subscribed "true" "false")))
|
||||
(~account-newsletter-item
|
||||
:name (get nl "name")
|
||||
:desc (when (get nl "description")
|
||||
(~account-newsletter-desc :description (get nl "description")))
|
||||
:toggle (~account-newsletter-toggle
|
||||
:id (str "nl-" nid)
|
||||
:url toggle-url
|
||||
:hdrs (str "{\"X-CSRFToken\": \"" csrf "\"}")
|
||||
:target (str "#nl-" nid)
|
||||
:cls (str "relative inline-flex h-6 w-11 items-center rounded-full transition-colors focus:outline-none focus:ring-2 focus:ring-emerald-500 focus:ring-offset-2 " bg)
|
||||
:checked checked
|
||||
:knob-cls (str "inline-block h-4 w-4 rounded-full bg-white shadow transform transition-transform " translate)))))
|
||||
newsletter-list))))))
|
||||
0
account/sxc/__init__.py
Normal file
0
account/sxc/__init__.py
Normal file
19
account/sxc/pages/__init__.py
Normal file
19
account/sxc/pages/__init__.py
Normal file
@@ -0,0 +1,19 @@
|
||||
"""Account defpage setup — registers layouts and loads .sx pages."""
|
||||
from __future__ import annotations
|
||||
|
||||
|
||||
def setup_account_pages() -> None:
|
||||
"""Register account-specific layouts and load page definitions."""
|
||||
_register_account_layouts()
|
||||
_load_account_page_files()
|
||||
|
||||
|
||||
def _load_account_page_files() -> None:
|
||||
import os
|
||||
from shared.sx.pages import load_page_dir
|
||||
load_page_dir(os.path.dirname(__file__), "account")
|
||||
|
||||
|
||||
def _register_account_layouts() -> None:
|
||||
from shared.sx.layouts import register_sx_layout
|
||||
register_sx_layout("account", "account-layout-full", "account-layout-oob", "account-layout-mobile")
|
||||
40
account/sxc/pages/account.sx
Normal file
40
account/sxc/pages/account.sx
Normal file
@@ -0,0 +1,40 @@
|
||||
;; Account app — declarative page definitions
|
||||
|
||||
;; ---------------------------------------------------------------------------
|
||||
;; Account dashboard
|
||||
;; ---------------------------------------------------------------------------
|
||||
|
||||
(defpage account-dashboard
|
||||
:path "/"
|
||||
:auth :login
|
||||
:layout :account
|
||||
:content (~account-dashboard-content))
|
||||
|
||||
;; ---------------------------------------------------------------------------
|
||||
;; Newsletters
|
||||
;; ---------------------------------------------------------------------------
|
||||
|
||||
(defpage newsletters
|
||||
:path "/newsletters/"
|
||||
:auth :login
|
||||
:layout :account
|
||||
:data (service "account-page" "newsletters-data")
|
||||
:content (~account-newsletters-content
|
||||
:newsletter-list newsletter-list
|
||||
:account-url account-url))
|
||||
|
||||
;; ---------------------------------------------------------------------------
|
||||
;; Fragment pages (tickets, bookings, etc. from events service)
|
||||
;; ---------------------------------------------------------------------------
|
||||
|
||||
(defpage fragment-page
|
||||
:path "/<slug>/"
|
||||
:auth :login
|
||||
:layout :account
|
||||
:content (let* ((user (current-user))
|
||||
(result (frag "events" "account-page"
|
||||
:slug slug
|
||||
:user-id (str (get user "id")))))
|
||||
(if (or (nil? result) (empty? result))
|
||||
(abort 404)
|
||||
result)))
|
||||
33
account/templates/_email/magic_link.html
Normal file
33
account/templates/_email/magic_link.html
Normal file
@@ -0,0 +1,33 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head><meta charset="utf-8"></head>
|
||||
<body style="margin:0;padding:0;background:#f5f5f4;font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',Roboto,sans-serif;">
|
||||
<table width="100%" cellpadding="0" cellspacing="0" style="background:#f5f5f4;padding:40px 0;">
|
||||
<tr><td align="center">
|
||||
<table width="480" cellpadding="0" cellspacing="0" style="background:#ffffff;border-radius:12px;border:1px solid #e7e5e4;padding:40px;">
|
||||
<tr><td>
|
||||
<h1 style="margin:0 0 8px;font-size:20px;font-weight:600;color:#1c1917;">{{ site_name }}</h1>
|
||||
<p style="margin:0 0 24px;font-size:15px;color:#57534e;">Sign in to your account</p>
|
||||
<p style="margin:0 0 24px;font-size:15px;line-height:1.5;color:#44403c;">
|
||||
Click the button below to sign in. This link will expire in 15 minutes.
|
||||
</p>
|
||||
<table cellpadding="0" cellspacing="0" style="margin:0 0 24px;"><tr><td style="border-radius:8px;background:#1c1917;">
|
||||
<a href="{{ link_url }}" target="_blank"
|
||||
style="display:inline-block;padding:12px 32px;font-size:15px;font-weight:500;color:#ffffff;text-decoration:none;border-radius:8px;">
|
||||
Sign in
|
||||
</a>
|
||||
</td></tr></table>
|
||||
<p style="margin:0 0 8px;font-size:13px;color:#78716c;">Or copy and paste this link into your browser:</p>
|
||||
<p style="margin:0 0 24px;font-size:13px;word-break:break-all;">
|
||||
<a href="{{ link_url }}" style="color:#1c1917;">{{ link_url }}</a>
|
||||
</p>
|
||||
<hr style="border:none;border-top:1px solid #e7e5e4;margin:24px 0;">
|
||||
<p style="margin:0;font-size:12px;color:#a8a29e;">
|
||||
If you did not request this email, you can safely ignore it.
|
||||
</p>
|
||||
</td></tr>
|
||||
</table>
|
||||
</td></tr>
|
||||
</table>
|
||||
</body>
|
||||
</html>
|
||||
8
account/templates/_email/magic_link.txt
Normal file
8
account/templates/_email/magic_link.txt
Normal file
@@ -0,0 +1,8 @@
|
||||
Hello,
|
||||
|
||||
Click this link to sign in:
|
||||
{{ link_url }}
|
||||
|
||||
This link will expire in 15 minutes.
|
||||
|
||||
If you did not request this, you can ignore this email.
|
||||
0
account/tests/__init__.py
Normal file
0
account/tests/__init__.py
Normal file
39
account/tests/test_auth_operations.py
Normal file
39
account/tests/test_auth_operations.py
Normal file
@@ -0,0 +1,39 @@
|
||||
"""Unit tests for account auth operations."""
|
||||
from __future__ import annotations
|
||||
|
||||
import pytest
|
||||
|
||||
from account.bp.auth.services.auth_operations import validate_email
|
||||
|
||||
|
||||
class TestValidateEmail:
|
||||
def test_valid_email(self):
|
||||
ok, email = validate_email("user@example.com")
|
||||
assert ok is True
|
||||
assert email == "user@example.com"
|
||||
|
||||
def test_uppercase_lowered(self):
|
||||
ok, email = validate_email("USER@EXAMPLE.COM")
|
||||
assert ok is True
|
||||
assert email == "user@example.com"
|
||||
|
||||
def test_whitespace_stripped(self):
|
||||
ok, email = validate_email(" user@example.com ")
|
||||
assert ok is True
|
||||
assert email == "user@example.com"
|
||||
|
||||
def test_empty_string(self):
|
||||
ok, email = validate_email("")
|
||||
assert ok is False
|
||||
|
||||
def test_no_at_sign(self):
|
||||
ok, email = validate_email("notanemail")
|
||||
assert ok is False
|
||||
|
||||
def test_just_at(self):
|
||||
ok, email = validate_email("@")
|
||||
assert ok is True # has "@", passes the basic check
|
||||
|
||||
def test_spaces_only(self):
|
||||
ok, email = validate_email(" ")
|
||||
assert ok is False
|
||||
164
account/tests/test_ghost_membership.py
Normal file
164
account/tests/test_ghost_membership.py
Normal file
@@ -0,0 +1,164 @@
|
||||
"""Unit tests for Ghost membership helpers."""
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
import pytest
|
||||
|
||||
from account.services.ghost_membership import (
|
||||
_iso, _to_str_or_none, _member_email,
|
||||
_price_cents, _sanitize_member_payload,
|
||||
)
|
||||
|
||||
|
||||
class TestIso:
|
||||
def test_none(self):
|
||||
assert _iso(None) is None
|
||||
|
||||
def test_empty(self):
|
||||
assert _iso("") is None
|
||||
|
||||
def test_z_suffix(self):
|
||||
result = _iso("2024-06-15T12:00:00Z")
|
||||
assert isinstance(result, datetime)
|
||||
assert result.year == 2024
|
||||
|
||||
def test_offset(self):
|
||||
result = _iso("2024-06-15T12:00:00+00:00")
|
||||
assert isinstance(result, datetime)
|
||||
|
||||
|
||||
class TestToStrOrNone:
|
||||
def test_none(self):
|
||||
assert _to_str_or_none(None) is None
|
||||
|
||||
def test_dict(self):
|
||||
assert _to_str_or_none({"a": 1}) is None
|
||||
|
||||
def test_list(self):
|
||||
assert _to_str_or_none([1, 2]) is None
|
||||
|
||||
def test_bytes(self):
|
||||
assert _to_str_or_none(b"hello") is None
|
||||
|
||||
def test_empty_string(self):
|
||||
assert _to_str_or_none("") is None
|
||||
|
||||
def test_whitespace_only(self):
|
||||
assert _to_str_or_none(" ") is None
|
||||
|
||||
def test_valid_string(self):
|
||||
assert _to_str_or_none("hello") == "hello"
|
||||
|
||||
def test_int(self):
|
||||
assert _to_str_or_none(42) == "42"
|
||||
|
||||
def test_strips_whitespace(self):
|
||||
assert _to_str_or_none(" hi ") == "hi"
|
||||
|
||||
def test_set(self):
|
||||
assert _to_str_or_none({1, 2}) is None
|
||||
|
||||
def test_tuple(self):
|
||||
assert _to_str_or_none((1,)) is None
|
||||
|
||||
def test_bytearray(self):
|
||||
assert _to_str_or_none(bytearray(b"x")) is None
|
||||
|
||||
|
||||
class TestMemberEmail:
|
||||
def test_normal(self):
|
||||
assert _member_email({"email": "USER@EXAMPLE.COM"}) == "user@example.com"
|
||||
|
||||
def test_none(self):
|
||||
assert _member_email({"email": None}) is None
|
||||
|
||||
def test_empty(self):
|
||||
assert _member_email({"email": ""}) is None
|
||||
|
||||
def test_whitespace(self):
|
||||
assert _member_email({"email": " "}) is None
|
||||
|
||||
def test_missing_key(self):
|
||||
assert _member_email({}) is None
|
||||
|
||||
def test_strips(self):
|
||||
assert _member_email({"email": " a@b.com "}) == "a@b.com"
|
||||
|
||||
|
||||
class TestPriceCents:
|
||||
def test_valid(self):
|
||||
assert _price_cents({"price": {"amount": 1500}}) == 1500
|
||||
|
||||
def test_string_amount(self):
|
||||
assert _price_cents({"price": {"amount": "2000"}}) == 2000
|
||||
|
||||
def test_missing_price(self):
|
||||
assert _price_cents({}) is None
|
||||
|
||||
def test_missing_amount(self):
|
||||
assert _price_cents({"price": {}}) is None
|
||||
|
||||
def test_none_amount(self):
|
||||
assert _price_cents({"price": {"amount": None}}) is None
|
||||
|
||||
def test_nested_none(self):
|
||||
assert _price_cents({"price": None}) is None
|
||||
|
||||
|
||||
class TestSanitizeMemberPayload:
|
||||
def test_email_lowercased(self):
|
||||
result = _sanitize_member_payload({"email": "USER@EXAMPLE.COM"})
|
||||
assert result["email"] == "user@example.com"
|
||||
|
||||
def test_empty_email_excluded(self):
|
||||
result = _sanitize_member_payload({"email": ""})
|
||||
assert "email" not in result
|
||||
|
||||
def test_name_included(self):
|
||||
result = _sanitize_member_payload({"name": "Alice"})
|
||||
assert result["name"] == "Alice"
|
||||
|
||||
def test_note_included(self):
|
||||
result = _sanitize_member_payload({"note": "VIP"})
|
||||
assert result["note"] == "VIP"
|
||||
|
||||
def test_subscribed_bool(self):
|
||||
result = _sanitize_member_payload({"subscribed": 1})
|
||||
assert result["subscribed"] is True
|
||||
|
||||
def test_labels_with_id(self):
|
||||
result = _sanitize_member_payload({
|
||||
"labels": [{"id": "abc"}, {"name": "VIP"}]
|
||||
})
|
||||
assert result["labels"] == [{"id": "abc"}, {"name": "VIP"}]
|
||||
|
||||
def test_labels_empty_items_excluded(self):
|
||||
result = _sanitize_member_payload({
|
||||
"labels": [{"id": None, "name": None}]
|
||||
})
|
||||
assert "labels" not in result
|
||||
|
||||
def test_newsletters_with_id(self):
|
||||
result = _sanitize_member_payload({
|
||||
"newsletters": [{"id": "n1", "subscribed": True}]
|
||||
})
|
||||
assert result["newsletters"] == [{"subscribed": True, "id": "n1"}]
|
||||
|
||||
def test_newsletters_default_subscribed(self):
|
||||
result = _sanitize_member_payload({
|
||||
"newsletters": [{"name": "Weekly"}]
|
||||
})
|
||||
assert result["newsletters"][0]["subscribed"] is True
|
||||
|
||||
def test_dict_email_excluded(self):
|
||||
result = _sanitize_member_payload({"email": {"bad": "input"}})
|
||||
assert "email" not in result
|
||||
|
||||
def test_id_passthrough(self):
|
||||
result = _sanitize_member_payload({"id": "ghost-member-123"})
|
||||
assert result["id"] == "ghost-member-123"
|
||||
|
||||
def test_empty_payload(self):
|
||||
result = _sanitize_member_payload({})
|
||||
assert result == {}
|
||||
8
artdag/.dockerignore
Normal file
8
artdag/.dockerignore
Normal file
@@ -0,0 +1,8 @@
|
||||
.git
|
||||
.gitea
|
||||
**/.env
|
||||
**/.env.gpu
|
||||
**/__pycache__
|
||||
**/.pytest_cache
|
||||
**/*.pyc
|
||||
test/
|
||||
114
artdag/.gitea/workflows/ci.yml
Normal file
114
artdag/.gitea/workflows/ci.yml
Normal file
@@ -0,0 +1,114 @@
|
||||
name: Build and Deploy
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
env:
|
||||
REGISTRY: registry.rose-ash.com:5000
|
||||
ARTDAG_DIR: /root/art-dag-mono
|
||||
|
||||
jobs:
|
||||
build-and-deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Install tools
|
||||
run: |
|
||||
apt-get update && apt-get install -y --no-install-recommends openssh-client
|
||||
|
||||
- name: Set up SSH
|
||||
env:
|
||||
SSH_KEY: ${{ secrets.DEPLOY_SSH_KEY }}
|
||||
DEPLOY_HOST: ${{ secrets.DEPLOY_HOST }}
|
||||
run: |
|
||||
mkdir -p ~/.ssh
|
||||
echo "$SSH_KEY" > ~/.ssh/id_rsa
|
||||
chmod 600 ~/.ssh/id_rsa
|
||||
ssh-keyscan -H "$DEPLOY_HOST" >> ~/.ssh/known_hosts 2>/dev/null || true
|
||||
|
||||
- name: Build and deploy
|
||||
env:
|
||||
DEPLOY_HOST: ${{ secrets.DEPLOY_HOST }}
|
||||
run: |
|
||||
ssh "root@$DEPLOY_HOST" "
|
||||
cd ${{ env.ARTDAG_DIR }}
|
||||
|
||||
OLD_HEAD=\$(git rev-parse HEAD 2>/dev/null || echo none)
|
||||
|
||||
git fetch origin main
|
||||
git reset --hard origin/main
|
||||
|
||||
NEW_HEAD=\$(git rev-parse HEAD)
|
||||
|
||||
# Change detection
|
||||
BUILD_L1=false
|
||||
BUILD_L2=false
|
||||
if [ \"\$OLD_HEAD\" = \"none\" ] || [ \"\$OLD_HEAD\" = \"\$NEW_HEAD\" ]; then
|
||||
BUILD_L1=true
|
||||
BUILD_L2=true
|
||||
else
|
||||
CHANGED=\$(git diff --name-only \$OLD_HEAD \$NEW_HEAD)
|
||||
# common/ or core/ change -> rebuild both
|
||||
if echo \"\$CHANGED\" | grep -qE '^(common|core)/'; then
|
||||
BUILD_L1=true
|
||||
BUILD_L2=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^l1/'; then
|
||||
BUILD_L1=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^l2/'; then
|
||||
BUILD_L2=true
|
||||
fi
|
||||
if echo \"\$CHANGED\" | grep -q '^client/'; then
|
||||
BUILD_L1=true
|
||||
fi
|
||||
fi
|
||||
|
||||
# Build L1
|
||||
if [ \"\$BUILD_L1\" = true ]; then
|
||||
echo 'Building L1...'
|
||||
docker build \
|
||||
--build-arg CACHEBUST=\$(date +%s) \
|
||||
-f l1/Dockerfile \
|
||||
-t ${{ env.REGISTRY }}/celery-l1-server:latest \
|
||||
-t ${{ env.REGISTRY }}/celery-l1-server:${{ github.sha }} \
|
||||
.
|
||||
docker push ${{ env.REGISTRY }}/celery-l1-server:latest
|
||||
docker push ${{ env.REGISTRY }}/celery-l1-server:${{ github.sha }}
|
||||
else
|
||||
echo 'Skipping L1 (no changes)'
|
||||
fi
|
||||
|
||||
# Build L2
|
||||
if [ \"\$BUILD_L2\" = true ]; then
|
||||
echo 'Building L2...'
|
||||
docker build \
|
||||
--build-arg CACHEBUST=\$(date +%s) \
|
||||
-f l2/Dockerfile \
|
||||
-t ${{ env.REGISTRY }}/l2-server:latest \
|
||||
-t ${{ env.REGISTRY }}/l2-server:${{ github.sha }} \
|
||||
.
|
||||
docker push ${{ env.REGISTRY }}/l2-server:latest
|
||||
docker push ${{ env.REGISTRY }}/l2-server:${{ github.sha }}
|
||||
else
|
||||
echo 'Skipping L2 (no changes)'
|
||||
fi
|
||||
|
||||
# Deploy stacks (--resolve-image always forces re-pull of :latest)
|
||||
if [ \"\$BUILD_L1\" = true ]; then
|
||||
cd l1 && source .env && docker stack deploy --resolve-image always -c docker-compose.yml celery && cd ..
|
||||
echo 'L1 stack deployed'
|
||||
fi
|
||||
if [ \"\$BUILD_L2\" = true ]; then
|
||||
cd l2 && source .env && docker stack deploy --resolve-image always -c docker-compose.yml activitypub && cd ..
|
||||
echo 'L2 stack deployed'
|
||||
fi
|
||||
|
||||
sleep 10
|
||||
echo '=== L1 Services ==='
|
||||
docker stack services celery
|
||||
echo '=== L2 Services ==='
|
||||
docker stack services activitypub
|
||||
"
|
||||
74
artdag/CLAUDE.md
Normal file
74
artdag/CLAUDE.md
Normal file
@@ -0,0 +1,74 @@
|
||||
# Art DAG Monorepo
|
||||
|
||||
Federated content-addressed DAG execution engine for distributed media processing with ActivityPub ownership and provenance tracking.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
core/ # DAG engine (artdag package) - nodes, effects, analysis, planning
|
||||
l1/ # L1 Celery rendering server (FastAPI + Celery + Redis + PostgreSQL)
|
||||
l2/ # L2 ActivityPub registry (FastAPI + PostgreSQL)
|
||||
common/ # Shared templates, middleware, models (artdag_common package)
|
||||
client/ # CLI client
|
||||
test/ # Integration & e2e tests
|
||||
```
|
||||
|
||||
## Tech Stack
|
||||
|
||||
Python 3.11+, FastAPI, Celery, Redis, PostgreSQL (asyncpg for L1), SQLAlchemy, Pydantic, JAX (CPU/GPU), IPFS/Kubo, Docker Swarm, HTMX + Jinja2 for web UI.
|
||||
|
||||
## Key Commands
|
||||
|
||||
### Testing
|
||||
```bash
|
||||
cd l1 && pytest tests/ # L1 unit tests
|
||||
cd core && pytest tests/ # Core unit tests
|
||||
cd test && python run.py # Full integration pipeline
|
||||
```
|
||||
- pytest uses `asyncio_mode = "auto"` for async tests
|
||||
- Test files: `test_*.py`, fixtures in `conftest.py`
|
||||
|
||||
### Linting & Type Checking (L1)
|
||||
```bash
|
||||
cd l1 && ruff check . # Lint (E, F, I, UP rules)
|
||||
cd l1 && mypy app/types.py app/routers/recipes.py tests/
|
||||
```
|
||||
- Line length: 100 chars (E501 ignored)
|
||||
- Mypy: strict on `app/types.py`, `app/routers/recipes.py`, `tests/`; gradual elsewhere
|
||||
- Mypy ignores imports for: celery, redis, artdag, artdag_common, ipfs_client
|
||||
|
||||
### Docker
|
||||
```bash
|
||||
docker build -f l1/Dockerfile -t celery-l1-server:latest .
|
||||
docker build -f l1/Dockerfile.gpu -t celery-l1-gpu:latest .
|
||||
docker build -f l2/Dockerfile -t l2-server:latest .
|
||||
./deploy.sh # Build, push, deploy stacks
|
||||
```
|
||||
|
||||
## Architecture Patterns
|
||||
|
||||
- **3-Phase Execution**: Analyze -> Plan -> Execute (tasks in `l1/tasks/`)
|
||||
- **Content-Addressed**: All data identified by SHA3-256 hashes or IPFS CIDs
|
||||
- **Services Pattern**: Business logic in `app/services/`, API endpoints in `app/routers/`
|
||||
- **Types Module**: Pydantic models and TypedDicts in `app/types.py`
|
||||
- **Celery Tasks**: In `l1/tasks/`, decorated with `@app.task`
|
||||
- **S-Expression Effects**: Composable effect language in `l1/sexp_effects/`
|
||||
- **Storage**: Local filesystem, S3, or IPFS backends (`storage_providers.py`)
|
||||
|
||||
## Auth
|
||||
|
||||
- L1 <-> L2: scoped JWT tokens (no shared secrets)
|
||||
- L2: password + OAuth SSO, token revocation in Redis (30-day expiry)
|
||||
- Federation: ActivityPub RSA signatures (`core/artdag/activitypub/`)
|
||||
|
||||
## Key Config Files
|
||||
|
||||
- `l1/pyproject.toml` - mypy, pytest, ruff config for L1
|
||||
- `l1/celery_app.py` - Celery initialization
|
||||
- `l1/database.py` / `l2/db.py` - SQLAlchemy models
|
||||
- `l1/docker-compose.yml` / `l2/docker-compose.yml` - Swarm stacks
|
||||
|
||||
## Tools
|
||||
|
||||
- Use Context7 MCP for up-to-date library documentation
|
||||
- Playwright MCP is available for browser automation/testing
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user