Hermes Web UI — Sprints 11-14: multi-provider models, settings, session QoL, alerts, polish
Sprint 11 (v0.13): multi-provider model support, streaming smoothness - Dynamic model dropdown populated from configured API keys (OpenAI, Anthropic, Google, DeepSeek, GLM, Kimi, MiniMax, OpenRouter, Nous Portal) - Scroll pinning during streaming (no forced scroll when user has scrolled up) - All route handlers extracted to api/routes.py (server.py now ~76 lines) Sprint 12 (v0.14): settings panel, SSE reconnect, session QoL - Settings panel (gear icon) -- persist default model and workspace server-side - SSE auto-reconnect on network blips - Pin/star sessions to top of sidebar - Import session from JSON export Sprint 13 (v0.15): cron alerts, background errors, session duplicate, tab title - Cron completion alerts: toast per completion + unread badge on Tasks tab - Background agent error banner when a non-active session errors mid-stream - Session duplicate button - Browser tab title reflects active session name Sprint 14 (v0.16): Mermaid diagrams, file ops, session archive/tags, timestamps - Mermaid diagram rendering inline (dark theme, lazy CDN load) - File rename (double-click in file tree) and create folder - Session archive (hide without deleting, toggle to show) - Session tags -- #hashtag in title becomes colored chip + click-to-filter - Message timestamps (HH:MM on hover, full date as tooltip) Test suite: 224 tests across 14 sprint files + regression gate, 0 failures.
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -16,6 +16,7 @@ archive/
|
|||||||
.env
|
.env
|
||||||
.env.*
|
.env.*
|
||||||
!.env.example
|
!.env.example
|
||||||
|
.claude/*
|
||||||
|
|
||||||
# Generated screenshots and transient artifacts
|
# Generated screenshots and transient artifacts
|
||||||
screenshot-*.png
|
screenshot-*.png
|
||||||
|
|||||||
272
ARCHITECTURE.md
272
ARCHITECTURE.md
@@ -11,39 +11,63 @@
|
|||||||
|
|
||||||
## 1. Overview and Purpose
|
## 1. Overview and Purpose
|
||||||
|
|
||||||
The Hermes Web UI is a lightweight, single-file web application that gives you
|
The Hermes Web UI is a lightweight web application that gives you a browser-based
|
||||||
a browser-based interface to the Hermes agent that is functionally equivalent to the CLI.
|
interface to the Hermes agent that is functionally equivalent to the CLI. It is modeled on
|
||||||
It is modeled on the Claude interface: a three-panel layout with a sidebar for
|
the Claude-style interface: a three-panel layout with a sidebar for session management,
|
||||||
session management, a central chat area, and a right panel for workspace file browsing.
|
a central chat area, and a right panel for workspace file browsing.
|
||||||
|
|
||||||
The design philosophy is deliberately minimal. There is no build step, no bundler, no
|
The design philosophy is deliberately minimal. There is no build step, no bundler, no
|
||||||
frontend framework. Everything ships from a single Python file. This makes the code easy
|
frontend framework. The Python server is split into a routing shell (server.py) and
|
||||||
to modify from a terminal or by an agent, but it creates architectural debt that grows as
|
business logic modules (api/). The frontend is six vanilla JS modules loaded from static/.
|
||||||
the feature set expands.
|
This makes the code easy to modify from a terminal or by an agent.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 2. File Inventory
|
## 2. File Inventory
|
||||||
|
|
||||||
<agent-dir>/webui-mvp/
|
<repo>/
|
||||||
server.py Main server file. ~1150 lines. Pure Python.
|
server.py Thin routing shell + HTTP Handler. ~76 lines. Pure Python.
|
||||||
HTTP server, all API handlers, Session model, SSE engine,
|
Delegates all route handling to api/routes.py.
|
||||||
approval wiring, file upload parser. No inline HTML/CSS/JS.
|
start.sh Discovery script: finds agent dir, Python, starts server.
|
||||||
(Phase A+E complete: HTML/CSS/JS all extracted to static/)
|
api/
|
||||||
server.py.bak Backup from a prior iteration. Kept for reference.
|
__init__.py Package marker
|
||||||
server_new.py Intermediate ~900-line draft. Superseded by server.py.
|
routes.py All GET + POST route handlers (~802 lines)
|
||||||
Safe to delete once Wave 1 begins.
|
config.py Shared configuration, constants, global state, model discovery (~453 lines)
|
||||||
start.sh Convenience script: kills running instance, starts server.py
|
helpers.py HTTP helpers: j(), bad(), require(), safe_resolve() (~57 lines)
|
||||||
via nohup, writes stdout/stderr to /tmp/webui-mvp.log
|
models.py Session model + CRUD (~114 lines)
|
||||||
|
workspace.py File ops: list_dir, read_file_content, workspace helpers (~77 lines)
|
||||||
|
upload.py Multipart parser, file upload handler (~77 lines)
|
||||||
|
streaming.py SSE engine, run_agent integration, cancel support (~218 lines)
|
||||||
|
static/
|
||||||
|
index.html HTML template (served from disk)
|
||||||
|
style.css All CSS
|
||||||
|
ui.js DOM helpers, renderMd, tool cards, model dropdown (~671 lines)
|
||||||
|
workspace.js File tree, preview, file ops (~168 lines)
|
||||||
|
sessions.js Session CRUD, list rendering, search (~206 lines)
|
||||||
|
messages.js send(), SSE event handlers, approval, transcript (~310 lines)
|
||||||
|
panels.js Cron, skills, memory, workspace, todo, switchPanel (~600 lines)
|
||||||
|
boot.js Event wiring + boot IIFE (~154 lines)
|
||||||
|
tests/
|
||||||
|
conftest.py Isolated test server (port 8788, separate HERMES_HOME) (~240 lines)
|
||||||
|
test_sprint1-11.py Feature tests per sprint (13 files)
|
||||||
|
test_regressions.py Permanent regression gate
|
||||||
AGENTS.md Instruction file for agents working in this directory.
|
AGENTS.md Instruction file for agents working in this directory.
|
||||||
ROADMAP.md Feature and product roadmap document.
|
ROADMAP.md Feature and product roadmap document.
|
||||||
|
SPRINTS.md Forward sprint plan with CLI + Claude parity targets.
|
||||||
ARCHITECTURE.md THIS FILE.
|
ARCHITECTURE.md THIS FILE.
|
||||||
|
TESTING.md Manual browser test plan and automated coverage reference.
|
||||||
|
CHANGELOG.md Release notes per sprint.
|
||||||
|
PORTABILITY.md Portability design spec for download-and-run installs.
|
||||||
|
requirements.txt Python dependencies.
|
||||||
|
.env.example Sample environment variable overrides.
|
||||||
|
|
||||||
State directory (runtime data, separate from source):
|
State directory (runtime data, separate from source):
|
||||||
|
|
||||||
~/.hermes/webui-mvp/
|
~/.hermes/webui-mvp/
|
||||||
sessions/ One JSON file per session: {session_id}.json
|
sessions/ One JSON file per session: {session_id}.json
|
||||||
test-workspace/ Default empty workspace used during development
|
workspaces.json Registered workspaces list
|
||||||
|
last_workspace.txt Last-used workspace path
|
||||||
|
settings.json (future) User settings
|
||||||
|
|
||||||
Log file:
|
Log file:
|
||||||
|
|
||||||
@@ -301,13 +325,21 @@ read_file_content(workspace, rel):
|
|||||||
|
|
||||||
### 5.1 Structure
|
### 5.1 Structure
|
||||||
|
|
||||||
The entire frontend is ~750 lines inside the HTML Python raw string.
|
The frontend is served from static/ as separate files: one HTML template, one CSS file,
|
||||||
Structure: <head> with CSS only (no external stylesheets), <body> with three-panel layout,
|
and six JavaScript modules (~2,025 lines total). External dependency: Prism.js from CDN
|
||||||
<script> with all JavaScript (no external libraries).
|
(syntax highlighting, loaded async/deferred).
|
||||||
|
|
||||||
Three-panel layout:
|
Six JS modules loaded in order at end of <body>:
|
||||||
|
1. ui.js (~589 lines) DOM helpers, renderMd, tool card rendering, global state
|
||||||
|
2. workspace.js (~168 lines) File tree, preview, file operations
|
||||||
|
3. sessions.js (~206 lines) Session CRUD, list rendering, search
|
||||||
|
4. messages.js (~310 lines) send(), SSE event handlers, approval, transcript
|
||||||
|
5. panels.js (~600 lines) Cron, skills, memory, workspace, todo, switchPanel
|
||||||
|
6. boot.js (~152 lines) Event wiring + boot IIFE
|
||||||
|
|
||||||
<aside class="sidebar"> Left panel: session list, model selector, workspace path
|
Three-panel layout (in static/index.html):
|
||||||
|
|
||||||
|
<aside class="sidebar"> Left panel: session list, nav tabs, model selector
|
||||||
<main class="main"> Center: topbar, messages area, approval card, composer
|
<main class="main"> Center: topbar, messages area, approval card, composer
|
||||||
<aside class="rightpanel"> Right panel: workspace file tree and file preview
|
<aside class="rightpanel"> Right panel: workspace file tree and file preview
|
||||||
|
|
||||||
@@ -477,13 +509,17 @@ Step-by-step trace of what happens when you type a message and press Send:
|
|||||||
|
|
||||||
## 7. Dependency Map
|
## 7. Dependency Map
|
||||||
|
|
||||||
Direct imports in server.py:
|
server.py imports from api/ modules (config, helpers, models, workspace, upload, streaming).
|
||||||
|
The api/ modules in turn import Hermes internals:
|
||||||
|
|
||||||
|
api/streaming.py imports:
|
||||||
run_agent.AIAgent Main agent class. Wraps LLM + tool execution.
|
run_agent.AIAgent Main agent class. Wraps LLM + tool execution.
|
||||||
tools.approval.* Module-level approval state.
|
api/config.py imports:
|
||||||
yaml Config loading.
|
yaml Config loading.
|
||||||
Standard library: json, os, re, sys, threading, time, traceback, uuid,
|
server.py imports:
|
||||||
http.server, pathlib, urllib.parse, email.parser, queue
|
tools.approval.* Module-level approval state (with graceful fallback).
|
||||||
|
Standard library across all modules: json, os, re, sys, threading, time, traceback,
|
||||||
|
uuid, http.server, pathlib, urllib.parse, email.parser, queue, collections
|
||||||
|
|
||||||
AIAgent constructor parameters used:
|
AIAgent constructor parameters used:
|
||||||
|
|
||||||
@@ -561,52 +597,33 @@ restriction from the UI yet (see ROADMAP.md Wave 4 for the plan).
|
|||||||
These phases run in parallel with the feature roadmap. Each phase targets software
|
These phases run in parallel with the feature roadmap. Each phase targets software
|
||||||
quality: testability, resilience, maintainability, and modularity.
|
quality: testability, resilience, maintainability, and modularity.
|
||||||
|
|
||||||
### Phase A: File Separation (Priority: High, Effort: Medium)
|
### Phase A: File Separation -- COMPLETE
|
||||||
|
|
||||||
Split server.py into a proper package.
|
Split server.py into a proper package. Completed across Sprints 4-10.
|
||||||
|
|
||||||
Target structure:
|
Current structure:
|
||||||
|
|
||||||
webui-mvp/
|
<repo>/
|
||||||
server.py Entry point: starts server, imports api/
|
server.py Entry point + HTTP Handler routing (~704 lines)
|
||||||
api/
|
api/
|
||||||
__init__.py
|
__init__.py
|
||||||
handlers.py do_GET / do_POST routing and dispatch
|
config.py Configuration, constants, global state (~273 lines)
|
||||||
session_store.py Session class, get_session, new_session, all_sessions, SESSIONS
|
helpers.py HTTP helpers: j(), bad(), require(), safe_resolve() (~57 lines)
|
||||||
streaming.py _run_agent_streaming, STREAMS, STREAMS_LOCK, _sse()
|
models.py Session model + CRUD (~114 lines)
|
||||||
upload.py parse_multipart, handle_upload
|
workspace.py File ops, workspace management (~77 lines)
|
||||||
files.py safe_resolve, list_dir, read_file_content
|
upload.py Multipart parser, file upload handler (~77 lines)
|
||||||
approval.py Thin wrapper around tools.approval for the HTTP API
|
streaming.py SSE engine, run_agent, cancel support (~218 lines)
|
||||||
config.py Configuration loading (env vars, config.yaml)
|
|
||||||
static/
|
static/
|
||||||
index.html HTML document (served directly from disk)
|
index.html HTML document (served from disk)
|
||||||
style.css All CSS
|
style.css All CSS
|
||||||
[app.js deleted] Replaced by 6 modules: ui.js, workspace.js, sessions.js,
|
ui.js, workspace.js, sessions.js, messages.js, panels.js, boot.js
|
||||||
messages.js, panels.js, boot.js
|
|
||||||
tests/
|
tests/
|
||||||
test_session_crud.py
|
conftest.py Isolated test server on port 8788
|
||||||
test_upload.py
|
test_sprint1-10.py Feature tests per sprint (12 files)
|
||||||
test_streaming.py
|
test_regressions.py Permanent regression gate
|
||||||
test_approval.py
|
|
||||||
test_files.py
|
|
||||||
frontend/
|
|
||||||
test_markdown.html
|
|
||||||
test_session_state.html
|
|
||||||
|
|
||||||
Implementation steps:
|
Remaining: server.py still has all 49 route handlers in one do_GET/do_POST class.
|
||||||
1. Extract CSS and HTML to static/style.css and static/index.html. No content changes.
|
Sprint 11 plans extracting these to api/routes.py, making server.py a ~50-line shell.
|
||||||
Server serves index.html from disk: handler reads Path('static/index.html').read_text()
|
|
||||||
2. Extract JS to 6 static modules (complete -- app.js deleted Sprint 9)
|
|
||||||
Add GET /static/* handler in do_GET.
|
|
||||||
3. Extract Session class and helpers to api/session_store.py
|
|
||||||
4. Extract _run_agent_streaming and SSE helpers to api/streaming.py
|
|
||||||
5. Extract parse_multipart and handle_upload to api/upload.py
|
|
||||||
6. Extract list_dir and friends to api/files.py
|
|
||||||
7. Refactor handlers.py to import from the above modules
|
|
||||||
8. server.py becomes: config setup, start server, import Handler from handlers.py
|
|
||||||
|
|
||||||
Benefit: Each file is under ~200 lines. Agents can read and modify individual files
|
|
||||||
without loading the full 1100-line blob.
|
|
||||||
|
|
||||||
### Phase B: Thread-Safe Request Context (Priority: Critical, Effort: Medium)
|
### Phase B: Thread-Safe Request Context (Priority: Critical, Effort: Medium)
|
||||||
|
|
||||||
@@ -637,72 +654,36 @@ Option 3 (interim, safe for single-user): Wrap the env var block in a per-sessio
|
|||||||
Phase B also includes: review all other os.environ reads/writes in the codebase for
|
Phase B also includes: review all other os.environ reads/writes in the codebase for
|
||||||
similar thread-safety issues.
|
similar thread-safety issues.
|
||||||
|
|
||||||
### Phase C: Session Store Improvements (Priority: Medium, Effort: Medium)
|
### Phase C: Session Store Improvements -- COMPLETE
|
||||||
|
|
||||||
Three problems to fix:
|
All three problems fixed in Sprint 5:
|
||||||
|
|
||||||
1. Unbounded SESSIONS cache:
|
1. SESSIONS cache: OrderedDict with LRU cap of 100, oldest evicted automatically.
|
||||||
Replace dict with functools.lru_cache wrapper or a simple OrderedDict with max size.
|
2. LOCK: all SESSIONS dict reads/writes wrapped with LOCK (from Sprint 1).
|
||||||
Evict LRU entries when size exceeds 100.
|
3. Session index: `sessions/_index.json` maintained on every save/delete.
|
||||||
|
`all_sessions()` reads the index file (O(1)) instead of scanning all JSONs.
|
||||||
|
|
||||||
2. No locking around SESSIONS:
|
### Phase D: Input Validation and Error Handling -- COMPLETE
|
||||||
Wrap all SESSIONS dict reads and writes with LOCK (already defined, just unused).
|
|
||||||
Pattern: with LOCK: s = SESSIONS.get(sid)
|
|
||||||
|
|
||||||
3. O(n) directory scan in all_sessions():
|
Completed in Sprint 4-6:
|
||||||
Add an index file: SESSION_DIR/index.json
|
|
||||||
Contents: list of compact() dicts, sorted by updated_at
|
|
||||||
Maintained on every Session.save() and every delete.
|
|
||||||
all_sessions() reads index.json (one file read) instead of scanning all JSONs.
|
|
||||||
get_session() still loads the full {session_id}.json on cache miss.
|
|
||||||
Index rebuild tool: a function that regenerates index.json from all *.json files.
|
|
||||||
|
|
||||||
### Phase D: Input Validation and Error Handling (Priority: Medium, Effort: Low)
|
1. `require()` and `bad()` helpers in `api/helpers.py` for parameter validation.
|
||||||
|
2. All endpoints return clean 400/404 responses instead of tracebacks.
|
||||||
|
3. Structured JSON request logging via `log_request()` override (Sprint 1).
|
||||||
|
|
||||||
1. Add a validate() helper:
|
### Phase E: Frontend Modularization -- COMPLETE
|
||||||
def validate(body, *required_fields):
|
|
||||||
missing = [f for f in required_fields if not body.get(f)]
|
|
||||||
if missing: raise ValueError(f"Missing required fields: {missing}")
|
|
||||||
|
|
||||||
2. Refine the outer try/except in do_GET and do_POST:
|
Completed across Sprints 5, 6, and 9:
|
||||||
except ValueError as e:
|
|
||||||
return j(self, {'error': str(e)}, status=400)
|
|
||||||
except KeyError as e:
|
|
||||||
return j(self, {'error': f'Not found: {e}'}, status=404)
|
|
||||||
except Exception as e:
|
|
||||||
log.exception('Unhandled error')
|
|
||||||
return j(self, {'error': 'Internal server error'}, status=500)
|
|
||||||
# Never expose tracebacks to the client (security risk even on localhost)
|
|
||||||
|
|
||||||
3. Add request duration logging:
|
1. HTML extracted to `static/index.html` (Sprint 6).
|
||||||
Log at INFO level: {method} {path} -> {status} in {duration}ms
|
2. CSS extracted to `static/style.css` (Sprint 4).
|
||||||
|
3. `app.js` deleted Sprint 9, replaced by 6 focused modules:
|
||||||
|
`ui.js`, `workspace.js`, `sessions.js`, `messages.js`, `panels.js`, `boot.js`.
|
||||||
|
Loaded as standard `<script>` tags (not ES modules) in dependency order.
|
||||||
|
4. Prism.js added for syntax highlighting (Sprint 8) via CDN, deferred load.
|
||||||
|
|
||||||
### Phase E: Frontend Modularization (Priority: Medium, Effort: High)
|
Remaining: renderMd() is still a hand-rolled regex chain. Tables partially supported.
|
||||||
|
Replacing with marked.js + DOMPurify is a future improvement (not blocking).
|
||||||
After Phase A splits the HTML/JS into files, Phase E improves the JavaScript itself.
|
|
||||||
|
|
||||||
1. Switch to ES Modules (type="module"):
|
|
||||||
app.js deleted Sprint 9 -- replaced by 6 modules:
|
|
||||||
- state.js: export S, INFLIGHT
|
|
||||||
- sessions.js: session CRUD functions
|
|
||||||
- chat.js: send(), SSE handling
|
|
||||||
- files.js: loadDir(), openFile()
|
|
||||||
- upload.js: uploadPendingFiles(), addFiles(), renderTray()
|
|
||||||
- approval.js: approval card and polling
|
|
||||||
- markdown.js: renderMd()
|
|
||||||
- ui.js: setStatus, setBusy, showToast, syncTopbar
|
|
||||||
Each module imports what it needs from state.js and other modules.
|
|
||||||
|
|
||||||
2. Replace renderMd with marked.js:
|
|
||||||
CDN: https://cdn.jsdelivr.net/npm/marked/marked.min.js
|
|
||||||
No bundler needed, ~50KB, handles tables, nested lists, HTML sanitization.
|
|
||||||
Usage: marked.parse(raw) -- drop-in replacement.
|
|
||||||
Add DOMPurify alongside for XSS sanitization of rendered HTML.
|
|
||||||
|
|
||||||
3. Add Prism.js for syntax highlighting:
|
|
||||||
CDN: https://cdn.jsdelivr.net/npm/prismjs
|
|
||||||
Apply after renderMd: Prism.highlightAllUnder(element)
|
|
||||||
Supports 200+ languages with auto-detection.
|
|
||||||
|
|
||||||
### Phase F: API Design Cleanup (Priority: Low, Effort: Medium)
|
### Phase F: API Design Cleanup (Priority: Low, Effort: Medium)
|
||||||
|
|
||||||
@@ -719,16 +700,11 @@ After Phase A splits the HTML/JS into files, Phase E improves the JavaScript its
|
|||||||
|
|
||||||
4. Consistent naming: use snake_case for all JSON keys.
|
4. Consistent naming: use snake_case for all JSON keys.
|
||||||
|
|
||||||
### Phase G: Observability (Priority: Low, Effort: Low)
|
### Phase G: Observability -- MOSTLY COMPLETE
|
||||||
|
|
||||||
1. Structured JSON logging to /tmp/webui-mvp.log:
|
1. Structured JSON logging: COMPLETE (Sprint 1). Per-request JSON to /tmp/webui-mvp.log.
|
||||||
{"ts": "...", "method": "POST", "path": "/api/chat/start", "status": 200, "ms": 12}
|
2. Enhanced /health: COMPLETE (Sprint 7). Returns `active_streams`, `uptime_seconds`.
|
||||||
|
3. GET /api/debug/stats: NOT YET IMPLEMENTED. Low priority.
|
||||||
2. Enhanced /health response:
|
|
||||||
{"status": "ok", "sessions": 10, "active_streams": 2, "uptime_s": 3600, "version": "0.3"}
|
|
||||||
|
|
||||||
3. GET /api/debug/stats (localhost only):
|
|
||||||
{"sessions_cached": N, "streams_active": M, "memory_mb": X}
|
|
||||||
|
|
||||||
### Phase H: Authentication (Priority: Low, Effort: Medium)
|
### Phase H: Authentication (Priority: Low, Effort: Medium)
|
||||||
|
|
||||||
@@ -740,29 +716,15 @@ Optional password gate for non-SSH-tunnel deployments.
|
|||||||
4. All API endpoints check cookie if HERMES_WEBUI_PASSWORD is set
|
4. All API endpoints check cookie if HERMES_WEBUI_PASSWORD is set
|
||||||
5. Cookie validity: 30 days from last activity
|
5. Cookie validity: 30 days from last activity
|
||||||
|
|
||||||
### Phase I: Test Infrastructure (Priority: High, Effort: High)
|
### Phase I: Test Infrastructure -- COMPLETE
|
||||||
|
|
||||||
No tests exist today. This is the highest-risk technical debt.
|
190 tests across 12 test files + regression gate. Isolated test server on port 8788
|
||||||
|
with separate HERMES_HOME, wiped per run. Production data never touched.
|
||||||
|
|
||||||
1. Python unit tests (pytest):
|
Test files: `test_sprint1.py` through `test_sprint10.py`, `test_regressions.py`.
|
||||||
- tests/test_session_crud.py: Session class, get_session, new_session, all_sessions
|
Fixtures in `conftest.py`: auto-cleanup, cron isolation, workspace reset.
|
||||||
- tests/test_upload.py: parse_multipart directly with known byte payloads
|
|
||||||
- tests/test_files.py: safe_resolve, list_dir, read_file_content with tmp dirs
|
|
||||||
- tests/test_streaming.py: mock AIAgent, verify event sequence
|
|
||||||
- tests/test_approval.py: approval state machine
|
|
||||||
|
|
||||||
2. HTTP integration tests:
|
Remaining: no CI (GitHub Actions), no frontend tests (browser-based).
|
||||||
- Start a test server on a random port
|
|
||||||
- Drive it with httpx or requests
|
|
||||||
- Verify all API endpoints return correct shapes and status codes
|
|
||||||
|
|
||||||
3. Frontend tests (no build step):
|
|
||||||
- tests/frontend/test_markdown.html: known input -> expected HTML output assertions
|
|
||||||
- Run via: python3 -m http.server and open in browser, or use playwright
|
|
||||||
|
|
||||||
4. CI (GitHub Actions):
|
|
||||||
- .github/workflows/test.yml: on push, run pytest + ruff lint
|
|
||||||
- Target: zero test failures before merging any feature branch
|
|
||||||
|
|
||||||
### Phase J: Performance (Priority: Low, Effort: High)
|
### Phase J: Performance (Priority: Low, Effort: High)
|
||||||
|
|
||||||
@@ -985,7 +947,7 @@ Resolution: Phase B replaces with thread-local or explicit parameter passing.
|
|||||||
Bug fix: Escape from file editor now cancels edits
|
Bug fix: Escape from file editor now cancels edits
|
||||||
New endpoints: POST /api/crons/create, GET /api/session/export
|
New endpoints: POST /api/crons/create, GET /api/session/export
|
||||||
Tests: 16 new, 106/106 total
|
Tests: 16 new, 106/106 total
|
||||||
v0.0.6 Sprint 8 (March 31, 2026):
|
v0.10 Sprint 8 (March 31, 2026):
|
||||||
Features: edit+regenerate messages, regenerate last response, clear conversation,
|
Features: edit+regenerate messages, regenerate last response, clear conversation,
|
||||||
Prism.js syntax highlighting, message queue (MSG_QUEUE + drain on idle),
|
Prism.js syntax highlighting, message queue (MSG_QUEUE + drain on idle),
|
||||||
INFLIGHT-first loadSession (message persists on switch-away/back)
|
INFLIGHT-first loadSession (message persists on switch-away/back)
|
||||||
@@ -994,22 +956,22 @@ Resolution: Phase B replaces with thread-local or explicit parameter passing.
|
|||||||
Tests: 14 new, 139/139 total
|
Tests: 14 new, 139/139 total
|
||||||
JS: MSG_QUEUE global, updateQueueBadge(), setBusy drain logic, send() queues when busy,
|
JS: MSG_QUEUE global, updateQueueBadge(), setBusy drain logic, send() queues when busy,
|
||||||
loadSession checks INFLIGHT before server fetch
|
loadSession checks INFLIGHT before server fetch
|
||||||
v0.1.0 Concurrency sweeps (March 31, 2026):
|
v0.12.2 Concurrency sweeps (March 31, 2026):
|
||||||
R10-R15: approval cross-session, activity bar per-session, live card
|
R10-R15: approval cross-session, activity bar per-session, live card
|
||||||
restore on switch-back, settled cards after done, model source,
|
restore on switch-back, settled cards after done, model source,
|
||||||
newSession card clear. 190/190 tests.
|
newSession card clear. 190/190 tests.
|
||||||
v0.0.8 Sprint 10 (March 31, 2026):
|
v0.12 Sprint 10 (March 31, 2026):
|
||||||
Arch: server.py split into api/ modules (config, helpers, models, workspace, upload, streaming)
|
Arch: server.py split into api/ modules (config, helpers, models, workspace, upload, streaming)
|
||||||
Features: background task cancel, cron run history, tool card UX polish
|
Features: background task cancel, cron run history, tool card UX polish
|
||||||
Post-sprint fixes: SSE cancel event breaks loop, Cancel button always hidden on setBusy(false),
|
Post-sprint fixes: SSE cancel event breaks loop, Cancel button always hidden on setBusy(false),
|
||||||
S.activeStreamId initialized, tool-card show-more uses data attributes, version label v0.0.8,
|
S.activeStreamId initialized, tool-card show-more uses data attributes, version label v0.12,
|
||||||
Session.__init__ **kwargs forward-compat, test cron isolation via HERMES_HOME,
|
Session.__init__ **kwargs forward-compat, test cron isolation via HERMES_HOME,
|
||||||
last_workspace reset in conftest between tests, tool cards grouped by assistant turn
|
last_workspace reset in conftest between tests, tool cards grouped by assistant turn
|
||||||
Tests: 18 new, 167/167 total
|
Tests: 18 new, 167/167 total
|
||||||
Regressions fixed: uuid, AIAgent, has_pending, SSE cancel loop, Session.__init__ tool_calls
|
Regressions fixed: uuid, AIAgent, has_pending, SSE cancel loop, Session.__init__ tool_calls
|
||||||
test_regressions.py: 10 tests -- one per introduced bug, permanent regression gate
|
test_regressions.py: 10 tests -- one per introduced bug, permanent regression gate
|
||||||
Total after fixes: 177/177
|
Total after fixes: 177/177
|
||||||
v0.0.7 Sprint 9 (March 31, 2026):
|
v0.11 Sprint 9 (March 31, 2026):
|
||||||
Arch: app.js deleted; replaced by ui.js, workspace.js, sessions.js, messages.js, panels.js, boot.js
|
Arch: app.js deleted; replaced by ui.js, workspace.js, sessions.js, messages.js, panels.js, boot.js
|
||||||
Features: tool call cards (inline collapsible, live + history), attachment persistence,
|
Features: tool call cards (inline collapsible, live + history), attachment persistence,
|
||||||
todo list panel (parses tool results from session history)
|
todo list panel (parses tool results from session history)
|
||||||
|
|||||||
114
CHANGELOG.md
114
CHANGELOG.md
@@ -1,4 +1,4 @@
|
|||||||
# Hermes WebUI -- Changelog
|
# Hermes Web UI -- Changelog
|
||||||
|
|
||||||
> Living document. Updated at the end of every sprint.
|
> Living document. Updated at the end of every sprint.
|
||||||
> Source: <repo>/
|
> Source: <repo>/
|
||||||
@@ -6,7 +6,103 @@
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [v0.1.0] Concurrency + Correctness Sweeps
|
## [v0.16] Sprint 14 -- Visual Polish + Workspace Ops + Session Organization
|
||||||
|
*March 30, 2026 | 233 tests*
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- **Mermaid diagram rendering.** Code blocks tagged `mermaid` render as
|
||||||
|
diagrams inline. Mermaid.js loaded lazily from CDN on first encounter.
|
||||||
|
Dark theme with matching colors. Falls back to code block on parse error.
|
||||||
|
- **Message timestamps.** Subtle HH:MM time next to each role label. Full
|
||||||
|
date/time on hover tooltip. User messages get `_ts` field when sent.
|
||||||
|
- **File rename.** Double-click any filename in workspace panel to rename
|
||||||
|
inline. `POST /api/file/rename` endpoint with path traversal protection.
|
||||||
|
- **Folder create.** Folder icon button in workspace panel header. Prompt
|
||||||
|
for name, `POST /api/file/create-dir` endpoint.
|
||||||
|
- **Session tags.** Add `#tag` to session titles. Tags shown as colored
|
||||||
|
chips in sidebar. Click a tag to filter the session list.
|
||||||
|
- **Session archive.** Archive icon on each session. Archived sessions
|
||||||
|
hidden by default; "Show N archived" toggle at top of list. Backend
|
||||||
|
`POST /api/session/archive` with `archived` field on Session model.
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
- **Date grouping fix.** Session list groups (Today/Yesterday/Earlier) now
|
||||||
|
use `created_at` instead of `updated_at`, preventing sessions from jumping
|
||||||
|
between groups when auto-titling touches `updated_at`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [v0.15] Sprint 13 -- Alerts + Session QoL + Polish
|
||||||
|
*March 30, 2026 | 221 tests*
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- **Cron completion alerts.** New `GET /api/crons/recent` endpoint. UI polls every
|
||||||
|
30s (pauses when tab is hidden). Toast notification per completion with status icon.
|
||||||
|
Red badge count on Tasks nav tab, cleared when tab is opened.
|
||||||
|
- **Background agent error alerts.** When a streaming session errors out and the user
|
||||||
|
is viewing a different session, a persistent red banner appears above the messages:
|
||||||
|
"Session X has encountered an error." View button navigates, Dismiss clears.
|
||||||
|
- **Session duplicate.** Copy icon on each session in the sidebar (visible on hover).
|
||||||
|
Creates a new session with the same workspace and model, titled "(copy)".
|
||||||
|
- **Browser tab title.** `document.title` updates to show the active session title
|
||||||
|
(e.g. "My Task -- Hermes"). Resets to "Hermes" when no session is active.
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
- Click guard added for duplicate button to prevent accidental session navigation.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [v0.14] Sprint 12 -- Settings Panel + Reliability + Session QoL
|
||||||
|
*March 30, 2026 | 211 tests*
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- **Settings panel.** Gear icon in topbar opens slide-in overlay. Persist default
|
||||||
|
model and workspace server-side in `settings.json`. Server reads on startup.
|
||||||
|
- **SSE auto-reconnect.** When EventSource drops mid-stream, attempts one reconnect
|
||||||
|
using the same stream_id after 1.5s. Shared `_wireSSE()` function eliminates
|
||||||
|
handler duplication.
|
||||||
|
- **Pin sessions.** Star icon on each session. Pinned sessions float to top of sidebar
|
||||||
|
under a gold "Pinned" header. Persisted in session JSON.
|
||||||
|
- **Import session from JSON.** Upload button in sidebar. Creates new session with
|
||||||
|
fresh ID from exported JSON file.
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
- `models.py` uses `_cfg.DEFAULT_MODEL` module reference so `save_settings()` changes
|
||||||
|
take effect for `new_session()`.
|
||||||
|
- Full-scan fallback sort in `all_sessions()` now accounts for pinned sessions.
|
||||||
|
- `save_settings()` whitelists known keys only, rejecting arbitrary data.
|
||||||
|
- Escape key closes settings overlay.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [v0.13] Sprint 11 -- Multi-Provider Models + Streaming Smoothness
|
||||||
|
*March 30, 2026 | 201 tests*
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- **Multi-provider model support.** New `GET /api/models` endpoint discovers configured
|
||||||
|
providers from `config.yaml`, `auth.json`, and API key environment variables. The model
|
||||||
|
dropdown now populates dynamically from whatever providers the user has set up (Anthropic,
|
||||||
|
OpenAI, Google, DeepSeek, Nous Portal, OpenRouter, etc.). Falls back to the hardcoded
|
||||||
|
OpenRouter list when no providers are detected. Sessions with unlisted models auto-add
|
||||||
|
them to the dropdown.
|
||||||
|
- **Smooth scroll pinning.** During streaming, auto-scroll only when the user is near the
|
||||||
|
bottom of the message area. If the user scrolls up to read earlier content, new tokens
|
||||||
|
no longer yank them back down. Pinning resumes when they scroll back to the bottom.
|
||||||
|
|
||||||
|
### Architecture
|
||||||
|
- **Routes extracted to api/routes.py.** All 49 GET/POST route handlers moved from server.py
|
||||||
|
into `api/routes.py` (802 lines). server.py is now a 76-line thin shell: Handler class
|
||||||
|
with structured logging, dispatch to `handle_get()`/`handle_post()`, and `main()`.
|
||||||
|
Completes the server split started in Sprint 10.
|
||||||
|
- **Cleaned up duplicate dead-code routes** that existed in the old `do_GET` (skills/save,
|
||||||
|
skills/delete, memory/write were duplicated in both GET and POST handlers).
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
- Regression tests updated for new route module structure.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## [v0.12.2] Concurrency + Correctness Sweeps
|
||||||
*March 31, 2026 | 190 tests*
|
*March 31, 2026 | 190 tests*
|
||||||
|
|
||||||
Two systematic audits of all concurrent multi-session scenarios. Each finding
|
Two systematic audits of all concurrent multi-session scenarios. Each finding
|
||||||
@@ -38,7 +134,7 @@ became a regression test so it cannot silently return.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [v0.0.9] Sprint 10 Post-Release Fixes
|
## [v0.12.1] Sprint 10 Post-Release Fixes
|
||||||
*March 31, 2026 | 177 tests*
|
*March 31, 2026 | 177 tests*
|
||||||
|
|
||||||
Critical regressions introduced during the server.py split, caught by users and fixed immediately.
|
Critical regressions introduced during the server.py split, caught by users and fixed immediately.
|
||||||
@@ -52,7 +148,7 @@ Critical regressions introduced during the server.py split, caught by users and
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [v0.0.8] Sprint 10 -- Server Health + Operational Polish
|
## [v0.12] Sprint 10 -- Server Health + Operational Polish
|
||||||
*March 31, 2026 | 167 tests*
|
*March 31, 2026 | 167 tests*
|
||||||
|
|
||||||
### Post-sprint Bug Fixes
|
### Post-sprint Bug Fixes
|
||||||
@@ -60,7 +156,7 @@ Critical regressions introduced during the server.py split, caught by users and
|
|||||||
- `setBusy(false)` now always hides the Cancel button
|
- `setBusy(false)` now always hides the Cancel button
|
||||||
- `S.activeStreamId` properly initialized in the S global state object
|
- `S.activeStreamId` properly initialized in the S global state object
|
||||||
- Tool card "Show more" button uses data attributes instead of inline JSON.stringify (XSS/parse safety)
|
- Tool card "Show more" button uses data attributes instead of inline JSON.stringify (XSS/parse safety)
|
||||||
- Version label updated to v0.0.8
|
- Version label updated to v0.2
|
||||||
- `Session.__init__` accepts `**kwargs` for forward-compatibility with future JSON fields
|
- `Session.__init__` accepts `**kwargs` for forward-compatibility with future JSON fields
|
||||||
- Test cron jobs now isolated via `HERMES_HOME` env var in conftest (no more pollution of real jobs.json)
|
- Test cron jobs now isolated via `HERMES_HOME` env var in conftest (no more pollution of real jobs.json)
|
||||||
- `last_workspace` reset after each test in conftest (prevents workspace state bleed between tests)
|
- `last_workspace` reset after each test in conftest (prevents workspace state bleed between tests)
|
||||||
@@ -89,7 +185,7 @@ Critical regressions introduced during the server.py split, caught by users and
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [v0.0.7] Sprint 9 -- Codebase Health + Daily Driver Gaps
|
## [v0.11] Sprint 9 -- Codebase Health + Daily Driver Gaps
|
||||||
*March 31, 2026 | 149 tests*
|
*March 31, 2026 | 149 tests*
|
||||||
|
|
||||||
The sprint that closed the last gaps for heavy agentic use.
|
The sprint that closed the last gaps for heavy agentic use.
|
||||||
@@ -121,7 +217,7 @@ The sprint that closed the last gaps for heavy agentic use.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## [v0.0.6] Sprint 8 -- Daily Driver Finish Line
|
## [v0.10] Sprint 8 -- Daily Driver Finish Line
|
||||||
*March 31, 2026 | 139 tests*
|
*March 31, 2026 | 139 tests*
|
||||||
|
|
||||||
### Features
|
### Features
|
||||||
@@ -139,7 +235,7 @@ The sprint that closed the last gaps for heavy agentic use.
|
|||||||
- Send button guard while inline edit is active
|
- Send button guard while inline edit is active
|
||||||
- Escape closes dropdown, clears search, cancels active edit
|
- Escape closes dropdown, clears search, cancels active edit
|
||||||
- Approval polling not restarted on INFLIGHT session switch-back
|
- Approval polling not restarted on INFLIGHT session switch-back
|
||||||
- Version label updated to v0.0.6
|
- Version label updated to v0.10
|
||||||
|
|
||||||
### Hotfix: Message Queue + INFLIGHT
|
### Hotfix: Message Queue + INFLIGHT
|
||||||
- **Message queue.** Sending while busy queues the message with toast + badge.
|
- **Message queue.** Sending while busy queues the message with toast + badge.
|
||||||
@@ -309,4 +405,4 @@ Three-panel layout: sessions sidebar, chat area, workspace panel.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
*Last updated: Sprint 9, March 31, 2026 | Tests: 149/149*
|
*Last updated: Sprint 14, March 31, 2026 | Tests: 224/224*
|
||||||
|
|||||||
49
README.md
49
README.md
@@ -104,9 +104,11 @@ If you prefer to launch the server directly:
|
|||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd /path/to/hermes-agent # or wherever sys.path can find Hermes modules
|
cd /path/to/hermes-agent # or wherever sys.path can find Hermes modules
|
||||||
HERMES_WEBUI_PORT=8787 python /path/to/hermes-webui/server.py
|
HERMES_WEBUI_PORT=8787 venv/bin/python /path/to/hermes-webui/server.py
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Note: use the agent venv Python (or any Python environment that has the Hermes agent dependencies installed). System Python will be missing `openai`, `httpx`, and other required packages.
|
||||||
|
|
||||||
Health check:
|
Health check:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -127,7 +129,7 @@ python -m pytest tests/ -v
|
|||||||
Or using the agent venv explicitly:
|
Or using the agent venv explicitly:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
/path/to/hermes-agent/venv/bin/python -m pytest tests/ -v
|
/path/to/hermes-agent/venv/bin/python -m pytest tests/ -v # or any Python with deps installed
|
||||||
```
|
```
|
||||||
|
|
||||||
Tests run against an isolated server on port 8788 with a separate state directory.
|
Tests run against an isolated server on port 8788 with a separate state directory.
|
||||||
@@ -139,32 +141,44 @@ Production data and real cron jobs are never touched.
|
|||||||
|
|
||||||
### Chat and agent
|
### Chat and agent
|
||||||
- Streaming responses via SSE (tokens appear as they are generated)
|
- Streaming responses via SSE (tokens appear as they are generated)
|
||||||
- 10+ models across OpenAI, Anthropic, and other providers; last-used model persists
|
- Multi-provider model support -- any Hermes API provider (OpenAI, Anthropic, Google, DeepSeek, Nous Portal, OpenRouter); dynamic model dropdown populated from configured keys
|
||||||
- Send a message while one is processing -- it queues automatically
|
- Send a message while one is processing -- it queues automatically
|
||||||
- Edit any past user message inline and regenerate from that point
|
- Edit any past user message inline and regenerate from that point
|
||||||
- Retry the last assistant response with one click
|
- Retry the last assistant response with one click
|
||||||
- Cancel a running task from the activity bar
|
- Cancel a running task from the activity bar
|
||||||
- Tool call cards inline -- each shows the tool name, args, and result snippet
|
- Tool call cards inline -- each shows the tool name, args, and result snippet
|
||||||
|
- Mermaid diagram rendering inline (flowcharts, sequence diagrams, gantt charts)
|
||||||
- Approval card for dangerous shell commands (allow once / session / always / deny)
|
- Approval card for dangerous shell commands (allow once / session / always / deny)
|
||||||
|
- SSE auto-reconnect on network blips (SSH tunnel resilience)
|
||||||
- File attachments persist across page reloads
|
- File attachments persist across page reloads
|
||||||
|
- Message timestamps (HH:MM next to each message, full date on hover)
|
||||||
|
|
||||||
### Sessions
|
### Sessions
|
||||||
- Create, rename, delete, search by title and message content
|
- Create, rename, duplicate, delete, search by title and message content
|
||||||
|
- Pin/star sessions to the top of the sidebar
|
||||||
|
- Archive sessions (hide without deleting, toggle to show)
|
||||||
|
- Session tags -- add #tag to titles for colored chips and click-to-filter
|
||||||
- Grouped by Today / Yesterday / Earlier in the sidebar
|
- Grouped by Today / Yesterday / Earlier in the sidebar
|
||||||
- Download as Markdown transcript or full JSON export
|
- Download as Markdown transcript, full JSON export, or import from JSON
|
||||||
- Sessions persist across page reloads and SSH tunnel reconnects
|
- Sessions persist across page reloads and SSH tunnel reconnects
|
||||||
|
- Browser tab title reflects the active session name
|
||||||
|
|
||||||
### Workspace file browser
|
### Workspace file browser
|
||||||
- Browse directory tree with type icons
|
- Browse directory tree with type icons
|
||||||
- Preview text, code, Markdown (rendered), and images inline
|
- Preview text, code, Markdown (rendered), and images inline
|
||||||
- Edit files in the browser
|
- Edit, create, delete, and rename files; create folders
|
||||||
- Create and delete files
|
|
||||||
- Right panel is drag-resizable
|
- Right panel is drag-resizable
|
||||||
|
- Syntax highlighted code preview (Prism.js)
|
||||||
|
|
||||||
|
### Settings and configuration
|
||||||
|
- Settings panel (gear icon in topbar) -- persist default model and default workspace server-side
|
||||||
|
- Cron completion alerts -- toast notifications and unread badge on Tasks tab
|
||||||
|
- Background agent error alerts -- banner when a non-active session encounters an error
|
||||||
|
|
||||||
### Panels
|
### Panels
|
||||||
- **Chat** -- session list, search, new conversation
|
- **Chat** -- session list, search, pin, archive, new conversation
|
||||||
- **Tasks** -- view, create, edit, run, pause/resume, delete cron jobs
|
- **Tasks** -- view, create, edit, run, pause/resume, delete cron jobs; completion alerts
|
||||||
- **Skills** -- list all skills by category, search, preview, create/edit
|
- **Skills** -- list all skills by category, search, preview, create/edit/delete
|
||||||
- **Memory** -- view and edit MEMORY.md and USER.md inline
|
- **Memory** -- view and edit MEMORY.md and USER.md inline
|
||||||
- **Todos** -- live task list from the current session
|
- **Todos** -- live task list from the current session
|
||||||
- **Spaces** -- add, rename, remove workspaces; quick-switch from topbar
|
- **Spaces** -- add, rename, remove workspaces; quick-switch from topbar
|
||||||
@@ -174,9 +188,10 @@ Production data and real cron jobs are never touched.
|
|||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
```
|
```
|
||||||
server.py HTTP routing shell
|
server.py HTTP routing shell (~76 lines)
|
||||||
api/
|
api/
|
||||||
config.py Discovery + globals (HOST, PORT, SESSIONS, etc.)
|
routes.py All GET + POST route handlers
|
||||||
|
config.py Discovery + globals + model provider detection
|
||||||
helpers.py HTTP helpers: j(), bad(), require(), safe_resolve()
|
helpers.py HTTP helpers: j(), bad(), require(), safe_resolve()
|
||||||
models.py Session model + CRUD
|
models.py Session model + CRUD
|
||||||
workspace.py File ops: list_dir, read_file_content, workspace helpers
|
workspace.py File ops: list_dir, read_file_content, workspace helpers
|
||||||
@@ -185,20 +200,20 @@ api/
|
|||||||
static/
|
static/
|
||||||
index.html HTML template
|
index.html HTML template
|
||||||
style.css All CSS
|
style.css All CSS
|
||||||
ui.js DOM helpers, renderMd, tool cards
|
ui.js DOM helpers, renderMd, Mermaid, tool cards, file tree
|
||||||
workspace.js File tree, preview, file ops
|
workspace.js File tree, preview, file ops
|
||||||
sessions.js Session CRUD, list rendering, search
|
sessions.js Session CRUD, list rendering, search, tags, archive
|
||||||
messages.js send(), SSE event handlers, approval, transcript
|
messages.js send(), SSE event handlers, approval, transcript
|
||||||
panels.js Cron, skills, memory, workspace, todo, switchPanel
|
panels.js Cron, skills, memory, workspace, todo, switchPanel, alerts
|
||||||
boot.js Event wiring + boot IIFE
|
boot.js Event wiring + boot IIFE
|
||||||
tests/
|
tests/
|
||||||
conftest.py Isolated test server (port 8788, separate HERMES_HOME)
|
conftest.py Isolated test server (port 8788, separate HERMES_HOME)
|
||||||
test_sprint1-10.py Feature tests per sprint
|
test_sprint1-14.py Feature tests per sprint
|
||||||
test_regressions.py Permanent regression gate
|
test_regressions.py Permanent regression gate
|
||||||
```
|
```
|
||||||
|
|
||||||
State lives outside the repo at `~/.hermes/webui-mvp/` by default
|
State lives outside the repo at `~/.hermes/webui-mvp/` by default
|
||||||
(sessions, workspaces, last_workspace). Override with `HERMES_WEBUI_STATE_DIR`.
|
(sessions, workspaces, settings, last_workspace). Override with `HERMES_WEBUI_STATE_DIR`.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|||||||
34
ROADMAP.md
34
ROADMAP.md
@@ -1,10 +1,10 @@
|
|||||||
# Hermes WebUI: Full Parity Roadmap
|
# Hermes Web UI: Full Parity Roadmap
|
||||||
|
|
||||||
> Goal: Full 1:1 parity with the Hermes CLI experience via a clean dark web UI.
|
> Goal: Full 1:1 parity with the Hermes CLI experience via a clean dark web UI.
|
||||||
> Everything you can do from the CLI terminal, you can do from this UI.
|
> Everything you can do from the CLI terminal, you can do from this UI.
|
||||||
>
|
>
|
||||||
> Last updated: Post-Sprint 10 bug sweeps (March 31, 2026)
|
> Last updated: Sprint 14 (March 30, 2026)
|
||||||
> Tests: 190/190 passing
|
> Tests: 226/226 passing
|
||||||
> Source: <repo>/
|
> Source: <repo>/
|
||||||
|
|
||||||
---
|
---
|
||||||
@@ -27,6 +27,9 @@
|
|||||||
| Sprint 10 | Server health + operational polish | server.py split into api/ modules, background task cancel, cron run history viewer, tool card UX polish | 167 |
|
| Sprint 10 | Server health + operational polish | server.py split into api/ modules, background task cancel, cron run history viewer, tool card UX polish | 167 |
|
||||||
| Sprint 10 fixes | Import regressions + regression tests | uuid, AIAgent, has_pending, SSE cancel loop, Session.__init__ tool_calls; test_regressions.py | 177 |
|
| Sprint 10 fixes | Import regressions + regression tests | uuid, AIAgent, has_pending, SSE cancel loop, Session.__init__ tool_calls; test_regressions.py | 177 |
|
||||||
| Concurrency sweeps | Multi-session correctness | Approval cross-session (R10), activity bar per-session (R11), live cards on switch-back (R12), tool cards after done (R13), session model authoritative (R14), newSession cards (R15) | 190 |
|
| Concurrency sweeps | Multi-session correctness | Approval cross-session (R10), activity bar per-session (R11), live cards on switch-back (R12), tool cards after done (R13), session model authoritative (R14), newSession cards (R15) | 190 |
|
||||||
|
| Sprint 11 | Multi-provider models + streaming | Dynamic model dropdown (any Hermes provider), smooth scroll pinning, routes extracted to api/routes.py (server.py 704→76 lines) | 201 |
|
||||||
|
| Sprint 12 | Settings + reliability + session QoL | Settings panel (gear icon, settings.json), SSE auto-reconnect, pin sessions, import session from JSON | 211 |
|
||||||
|
| Sprint 13 | Alerts + polish | Cron completion alerts (polling + badge), background error banner, session duplicate, browser tab title | 221 |
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
@@ -34,10 +37,10 @@
|
|||||||
|
|
||||||
| Layer | Location | Status |
|
| Layer | Location | Status |
|
||||||
|-------|----------|--------|
|
|-------|----------|--------|
|
||||||
| Python server | <repo>/server.py (~1100 lines) | Pure Python, no inline HTML/CSS/JS |
|
| Python server | <repo>/server.py (~76 lines) + api/ modules (~1900 lines) | Thin shell + business logic in api/ |
|
||||||
| HTML template | <repo>/static/index.html | Served from disk |
|
| HTML template | <repo>/static/index.html | Served from disk |
|
||||||
| CSS | <repo>/static/style.css | Served from disk |
|
| CSS | <repo>/static/style.css | Served from disk |
|
||||||
| JavaScript | <repo>/static/app.js | Served from disk |
|
| JavaScript | <repo>/static/{ui,workspace,sessions,messages,panels,boot}.js | 6 modules, ~2250 lines total |
|
||||||
| Runtime state | ~/.hermes/webui-mvp/sessions/ | Session JSON files |
|
| Runtime state | ~/.hermes/webui-mvp/sessions/ | Session JSON files |
|
||||||
| Test server | Port 8788, state dir ~/.hermes/webui-mvp-test/ | Isolated, wiped per run |
|
| Test server | Port 8788, state dir ~/.hermes/webui-mvp-test/ | Isolated, wiped per run |
|
||||||
| Production server | Port 8787 | SSH tunnel from Mac |
|
| Production server | Port 8787 | SSH tunnel from Mac |
|
||||||
@@ -49,6 +52,7 @@
|
|||||||
### Chat and Agent
|
### Chat and Agent
|
||||||
- [x] Send messages, get SSE-streaming responses
|
- [x] Send messages, get SSE-streaming responses
|
||||||
- [x] Switch models per session (10 models, grouped by provider)
|
- [x] Switch models per session (10 models, grouped by provider)
|
||||||
|
- [x] Multi-provider API support: use any Hermes agent API provider (OpenAI, Anthropic, Google, etc.) directly, not just OpenRouter (Sprint 11)
|
||||||
- [x] Upload files to workspace (drag-drop, click, clipboard paste)
|
- [x] Upload files to workspace (drag-drop, click, clipboard paste)
|
||||||
- [x] File tray with remove button
|
- [x] File tray with remove button
|
||||||
- [x] Tool progress shown in activity bar above composer
|
- [x] Tool progress shown in activity bar above composer
|
||||||
@@ -78,8 +82,8 @@
|
|||||||
- [x] File name truncation with tooltip for long names
|
- [x] File name truncation with tooltip for long names
|
||||||
- [x] Right panel resizable (drag inner edge)
|
- [x] Right panel resizable (drag inner edge)
|
||||||
- [x] Syntax highlighted code preview (Prism.js)
|
- [x] Syntax highlighted code preview (Prism.js)
|
||||||
- [ ] Rename file (Wave 3)
|
- [x] Rename file (Sprint 14)
|
||||||
- [ ] Create folder (Wave 3)
|
- [x] Create folder (Sprint 14)
|
||||||
|
|
||||||
### Sessions
|
### Sessions
|
||||||
- [x] Create session (+ button or Cmd/Ctrl+K)
|
- [x] Create session (+ button or Cmd/Ctrl+K)
|
||||||
@@ -93,10 +97,12 @@
|
|||||||
- [x] Export session as JSON (full messages + metadata)
|
- [x] Export session as JSON (full messages + metadata)
|
||||||
- [x] Session inherits last-used workspace on creation
|
- [x] Session inherits last-used workspace on creation
|
||||||
- [x] Session content search (search message text across sessions)
|
- [x] Session content search (search message text across sessions)
|
||||||
- [ ] Session tags / labels (Wave 5)
|
- [x] Session tags / labels (Sprint 14)
|
||||||
- [ ] Archive sessions (Wave 5)
|
- [x] Archive sessions (Sprint 14)
|
||||||
- [x] Clear conversation (wipe messages, keep session) (Wave 3)
|
- [x] Clear conversation (wipe messages, keep session) (Wave 3)
|
||||||
- [ ] Import session from JSON (Wave 3)
|
- [x] Import session from JSON (Sprint 12)
|
||||||
|
- [x] Pin/star sessions to top of list (Sprint 12)
|
||||||
|
- [x] Duplicate session (Sprint 13)
|
||||||
|
|
||||||
### Workspace Management
|
### Workspace Management
|
||||||
- [x] Add workspace with path validation (must be existing directory)
|
- [x] Add workspace with path validation (must be existing directory)
|
||||||
@@ -136,12 +142,12 @@
|
|||||||
- [x] Add/edit memory entry inline
|
- [x] Add/edit memory entry inline
|
||||||
|
|
||||||
### Configuration
|
### Configuration
|
||||||
- [ ] Settings panel (default model, workspace, toolsets) (Wave 4)
|
- [x] Settings panel (default model, default workspace) (Sprint 12)
|
||||||
- [ ] Enable/disable toolsets per session (Wave 4)
|
- [ ] Enable/disable toolsets per session (deferred)
|
||||||
|
|
||||||
### Notifications
|
### Notifications
|
||||||
- [ ] Cron job completion alerts (Wave 4)
|
- [x] Cron job completion alerts (Sprint 13)
|
||||||
- [ ] Background agent error alerts (Wave 4)
|
- [x] Background agent error alerts (Sprint 13)
|
||||||
|
|
||||||
### Advanced / Future
|
### Advanced / Future
|
||||||
- [ ] Voice input via Whisper (Wave 6)
|
- [ ] Voice input via Whisper (Wave 6)
|
||||||
|
|||||||
203
SPRINTS.md
203
SPRINTS.md
@@ -1,6 +1,6 @@
|
|||||||
# Hermes WebUI -- Forward Sprint Plan
|
# Hermes Web UI -- Forward Sprint Plan
|
||||||
|
|
||||||
> Current state: v0.1.0 | 190 tests | Daily driver ready
|
> Current state: v0.15 | 221 tests | Daily driver ready
|
||||||
> This document plans the path from here to two targets:
|
> This document plans the path from here to two targets:
|
||||||
>
|
>
|
||||||
> Target A: 1:1 feature parity with the Hermes CLI (everything you can do from the
|
> Target A: 1:1 feature parity with the Hermes CLI (everything you can do from the
|
||||||
@@ -14,7 +14,7 @@
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Where we are now (v0.1.0)
|
## Where we are now (v0.12.1)
|
||||||
|
|
||||||
**CLI parity: ~80% complete.** Core agent loop, all tools visible, workspace
|
**CLI parity: ~80% complete.** Core agent loop, all tools visible, workspace
|
||||||
file ops, cron/skills/memory CRUD, session management, streaming, cancel --
|
file ops, cron/skills/memory CRUD, session management, streaming, cancel --
|
||||||
@@ -26,14 +26,18 @@ present. Gaps are project organization, artifacts, voice, sharing, mobile.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Sprint 11 -- Streaming Smoothness + Tool Card Incremental Render
|
## Sprint 11 -- Multi-Provider Models + Streaming Smoothness (COMPLETED)
|
||||||
|
|
||||||
**Theme:** Make heavy agentic work feel fast and fluid.
|
**Theme:** Use any Hermes-supported model provider from the UI, and make
|
||||||
|
heavy agentic work feel fast and fluid.
|
||||||
|
|
||||||
**Why now:** The biggest remaining daily friction point. During a 20-step task,
|
**Why now:** Two high-impact gaps converge here. First, the model dropdown is
|
||||||
every tool event triggers a full renderMessages() re-render of the entire
|
hardcoded to ~10 OpenRouter model strings. If Hermes is configured with direct
|
||||||
message list. On fast tasks you can see flicker. This is the last thing that
|
Anthropic, OpenAI, Google, or other API providers, the web UI can't use them.
|
||||||
makes the UI feel noticeably worse than watching the CLI.
|
This means users who set up Hermes with native API keys are locked out of
|
||||||
|
their own models in the browser. Second, the streaming render path rebuilds
|
||||||
|
the entire message list on every tool event, causing visible flicker during
|
||||||
|
heavy agentic work.
|
||||||
|
|
||||||
### Track A: Bugs
|
### Track A: Bugs
|
||||||
- Tool card DOM thrash: renderMessages() rebuilds all cards on each tool event.
|
- Tool card DOM thrash: renderMessages() rebuilds all cards on each tool event.
|
||||||
@@ -41,13 +45,16 @@ makes the UI feel noticeably worse than watching the CLI.
|
|||||||
- Scroll position lost on re-render during streaming (messages jump).
|
- Scroll position lost on re-render during streaming (messages jump).
|
||||||
|
|
||||||
### Track B: Features
|
### Track B: Features
|
||||||
|
- **Multi-provider model support:** Query Hermes agent's configured providers
|
||||||
|
and available models at startup via a new `GET /api/models` endpoint. The
|
||||||
|
model dropdown populates dynamically from whatever providers the user has
|
||||||
|
configured (OpenRouter, direct OpenAI, direct Anthropic, Google, DeepSeek,
|
||||||
|
etc.). Group by provider. Fall back to the current hardcoded list if the
|
||||||
|
agent query fails. This ensures the web UI can use any model the CLI can.
|
||||||
- **Incremental tool card streaming:** Instead of renderMessages() on each
|
- **Incremental tool card streaming:** Instead of renderMessages() on each
|
||||||
tool event, maintain a live card group element per turn and append/update
|
tool event, maintain a live card group element per turn and append/update
|
||||||
cards in place. The assistant text row below the cards also updates
|
cards in place. The assistant text row below the cards also updates
|
||||||
incrementally (already does via assistantBody.innerHTML).
|
incrementally (already does via assistantBody.innerHTML).
|
||||||
- **Tool card collapse-all / expand-all:** A small toggle in the topbar or
|
|
||||||
per-message to collapse all tool cards in a response. Useful when a response
|
|
||||||
has 10+ tool calls.
|
|
||||||
- **Smooth scroll:** Pin scroll to bottom during streaming unless user has
|
- **Smooth scroll:** Pin scroll to bottom during streaming unless user has
|
||||||
manually scrolled up (read-back mode). Resume pinning when user scrolls
|
manually scrolled up (read-back mode). Resume pinning when user scrolls
|
||||||
back to bottom.
|
back to bottom.
|
||||||
@@ -58,86 +65,123 @@ makes the UI feel noticeably worse than watching the CLI.
|
|||||||
~50-line shell: imports, Handler stub that delegates to routes, main().
|
~50-line shell: imports, Handler stub that delegates to routes, main().
|
||||||
Completes the server split started in Sprint 10.
|
Completes the server split started in Sprint 10.
|
||||||
|
|
||||||
**Tests:** ~12 new. Total: ~196.
|
**Tests:** ~15 new. Total: ~205.
|
||||||
**Hermes CLI parity impact:** Low (smoothness, not features)
|
**Hermes CLI parity impact:** High (model provider parity is a major CLI gap)
|
||||||
**Claude parity impact:** Low
|
**Claude parity impact:** Low (streaming smoothness)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Sprint 12 -- Settings Panel + Toolset Control
|
## Sprint 12 -- Settings Panel + Reliability + Session QoL
|
||||||
|
|
||||||
**Theme:** Configuration you can actually reach from the UI.
|
**Theme:** Persist your preferences, survive network blips, and organize sessions.
|
||||||
|
|
||||||
**Why now:** Last remaining thing that forces a trip to the CLI or config files
|
**Why now:** Three daily-driver friction points converge. First, default model
|
||||||
for basic setup. The model dropdown works but defaults aren't persisted
|
and workspace aren't persisted server-side -- every restart loses them. Second,
|
||||||
server-side. Toolsets can't be toggled per session.
|
SSH tunnel hiccups during long agent runs silently kill the response with no
|
||||||
|
recovery. Third, after 50+ sessions the flat chronological list makes it hard
|
||||||
|
to keep important conversations accessible.
|
||||||
|
|
||||||
### Track A: Bugs
|
### Track A: Bugs
|
||||||
- Model dropdown doesn't sync when a session was created with a model not in
|
|
||||||
the current dropdown list (edge case from model additions).
|
|
||||||
- Workspace validation on add doesn't check symlinks (shows as invalid when
|
- Workspace validation on add doesn't check symlinks (shows as invalid when
|
||||||
it's actually a valid symlink to a directory).
|
it's actually a valid symlink to a directory).
|
||||||
|
|
||||||
### Track B: Features
|
### Track B: Features
|
||||||
- **Settings panel:** A gear icon in the topbar opens a slide-in settings panel.
|
- **Settings panel:** A gear icon in the topbar opens a slide-in settings panel.
|
||||||
Sections: Default Model (writes HERMES_WEBUI_DEFAULT_MODEL to a settings file),
|
Sections: Default Model, Default Workspace. Persisted server-side in
|
||||||
Default Workspace (writes HERMES_WEBUI_DEFAULT_WORKSPACE), UI preferences
|
`~/.hermes/webui-mvp/settings.json`. Server reads settings on startup and
|
||||||
(font size, message density). Persisted server-side in `~/.hermes/webui-mvp/settings.json`.
|
uses them as defaults. `GET /api/settings` + `POST /api/settings` endpoints.
|
||||||
- **Toolset control per session:** A "Tools" chip in the session topbar opens
|
- **SSE auto-reconnect:** When the EventSource connection drops mid-stream
|
||||||
a popover listing all available toolsets (terminal, web, file, memory, etc.)
|
(network blip, SSH tunnel hiccup), auto-reconnect once using the same
|
||||||
with toggles. Selected toolsets stored on the session and passed to AIAgent.
|
`stream_id`. The server-side queue holds undelivered events. If reconnect
|
||||||
Matches the `--tools` flag behavior in the CLI.
|
fails after 5s, show error banner. This is the #1 reliability gap for
|
||||||
- **Rename file / Create folder:** Two small file tree ops that close the last
|
remote VPS usage.
|
||||||
workspace management gap. Inline rename on double-click (same pattern as
|
- **Pin sessions:** A star icon on any session in the sidebar. Pinned sessions
|
||||||
session rename). Create folder via + menu next to the existing + file button.
|
float to the top of the list above date groups. Persisted on the session
|
||||||
|
JSON as `pinned: true`. Toggle on click. Simple and high quality-of-life.
|
||||||
|
- **Import session from JSON:** Drag a `.json` export file into the sidebar
|
||||||
|
(or click an import button) to restore it as a new session. Mirrors the
|
||||||
|
existing JSON export. Useful for moving sessions between machines.
|
||||||
|
|
||||||
### Track C: Architecture
|
### Track C: Architecture
|
||||||
- Settings schema: `settings.json` with typed fields, validated on load, with
|
- Settings schema: `settings.json` with typed fields, validated on load, with
|
||||||
sane defaults. Served via `GET /api/settings`, written via `POST /api/settings`.
|
sane defaults. Served via `GET /api/settings`, written via `POST /api/settings`.
|
||||||
|
- SSE reconnect: server keeps `STREAMS[stream_id]` alive for 60s after
|
||||||
|
client disconnect, allowing reconnect with the same stream_id.
|
||||||
|
|
||||||
**Tests:** ~15 new. Total: ~211.
|
**Tests:** ~15 new. Total: ~216.
|
||||||
**Hermes CLI parity impact:** High (toolset control is the last major CLI feature)
|
**Hermes CLI parity impact:** Medium (settings persistence, reliability)
|
||||||
**Claude parity impact:** Medium (settings exist in Claude as a panel)
|
**Claude parity impact:** Medium (settings panel, pinned conversations)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Sprint 13 -- Notification System + Background Visibility
|
## Sprint 13 -- Alerts, Session QoL, Polish
|
||||||
|
|
||||||
**Theme:** Know what Hermes is doing even when you're not watching.
|
**Theme:** Know what Hermes is doing, and small quality-of-life wins.
|
||||||
|
|
||||||
**Why now:** Cron jobs run silently. Background errors surface nowhere. You have
|
**Why now:** Cron jobs run silently. Background errors surface nowhere. You have
|
||||||
no way to know a long-running task finished (or failed) while you were on another
|
no way to know a long-running task finished (or failed) while you were on another
|
||||||
tab. This is a meaningful daily driver gap for anyone using cron heavily.
|
tab. Meanwhile, a few small UX gaps (no session duplicate, no tab title) add up
|
||||||
|
to daily friction.
|
||||||
|
|
||||||
### Track A: Bugs
|
### Track A: Bugs
|
||||||
- Cron "Run now" button shows no feedback if the job errors immediately.
|
- Symlink workspace validation — confirmed already fixed (`.resolve()` follows
|
||||||
- Sessions with very long message histories (100+ messages) cause noticeable
|
symlinks before `is_dir()` check).
|
||||||
render lag on load (no virtual scroll yet).
|
|
||||||
|
|
||||||
### Track B: Features
|
### Track B: Features
|
||||||
- **Cron completion alerts:** When a cron job finishes (success or error), push
|
- **Cron completion alerts:** `GET /api/crons/recent?since=TIMESTAMP` endpoint.
|
||||||
a toast notification to the UI. Use a polling endpoint (`GET /api/crons/status`)
|
UI polls every 30s (only when tab is focused). Toast notification on each
|
||||||
that the UI checks every 30s while the window is focused. Badge count on the
|
completion. Red badge count on Tasks nav tab, cleared when tab is opened.
|
||||||
Tasks tab icon when there are unread completions.
|
- **Background agent error alerts:** When a streaming session errors out and
|
||||||
- **Background agent error alerts:** When a streaming session errors out (network
|
the user is on a different session, show a persistent red banner above the
|
||||||
drop, model error, tool failure), and the user is not currently viewing that
|
message area: "Session X encountered an error." Click "View" to navigate,
|
||||||
session, show a persistent banner: "Session X encountered an error." Clicking
|
"Dismiss" to clear.
|
||||||
it navigates to that session.
|
- **Session duplicate:** Copy icon on each session in the sidebar (visible on
|
||||||
- **Virtual scroll for session list:** Session list currently renders all sessions
|
hover). Creates a new session with same workspace/model, titled "(copy)".
|
||||||
in the DOM. Above ~100 sessions, the sidebar gets slow. Implement simple virtual
|
- **Browser tab title:** `document.title` updates to show the active session
|
||||||
scroll: render only ~20 visible rows, reuse DOM nodes on scroll.
|
title (e.g. "My Task — Hermes"). Resets to "Hermes" when no session active.
|
||||||
|
|
||||||
### Track C: Architecture
|
**Tests:** ~10 new. Total: ~221.
|
||||||
- SSE reconnect: if the SSE connection drops mid-stream, auto-reconnect once
|
**Hermes CLI parity impact:** Medium (cron visibility, error surfacing)
|
||||||
(with the same stream_id). Currently a network blip ends the response silently.
|
**Claude parity impact:** Low
|
||||||
|
|
||||||
**Tests:** ~14 new. Total: ~225.
|
|
||||||
**Hermes CLI parity impact:** High (cron visibility, error surfacing)
|
|
||||||
**Claude parity impact:** Medium (Claude has notification panel)
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Sprint 14 -- Project Organization + Session Management
|
## Sprint 14 -- Visual Polish + Workspace Ops + Session Organization
|
||||||
|
|
||||||
|
**Theme:** Polish the visual experience, close workspace file gaps, and
|
||||||
|
organize sessions properly.
|
||||||
|
|
||||||
|
### Track B: Features
|
||||||
|
- **Mermaid diagram rendering:** Code blocks tagged `mermaid` render as
|
||||||
|
diagrams inline. Mermaid.js loaded lazily from CDN. Dark theme. Falls
|
||||||
|
back to code block on parse error.
|
||||||
|
- **Message timestamps:** Subtle HH:MM time next to each role label. Full
|
||||||
|
date/time on hover. User messages tagged with `_ts` on send.
|
||||||
|
- **Date grouping fix:** Session list uses `created_at` for groups instead
|
||||||
|
of `updated_at`. Prevents sessions jumping between groups on auto-title.
|
||||||
|
- **File rename:** Double-click any filename in the workspace panel to
|
||||||
|
rename inline (same pattern as session rename). `POST /api/file/rename`.
|
||||||
|
- **Folder create:** Folder icon button in workspace panel header.
|
||||||
|
`POST /api/file/create-dir`. Prompt for folder name.
|
||||||
|
- **Session tags:** Add `#tag` to session titles. Tags extracted and shown
|
||||||
|
as colored chips in the sidebar. Click a tag to filter the session list.
|
||||||
|
- **Session archive:** Archive button on each session (box icon). Archived
|
||||||
|
sessions hidden from sidebar by default. "Show N archived" toggle at top
|
||||||
|
of list. `POST /api/session/archive` endpoint.
|
||||||
|
|
||||||
|
### Candidates for next sprints
|
||||||
|
- Workspace reorder (drag-and-drop)
|
||||||
|
- View skill linked files
|
||||||
|
- Voice input via Whisper
|
||||||
|
- Subagent delegation cards (enhanced tool card rendering)
|
||||||
|
|
||||||
|
**Tests:** ~12 new. Total: ~233.
|
||||||
|
**Hermes CLI parity impact:** Medium (file rename, folder create)
|
||||||
|
**Claude parity impact:** Medium (Mermaid, tags, archive)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Sprint 15 -- Project Organization + Session Management
|
||||||
|
|
||||||
**Theme:** Organize work the way you think, not just chronologically.
|
**Theme:** Organize work the way you think, not just chronologically.
|
||||||
|
|
||||||
@@ -158,21 +202,19 @@ daily organizational gap vs. Claude's project folders.
|
|||||||
Each project is a named group. Sessions can be dragged into projects or
|
Each project is a named group. Sessions can be dragged into projects or
|
||||||
assigned via right-click. Stored in `projects.json`. Projects collapse/expand.
|
assigned via right-click. Stored in `projects.json`. Projects collapse/expand.
|
||||||
This is the single biggest Claude parity feature missing.
|
This is the single biggest Claude parity feature missing.
|
||||||
- **Pin sessions:** Star icon on any session to pin it to the top of the list
|
- ~~Pin sessions~~ (DONE Sprint 12)
|
||||||
above date groups. Persisted on the session JSON as `pinned: true`.
|
- ~~Import session from JSON~~ (DONE Sprint 12)
|
||||||
- **Session tags:** Inline `#tag` syntax in session titles gets extracted and
|
|
||||||
shown as colored chips. Clicking a tag filters the list. No backend change
|
### Deferred to later sprints
|
||||||
needed -- parsed client-side from title text.
|
- Session tags / labels
|
||||||
- **Archive sessions:** A "More" overflow menu on each session (right-click or
|
- Archive sessions
|
||||||
long-press) with: Archive (hides from main list, accessible via filter),
|
- Rename file / Create folder (can be done through the agent)
|
||||||
Duplicate (new session with same workspace/model), Export JSON.
|
- Toolset control per session
|
||||||
- **Import session from JSON:** Drag a `.json` export file into the sidebar to
|
- Virtual scroll for session list
|
||||||
restore it as a new session. Mirrors the existing JSON export.
|
|
||||||
|
|
||||||
### Track C: Architecture
|
### Track C: Architecture
|
||||||
- Session index v2: extend `_index.json` to include `tags`, `pinned`, and
|
- Session index v2: extend `_index.json` to include `project_id` field.
|
||||||
`project_id` fields. Rebuild on session save. Enables fast client-side
|
Rebuild on session save. Enables fast client-side filtering without disk reads.
|
||||||
filtering without disk reads.
|
|
||||||
|
|
||||||
**Tests:** ~16 new. Total: ~241.
|
**Tests:** ~16 new. Total: ~241.
|
||||||
**Hermes CLI parity impact:** Low (CLI has no session organization)
|
**Hermes CLI parity impact:** Low (CLI has no session organization)
|
||||||
@@ -344,7 +386,7 @@ address.
|
|||||||
|-------------|--------|
|
|-------------|--------|
|
||||||
| Chat / agent loop | Done (v0.3) |
|
| Chat / agent loop | Done (v0.3) |
|
||||||
| Streaming responses | Done (v0.5) |
|
| Streaming responses | Done (v0.5) |
|
||||||
| Tool call visibility | Done (v0.0.7) |
|
| Tool call visibility | Done (v0.11) |
|
||||||
| File ops (read/write/search/patch) | Done (v0.6) |
|
| File ops (read/write/search/patch) | Done (v0.6) |
|
||||||
| Terminal commands | Done via workspace |
|
| Terminal commands | Done via workspace |
|
||||||
| Cron job management | Done (v0.9) |
|
| Cron job management | Done (v0.9) |
|
||||||
@@ -353,6 +395,7 @@ address.
|
|||||||
| Session history | Done (v0.3) |
|
| Session history | Done (v0.3) |
|
||||||
| Workspace switching | Done (v0.7) |
|
| Workspace switching | Done (v0.7) |
|
||||||
| Model selection | Done (v0.3) |
|
| Model selection | Done (v0.3) |
|
||||||
|
| Multi-provider model support | Sprint 11 |
|
||||||
| Toolset control | Sprint 12 |
|
| Toolset control | Sprint 12 |
|
||||||
| Settings persistence | Sprint 12 |
|
| Settings persistence | Sprint 12 |
|
||||||
| Subagent visibility | Sprint 17 |
|
| Subagent visibility | Sprint 17 |
|
||||||
@@ -369,9 +412,9 @@ address.
|
|||||||
| Streaming chat | Done (v0.5) |
|
| Streaming chat | Done (v0.5) |
|
||||||
| Model switching | Done (v0.3) |
|
| Model switching | Done (v0.3) |
|
||||||
| File attachments | Done (v0.6) |
|
| File attachments | Done (v0.6) |
|
||||||
| Syntax highlighting | Done (v0.0.6) |
|
| Syntax highlighting | Done (v0.10) |
|
||||||
| Tool use visibility | Done (v0.0.7) |
|
| Tool use visibility | Done (v0.11) |
|
||||||
| Edit/regenerate messages | Done (v0.0.6) |
|
| Edit/regenerate messages | Done (v0.10) |
|
||||||
| Session management | Done (v0.6) |
|
| Session management | Done (v0.6) |
|
||||||
| Artifacts (HTML/SVG preview) | Sprint 15 |
|
| Artifacts (HTML/SVG preview) | Sprint 15 |
|
||||||
| Code execution inline | Sprint 15 |
|
| Code execution inline | Sprint 15 |
|
||||||
@@ -402,6 +445,6 @@ address.
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
*Last updated: March 31, 2026*
|
*Last updated: March 30, 2026*
|
||||||
*Current version: v0.1.0 | 190 tests*
|
*Current version: v0.13 | 201 tests*
|
||||||
*Next sprint: Sprint 11 (streaming smoothness + api/routes.py split)*
|
*Next sprint: Sprint 14 (visual polish + small QoL)*
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
# Hermes WebUI: Browser Testing Plan
|
# Hermes Web UI: Browser Testing Plan
|
||||||
|
|
||||||
> This document is for manual browser testing by you or by a Claude browser agent.
|
> This document is for manual browser testing by you or by a Claude browser agent.
|
||||||
> It covers every user-facing feature of the UI through Sprint 2.
|
> It covers every user-facing feature of the UI through Sprint 2.
|
||||||
|
|||||||
@@ -1 +1 @@
|
|||||||
"""Hermes WebUI -- API modules."""
|
"""Hermes Web UI -- API modules."""
|
||||||
|
|||||||
247
api/config.py
247
api/config.py
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- Shared configuration, constants, and global state.
|
Hermes Web UI -- Shared configuration, constants, and global state.
|
||||||
Imported by all other api/* modules and by server.py.
|
Imported by all other api/* modules and by server.py.
|
||||||
|
|
||||||
Discovery order for all paths:
|
Discovery order for all paths:
|
||||||
@@ -37,6 +37,7 @@ STATE_DIR = Path(os.getenv(
|
|||||||
SESSION_DIR = STATE_DIR / 'sessions'
|
SESSION_DIR = STATE_DIR / 'sessions'
|
||||||
WORKSPACES_FILE = STATE_DIR / 'workspaces.json'
|
WORKSPACES_FILE = STATE_DIR / 'workspaces.json'
|
||||||
SESSION_INDEX_FILE = SESSION_DIR / '_index.json'
|
SESSION_INDEX_FILE = SESSION_DIR / '_index.json'
|
||||||
|
SETTINGS_FILE = STATE_DIR / 'settings.json'
|
||||||
LAST_WORKSPACE_FILE = STATE_DIR / 'last_workspace.txt'
|
LAST_WORKSPACE_FILE = STATE_DIR / 'last_workspace.txt'
|
||||||
|
|
||||||
# ── Hermes agent directory discovery ─────────────────────────────────────────
|
# ── Hermes agent directory discovery ─────────────────────────────────────────
|
||||||
@@ -238,6 +239,203 @@ CLI_TOOLSETS = cfg.get('platform_toolsets', {}).get('cli', [
|
|||||||
'web', 'webhook',
|
'web', 'webhook',
|
||||||
])
|
])
|
||||||
|
|
||||||
|
# ── Model / provider discovery ───────────────────────────────────────────────
|
||||||
|
|
||||||
|
# Hardcoded fallback models (used when no config.yaml or agent is available)
|
||||||
|
_FALLBACK_MODELS = [
|
||||||
|
{'provider': 'OpenAI', 'id': 'openai/gpt-5.4-mini', 'label': 'GPT-5.4 Mini'},
|
||||||
|
{'provider': 'OpenAI', 'id': 'openai/gpt-4o', 'label': 'GPT-4o'},
|
||||||
|
{'provider': 'OpenAI', 'id': 'openai/o3', 'label': 'o3'},
|
||||||
|
{'provider': 'OpenAI', 'id': 'openai/o4-mini', 'label': 'o4-mini'},
|
||||||
|
{'provider': 'Anthropic', 'id': 'anthropic/claude-sonnet-4.6', 'label': 'Claude Sonnet 4.6'},
|
||||||
|
{'provider': 'Anthropic', 'id': 'anthropic/claude-sonnet-4-5', 'label': 'Claude Sonnet 4.5'},
|
||||||
|
{'provider': 'Anthropic', 'id': 'anthropic/claude-haiku-3-5', 'label': 'Claude Haiku 3.5'},
|
||||||
|
{'provider': 'Other', 'id': 'google/gemini-2.5-pro', 'label': 'Gemini 2.5 Pro'},
|
||||||
|
{'provider': 'Other', 'id': 'deepseek/deepseek-chat-v3-0324', 'label': 'DeepSeek V3'},
|
||||||
|
{'provider': 'Other', 'id': 'meta-llama/llama-4-scout', 'label': 'Llama 4 Scout'},
|
||||||
|
]
|
||||||
|
|
||||||
|
# Provider display names for known Hermes provider IDs
|
||||||
|
_PROVIDER_DISPLAY = {
|
||||||
|
'nous': 'Nous Portal', 'openrouter': 'OpenRouter', 'anthropic': 'Anthropic',
|
||||||
|
'openai': 'OpenAI', 'openai-codex': 'OpenAI Codex', 'copilot': 'GitHub Copilot',
|
||||||
|
'zai': 'Z.AI / GLM', 'kimi-coding': 'Kimi / Moonshot', 'deepseek': 'DeepSeek',
|
||||||
|
'minimax': 'MiniMax', 'google': 'Google', 'meta-llama': 'Meta Llama',
|
||||||
|
'huggingface': 'HuggingFace', 'alibaba': 'Alibaba',
|
||||||
|
}
|
||||||
|
|
||||||
|
# Well-known models per provider (used to populate dropdown for direct API providers)
|
||||||
|
_PROVIDER_MODELS = {
|
||||||
|
'anthropic': [
|
||||||
|
{'id': 'claude-opus-4.6', 'label': 'Claude Opus 4.6'},
|
||||||
|
{'id': 'claude-sonnet-4.6', 'label': 'Claude Sonnet 4.6'},
|
||||||
|
{'id': 'claude-sonnet-4-5', 'label': 'Claude Sonnet 4.5'},
|
||||||
|
{'id': 'claude-haiku-3-5', 'label': 'Claude Haiku 3.5'},
|
||||||
|
],
|
||||||
|
'openai': [
|
||||||
|
{'id': 'gpt-5.4-mini', 'label': 'GPT-5.4 Mini'},
|
||||||
|
{'id': 'gpt-4o', 'label': 'GPT-4o'},
|
||||||
|
{'id': 'o3', 'label': 'o3'},
|
||||||
|
{'id': 'o4-mini', 'label': 'o4-mini'},
|
||||||
|
],
|
||||||
|
'openai-codex': [
|
||||||
|
{'id': 'codex-mini-latest', 'label': 'Codex Mini'},
|
||||||
|
],
|
||||||
|
'google': [
|
||||||
|
{'id': 'gemini-2.5-pro', 'label': 'Gemini 2.5 Pro'},
|
||||||
|
],
|
||||||
|
'deepseek': [
|
||||||
|
{'id': 'deepseek-chat-v3-0324', 'label': 'DeepSeek V3'},
|
||||||
|
{'id': 'deepseek-reasoner', 'label': 'DeepSeek Reasoner'},
|
||||||
|
],
|
||||||
|
'nous': [
|
||||||
|
{'id': 'claude-opus-4.6', 'label': 'Claude Opus 4.6 (via Nous)'},
|
||||||
|
{'id': 'claude-sonnet-4.6', 'label': 'Claude Sonnet 4.6 (via Nous)'},
|
||||||
|
{'id': 'gpt-5.4-mini', 'label': 'GPT-5.4 Mini (via Nous)'},
|
||||||
|
{'id': 'gemini-2.5-pro', 'label': 'Gemini 2.5 Pro (via Nous)'},
|
||||||
|
],
|
||||||
|
'zai': [
|
||||||
|
{'id': 'glm-4-plus', 'label': 'GLM-4 Plus'},
|
||||||
|
{'id': 'glm-4-air', 'label': 'GLM-4 Air'},
|
||||||
|
{'id': 'glm-z1-flash', 'label': 'GLM-Z1 Flash'},
|
||||||
|
],
|
||||||
|
'kimi-coding': [
|
||||||
|
{'id': 'moonshot-v1-8k', 'label': 'Moonshot v1 8k'},
|
||||||
|
{'id': 'moonshot-v1-32k', 'label': 'Moonshot v1 32k'},
|
||||||
|
{'id': 'moonshot-v1-128k', 'label': 'Moonshot v1 128k'},
|
||||||
|
{'id': 'kimi-latest', 'label': 'Kimi Latest'},
|
||||||
|
],
|
||||||
|
'minimax': [
|
||||||
|
{'id': 'abab6.5s-chat', 'label': 'MiniMax ABAB 6.5S'},
|
||||||
|
{'id': 'abab6.5g-chat', 'label': 'MiniMax ABAB 6.5G'},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def get_available_models() -> dict:
|
||||||
|
"""
|
||||||
|
Return available models grouped by provider.
|
||||||
|
|
||||||
|
Discovery order:
|
||||||
|
1. Read config.yaml 'model' section for active provider info
|
||||||
|
2. Check for known API keys in env or ~/.hermes/.env
|
||||||
|
3. Fall back to hardcoded model list (OpenRouter-style)
|
||||||
|
|
||||||
|
Returns: {
|
||||||
|
'active_provider': str|None,
|
||||||
|
'default_model': str,
|
||||||
|
'groups': [{'provider': str, 'models': [{'id': str, 'label': str}]}]
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
active_provider = None
|
||||||
|
default_model = DEFAULT_MODEL
|
||||||
|
groups = []
|
||||||
|
|
||||||
|
# 1. Read config.yaml model section
|
||||||
|
model_cfg = cfg.get('model', {})
|
||||||
|
if isinstance(model_cfg, str):
|
||||||
|
default_model = model_cfg
|
||||||
|
elif isinstance(model_cfg, dict):
|
||||||
|
active_provider = model_cfg.get('provider')
|
||||||
|
cfg_default = model_cfg.get('default', '')
|
||||||
|
if cfg_default:
|
||||||
|
default_model = cfg_default
|
||||||
|
|
||||||
|
# 2. Also check env vars for model override
|
||||||
|
env_model = os.getenv('HERMES_MODEL') or os.getenv('OPENAI_MODEL') or os.getenv('LLM_MODEL')
|
||||||
|
if env_model:
|
||||||
|
default_model = env_model.strip()
|
||||||
|
|
||||||
|
# 3. Try to read auth store for active provider (if hermes is installed)
|
||||||
|
if not active_provider:
|
||||||
|
auth_store_path = HOME / '.hermes' / 'auth.json'
|
||||||
|
if auth_store_path.exists():
|
||||||
|
try:
|
||||||
|
import json as _j
|
||||||
|
auth_store = _j.loads(auth_store_path.read_text())
|
||||||
|
active_provider = auth_store.get('active_provider')
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# 4. Check for API keys that imply available providers
|
||||||
|
hermes_env_path = HOME / '.hermes' / '.env'
|
||||||
|
env_keys = {}
|
||||||
|
if hermes_env_path.exists():
|
||||||
|
try:
|
||||||
|
for line in hermes_env_path.read_text().splitlines():
|
||||||
|
line = line.strip()
|
||||||
|
if line and not line.startswith('#') and '=' in line:
|
||||||
|
k, v = line.split('=', 1)
|
||||||
|
env_keys[k.strip()] = v.strip().strip('"').strip("'")
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Merge with actual env
|
||||||
|
all_env = {**env_keys}
|
||||||
|
for k in ('ANTHROPIC_API_KEY', 'OPENAI_API_KEY', 'OPENROUTER_API_KEY',
|
||||||
|
'GOOGLE_API_KEY', 'GLM_API_KEY', 'KIMI_API_KEY', 'DEEPSEEK_API_KEY'):
|
||||||
|
val = os.getenv(k)
|
||||||
|
if val:
|
||||||
|
all_env[k] = val
|
||||||
|
|
||||||
|
detected_providers = set()
|
||||||
|
if active_provider:
|
||||||
|
detected_providers.add(active_provider)
|
||||||
|
if all_env.get('ANTHROPIC_API_KEY'):
|
||||||
|
detected_providers.add('anthropic')
|
||||||
|
if all_env.get('OPENAI_API_KEY'):
|
||||||
|
detected_providers.add('openai')
|
||||||
|
if all_env.get('OPENROUTER_API_KEY'):
|
||||||
|
detected_providers.add('openrouter')
|
||||||
|
if all_env.get('GOOGLE_API_KEY'):
|
||||||
|
detected_providers.add('google')
|
||||||
|
if all_env.get('GLM_API_KEY'):
|
||||||
|
detected_providers.add('zai')
|
||||||
|
if all_env.get('KIMI_API_KEY'):
|
||||||
|
detected_providers.add('kimi-coding')
|
||||||
|
if all_env.get('MINIMAX_API_KEY') or all_env.get('MINIMAX_CN_API_KEY'):
|
||||||
|
detected_providers.add('minimax')
|
||||||
|
if all_env.get('DEEPSEEK_API_KEY'):
|
||||||
|
detected_providers.add('deepseek')
|
||||||
|
|
||||||
|
# 5. Build model groups
|
||||||
|
if detected_providers:
|
||||||
|
for pid in sorted(detected_providers):
|
||||||
|
provider_name = _PROVIDER_DISPLAY.get(pid, pid.title())
|
||||||
|
if pid == 'openrouter':
|
||||||
|
# OpenRouter uses provider/model format -- show the fallback list
|
||||||
|
groups.append({
|
||||||
|
'provider': 'OpenRouter',
|
||||||
|
'models': [{'id': m['id'], 'label': m['label']} for m in _FALLBACK_MODELS],
|
||||||
|
})
|
||||||
|
elif pid in _PROVIDER_MODELS:
|
||||||
|
groups.append({
|
||||||
|
'provider': provider_name,
|
||||||
|
'models': _PROVIDER_MODELS[pid],
|
||||||
|
})
|
||||||
|
else:
|
||||||
|
# Unknown provider with key -- add a placeholder with the default model
|
||||||
|
groups.append({
|
||||||
|
'provider': provider_name,
|
||||||
|
'models': [{'id': default_model, 'label': default_model.split('/')[-1]}],
|
||||||
|
})
|
||||||
|
else:
|
||||||
|
# No providers detected -- use fallback grouped list
|
||||||
|
by_provider = {}
|
||||||
|
for m in _FALLBACK_MODELS:
|
||||||
|
by_provider.setdefault(m['provider'], []).append(
|
||||||
|
{'id': m['id'], 'label': m['label']}
|
||||||
|
)
|
||||||
|
for provider_name, models in by_provider.items():
|
||||||
|
groups.append({'provider': provider_name, 'models': models})
|
||||||
|
|
||||||
|
return {
|
||||||
|
'active_provider': active_provider,
|
||||||
|
'default_model': default_model,
|
||||||
|
'groups': groups,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
# ── Static file path ─────────────────────────────────────────────────────────
|
# ── Static file path ─────────────────────────────────────────────────────────
|
||||||
_INDEX_HTML_PATH = REPO_ROOT / 'static' / 'index.html'
|
_INDEX_HTML_PATH = REPO_ROOT / 'static' / 'index.html'
|
||||||
|
|
||||||
@@ -269,5 +467,52 @@ def _get_session_agent_lock(session_id: str) -> threading.Lock:
|
|||||||
SESSION_AGENT_LOCKS[session_id] = threading.Lock()
|
SESSION_AGENT_LOCKS[session_id] = threading.Lock()
|
||||||
return SESSION_AGENT_LOCKS[session_id]
|
return SESSION_AGENT_LOCKS[session_id]
|
||||||
|
|
||||||
|
# ── Settings persistence ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
_SETTINGS_DEFAULTS = {
|
||||||
|
'default_model': DEFAULT_MODEL,
|
||||||
|
'default_workspace': str(DEFAULT_WORKSPACE),
|
||||||
|
}
|
||||||
|
|
||||||
|
def load_settings() -> dict:
|
||||||
|
"""Load settings from disk, merging with defaults for any missing keys."""
|
||||||
|
settings = dict(_SETTINGS_DEFAULTS)
|
||||||
|
if SETTINGS_FILE.exists():
|
||||||
|
try:
|
||||||
|
stored = json.loads(SETTINGS_FILE.read_text(encoding='utf-8'))
|
||||||
|
if isinstance(stored, dict):
|
||||||
|
settings.update(stored)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return settings
|
||||||
|
|
||||||
|
_SETTINGS_ALLOWED_KEYS = set(_SETTINGS_DEFAULTS.keys())
|
||||||
|
|
||||||
|
def save_settings(settings: dict) -> dict:
|
||||||
|
"""Save settings to disk. Returns the merged settings. Ignores unknown keys."""
|
||||||
|
current = load_settings()
|
||||||
|
for k, v in settings.items():
|
||||||
|
if k in _SETTINGS_ALLOWED_KEYS:
|
||||||
|
current[k] = v
|
||||||
|
SETTINGS_FILE.write_text(
|
||||||
|
json.dumps(current, ensure_ascii=False, indent=2),
|
||||||
|
encoding='utf-8',
|
||||||
|
)
|
||||||
|
# Update runtime defaults so new sessions use them immediately
|
||||||
|
global DEFAULT_MODEL, DEFAULT_WORKSPACE
|
||||||
|
if 'default_model' in current:
|
||||||
|
DEFAULT_MODEL = current['default_model']
|
||||||
|
if 'default_workspace' in current:
|
||||||
|
DEFAULT_WORKSPACE = Path(current['default_workspace']).expanduser().resolve()
|
||||||
|
return current
|
||||||
|
|
||||||
|
# Apply saved settings on startup (override env-derived defaults)
|
||||||
|
_startup_settings = load_settings()
|
||||||
|
if SETTINGS_FILE.exists():
|
||||||
|
if _startup_settings.get('default_model'):
|
||||||
|
DEFAULT_MODEL = _startup_settings['default_model']
|
||||||
|
if _startup_settings.get('default_workspace'):
|
||||||
|
DEFAULT_WORKSPACE = Path(_startup_settings['default_workspace']).expanduser().resolve()
|
||||||
|
|
||||||
# ── SESSIONS in-memory cache (LRU OrderedDict) ───────────────────────────────
|
# ── SESSIONS in-memory cache (LRU OrderedDict) ───────────────────────────────
|
||||||
SESSIONS: collections.OrderedDict = collections.OrderedDict()
|
SESSIONS: collections.OrderedDict = collections.OrderedDict()
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- HTTP helper functions.
|
Hermes Web UI -- HTTP helper functions.
|
||||||
"""
|
"""
|
||||||
import json as _json
|
import json as _json
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- Session model and in-memory session store.
|
Hermes Web UI -- Session model and in-memory session store.
|
||||||
"""
|
"""
|
||||||
import collections
|
import collections
|
||||||
import json
|
import json
|
||||||
@@ -7,6 +7,7 @@ import time
|
|||||||
import uuid
|
import uuid
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
|
import api.config as _cfg
|
||||||
from api.config import (
|
from api.config import (
|
||||||
SESSION_DIR, SESSION_INDEX_FILE, SESSIONS, SESSIONS_MAX,
|
SESSION_DIR, SESSION_INDEX_FILE, SESSIONS, SESSIONS_MAX,
|
||||||
LOCK, DEFAULT_WORKSPACE, DEFAULT_MODEL
|
LOCK, DEFAULT_WORKSPACE, DEFAULT_MODEL
|
||||||
@@ -33,8 +34,8 @@ def _write_session_index():
|
|||||||
|
|
||||||
|
|
||||||
class Session:
|
class Session:
|
||||||
def __init__(self, session_id=None, title='Untitled', workspace=str(DEFAULT_WORKSPACE), model=DEFAULT_MODEL, messages=None, created_at=None, updated_at=None, tool_calls=None, **kwargs):
|
def __init__(self, session_id=None, title='Untitled', workspace=str(DEFAULT_WORKSPACE), model=DEFAULT_MODEL, messages=None, created_at=None, updated_at=None, tool_calls=None, pinned=False, archived=False, **kwargs):
|
||||||
self.session_id = session_id or uuid.uuid4().hex[:12]; self.title = title; self.workspace = str(Path(workspace).expanduser().resolve()); self.model = model; self.messages = messages or []; self.tool_calls = tool_calls or []; self.created_at = created_at or time.time(); self.updated_at = updated_at or time.time()
|
self.session_id = session_id or uuid.uuid4().hex[:12]; self.title = title; self.workspace = str(Path(workspace).expanduser().resolve()); self.model = model; self.messages = messages or []; self.tool_calls = tool_calls or []; self.created_at = created_at or time.time(); self.updated_at = updated_at or time.time(); self.pinned = bool(pinned); self.archived = bool(archived)
|
||||||
@property
|
@property
|
||||||
def path(self): return SESSION_DIR / f'{self.session_id}.json'
|
def path(self): return SESSION_DIR / f'{self.session_id}.json'
|
||||||
def save(self): self.updated_at = time.time(); self.path.write_text(json.dumps(self.__dict__, ensure_ascii=False, indent=2), encoding='utf-8'); _write_session_index()
|
def save(self): self.updated_at = time.time(); self.path.write_text(json.dumps(self.__dict__, ensure_ascii=False, indent=2), encoding='utf-8'); _write_session_index()
|
||||||
@@ -43,7 +44,7 @@ class Session:
|
|||||||
p = SESSION_DIR / f'{sid}.json'
|
p = SESSION_DIR / f'{sid}.json'
|
||||||
if not p.exists(): return None
|
if not p.exists(): return None
|
||||||
return cls(**json.loads(p.read_text(encoding='utf-8')))
|
return cls(**json.loads(p.read_text(encoding='utf-8')))
|
||||||
def compact(self): return {'session_id': self.session_id, 'title': self.title, 'workspace': self.workspace, 'model': self.model, 'message_count': len(self.messages), 'created_at': self.created_at, 'updated_at': self.updated_at}
|
def compact(self): return {'session_id': self.session_id, 'title': self.title, 'workspace': self.workspace, 'model': self.model, 'message_count': len(self.messages), 'created_at': self.created_at, 'updated_at': self.updated_at, 'pinned': self.pinned, 'archived': self.archived}
|
||||||
|
|
||||||
def get_session(sid):
|
def get_session(sid):
|
||||||
with LOCK:
|
with LOCK:
|
||||||
@@ -61,7 +62,8 @@ def get_session(sid):
|
|||||||
raise KeyError(sid)
|
raise KeyError(sid)
|
||||||
|
|
||||||
def new_session(workspace=None, model=None):
|
def new_session(workspace=None, model=None):
|
||||||
s = Session(workspace=workspace or get_last_workspace(), model=model or DEFAULT_MODEL)
|
# Use _cfg.DEFAULT_MODEL (not the import-time snapshot) so save_settings() changes take effect
|
||||||
|
s = Session(workspace=workspace or get_last_workspace(), model=model or _cfg.DEFAULT_MODEL)
|
||||||
with LOCK:
|
with LOCK:
|
||||||
SESSIONS[s.session_id] = s
|
SESSIONS[s.session_id] = s
|
||||||
SESSIONS.move_to_end(s.session_id)
|
SESSIONS.move_to_end(s.session_id)
|
||||||
@@ -80,7 +82,7 @@ def all_sessions():
|
|||||||
with LOCK:
|
with LOCK:
|
||||||
for s in SESSIONS.values():
|
for s in SESSIONS.values():
|
||||||
index_map[s.session_id] = s.compact()
|
index_map[s.session_id] = s.compact()
|
||||||
result = sorted(index_map.values(), key=lambda s: s['updated_at'], reverse=True)
|
result = sorted(index_map.values(), key=lambda s: (s.get('pinned', False), s['updated_at']), reverse=True)
|
||||||
# Hide empty Untitled sessions from the UI (created by tests, page refreshes, etc.)
|
# Hide empty Untitled sessions from the UI (created by tests, page refreshes, etc.)
|
||||||
result = [s for s in result if not (s.get('title','Untitled')=='Untitled' and s.get('message_count',0)==0)]
|
result = [s for s in result if not (s.get('title','Untitled')=='Untitled' and s.get('message_count',0)==0)]
|
||||||
return result
|
return result
|
||||||
@@ -97,7 +99,7 @@ def all_sessions():
|
|||||||
pass
|
pass
|
||||||
for s in SESSIONS.values():
|
for s in SESSIONS.values():
|
||||||
if all(s.session_id != x.session_id for x in out): out.append(s)
|
if all(s.session_id != x.session_id for x in out): out.append(s)
|
||||||
out.sort(key=lambda s: s.updated_at, reverse=True)
|
out.sort(key=lambda s: (getattr(s, 'pinned', False), s.updated_at), reverse=True)
|
||||||
return [s.compact() for s in out if not (s.title=='Untitled' and len(s.messages)==0)]
|
return [s.compact() for s in out if not (s.title=='Untitled' and len(s.messages)==0)]
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
932
api/routes.py
Normal file
932
api/routes.py
Normal file
@@ -0,0 +1,932 @@
|
|||||||
|
"""
|
||||||
|
Hermes Web UI -- Route handlers for GET and POST endpoints.
|
||||||
|
Extracted from server.py (Sprint 11) so server.py is a thin shell.
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import queue
|
||||||
|
import sys
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
import uuid
|
||||||
|
from pathlib import Path
|
||||||
|
from urllib.parse import parse_qs
|
||||||
|
|
||||||
|
from api.config import (
|
||||||
|
STATE_DIR, SESSION_DIR, DEFAULT_WORKSPACE, DEFAULT_MODEL,
|
||||||
|
SESSIONS, SESSIONS_MAX, LOCK, STREAMS, STREAMS_LOCK, CANCEL_FLAGS,
|
||||||
|
SERVER_START_TIME, CLI_TOOLSETS, _INDEX_HTML_PATH, get_available_models,
|
||||||
|
IMAGE_EXTS, MD_EXTS, MIME_MAP, MAX_FILE_BYTES, MAX_UPLOAD_BYTES,
|
||||||
|
CHAT_LOCK, load_settings, save_settings,
|
||||||
|
)
|
||||||
|
from api.helpers import require, bad, safe_resolve, j, t, read_body
|
||||||
|
from api.models import (
|
||||||
|
Session, get_session, new_session, all_sessions, title_from,
|
||||||
|
_write_session_index, SESSION_INDEX_FILE,
|
||||||
|
)
|
||||||
|
from api.workspace import (
|
||||||
|
load_workspaces, save_workspaces, get_last_workspace, set_last_workspace,
|
||||||
|
list_dir, read_file_content, safe_resolve_ws,
|
||||||
|
)
|
||||||
|
from api.upload import handle_upload
|
||||||
|
from api.streaming import _sse, _run_agent_streaming, cancel_stream
|
||||||
|
|
||||||
|
# Approval system (optional -- graceful fallback if agent not available)
|
||||||
|
try:
|
||||||
|
from tools.approval import (
|
||||||
|
has_pending, pop_pending, submit_pending,
|
||||||
|
approve_session, approve_permanent, save_permanent_allowlist,
|
||||||
|
is_approved, _pending, _lock, _permanent_approved,
|
||||||
|
)
|
||||||
|
except ImportError:
|
||||||
|
has_pending = lambda *a, **k: False
|
||||||
|
pop_pending = lambda *a, **k: None
|
||||||
|
submit_pending = lambda *a, **k: None
|
||||||
|
approve_session = lambda *a, **k: None
|
||||||
|
approve_permanent = lambda *a, **k: None
|
||||||
|
save_permanent_allowlist = lambda *a, **k: None
|
||||||
|
is_approved = lambda *a, **k: True
|
||||||
|
_pending = {}
|
||||||
|
_lock = threading.Lock()
|
||||||
|
_permanent_approved = set()
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET routes ────────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def handle_get(handler, parsed):
|
||||||
|
"""Handle all GET routes. Returns True if handled, False for 404."""
|
||||||
|
|
||||||
|
if parsed.path in ('/', '/index.html'):
|
||||||
|
return t(handler, _INDEX_HTML_PATH.read_text(encoding='utf-8'),
|
||||||
|
content_type='text/html; charset=utf-8')
|
||||||
|
|
||||||
|
if parsed.path == '/favicon.ico':
|
||||||
|
handler.send_response(204); handler.end_headers(); return True
|
||||||
|
|
||||||
|
if parsed.path == '/health':
|
||||||
|
with STREAMS_LOCK: n_streams = len(STREAMS)
|
||||||
|
return j(handler, {
|
||||||
|
'status': 'ok', 'sessions': len(SESSIONS),
|
||||||
|
'active_streams': n_streams,
|
||||||
|
'uptime_seconds': round(time.time() - SERVER_START_TIME, 1),
|
||||||
|
})
|
||||||
|
|
||||||
|
if parsed.path == '/api/models':
|
||||||
|
return j(handler, get_available_models())
|
||||||
|
|
||||||
|
if parsed.path == '/api/settings':
|
||||||
|
return j(handler, load_settings())
|
||||||
|
|
||||||
|
if parsed.path.startswith('/static/'):
|
||||||
|
return _serve_static(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/session':
|
||||||
|
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
||||||
|
if not sid:
|
||||||
|
return j(handler, {'error': 'session_id is required'}, status=400)
|
||||||
|
s = get_session(sid)
|
||||||
|
return j(handler, {'session': s.compact() | {
|
||||||
|
'messages': s.messages,
|
||||||
|
'tool_calls': getattr(s, 'tool_calls', []),
|
||||||
|
}})
|
||||||
|
|
||||||
|
if parsed.path == '/api/sessions':
|
||||||
|
return j(handler, {'sessions': all_sessions()})
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/export':
|
||||||
|
return _handle_session_export(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/workspaces':
|
||||||
|
return j(handler, {'workspaces': load_workspaces(), 'last': get_last_workspace()})
|
||||||
|
|
||||||
|
if parsed.path == '/api/sessions/search':
|
||||||
|
return _handle_sessions_search(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/list':
|
||||||
|
return _handle_list_dir(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/chat/stream/status':
|
||||||
|
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
||||||
|
return j(handler, {'active': stream_id in STREAMS, 'stream_id': stream_id})
|
||||||
|
|
||||||
|
if parsed.path == '/api/chat/cancel':
|
||||||
|
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
||||||
|
if not stream_id:
|
||||||
|
return bad(handler, 'stream_id required')
|
||||||
|
cancelled = cancel_stream(stream_id)
|
||||||
|
return j(handler, {'ok': True, 'cancelled': cancelled, 'stream_id': stream_id})
|
||||||
|
|
||||||
|
if parsed.path == '/api/chat/stream':
|
||||||
|
return _handle_sse_stream(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file/raw':
|
||||||
|
return _handle_file_raw(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file':
|
||||||
|
return _handle_file_read(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/approval/pending':
|
||||||
|
return _handle_approval_pending(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/approval/inject_test':
|
||||||
|
return _handle_approval_inject(handler, parsed)
|
||||||
|
|
||||||
|
# ── Cron API (GET) ──
|
||||||
|
if parsed.path == '/api/crons':
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||||
|
from cron.jobs import list_jobs
|
||||||
|
return j(handler, {'jobs': list_jobs(include_disabled=True)})
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/output':
|
||||||
|
return _handle_cron_output(handler, parsed)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/recent':
|
||||||
|
return _handle_cron_recent(handler, parsed)
|
||||||
|
|
||||||
|
# ── Skills API (GET) ──
|
||||||
|
if parsed.path == '/api/skills':
|
||||||
|
from tools.skills_tool import skills_list as _skills_list
|
||||||
|
raw = _skills_list()
|
||||||
|
data = json.loads(raw) if isinstance(raw, str) else raw
|
||||||
|
return j(handler, {'skills': data.get('skills', [])})
|
||||||
|
|
||||||
|
if parsed.path == '/api/skills/content':
|
||||||
|
from tools.skills_tool import skill_view as _skill_view
|
||||||
|
name = parse_qs(parsed.query).get('name', [''])[0]
|
||||||
|
if not name: return j(handler, {'error': 'name required'}, status=400)
|
||||||
|
raw = _skill_view(name)
|
||||||
|
data = json.loads(raw) if isinstance(raw, str) else raw
|
||||||
|
return j(handler, data)
|
||||||
|
|
||||||
|
# ── Memory API (GET) ──
|
||||||
|
if parsed.path == '/api/memory':
|
||||||
|
return _handle_memory_read(handler)
|
||||||
|
|
||||||
|
return False # 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST routes ───────────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def handle_post(handler, parsed):
|
||||||
|
"""Handle all POST routes. Returns True if handled, False for 404."""
|
||||||
|
|
||||||
|
if parsed.path == '/api/upload':
|
||||||
|
return handle_upload(handler)
|
||||||
|
|
||||||
|
body = read_body(handler)
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/new':
|
||||||
|
s = new_session(workspace=body.get('workspace'), model=body.get('model'))
|
||||||
|
return j(handler, {'session': s.compact() | {'messages': s.messages}})
|
||||||
|
|
||||||
|
if parsed.path == '/api/sessions/cleanup':
|
||||||
|
return _handle_sessions_cleanup(handler, body, zero_only=False)
|
||||||
|
|
||||||
|
if parsed.path == '/api/sessions/cleanup_zero_message':
|
||||||
|
return _handle_sessions_cleanup(handler, body, zero_only=True)
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/rename':
|
||||||
|
try: require(body, 'session_id', 'title')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
s.title = str(body['title']).strip()[:80] or 'Untitled'
|
||||||
|
s.save()
|
||||||
|
return j(handler, {'session': s.compact()})
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/update':
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
new_ws = str(Path(body.get('workspace', s.workspace)).expanduser().resolve())
|
||||||
|
s.workspace = new_ws; s.model = body.get('model', s.model); s.save()
|
||||||
|
set_last_workspace(new_ws)
|
||||||
|
return j(handler, {'session': s.compact() | {'messages': s.messages}})
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/delete':
|
||||||
|
sid = body.get('session_id', '')
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
with LOCK: SESSIONS.pop(sid, None)
|
||||||
|
p = SESSION_DIR / f'{sid}.json'
|
||||||
|
try: p.unlink(missing_ok=True)
|
||||||
|
except Exception: pass
|
||||||
|
try: SESSION_INDEX_FILE.unlink(missing_ok=True)
|
||||||
|
except Exception: pass
|
||||||
|
return j(handler, {'ok': True})
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/clear':
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
s.messages = []; s.tool_calls = []; s.title = 'Untitled'; s.save()
|
||||||
|
return j(handler, {'ok': True, 'session': s.compact()})
|
||||||
|
|
||||||
|
if parsed.path == '/api/session/truncate':
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
if body.get('keep_count') is None:
|
||||||
|
return bad(handler, 'Missing required field(s): keep_count')
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
keep = int(body['keep_count'])
|
||||||
|
s.messages = s.messages[:keep]; s.save()
|
||||||
|
return j(handler, {'ok': True, 'session': s.compact() | {'messages': s.messages}})
|
||||||
|
|
||||||
|
if parsed.path == '/api/chat/start':
|
||||||
|
return _handle_chat_start(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/chat':
|
||||||
|
return _handle_chat_sync(handler, body)
|
||||||
|
|
||||||
|
# ── Cron API (POST) ──
|
||||||
|
if parsed.path == '/api/crons/create':
|
||||||
|
return _handle_cron_create(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/update':
|
||||||
|
return _handle_cron_update(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/delete':
|
||||||
|
return _handle_cron_delete(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/run':
|
||||||
|
return _handle_cron_run(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/pause':
|
||||||
|
return _handle_cron_pause(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/crons/resume':
|
||||||
|
return _handle_cron_resume(handler, body)
|
||||||
|
|
||||||
|
# ── File ops (POST) ──
|
||||||
|
if parsed.path == '/api/file/delete':
|
||||||
|
return _handle_file_delete(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file/save':
|
||||||
|
return _handle_file_save(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file/create':
|
||||||
|
return _handle_file_create(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file/rename':
|
||||||
|
return _handle_file_rename(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/file/create-dir':
|
||||||
|
return _handle_create_dir(handler, body)
|
||||||
|
|
||||||
|
# ── Workspace management (POST) ──
|
||||||
|
if parsed.path == '/api/workspaces/add':
|
||||||
|
return _handle_workspace_add(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/workspaces/remove':
|
||||||
|
return _handle_workspace_remove(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/workspaces/rename':
|
||||||
|
return _handle_workspace_rename(handler, body)
|
||||||
|
|
||||||
|
# ── Approval (POST) ──
|
||||||
|
if parsed.path == '/api/approval/respond':
|
||||||
|
return _handle_approval_respond(handler, body)
|
||||||
|
|
||||||
|
# ── Skills (POST) ──
|
||||||
|
if parsed.path == '/api/skills/save':
|
||||||
|
return _handle_skill_save(handler, body)
|
||||||
|
|
||||||
|
if parsed.path == '/api/skills/delete':
|
||||||
|
return _handle_skill_delete(handler, body)
|
||||||
|
|
||||||
|
# ── Memory (POST) ──
|
||||||
|
if parsed.path == '/api/memory/write':
|
||||||
|
return _handle_memory_write(handler, body)
|
||||||
|
|
||||||
|
# ── Settings (POST) ──
|
||||||
|
if parsed.path == '/api/settings':
|
||||||
|
return j(handler, save_settings(body))
|
||||||
|
|
||||||
|
# ── Session pin (POST) ──
|
||||||
|
if parsed.path == '/api/session/pin':
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
s.pinned = bool(body.get('pinned', True))
|
||||||
|
s.save()
|
||||||
|
return j(handler, {'ok': True, 'session': s.compact()})
|
||||||
|
|
||||||
|
# ── Session archive (POST) ──
|
||||||
|
if parsed.path == '/api/session/archive':
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
s.archived = bool(body.get('archived', True))
|
||||||
|
s.save()
|
||||||
|
return j(handler, {'ok': True, 'session': s.compact()})
|
||||||
|
|
||||||
|
# ── Session import from JSON (POST) ──
|
||||||
|
if parsed.path == '/api/session/import':
|
||||||
|
return _handle_session_import(handler, body)
|
||||||
|
|
||||||
|
return False # 404
|
||||||
|
|
||||||
|
|
||||||
|
# ── GET route helpers ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _serve_static(handler, parsed):
|
||||||
|
static_file = Path(__file__).parent.parent / parsed.path.lstrip('/')
|
||||||
|
if not static_file.exists() or not static_file.is_file():
|
||||||
|
return j(handler, {'error': 'not found'}, status=404)
|
||||||
|
ext = static_file.suffix.lower()
|
||||||
|
ct = {'css': 'text/css', 'js': 'application/javascript',
|
||||||
|
'html': 'text/html'}.get(ext.lstrip('.'), 'text/plain')
|
||||||
|
handler.send_response(200)
|
||||||
|
handler.send_header('Content-Type', f'{ct}; charset=utf-8')
|
||||||
|
handler.send_header('Cache-Control', 'no-store')
|
||||||
|
raw = static_file.read_bytes()
|
||||||
|
handler.send_header('Content-Length', str(len(raw)))
|
||||||
|
handler.end_headers()
|
||||||
|
handler.wfile.write(raw)
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_session_export(handler, parsed):
|
||||||
|
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
try: s = get_session(sid)
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
payload = json.dumps(s.__dict__, ensure_ascii=False, indent=2)
|
||||||
|
handler.send_response(200)
|
||||||
|
handler.send_header('Content-Type', 'application/json; charset=utf-8')
|
||||||
|
handler.send_header('Content-Disposition', f'attachment; filename="hermes-{sid}.json"')
|
||||||
|
handler.send_header('Content-Length', str(len(payload.encode('utf-8'))))
|
||||||
|
handler.send_header('Cache-Control', 'no-store')
|
||||||
|
handler.end_headers()
|
||||||
|
handler.wfile.write(payload.encode('utf-8'))
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_sessions_search(handler, parsed):
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
q = qs.get('q', [''])[0].lower().strip()
|
||||||
|
content_search = qs.get('content', ['1'])[0] == '1'
|
||||||
|
depth = int(qs.get('depth', ['5'])[0])
|
||||||
|
if not q: return j(handler, {'sessions': all_sessions()})
|
||||||
|
results = []
|
||||||
|
for s in all_sessions():
|
||||||
|
title_match = q in (s.get('title') or '').lower()
|
||||||
|
if title_match:
|
||||||
|
results.append(dict(s, match_type='title'))
|
||||||
|
continue
|
||||||
|
if content_search:
|
||||||
|
try:
|
||||||
|
sess = get_session(s['session_id'])
|
||||||
|
msgs = sess.messages[:depth] if depth else sess.messages
|
||||||
|
for m in msgs:
|
||||||
|
c = m.get('content') or ''
|
||||||
|
if isinstance(c, list):
|
||||||
|
c = ' '.join(p.get('text', '') for p in c
|
||||||
|
if isinstance(p, dict) and p.get('type') == 'text')
|
||||||
|
if q in str(c).lower():
|
||||||
|
results.append(dict(s, match_type='content'))
|
||||||
|
break
|
||||||
|
except (KeyError, Exception):
|
||||||
|
pass
|
||||||
|
return j(handler, {'sessions': results, 'query': q, 'count': len(results)})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_list_dir(handler, parsed):
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
sid = qs.get('session_id', [''])[0]
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
try: s = get_session(sid)
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
return j(handler, {
|
||||||
|
'entries': list_dir(Path(s.workspace), qs.get('path', ['.'])[0]),
|
||||||
|
'path': qs.get('path', ['.'])[0],
|
||||||
|
})
|
||||||
|
except (FileNotFoundError, ValueError) as e:
|
||||||
|
return bad(handler, str(e), 404)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_sse_stream(handler, parsed):
|
||||||
|
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
||||||
|
q = STREAMS.get(stream_id)
|
||||||
|
if q is None: return j(handler, {'error': 'stream not found'}, status=404)
|
||||||
|
handler.send_response(200)
|
||||||
|
handler.send_header('Content-Type', 'text/event-stream; charset=utf-8')
|
||||||
|
handler.send_header('Cache-Control', 'no-cache')
|
||||||
|
handler.send_header('X-Accel-Buffering', 'no')
|
||||||
|
handler.send_header('Connection', 'keep-alive')
|
||||||
|
handler.end_headers()
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
event, data = q.get(timeout=30)
|
||||||
|
except queue.Empty:
|
||||||
|
handler.wfile.write(b': heartbeat\n\n')
|
||||||
|
handler.wfile.flush()
|
||||||
|
continue
|
||||||
|
_sse(handler, event, data)
|
||||||
|
if event in ('done', 'error', 'cancel'):
|
||||||
|
break
|
||||||
|
except (BrokenPipeError, ConnectionResetError):
|
||||||
|
pass
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_raw(handler, parsed):
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
sid = qs.get('session_id', [''])[0]
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
try: s = get_session(sid)
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
rel = qs.get('path', [''])[0]
|
||||||
|
force_download = qs.get('download', [''])[0] == '1'
|
||||||
|
target = safe_resolve(Path(s.workspace), rel)
|
||||||
|
if not target.exists() or not target.is_file():
|
||||||
|
return j(handler, {'error': 'not found'}, status=404)
|
||||||
|
ext = target.suffix.lower()
|
||||||
|
mime = MIME_MAP.get(ext, 'application/octet-stream')
|
||||||
|
raw_bytes = target.read_bytes()
|
||||||
|
import urllib.parse as _up
|
||||||
|
safe_name = _up.quote(target.name, safe='')
|
||||||
|
handler.send_response(200)
|
||||||
|
handler.send_header('Content-Type', mime)
|
||||||
|
handler.send_header('Content-Length', str(len(raw_bytes)))
|
||||||
|
handler.send_header('Cache-Control', 'no-store')
|
||||||
|
if force_download:
|
||||||
|
handler.send_header('Content-Disposition',
|
||||||
|
f'attachment; filename="{target.name}"; filename*=UTF-8\'\'{safe_name}')
|
||||||
|
handler.end_headers()
|
||||||
|
handler.wfile.write(raw_bytes)
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_read(handler, parsed):
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
sid = qs.get('session_id', [''])[0]
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
try: s = get_session(sid)
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
rel = qs.get('path', [''])[0]
|
||||||
|
if not rel: return bad(handler, 'path is required')
|
||||||
|
try: return j(handler, read_file_content(Path(s.workspace), rel))
|
||||||
|
except (FileNotFoundError, ValueError) as e: return bad(handler, str(e), 404)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_approval_pending(handler, parsed):
|
||||||
|
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
||||||
|
if has_pending(sid):
|
||||||
|
with _lock:
|
||||||
|
p = dict(_pending.get(sid, {}))
|
||||||
|
return j(handler, {'pending': p})
|
||||||
|
return j(handler, {'pending': None})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_approval_inject(handler, parsed):
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
sid = qs.get('session_id', [''])[0]
|
||||||
|
key = qs.get('pattern_key', ['test_pattern'])[0]
|
||||||
|
cmd = qs.get('command', ['rm -rf /tmp/test'])[0]
|
||||||
|
if sid:
|
||||||
|
submit_pending(sid, {
|
||||||
|
'command': cmd, 'pattern_key': key,
|
||||||
|
'pattern_keys': [key], 'description': 'test pattern',
|
||||||
|
})
|
||||||
|
return j(handler, {'ok': True, 'session_id': sid})
|
||||||
|
return j(handler, {'error': 'session_id required'}, status=400)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_output(handler, parsed):
|
||||||
|
from cron.jobs import OUTPUT_DIR as CRON_OUT
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
job_id = qs.get('job_id', [''])[0]
|
||||||
|
limit = int(qs.get('limit', ['5'])[0])
|
||||||
|
if not job_id: return j(handler, {'error': 'job_id required'}, status=400)
|
||||||
|
out_dir = CRON_OUT / job_id
|
||||||
|
outputs = []
|
||||||
|
if out_dir.exists():
|
||||||
|
files = sorted(out_dir.glob('*.md'), reverse=True)[:limit]
|
||||||
|
for f in files:
|
||||||
|
try:
|
||||||
|
txt = f.read_text(encoding='utf-8', errors='replace')
|
||||||
|
outputs.append({'filename': f.name, 'content': txt[:8000]})
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return j(handler, {'job_id': job_id, 'outputs': outputs})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_recent(handler, parsed):
|
||||||
|
"""Return cron jobs that have completed since a given timestamp."""
|
||||||
|
import datetime
|
||||||
|
qs = parse_qs(parsed.query)
|
||||||
|
since = float(qs.get('since', ['0'])[0])
|
||||||
|
try:
|
||||||
|
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||||
|
from cron.jobs import list_jobs
|
||||||
|
jobs = list_jobs(include_disabled=True)
|
||||||
|
completions = []
|
||||||
|
for job in jobs:
|
||||||
|
last_run = job.get('last_run_at')
|
||||||
|
if not last_run:
|
||||||
|
continue
|
||||||
|
if isinstance(last_run, str):
|
||||||
|
try:
|
||||||
|
ts = datetime.datetime.fromisoformat(last_run.replace('Z', '+00:00')).timestamp()
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
ts = float(last_run)
|
||||||
|
if ts > since:
|
||||||
|
completions.append({
|
||||||
|
'job_id': job.get('id', ''),
|
||||||
|
'name': job.get('name', 'Unknown'),
|
||||||
|
'status': job.get('last_status', 'unknown'),
|
||||||
|
'completed_at': ts,
|
||||||
|
})
|
||||||
|
return j(handler, {'completions': completions, 'since': since})
|
||||||
|
except ImportError:
|
||||||
|
return j(handler, {'completions': [], 'since': since})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_memory_read(handler):
|
||||||
|
mem_dir = Path.home() / '.hermes' / 'memories'
|
||||||
|
mem_file = mem_dir / 'MEMORY.md'
|
||||||
|
user_file = mem_dir / 'USER.md'
|
||||||
|
memory = mem_file.read_text(encoding='utf-8', errors='replace') if mem_file.exists() else ''
|
||||||
|
user = user_file.read_text(encoding='utf-8', errors='replace') if user_file.exists() else ''
|
||||||
|
return j(handler, {
|
||||||
|
'memory': memory, 'user': user,
|
||||||
|
'memory_path': str(mem_file), 'user_path': str(user_file),
|
||||||
|
'memory_mtime': mem_file.stat().st_mtime if mem_file.exists() else None,
|
||||||
|
'user_mtime': user_file.stat().st_mtime if user_file.exists() else None,
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
# ── POST route helpers ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def _handle_sessions_cleanup(handler, body, zero_only=False):
|
||||||
|
cleaned = 0
|
||||||
|
for p in SESSION_DIR.glob('*.json'):
|
||||||
|
if p.name.startswith('_'): continue
|
||||||
|
try:
|
||||||
|
s = Session.load(p.stem)
|
||||||
|
if zero_only:
|
||||||
|
should_delete = s and len(s.messages) == 0
|
||||||
|
else:
|
||||||
|
should_delete = s and s.title == 'Untitled' and len(s.messages) == 0
|
||||||
|
if should_delete:
|
||||||
|
with LOCK: SESSIONS.pop(p.stem, None)
|
||||||
|
p.unlink(missing_ok=True)
|
||||||
|
cleaned += 1
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
if SESSION_INDEX_FILE.exists():
|
||||||
|
SESSION_INDEX_FILE.unlink(missing_ok=True)
|
||||||
|
return j(handler, {'ok': True, 'cleaned': cleaned})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_chat_start(handler, body):
|
||||||
|
try: require(body, 'session_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
msg = str(body.get('message', '')).strip()
|
||||||
|
if not msg: return bad(handler, 'message is required')
|
||||||
|
attachments = [str(a) for a in (body.get('attachments') or [])][:20]
|
||||||
|
workspace = str(Path(body.get('workspace') or s.workspace).expanduser().resolve())
|
||||||
|
model = body.get('model') or s.model
|
||||||
|
s.workspace = workspace; s.model = model; s.save()
|
||||||
|
set_last_workspace(workspace)
|
||||||
|
stream_id = uuid.uuid4().hex
|
||||||
|
q = queue.Queue()
|
||||||
|
with STREAMS_LOCK: STREAMS[stream_id] = q
|
||||||
|
thr = threading.Thread(
|
||||||
|
target=_run_agent_streaming,
|
||||||
|
args=(s.session_id, msg, model, workspace, stream_id, attachments),
|
||||||
|
daemon=True,
|
||||||
|
)
|
||||||
|
thr.start()
|
||||||
|
return j(handler, {'stream_id': stream_id, 'session_id': s.session_id})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_chat_sync(handler, body):
|
||||||
|
"""Fallback synchronous chat endpoint (POST /api/chat). Not used by frontend."""
|
||||||
|
from api.config import _get_session_agent_lock
|
||||||
|
s = get_session(body['session_id'])
|
||||||
|
msg = str(body.get('message', '')).strip()
|
||||||
|
if not msg: return j(handler, {'error': 'empty message'}, status=400)
|
||||||
|
workspace = Path(body.get('workspace') or s.workspace).expanduser().resolve()
|
||||||
|
s.workspace = str(workspace); s.model = body.get('model') or s.model
|
||||||
|
old_cwd = os.environ.get('TERMINAL_CWD')
|
||||||
|
os.environ['TERMINAL_CWD'] = str(workspace)
|
||||||
|
old_exec_ask = os.environ.get('HERMES_EXEC_ASK')
|
||||||
|
old_session_key = os.environ.get('HERMES_SESSION_KEY')
|
||||||
|
os.environ['HERMES_EXEC_ASK'] = '1'
|
||||||
|
os.environ['HERMES_SESSION_KEY'] = s.session_id
|
||||||
|
try:
|
||||||
|
from run_agent import AIAgent
|
||||||
|
with CHAT_LOCK:
|
||||||
|
agent = AIAgent(model=s.model, platform='cli', quiet_mode=True,
|
||||||
|
enabled_toolsets=CLI_TOOLSETS, session_id=s.session_id)
|
||||||
|
workspace_ctx = f"[Workspace: {s.workspace}]\n"
|
||||||
|
workspace_system_msg = (
|
||||||
|
f"Active workspace at session start: {s.workspace}\n"
|
||||||
|
"Every user message is prefixed with [Workspace: /absolute/path] indicating the "
|
||||||
|
"workspace the user has selected in the web UI at the time they sent that message. "
|
||||||
|
"This tag is the single authoritative source of the active workspace and updates "
|
||||||
|
"with every message. It overrides any prior workspace mentioned in this system "
|
||||||
|
"prompt, memory, or conversation history. Always use the value from the most recent "
|
||||||
|
"[Workspace: ...] tag as your default working directory for ALL file operations: "
|
||||||
|
"write_file, read_file, search_files, terminal workdir, and patch. "
|
||||||
|
"Never fall back to a hardcoded path when this tag is present."
|
||||||
|
)
|
||||||
|
result = agent.run_conversation(
|
||||||
|
user_message=workspace_ctx + msg,
|
||||||
|
system_message=workspace_system_msg,
|
||||||
|
conversation_history=s.messages,
|
||||||
|
task_id=s.session_id,
|
||||||
|
persist_user_message=msg,
|
||||||
|
)
|
||||||
|
finally:
|
||||||
|
if old_cwd is None: os.environ.pop('TERMINAL_CWD', None)
|
||||||
|
else: os.environ['TERMINAL_CWD'] = old_cwd
|
||||||
|
if old_exec_ask is None: os.environ.pop('HERMES_EXEC_ASK', None)
|
||||||
|
else: os.environ['HERMES_EXEC_ASK'] = old_exec_ask
|
||||||
|
if old_session_key is None: os.environ.pop('HERMES_SESSION_KEY', None)
|
||||||
|
else: os.environ['HERMES_SESSION_KEY'] = old_session_key
|
||||||
|
s.messages = result.get('messages') or s.messages
|
||||||
|
s.title = title_from(s.messages, s.title); s.save()
|
||||||
|
return j(handler, {
|
||||||
|
'answer': result.get('final_response') or '',
|
||||||
|
'status': 'done' if result.get('completed', True) else 'partial',
|
||||||
|
'session': s.compact() | {'messages': s.messages},
|
||||||
|
'result': {k: v for k, v in result.items() if k != 'messages'},
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_create(handler, body):
|
||||||
|
try: require(body, 'prompt', 'schedule')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try:
|
||||||
|
from cron.jobs import create_job
|
||||||
|
job = create_job(
|
||||||
|
prompt=body['prompt'], schedule=body['schedule'],
|
||||||
|
name=body.get('name') or None, deliver=body.get('deliver') or 'local',
|
||||||
|
skills=body.get('skills') or [], model=body.get('model') or None,
|
||||||
|
)
|
||||||
|
return j(handler, {'ok': True, 'job': job})
|
||||||
|
except Exception as e:
|
||||||
|
return j(handler, {'error': str(e)}, status=400)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_update(handler, body):
|
||||||
|
try: require(body, 'job_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
from cron.jobs import update_job
|
||||||
|
updates = {k: v for k, v in body.items() if k != 'job_id' and v is not None}
|
||||||
|
job = update_job(body['job_id'], updates)
|
||||||
|
if not job: return bad(handler, 'Job not found', 404)
|
||||||
|
return j(handler, {'ok': True, 'job': job})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_delete(handler, body):
|
||||||
|
try: require(body, 'job_id')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
from cron.jobs import remove_job
|
||||||
|
ok = remove_job(body['job_id'])
|
||||||
|
if not ok: return bad(handler, 'Job not found', 404)
|
||||||
|
return j(handler, {'ok': True, 'job_id': body['job_id']})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_run(handler, body):
|
||||||
|
job_id = body.get('job_id', '')
|
||||||
|
if not job_id: return bad(handler, 'job_id required')
|
||||||
|
from cron.jobs import get_job
|
||||||
|
from cron.scheduler import run_job
|
||||||
|
job = get_job(job_id)
|
||||||
|
if not job: return bad(handler, 'Job not found', 404)
|
||||||
|
threading.Thread(target=run_job, args=(job,), daemon=True).start()
|
||||||
|
return j(handler, {'ok': True, 'job_id': job_id, 'status': 'triggered'})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_pause(handler, body):
|
||||||
|
job_id = body.get('job_id', '')
|
||||||
|
if not job_id: return bad(handler, 'job_id required')
|
||||||
|
from cron.jobs import pause_job
|
||||||
|
result = pause_job(job_id, reason=body.get('reason'))
|
||||||
|
if result: return j(handler, {'ok': True, 'job': result})
|
||||||
|
return bad(handler, 'Job not found', 404)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_cron_resume(handler, body):
|
||||||
|
job_id = body.get('job_id', '')
|
||||||
|
if not job_id: return bad(handler, 'job_id required')
|
||||||
|
from cron.jobs import resume_job
|
||||||
|
result = resume_job(job_id)
|
||||||
|
if result: return j(handler, {'ok': True, 'job': result})
|
||||||
|
return bad(handler, 'Job not found', 404)
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_delete(handler, body):
|
||||||
|
try: require(body, 'session_id', 'path')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
target = safe_resolve(Path(s.workspace), body['path'])
|
||||||
|
if not target.exists(): return bad(handler, 'File not found', 404)
|
||||||
|
if target.is_dir(): return bad(handler, 'Cannot delete directories via this endpoint')
|
||||||
|
target.unlink()
|
||||||
|
return j(handler, {'ok': True, 'path': body['path']})
|
||||||
|
except (ValueError, PermissionError) as e: return bad(handler, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_save(handler, body):
|
||||||
|
try: require(body, 'session_id', 'path')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
target = safe_resolve(Path(s.workspace), body['path'])
|
||||||
|
if not target.exists(): return bad(handler, 'File not found', 404)
|
||||||
|
if target.is_dir(): return bad(handler, 'Cannot save: path is a directory')
|
||||||
|
target.write_text(body.get('content', ''), encoding='utf-8')
|
||||||
|
return j(handler, {'ok': True, 'path': body['path'], 'size': target.stat().st_size})
|
||||||
|
except (ValueError, PermissionError) as e: return bad(handler, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_create(handler, body):
|
||||||
|
try: require(body, 'session_id', 'path')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
target = safe_resolve(Path(s.workspace), body['path'])
|
||||||
|
if target.exists(): return bad(handler, 'File already exists')
|
||||||
|
target.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
target.write_text(body.get('content', ''), encoding='utf-8')
|
||||||
|
return j(handler, {'ok': True, 'path': str(target.relative_to(Path(s.workspace)))})
|
||||||
|
except (ValueError, PermissionError) as e: return bad(handler, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_file_rename(handler, body):
|
||||||
|
try: require(body, 'session_id', 'path', 'new_name')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
source = safe_resolve(Path(s.workspace), body['path'])
|
||||||
|
if not source.exists(): return bad(handler, 'File not found', 404)
|
||||||
|
new_name = body['new_name'].strip()
|
||||||
|
if not new_name or '/' in new_name or '..' in new_name:
|
||||||
|
return bad(handler, 'Invalid file name')
|
||||||
|
dest = source.parent / new_name
|
||||||
|
if dest.exists(): return bad(handler, f'A file named "{new_name}" already exists')
|
||||||
|
source.rename(dest)
|
||||||
|
new_rel = str(dest.relative_to(Path(s.workspace)))
|
||||||
|
return j(handler, {'ok': True, 'old_path': body['path'], 'new_path': new_rel})
|
||||||
|
except (ValueError, PermissionError, OSError) as e: return bad(handler, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_create_dir(handler, body):
|
||||||
|
try: require(body, 'session_id', 'path')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
try: s = get_session(body['session_id'])
|
||||||
|
except KeyError: return bad(handler, 'Session not found', 404)
|
||||||
|
try:
|
||||||
|
target = safe_resolve(Path(s.workspace), body['path'])
|
||||||
|
if target.exists(): return bad(handler, 'Path already exists')
|
||||||
|
target.mkdir(parents=True)
|
||||||
|
return j(handler, {'ok': True, 'path': str(target.relative_to(Path(s.workspace)))})
|
||||||
|
except (ValueError, PermissionError, OSError) as e: return bad(handler, str(e))
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_workspace_add(handler, body):
|
||||||
|
path_str = body.get('path', '').strip()
|
||||||
|
name = body.get('name', '').strip()
|
||||||
|
if not path_str: return bad(handler, 'path is required')
|
||||||
|
p = Path(path_str).expanduser().resolve()
|
||||||
|
if not p.exists(): return bad(handler, f'Path does not exist: {p}')
|
||||||
|
if not p.is_dir(): return bad(handler, f'Path is not a directory: {p}')
|
||||||
|
wss = load_workspaces()
|
||||||
|
if any(w['path'] == str(p) for w in wss):
|
||||||
|
return bad(handler, 'Workspace already in list')
|
||||||
|
wss.append({'path': str(p), 'name': name or p.name})
|
||||||
|
save_workspaces(wss)
|
||||||
|
return j(handler, {'ok': True, 'workspaces': wss})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_workspace_remove(handler, body):
|
||||||
|
path_str = body.get('path', '').strip()
|
||||||
|
if not path_str: return bad(handler, 'path is required')
|
||||||
|
wss = load_workspaces()
|
||||||
|
wss = [w for w in wss if w['path'] != path_str]
|
||||||
|
save_workspaces(wss)
|
||||||
|
return j(handler, {'ok': True, 'workspaces': wss})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_workspace_rename(handler, body):
|
||||||
|
path_str = body.get('path', '').strip()
|
||||||
|
name = body.get('name', '').strip()
|
||||||
|
if not path_str or not name: return bad(handler, 'path and name are required')
|
||||||
|
wss = load_workspaces()
|
||||||
|
for w in wss:
|
||||||
|
if w['path'] == path_str:
|
||||||
|
w['name'] = name; break
|
||||||
|
else:
|
||||||
|
return bad(handler, 'Workspace not found', 404)
|
||||||
|
save_workspaces(wss)
|
||||||
|
return j(handler, {'ok': True, 'workspaces': wss})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_approval_respond(handler, body):
|
||||||
|
sid = body.get('session_id', '')
|
||||||
|
if not sid: return bad(handler, 'session_id is required')
|
||||||
|
choice = body.get('choice', 'deny')
|
||||||
|
if choice not in ('once', 'session', 'always', 'deny'):
|
||||||
|
return bad(handler, f'Invalid choice: {choice}')
|
||||||
|
with _lock:
|
||||||
|
pending = _pending.pop(sid, None)
|
||||||
|
if pending:
|
||||||
|
keys = pending.get('pattern_keys') or [pending.get('pattern_key', '')]
|
||||||
|
if choice in ('once', 'session'):
|
||||||
|
for k in keys: approve_session(sid, k)
|
||||||
|
elif choice == 'always':
|
||||||
|
for k in keys:
|
||||||
|
approve_session(sid, k); approve_permanent(k)
|
||||||
|
save_permanent_allowlist(_permanent_approved)
|
||||||
|
return j(handler, {'ok': True, 'choice': choice})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_skill_save(handler, body):
|
||||||
|
try: require(body, 'name', 'content')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
skill_name = body['name'].strip().lower().replace(' ', '-')
|
||||||
|
if not skill_name or '/' in skill_name or '..' in skill_name:
|
||||||
|
return bad(handler, 'Invalid skill name')
|
||||||
|
category = body.get('category', '').strip()
|
||||||
|
from tools.skills_tool import SKILLS_DIR
|
||||||
|
if category:
|
||||||
|
skill_dir = SKILLS_DIR / category / skill_name
|
||||||
|
else:
|
||||||
|
skill_dir = SKILLS_DIR / skill_name
|
||||||
|
skill_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
skill_file = skill_dir / 'SKILL.md'
|
||||||
|
skill_file.write_text(body['content'], encoding='utf-8')
|
||||||
|
return j(handler, {'ok': True, 'name': skill_name, 'path': str(skill_file)})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_skill_delete(handler, body):
|
||||||
|
try: require(body, 'name')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
from tools.skills_tool import SKILLS_DIR
|
||||||
|
import shutil
|
||||||
|
matches = list(SKILLS_DIR.rglob(f'{body["name"]}/SKILL.md'))
|
||||||
|
if not matches: return bad(handler, 'Skill not found', 404)
|
||||||
|
skill_dir = matches[0].parent
|
||||||
|
shutil.rmtree(str(skill_dir))
|
||||||
|
return j(handler, {'ok': True, 'name': body['name']})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_memory_write(handler, body):
|
||||||
|
try: require(body, 'section', 'content')
|
||||||
|
except ValueError as e: return bad(handler, str(e))
|
||||||
|
mem_dir = Path.home() / '.hermes' / 'memories'
|
||||||
|
mem_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
section = body['section']
|
||||||
|
if section == 'memory':
|
||||||
|
target = mem_dir / 'MEMORY.md'
|
||||||
|
elif section == 'user':
|
||||||
|
target = mem_dir / 'USER.md'
|
||||||
|
else:
|
||||||
|
return bad(handler, 'section must be "memory" or "user"')
|
||||||
|
target.write_text(body['content'], encoding='utf-8')
|
||||||
|
return j(handler, {'ok': True, 'section': section, 'path': str(target)})
|
||||||
|
|
||||||
|
|
||||||
|
def _handle_session_import(handler, body):
|
||||||
|
"""Import a session from a JSON export. Creates a new session with a new ID."""
|
||||||
|
if not body or not isinstance(body, dict):
|
||||||
|
return bad(handler, 'Request body must be a JSON object')
|
||||||
|
messages = body.get('messages')
|
||||||
|
if not isinstance(messages, list):
|
||||||
|
return bad(handler, 'JSON must contain a "messages" array')
|
||||||
|
title = body.get('title', 'Imported session')
|
||||||
|
workspace = body.get('workspace', str(DEFAULT_WORKSPACE))
|
||||||
|
model = body.get('model', DEFAULT_MODEL)
|
||||||
|
s = Session(
|
||||||
|
title=title, workspace=workspace, model=model,
|
||||||
|
messages=messages,
|
||||||
|
tool_calls=body.get('tool_calls', []),
|
||||||
|
)
|
||||||
|
s.pinned = body.get('pinned', False)
|
||||||
|
with LOCK:
|
||||||
|
SESSIONS[s.session_id] = s
|
||||||
|
SESSIONS.move_to_end(s.session_id)
|
||||||
|
while len(SESSIONS) > SESSIONS_MAX:
|
||||||
|
SESSIONS.popitem(last=False)
|
||||||
|
s.save()
|
||||||
|
return j(handler, {'ok': True, 'session': s.compact() | {'messages': s.messages}})
|
||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- SSE streaming engine and agent thread runner.
|
Hermes Web UI -- SSE streaming engine and agent thread runner.
|
||||||
Includes Sprint 10 cancel support via CANCEL_FLAGS.
|
Includes Sprint 10 cancel support via CANCEL_FLAGS.
|
||||||
"""
|
"""
|
||||||
import json
|
import json
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- File upload: multipart parser and upload handler.
|
Hermes Web UI -- File upload: multipart parser and upload handler.
|
||||||
"""
|
"""
|
||||||
import re as _re
|
import re as _re
|
||||||
import email.parser
|
import email.parser
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- Workspace and file system helpers.
|
Hermes Web UI -- Workspace and file system helpers.
|
||||||
"""
|
"""
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
|
|||||||
668
server.py
668
server.py
@@ -1,62 +1,24 @@
|
|||||||
"""
|
"""
|
||||||
Hermes WebUI -- Main server entry point.
|
Hermes Web UI -- Main server entry point.
|
||||||
HTTP Handler (routing) + startup. All business logic lives in api/*.
|
Thin routing shell: imports Handler, delegates to api/routes.py, runs server.
|
||||||
|
All business logic lives in api/*.
|
||||||
"""
|
"""
|
||||||
import json
|
|
||||||
import os
|
|
||||||
import queue
|
|
||||||
import sys
|
|
||||||
import threading
|
|
||||||
import time
|
import time
|
||||||
import traceback
|
import traceback
|
||||||
import uuid
|
|
||||||
from http.server import BaseHTTPRequestHandler, ThreadingHTTPServer
|
from http.server import BaseHTTPRequestHandler, ThreadingHTTPServer
|
||||||
from pathlib import Path
|
from urllib.parse import urlparse
|
||||||
from urllib.parse import parse_qs, urlparse
|
|
||||||
|
|
||||||
# ── API modules ───────────────────────────────────────────────────────────────
|
from api.config import HOST, PORT, STATE_DIR, SESSION_DIR, DEFAULT_WORKSPACE
|
||||||
from api.config import (
|
from api.helpers import j
|
||||||
HOST, PORT, STATE_DIR, SESSION_DIR, DEFAULT_WORKSPACE, DEFAULT_MODEL,
|
from api.routes import handle_get, handle_post
|
||||||
SESSIONS, SESSIONS_MAX, LOCK, STREAMS, STREAMS_LOCK, CANCEL_FLAGS,
|
|
||||||
SERVER_START_TIME, CLI_TOOLSETS, _INDEX_HTML_PATH,
|
|
||||||
IMAGE_EXTS, MD_EXTS, MIME_MAP, MAX_FILE_BYTES, MAX_UPLOAD_BYTES,
|
|
||||||
_get_session_agent_lock, SESSION_AGENT_LOCKS, SESSION_AGENT_LOCKS_LOCK,
|
|
||||||
)
|
|
||||||
from api.helpers import require, bad, safe_resolve, j, t, read_body
|
|
||||||
from api.models import (
|
|
||||||
Session, get_session, new_session, all_sessions, title_from,
|
|
||||||
_write_session_index, SESSION_INDEX_FILE,
|
|
||||||
)
|
|
||||||
from api.workspace import (
|
|
||||||
load_workspaces, save_workspaces, get_last_workspace, set_last_workspace,
|
|
||||||
list_dir, read_file_content, safe_resolve_ws,
|
|
||||||
)
|
|
||||||
from api.upload import parse_multipart, handle_upload
|
|
||||||
from api.streaming import _sse, _run_agent_streaming, cancel_stream
|
|
||||||
|
|
||||||
# Approval system
|
|
||||||
try:
|
|
||||||
from tools.approval import (
|
|
||||||
has_pending, pop_pending, submit_pending,
|
|
||||||
approve_session, approve_permanent, save_permanent_allowlist,
|
|
||||||
is_approved,
|
|
||||||
)
|
|
||||||
except ImportError:
|
|
||||||
def has_pending(*a, **k): return False
|
|
||||||
def pop_pending(*a, **k): return None
|
|
||||||
def submit_pending(*a, **k): pass
|
|
||||||
def approve_session(*a, **k): pass
|
|
||||||
def approve_permanent(*a, **k): pass
|
|
||||||
def save_permanent_allowlist(*a, **k): pass
|
|
||||||
def is_approved(*a, **k): return True
|
|
||||||
|
|
||||||
|
|
||||||
class Handler(BaseHTTPRequestHandler):
|
class Handler(BaseHTTPRequestHandler):
|
||||||
server_version = 'HermesWebUI/0.1.0'
|
server_version = 'HermesWebUI/0.2'
|
||||||
def log_message(self, fmt, *args): pass # suppress default Apache-style log
|
def log_message(self, fmt, *args): pass # suppress default Apache-style log
|
||||||
|
|
||||||
def log_request(self, code='-', size='-'):
|
def log_request(self, code='-', size='-'):
|
||||||
"""Override BaseHTTPRequestHandler.log_request to emit structured JSON logs."""
|
"""Structured JSON logs for each request."""
|
||||||
import json as _json
|
import json as _json
|
||||||
duration_ms = round((time.time() - getattr(self, '_req_t0', time.time())) * 1000, 1)
|
duration_ms = round((time.time() - getattr(self, '_req_t0', time.time())) * 1000, 1)
|
||||||
record = _json.dumps({
|
record = _json.dumps({
|
||||||
@@ -68,624 +30,34 @@ class Handler(BaseHTTPRequestHandler):
|
|||||||
})
|
})
|
||||||
print(f'[webui] {record}', flush=True)
|
print(f'[webui] {record}', flush=True)
|
||||||
|
|
||||||
def _log_request(self, method, path, status, duration_ms):
|
|
||||||
pass # kept for backward compat with error path calls; log_request handles it now
|
|
||||||
def do_GET(self):
|
def do_GET(self):
|
||||||
_t0 = time.time()
|
self._req_t0 = time.time()
|
||||||
self._req_t0 = _t0
|
|
||||||
try:
|
try:
|
||||||
parsed = urlparse(self.path)
|
parsed = urlparse(self.path)
|
||||||
if parsed.path in ('/', '/index.html'): return t(self, _INDEX_HTML_PATH.read_text(encoding='utf-8'), content_type='text/html; charset=utf-8')
|
result = handle_get(self, parsed)
|
||||||
if parsed.path == '/favicon.ico':
|
if result is False:
|
||||||
self.send_response(204); self.end_headers(); return
|
|
||||||
if parsed.path == '/health':
|
|
||||||
with STREAMS_LOCK: n_streams = len(STREAMS)
|
|
||||||
return j(self, {'status':'ok','sessions':len(SESSIONS),'active_streams':n_streams,'uptime_seconds':round(time.time()-SERVER_START_TIME,1)})
|
|
||||||
if parsed.path.startswith('/static/'):
|
|
||||||
# Phase A: serve static assets from disk
|
|
||||||
static_file = Path(__file__).parent / parsed.path.lstrip('/')
|
|
||||||
if not static_file.exists() or not static_file.is_file():
|
|
||||||
return j(self, {'error': 'not found'}, status=404)
|
return j(self, {'error': 'not found'}, status=404)
|
||||||
ext = static_file.suffix.lower()
|
|
||||||
ct = {'css': 'text/css', 'js': 'application/javascript', 'html': 'text/html'}.get(ext.lstrip('.'), 'text/plain')
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', f'{ct}; charset=utf-8')
|
|
||||||
self.send_header('Cache-Control', 'no-store')
|
|
||||||
raw = static_file.read_bytes()
|
|
||||||
self.send_header('Content-Length', str(len(raw)))
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(raw)
|
|
||||||
return
|
|
||||||
if parsed.path == '/api/session':
|
|
||||||
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
|
||||||
if not sid:
|
|
||||||
return j(self, {'error': 'session_id is required'}, status=400)
|
|
||||||
s = get_session(sid); return j(self, {'session': s.compact() | {'messages': s.messages, 'tool_calls': getattr(s, 'tool_calls', [])}})
|
|
||||||
if parsed.path == '/api/sessions': return j(self, {'sessions': all_sessions()})
|
|
||||||
if parsed.path == '/api/session/export':
|
|
||||||
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
|
||||||
if not sid: return bad(self, 'session_id is required')
|
|
||||||
try: s = get_session(sid)
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
import json as _json_exp
|
|
||||||
payload = _json_exp.dumps(s.__dict__, ensure_ascii=False, indent=2)
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'application/json; charset=utf-8')
|
|
||||||
self.send_header('Content-Disposition', f'attachment; filename="hermes-{sid}.json"')
|
|
||||||
self.send_header('Content-Length', str(len(payload.encode('utf-8'))))
|
|
||||||
self.send_header('Cache-Control', 'no-store')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(payload.encode('utf-8'))
|
|
||||||
return
|
|
||||||
if parsed.path == '/api/workspaces':
|
|
||||||
return j(self, {'workspaces': load_workspaces(), 'last': get_last_workspace()})
|
|
||||||
if parsed.path == '/api/sessions/search':
|
|
||||||
qs2 = parse_qs(parsed.query)
|
|
||||||
q = qs2.get('q', [''])[0].lower().strip()
|
|
||||||
content_search = qs2.get('content', ['1'])[0] == '1' # default: search message content too
|
|
||||||
depth = int(qs2.get('depth', ['5'])[0]) # max messages per session to scan
|
|
||||||
if not q: return j(self, {'sessions': all_sessions()})
|
|
||||||
results = []
|
|
||||||
for s in all_sessions():
|
|
||||||
title_match = q in (s.get('title') or '').lower()
|
|
||||||
if title_match:
|
|
||||||
results.append(dict(s, match_type='title'))
|
|
||||||
continue
|
|
||||||
if content_search:
|
|
||||||
# Load full session to search message content
|
|
||||||
try:
|
|
||||||
sess = get_session(s['session_id'])
|
|
||||||
msgs = sess.messages[:depth] if depth else sess.messages
|
|
||||||
for m in msgs:
|
|
||||||
c = m.get('content') or ''
|
|
||||||
if isinstance(c, list):
|
|
||||||
c = ' '.join(p.get('text','') for p in c if isinstance(p,dict) and p.get('type')=='text')
|
|
||||||
if q in str(c).lower():
|
|
||||||
results.append(dict(s, match_type='content'))
|
|
||||||
break
|
|
||||||
except (KeyError, Exception):
|
|
||||||
pass
|
|
||||||
return j(self, {'sessions': results, 'query': q, 'count': len(results)})
|
|
||||||
if parsed.path == '/api/list':
|
|
||||||
qs2 = parse_qs(parsed.query)
|
|
||||||
sid2 = qs2.get('session_id', [''])[0]
|
|
||||||
if not sid2: return bad(self, 'session_id is required')
|
|
||||||
try: s = get_session(sid2)
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
try: return j(self, {'entries': list_dir(Path(s.workspace), qs2.get('path', ['.'])[0]), 'path': qs2.get('path', ['.'])[0]})
|
|
||||||
except (FileNotFoundError, ValueError) as e: return bad(self, str(e), 404)
|
|
||||||
if parsed.path == '/api/chat/stream/status':
|
|
||||||
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
|
||||||
active = stream_id in STREAMS
|
|
||||||
return j(self, {'active': active, 'stream_id': stream_id})
|
|
||||||
if parsed.path == '/api/chat/cancel':
|
|
||||||
# Sprint 10: cancel an in-flight stream
|
|
||||||
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
|
||||||
if not stream_id:
|
|
||||||
return bad(self, 'stream_id required')
|
|
||||||
cancelled = cancel_stream(stream_id)
|
|
||||||
return j(self, {'ok': True, 'cancelled': cancelled, 'stream_id': stream_id})
|
|
||||||
if parsed.path == '/api/chat/stream':
|
|
||||||
stream_id = parse_qs(parsed.query).get('stream_id', [''])[0]
|
|
||||||
q = STREAMS.get(stream_id)
|
|
||||||
if q is None: return j(self, {'error': 'stream not found'}, status=404)
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'text/event-stream; charset=utf-8')
|
|
||||||
self.send_header('Cache-Control', 'no-cache')
|
|
||||||
self.send_header('X-Accel-Buffering', 'no')
|
|
||||||
self.send_header('Connection', 'keep-alive')
|
|
||||||
self.end_headers()
|
|
||||||
try:
|
|
||||||
while True:
|
|
||||||
try:
|
|
||||||
event, data = q.get(timeout=30)
|
|
||||||
except queue.Empty:
|
|
||||||
self.wfile.write(b': heartbeat\n\n'); self.wfile.flush(); continue
|
|
||||||
_sse(self, event, data)
|
|
||||||
if event in ('done', 'error', 'cancel'): break
|
|
||||||
except (BrokenPipeError, ConnectionResetError): pass
|
|
||||||
return
|
|
||||||
if parsed.path == '/api/file/raw':
|
|
||||||
# Serve raw file bytes (for images and downloads).
|
|
||||||
# Pass ?download=1 to force Content-Disposition: attachment (save to disk).
|
|
||||||
qs = parse_qs(parsed.query)
|
|
||||||
_raw_sid = qs.get('session_id', [''])[0]
|
|
||||||
if not _raw_sid: return bad(self, 'session_id is required')
|
|
||||||
try: s = get_session(_raw_sid)
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
rel = qs.get('path', [''])[0]
|
|
||||||
force_download = qs.get('download', [''])[0] == '1'
|
|
||||||
target = safe_resolve(Path(s.workspace), rel)
|
|
||||||
if not target.exists() or not target.is_file():
|
|
||||||
return j(self, {'error': 'not found'}, status=404)
|
|
||||||
ext = target.suffix.lower()
|
|
||||||
mime = MIME_MAP.get(ext, 'application/octet-stream')
|
|
||||||
raw_bytes = target.read_bytes()
|
|
||||||
import urllib.parse as _up
|
|
||||||
safe_name = _up.quote(target.name, safe='')
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', mime)
|
|
||||||
self.send_header('Content-Length', str(len(raw_bytes)))
|
|
||||||
self.send_header('Cache-Control', 'no-store')
|
|
||||||
if force_download:
|
|
||||||
self.send_header('Content-Disposition', f'attachment; filename="{target.name}"; filename*=UTF-8\'\'{safe_name}')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(raw_bytes)
|
|
||||||
return
|
|
||||||
if parsed.path == '/api/file':
|
|
||||||
qs3 = parse_qs(parsed.query)
|
|
||||||
sid3 = qs3.get('session_id', [''])[0]
|
|
||||||
if not sid3: return bad(self, 'session_id is required')
|
|
||||||
try: s = get_session(sid3)
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
rel3 = qs3.get('path', [''])[0]
|
|
||||||
if not rel3: return bad(self, 'path is required')
|
|
||||||
try: return j(self, read_file_content(Path(s.workspace), rel3))
|
|
||||||
except (FileNotFoundError, ValueError) as e: return bad(self, str(e), 404)
|
|
||||||
if parsed.path == '/api/approval/pending':
|
|
||||||
sid = parse_qs(parsed.query).get('session_id', [''])[0]
|
|
||||||
if has_pending(sid):
|
|
||||||
# peek without removing
|
|
||||||
import threading as _t
|
|
||||||
from tools.approval import _pending, _lock
|
|
||||||
with _lock:
|
|
||||||
p = dict(_pending.get(sid, {}))
|
|
||||||
return j(self, {'pending': p})
|
|
||||||
return j(self, {'pending': None})
|
|
||||||
# Test-only: inject a pending approval entry directly (no agent needed)
|
|
||||||
if parsed.path == '/api/approval/inject_test':
|
|
||||||
qs2 = parse_qs(parsed.query)
|
|
||||||
sid = qs2.get('session_id', [''])[0]
|
|
||||||
key = qs2.get('pattern_key', ['test_pattern'])[0]
|
|
||||||
cmd = qs2.get('command', ['rm -rf /tmp/test'])[0]
|
|
||||||
if sid:
|
|
||||||
submit_pending(sid, {
|
|
||||||
'command': cmd, 'pattern_key': key,
|
|
||||||
'pattern_keys': [key], 'description': 'test pattern',
|
|
||||||
})
|
|
||||||
return j(self, {'ok': True, 'session_id': sid})
|
|
||||||
return j(self, {'error': 'session_id required'}, status=400)
|
|
||||||
self._log_request(self.command, self.path, 404, (time.time()-_t0)*1000)
|
|
||||||
# ── Cron API ──
|
|
||||||
if parsed.path == '/api/crons':
|
|
||||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
|
||||||
from cron.jobs import list_jobs, OUTPUT_DIR as CRON_OUT
|
|
||||||
jobs = list_jobs(include_disabled=True)
|
|
||||||
return j(self, {'jobs': jobs})
|
|
||||||
if parsed.path == '/api/crons/output':
|
|
||||||
from cron.jobs import OUTPUT_DIR as CRON_OUT
|
|
||||||
job_id = parse_qs(parsed.query).get('job_id', [''])[0]
|
|
||||||
limit = int(parse_qs(parsed.query).get('limit', ['5'])[0])
|
|
||||||
if not job_id: return j(self, {'error': 'job_id required'}, status=400)
|
|
||||||
out_dir = CRON_OUT / job_id
|
|
||||||
outputs = []
|
|
||||||
if out_dir.exists():
|
|
||||||
files = sorted(out_dir.glob('*.md'), reverse=True)[:limit]
|
|
||||||
for f in files:
|
|
||||||
try:
|
|
||||||
txt_content = f.read_text(encoding='utf-8', errors='replace')
|
|
||||||
outputs.append({'filename': f.name, 'content': txt_content[:8000]})
|
|
||||||
except Exception: pass
|
|
||||||
return j(self, {'job_id': job_id, 'outputs': outputs})
|
|
||||||
# ── Skills API ──
|
|
||||||
if parsed.path == '/api/skills':
|
|
||||||
from tools.skills_tool import skills_list as _skills_list
|
|
||||||
import json as _j
|
|
||||||
raw = _skills_list()
|
|
||||||
data = _j.loads(raw) if isinstance(raw, str) else raw
|
|
||||||
return j(self, {'skills': data.get('skills', [])})
|
|
||||||
if parsed.path == '/api/skills/content':
|
|
||||||
from tools.skills_tool import skill_view as _skill_view
|
|
||||||
import json as _j
|
|
||||||
name = parse_qs(parsed.query).get('name', [''])[0]
|
|
||||||
if not name: return j(self, {'error': 'name required'}, status=400)
|
|
||||||
raw = _skill_view(name)
|
|
||||||
data = _j.loads(raw) if isinstance(raw, str) else raw
|
|
||||||
return j(self, data)
|
|
||||||
# ── Memory API ──
|
|
||||||
if parsed.path == '/api/memory':
|
|
||||||
mem_dir = Path.home() / '.hermes' / 'memories'
|
|
||||||
mem_file = mem_dir / 'MEMORY.md'
|
|
||||||
user_file = mem_dir / 'USER.md'
|
|
||||||
memory = mem_file.read_text(encoding='utf-8', errors='replace') if mem_file.exists() else ''
|
|
||||||
user = user_file.read_text(encoding='utf-8', errors='replace') if user_file.exists() else ''
|
|
||||||
return j(self, {
|
|
||||||
'memory': memory, 'user': user,
|
|
||||||
'memory_path': str(mem_file), 'user_path': str(user_file),
|
|
||||||
'memory_mtime': mem_file.stat().st_mtime if mem_file.exists() else None,
|
|
||||||
'user_mtime': user_file.stat().st_mtime if user_file.exists() else None,
|
|
||||||
})
|
|
||||||
if parsed.path == '/api/crons/run':
|
|
||||||
job_id = body.get('job_id', '')
|
|
||||||
if not job_id: return bad(self, 'job_id required')
|
|
||||||
from cron.jobs import get_job
|
|
||||||
from cron.scheduler import run_job
|
|
||||||
import threading as _threading
|
|
||||||
job = get_job(job_id)
|
|
||||||
if not job: return bad(self, 'Job not found', 404)
|
|
||||||
# Run in a background thread so the request returns immediately
|
|
||||||
_threading.Thread(target=run_job, args=(job,), daemon=True).start()
|
|
||||||
return j(self, {'ok': True, 'job_id': job_id, 'status': 'triggered'})
|
|
||||||
if parsed.path == '/api/crons/pause':
|
|
||||||
job_id = body.get('job_id', '')
|
|
||||||
if not job_id: return bad(self, 'job_id required')
|
|
||||||
from cron.jobs import pause_job
|
|
||||||
result = pause_job(job_id, reason=body.get('reason'))
|
|
||||||
if result: return j(self, {'ok': True, 'job': result})
|
|
||||||
return bad(self, 'Job not found', 404)
|
|
||||||
if parsed.path == '/api/crons/resume':
|
|
||||||
job_id = body.get('job_id', '')
|
|
||||||
if not job_id: return bad(self, 'job_id required')
|
|
||||||
from cron.jobs import resume_job
|
|
||||||
result = resume_job(job_id)
|
|
||||||
if result: return j(self, {'ok': True, 'job': result})
|
|
||||||
return bad(self, 'Job not found', 404)
|
|
||||||
self._log_request(self.command, self.path, 404, (time.time()-_t0)*1000)
|
|
||||||
if parsed.path == '/api/skills/save':
|
|
||||||
# Create or update a skill's SKILL.md content
|
|
||||||
try: require(body, 'name', 'content')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
skill_name = body['name'].strip().lower().replace(' ', '-')
|
|
||||||
if not skill_name or '/' in skill_name or '..' in skill_name:
|
|
||||||
return bad(self, 'Invalid skill name')
|
|
||||||
category = body.get('category', '').strip()
|
|
||||||
from tools.skills_tool import SKILLS_DIR
|
|
||||||
if category:
|
|
||||||
skill_dir = SKILLS_DIR / category / skill_name
|
|
||||||
else:
|
|
||||||
skill_dir = SKILLS_DIR / skill_name
|
|
||||||
skill_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
skill_file = skill_dir / 'SKILL.md'
|
|
||||||
skill_file.write_text(body['content'], encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'name': skill_name, 'path': str(skill_file)})
|
|
||||||
if parsed.path == '/api/skills/delete':
|
|
||||||
try: require(body, 'name')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
from tools.skills_tool import SKILLS_DIR
|
|
||||||
import shutil as _shutil
|
|
||||||
# Search for the skill directory by name
|
|
||||||
matches = list(SKILLS_DIR.rglob(f'{body["name"]}/SKILL.md'))
|
|
||||||
if not matches: return bad(self, 'Skill not found', 404)
|
|
||||||
skill_dir = matches[0].parent
|
|
||||||
_shutil.rmtree(str(skill_dir))
|
|
||||||
return j(self, {'ok': True, 'name': body['name']})
|
|
||||||
if parsed.path == '/api/memory/write':
|
|
||||||
# Write to MEMORY.md or USER.md
|
|
||||||
try: require(body, 'section', 'content')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
mem_dir = Path.home() / '.hermes' / 'memories'
|
|
||||||
mem_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
section = body['section']
|
|
||||||
if section == 'memory':
|
|
||||||
target = mem_dir / 'MEMORY.md'
|
|
||||||
elif section == 'user':
|
|
||||||
target = mem_dir / 'USER.md'
|
|
||||||
else:
|
|
||||||
return bad(self, 'section must be "memory" or "user"')
|
|
||||||
target.write_text(body['content'], encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'section': section, 'path': str(target)})
|
|
||||||
return j(self, {'error':'not found'}, status=404)
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self._log_request(self.command, self.path, 500, (time.time()-_t0)*1000)
|
|
||||||
return j(self, {'error': str(e), 'trace': traceback.format_exc()}, status=500)
|
return j(self, {'error': str(e), 'trace': traceback.format_exc()}, status=500)
|
||||||
|
|
||||||
def do_POST(self):
|
def do_POST(self):
|
||||||
_t0 = time.time()
|
self._req_t0 = time.time()
|
||||||
self._req_t0 = _t0
|
|
||||||
try:
|
try:
|
||||||
parsed = urlparse(self.path)
|
parsed = urlparse(self.path)
|
||||||
if parsed.path == '/api/upload':
|
result = handle_post(self, parsed)
|
||||||
return handle_upload(self)
|
if result is False:
|
||||||
body = read_body(self)
|
return j(self, {'error': 'not found'}, status=404)
|
||||||
if parsed.path == '/api/session/new':
|
|
||||||
s = new_session(workspace=body.get('workspace'), model=body.get('model')); return j(self, {'session': s.compact() | {'messages': s.messages}})
|
|
||||||
if parsed.path == '/api/sessions/cleanup':
|
|
||||||
# Delete all sessions with no messages and title == Untitled (legacy)
|
|
||||||
cleaned = 0
|
|
||||||
for p in SESSION_DIR.glob('*.json'):
|
|
||||||
if p.name.startswith('_'): continue
|
|
||||||
try:
|
|
||||||
s = Session.load(p.stem)
|
|
||||||
if s and s.title == 'Untitled' and len(s.messages) == 0:
|
|
||||||
with LOCK: SESSIONS.pop(p.stem, None)
|
|
||||||
p.unlink(missing_ok=True)
|
|
||||||
cleaned += 1
|
|
||||||
except Exception: pass
|
|
||||||
if SESSION_INDEX_FILE.exists():
|
|
||||||
SESSION_INDEX_FILE.unlink(missing_ok=True)
|
|
||||||
return j(self, {'ok': True, 'cleaned': cleaned})
|
|
||||||
if parsed.path == '/api/sessions/cleanup_zero_message':
|
|
||||||
# Delete ALL sessions with 0 messages (used by test teardown)
|
|
||||||
cleaned = 0
|
|
||||||
for p in SESSION_DIR.glob('*.json'):
|
|
||||||
if p.name.startswith('_'): continue
|
|
||||||
try:
|
|
||||||
s = Session.load(p.stem)
|
|
||||||
if s and len(s.messages) == 0:
|
|
||||||
with LOCK: SESSIONS.pop(p.stem, None)
|
|
||||||
p.unlink(missing_ok=True)
|
|
||||||
cleaned += 1
|
|
||||||
except Exception: pass
|
|
||||||
if SESSION_INDEX_FILE.exists():
|
|
||||||
SESSION_INDEX_FILE.unlink(missing_ok=True)
|
|
||||||
return j(self, {'ok': True, 'cleaned': cleaned})
|
|
||||||
if parsed.path == '/api/session/rename':
|
|
||||||
try: require(body, 'session_id', 'title')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
s.title = str(body['title']).strip()[:80] or 'Untitled'
|
|
||||||
s.save()
|
|
||||||
return j(self, {'session': s.compact()})
|
|
||||||
if parsed.path == '/api/session/update':
|
|
||||||
try: require(body, 'session_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
new_ws = str(Path(body.get('workspace', s.workspace)).expanduser().resolve())
|
|
||||||
s.workspace = new_ws; s.model = body.get('model', s.model); s.save()
|
|
||||||
set_last_workspace(new_ws) # persist for new session inheritance
|
|
||||||
return j(self, {'session': s.compact() | {'messages': s.messages}})
|
|
||||||
if parsed.path == '/api/session/delete':
|
|
||||||
sid = body.get('session_id','')
|
|
||||||
if not sid: return bad(self, 'session_id is required')
|
|
||||||
with LOCK: SESSIONS.pop(sid, None)
|
|
||||||
p = SESSION_DIR / f'{sid}.json'
|
|
||||||
try: p.unlink(missing_ok=True)
|
|
||||||
except Exception: pass
|
|
||||||
# Invalidate index so the deleted session stops appearing in lists
|
|
||||||
try: SESSION_INDEX_FILE.unlink(missing_ok=True)
|
|
||||||
except Exception: pass
|
|
||||||
return j(self, {'ok': True})
|
|
||||||
if parsed.path == '/api/session/clear':
|
|
||||||
# Wipe all messages from a session, keep session metadata
|
|
||||||
try: require(body, 'session_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
s.messages = []
|
|
||||||
s.tool_calls = []
|
|
||||||
s.title = 'Untitled'
|
|
||||||
s.save()
|
|
||||||
return j(self, {'ok': True, 'session': s.compact()})
|
|
||||||
if parsed.path == '/api/session/truncate':
|
|
||||||
# Truncate messages at a given index (keep messages[:index])
|
|
||||||
# Used by edit+regenerate: trim everything from the edited message onward
|
|
||||||
try: require(body, 'session_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
if body.get('keep_count') is None: return bad(self, 'Missing required field(s): keep_count')
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
keep = int(body['keep_count'])
|
|
||||||
s.messages = s.messages[:keep]
|
|
||||||
s.save()
|
|
||||||
return j(self, {'ok': True, 'session': s.compact() | {'messages': s.messages}})
|
|
||||||
if parsed.path == '/api/chat/start':
|
|
||||||
try: require(body, 'session_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
msg = str(body.get('message', '')).strip()
|
|
||||||
if not msg: return bad(self, 'message is required')
|
|
||||||
attachments = [str(a) for a in (body.get('attachments') or [])][:20]
|
|
||||||
workspace = str(Path(body.get('workspace') or s.workspace).expanduser().resolve())
|
|
||||||
model = body.get('model') or s.model
|
|
||||||
s.workspace = workspace; s.model = model; s.save()
|
|
||||||
set_last_workspace(workspace) # persist for new session inheritance
|
|
||||||
stream_id = uuid.uuid4().hex
|
|
||||||
q = queue.Queue()
|
|
||||||
with STREAMS_LOCK: STREAMS[stream_id] = q
|
|
||||||
t = threading.Thread(target=_run_agent_streaming,
|
|
||||||
args=(s.session_id, msg, model, workspace, stream_id, attachments), daemon=True)
|
|
||||||
t.start()
|
|
||||||
return j(self, {'stream_id': stream_id, 'session_id': s.session_id})
|
|
||||||
if parsed.path == '/api/chat':
|
|
||||||
s = get_session(body['session_id']); msg = str(body.get('message', '')).strip()
|
|
||||||
if not msg: return j(self, {'error':'empty message'}, status=400)
|
|
||||||
workspace = Path(body.get('workspace') or s.workspace).expanduser().resolve(); s.workspace = str(workspace); s.model = body.get('model') or s.model
|
|
||||||
old_cwd = os.environ.get('TERMINAL_CWD'); os.environ['TERMINAL_CWD'] = str(workspace)
|
|
||||||
old_exec_ask = os.environ.get('HERMES_EXEC_ASK')
|
|
||||||
old_session_key = os.environ.get('HERMES_SESSION_KEY')
|
|
||||||
os.environ['HERMES_EXEC_ASK'] = '1'
|
|
||||||
os.environ['HERMES_SESSION_KEY'] = s.session_id
|
|
||||||
try:
|
|
||||||
with CHAT_LOCK:
|
|
||||||
agent = AIAgent(model=s.model, platform='cli', quiet_mode=True, enabled_toolsets=CLI_TOOLSETS, session_id=s.session_id)
|
|
||||||
workspace_ctx = f"[Workspace: {s.workspace}]\n"
|
|
||||||
workspace_system_msg = (
|
|
||||||
f"Active workspace at session start: {s.workspace}\n"
|
|
||||||
"Every user message is prefixed with [Workspace: /absolute/path] indicating the "
|
|
||||||
"workspace the user has selected in the web UI at the time they sent that message. "
|
|
||||||
"This tag is the single authoritative source of the active workspace and updates "
|
|
||||||
"with every message. It overrides any prior workspace mentioned in this system "
|
|
||||||
"prompt, memory, or conversation history. Always use the value from the most recent "
|
|
||||||
"[Workspace: ...] tag as your default working directory for ALL file operations: "
|
|
||||||
"write_file, read_file, search_files, terminal workdir, and patch. "
|
|
||||||
"Never fall back to a hardcoded path when this tag is present."
|
|
||||||
)
|
|
||||||
result = agent.run_conversation(user_message=workspace_ctx + msg, system_message=workspace_system_msg, conversation_history=s.messages, task_id=s.session_id, persist_user_message=msg)
|
|
||||||
finally:
|
|
||||||
if old_cwd is None: os.environ.pop('TERMINAL_CWD', None)
|
|
||||||
else: os.environ['TERMINAL_CWD'] = old_cwd
|
|
||||||
if old_exec_ask is None: os.environ.pop('HERMES_EXEC_ASK', None)
|
|
||||||
else: os.environ['HERMES_EXEC_ASK'] = old_exec_ask
|
|
||||||
if old_session_key is None: os.environ.pop('HERMES_SESSION_KEY', None)
|
|
||||||
else: os.environ['HERMES_SESSION_KEY'] = old_session_key
|
|
||||||
s.messages = result.get('messages') or s.messages; s.title = title_from(s.messages, s.title); s.save()
|
|
||||||
return j(self, {'answer': result.get('final_response') or '', 'status': 'done' if result.get('completed', True) else 'partial', 'session': s.compact() | {'messages': s.messages}, 'result': {k:v for k,v in result.items() if k != 'messages'}})
|
|
||||||
if parsed.path == '/api/crons/create':
|
|
||||||
try: require(body, 'prompt', 'schedule')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try:
|
|
||||||
from cron.jobs import create_job
|
|
||||||
job = create_job(
|
|
||||||
prompt=body['prompt'],
|
|
||||||
schedule=body['schedule'],
|
|
||||||
name=body.get('name') or None,
|
|
||||||
deliver=body.get('deliver') or 'local',
|
|
||||||
skills=body.get('skills') or [],
|
|
||||||
model=body.get('model') or None,
|
|
||||||
)
|
|
||||||
return j(self, {'ok': True, 'job': job})
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
return j(self, {'error': str(e)}, status=400)
|
|
||||||
if parsed.path == '/api/crons/update':
|
|
||||||
try: require(body, 'job_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
from cron.jobs import update_job
|
|
||||||
updates = {k: v for k, v in body.items() if k != 'job_id' and v is not None}
|
|
||||||
job = update_job(body['job_id'], updates)
|
|
||||||
if not job: return bad(self, 'Job not found', 404)
|
|
||||||
return j(self, {'ok': True, 'job': job})
|
|
||||||
if parsed.path == '/api/crons/delete':
|
|
||||||
try: require(body, 'job_id')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
from cron.jobs import remove_job
|
|
||||||
ok = remove_job(body['job_id'])
|
|
||||||
if not ok: return bad(self, 'Job not found', 404)
|
|
||||||
return j(self, {'ok': True, 'job_id': body['job_id']})
|
|
||||||
if parsed.path == '/api/file/delete':
|
|
||||||
try: require(body, 'session_id', 'path')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
try:
|
|
||||||
target = safe_resolve(Path(s.workspace), body['path'])
|
|
||||||
if not target.exists(): return bad(self, 'File not found', 404)
|
|
||||||
if target.is_dir(): return bad(self, 'Cannot delete directories via this endpoint')
|
|
||||||
target.unlink()
|
|
||||||
return j(self, {'ok': True, 'path': body['path']})
|
|
||||||
except (ValueError, PermissionError) as e: return bad(self, str(e))
|
|
||||||
if parsed.path == '/api/file/save':
|
|
||||||
try: require(body, 'session_id', 'path')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
try:
|
|
||||||
target = safe_resolve(Path(s.workspace), body['path'])
|
|
||||||
if not target.exists(): return bad(self, 'File not found', 404)
|
|
||||||
if target.is_dir(): return bad(self, 'Cannot save: path is a directory')
|
|
||||||
target.write_text(body.get('content', ''), encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'path': body['path'], 'size': target.stat().st_size})
|
|
||||||
except (ValueError, PermissionError) as e: return bad(self, str(e))
|
|
||||||
if parsed.path == '/api/file/create':
|
|
||||||
try: require(body, 'session_id', 'path')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
try: s = get_session(body['session_id'])
|
|
||||||
except KeyError: return bad(self, 'Session not found', 404)
|
|
||||||
try:
|
|
||||||
target = safe_resolve(Path(s.workspace), body['path'])
|
|
||||||
if target.exists(): return bad(self, 'File already exists')
|
|
||||||
target.parent.mkdir(parents=True, exist_ok=True)
|
|
||||||
target.write_text(body.get('content', ''), encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'path': str(target.relative_to(Path(s.workspace)))})
|
|
||||||
except (ValueError, PermissionError) as e: return bad(self, str(e))
|
|
||||||
if parsed.path == '/api/workspaces/add':
|
|
||||||
path_str = body.get('path', '').strip()
|
|
||||||
name = body.get('name', '').strip()
|
|
||||||
if not path_str: return bad(self, 'path is required')
|
|
||||||
p = Path(path_str).expanduser().resolve()
|
|
||||||
if not p.exists(): return bad(self, f'Path does not exist: {p}')
|
|
||||||
if not p.is_dir(): return bad(self, f'Path is not a directory: {p}')
|
|
||||||
wss = load_workspaces()
|
|
||||||
if any(w['path'] == str(p) for w in wss):
|
|
||||||
return bad(self, 'Workspace already in list')
|
|
||||||
wss.append({'path': str(p), 'name': name or p.name})
|
|
||||||
save_workspaces(wss)
|
|
||||||
return j(self, {'ok': True, 'workspaces': wss})
|
|
||||||
if parsed.path == '/api/workspaces/remove':
|
|
||||||
path_str = body.get('path', '').strip()
|
|
||||||
if not path_str: return bad(self, 'path is required')
|
|
||||||
wss = load_workspaces()
|
|
||||||
wss = [w for w in wss if w['path'] != path_str]
|
|
||||||
save_workspaces(wss)
|
|
||||||
return j(self, {'ok': True, 'workspaces': wss})
|
|
||||||
if parsed.path == '/api/workspaces/rename':
|
|
||||||
path_str = body.get('path', '').strip()
|
|
||||||
name = body.get('name', '').strip()
|
|
||||||
if not path_str or not name: return bad(self, 'path and name are required')
|
|
||||||
wss = load_workspaces()
|
|
||||||
for w in wss:
|
|
||||||
if w['path'] == path_str:
|
|
||||||
w['name'] = name; break
|
|
||||||
else:
|
|
||||||
return bad(self, 'Workspace not found', 404)
|
|
||||||
save_workspaces(wss)
|
|
||||||
return j(self, {'ok': True, 'workspaces': wss})
|
|
||||||
if parsed.path == '/api/approval/respond':
|
|
||||||
sid = body.get('session_id', '')
|
|
||||||
if not sid: return bad(self, 'session_id is required')
|
|
||||||
choice = body.get('choice', 'deny')
|
|
||||||
if choice not in ('once','session','always','deny'):
|
|
||||||
return bad(self, f'Invalid choice: {choice}')
|
|
||||||
from tools.approval import _pending, _lock, _permanent_approved
|
|
||||||
with _lock:
|
|
||||||
pending = _pending.pop(sid, None)
|
|
||||||
if pending:
|
|
||||||
keys = pending.get('pattern_keys') or [pending.get('pattern_key', '')]
|
|
||||||
if choice in ('once', 'session'):
|
|
||||||
for k in keys: approve_session(sid, k)
|
|
||||||
elif choice == 'always':
|
|
||||||
for k in keys:
|
|
||||||
approve_session(sid, k); approve_permanent(k)
|
|
||||||
save_permanent_allowlist(_permanent_approved)
|
|
||||||
return j(self, {'ok': True, 'choice': choice})
|
|
||||||
if parsed.path == '/api/skills/save':
|
|
||||||
# Create or update a skill's SKILL.md content
|
|
||||||
try: require(body, 'name', 'content')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
skill_name = body['name'].strip().lower().replace(' ', '-')
|
|
||||||
if not skill_name or '/' in skill_name or '..' in skill_name:
|
|
||||||
return bad(self, 'Invalid skill name')
|
|
||||||
category = body.get('category', '').strip()
|
|
||||||
from tools.skills_tool import SKILLS_DIR
|
|
||||||
if category:
|
|
||||||
skill_dir = SKILLS_DIR / category / skill_name
|
|
||||||
else:
|
|
||||||
skill_dir = SKILLS_DIR / skill_name
|
|
||||||
skill_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
skill_file = skill_dir / 'SKILL.md'
|
|
||||||
skill_file.write_text(body['content'], encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'name': skill_name, 'path': str(skill_file)})
|
|
||||||
if parsed.path == '/api/skills/delete':
|
|
||||||
try: require(body, 'name')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
from tools.skills_tool import SKILLS_DIR
|
|
||||||
import shutil as _shutil
|
|
||||||
matches = list(SKILLS_DIR.rglob(f'{body["name"]}/SKILL.md'))
|
|
||||||
if not matches: return bad(self, 'Skill not found', 404)
|
|
||||||
skill_dir = matches[0].parent
|
|
||||||
_shutil.rmtree(str(skill_dir))
|
|
||||||
return j(self, {'ok': True, 'name': body['name']})
|
|
||||||
if parsed.path == '/api/memory/write':
|
|
||||||
# Write to MEMORY.md or USER.md
|
|
||||||
try: require(body, 'section', 'content')
|
|
||||||
except ValueError as e: return bad(self, str(e))
|
|
||||||
mem_dir = Path.home() / '.hermes' / 'memories'
|
|
||||||
mem_dir.mkdir(parents=True, exist_ok=True)
|
|
||||||
section = body['section']
|
|
||||||
if section == 'memory':
|
|
||||||
target = mem_dir / 'MEMORY.md'
|
|
||||||
elif section == 'user':
|
|
||||||
target = mem_dir / 'USER.md'
|
|
||||||
else:
|
|
||||||
return bad(self, 'section must be "memory" or "user"')
|
|
||||||
target.write_text(body['content'], encoding='utf-8')
|
|
||||||
return j(self, {'ok': True, 'section': section, 'path': str(target)})
|
|
||||||
return j(self, {'error':'not found'}, status=404)
|
|
||||||
except Exception as e:
|
|
||||||
self._log_request(self.command, self.path, 500, (time.time()-_t0)*1000)
|
|
||||||
return j(self, {'error': str(e), 'trace': traceback.format_exc()}, status=500)
|
return j(self, {'error': str(e), 'trace': traceback.format_exc()}, status=500)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
from api.config import print_startup_config, verify_hermes_imports, _HERMES_FOUND
|
from api.config import print_startup_config, verify_hermes_imports, _HERMES_FOUND
|
||||||
|
|
||||||
print_startup_config()
|
print_startup_config()
|
||||||
|
|
||||||
if not _HERMES_FOUND:
|
|
||||||
ok, missing = verify_hermes_imports()
|
ok, missing = verify_hermes_imports()
|
||||||
else:
|
if not ok and _HERMES_FOUND:
|
||||||
ok, missing = verify_hermes_imports()
|
|
||||||
if not ok:
|
|
||||||
print(f'[!!] Warning: Hermes agent found but missing modules: {missing}', flush=True)
|
print(f'[!!] Warning: Hermes agent found but missing modules: {missing}', flush=True)
|
||||||
print(' Agent features may not work correctly.', flush=True)
|
print(' Agent features may not work correctly.', flush=True)
|
||||||
|
|
||||||
@@ -693,7 +65,7 @@ def main():
|
|||||||
SESSION_DIR.mkdir(parents=True, exist_ok=True)
|
SESSION_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
DEFAULT_WORKSPACE.mkdir(parents=True, exist_ok=True)
|
DEFAULT_WORKSPACE.mkdir(parents=True, exist_ok=True)
|
||||||
httpd = ThreadingHTTPServer((HOST, PORT), Handler)
|
httpd = ThreadingHTTPServer((HOST, PORT), Handler)
|
||||||
print(f' Hermes WebUI listening on http://{HOST}:{PORT}', flush=True)
|
print(f' Hermes Web UI listening on http://{HOST}:{PORT}', flush=True)
|
||||||
if HOST == '127.0.0.1':
|
if HOST == '127.0.0.1':
|
||||||
print(f' Remote access: ssh -N -L {PORT}:127.0.0.1:{PORT} <user>@<your-server>', flush=True)
|
print(f' Remote access: ssh -N -L {PORT}:127.0.0.1:{PORT} <user>@<your-server>', flush=True)
|
||||||
print(f' Then open: http://localhost:{PORT}', flush=True)
|
print(f' Then open: http://localhost:{PORT}', flush=True)
|
||||||
|
|||||||
@@ -24,6 +24,24 @@ $('btnExportJSON').onclick=()=>{
|
|||||||
const a=document.createElement('a');a.href=url;
|
const a=document.createElement('a');a.href=url;
|
||||||
a.download=`hermes-${S.session.session_id}.json`;a.click();
|
a.download=`hermes-${S.session.session_id}.json`;a.click();
|
||||||
};
|
};
|
||||||
|
$('btnImportJSON').onclick=()=>$('importFileInput').click();
|
||||||
|
$('importFileInput').onchange=async(e)=>{
|
||||||
|
const file=e.target.files[0];
|
||||||
|
if(!file)return;
|
||||||
|
e.target.value='';
|
||||||
|
try{
|
||||||
|
const text=await file.text();
|
||||||
|
const data=JSON.parse(text);
|
||||||
|
const res=await api('/api/session/import',{method:'POST',body:JSON.stringify(data)});
|
||||||
|
if(res.ok&&res.session){
|
||||||
|
await loadSession(res.session.session_id);
|
||||||
|
await renderSessionList();
|
||||||
|
showToast('Session imported');
|
||||||
|
}
|
||||||
|
}catch(err){
|
||||||
|
showToast('Import failed: '+(err.message||'Invalid JSON'));
|
||||||
|
}
|
||||||
|
};
|
||||||
// btnRefreshFiles is now panel-icon-btn in header (see HTML)
|
// btnRefreshFiles is now panel-icon-btn in header (see HTML)
|
||||||
$('btnClearPreview').onclick=()=>{
|
$('btnClearPreview').onclick=()=>{
|
||||||
$('previewArea').classList.remove('visible');
|
$('previewArea').classList.remove('visible');
|
||||||
@@ -50,6 +68,9 @@ document.addEventListener('keydown',async e=>{
|
|||||||
if(!S.busy){await newSession();await renderSessionList();$('msg').focus();}
|
if(!S.busy){await newSession();await renderSessionList();$('msg').focus();}
|
||||||
}
|
}
|
||||||
if(e.key==='Escape'){
|
if(e.key==='Escape'){
|
||||||
|
// Close settings overlay if open
|
||||||
|
const settingsOverlay=$('settingsOverlay');
|
||||||
|
if(settingsOverlay&&settingsOverlay.style.display!=='none'){toggleSettings();return;}
|
||||||
// Close workspace dropdown
|
// Close workspace dropdown
|
||||||
closeWsDropdown();
|
closeWsDropdown();
|
||||||
// Clear session search
|
// Clear session search
|
||||||
@@ -130,6 +151,8 @@ document.querySelectorAll('.suggestion').forEach(btn=>{
|
|||||||
})();
|
})();
|
||||||
|
|
||||||
(async()=>{
|
(async()=>{
|
||||||
|
// Fetch available models from server and populate dropdown dynamically
|
||||||
|
await populateModelDropdown();
|
||||||
// Restore last-used model preference
|
// Restore last-used model preference
|
||||||
const savedModel=localStorage.getItem('hermes-webui-model');
|
const savedModel=localStorage.getItem('hermes-webui-model');
|
||||||
if(savedModel && $('modelSelect')){
|
if(savedModel && $('modelSelect')){
|
||||||
|
|||||||
@@ -13,7 +13,7 @@
|
|||||||
<body>
|
<body>
|
||||||
<div class="layout">
|
<div class="layout">
|
||||||
<aside class="sidebar">
|
<aside class="sidebar">
|
||||||
<div class="sidebar-header"><div class="logo">H</div><div><h1 style="margin:0;font-size:15px;font-weight:700;letter-spacing:-.01em">Hermes</h1><div style="font-size:10px;color:var(--muted);opacity:.8;margin-top:1px">v0.1.0 · WebUI</div></div></div>
|
<div class="sidebar-header"><div class="logo">H</div><div><h1 style="margin:0;font-size:15px;font-weight:700;letter-spacing:-.01em">Hermes</h1><div style="font-size:10px;color:var(--muted);opacity:.8;margin-top:1px">v0.2</div></div></div>
|
||||||
<div class="sidebar-nav">
|
<div class="sidebar-nav">
|
||||||
<button class="nav-tab active" data-panel="chat" data-label="Chat" onclick="switchPanel('chat')" title="Chat">💬</button>
|
<button class="nav-tab active" data-panel="chat" data-label="Chat" onclick="switchPanel('chat')" title="Chat">💬</button>
|
||||||
<button class="nav-tab" data-panel="tasks" data-label="Tasks" onclick="switchPanel('tasks')" title="Tasks">📅</button>
|
<button class="nav-tab" data-panel="tasks" data-label="Tasks" onclick="switchPanel('tasks')" title="Tasks">📅</button>
|
||||||
@@ -135,6 +135,8 @@
|
|||||||
<div class="sidebar-actions">
|
<div class="sidebar-actions">
|
||||||
<button class="sm-btn" id="btnDownload" title="Download as Markdown">↓ Transcript</button>
|
<button class="sm-btn" id="btnDownload" title="Download as Markdown">↓ Transcript</button>
|
||||||
<button class="sm-btn" id="btnExportJSON" title="Export full session as JSON">❬/❭ JSON</button>
|
<button class="sm-btn" id="btnExportJSON" title="Export full session as JSON">❬/❭ JSON</button>
|
||||||
|
<button class="sm-btn" id="btnImportJSON" title="Import session from JSON">↑ Import</button>
|
||||||
|
<input type="file" id="importFileInput" accept=".json" style="display:none">
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="resize-handle" id="sidebarResize"></div>
|
<div class="resize-handle" id="sidebarResize"></div>
|
||||||
@@ -149,6 +151,7 @@
|
|||||||
<div class="ws-dropdown" id="wsDropdown"></div>
|
<div class="ws-dropdown" id="wsDropdown"></div>
|
||||||
</div>
|
</div>
|
||||||
<button class="chip clear-btn" id="btnClearConv" onclick="clearConversation()" title="Clear all messages in this conversation" style="display:none">🗑 Clear</button>
|
<button class="chip clear-btn" id="btnClearConv" onclick="clearConversation()" title="Clear all messages in this conversation" style="display:none">🗑 Clear</button>
|
||||||
|
<button class="chip gear-btn" id="btnSettings" onclick="toggleSettings()" title="Settings">⚙</button>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="messages" id="messages">
|
<div class="messages" id="messages">
|
||||||
@@ -234,6 +237,7 @@
|
|||||||
<span>Workspace</span>
|
<span>Workspace</span>
|
||||||
<div class="panel-actions">
|
<div class="panel-actions">
|
||||||
<button class="panel-icon-btn" id="btnNewFile" title="New file" onclick="promptNewFile()">+</button>
|
<button class="panel-icon-btn" id="btnNewFile" title="New file" onclick="promptNewFile()">+</button>
|
||||||
|
<button class="panel-icon-btn" id="btnNewFolder" title="New folder" onclick="promptNewFolder()">📁</button>
|
||||||
<button class="panel-icon-btn" id="btnRefreshPanel" title="Refresh" onclick="if(S.session)loadDir('.')">↻</button>
|
<button class="panel-icon-btn" id="btnRefreshPanel" title="Refresh" onclick="if(S.session)loadDir('.')">↻</button>
|
||||||
<button class="panel-icon-btn close-preview" id="btnClearPreview" title="Close preview">✕</button>
|
<button class="panel-icon-btn close-preview" id="btnClearPreview" title="Close preview">✕</button>
|
||||||
</div>
|
</div>
|
||||||
@@ -253,6 +257,25 @@
|
|||||||
</div>
|
</div>
|
||||||
</aside>
|
</aside>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="settings-overlay" id="settingsOverlay" style="display:none">
|
||||||
|
<div class="settings-panel">
|
||||||
|
<div class="settings-header">
|
||||||
|
<h3 style="margin:0;font-size:16px">Settings</h3>
|
||||||
|
<button class="panel-icon-btn" onclick="toggleSettings()" title="Close">✕</button>
|
||||||
|
</div>
|
||||||
|
<div class="settings-body">
|
||||||
|
<div class="settings-field">
|
||||||
|
<label for="settingsModel">Default Model</label>
|
||||||
|
<select id="settingsModel" style="width:100%;padding:8px;background:var(--code-bg);color:var(--text);border:1px solid var(--border2);border-radius:6px"></select>
|
||||||
|
</div>
|
||||||
|
<div class="settings-field">
|
||||||
|
<label for="settingsWorkspace">Default Workspace</label>
|
||||||
|
<select id="settingsWorkspace" style="width:100%;padding:8px;background:var(--code-bg);color:var(--text);border:1px solid var(--border2);border-radius:6px"></select>
|
||||||
|
</div>
|
||||||
|
<button class="sm-btn" onclick="saveSettings()" style="margin-top:12px;width:100%;padding:8px;font-weight:600">Save Settings</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
<div class="toast" id="toast"></div>
|
<div class="toast" id="toast"></div>
|
||||||
<script src="/static/ui.js"></script>
|
<script src="/static/ui.js"></script>
|
||||||
<script src="/static/workspace.js"></script>
|
<script src="/static/workspace.js"></script>
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ async function send(){
|
|||||||
|
|
||||||
$('msg').value='';autoResize();
|
$('msg').value='';autoResize();
|
||||||
const displayText=text||(uploaded.length?`Uploaded: ${uploaded.join(', ')}`:'(file upload)');
|
const displayText=text||(uploaded.length?`Uploaded: ${uploaded.join(', ')}`:'(file upload)');
|
||||||
const userMsg={role:'user',content:displayText,attachments:uploaded.length?uploaded:undefined};
|
const userMsg={role:'user',content:displayText,attachments:uploaded.length?uploaded:undefined,_ts:Date.now()/1000};
|
||||||
S.toolCalls=[]; // clear tool calls from previous turn
|
S.toolCalls=[]; // clear tool calls from previous turn
|
||||||
clearLiveToolCards(); // clear any leftover live cards from last turn
|
clearLiveToolCards(); // clear any leftover live cards from last turn
|
||||||
S.messages.push(userMsg);renderMessages();appendThinking();setBusy(true); // activity bar shown via setBusy
|
S.messages.push(userMsg);renderMessages();appendThinking();setBusy(true); // activity bar shown via setBusy
|
||||||
@@ -96,57 +96,52 @@ async function send(){
|
|||||||
$('msgInner').appendChild(assistantRow);
|
$('msgInner').appendChild(assistantRow);
|
||||||
}
|
}
|
||||||
|
|
||||||
const es=new EventSource(`/api/chat/stream?stream_id=${encodeURIComponent(streamId)}`);
|
// ── Shared SSE handler wiring (used for initial connection and reconnect) ──
|
||||||
|
let _reconnectAttempted=false;
|
||||||
|
|
||||||
es.addEventListener('token',e=>{
|
function _wireSSE(source){
|
||||||
// Guard: if the user switched sessions, don't write tokens to the wrong DOM
|
source.addEventListener('token',e=>{
|
||||||
if(!S.session||S.session.session_id!==activeSid) return;
|
if(!S.session||S.session.session_id!==activeSid) return;
|
||||||
const d=JSON.parse(e.data);
|
const d=JSON.parse(e.data);
|
||||||
assistantText+=d.text;
|
assistantText+=d.text;
|
||||||
ensureAssistantRow();
|
ensureAssistantRow();
|
||||||
assistantBody.innerHTML=renderMd(assistantText);
|
assistantBody.innerHTML=renderMd(assistantText);
|
||||||
$('messages').scrollTop=$('messages').scrollHeight;
|
scrollIfPinned();
|
||||||
});
|
});
|
||||||
|
|
||||||
es.addEventListener('tool',e=>{
|
source.addEventListener('tool',e=>{
|
||||||
const d=JSON.parse(e.data);
|
const d=JSON.parse(e.data);
|
||||||
// Only update activity bar if viewing this session
|
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
setStatus(`${d.name}${d.preview?' · '+d.preview.slice(0,55):''}`);
|
setStatus(`${d.name}${d.preview?' · '+d.preview.slice(0,55):''}`);
|
||||||
}
|
}
|
||||||
if(!S.session||S.session.session_id!==activeSid) return;
|
if(!S.session||S.session.session_id!==activeSid) return;
|
||||||
removeThinking();
|
removeThinking();
|
||||||
const oldRow=$('toolRunningRow');if(oldRow)oldRow.remove();
|
const oldRow=$('toolRunningRow');if(oldRow)oldRow.remove();
|
||||||
// Append card to the stable live container -- no renderMessages() call
|
|
||||||
const tc={name:d.name, preview:d.preview||'', args:d.args||{}, snippet:'', done:false};
|
const tc={name:d.name, preview:d.preview||'', args:d.args||{}, snippet:'', done:false};
|
||||||
S.toolCalls.push(tc);
|
S.toolCalls.push(tc);
|
||||||
appendLiveToolCard(tc);
|
appendLiveToolCard(tc);
|
||||||
$('messages').scrollTop=$('messages').scrollHeight;
|
scrollIfPinned();
|
||||||
});
|
});
|
||||||
|
|
||||||
es.addEventListener('approval',e=>{
|
source.addEventListener('approval',e=>{
|
||||||
const d=JSON.parse(e.data);
|
const d=JSON.parse(e.data);
|
||||||
// Tag the approval with the session that owns it so respondApproval uses correct sid
|
|
||||||
d._session_id=activeSid;
|
d._session_id=activeSid;
|
||||||
showApprovalCard(d);
|
showApprovalCard(d);
|
||||||
});
|
});
|
||||||
|
|
||||||
es.addEventListener('done',e=>{
|
source.addEventListener('done',e=>{
|
||||||
es.close();
|
source.close();
|
||||||
const d=JSON.parse(e.data);
|
const d=JSON.parse(e.data);
|
||||||
delete INFLIGHT[activeSid];
|
delete INFLIGHT[activeSid];
|
||||||
clearInflight();
|
clearInflight();
|
||||||
stopApprovalPolling();
|
stopApprovalPolling();
|
||||||
// Only hide approval card if it belongs to the session that just finished
|
|
||||||
if(!_approvalSessionId || _approvalSessionId===activeSid) hideApprovalCard();
|
if(!_approvalSessionId || _approvalSessionId===activeSid) hideApprovalCard();
|
||||||
// Only clear active stream state if this is the currently viewed session
|
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
S.activeStreamId=null;
|
S.activeStreamId=null;
|
||||||
const _cb=$('btnCancel');if(_cb)_cb.style.display='none';
|
const _cb=$('btnCancel');if(_cb)_cb.style.display='none';
|
||||||
}
|
}
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
S.session=d.session;S.messages=d.session.messages||[];
|
S.session=d.session;S.messages=d.session.messages||[];
|
||||||
// Populate tool calls from server-extracted metadata (has snippets)
|
|
||||||
if(d.session.tool_calls&&d.session.tool_calls.length){
|
if(d.session.tool_calls&&d.session.tool_calls.length){
|
||||||
S.toolCalls=d.session.tool_calls.map(tc=>({...tc,done:true}));
|
S.toolCalls=d.session.tool_calls.map(tc=>({...tc,done:true}));
|
||||||
} else {
|
} else {
|
||||||
@@ -157,82 +152,70 @@ async function send(){
|
|||||||
if(lastUser)lastUser.attachments=uploaded;
|
if(lastUser)lastUser.attachments=uploaded;
|
||||||
}
|
}
|
||||||
clearLiveToolCards();
|
clearLiveToolCards();
|
||||||
// Set S.busy=false BEFORE renderMessages so the settled tool card
|
|
||||||
// block (!S.busy guard) can render the final grouped cards.
|
|
||||||
S.busy=false;
|
S.busy=false;
|
||||||
syncTopbar();renderMessages();loadDir('.');
|
syncTopbar();renderMessages();loadDir('.');
|
||||||
}
|
}
|
||||||
renderSessionList();setBusy(false);setStatus('');
|
renderSessionList();setBusy(false);setStatus('');
|
||||||
});
|
});
|
||||||
|
|
||||||
es.addEventListener('error',e=>{
|
source.addEventListener('error',e=>{
|
||||||
es.close();
|
source.close();
|
||||||
delete INFLIGHT[activeSid];
|
// Attempt one reconnect if the stream is still active server-side
|
||||||
clearInflight();
|
if(!_reconnectAttempted && streamId){
|
||||||
stopApprovalPolling();
|
_reconnectAttempted=true;
|
||||||
// Only hide approval card if it belongs to the session that just finished
|
setStatus('Connection lost \u2014 reconnecting\u2026');
|
||||||
if(!_approvalSessionId || _approvalSessionId===activeSid) hideApprovalCard();
|
setTimeout(async()=>{
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
try{
|
||||||
S.activeStreamId=null;
|
const st=await api(`/api/chat/stream/status?stream_id=${encodeURIComponent(streamId)}`);
|
||||||
const _cbe=$('btnCancel');if(_cbe)_cbe.style.display='none';
|
if(st.active){
|
||||||
|
setStatus('Reconnected');
|
||||||
|
_wireSSE(new EventSource(`/api/chat/stream?stream_id=${encodeURIComponent(streamId)}`));
|
||||||
|
return;
|
||||||
}
|
}
|
||||||
let msg='Connection error';
|
}catch(_){}
|
||||||
try{const d=JSON.parse(e.data);msg=d.message||msg;}catch(_){}
|
_handleStreamError();
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
},1500);
|
||||||
clearLiveToolCards();
|
return;
|
||||||
if(!assistantText){removeThinking();}
|
|
||||||
S.messages.push({role:'assistant',content:`**Error:** ${msg}`});
|
|
||||||
renderMessages();
|
|
||||||
}
|
|
||||||
if(!S.session || !INFLIGHT[S.session.session_id]){
|
|
||||||
setBusy(false);setStatus('Error: '+msg);
|
|
||||||
}
|
}
|
||||||
|
_handleStreamError();
|
||||||
});
|
});
|
||||||
|
|
||||||
es.addEventListener('cancel',e=>{
|
source.addEventListener('cancel',e=>{
|
||||||
es.close();
|
source.close();
|
||||||
delete INFLIGHT[activeSid];
|
delete INFLIGHT[activeSid];clearInflight();stopApprovalPolling();
|
||||||
clearInflight();
|
if(!_approvalSessionId||_approvalSessionId===activeSid) hideApprovalCard();
|
||||||
stopApprovalPolling();
|
|
||||||
// Only hide approval card if it belongs to the session that just finished
|
|
||||||
if(!_approvalSessionId || _approvalSessionId===activeSid) hideApprovalCard();
|
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
S.activeStreamId=null;
|
S.activeStreamId=null;const _cbc=$('btnCancel');if(_cbc)_cbc.style.display='none';
|
||||||
const _cbc=$('btnCancel');if(_cbc)_cbc.style.display='none';
|
|
||||||
}
|
}
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
clearLiveToolCards();
|
clearLiveToolCards();if(!assistantText)removeThinking();
|
||||||
if(!assistantText){removeThinking();}
|
S.messages.push({role:'assistant',content:'*Task cancelled.*'});renderMessages();
|
||||||
S.messages.push({role:'assistant',content:'*Task cancelled.*'});
|
|
||||||
renderMessages();
|
|
||||||
}
|
}
|
||||||
renderSessionList();
|
renderSessionList();
|
||||||
if(!S.session || !INFLIGHT[S.session.session_id]){
|
if(!S.session||!INFLIGHT[S.session.session_id]){setBusy(false);setStatus('');}
|
||||||
setBusy(false);setStatus('');
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
|
}
|
||||||
|
|
||||||
// Handle SSE connection errors (network drop etc)
|
function _handleStreamError(){
|
||||||
es.onerror=()=>{
|
delete INFLIGHT[activeSid];clearInflight();stopApprovalPolling();
|
||||||
if(es.readyState===EventSource.CLOSED){
|
if(!_approvalSessionId||_approvalSessionId===activeSid) hideApprovalCard();
|
||||||
delete INFLIGHT[activeSid];
|
|
||||||
stopApprovalPolling();
|
|
||||||
// Only hide approval card if it belongs to the session that just finished
|
|
||||||
if(!_approvalSessionId || _approvalSessionId===activeSid) hideApprovalCard();
|
|
||||||
if(S.session&&S.session.session_id===activeSid){
|
if(S.session&&S.session.session_id===activeSid){
|
||||||
S.activeStreamId=null;
|
S.activeStreamId=null;const _cbe=$('btnCancel');if(_cbe)_cbe.style.display='none';
|
||||||
const _cbo=$('btnCancel');if(_cbo)_cbo.style.display='none';
|
clearLiveToolCards();if(!assistantText)removeThinking();
|
||||||
}
|
S.messages.push({role:'assistant',content:'**Error:** Connection lost'});renderMessages();
|
||||||
if(!assistantText&&S.session&&S.session.session_id===activeSid){
|
}else{
|
||||||
removeThinking();
|
// User switched away — show background error banner
|
||||||
S.messages.push({role:'assistant',content:'**Error:** Connection lost'});
|
if(typeof trackBackgroundError==='function'){
|
||||||
renderMessages();
|
// Look up session title from the session list cache so the banner names it correctly
|
||||||
}
|
const _errTitle=(typeof _allSessions!=='undefined'&&_allSessions.find(s=>s.session_id===activeSid)||{}).title||null;
|
||||||
if(!S.session || !INFLIGHT[S.session.session_id]){
|
trackBackgroundError(activeSid,_errTitle,'Connection lost');
|
||||||
setBusy(false);setStatus('');
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
};
|
if(!S.session||!INFLIGHT[S.session.session_id]){setBusy(false);setStatus('Error: Connection lost');}
|
||||||
|
}
|
||||||
|
|
||||||
|
_wireSSE(new EventSource(`/api/chat/stream?stream_id=${encodeURIComponent(streamId)}`));
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
function transcript(){
|
function transcript(){
|
||||||
|
|||||||
171
static/panels.js
171
static/panels.js
@@ -597,4 +597,175 @@ document.addEventListener('dragenter',e=>{e.preventDefault();if(e.dataTransfer.t
|
|||||||
document.addEventListener('dragleave',e=>{dragCounter--;if(dragCounter<=0){dragCounter=0;wrap.classList.remove('drag-over');}});
|
document.addEventListener('dragleave',e=>{dragCounter--;if(dragCounter<=0){dragCounter=0;wrap.classList.remove('drag-over');}});
|
||||||
document.addEventListener('drop',e=>{e.preventDefault();dragCounter=0;wrap.classList.remove('drag-over');const files=Array.from(e.dataTransfer.files);if(files.length){addFiles(files);$('msg').focus();}});
|
document.addEventListener('drop',e=>{e.preventDefault();dragCounter=0;wrap.classList.remove('drag-over');const files=Array.from(e.dataTransfer.files);if(files.length){addFiles(files);$('msg').focus();}});
|
||||||
|
|
||||||
|
// ── Settings panel ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
function toggleSettings(){
|
||||||
|
const overlay=$('settingsOverlay');
|
||||||
|
if(!overlay) return;
|
||||||
|
if(overlay.style.display==='none'){
|
||||||
|
overlay.style.display='';
|
||||||
|
loadSettingsPanel();
|
||||||
|
} else {
|
||||||
|
overlay.style.display='none';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadSettingsPanel(){
|
||||||
|
try{
|
||||||
|
const settings=await api('/api/settings');
|
||||||
|
// Populate model dropdown from /api/models
|
||||||
|
const modelSel=$('settingsModel');
|
||||||
|
if(modelSel){
|
||||||
|
modelSel.innerHTML='';
|
||||||
|
try{
|
||||||
|
const models=await api('/api/models');
|
||||||
|
for(const g of (models.groups||[])){
|
||||||
|
const og=document.createElement('optgroup');
|
||||||
|
og.label=g.provider;
|
||||||
|
for(const m of g.models){
|
||||||
|
const opt=document.createElement('option');
|
||||||
|
opt.value=m.id;opt.textContent=m.label;
|
||||||
|
og.appendChild(opt);
|
||||||
|
}
|
||||||
|
modelSel.appendChild(og);
|
||||||
|
}
|
||||||
|
}catch(e){}
|
||||||
|
modelSel.value=settings.default_model||'';
|
||||||
|
}
|
||||||
|
// Populate workspace dropdown from /api/workspaces
|
||||||
|
const wsSel=$('settingsWorkspace');
|
||||||
|
if(wsSel){
|
||||||
|
wsSel.innerHTML='';
|
||||||
|
try{
|
||||||
|
const wsData=await api('/api/workspaces');
|
||||||
|
for(const w of (wsData.workspaces||[])){
|
||||||
|
const opt=document.createElement('option');
|
||||||
|
opt.value=w.path;opt.textContent=w.name||w.path;
|
||||||
|
wsSel.appendChild(opt);
|
||||||
|
}
|
||||||
|
}catch(e){}
|
||||||
|
wsSel.value=settings.default_workspace||'';
|
||||||
|
}
|
||||||
|
}catch(e){
|
||||||
|
showToast('Failed to load settings: '+e.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function saveSettings(){
|
||||||
|
const model=($('settingsModel')||{}).value;
|
||||||
|
const workspace=($('settingsWorkspace')||{}).value;
|
||||||
|
const body={};
|
||||||
|
if(model) body.default_model=model;
|
||||||
|
if(workspace) body.default_workspace=workspace;
|
||||||
|
try{
|
||||||
|
await api('/api/settings',{method:'POST',body:JSON.stringify(body)});
|
||||||
|
showToast('Settings saved');
|
||||||
|
toggleSettings();
|
||||||
|
}catch(e){
|
||||||
|
showToast('Save failed: '+e.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close settings on overlay click (not panel click)
|
||||||
|
document.addEventListener('click',e=>{
|
||||||
|
const overlay=$('settingsOverlay');
|
||||||
|
if(overlay&&e.target===overlay) toggleSettings();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ── Cron completion alerts ────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
let _cronPollSince=Date.now()/1000; // track from page load
|
||||||
|
let _cronPollTimer=null;
|
||||||
|
let _cronUnreadCount=0;
|
||||||
|
|
||||||
|
function startCronPolling(){
|
||||||
|
if(_cronPollTimer) return;
|
||||||
|
_cronPollTimer=setInterval(async()=>{
|
||||||
|
if(document.hidden) return; // don't poll when tab is in background
|
||||||
|
try{
|
||||||
|
const data=await api(`/api/crons/recent?since=${_cronPollSince}`);
|
||||||
|
if(data.completions&&data.completions.length>0){
|
||||||
|
for(const c of data.completions){
|
||||||
|
const icon=c.status==='error'?'\u274c':'\u2705';
|
||||||
|
showToast(`${icon} Cron "${c.name}" ${c.status==='error'?'failed':'completed'}`,4000);
|
||||||
|
_cronPollSince=Math.max(_cronPollSince,c.completed_at);
|
||||||
|
}
|
||||||
|
_cronUnreadCount+=data.completions.length;
|
||||||
|
updateCronBadge();
|
||||||
|
}
|
||||||
|
}catch(e){}
|
||||||
|
},30000);
|
||||||
|
}
|
||||||
|
|
||||||
|
function updateCronBadge(){
|
||||||
|
const tab=document.querySelector('.nav-tab[data-panel="tasks"]');
|
||||||
|
if(!tab) return;
|
||||||
|
let badge=tab.querySelector('.cron-badge');
|
||||||
|
if(_cronUnreadCount>0){
|
||||||
|
if(!badge){
|
||||||
|
badge=document.createElement('span');
|
||||||
|
badge.className='cron-badge';
|
||||||
|
tab.style.position='relative';
|
||||||
|
tab.appendChild(badge);
|
||||||
|
}
|
||||||
|
badge.textContent=_cronUnreadCount>9?'9+':_cronUnreadCount;
|
||||||
|
badge.style.display='';
|
||||||
|
}else if(badge){
|
||||||
|
badge.style.display='none';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clear cron badge when Tasks tab is opened
|
||||||
|
const _origSwitchPanel=switchPanel;
|
||||||
|
switchPanel=async function(name){
|
||||||
|
if(name==='tasks'){_cronUnreadCount=0;updateCronBadge();}
|
||||||
|
return _origSwitchPanel(name);
|
||||||
|
};
|
||||||
|
|
||||||
|
// Start polling on page load
|
||||||
|
startCronPolling();
|
||||||
|
|
||||||
|
// ── Background agent error tracking ──────────────────────────────────────────
|
||||||
|
|
||||||
|
const _backgroundErrors=[]; // {session_id, title, message, ts}
|
||||||
|
|
||||||
|
function trackBackgroundError(sessionId, title, message){
|
||||||
|
// Only track if user is NOT currently viewing this session
|
||||||
|
if(S.session&&S.session.session_id===sessionId) return;
|
||||||
|
_backgroundErrors.push({session_id:sessionId, title:title||'Untitled', message, ts:Date.now()});
|
||||||
|
showErrorBanner();
|
||||||
|
}
|
||||||
|
|
||||||
|
function showErrorBanner(){
|
||||||
|
let banner=$('bgErrorBanner');
|
||||||
|
if(!banner){
|
||||||
|
banner=document.createElement('div');
|
||||||
|
banner.id='bgErrorBanner';
|
||||||
|
banner.className='bg-error-banner';
|
||||||
|
const msgs=document.querySelector('.messages');
|
||||||
|
if(msgs) msgs.parentNode.insertBefore(banner,msgs);
|
||||||
|
else document.body.appendChild(banner);
|
||||||
|
}
|
||||||
|
const latest=_backgroundErrors[0]; // FIFO: show oldest (first) error
|
||||||
|
if(!latest){banner.style.display='none';return;}
|
||||||
|
const count=_backgroundErrors.length;
|
||||||
|
banner.innerHTML=`<span>\u26a0 ${count>1?count+' sessions have':'"'+esc(latest.title)+'" has'} encountered an error</span><div style="display:flex;gap:6px;flex-shrink:0"><button class="reconnect-btn" onclick="navigateToErrorSession()">View</button><button class="reconnect-btn" onclick="dismissErrorBanner()">Dismiss</button></div>`;
|
||||||
|
banner.style.display='';
|
||||||
|
}
|
||||||
|
|
||||||
|
function navigateToErrorSession(){
|
||||||
|
const latest=_backgroundErrors.shift(); // FIFO: show oldest error first
|
||||||
|
if(latest){
|
||||||
|
loadSession(latest.session_id);renderSessionList();
|
||||||
|
}
|
||||||
|
if(_backgroundErrors.length===0) dismissErrorBanner();
|
||||||
|
else showErrorBanner();
|
||||||
|
}
|
||||||
|
|
||||||
|
function dismissErrorBanner(){
|
||||||
|
_backgroundErrors.length=0;
|
||||||
|
const banner=$('bgErrorBanner');
|
||||||
|
if(banner) banner.style.display='none';
|
||||||
|
}
|
||||||
|
|
||||||
// Event wiring
|
// Event wiring
|
||||||
|
|||||||
@@ -55,6 +55,7 @@ async function loadSession(sid){
|
|||||||
|
|
||||||
let _allSessions = []; // cached for search filter
|
let _allSessions = []; // cached for search filter
|
||||||
let _renamingSid = null; // session_id currently being renamed (blocks list re-renders)
|
let _renamingSid = null; // session_id currently being renamed (blocks list re-renders)
|
||||||
|
let _showArchived = false; // toggle to show archived sessions
|
||||||
|
|
||||||
async function renderSessionList(){
|
async function renderSessionList(){
|
||||||
try{
|
try{
|
||||||
@@ -92,14 +93,36 @@ function renderSessionListFromCache(){
|
|||||||
const titleMatches=q?_allSessions.filter(s=>(s.title||'Untitled').toLowerCase().includes(q)):_allSessions;
|
const titleMatches=q?_allSessions.filter(s=>(s.title||'Untitled').toLowerCase().includes(q)):_allSessions;
|
||||||
// Merge content matches (deduped): content matches appended after title matches
|
// Merge content matches (deduped): content matches appended after title matches
|
||||||
const titleIds=new Set(titleMatches.map(s=>s.session_id));
|
const titleIds=new Set(titleMatches.map(s=>s.session_id));
|
||||||
const sessions=q?[...titleMatches,..._contentSearchResults.filter(s=>!titleIds.has(s.session_id))]:titleMatches;
|
const allMatched=q?[...titleMatches,..._contentSearchResults.filter(s=>!titleIds.has(s.session_id))]:titleMatches;
|
||||||
|
// Filter archived unless toggle is on
|
||||||
|
const sessions=_showArchived?allMatched:allMatched.filter(s=>!s.archived);
|
||||||
|
const archivedCount=allMatched.filter(s=>s.archived).length;
|
||||||
const list=$('sessionList');list.innerHTML='';
|
const list=$('sessionList');list.innerHTML='';
|
||||||
// Date grouping: Today / Yesterday / Earlier
|
// Show/hide archived toggle if there are archived sessions
|
||||||
|
if(archivedCount>0){
|
||||||
|
const toggle=document.createElement('div');
|
||||||
|
toggle.style.cssText='font-size:10px;padding:4px 10px;color:var(--muted);cursor:pointer;text-align:center;opacity:.7;';
|
||||||
|
toggle.textContent=_showArchived?'Hide archived':'Show '+archivedCount+' archived';
|
||||||
|
toggle.onclick=()=>{_showArchived=!_showArchived;renderSessionListFromCache();};
|
||||||
|
list.appendChild(toggle);
|
||||||
|
}
|
||||||
|
// Separate pinned from unpinned
|
||||||
|
const pinned=sessions.filter(s=>s.pinned);
|
||||||
|
const unpinned=sessions.filter(s=>!s.pinned);
|
||||||
|
// Date grouping: Pinned / Today / Yesterday / Earlier
|
||||||
const now=Date.now();
|
const now=Date.now();
|
||||||
const ONE_DAY=86400000;
|
const ONE_DAY=86400000;
|
||||||
let lastGroup='';
|
let lastGroup='';
|
||||||
for(const s of sessions.slice(0,50)){
|
const ordered=[...pinned,...unpinned].slice(0,50);
|
||||||
const ts=(s.updated_at||0)*1000;
|
if(pinned.length){
|
||||||
|
const hdr=document.createElement('div');
|
||||||
|
hdr.style.cssText='font-size:10px;font-weight:700;text-transform:uppercase;letter-spacing:.08em;color:#f5c542;padding:10px 10px 4px;opacity:.9;';
|
||||||
|
hdr.textContent='\u2605 Pinned';
|
||||||
|
list.appendChild(hdr);
|
||||||
|
}
|
||||||
|
for(const s of ordered){
|
||||||
|
if(!s.pinned){
|
||||||
|
const ts=(s.updated_at||s.created_at||0)*1000; // group by last activity, not creation
|
||||||
const group=ts>now-ONE_DAY?'Today':ts>now-2*ONE_DAY?'Yesterday':'Earlier';
|
const group=ts>now-ONE_DAY?'Today':ts>now-2*ONE_DAY?'Yesterday':'Earlier';
|
||||||
if(group!==lastGroup){
|
if(group!==lastGroup){
|
||||||
lastGroup=group;
|
lastGroup=group;
|
||||||
@@ -108,13 +131,31 @@ function renderSessionListFromCache(){
|
|||||||
hdr.textContent=group;
|
hdr.textContent=group;
|
||||||
list.appendChild(hdr);
|
list.appendChild(hdr);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
const el=document.createElement('div');
|
const el=document.createElement('div');
|
||||||
const isActive=S.session&&s.session_id===S.session.session_id;
|
const isActive=S.session&&s.session_id===S.session.session_id;
|
||||||
el.className='session-item'+(isActive?' active':'')+(isActive&&S.session&&S.session._flash?' new-flash':'');
|
el.className='session-item'+(isActive?' active':'')+(isActive&&S.session&&S.session._flash?' new-flash':'')+(s.archived?' archived':'');
|
||||||
if(isActive&&S.session&&S.session._flash)delete S.session._flash;
|
if(isActive&&S.session&&S.session._flash)delete S.session._flash;
|
||||||
|
const rawTitle=s.title||'Untitled';
|
||||||
|
const tags=(rawTitle.match(/#[\w-]+/g)||[]);
|
||||||
|
const cleanTitle=tags.length?rawTitle.replace(/#[\w-]+/g,'').trim():rawTitle;
|
||||||
const title=document.createElement('span');
|
const title=document.createElement('span');
|
||||||
title.className='session-title';title.textContent=s.title||'Untitled';
|
title.className='session-title';
|
||||||
|
title.textContent=cleanTitle||'Untitled';
|
||||||
title.title='Double-click to rename';
|
title.title='Double-click to rename';
|
||||||
|
// Append tag chips after the title text
|
||||||
|
for(const tag of tags){
|
||||||
|
const chip=document.createElement('span');
|
||||||
|
chip.className='session-tag';
|
||||||
|
chip.textContent=tag;
|
||||||
|
chip.title='Click to filter by '+tag;
|
||||||
|
chip.onclick=(e)=>{
|
||||||
|
e.stopPropagation();
|
||||||
|
const searchBox=$('sessionSearch');
|
||||||
|
if(searchBox){searchBox.value=tag;filterSessions();}
|
||||||
|
};
|
||||||
|
title.appendChild(chip);
|
||||||
|
}
|
||||||
|
|
||||||
// Rename: called directly when we confirm it's a double-click
|
// Rename: called directly when we confirm it's a double-click
|
||||||
const startRename=()=>{
|
const startRename=()=>{
|
||||||
@@ -149,10 +190,50 @@ function renderSessionListFromCache(){
|
|||||||
setTimeout(()=>{inp.focus();inp.select();},10);
|
setTimeout(()=>{inp.focus();inp.select();},10);
|
||||||
};
|
};
|
||||||
|
|
||||||
|
const pin=document.createElement('span');
|
||||||
|
pin.className='session-pin'+(s.pinned?' pinned':'');
|
||||||
|
pin.innerHTML=s.pinned?'★':'☆';
|
||||||
|
pin.title=s.pinned?'Unpin':'Pin to top';
|
||||||
|
pin.onclick=async(e)=>{
|
||||||
|
e.stopPropagation();e.preventDefault();
|
||||||
|
const newPinned=!s.pinned;
|
||||||
|
try{
|
||||||
|
await api('/api/session/pin',{method:'POST',body:JSON.stringify({session_id:s.session_id,pinned:newPinned})});
|
||||||
|
s.pinned=newPinned;
|
||||||
|
if(S.session&&S.session.session_id===s.session_id) S.session.pinned=newPinned;
|
||||||
|
renderSessionList();
|
||||||
|
}catch(err){showToast('Pin failed: '+err.message);}
|
||||||
|
};
|
||||||
|
const archive=document.createElement('button');
|
||||||
|
archive.className='session-action-btn';archive.innerHTML=s.archived?'✉':'📦';
|
||||||
|
archive.title=s.archived?'Unarchive':'Archive';
|
||||||
|
archive.onclick=async(e)=>{
|
||||||
|
e.stopPropagation();e.preventDefault();
|
||||||
|
try{
|
||||||
|
await api('/api/session/archive',{method:'POST',body:JSON.stringify({session_id:s.session_id,archived:!s.archived})});
|
||||||
|
s.archived=!s.archived;
|
||||||
|
if(S.session&&S.session.session_id===s.session_id) S.session.archived=s.archived;
|
||||||
|
await renderSessionList();
|
||||||
|
showToast(s.archived?'Session archived':'Session restored');
|
||||||
|
}catch(err){showToast('Archive failed: '+err.message);}
|
||||||
|
};
|
||||||
|
const dup=document.createElement('button');
|
||||||
|
dup.className='session-dup';dup.innerHTML='⧉';dup.title='Duplicate';
|
||||||
|
dup.onclick=async(e)=>{
|
||||||
|
e.stopPropagation();e.preventDefault();
|
||||||
|
try{
|
||||||
|
const res=await api('/api/session/new',{method:'POST',body:JSON.stringify({workspace:s.workspace,model:s.model})});
|
||||||
|
if(res.session){
|
||||||
|
await api('/api/session/rename',{method:'POST',body:JSON.stringify({session_id:res.session.session_id,title:(s.title||'Untitled')+' (copy)'})});
|
||||||
|
await loadSession(res.session.session_id);await renderSessionList();
|
||||||
|
showToast('Session duplicated');
|
||||||
|
}
|
||||||
|
}catch(err){showToast('Duplicate failed: '+err.message);}
|
||||||
|
};
|
||||||
const trash=document.createElement('button');
|
const trash=document.createElement('button');
|
||||||
trash.className='session-trash';trash.innerHTML='🗑';trash.title='Delete';
|
trash.className='session-trash';trash.innerHTML='🗑';trash.title='Delete';
|
||||||
trash.onclick=async(e)=>{e.stopPropagation();e.preventDefault();await deleteSession(s.session_id);};
|
trash.onclick=async(e)=>{e.stopPropagation();e.preventDefault();await deleteSession(s.session_id);};
|
||||||
el.appendChild(title);el.appendChild(trash);
|
el.appendChild(pin);el.appendChild(title);el.appendChild(archive);el.appendChild(dup);el.appendChild(trash);
|
||||||
|
|
||||||
// Use a click timer to distinguish single-click (navigate) from double-click (rename).
|
// Use a click timer to distinguish single-click (navigate) from double-click (rename).
|
||||||
// This prevents loadSession from firing on the first click of a double-click,
|
// This prevents loadSession from firing on the first click of a double-click,
|
||||||
@@ -160,7 +241,7 @@ function renderSessionListFromCache(){
|
|||||||
let _clickTimer=null;
|
let _clickTimer=null;
|
||||||
el.onclick=async(e)=>{
|
el.onclick=async(e)=>{
|
||||||
if(_renamingSid) return; // ignore while any rename is active
|
if(_renamingSid) return; // ignore while any rename is active
|
||||||
if(e.target===trash||trash.contains(e.target)) return; // trash handles itself
|
if([trash,dup,archive].some(b=>e.target===b||b.contains(e.target))) return;
|
||||||
clearTimeout(_clickTimer);
|
clearTimeout(_clickTimer);
|
||||||
_clickTimer=setTimeout(async()=>{
|
_clickTimer=setTimeout(async()=>{
|
||||||
_clickTimer=null;
|
_clickTimer=null;
|
||||||
|
|||||||
@@ -448,3 +448,53 @@ body.resizing{user-select:none;cursor:col-resize;}
|
|||||||
::-webkit-scrollbar-thumb{background:rgba(255,255,255,.12);border-radius:3px;}
|
::-webkit-scrollbar-thumb{background:rgba(255,255,255,.12);border-radius:3px;}
|
||||||
::-webkit-scrollbar-thumb:hover{background:rgba(255,255,255,.22);}
|
::-webkit-scrollbar-thumb:hover{background:rgba(255,255,255,.22);}
|
||||||
* { scrollbar-width: thin; scrollbar-color: rgba(255,255,255,.12) transparent; }
|
* { scrollbar-width: thin; scrollbar-color: rgba(255,255,255,.12) transparent; }
|
||||||
|
|
||||||
|
/* ── Settings overlay ── */
|
||||||
|
.settings-overlay{position:fixed;inset:0;background:rgba(0,0,0,.5);z-index:1000;display:flex;align-items:center;justify-content:center;}
|
||||||
|
.settings-panel{background:var(--bg);border:1px solid var(--border);border-radius:12px;padding:0;width:380px;max-width:90vw;max-height:80vh;overflow-y:auto;box-shadow:0 12px 40px rgba(0,0,0,.5);}
|
||||||
|
.settings-header{display:flex;align-items:center;justify-content:space-between;padding:16px 20px 12px;border-bottom:1px solid var(--border);}
|
||||||
|
.settings-body{padding:20px;}
|
||||||
|
.settings-field{margin-bottom:16px;}
|
||||||
|
.settings-field label{display:block;font-size:11px;font-weight:600;letter-spacing:.05em;text-transform:uppercase;color:var(--muted);margin-bottom:6px;}
|
||||||
|
/* Save button inside the settings panel */
|
||||||
|
.settings-panel .settings-btn{background:var(--accent);color:#fff;border:none;border-radius:6px;padding:8px 16px;cursor:pointer;font-weight:600;font-size:13px;}
|
||||||
|
.settings-panel .settings-btn:hover{opacity:.9;}
|
||||||
|
/* Gear icon in topbar -- muted chip, no red */
|
||||||
|
.gear-btn{font-size:13px;cursor:pointer;transition:color .15s,background .15s;}
|
||||||
|
.gear-btn:hover{color:var(--text);background:rgba(255,255,255,.08);}
|
||||||
|
|
||||||
|
/* ── Session pin star ── */
|
||||||
|
.session-pin{font-size:12px;cursor:pointer;opacity:0;transition:opacity .15s;padding:2px 4px;flex-shrink:0;}
|
||||||
|
.session-item:hover .session-pin,.session-pin.pinned{opacity:1;}
|
||||||
|
.session-pin.pinned{color:#f5c542;}
|
||||||
|
|
||||||
|
/* ── Session duplicate button ── */
|
||||||
|
.session-dup,.session-action-btn{background:none;border:none;color:var(--muted);font-size:11px;cursor:pointer;opacity:0;transition:opacity .15s;padding:2px 4px;flex-shrink:0;}
|
||||||
|
.session-item:hover .session-dup,.session-item:hover .session-action-btn{opacity:1;}
|
||||||
|
.session-dup:hover,.session-action-btn:hover{color:var(--text);}
|
||||||
|
|
||||||
|
/* ── Cron alert badge ── */
|
||||||
|
.cron-badge{position:absolute;top:2px;right:2px;background:#e53e3e;color:#fff;font-size:9px;font-weight:700;min-width:14px;height:14px;line-height:14px;text-align:center;border-radius:7px;padding:0 3px;}
|
||||||
|
|
||||||
|
/* ── Background error banner ── */
|
||||||
|
/* ── Archived sessions ── */
|
||||||
|
.session-item.archived{opacity:.5;}
|
||||||
|
.session-item.archived .session-title{font-style:italic;}
|
||||||
|
|
||||||
|
/* ── Session tags ── */
|
||||||
|
.session-tag{display:inline-block;font-size:9px;font-weight:600;padding:1px 5px;margin-left:4px;border-radius:3px;background:rgba(99,179,237,.2);color:#63b3ed;cursor:pointer;vertical-align:middle;}
|
||||||
|
.session-tag:hover{background:rgba(99,179,237,.35);}
|
||||||
|
|
||||||
|
/* ── File rename input ── */
|
||||||
|
.file-rename-input{background:rgba(255,255,255,.08);border:1px solid var(--accent);border-radius:4px;color:var(--text);font-size:12px;padding:1px 4px;width:100%;outline:none;min-width:0;}
|
||||||
|
|
||||||
|
/* ── Message timestamps ── */
|
||||||
|
.msg-time{font-size:10px;color:var(--muted);opacity:.6;margin-left:6px;}
|
||||||
|
.msg-role:hover .msg-time{opacity:1;}
|
||||||
|
|
||||||
|
/* ── Mermaid diagrams ── */
|
||||||
|
.mermaid-block{background:var(--code-bg);border-radius:8px;padding:16px;margin:8px 0;overflow-x:auto;}
|
||||||
|
.mermaid-rendered{background:transparent;padding:8px 0;}
|
||||||
|
.mermaid-rendered svg{max-width:100%;height:auto;}
|
||||||
|
|
||||||
|
.bg-error-banner{background:rgba(229,62,62,.15);border:1px solid rgba(229,62,62,.3);color:#fca5a5;padding:8px 16px;font-size:12px;display:flex;align-items:center;justify-content:space-between;gap:12px;border-radius:0;}
|
||||||
|
|||||||
194
static/ui.js
194
static/ui.js
@@ -4,8 +4,88 @@ const MSG_QUEUE=[]; // messages queued while a request is in-flight
|
|||||||
const $=id=>document.getElementById(id);
|
const $=id=>document.getElementById(id);
|
||||||
const esc=s=>String(s??'').replace(/[&<>"']/g,c=>({'&':'&','<':'<','>':'>','"':'"',"'":'''}[c]));
|
const esc=s=>String(s??'').replace(/[&<>"']/g,c=>({'&':'&','<':'<','>':'>','"':'"',"'":'''}[c]));
|
||||||
|
|
||||||
|
// Dynamic model labels -- populated by populateModelDropdown(), fallback to static map
|
||||||
|
let _dynamicModelLabels={};
|
||||||
|
|
||||||
|
async function populateModelDropdown(){
|
||||||
|
const sel=$('modelSelect');
|
||||||
|
if(!sel) return;
|
||||||
|
try{
|
||||||
|
const data=await fetch('/api/models').then(r=>r.json());
|
||||||
|
if(!data.groups||!data.groups.length) return; // keep HTML defaults
|
||||||
|
// Clear existing options
|
||||||
|
sel.innerHTML='';
|
||||||
|
_dynamicModelLabels={};
|
||||||
|
for(const g of data.groups){
|
||||||
|
const og=document.createElement('optgroup');
|
||||||
|
og.label=g.provider;
|
||||||
|
for(const m of g.models){
|
||||||
|
const opt=document.createElement('option');
|
||||||
|
opt.value=m.id;
|
||||||
|
opt.textContent=m.label;
|
||||||
|
og.appendChild(opt);
|
||||||
|
_dynamicModelLabels[m.id]=m.label;
|
||||||
|
}
|
||||||
|
sel.appendChild(og);
|
||||||
|
}
|
||||||
|
// Set default model from server if no localStorage preference
|
||||||
|
if(data.default_model && !localStorage.getItem('hermes-webui-model')){
|
||||||
|
sel.value=data.default_model;
|
||||||
|
// If the default isn't in the list, add it
|
||||||
|
if(sel.value!==data.default_model){
|
||||||
|
const opt=document.createElement('option');
|
||||||
|
opt.value=data.default_model;
|
||||||
|
opt.textContent=data.default_model.split('/').pop();
|
||||||
|
sel.insertBefore(opt,sel.firstChild);
|
||||||
|
sel.value=data.default_model;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}catch(e){
|
||||||
|
// API unavailable -- keep the hardcoded HTML options as fallback
|
||||||
|
console.warn('Failed to load models from server:',e.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ── Scroll pinning ──────────────────────────────────────────────────────────
|
||||||
|
// When streaming, auto-scroll only if the user hasn't manually scrolled up.
|
||||||
|
// Once the user scrolls back to within 80px of the bottom, re-pin.
|
||||||
|
let _scrollPinned=true;
|
||||||
|
(function(){
|
||||||
|
const el=document.getElementById('messages');
|
||||||
|
if(!el) return;
|
||||||
|
el.addEventListener('scroll',()=>{
|
||||||
|
const nearBottom=el.scrollHeight-el.scrollTop-el.clientHeight<80;
|
||||||
|
_scrollPinned=nearBottom;
|
||||||
|
});
|
||||||
|
})();
|
||||||
|
function scrollIfPinned(){
|
||||||
|
if(!_scrollPinned) return;
|
||||||
|
const el=$('messages');
|
||||||
|
if(el) el.scrollTop=el.scrollHeight;
|
||||||
|
}
|
||||||
|
function scrollToBottom(){
|
||||||
|
_scrollPinned=true;
|
||||||
|
const el=$('messages');
|
||||||
|
if(el) el.scrollTop=el.scrollHeight;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getModelLabel(modelId){
|
||||||
|
if(!modelId) return 'Unknown';
|
||||||
|
// Check dynamic labels first, then fall back to splitting the ID
|
||||||
|
if(_dynamicModelLabels[modelId]) return _dynamicModelLabels[modelId];
|
||||||
|
// Static fallback for common models
|
||||||
|
const STATIC_LABELS={'openai/gpt-5.4-mini':'GPT-5.4 Mini','openai/gpt-4o':'GPT-4o','openai/o3':'o3','openai/o4-mini':'o4-mini','anthropic/claude-sonnet-4.6':'Sonnet 4.6','anthropic/claude-sonnet-4-5':'Sonnet 4.5','anthropic/claude-haiku-3-5':'Haiku 3.5','google/gemini-2.5-pro':'Gemini 2.5 Pro','deepseek/deepseek-chat-v3-0324':'DeepSeek V3','meta-llama/llama-4-scout':'Llama 4 Scout'};
|
||||||
|
if(STATIC_LABELS[modelId]) return STATIC_LABELS[modelId];
|
||||||
|
return modelId.split('/').pop()||'Unknown';
|
||||||
|
}
|
||||||
|
|
||||||
function renderMd(raw){
|
function renderMd(raw){
|
||||||
let s=raw||'';
|
let s=raw||'';
|
||||||
|
// Mermaid blocks: render as diagram containers (processed after DOM insertion)
|
||||||
|
s=s.replace(/```mermaid\n?([\s\S]*?)```/g,(_,code)=>{
|
||||||
|
const id='mermaid-'+Math.random().toString(36).slice(2,10);
|
||||||
|
return `<div class="mermaid-block" data-mermaid-id="${id}">${esc(code.trim())}</div>`;
|
||||||
|
});
|
||||||
s=s.replace(/```([\w+-]*)\n?([\s\S]*?)```/g,(_,lang,code)=>{const h=lang?`<div class="pre-header">${esc(lang)}</div>`:'';return `${h}<pre><code>${esc(code.replace(/\n$/,''))}</code></pre>`;});
|
s=s.replace(/```([\w+-]*)\n?([\s\S]*?)```/g,(_,lang,code)=>{const h=lang?`<div class="pre-header">${esc(lang)}</div>`:'';return `${h}<pre><code>${esc(code.replace(/\n$/,''))}</code></pre>`;});
|
||||||
s=s.replace(/`([^`\n]+)`/g,(_,c)=>`<code>${esc(c)}</code>`);
|
s=s.replace(/`([^`\n]+)`/g,(_,c)=>`<code>${esc(c)}</code>`);
|
||||||
s=s.replace(/\*\*\*(.+?)\*\*\*/g,'<strong><em>$1</em></strong>');
|
s=s.replace(/\*\*\*(.+?)\*\*\*/g,'<strong><em>$1</em></strong>');
|
||||||
@@ -174,6 +254,7 @@ async function checkInflightOnBoot(sid) {
|
|||||||
|
|
||||||
function syncTopbar(){
|
function syncTopbar(){
|
||||||
if(!S.session){
|
if(!S.session){
|
||||||
|
document.title='Hermes';
|
||||||
// Show default workspace name even without a session
|
// Show default workspace name even without a session
|
||||||
const sidebarName=$('sidebarWsName');
|
const sidebarName=$('sidebarWsName');
|
||||||
if(sidebarName && sidebarName.textContent==='Workspace'){
|
if(sidebarName && sidebarName.textContent==='Workspace'){
|
||||||
@@ -181,17 +262,26 @@ function syncTopbar(){
|
|||||||
}
|
}
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
$('topbarTitle').textContent=S.session.title||'Untitled';
|
const sessionTitle=S.session.title||'Untitled';
|
||||||
|
$('topbarTitle').textContent=sessionTitle;
|
||||||
|
document.title=sessionTitle+' \u2014 Hermes';
|
||||||
const vis=S.messages.filter(m=>m&&m.role&&m.role!=='tool');
|
const vis=S.messages.filter(m=>m&&m.role&&m.role!=='tool');
|
||||||
$('topbarMeta').textContent=`${vis.length} messages`;
|
$('topbarMeta').textContent=`${vis.length} messages`;
|
||||||
const m=S.session.model||'';
|
const m=S.session.model||'';
|
||||||
const MODEL_LABELS={'openai/gpt-5.4-mini':'GPT-5.4 Mini','openai/gpt-4o':'GPT-4o','openai/o3':'o3','openai/o4-mini':'o4-mini','anthropic/claude-sonnet-4.6':'Sonnet 4.6','anthropic/claude-sonnet-4-5':'Sonnet 4.5','anthropic/claude-haiku-3-5':'Haiku 3.5','google/gemini-2.5-pro':'Gemini 2.5 Pro','deepseek/deepseek-chat-v3-0324':'DeepSeek V3','meta-llama/llama-4-scout':'Llama 4 Scout'};
|
|
||||||
$('modelSelect').value=m; // set dropdown first so chip reads consistent value
|
$('modelSelect').value=m; // set dropdown first so chip reads consistent value
|
||||||
|
// If session model isn't in the dropdown, add it dynamically
|
||||||
|
if(m && $('modelSelect').value!==m){
|
||||||
|
const opt=document.createElement('option');
|
||||||
|
opt.value=m;
|
||||||
|
opt.textContent=getModelLabel(m);
|
||||||
|
$('modelSelect').appendChild(opt);
|
||||||
|
$('modelSelect').value=m;
|
||||||
|
}
|
||||||
// Show Clear button only when session has messages
|
// Show Clear button only when session has messages
|
||||||
const clearBtn=$('btnClearConv');
|
const clearBtn=$('btnClearConv');
|
||||||
if(clearBtn) clearBtn.style.display=(S.messages&&S.messages.filter(m=>m.role!=='tool').length>0)?'':'none';
|
if(clearBtn) clearBtn.style.display=(S.messages&&S.messages.filter(msg=>msg.role!=='tool').length>0)?'':'none';
|
||||||
const displayModel=$('modelSelect').value||m;
|
const displayModel=$('modelSelect').value||m;
|
||||||
$('modelChip').textContent=MODEL_LABELS[displayModel]||(displayModel.split('/').pop()||'Unknown');
|
$('modelChip').textContent=getModelLabel(displayModel);
|
||||||
const ws=S.session.workspace||'';
|
const ws=S.session.workspace||'';
|
||||||
$('wsChip').textContent=ws.split('/').slice(-2).join('/')||ws;
|
$('wsChip').textContent=ws.split('/').slice(-2).join('/')||ws;
|
||||||
// Update workspace chip in topbar with friendly name from workspace list
|
// Update workspace chip in topbar with friendly name from workspace list
|
||||||
@@ -250,7 +340,9 @@ function renderMessages(){
|
|||||||
// Action buttons for this bubble
|
// Action buttons for this bubble
|
||||||
const editBtn = isUser ? `<button class="msg-action-btn" title="Edit message" onclick="editMessage(this)">✎</button>` : '';
|
const editBtn = isUser ? `<button class="msg-action-btn" title="Edit message" onclick="editMessage(this)">✎</button>` : '';
|
||||||
const retryBtn = isLastAssistant ? `<button class="msg-action-btn" title="Regenerate response" onclick="regenerateResponse(this)">↻</button>` : '';
|
const retryBtn = isLastAssistant ? `<button class="msg-action-btn" title="Regenerate response" onclick="regenerateResponse(this)">↻</button>` : '';
|
||||||
row.innerHTML=`<div class="msg-role ${m.role}"><div class="role-icon ${m.role}">${isUser?'Y':'H'}</div><span style="font-size:12px">${isUser?'You':'Hermes'}</span><span class="msg-actions">${editBtn}<button class="msg-copy-btn msg-action-btn" title="Copy" onclick="copyMsg(this)">📋</button>${retryBtn}</span></div>${filesHtml}<div class="msg-body">${bodyHtml}</div>`;
|
const tsVal=m._ts||m.timestamp;
|
||||||
|
const tsTitle=tsVal?new Date(tsVal*1000).toLocaleString():'';
|
||||||
|
row.innerHTML=`<div class="msg-role ${m.role}" ${tsTitle?`title="${esc(tsTitle)}"`:''}><div class="role-icon ${m.role}">${isUser?'Y':'H'}</div><span style="font-size:12px">${isUser?'You':'Hermes'}</span>${tsTitle?`<span class="msg-time">${new Date(tsVal*1000).toLocaleTimeString([],{hour:'2-digit',minute:'2-digit'})}</span>`:''}<span class="msg-actions">${editBtn}<button class="msg-copy-btn msg-action-btn" title="Copy" onclick="copyMsg(this)">📋</button>${retryBtn}</span></div>${filesHtml}<div class="msg-body">${bodyHtml}</div>`;
|
||||||
row.dataset.rawText = String(content).trim();
|
row.dataset.rawText = String(content).trim();
|
||||||
inner.appendChild(row);
|
inner.appendChild(row);
|
||||||
}
|
}
|
||||||
@@ -286,9 +378,9 @@ function renderMessages(){
|
|||||||
else inner.appendChild(frag);
|
else inner.appendChild(frag);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
$('messages').scrollTop=$('messages').scrollHeight;
|
scrollToBottom();
|
||||||
// Apply syntax highlighting after DOM is built
|
// Apply syntax highlighting after DOM is built
|
||||||
requestAnimationFrame(()=>highlightCode());
|
requestAnimationFrame(()=>{highlightCode();renderMermaidBlocks();});
|
||||||
// Refresh todo panel if it's currently open
|
// Refresh todo panel if it's currently open
|
||||||
if(typeof loadTodos==='function' && document.getElementById('panelTodos') && document.getElementById('panelTodos').classList.contains('active')){
|
if(typeof loadTodos==='function' && document.getElementById('panelTodos') && document.getElementById('panelTodos').classList.contains('active')){
|
||||||
loadTodos();
|
loadTodos();
|
||||||
@@ -463,15 +555,54 @@ function highlightCode(container) {
|
|||||||
if(typeof Prism === 'undefined' || !Prism.highlightAllUnder) return;
|
if(typeof Prism === 'undefined' || !Prism.highlightAllUnder) return;
|
||||||
const el = container || $('msgInner');
|
const el = container || $('msgInner');
|
||||||
if(!el) return;
|
if(!el) return;
|
||||||
// Prism autoloader handles language detection via class="language-xxx"
|
|
||||||
Prism.highlightAllUnder(el);
|
Prism.highlightAllUnder(el);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let _mermaidLoading=false;
|
||||||
|
let _mermaidReady=false;
|
||||||
|
|
||||||
|
function renderMermaidBlocks(){
|
||||||
|
const blocks=document.querySelectorAll('.mermaid-block:not([data-rendered])');
|
||||||
|
if(!blocks.length) return;
|
||||||
|
if(!_mermaidReady){
|
||||||
|
if(!_mermaidLoading){
|
||||||
|
_mermaidLoading=true;
|
||||||
|
const script=document.createElement('script');
|
||||||
|
script.src='https://cdn.jsdelivr.net/npm/mermaid@10/dist/mermaid.min.js';
|
||||||
|
script.onload=()=>{
|
||||||
|
if(typeof mermaid!=='undefined'){
|
||||||
|
mermaid.initialize({startOnLoad:false,theme:'dark',themeVariables:{
|
||||||
|
primaryColor:'#4a6fa5',primaryTextColor:'#e2e8f0',lineColor:'#718096',
|
||||||
|
secondaryColor:'#2d3748',tertiaryColor:'#1a202c',primaryBorderColor:'#4a5568',
|
||||||
|
}});
|
||||||
|
_mermaidReady=true;
|
||||||
|
renderMermaidBlocks();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
document.head.appendChild(script);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
blocks.forEach(async(block)=>{
|
||||||
|
block.dataset.rendered='true';
|
||||||
|
const code=block.textContent;
|
||||||
|
const id=block.dataset.mermaidId||('m-'+Math.random().toString(36).slice(2));
|
||||||
|
try{
|
||||||
|
const {svg}=await mermaid.render(id,code);
|
||||||
|
block.innerHTML=svg;
|
||||||
|
block.classList.add('mermaid-rendered');
|
||||||
|
}catch(e){
|
||||||
|
// Fall back to showing as a code block
|
||||||
|
block.innerHTML=`<div class="pre-header">mermaid</div><pre><code>${esc(code)}</code></pre>`;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
function appendThinking(){
|
function appendThinking(){
|
||||||
$('emptyState').style.display='none';
|
$('emptyState').style.display='none';
|
||||||
const row=document.createElement('div');row.className='msg-row';row.id='thinkingRow';
|
const row=document.createElement('div');row.className='msg-row';row.id='thinkingRow';
|
||||||
row.innerHTML=`<div class="msg-role assistant"><div class="role-icon assistant">H</div>Hermes</div><div class="thinking"><div class="dot"></div><div class="dot"></div><div class="dot"></div></div>`;
|
row.innerHTML=`<div class="msg-role assistant"><div class="role-icon assistant">H</div>Hermes</div><div class="thinking"><div class="dot"></div><div class="dot"></div><div class="dot"></div></div>`;
|
||||||
$('msgInner').appendChild(row);$('messages').scrollTop=$('messages').scrollHeight;
|
$('msgInner').appendChild(row);scrollToBottom();
|
||||||
}
|
}
|
||||||
function removeThinking(){const el=$('thinkingRow');if(el)el.remove();}
|
function removeThinking(){const el=$('thinkingRow');if(el)el.remove();}
|
||||||
|
|
||||||
@@ -501,7 +632,37 @@ function renderFileTree(){
|
|||||||
|
|
||||||
// Name -- takes all remaining space, truncates with ellipsis
|
// Name -- takes all remaining space, truncates with ellipsis
|
||||||
const nameEl=document.createElement('span');
|
const nameEl=document.createElement('span');
|
||||||
nameEl.className='file-name';nameEl.textContent=item.name;nameEl.title=item.name;
|
nameEl.className='file-name';nameEl.textContent=item.name;nameEl.title='Double-click to rename';
|
||||||
|
// Inline rename on double-click
|
||||||
|
nameEl.ondblclick=(e)=>{
|
||||||
|
e.stopPropagation();
|
||||||
|
const inp=document.createElement('input');
|
||||||
|
inp.className='file-rename-input';inp.value=item.name;
|
||||||
|
inp.onclick=(e2)=>e2.stopPropagation();
|
||||||
|
const finish=async(save)=>{
|
||||||
|
inp.onblur=null; // prevent double-call: Enter triggers blur after replaceWith
|
||||||
|
if(save){
|
||||||
|
const newName=inp.value.trim();
|
||||||
|
if(newName&&newName!==item.name){
|
||||||
|
try{
|
||||||
|
await api('/api/file/rename',{method:'POST',body:JSON.stringify({
|
||||||
|
session_id:S.session.session_id,path:item.path,new_name:newName
|
||||||
|
})});
|
||||||
|
showToast(`Renamed to ${newName}`);
|
||||||
|
await loadDir('.');
|
||||||
|
}catch(err){showToast('Rename failed: '+err.message);}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
inp.replaceWith(nameEl);
|
||||||
|
};
|
||||||
|
inp.onkeydown=(e2)=>{
|
||||||
|
if(e2.key==='Enter'){e2.preventDefault();finish(true);}
|
||||||
|
if(e2.key==='Escape'){e2.preventDefault();finish(false);}
|
||||||
|
};
|
||||||
|
inp.onblur=()=>finish(false);
|
||||||
|
nameEl.replaceWith(inp);
|
||||||
|
setTimeout(()=>{inp.focus();inp.select();},10);
|
||||||
|
};
|
||||||
el.appendChild(nameEl);
|
el.appendChild(nameEl);
|
||||||
|
|
||||||
// Size -- only for files, right-aligned, shrinks but never wraps
|
// Size -- only for files, right-aligned, shrinks but never wraps
|
||||||
@@ -512,7 +673,7 @@ function renderFileTree(){
|
|||||||
el.appendChild(sizeEl);
|
el.appendChild(sizeEl);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Delete button -- only for files, shown as a CSS class toggle on hover
|
// Delete button -- for files, shown on hover
|
||||||
if(item.type==='file'){
|
if(item.type==='file'){
|
||||||
const del=document.createElement('button');
|
const del=document.createElement('button');
|
||||||
del.className='file-del-btn';del.title='Delete';del.textContent='×';
|
del.className='file-del-btn';del.title='Delete';del.textContent='×';
|
||||||
@@ -550,6 +711,17 @@ async function promptNewFile(){
|
|||||||
}catch(e){setStatus('Create failed: '+e.message);}
|
}catch(e){setStatus('Create failed: '+e.message);}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function promptNewFolder(){
|
||||||
|
if(!S.session)return;
|
||||||
|
const name=prompt('New folder name:','');
|
||||||
|
if(!name||!name.trim())return;
|
||||||
|
try{
|
||||||
|
await api('/api/file/create-dir',{method:'POST',body:JSON.stringify({session_id:S.session.session_id,path:name.trim()})});
|
||||||
|
showToast(`Created folder ${name.trim()}`);
|
||||||
|
await loadDir('.');
|
||||||
|
}catch(e){setStatus('Create folder failed: '+e.message);}
|
||||||
|
}
|
||||||
|
|
||||||
function renderTray(){
|
function renderTray(){
|
||||||
const tray=$('attachTray');tray.innerHTML='';
|
const tray=$('attachTray');tray.innerHTML='';
|
||||||
if(!S.pendingFiles.length){tray.classList.remove('has-files');return;}
|
if(!S.pendingFiles.length){tray.classList.remove('has-files');return;}
|
||||||
|
|||||||
@@ -156,14 +156,17 @@ def test_cancel_nonexistent_stream_returns_not_cancelled(cleanup_test_sessions):
|
|||||||
|
|
||||||
|
|
||||||
def test_server_py_sse_loop_breaks_on_cancel(cleanup_test_sessions):
|
def test_server_py_sse_loop_breaks_on_cancel(cleanup_test_sessions):
|
||||||
"""R5b: server.py SSE loop must include 'cancel' in the break condition.
|
"""R5b: SSE loop must include 'cancel' in the break condition.
|
||||||
When missing, the connection hung after the cancel event was processed.
|
When missing, the connection hung after the cancel event was processed.
|
||||||
|
Sprint 11: logic moved from server.py to api/routes.py -- check both.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "server.py").read_text()
|
|
||||||
# Find the SSE break condition
|
|
||||||
import re
|
import re
|
||||||
m = re.search(r"if event in \([^)]+\):\s*break", src)
|
# Check server.py first, then api/routes.py (Sprint 11 extracted routes)
|
||||||
assert m, "SSE break condition not found in server.py"
|
src = (REPO_ROOT / "server.py").read_text()
|
||||||
|
routes_src = (REPO_ROOT / "api" / "routes.py").read_text() if (REPO_ROOT / "api" / "routes.py").exists() else ""
|
||||||
|
combined = src + routes_src
|
||||||
|
m = re.search(r"if event in \([^)]+\):\s*break", combined)
|
||||||
|
assert m, "SSE break condition not found in server.py or api/routes.py"
|
||||||
assert "cancel" in m.group(), \
|
assert "cancel" in m.group(), \
|
||||||
f"'cancel' missing from SSE break condition: {m.group()}"
|
f"'cancel' missing from SSE break condition: {m.group()}"
|
||||||
|
|
||||||
@@ -275,16 +278,21 @@ def test_deleted_session_does_not_appear_in_list(cleanup_test_sessions):
|
|||||||
|
|
||||||
|
|
||||||
def test_server_delete_invalidates_index(cleanup_test_sessions):
|
def test_server_delete_invalidates_index(cleanup_test_sessions):
|
||||||
"""R8b: server.py session/delete handler must unlink _index.json.
|
"""R8b: session/delete handler must unlink _index.json.
|
||||||
Static check that the fix is in place.
|
Static check that the fix is in place.
|
||||||
|
Sprint 11: handler moved from server.py to api/routes.py -- check both.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "server.py").read_text()
|
src = (REPO_ROOT / "server.py").read_text()
|
||||||
# Find the delete handler and verify it unlinks the index
|
routes_src = (REPO_ROOT / "api" / "routes.py").read_text() if (REPO_ROOT / "api" / "routes.py").exists() else ""
|
||||||
delete_idx = src.find("if parsed.path == '/api/session/delete':")
|
# Find the delete handler in either file
|
||||||
assert delete_idx >= 0, "session/delete handler not found"
|
for label, text in [("server.py", src), ("api/routes.py", routes_src)]:
|
||||||
delete_block = src[delete_idx:delete_idx+600]
|
delete_idx = text.find("if parsed.path == '/api/session/delete':")
|
||||||
assert "SESSION_INDEX_FILE" in delete_block, "server.py session/delete must invalidate SESSION_INDEX_FILE"
|
if delete_idx >= 0:
|
||||||
|
delete_block = text[delete_idx:delete_idx+600]
|
||||||
|
assert "SESSION_INDEX_FILE" in delete_block, \
|
||||||
|
f"{label} session/delete must invalidate SESSION_INDEX_FILE"
|
||||||
|
return
|
||||||
|
assert False, "session/delete handler not found in server.py or api/routes.py"
|
||||||
|
|
||||||
# ── R9: Token/tool SSE events write to wrong session after switch ─────────────
|
# ── R9: Token/tool SSE events write to wrong session after switch ─────────────
|
||||||
|
|
||||||
@@ -292,25 +300,36 @@ def test_token_handler_guards_session_id(cleanup_test_sessions):
|
|||||||
"""R9a: The SSE token event handler must check activeSid before writing to DOM.
|
"""R9a: The SSE token event handler must check activeSid before writing to DOM.
|
||||||
When missing, tokens from session A would render into session B's message area
|
When missing, tokens from session A would render into session B's message area
|
||||||
if the user switched sessions mid-stream.
|
if the user switched sessions mid-stream.
|
||||||
|
Sprint 12: handler moved into _wireSSE(source), so search source.addEventListener.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "static/messages.js").read_text()
|
src = (REPO_ROOT / "static/messages.js").read_text()
|
||||||
# Find the token event handler
|
# Sprint 12 refactored es.addEventListener -> source.addEventListener inside _wireSSE()
|
||||||
|
token_idx = src.find("source.addEventListener('token'")
|
||||||
|
if token_idx < 0:
|
||||||
token_idx = src.find("es.addEventListener('token'")
|
token_idx = src.find("es.addEventListener('token'")
|
||||||
assert token_idx >= 0, "token event handler not found"
|
assert token_idx >= 0, "token event handler not found"
|
||||||
token_block = src[token_idx:token_idx+300]
|
token_block = src[token_idx:token_idx+300]
|
||||||
assert "activeSid" in token_block, "token handler must check activeSid before writing to DOM"
|
assert "activeSid" in token_block, \
|
||||||
assert "S.session.session_id!==activeSid" in token_block or "S.session.session_id===activeSid" in token_block, "token handler must compare current session to activeSid"
|
"token handler must check activeSid before writing to DOM"
|
||||||
|
assert "S.session.session_id!==activeSid" in token_block or \
|
||||||
|
"S.session.session_id===activeSid" in token_block, \
|
||||||
|
"token handler must compare current session to activeSid"
|
||||||
|
|
||||||
|
|
||||||
def test_tool_handler_guards_session_id(cleanup_test_sessions):
|
def test_tool_handler_guards_session_id(cleanup_test_sessions):
|
||||||
"""R9b: The SSE tool event handler must check activeSid before writing to DOM.
|
"""R9b: The SSE tool event handler must check activeSid before writing to DOM.
|
||||||
When missing, tool cards from session A would render into session B's message area.
|
When missing, tool cards from session A would render into session B's message area.
|
||||||
|
Sprint 12: handler moved into _wireSSE(source), so search source.addEventListener.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "static/messages.js").read_text()
|
src = (REPO_ROOT / "static/messages.js").read_text()
|
||||||
|
tool_idx = src.find("source.addEventListener('tool'")
|
||||||
|
if tool_idx < 0:
|
||||||
tool_idx = src.find("es.addEventListener('tool'")
|
tool_idx = src.find("es.addEventListener('tool'")
|
||||||
assert tool_idx >= 0, "tool event handler not found"
|
assert tool_idx >= 0, "tool event handler not found"
|
||||||
tool_block = src[tool_idx:tool_idx+400]
|
tool_block = src[tool_idx:tool_idx+400]
|
||||||
assert "activeSid" in tool_block, "tool handler must check activeSid before writing to DOM"
|
assert "activeSid" in tool_block, \
|
||||||
|
"tool handler must check activeSid before writing to DOM"
|
||||||
|
|
||||||
|
|
||||||
# ── R10: respondApproval uses wrong session_id after switch (multi-session) ─
|
# ── R10: respondApproval uses wrong session_id after switch (multi-session) ─
|
||||||
|
|
||||||
@@ -337,7 +356,9 @@ def test_tool_status_only_shown_for_current_session(cleanup_test_sessions):
|
|||||||
When missing, session A's tool names would appear in session B's activity bar.
|
When missing, session A's tool names would appear in session B's activity bar.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "static/messages.js").read_text()
|
src = (REPO_ROOT / "static/messages.js").read_text()
|
||||||
# Find the tool event handler
|
# Sprint 12: handler moved into _wireSSE(source)
|
||||||
|
tool_idx = src.find("source.addEventListener('tool'")
|
||||||
|
if tool_idx < 0:
|
||||||
tool_idx = src.find("es.addEventListener('tool'")
|
tool_idx = src.find("es.addEventListener('tool'")
|
||||||
assert tool_idx >= 0
|
assert tool_idx >= 0
|
||||||
tool_block = src[tool_idx:tool_idx+400]
|
tool_block = src[tool_idx:tool_idx+400]
|
||||||
@@ -347,8 +368,8 @@ def test_tool_status_only_shown_for_current_session(cleanup_test_sessions):
|
|||||||
assert guard_pos >= 0, "tool handler must guard with activeSid check"
|
assert guard_pos >= 0, "tool handler must guard with activeSid check"
|
||||||
# The guard must appear BEFORE or AROUND the setStatus call
|
# The guard must appear BEFORE or AROUND the setStatus call
|
||||||
# (status only fires for the current session)
|
# (status only fires for the current session)
|
||||||
assert status_pos > tool_block.find("activeSid"), "setStatus in tool handler must be inside the activeSid guard"
|
assert status_pos > tool_block.find("activeSid"), \
|
||||||
|
"setStatus in tool handler must be inside the activeSid guard"
|
||||||
|
|
||||||
# ── R12: Live tool cards lost on switch-away and switch-back ──────────────
|
# ── R12: Live tool cards lost on switch-away and switch-back ──────────────
|
||||||
|
|
||||||
@@ -375,6 +396,9 @@ def test_done_handler_sets_busy_false_before_renderMessages(cleanup_test_session
|
|||||||
tool cards are skipped entirely after a response completes.
|
tool cards are skipped entirely after a response completes.
|
||||||
"""
|
"""
|
||||||
src = (REPO_ROOT / "static/messages.js").read_text()
|
src = (REPO_ROOT / "static/messages.js").read_text()
|
||||||
|
# Sprint 12: handler moved into _wireSSE(source)
|
||||||
|
done_idx = src.find("source.addEventListener('done'")
|
||||||
|
if done_idx < 0:
|
||||||
done_idx = src.find("es.addEventListener('done'")
|
done_idx = src.find("es.addEventListener('done'")
|
||||||
assert done_idx >= 0
|
assert done_idx >= 0
|
||||||
done_block = src[done_idx:done_idx+1500]
|
done_block = src[done_idx:done_idx+1500]
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
"""
|
"""
|
||||||
Sprint 1 test suite for the Hermes WebUI.
|
Sprint 1 test suite for the Hermes Web UI.
|
||||||
|
|
||||||
Tests use the ISOLATED test server running on http://127.0.0.1:8788.
|
Tests use the ISOLATED test server running on http://127.0.0.1:8788.
|
||||||
Production server (port 8787) and your real conversations are never touched.
|
Production server (port 8787) and your real conversations are never touched.
|
||||||
|
|||||||
96
tests/test_sprint11.py
Normal file
96
tests/test_sprint11.py
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
"""
|
||||||
|
Sprint 11 Tests: multi-provider model support, streaming smoothness, routes extraction.
|
||||||
|
"""
|
||||||
|
import json, pathlib, urllib.error, urllib.request, urllib.parse
|
||||||
|
REPO_ROOT = pathlib.Path(__file__).parent.parent.resolve()
|
||||||
|
|
||||||
|
BASE = "http://127.0.0.1:8788"
|
||||||
|
|
||||||
|
def get(path):
|
||||||
|
with urllib.request.urlopen(BASE + path, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
|
||||||
|
def post(path, body=None):
|
||||||
|
data = json.dumps(body or {}).encode()
|
||||||
|
req = urllib.request.Request(BASE + path, data=data,
|
||||||
|
headers={"Content-Type": "application/json"})
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
return json.loads(e.read()), e.code
|
||||||
|
|
||||||
|
|
||||||
|
# ── /api/models endpoint ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_models_endpoint_returns_200():
|
||||||
|
"""GET /api/models returns a valid response."""
|
||||||
|
d, status = get("/api/models")
|
||||||
|
assert status == 200
|
||||||
|
|
||||||
|
def test_models_has_required_fields():
|
||||||
|
"""Response includes groups, default_model, and active_provider."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
assert 'groups' in d
|
||||||
|
assert 'default_model' in d
|
||||||
|
assert 'active_provider' in d
|
||||||
|
|
||||||
|
def test_models_groups_structure():
|
||||||
|
"""Each group has provider name and models list."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
assert isinstance(d['groups'], list)
|
||||||
|
assert len(d['groups']) > 0
|
||||||
|
for group in d['groups']:
|
||||||
|
assert 'provider' in group
|
||||||
|
assert 'models' in group
|
||||||
|
assert isinstance(group['models'], list)
|
||||||
|
assert len(group['models']) > 0
|
||||||
|
|
||||||
|
def test_models_model_structure():
|
||||||
|
"""Each model has id and label."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
for group in d['groups']:
|
||||||
|
for model in group['models']:
|
||||||
|
assert 'id' in model
|
||||||
|
assert 'label' in model
|
||||||
|
assert isinstance(model['id'], str)
|
||||||
|
assert isinstance(model['label'], str)
|
||||||
|
assert len(model['id']) > 0
|
||||||
|
assert len(model['label']) > 0
|
||||||
|
|
||||||
|
def test_models_default_model_not_empty():
|
||||||
|
"""Default model should be a non-empty string."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
assert isinstance(d['default_model'], str)
|
||||||
|
assert len(d['default_model']) > 0
|
||||||
|
|
||||||
|
def test_models_at_least_one_provider():
|
||||||
|
"""At least one provider group should exist (fallback list at minimum)."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
providers = [g['provider'] for g in d['groups']]
|
||||||
|
assert len(providers) >= 1
|
||||||
|
|
||||||
|
def test_models_no_duplicate_ids():
|
||||||
|
"""Model IDs should not be duplicated within a single group."""
|
||||||
|
d, _ = get("/api/models")
|
||||||
|
for group in d['groups']:
|
||||||
|
ids = [m['id'] for m in group['models']]
|
||||||
|
assert len(ids) == len(set(ids)), f"Duplicate model IDs in {group['provider']}: {ids}"
|
||||||
|
|
||||||
|
def test_session_preserves_unlisted_model():
|
||||||
|
"""A session with a model not in the dropdown should still load correctly."""
|
||||||
|
# Create a session with a custom model string
|
||||||
|
d, _ = post("/api/session/new", {})
|
||||||
|
sid = d['session']['session_id']
|
||||||
|
try:
|
||||||
|
custom_model = 'custom-provider/test-model-999'
|
||||||
|
post("/api/session/update", {
|
||||||
|
'session_id': sid,
|
||||||
|
'model': custom_model,
|
||||||
|
'workspace': d['session']['workspace']
|
||||||
|
})
|
||||||
|
# Reload and verify model persisted
|
||||||
|
d2, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d2['session']['model'] == custom_model
|
||||||
|
finally:
|
||||||
|
post("/api/session/delete", {'session_id': sid})
|
||||||
179
tests/test_sprint12.py
Normal file
179
tests/test_sprint12.py
Normal file
@@ -0,0 +1,179 @@
|
|||||||
|
"""
|
||||||
|
Sprint 12 Tests: settings panel, session pinning, session import, SSE reconnect.
|
||||||
|
"""
|
||||||
|
import json, pathlib, urllib.error, urllib.request, urllib.parse
|
||||||
|
|
||||||
|
BASE = "http://127.0.0.1:8788"
|
||||||
|
|
||||||
|
|
||||||
|
def get(path):
|
||||||
|
with urllib.request.urlopen(BASE + path, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
|
||||||
|
|
||||||
|
def post(path, body=None):
|
||||||
|
data = json.dumps(body or {}).encode()
|
||||||
|
req = urllib.request.Request(BASE + path, data=data,
|
||||||
|
headers={"Content-Type": "application/json"})
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
return json.loads(e.read()), e.code
|
||||||
|
|
||||||
|
|
||||||
|
def make_session(created_list):
|
||||||
|
d, _ = post("/api/session/new", {})
|
||||||
|
sid = d["session"]["session_id"]
|
||||||
|
created_list.append(sid)
|
||||||
|
return sid
|
||||||
|
|
||||||
|
|
||||||
|
# ── Settings API ──────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_settings_get_returns_defaults():
|
||||||
|
"""GET /api/settings returns default settings."""
|
||||||
|
d, status = get("/api/settings")
|
||||||
|
assert status == 200
|
||||||
|
assert 'default_model' in d
|
||||||
|
assert 'default_workspace' in d
|
||||||
|
|
||||||
|
def test_settings_post_persists():
|
||||||
|
"""POST /api/settings saves and returns merged settings."""
|
||||||
|
d, status = post("/api/settings", {"default_model": "test/model-123"})
|
||||||
|
assert status == 200
|
||||||
|
assert d['default_model'] == 'test/model-123'
|
||||||
|
# Verify it persisted
|
||||||
|
d2, _ = get("/api/settings")
|
||||||
|
assert d2['default_model'] == 'test/model-123'
|
||||||
|
# Restore
|
||||||
|
post("/api/settings", {"default_model": "openai/gpt-5.4-mini"})
|
||||||
|
|
||||||
|
def test_settings_partial_update():
|
||||||
|
"""POST /api/settings with partial data doesn't clobber other fields."""
|
||||||
|
d1, _ = get("/api/settings")
|
||||||
|
original_ws = d1['default_workspace']
|
||||||
|
post("/api/settings", {"default_model": "anthropic/claude-sonnet-4.6"})
|
||||||
|
d2, _ = get("/api/settings")
|
||||||
|
assert d2['default_model'] == 'anthropic/claude-sonnet-4.6'
|
||||||
|
assert d2['default_workspace'] == original_ws
|
||||||
|
# Restore
|
||||||
|
post("/api/settings", {"default_model": "openai/gpt-5.4-mini"})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Session Pinning ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_pin_session():
|
||||||
|
"""POST /api/session/pin sets pinned=true."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid = make_session(created)
|
||||||
|
d, status = post("/api/session/pin", {"session_id": sid, "pinned": True})
|
||||||
|
assert status == 200
|
||||||
|
assert d['ok'] is True
|
||||||
|
assert d['session']['pinned'] is True
|
||||||
|
finally:
|
||||||
|
for sid in created:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
def test_unpin_session():
|
||||||
|
"""POST /api/session/pin with pinned=false unpins."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid = make_session(created)
|
||||||
|
post("/api/session/pin", {"session_id": sid, "pinned": True})
|
||||||
|
d, status = post("/api/session/pin", {"session_id": sid, "pinned": False})
|
||||||
|
assert status == 200
|
||||||
|
assert d['session']['pinned'] is False
|
||||||
|
finally:
|
||||||
|
for sid in created:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
def test_pinned_in_session_list():
|
||||||
|
"""Pinned sessions include pinned field in session list."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid = make_session(created)
|
||||||
|
# Pin it and give it a title so it shows in the list
|
||||||
|
post("/api/session/rename", {"session_id": sid, "title": "Pinned Test"})
|
||||||
|
post("/api/session/pin", {"session_id": sid, "pinned": True})
|
||||||
|
d, _ = get("/api/sessions")
|
||||||
|
match = [s for s in d['sessions'] if s['session_id'] == sid]
|
||||||
|
assert len(match) == 1
|
||||||
|
assert match[0]['pinned'] is True
|
||||||
|
finally:
|
||||||
|
for sid in created:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
def test_pinned_persists_on_reload():
|
||||||
|
"""Pin status survives session reload from disk."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid = make_session(created)
|
||||||
|
post("/api/session/pin", {"session_id": sid, "pinned": True})
|
||||||
|
d, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d['session']['pinned'] is True
|
||||||
|
finally:
|
||||||
|
for sid in created:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Session Import ────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_import_session_basic():
|
||||||
|
"""POST /api/session/import creates a new session from JSON."""
|
||||||
|
payload = {
|
||||||
|
"title": "Imported Test",
|
||||||
|
"messages": [
|
||||||
|
{"role": "user", "content": "Hello from import"},
|
||||||
|
{"role": "assistant", "content": "Hi there!"},
|
||||||
|
],
|
||||||
|
"model": "test/import-model",
|
||||||
|
}
|
||||||
|
d, status = post("/api/session/import", payload)
|
||||||
|
assert status == 200
|
||||||
|
assert d['ok'] is True
|
||||||
|
sid = d['session']['session_id']
|
||||||
|
try:
|
||||||
|
assert d['session']['title'] == 'Imported Test'
|
||||||
|
assert len(d['session']['messages']) == 2
|
||||||
|
# Verify it loads correctly
|
||||||
|
d2, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d2['session']['model'] == 'test/import-model'
|
||||||
|
finally:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
def test_import_requires_messages():
|
||||||
|
"""Import fails without a messages array."""
|
||||||
|
d, status = post("/api/session/import", {"title": "No messages"})
|
||||||
|
assert status == 400
|
||||||
|
|
||||||
|
def test_import_creates_new_id():
|
||||||
|
"""Imported session gets a new session_id, not reusing any from the payload."""
|
||||||
|
payload = {
|
||||||
|
"session_id": "should_be_ignored",
|
||||||
|
"title": "ID Test",
|
||||||
|
"messages": [{"role": "user", "content": "test"}],
|
||||||
|
}
|
||||||
|
d, _ = post("/api/session/import", payload)
|
||||||
|
sid = d['session']['session_id']
|
||||||
|
try:
|
||||||
|
# The import should create a new ID, not use the one from the payload
|
||||||
|
assert sid != "should_be_ignored"
|
||||||
|
finally:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
|
|
||||||
|
def test_import_with_pinned():
|
||||||
|
"""Imported session can be pinned."""
|
||||||
|
payload = {
|
||||||
|
"title": "Pinned Import",
|
||||||
|
"messages": [{"role": "user", "content": "test"}],
|
||||||
|
"pinned": True,
|
||||||
|
}
|
||||||
|
d, _ = post("/api/session/import", payload)
|
||||||
|
sid = d['session']['session_id']
|
||||||
|
try:
|
||||||
|
d2, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d2['session']['pinned'] is True
|
||||||
|
finally:
|
||||||
|
post("/api/session/delete", {"session_id": sid})
|
||||||
120
tests/test_sprint13.py
Normal file
120
tests/test_sprint13.py
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
"""
|
||||||
|
Sprint 13 Tests: cron recent endpoint, session duplicate, background alerts.
|
||||||
|
"""
|
||||||
|
import json, pathlib, urllib.error, urllib.request
|
||||||
|
|
||||||
|
BASE = "http://127.0.0.1:8788"
|
||||||
|
|
||||||
|
|
||||||
|
def get(path):
|
||||||
|
with urllib.request.urlopen(BASE + path, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
|
||||||
|
|
||||||
|
def post(path, body=None):
|
||||||
|
data = json.dumps(body or {}).encode()
|
||||||
|
req = urllib.request.Request(BASE + path, data=data,
|
||||||
|
headers={"Content-Type": "application/json"})
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
return json.loads(e.read()), e.code
|
||||||
|
|
||||||
|
|
||||||
|
def make_session(created_list):
|
||||||
|
d, _ = post("/api/session/new", {})
|
||||||
|
sid = d["session"]["session_id"]
|
||||||
|
created_list.append(sid)
|
||||||
|
return sid, d["session"]
|
||||||
|
|
||||||
|
|
||||||
|
# ── Cron recent endpoint ──────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_crons_recent_returns_200():
|
||||||
|
"""GET /api/crons/recent returns completions list."""
|
||||||
|
d, status = get("/api/crons/recent?since=0")
|
||||||
|
assert status == 200
|
||||||
|
assert 'completions' in d
|
||||||
|
assert isinstance(d['completions'], list)
|
||||||
|
assert 'since' in d
|
||||||
|
|
||||||
|
def test_crons_recent_with_future_since():
|
||||||
|
"""Completions list is empty when since is in the future."""
|
||||||
|
import time
|
||||||
|
d, _ = get(f"/api/crons/recent?since={time.time() + 99999}")
|
||||||
|
assert d['completions'] == []
|
||||||
|
|
||||||
|
def test_crons_recent_default_since():
|
||||||
|
"""Default since=0 returns all completions."""
|
||||||
|
d, status = get("/api/crons/recent")
|
||||||
|
assert status == 200
|
||||||
|
assert 'completions' in d
|
||||||
|
|
||||||
|
|
||||||
|
# ── Session duplicate ─────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_duplicate_session():
|
||||||
|
"""Duplicating a session creates a new one with same workspace/model."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
# Set a specific model on the session
|
||||||
|
post("/api/session/update", {
|
||||||
|
"session_id": sid, "model": "test/dup-model",
|
||||||
|
"workspace": sess["workspace"]
|
||||||
|
})
|
||||||
|
# Duplicate: create new session with same workspace/model
|
||||||
|
d2, status = post("/api/session/new", {
|
||||||
|
"workspace": sess["workspace"], "model": "test/dup-model"
|
||||||
|
})
|
||||||
|
assert status == 200
|
||||||
|
new_sid = d2["session"]["session_id"]
|
||||||
|
created.append(new_sid)
|
||||||
|
assert new_sid != sid
|
||||||
|
assert d2["session"]["model"] == "test/dup-model"
|
||||||
|
assert d2["session"]["workspace"] == sess["workspace"]
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Session pinned field preserved across operations ──────────────────────
|
||||||
|
|
||||||
|
def test_pinned_survives_update():
|
||||||
|
"""Pinned status survives session update."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
post("/api/session/pin", {"session_id": sid, "pinned": True})
|
||||||
|
# Update workspace/model
|
||||||
|
post("/api/session/update", {
|
||||||
|
"session_id": sid, "model": "test/other",
|
||||||
|
"workspace": sess["workspace"]
|
||||||
|
})
|
||||||
|
d, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d["session"]["pinned"] is True
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Workspace symlink validation ──────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_workspace_add_rejects_nonexistent():
|
||||||
|
"""Adding a non-existent path returns 400."""
|
||||||
|
d, status = post("/api/workspaces/add", {"path": "/nonexistent/path/12345"})
|
||||||
|
assert status == 400
|
||||||
|
|
||||||
|
def test_workspace_add_accepts_real_dir():
|
||||||
|
"""Adding a real directory succeeds."""
|
||||||
|
import tempfile
|
||||||
|
tmp = tempfile.mkdtemp()
|
||||||
|
try:
|
||||||
|
d, status = post("/api/workspaces/add", {"path": tmp, "name": "test-ws"})
|
||||||
|
assert status == 200
|
||||||
|
assert d["ok"] is True
|
||||||
|
finally:
|
||||||
|
post("/api/workspaces/remove", {"path": tmp})
|
||||||
|
import shutil
|
||||||
|
shutil.rmtree(tmp, ignore_errors=True)
|
||||||
153
tests/test_sprint14.py
Normal file
153
tests/test_sprint14.py
Normal file
@@ -0,0 +1,153 @@
|
|||||||
|
"""
|
||||||
|
Sprint 14 Tests: file rename, folder create, session archive, session tags, mermaid, timestamps.
|
||||||
|
"""
|
||||||
|
import json, os, pathlib, shutil, tempfile, urllib.error, urllib.request
|
||||||
|
|
||||||
|
BASE = "http://127.0.0.1:8788"
|
||||||
|
|
||||||
|
|
||||||
|
def get(path):
|
||||||
|
with urllib.request.urlopen(BASE + path, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
|
||||||
|
|
||||||
|
def post(path, body=None):
|
||||||
|
data = json.dumps(body or {}).encode()
|
||||||
|
req = urllib.request.Request(BASE + path, data=data,
|
||||||
|
headers={"Content-Type": "application/json"})
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req, timeout=10) as r:
|
||||||
|
return json.loads(r.read()), r.status
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
return json.loads(e.read()), e.code
|
||||||
|
|
||||||
|
|
||||||
|
def make_session(created_list):
|
||||||
|
d, _ = post("/api/session/new", {})
|
||||||
|
sid = d["session"]["session_id"]
|
||||||
|
created_list.append(sid)
|
||||||
|
return sid, d["session"]
|
||||||
|
|
||||||
|
|
||||||
|
# ── File rename ───────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_file_rename():
|
||||||
|
"""Renaming a file changes its name on disk."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
# Create a file first
|
||||||
|
post("/api/file/create", {"session_id": sid, "path": "rename_test.txt", "content": "hello"})
|
||||||
|
d, status = post("/api/file/rename", {
|
||||||
|
"session_id": sid, "path": "rename_test.txt", "new_name": "renamed.txt"
|
||||||
|
})
|
||||||
|
assert status == 200
|
||||||
|
assert d["ok"] is True
|
||||||
|
assert "renamed.txt" in d["new_path"]
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
def test_file_rename_rejects_path_traversal():
|
||||||
|
"""Rename rejects names with path separators."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
post("/api/file/create", {"session_id": sid, "path": "safe.txt", "content": ""})
|
||||||
|
d, status = post("/api/file/rename", {
|
||||||
|
"session_id": sid, "path": "safe.txt", "new_name": "../evil.txt"
|
||||||
|
})
|
||||||
|
assert status == 400
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
def test_file_rename_rejects_existing():
|
||||||
|
"""Rename fails if target name already exists."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
post("/api/file/create", {"session_id": sid, "path": "a.txt", "content": "a"})
|
||||||
|
post("/api/file/create", {"session_id": sid, "path": "b.txt", "content": "b"})
|
||||||
|
d, status = post("/api/file/rename", {
|
||||||
|
"session_id": sid, "path": "a.txt", "new_name": "b.txt"
|
||||||
|
})
|
||||||
|
assert status == 400
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Folder create ─────────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_create_dir():
|
||||||
|
"""Creating a folder succeeds."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
d, status = post("/api/file/create-dir", {
|
||||||
|
"session_id": sid, "path": "test_folder"
|
||||||
|
})
|
||||||
|
assert status == 200
|
||||||
|
assert d["ok"] is True
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
def test_create_dir_rejects_existing():
|
||||||
|
"""Creating a folder that already exists fails."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, sess = make_session(created)
|
||||||
|
post("/api/file/create-dir", {"session_id": sid, "path": "dup_folder"})
|
||||||
|
d, status = post("/api/file/create-dir", {"session_id": sid, "path": "dup_folder"})
|
||||||
|
assert status == 400
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
# ── Session archive ───────────────────────────────────────────────────────
|
||||||
|
|
||||||
|
def test_archive_session():
|
||||||
|
"""Archiving a session sets archived=true."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, _ = make_session(created)
|
||||||
|
d, status = post("/api/session/archive", {"session_id": sid, "archived": True})
|
||||||
|
assert status == 200
|
||||||
|
assert d["session"]["archived"] is True
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
def test_unarchive_session():
|
||||||
|
"""Unarchiving a session sets archived=false."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, _ = make_session(created)
|
||||||
|
post("/api/session/archive", {"session_id": sid, "archived": True})
|
||||||
|
d, status = post("/api/session/archive", {"session_id": sid, "archived": False})
|
||||||
|
assert status == 200
|
||||||
|
assert d["session"]["archived"] is False
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
|
|
||||||
|
|
||||||
|
def test_archived_in_compact():
|
||||||
|
"""Archived field appears in session list."""
|
||||||
|
created = []
|
||||||
|
try:
|
||||||
|
sid, _ = make_session(created)
|
||||||
|
post("/api/session/rename", {"session_id": sid, "title": "Archive Test"})
|
||||||
|
post("/api/session/archive", {"session_id": sid, "archived": True})
|
||||||
|
d, _ = get(f"/api/session?session_id={sid}")
|
||||||
|
assert d["session"]["archived"] is True
|
||||||
|
finally:
|
||||||
|
for s in created:
|
||||||
|
post("/api/session/delete", {"session_id": s})
|
||||||
Reference in New Issue
Block a user