The Complete Platform

A production proxy layer for OpenAI-compatible clients that adds persistent memory, context trimming, and identity-safe multi-user behavior.

✂️

Context Trimming Engine

Soft/hard limits, pressure-aware compression, and token budgeting prevent context blowups while preserving conversation quality across long sessions.

👤

Identity Resolution

OWUI headers, custom headers, bearer token paths, and trust-level controls map every request to the correct user memory boundary.

🔒

Memory Isolation

Per-user read/write gating with identity modes: strict, required, quarantine, and legacy. Zero cross-user bleed.

🔄

Memory Lifecycle

Dedup, contradiction handling, pinning, TTL expiry, quality sweep, archive, audit trails, and repair routines keep memory healthy long-term.

📄

File Upload + Retrieval

Document ingest with quotas, content extraction, TOC/excerpt injection, and optional embedding search for deep knowledge retrieval.

🎯

Intent-Aware Injection

Intent DNA classification and sandbox scoring determine which memories surface at query time. Right memory, right moment.

Your LLM Finally Remembers

UPtrim sits between your chat UI and LLM backend. Users get persistent memory, operators get full control over what is stored and surfaced.

  • Extracts facts automatically from conversations
  • Deduplicates and resolves contradictions
  • Intent-aware injection — right memory, right time
  • Category-based retrieval with TTL rules

Built for Shared Deployments

Run one LLM backend for your whole team. Each user gets their own isolated memory space with zero bleed between accounts.

  • Strict identity-mode enforcement
  • Per-user memory boundaries
  • Quarantine mode for untrusted clients
  • HMAC-signed identity resolution

Operator Surface

Three interfaces for managing your deployment — from full admin dashboards to user self-service portals.

📊

Web Dashboard

Real-time stats, config controls, user/token management, logs, and memory operations. Full operator visibility from the browser.

🧑

My Memory Portal

User self-service page for personal memory visibility, file uploads, profile management, and memory correction.

🔧

Tool Server

External tool endpoints for memory and file operations when mounted in Open WebUI. Extends your LLM with proxy-powered tools.

See It in Action

Dive deeper into every capability or compare against alternatives.