Uxopian AI overview
Uxopian AI is a framework for embedding AI assistants inside legacy enterprise applications. It is designed for content management and document-centric systems. It does not replace the host application. It adds a conversational interface, connects to multiple LLM providers, and exposes tools the LLM can call to operate on enterprise data.
Who this documentation is for
| Audience | Entry point |
|---|---|
| System administrator deploying the stack | Quickstart with Docker Compose |
| Integration developer embedding the chat UI | Embed in a web application |
| Solution architect evaluating the system | System architecture |
| ARender integrator | Integrate with ARender |
| Alfresco integrator | Integrate with Alfresco |
| FlowerDocs integrator | Integrate with FlowerDocs |
| Prompt or assistant developer | Prompts and templating |
| Gateway / route configuration | Configure gateway routes |
System components
Figure: High-level component topology.
uxopian-ai
The primary application. Java 21, Spring Boot (reactive WebFlux stack). Handles all business logic: conversations, requests, prompt rendering, tool execution, LLM calls, and the REST/WebSocket API.
uxopian-gateway
Spring Cloud Gateway acting as a reverse proxy. The only public entry point. Authenticates every request using a configurable AuthProvider and forwards identity headers to uxopian-ai. uxopian-ai must never be exposed directly to browsers.
OpenSearch
Primary persistence store. Stores conversations, requests, prompts, goal configurations, LLM provider configurations, and usage metrics. All data is tenant-scoped.
Hazelcast
Distributed in-memory cache used by the gateway for session token caching. Configured via hazelcast.yml.
Supported LLM providers
Nine providers are supported out of the box:
| Provider | Key identifier |
|---|---|
| OpenAI | openai |
| Anthropic | anthropic |
| Azure OpenAI | azure-openai |
| AWS Bedrock | bedrock |
| Google Gemini | gemini |
| Mistral AI | mistral-ai |
| HuggingFace | huggingface |
| Ollama | ollama |
| NuExtract | nu-extract |
See LLM providers for configuration details.
Integration paths
Three integration paths are available:
- Generic web application: load the JavaScript and CSS bundles from the gateway's
/api/web-components/chat/scriptand/api/web-components/chat/styleendpoints, callwindow.createChat(). See Embed in a web application. - ARender document viewer: adds an AI menu to the ARender top panel. Documents are accessed via the ARender DSB API. See Integrate with ARender.
- FlowerDocs ECM: embeds the chat panel via FlowerDocs scope files. Uses
FlowerDocsProviderin the gateway. See Integrate with FlowerDocs.
Extension mechanisms
- Custom tool plugins: write a
@ToolServiceclass, package as a shaded JAR, drop inplugins/. The LLM can then call those methods as tools. - Custom ServiceHelpers: write a
@HelperServiceclass, expose it as a named expression object in Thymeleaf prompt templates. - Custom auth providers: implement the
AuthProviderinterface in the gateway to support any identity system. - Prompt and goal customization: define per-tenant overrides in
prompts.ymlandgoals.yml, or manage them live via the Admin API.
Key concepts
| Concept | Description |
|---|---|
| Tenant | Primary isolation unit. All data is scoped to a tenant ID. |
| Conversation | A chat session. Contains a sequence of Requests. |
| Request | A single LLM round-trip: inputs, rendered prompt, response, token usage. |
| Prompt | A named Thymeleaf template defining a role and content. |
| Goal | A named group of ordered prompt references with optional filters. |
| Plugin | A shaded JAR in plugins/ loaded at startup by IntegrationLoader. |
Next steps
- Follow Quickstart with Docker Compose to run a local stack.
- Read System architecture for the full component diagram and request flow.
- Read Authentication and gateway before any deployment.