Welcome to uxopian-ai
uxopian-ai is a complete, standalone framework designed to accelerate and simplify the integration of powerful AI features into any enterprise application.
Built on a solid foundation of Java 21 LTS and Spring 3.5, it goes far beyond a simple library by providing a full suite of tools — from backend services to frontend components — to create sophisticated, reliable, and scalable AI solutions.
The uxopian-ai Advantage: More Than Just a Library
While uxopian-ai uses the excellent Langchain4j library as its core for LLM interactions, it builds a complete enterprise-ready ecosystem around it. Here's the added value:
- Standalone Service, Not Just Code: A pre-packaged, deployable service that saves you months of development and infrastructure setup.
- Ready-to-Use UI Components: Instantly integrate AI with web-components (IIFE compiled, scoped CSS), plus plug-and-play integration scripts.
- Advanced Orchestration Engine: The unique Goal system enables dynamic prompt selection based on context — no need to build this from scratch.
- Complete Conversation Management: Persistent conversations with cost tracking, response regeneration, and user feedback support.
- Data-Driven Insights: A comprehensive admin panel to monitor ROI, token usage, and adoption trends.
Key Features at a Glance
Effortless & Scalable Integration
- Standalone Service: Deployable via Docker or as a Java 21 application.
- Multi-Tenant Architecture: Designed for internal deployments with clear logical separation and distinct tenant management.
- Web-Component UI: Lightweight, embeddable components for any web app.
- Rich REST API: Fully documented (Swagger) for seamless integration.
Powerful Admin & Analytics
- Granular Token Monitoring: Visualize input and output token consumption globally, by specific users, or per conversation.
- ROI & Efficiency Tracking: Specific metrics allow you to view the number of times a prompt is used and estimate the total time saved.
- Usage Trends: Analyze activity over time (requests per week), monitor LLM model distribution, and track the adoption of advanced features like multi-modal capabilities.
Intelligent Orchestration
- Goal System: Define context-aware workflows using filters and priorities. Example: A "comparison" goal automatically picks a legal prompt for contracts, and a generic one for others.
- Templating Engine: Dynamic data injection, custom Java services, and conditional logic with Thymeleaf.
- Template Helpers: Add your own Java functions to enrich prompts.
Robust LLM Interaction
- Broad Support: Compatible with many LLM providers out-of-the-box.
- Custom Connectors: Add private or fine-tuned models easily.
- Advanced Features: Native support for function calling, multi-modal requests (text + image), and streaming/non-streaming responses.
- MCP Server Client: Acts as a client for Model Context Protocol (MCP) servers.
Complete Conversation Management
- Persistent History: Conversations and messages are stored with full context.
- Feedback Loop: Gather specific user feedback (Good/Bad/Neutral) on responses to improve prompt quality.
- Rich UX: Regenerate, copy, and manage conversation content easily.
Reading Paths
Choose the path that matches your role:
New to uxopian-ai?
- Quick Start — Your first AI exchange in 5 minutes.
- Core Concepts — Understand Prompts, Goals, and Conversations.
- Architecture Overview — See how the components fit together.
Operator / DevOps?
- Deploy with Docker — Set up the full stack.
- Configuration Files — YAML reference for all config files.
- Environment Variables — Quick reference for Docker deployments.
- Backup and Recovery — Protect your data.
Integrator?
- Architecture Overview — Understand the BFF pattern.
- Embedding in a Web Page — Add AI to any web app.
- Integrating with ARender — Add AI buttons in ARender.
- Integrating with FlowerDocs — Add AI features in FlowerDocs.
Java Developer?
- Core Concepts — Understand the domain model.
- The Templating Engine — Master dynamic prompt authoring.
- Creating Custom Helpers — Inject your own data into prompts.
- Creating Custom Tools — Give the LLM the ability to take actions.