Skip to content

Quick Start: Your First AI Exchange in 5 Minutes

This guide gets you from zero to a working AI response as fast as possible using the Docker Starter Kit.


Prerequisites

  • Docker and Docker Compose installed.
  • An OpenAI API Key (or any supported LLM provider key).

Step 1: Download the Starter Kit

Extract the archive. You should have the following structure:

.
├── config/
│   ├── application.yml
│   ├── goals.yml
│   ├── llm-clients-config.yml
│   ├── llm-clients-config.yml.example
│   ├── mcp-server.yml
│   ├── metrics.yml
│   ├── opensearch.yml
│   └── prompts.yml
├── gateway-application.yaml
└── uxopian-ai-stack.yml

Step 2: Configure Your API Key

The LLM configuration uses environment variables. The simplest approach is to set your API key in the uxopian-ai-stack.yml file by adding it to the environment section of the uxopian-ai-standalone service:

environment:
  - OPENAI_API_KEY=sk-YOUR_OPENAI_API_KEY_HERE

Alternatively, you can replace ${OPENAI_API_KEY:} directly in config/llm-clients-config.yml with your key.

Other Providers

You can use any supported provider (Azure, Anthropic, Gemini, Mistral, Ollama...). See Configuration Files for the full reference.


Step 3: Start the Stack

Start all services:

docker-compose -f uxopian-ai-stack.yml up -d

Docker Images

The compose file references images from artifactory.arondor.cloud:5001/. If your registry differs, update the image: fields in uxopian-ai-stack.yml accordingly.

Wait a few seconds for OpenSearch to initialize. You can check the health:

curl http://localhost:8085/actuator/health

Port Mapping

The starter kit maps the service to port 8085 on the host (8085:8080). All API calls from your machine go to localhost:8085.

Why No Gateway?

The starter kit runs in development mode (SPRING_PROFILES_ACTIVE=dev), which lets you call the AI service directly without a BFF Gateway. The service automatically fills in default values for missing authentication headers. In production, a BFF Gateway sits in front of the service and handles authentication, role injection, and tenant isolation.


Step 4: Create a Prompt

Tell the AI what to do by creating your first prompt:

curl -X POST "http://localhost:8085/api/v1/admin/prompts" \
  -H "Content-Type: application/json" \
  -H "X-User-TenantId: quickstart" \
  -H "X-User-Id: admin" \
  -H "X-User-Roles: admin" \
  -d '{
    "id": "helloAI",
    "role": "system",
    "content": "You are a helpful assistant. Answer concisely.",
    "defaultLlmProvider": "openai",
    "defaultLlmModel": "gpt-5.1"
  }'

Note

The starter kit already includes pre-defined prompts in config/prompts.yml (e.g., summarizeDocumentText, translate). You can use them directly or create new ones via the API.


Step 5: Create a Conversation and Send a Message

First, create a conversation:

curl -X POST "http://localhost:8085/api/v1/conversations" \
  -H "Content-Type: application/json" \
  -H "X-User-TenantId: quickstart" \
  -H "X-User-Id: demo-user"

Copy the id from the response, then send your first message:

curl -X POST "http://localhost:8085/api/v1/requests?conversation=YOUR_CONVERSATION_ID" \
  -H "Content-Type: application/json" \
  -H "X-User-TenantId: quickstart" \
  -H "X-User-Id: demo-user" \
  -d '{
    "inputs": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "value": "What is the capital of France?"
          }
        ]
      }
    ]
  }'

You should receive a JSON response with the AI's answer.


What's Next?

You've just completed your first AI exchange with uxopian-ai. Here's where to go from here:

  • Understand the system: Read Core Concepts to learn about Prompts, Goals, and Conversations.
  • Manage prompts: Learn how to manage prompts and goals with the API and the admin UI.
  • Explore the Admin Panel: See your usage stats in the Dashboard.
  • Go deeper: Discover the Templating Engine to build dynamic, context-aware prompts.
  • Understand security: Learn how the BFF Gateway handles authentication in production and why this starter kit can skip it.