Option 1: Docker Compose (Recommended)
The fastest way to get Ryumem running with all components.1
Clone the Repository
2
Configure Environment
3
Start All Services
- API Server: http://localhost:8000
- Dashboard: http://localhost:3000
- API Docs: http://localhost:8000/docs
4
Generate Your API Key
Register a customer to get your API key:The response includes your API key (starts with
ryu_). Save this securely.Option 2: Helm Chart (Kubernetes)
Deploy Ryumem to Kubernetes using the official Helm chart.1
Add the Helm Repository
2
Configure Values
Create a custom values file:
3
Install the Chart
4
Verify Installation
5
Generate Your API Key
Port-forward and register:
Helm Chart Configuration
Key configuration options invalues.yaml:
| Parameter | Default | Description |
|---|---|---|
replicaCount | 1 | Number of API server replicas |
image.repository | ghcr.io/predictable-labs/ryumem | Container image |
image.tag | latest | Image tag |
secrets.googleApiKey | "" | Google Gemini API key |
secrets.openaiApiKey | "" | OpenAI API key |
secrets.adminApiKey | "" | Admin API key for registration |
ryumem.llm.provider | gemini | LLM provider |
ryumem.llm.model | gemini-2.0-flash-exp | LLM model |
ryumem.embedding.provider | gemini | Embedding provider |
ryumem.embedding.model | text-embedding-004 | Embedding model |
persistence.enabled | true | Enable persistent storage |
persistence.size | 10Gi | Storage size |
dashboard.enabled | true | Deploy dashboard |
ingress.enabled | false | Enable API ingress |
dashboard.ingress.enabled | false | Enable dashboard ingress |
For production deployments, use external secret management (like Kubernetes Secrets, Vault, or External Secrets Operator) instead of storing API keys in values files.
Option 3: Local Development
For development or when you want more control over individual components.Prerequisites
- Python 3.10+
- Node.js 18+
- An LLM API key (Google Gemini, OpenAI, or local Ollama)
Start the API Server
1
Install the SDK
2
Install Server Dependencies
3
Configure Environment
4
Start the Server
Start the Dashboard
Environment Variables
Server Configuration (server/.env)
| Variable | Required | Default | Description |
|---|---|---|---|
GOOGLE_API_KEY | Yes* | - | Google Gemini API key |
OPENAI_API_KEY | Yes* | - | OpenAI API key |
RYUMEM_DB_FOLDER | Yes | ./data | Database storage path |
ADMIN_API_KEY | Yes | - | Admin key for registration |
LLM_PROVIDER | No | gemini | LLM provider (gemini, openai, ollama, litellm) |
LLM_MODEL | No | gemini-2.0-flash-exp | LLM model name |
EMBEDDING_PROVIDER | No | gemini | Embedding provider |
EMBEDDING_MODEL | No | text-embedding-004 | Embedding model |
CORS_ORIGINS | No | http://localhost:3000 | Allowed CORS origins |
Dashboard Configuration (dashboard/.env)
| Variable | Required | Default | Description |
|---|---|---|---|
NEXT_PUBLIC_API_URL | Yes | http://localhost:8000 | Ryumem API server URL |
Quick Start with Python SDK
Once your server is running, install the Python SDK:Accessing the Dashboard
Once Ryumem is running:- Navigate to http://localhost:3000
- Enter your API key (starts with
ryu_) - Click “Sign in”
- Search and query your knowledge graph
- Visualize entities and relationships
- View episodes and memories
- Track tool execution analytics
- Configure system settings
Dashboard Guide
Complete guide to dashboard features and navigation
Using with Ollama (Local LLMs)
For fully local deployment without cloud API keys:Local LLMs may have different performance characteristics compared to cloud providers. Test with your use case.