doku

Форк
0

README.md
Doku Logo

Doku: Open Source Observability for LLMs

Documentation | Quickstart | Python SDK | Node SDK | Helm Chart

Doku License Downloads GitHub Last Commit GitHub Contributors

Slack X

Doku Version Helm Version Python Library Version NPM Package Version

Doku is an open-source LLMOps tool engineered to enables developers with comprehensive capabilities to monitor, analyze, and optimize LLM applications. It provides valuable real-time data on LLM usage, performance, and costs. Through seamless integrations with leading LLM platforms, including OpenAI, Cohere, and Anthropic, Doku acts as a central command center for all your LLM needs. It effectively guides your efforts, ensuring that your LLM applications not only operate at peak efficiency but also scale successfully.

Why use Doku?

Get advanced monitoring and evaluation for your LLM applications with these key benefits:

  • Granular Usage Insights of your LLM Applications: Assess your LLM's performance and costs with fine-grained control, breaking down metrics by environment (such as staging or production) or application, to optimize for efficiency and scalability.
  • Real-Time Data Streaming: Unlike other platforms where you might wait minutes to see your data due to data being sent in batches, Doku is able to display data as it streams. This immediate insight enables quick decision-making and adjustments.
  • Zero Added Latency: Doku's smart data handling ensures rapid data processing without impacting your application's performance, maintaining the responsiveness of your LLM applications.
  • Connect to Observability Platforms: Doku seamlessly connects with leading observability platforms like Grafana Cloud and Datadog, among others to automatically export data.

How it works?

How it Works?

Step 1: Instrument your Application

Integrating the dokumetry SDK into LLM applications is straightforward with SDKs designed for Python and NodeJS. Start monitoring for your LLM Application with just two lines of code:

For Python

import dokumetry
dokumetry.init(llm=openai, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")

For NodeJS

import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})

Step 2: Data processed by Doku Ingester

Once the dokumetry SDKs are configured in your LLM application, Monitoring data starts streaming to the Doku Ingester. It processes and safely stores your data in ClickHouse, keeping your LLM Monitoring data secure and compliant in your environment.

You can choose to use a new ClickHouse database setup or connect to your existing one to work with Doku.

Step 3: Visualize and analyze

With your LLM monitoring data processed and securely stored, you can now leverage the Doku UI for in-depth visualization and analysis. Doku UI allows you to explore LLM costs, token usage, performance metrics, and user interactions in an intuitive interface. This powerful tool enhances your ability to observe and optimize your LLM applications, ensuring you make data-driven decisions for improvement.

For those with a preferred observability platform, you can also integrate and visualize this data elsewhere with ease. This flexibility ensures optimal monitoring workflow integration, regardless of your platform choice. For more details on how to set up these connections, check out the Connections guide.

🚀 Getting Started with Doku

💿 Installation

Jumpstart your journey with Doku by deploying it via our Helm chart, designed to simplify the installation process on any Kubernetes cluster.

Docker

To install the Doku using Docker, follow these steps:

  1. Create docker-compose.yml
version: '3.8'
services:
clickhouse:
image: clickhouse/clickhouse-server:24.1.5
container_name: clickhouse
environment:
CLICKHOUSE_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
CLICKHOUSE_USER: ${DOKU_DB_USER:-default}
volumes:
- clickhouse-data:/var/lib/clickhouse
ports:
- "9000:9000"
- "8123:8123"
restart: always
doku-ingester:
image: ghcr.io/dokulabs/doku-ingester:0.1.0
container_name: doku-ingester
environment:
DOKU_DB_HOST: clickhouse
DOKU_DB_PORT: 9000
DOKU_DB_NAME: ${DOKU_DB_NAME:-default}
DOKU_DB_USER: ${DOKU_DB_USER:-default}
DOKU_DB_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
ports:
- "9044:9044"
depends_on:
- clickhouse
restart: always
doku-client:
image: ghcr.io/dokulabs/doku-client:0.1.0
container_name: doku-client
environment:
INIT_DB_HOST: clickhouse
INIT_DB_PORT: 8123
INIT_DB_DATABASE: ${DOKU_DB_NAME:-default}
INIT_DB_USERNAME: ${DOKU_DB_USER:-default}
INIT_DB_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
SQLITE_DATABASE_URL: file:/app/client/data/data.db
ports:
- "3000:3000"
depends_on:
- clickhouse
volumes:
- doku-client-data:/app/client/data
restart: always
volumes:
clickhouse-data:
doku-client-data:
  1. Start Docker Compose
docker-compose up -d

Kubernetes

To install the Doku Helm chart, follow these steps:

  1. Add the Doku Helm repository to your Helm setup:
helm repo add dokulabs https://dokulabs.github.io/helm/
  1. Update your Helm repositories to fetch the latest chart information:
helm repo update
  1. Install the Doku chart with the release name doku:
helm install doku dokulabs/doku

For a detailed list of configurable parameters for the Helm chart, refer to the values.yaml file in the Helm chart.

🔑 Access Doku UI and Generate an API Key

With Doku running, the next step is to access the Doku UI and generate an API key for secure communication between your applications and Doku.

  1. Open your browser and go to Doku UI at 127.0.0.1:3000/login
  2. Login using theb default credentials
    • Email as user@dokulabs.com
    • Password as dokulabsuser
  3. Once you have logged into Doku UI, Go to API Keys page and Create an API Key. Copy the generated API Key.

💡 Tip: Alternatively, you can use the HTTP API to create your Doku API Key. For further details, take a look at the API Reference section.

⚡️ Instrument your Application

Choose the appropriate SDK for your LLM application's programming language and follow the steps to integrate monitoring with just two lines of code.

Python

Install the dokumetry Python SDK using pip:

pip install dokumetry

Add the following two lines to your application code:

import dokumetry
dokumetry.init(llm=client, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")
Example Usage for monitoring OpenAI Usage:
from openai import OpenAI
import dokumetry
client = OpenAI(
api_key="YOUR_OPENAI_KEY"
)
# Pass the above `client` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
dokumetry.init(llm=client, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "What is LLM Observability",
}
],
model="gpt-3.5-turbo",
)

Refer to the dokumetry Python SDK repository for more advanced configurations and use cases.

Node

Install the dokumetry NodeJS SDK using npm:

npm install dokumetry

Add the following two lines to your application code:

import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})
Example Usage for monitoring OpenAI Usage:
import OpenAI from 'openai';
import DokuMetry from 'dokumetry';
const openai = new OpenAI({
apiKey: 'My API Key', // defaults to process.env["OPENAI_API_KEY"]
});
// Pass the above `openai` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})
async function main() {
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: 'What are the key to effective observability?' }],
model: 'gpt-3.5-turbo',
});
}
main();

Refer to the dokumetry NodeJS SDK repository for more advanced configurations and use cases.

Visualize and Analyze

Once you have Doku Ingester and DokuMetry SDKs set up, you can instantly get insights into how your LLM applications in the Doku Client UI. Just head over to 127.0.0.1:3000 on your browser to start exploring.

Doku Client UI

With Doku, you get a simple, powerful view into important info like how much you’re spending on LLMs, which parts of your app are using them the most, and how well they’re performing. Find out which LLM models are favorites among your applications, and dive deep into performance details to make smart decisions. This setup is perfect for optimizing your app performance and keeping an eye on costs.

Security

Doku uses key based authentication mechanism to ensure the security of your data and as Doku is self-hosted, The data stays within your environment.

Contributing

We welcome contributions to the Doku project. Please refer to CONTRIBUTING for detailed guidelines on how you can participate.

License

Doku is available under the Apache-2.0 license.

Support

For support, issues, or feature requests, submit an issue through the GitHub issues associated with this repository.

Visualize! Analyze! Optimize!

Join us on this voyage to reshape the future of AI Observability. Share your thoughts, suggest features, and explore contributions. Engage with us on GitHub and be part of Doku's community-led innovation.

Описание

🚀 Open-source platform for evaluating and monitoring LLMs. Integrates with OpenAI, Cohere and Anthropic with stable SDKs in Python and Javascript.

Языки

TypeScript

  • Dockerfile
  • Shell
  • JavaScript
  • CSS
  • Go
Сообщить о нарушении

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.