open-webui + openlit #560
-
|
Hello, I want to push metrics from open-webui to openlit dashboard? I was trying to find out how to push the metrics into clickhouse database. But this is already done by openlit using python methode openlit.init()Is there any way to monitor in real time the requests sent from open-webui instead of sending them in a python code. Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
|
The only way possible would be to add openlit within the open-webui code. I will reach out to open-webui maintainers if they are open to adding it |
Beta Was this translation helpful? Give feedback.
-
|
any news? How openlit works actually? I am running this code below, which is a simple code to request vLLM server: def send_rqst():
rqst = """
curl http://localhost:8000/v1/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer token-abc123" \
-d '{
"model": "MyModel",
"prompt": "San Francisco is a",
"max_tokens": 7,
"temperature": 0
}'
"""
import os
import openlit
OTEL_ENDPOINT = "http://<my-ip>:4318"
openlit.init(
otlp_endpoint=OTEL_ENDPOINT,
collect_gpu_stats=True
)
os.system(rqst)I can receive the request but openlit shows nothing. However it works fine running vLLM inference without vLLM server. MY GOAL Please have a look at the code above and tell me what is wrong with it. Thank you |
Beta Was this translation helpful? Give feedback.

I have found a solution.
To monitor open-webui with openlit you need to install pipelines where you have to put the example code here in your pipeline python code. Example of a pipeline code is here where you can paste my example code and change it bit of course. So instead of sending
hello!as a prompt you will pass a variable calleduser_messageas shown.