Just don't get it: OpenAI API in Open WebUI

Hi, I just subscribed HF Pro and want to migrate my Open WebUI from Groq.com to HF models, and contribute to the community, of course.
Now, a very basic question: I want to add HF models using an OpenAI connection. I created an API and link to the API URL /static-proxy?url=https%3A%2F%2Fapi-inference.huggingface.co%2Fmodels%2FQwen%2FQwen3-235B-A22B%3C%2Fa%3E. However, no connection is possible. What is my error?
Thanks, Robert

1 Like

It seems that it is also possible to use Hugging Face models (in a similar way) from Groq and OpenAI APIs. The usage of the Inference API on Hugging Face has been significantly revamped recently and integrated into the Inference Provider, so I recommend looking into that.

Thanks for your prompt reply, John!

What finally worked, was the putting the API URL
/static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fnovita%2Fv3%2Fopenai%3C%2Fa%3E in Open WebUI connections.

This provides me with access to ~50 models on Novita. I tried some other
Inference providers without luck; Thus for now I’m fine to go and start
testing; however, the documentation on the API still could be improved.

Best, Robert

1 Like