GPU Community Grant Request - Docker Model Runner
#1
by
likhonsheikhdev
- opened
GPU Community Grant Application
Project Overview
Docker Model Runner is an open-source, self-hosted API server providing full Anthropic Messages API compatibility. Enables developers to run local LLMs with Claude Code, Cursor, and other AI tools.
Why GPU Access?
Currently on CPU Basic (2 vCPU, 16GB RAM). With GPU:
- Run larger models (Llama 3, Mistral, Phi-3)
- 10-100x faster inference
- Real-time streaming for Claude Code
- Handle more concurrent users
Open Source
- GitHub: https://github.com/mariarudushibd/docker-model-runner
- License: MIT
- Features: Anthropic API, OpenAI API, Streaming, Tool Calling, Interleaved Thinking
Requested Hardware
T4 small or L4 GPU preferred.
Thank you for considering! @huggingface
Built for the open-source AI community