File size: 3,588 Bytes
f8cbc8a 1d13765 f8cbc8a 1d13765 f8cbc8a 1d13765 f8cbc8a 1d13765 f8cbc8a 1d13765 f8cbc8a e94a1bf 1d13765 f8cbc8a e94a1bf 1d13765 f8cbc8a 1d13765 f8cbc8a e94a1bf 1d13765 f8cbc8a 1d13765 f8cbc8a e94a1bf f8cbc8a e94a1bf 196a3b1 f8cbc8a 1d13765 f8cbc8a 1d13765 17e1683 1d13765 e94a1bf 1d13765 fcf909b 1d13765 e94a1bf 1d13765 e94a1bf 1d13765 fcf909b 1d13765 e94a1bf 1d13765 fcf909b 1d13765 e94a1bf 1d13765 17e1683 1d13765 196a3b1 1d13765 196a3b1 1d13765 196a3b1 1d13765 196a3b1 1d13765 17e1683 1d13765 e94a1bf 1d13765 e94a1bf 1d13765 e94a1bf 1d13765 7a7a75c 1d13765 7a7a75c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 |
---
license: apache-2.0
language:
- en
- lt
pipeline_tag: text-generation
tags:
- gguf
- llama.cpp
- openwebui
- lithuanian
- bilingual
- local-ai
---
# 🧠 ZygAI — Neutral Bilingual AI Engine (LT / EN)
**ZygAI** is a neutral, locally hosted AI engine designed for **Lithuanian 🇱🇹 and English 🇬🇧** language tasks.
It is built for **local inference**, **server-based usage**, and **runtime personas**.
> ZygAI is a **base engine**, not a chatbot persona.
> Behavior and specialization are applied at runtime (OpenWebUI / API).
# ✨ Key Features
- 🇱🇹 / 🇬🇧 **True bilingual support**
- ⚡ Optimized **GGUF** models for `llama.cpp`
- 🧩 Supports **runtime personas** (MiniGPTs, system prompts)
- 🧠 Clean identity — **no vendor branding**
- 🖥️ Designed for **systemd + server deployments**
- 🔀 Supports **GGUF shards** (no merge required)
# 🧠 Architecture Overview
ZygAI (base engine)
├── Q4 → fast / high throughput
├── Q5 → balanced / general usage
└── Q8 → high quality / reasoning
- **ZygAI** = neutral engine
- No hardcoded system prompt in the model
# 📦 Available Quantizations
| Quantization | Purpose | Notes |
|-------------|--------|------|
| **Q4_K_M** | Fast | Best speed, low memory |
| **Q5_K_M** | Balanced | Default general use |
| **Q8_0** | High quality | Best reasoning, higher RAM |
> Models may be provided as **GGUF shards** (`-00001-of-00002.gguf`).
> `llama.cpp` loads shards automatically — **no merge required**.
# 🚀 Running ZygAI (llama.cpp server)
# Example: Q4 (shard-based)
```
./llama-server \
-m ZygAI-q4_k_m-00001-of-00002.gguf \
--host 0.0.0.0 \
--port 8081 \
--ctx-size 4096 \
--threads 4 \
--batch-size 2048 \
--jinja
```
# Multiple models (recommended)
| Model | Port |
| --- | --- |
| Q4 | 8081 |
| Q5 | 8082 |
| Q8 | 8083 |
# 🧰 Using with OpenWebUI
* Provider: **OpenAI (local)**
* Base URL: `http://127.0.0.1:PORT/v1`
* Auth: none
# Important
ZygAI is designed for **llama.cpp backend**.
System prompts and personas work **correctly only with llama.cpp**, not Ollama.
# 🎭 Personas (Recommended)
ZygAI is intentionally **neutral**.
Specialization is applied via runtime personas:
# 🌍 Language Behavior
* Responds **in the same language as the user**
* No automatic language switching
* No mixed-language replies unless requested
```
Examples:
User (EN):
> What is Lithuania?
Assistant:
> Lithuania is a country located in the Baltic region of Eastern Europe.
User (LT):
> Kada Lietuva įstojo į Europos Sąjungą?.
Assistant:
> Lietuva įstojo į Europos Sąjungą 2004 m. gegužės 1 d.
```
# 📜 License
Apache 2.0
This repository provides **inference-only model files**.
Base model weights originate from publicly available sources and are redistributed according to their respective licenses.
# 🔒 Notes
* ZygAI is **not** ChatGPT
* ZygAI is **not** a vendor-branded assistant
* ZygAI is designed for **local-first, privacy-respecting AI**
# 📖 Citation
If you use **ZygAI** in research, development, or documentation, please cite it as follows:
```
@software{zygai-7b,
title = {ZygAI: Neutral Bilingual AI Engine for Lithuanian and English},
author = {Mažeika, Žygimantas},
year = {2025},
publisher = {Hugging Face},
url = {https://huggingface.co/ZygAI},
license = {Apache-2.0},
note = {Local-first GGUF models optimized for llama.cpp with runtime personas}
}
``` |