Llama3.2-1B
Collection
small VLM -> VLA run on edge
•
3 items
•
Updated
| Model | Parameters (B) | VQAv2 (%) | GQA (%) | RefCOCO+ (%) | RefCOCO | RefCOCOg | OCID-Ref (%) | VSR (%) (most important) | Average (%) |
|---|---|---|---|---|---|---|---|---|---|
| qwen3 | 0.6 | 74.13 | 61.05 | 59.50 | 67.90 | 63.60 | 40.50 | 54.20 | 65.24 |
| LLaMA3.2-1B | 1.0 | 73.84 | 61.08 | 61.40 | 69.40 | 62.90 | 43.00 | 49.10 | 65.72 |
| Llama-2 | 7.0 | 77.08 | 62.44 | 59.47 | - | - | 43.89 | 63.67 | 66.33 |
Training setup: trained on LLaVA-Instruct-150K, llava-v1.5-instruct/llava_v1_5_mix665k.json for 2 epochs.