How to use FoundationVision/groma-7b-finetune with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("FoundationVision/groma-7b-finetune", dtype="auto")
No model card