Gemma-merged-2B-ties

Gemma-merged-2B-ties is a merge of the following models using mergekit:

🧩 Configuration

models:
  - model: google/gemma-2b
    parameters:
      density: 0.5
      weight: 0.5
  - model: google/gemma-2b-it
    parameters:
      density: 0.5
      weight: 0.5 # weight gradient
merge_method: ties
base_model: google/gemma-2b
parameters:
  normalize: true
  int8_mask: true
dtype: bfloat16
Downloads last month
55
Safetensors
Model size
3B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for arcee-ai/Gemma-merged-2B-ties

Quantizations
2 models