91cbcc5f6ad7c31e90bf7c977c225302
This model is a fine-tuned version of facebook/opt-125m on the google/boolq dataset. It achieves the following results on the evaluation set:
- Loss: 1.3196
- Data Size: 1.0
- Epoch Runtime: 23.4913
- Accuracy: 0.6734
- F1 Macro: 0.6517
- Rouge1: 0.6740
- Rouge2: 0.0
- Rougel: 0.6731
- Rougelsum: 0.6728
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Accuracy | F1 Macro | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|---|---|---|---|
| No log | 0 | 0 | 0.7327 | 0 | 3.2602 | 0.6081 | 0.4083 | 0.6078 | 0.0 | 0.6078 | 0.6078 |
| No log | 1 | 294 | 0.7389 | 0.0078 | 3.6040 | 0.4473 | 0.4417 | 0.4473 | 0.0 | 0.4476 | 0.4473 |
| No log | 2 | 588 | 0.6716 | 0.0156 | 3.5576 | 0.6137 | 0.4317 | 0.6137 | 0.0 | 0.6131 | 0.6140 |
| No log | 3 | 882 | 0.6609 | 0.0312 | 4.1385 | 0.6198 | 0.3879 | 0.6198 | 0.0 | 0.6192 | 0.6195 |
| 0.0292 | 4 | 1176 | 0.6609 | 0.0625 | 4.7772 | 0.6204 | 0.3859 | 0.6204 | 0.0 | 0.6201 | 0.6203 |
| 0.0546 | 5 | 1470 | 0.6584 | 0.125 | 6.0451 | 0.6219 | 0.3850 | 0.6219 | 0.0 | 0.6216 | 0.6216 |
| 0.093 | 6 | 1764 | 0.6289 | 0.25 | 8.8599 | 0.6547 | 0.5885 | 0.6547 | 0.0 | 0.6547 | 0.6555 |
| 0.5788 | 7 | 2058 | 0.6185 | 0.5 | 13.7694 | 0.6596 | 0.6478 | 0.6593 | 0.0 | 0.6599 | 0.6599 |
| 0.4664 | 8.0 | 2352 | 0.6135 | 1.0 | 24.4934 | 0.6774 | 0.6672 | 0.6774 | 0.0 | 0.6771 | 0.6774 |
| 0.2417 | 9.0 | 2646 | 0.9035 | 1.0 | 23.2813 | 0.6927 | 0.6741 | 0.6930 | 0.0 | 0.6924 | 0.6927 |
| 0.1696 | 10.0 | 2940 | 1.5665 | 1.0 | 23.5260 | 0.6673 | 0.6585 | 0.6673 | 0.0 | 0.6673 | 0.6667 |
| 0.1116 | 11.0 | 3234 | 1.7915 | 1.0 | 23.3186 | 0.6900 | 0.6749 | 0.6900 | 0.0 | 0.6896 | 0.6900 |
| 0.0833 | 12.0 | 3528 | 1.3196 | 1.0 | 23.4913 | 0.6734 | 0.6517 | 0.6740 | 0.0 | 0.6731 | 0.6728 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.3.0
- Tokenizers 0.22.1
- Downloads last month
- 2
Model tree for contemmcm/91cbcc5f6ad7c31e90bf7c977c225302
Base model
facebook/opt-125m