ybabakhin michaelfeil commited on
Commit
46b6d10
·
1 Parent(s): d6f618a

"use_bidirectional_attention": true flag (#13)

Browse files

- "use_bidirectional_attention": true flag (99553f4bd18a68b164c857ef75198651b13e7c19)


Co-authored-by: Michael <michaelfeil@users.noreply.huggingface.co>

Files changed (1) hide show
  1. config.json +2 -1
config.json CHANGED
@@ -37,5 +37,6 @@
37
  "torch_dtype": "bfloat16",
38
  "transformers_version": "4.44.2",
39
  "use_cache": true,
40
- "vocab_size": 128256
 
41
  }
 
37
  "torch_dtype": "bfloat16",
38
  "transformers_version": "4.44.2",
39
  "use_cache": true,
40
+ "vocab_size": 128256,
41
+ "use_bidirectional_attention": true
42
  }