How to use uclanlp/plbart-multi_task-dynamic with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("uclanlp/plbart-multi_task-dynamic") model = AutoModelForSeq2SeqLM.from_pretrained("uclanlp/plbart-multi_task-dynamic")
2474cbe f0416a1
1
2
3
4
version https://git-lfs.github.com/spec/v1 oid sha256:bafdc17eaad8703ea800d3ff993dfe0388c3f6ffb16013486aed65a5f45a13c3 size 557185721