使用transformers过程中出现的bug
1. The following model_kwargs are not used by the model: ['encoder_hidden_states', 'encoder_attention_mask'] (note: typos in the generate arguments will also show up in this list)
使用text_decoder就出现上述错误,这是由于transformers版本不兼容导致的
from transformers import AutoModel, AutoConfig, BertGenerationDecoder
decoder_config = AutoConfig.from_pretrained(args['text_checkpoint'])
text_decoder = BertGenerationDecoder(config=decoder_config)
output = self.text_decoder.generate(input_ids=cls_input_ids,
encoder_hidden_states=encoder_hidden_states,
encoder_attention_mask=encoder_attention_mask,
max_length=self.args['max_seq_length'],
do_sample=True,
num_beams=self.args['beam_size'],
length_penalty=1.0, use_cache=True,
)
解决办法:将transformer的版本换到以下范围, 4.15.0<=transformers<4.22.0,transformers>=4.25.0
比如:pip install transformers==4.25.1 or pip install transformers==4.20.1