XLM-RoBERTa-large-XNLI

by Hugginface

XLM-RoBERTa-large-XNLI

by Hugginface

Main use cases: Model that has been fine-tuned for intent recognition and text classification. It is based on the RoBERTa variation of the basic BERT model and can recognize intents even in complex emails if only the name of the intent is specified (zero shot).  

 
Input length: 512 tokens (approx. 384 words)  

 
Languages: English, French, German, Spanish, Greek & 10 others  

 
Model size: ~355 million parameters

Main use cases: Model that has been fine-tuned for intent recognition and text classification. It is based on the RoBERTa variation of the basic BERT model and can recognize intents even in complex emails if only the name of the intent is specified (zero shot).  

 
Input length: 512 tokens (approx. 384 words)  

 
Languages: English, French, German, Spanish, Greek & 10 others  

 
Model size: ~355 million parameters