Models, tokenizers, and preprocessing layers for RoBERTa, as described in "RoBERTa: A Robustly Optimized BERT Pretraining Approach".
For a full list of available presets, see the models page.