Models, tokenizers, and preprocessing layers for BERT, as described in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding".
For a full list of available presets, see the models page.