Tokenization 4.3.1 API

Utility Packages
Package
Description
Core definitions for tokenization.
Simple fixed rule tokenizers.
Provides an implementation of a Wordpiece tokenizer which implements to the Tribuo Tokenizer API.
OLCUT Options implementations which can construct Tokenizers of various types.
An implementation of a "universal" tokenizer which will split on word boundaries or character boundaries for languages where word boundaries are contextual.