Creating a custom pre-tokenizer · Issue #269 · huggingface/tokenizers

Hello, I was trying to subclass PreTokenizer class from tokenizers.pre_tokenizers in order to create my own custom pre-tokenizer for splitting the text at word level, but this doesn't seem to work ...