tok_exp = nlp.tokenizer.explain("(don't)") assert [t[0] for t in tok_exp] == ["PREFIX", "SPECIAL-1", "SPECIAL-2", "SUFFIX"] assert [t[1] for t in tok_exp] == ["(", "do", "n't", ")"] NEW: Official Python 3.8 wheels for spaCy and its dependencies. Base language support for Korean. Add Scorer.las_per_type (labelled depdencency scores per label). Rework Chinese language initialization and tokenization Improve language data for Luxembourgish. 🔴 Bug fixes Fix issue #4573, #4645: Improve tokenizer usage docs. Fix issue #4575: Add error in debug-data if no dev docs are available. Fix issue #4582: Make as_tuples=True in Language.pipe work with multiprocessing. Fix issue #4590: Correctly call on_match in DependencyMatcher . Fix issue #4593: Build wheels for Python 3.8. Fix issue #4604: Fix realloc in Retokenizer.split . Fix issue #4656: Fix conllu2json converter when -n > 1. Fix ...