https://github.com/huggingface/transformers/blame/638d49983f36af910934b38771b4e55c835c1774/src/transformers/tokenization_utils_base.py#L2253 I was trying to use [text-generation-webui](https://github.com/oobabooga/text-generation-webui/issues/4370) and all tokenizer loads, using quite a few loaders that depend on transformers libraries, are broken. @ArthurZucker Were you just trying to override the special token values in ``init_kwargs``? I changed the statement as below, locally, to get things to work: ``init_kwargs[key] = added_tokens_map.get(key, init_kwargs[key])`` I didn't make a PR because I'm not sure what the overall intent is. Prior to this, the flow was different. Appreciate your attention on this. Thanks :)