2

[2304.12404] Semantic Tokenizer for Enhanced Natural Language Processing

 1 year ago
source link: https://arxiv.org/abs/2304.12404
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

[Submitted on 24 Apr 2023]

Semantic Tokenizer for Enhanced Natural Language Processing

Download PDF

Traditionally, NLP performance improvement has been focused on improving models and increasing the number of model parameters. NLP vocabulary construction has remained focused on maximizing the number of words represented through subword regularization. We present a novel tokenizer that uses semantics to drive vocabulary construction. The tokenizer includes a trainer that uses stemming to enhance subword formation. Further optimizations and adaptations are implemented to minimize the number of words that cannot be encoded. The encoder is updated to integrate with the trainer. The tokenizer is implemented as a drop-in replacement for the SentencePiece tokenizer. The new tokenizer more than doubles the number of wordforms represented in the vocabulary. The enhanced vocabulary significantly improves NLP model convergence, and improves quality of word and sentence embeddings. Our experimental results show top performance on two Glue tasks using BERT-base, improving on models more than 50X in size.

Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2304.12404 [cs.CL]
  (or arXiv:2304.12404v1 [cs.CL] for this version)
  https://doi.org/10.48550/arXiv.2304.12404

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK