LongRoPE: Extending LLM Context Window Beyond 2M Tokens
source link: https://arxiv.org/abs/2402.13753
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Computer Science > Computation and Language
LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens
Download PDF HTML (experimental)
Large context window is a desirable feature in large language models (LLMs). However, due to high fine-tuning costs, scarcity of long texts, and catastrophic values introduced by new token positions, current extended context windows are limited to around 128k tokens. This paper introduces LongRoPE that, for the first time, extends the context window of pre-trained LLMs to an impressive 2048k tokens, with up to only 1k fine-tuning steps at within 256k training lengths, while maintaining performance at the original short context window. This is achieved by three key innovations: (i) we identify and exploit two forms of non-uniformities in positional interpolation through an efficient search, providing a better initialization for fine-tuning and enabling an 8x extension in non-fine-tuning scenarios; (ii) we introduce a progressive extension strategy that first fine-tunes a 256k length LLM and then conducts a second positional interpolation on the fine-tuned extended LLM to achieve a 2048k context window; (iii) we readjust LongRoPE on 8k length to recover the short context window performance. Extensive experiments on LLaMA2 and Mistral across various tasks demonstrate the effectiveness of our method. Models extended via LongRoPE retain the original architecture with minor modifications to the positional embedding, and can reuse most pre-existing optimizations.
Subjects: | Computation and Language (cs.CL) |
Cite as: | arXiv:2402.13753 [cs.CL] |
(or arXiv:2402.13753v1 [cs.CL] for this version) | |
https://doi.org/10.48550/arXiv.2402.13753 |
Recommend
-
10
July 19, 2021 Windows Developers Extending the Context Me...
-
12
Extending the Context Menu and Share Dialog in Windows 11Skip to main content
-
6
Extending ServiceMax Beyond Service: Workflow Automation In the previous 2 articles, we discussed how tracking and utilizing asset data with ServiceMax can greatly enable both
-
6
The Future of DAOs Beyond Tokens...
-
4
Hybrid tables in Power BI: Extending beyond time-related scenariosHybrid tables are one of the most powerful features in Power BI! And, it’s not (just) about separating “hot” and “cold” data…You can basically leverag...
-
5
Microsoft finally brings dark mode support to window frame context menu in Edge...
-
1
-
6
-
5
2 minute read / Aug 30, 2023 / announcements / Context.ai - Unlocking Insight into LLM-Based Applications
-
7
Is RAG relevant anymore (after Google’s 1mn+ context window)?
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK