Official release of InternLM2.5 base and chat models. 1M context support
-
Updated
Oct 10, 2024 - Python
Official release of InternLM2.5 base and chat models. 1M context support
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
LongWriter: Unleashing 10,000+ Word Generation from Long Context LLMs
[ACL 2024] LongBench: A Bilingual, Multitask Benchmark for Long Context Understanding
Transformers with Arbitrarily Large Context
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
LongCite: Enabling LLMs to Generate Fine-grained Citations in Long-context QA
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
[COLM 2024] TriForce: Lossless Acceleration of Long Sequence Generation with Hierarchical Speculative Decoding
[EMNLP 2024] LongAlign: A Recipe for Long Context Alignment of LLMs
ACL 2024 | LooGLE: Long Context Evaluation for Long-Context Language Models
LongQLoRA: Extent Context Length of LLMs Efficiently
open-source code for paper: Retrieval Head Mechanistically Explains Long-Context Factuality
awesome llm plaza: daily tracking all sorts of awesome topics of llm, e.g. llm for coding, robotics, reasoning, multimod etc.
Implementation of NAACL 2024 Outstanding Paper "LM-Infinite: Simple On-the-Fly Length Generalization for Large Language Models"
ShadowKV: KV Cache in Shadows for High-Throughput Long-Context LLM Inference
Add a description, image, and links to the long-context topic page so that developers can more easily learn about it.
To associate your repository with the long-context topic, visit your repo's landing page and select "manage topics."