"Efficiently Training 7B LLM with 1 Million Sequence Length on 8 GPUs."

Pinxue Zhao et al. (2024)

Details and statistics

DOI: 10.48550/ARXIV.2407.12117

access: open

type: Informal or Other Publication

metadata version: 2024-08-23