Natural-Language-Processing Papers 자연어처리 관련 paper들과 해당 논문 review입니다. (2022년 이전에 작성된 글은 다음 링크에서 확인하실 수 있습니다. [링크]) Date keywords Institute Paper review Publication 2023-03 A Survey of Large Language Models link-1, link-2 2023-02 LLaMA-1 Meta LLaMA: Open and Efficient Foundation Language Models link 2023-07 LLaMA-2 Meta Llama 2: Open Foundation and Fine-Tuned Chat Models link 2024-04 LLaMA-3 Meta LLaMA3 link 2024-01 TinyLlama TinyLlama: An Open-Source Small Language Model link 2023-05 GQA Goggle Research GQA: Training Generalized Multi-Query Transformer Models from Multi-Head Checkpoints link EMNLP 2023 2024-05 Anthropic Mapping the Mind of a Large Language Model link 2023-05 TinyStories Microsoft TinyStories: How Small Can Language Models Be and Still Speak Coherent English? link 2023-06 Phi-1 Microsoft Phi-1: Textbooks Are All You Need link 2023-09 Phi-1.5 Microsoft Textbooks Are All You Need II: phi-1.5 technical report link 2023-12 Phi-2 Microsoft Phi-2: The surprising power of small language models link 2024-05 Phi-3 Microsoft Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone link 2023-07 Chinchilla’s Death link 2024-03 RAFT: Adapting Language Model to Domain Specific RAG link 📒 LLM 시리즈: 기본 개념부터 최신 LLM 까지 [1] DeepLearning 기본 이론 [2] Network - CNN, RNN, LSTM, Transformer [3-1] A Survey of Large Language Models - 다양한 LLMs부터, Data, Architecture, Training 까지 [3-2] A Survey of Large Language Models - Adaptation of LLMs [4] RAG (Retriever Augumented Generation) 공유하기 Twitter Facebook LinkedIn