-
Notifications
You must be signed in to change notification settings - Fork 0
Issues: AkihikoWatanabe/paper_notes
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
OpenDevin: Code Less, Make More, 2024
LanguageModel
LLMAgent
NaturalLanguageGeneration
NLP
Repository
#1325
opened Jul 4, 2024 by
AkihikoWatanabe
より良いTransformerをつくる, Shun Kiyono, 2022
LanguageModel
NLP
Tutorial
#1324
opened Jul 3, 2024 by
AkihikoWatanabe
RetrievaBERTの公開, 2024
LanguageModel
LongSequence
NLP
RetrievalAugmentedGeneration
#1323
opened Jul 3, 2024 by
AkihikoWatanabe
Using and Evaluating User Directed Summaries to Improve Information Access
NLP
PersonalizedDocumentSummarization
#1318
opened May 30, 2024 by
AkihikoWatanabe
T5Score: Discriminative Fine-tuning of Generative Evaluation Metrics, Yiwei Qin+, N/A, arXiv'22
Pocket
#1317
opened May 28, 2024 by
AkihikoWatanabe
ChatEval: Towards Better LLM-based Evaluators through Multi-Agent Debate, Chi-Min Chan+, N/A, arXiv'23
Pocket
#1315
opened May 28, 2024 by
AkihikoWatanabe
COMET: A Neural Framework for MT Evaluation, Ricardo Rei+, N/A, arXiv'20
Pocket
#1312
opened May 26, 2024 by
AkihikoWatanabe
GLU Variants Improve Transformer, Noam Shazeer, N/A, arXiv'20
LanguageModel
Neural
NLP
Transformer
#1311
opened May 24, 2024 by
AkihikoWatanabe
RoFormer: Enhanced Transformer with Rotary Position Embedding, Jianlin Su+, N/A, arXiv'21
Pocket
#1310
opened May 24, 2024 by
AkihikoWatanabe
Does Fine-Tuning LLMs on New Knowledge Encourage Hallucinations?, Zorik Gekhman+, N/A, arXiv'24
Pocket
#1308
opened May 20, 2024 by
AkihikoWatanabe
ReFT: Representation Finetuning for Language Models, Zhengxuan Wu+, N/A, arXiv'24
Pocket
#1307
opened May 18, 2024 by
AkihikoWatanabe
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.