Skip to content

A curated collection of research papers exploring the utilization of LLMs for graph-related tasks.

Notifications You must be signed in to change notification settings

yhLeeee/Awesome-LLMs-in-Graph-tasks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 

Repository files navigation

Awesome-LLMs-in-Graph-tasks

If you like our project, please give us a star ⭐ on GitHub for the latest update.

Awesome GitHub stars

This is a collection of papers on leveraging Large Language Models in Graph Tasks. It's based on our survey paper: A Survey of Graph Meets Large Language Model: Progress and Future Directions.

We will try to make this list updated frequently. If you found any error or any missed paper, please don't hesitate to open issues or pull requests.

Our survey has been accepted by IJCAI 2024 survey track.

How can LLMs help improve graph-related tasks?

With the help of LLMs, there has been a notable shift in the way we interact with graphs, particularly those containing nodes associated with text attributes. The integration of LLMs with traditional GNNs can be mutually beneficial and enhance graph learning. While GNNs are proficient at capturing structural information, they primarily rely on semantically constrained embeddings as node features, limiting their ability to express the full complexities of the nodes. Incorporating LLMs, GNNs can be enhanced with stronger node features that effectively capture both structural and contextual aspects. On the other hand, LLMs excel at encoding text but often struggle to capture structural information present in graph data. Combining GNNs with LLMs can leverage the robust textual understanding of LLMs while harnessing GNNs' ability to capture structural relationships, leading to more comprehensive and powerful graph learning.

Figure 1. The overview of Graph Meets LLMs.

Summarizations based on proposed taxonomy

Table 1. A summary of models that leverage LLMs to assist graph-related tasks in literature, ordered by their release time. Fine-tuning denotes whether it is necessary to fine-tune the parameters of LLMs, and ♥ indicates that models employ parameter-efficient fine-tuning (PEFT) strategies, such as LoRA and prefix tuning. Prompting indicates the use of text-formatted prompts in LLMs, done manually or automatically. Acronyms in Task: Node refers to node-level tasks; Link refers to link-level tasks; Graph refers to graph-level tasks; Reasoning refers to Graph Reasoning; Retrieval refers to Graph-Text Retrieval; Captioning refers to Graph Captioning.

Table of Contents

LLM as Enhancer

  • (2022.03) [ICLR' 2022] Node Feature Extraction by Self-Supervised Multi-scale Neighborhood Prediction [Paper | Code]
    GIANT

    The framework of GIANT.

  • (2023.02) [ICLR' 2023] Edgeformers: Graph-Empowered Transformers for Representation Learning on Textual-Edge Networks [Paper | Code]
    Edgeformers

    The framework of Edgeformers.

  • (2023.05) [KDD' 2023] Graph-Aware Language Model Pre-Training on a Large Graph Corpus Can Help Multiple Graph Applications [Paper]
    GALM

    The framework of GALM.

  • (2023.06) [KDD' 2023] Heterformer: Transformer-based Deep Node Representation Learning on Heterogeneous Text-Rich Networks [Paper | Code]
    Heterformer

    The framework of Heterformer.

  • (2023.05) [ICLR' 2024] Harnessing Explanations: LLM-to-LM Interpreter for Enhanced Text-Attributed Graph Representation Learning [Paper | Code]
    TAPE

    The framework of TAPE.

  • (2023.08) [Arxiv' 2023] Exploring the potential of large language models (llms) in learning on graphs [Paper]
    KEA

    The framework of KEA.

  • (2023.07) [Arxiv' 2023] Can Large Language Models Empower Molecular Property Prediction? [Paper | Code]
    LLM4Mol

    The framework of LLM4Mol.

  • (2023.08) [Arxiv' 2023] Simteg: A frustratingly simple approach improves textual graph learning [Paper | Code]
    SimTeG

    The framework of SimTeG.

  • (2023.09) [Arxiv' 2023] Prompt-based Node Feature Extractor for Few-shot Learning on Text-Attributed Graphs [Paper]
    G-Prompt

    The framework of G-Prompt.

  • (2023.09) [Arxiv' 2023] TouchUp-G: Improving Feature Representation through Graph-Centric Finetuning [Paper]
    TouchUp-G

    The framework of TouchUp-G.

  • (2023.09) [ICLR' 2024] One for All: Towards Training One Graph Model for All Classification Tasks [Paper | Code]
    OFA

    The framework of OFA.

  • (2023.10) [Arxiv' 2023] Learning Multiplex Embeddings on Text-rich Networks with One Text Encoder [Paper | Code]
    METERN

    The framework of METERN.

  • (2023.11) [WSDM' 2024] LLMRec: Large Language Models with Graph Augmentation for Recommendation [Paper | Code]
    LLMRec

    The framework of LLMRec.

  • (2023.11) [NeurIPS' 2023] WalkLM: A Uniform Language Model Fine-tuning Framework for Attributed Graph Embedding [Paper | Code]
    WalkLM

    The framework of WalkLM.

  • (2024.01) [Arxiv' 2024] Efficient Tuning and Inference for Large Language Models on Textual Graphs [Paper]
    ENGINE

    The framework of ENGINE.

  • (2024.02) [Arxiv' 2024] ZeroG: Investigating Cross-dataset Zero-shot Transferability in Graphs [Paper]
    ZeroG

    The framework of ZeroG.

  • (2024.02) [Arxiv' 2024] UniGraph: Learning a Cross-Domain Graph Foundation Model From Natural Language [Paper]
    UniGraph

    The framework of UniGraph.

LLM as Predictor

  • (2023.05) [NeurIPS' 2023] Can language models solve graph problems in natural language? [Paper | Code]
    NLGraph

    The framework of NLGraph.

  • (2023.05) [Arxiv' 2023] GPT4Graph: Can Large Language Models Understand Graph Structured Data? An Empirical Evaluation and Benchmarking [Paper | Code]
    GPT4Graph

    The framework of GPT4Graph.

  • (2023.06) [NeurIPS' 2023] GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning [Paper | Code]
    GIMLET

    The framework of GIMLET.

  • (2023.07) [Arxiv' 2023] Exploring the Potential of Large Language Models (LLMs) in Learning on Graphs [Paper | Code]
    Framework

    The designed prompts of Chen et al.

  • (2023.08) [Arxiv' 2023] GIT-Mol: A Multi-modal Large Language Model for Molecular Science with Graph, Image, and Text [Paper]
    GIT-Mol

    The framework of GIT-Mol.

  • (2023.08) [Arxiv' 2023] Natural Language is All a Graph Needs [Paper | Code]
    InstructGLM

    The framework of InstructGLM.

  • (2023.08) [Arxiv' 2023] Evaluating Large Language Models on Graphs: Performance Insights and Comparative Analysis [Paper | Code]
    Framework

    The designed prompts of Liu et al.

  • (2023.09) [Arxiv' 2023] Can LLMs Effectively Leverage Graph Structural Information: When and Why [Paper | Code]
    Framework

    The designed prompts of Huang et al.

  • (2023.10) [Arxiv' 2023] GraphText: Graph Reasoning in Text Space [Paper] | Code]
    GraphText

    The framework of GraphText.

  • (2023.10) [Arxiv' 2023] Talk like a Graph: Encoding Graphs for Large Language Models [Paper]
    Framework

    The designed prompts of Fatemi et al.

  • (2023.10) [Arxiv' 2023] GraphLLM: Boosting Graph Reasoning Ability of Large Language Model [Paper | Code]
    GraphLLM

    The framework of GraphLLM.

  • (2023.10) [Arxiv' 2023] Beyond Text: A Deep Dive into Large Language Model [Paper]
    Framework

    The designed prompts of Hu et al.

  • (2023.10) [EMNLP' 2023] MolCA: Molecular Graph-Language Modeling with Cross-Modal Projector and Uni-Modal Adapter [Paper | Code]
    MolCA

    The framework of MolCA.

  • (2023.10) [Arxiv' 2023] GraphGPT: Graph Instruction Tuning for Large Language Models [Paper | Code]
    GraphGPT

    The framework of GraphGPT.

  • (2023.10) [EMNLP' 2023] ReLM: Leveraging Language Models for Enhanced Chemical Reaction Prediction [Paper | Code]
    ReLM

    The framework of ReLM.

  • (2023.10) [Arxiv' 2023] LLM4DyG: Can Large Language Models Solve Problems on Dynamic Graphs? [Paper]
    LLM4DyG

    The framework of LLM4DyG.

  • (2023.10) [Arxiv' 2023] Disentangled Representation Learning with Large Language Models for Text-Attributed Graphs [Paper]
    DGTL

    The framework of DGTL.

  • (2023.11) [Arxiv' 2023] Which Modality should I use -- Text, Motif, or Image? : Understanding Graphs with Large Language Models [Paper]
    Framework

    The framework of Das et al.

  • (2023.11) [Arxiv' 2023] InstructMol: Multi-Modal Integration for Building a Versatile and Reliable Molecular Assistant in Drug Discovery [Paper]
    InstructMol

    The framework of InstructMol.

  • (2023.12) [Arxiv' 2023] When Graph Data Meets Multimodal: A New Paradigm for Graph Understanding and Reasoning [Paper]
    Framework

    The framework of Ai et al.

  • (2024.02) [Arxiv' 2024] Let Your Graph Do the Talking: Encoding Structured Data for LLMs [Paper]
    GraphToken

    The framework of GraphToken.

  • (2024.02) [Arxiv' 2024] Rendering Graphs for Graph Reasoning in Multimodal Large Language Models [Paper]
    GITA

    The framework of GITA.

  • (2024.02) [WWW' 2024] GraphTranslator: Aligning Graph Model to Large Language Model for Open-ended Tasks [Paper | Code]
    GraphTranslator

    The framework of GraphTranslator.

  • (2024.02) [Arxiv' 2024] InstructGraph: Boosting Large Language Models via Graph-centric Instruction Tuning and Preference Alignment [Paper | Code]
    InstructGraph

    The framework of InstructGraph.

  • (2024.02) [Arxiv' 2024] LLaGA: Large Language and Graph Assistant [Paper | Code]
    LLaGA

    The framework of LLaGA.

  • (2024.02) [WWW' 2024] Can GNN be Good Adapter for LLMs? [Paper]
    GraphAdapter

    The framework of GraphAdapter.

  • (2024.02) [Arxiv' 2024] HiGPT: Heterogeneous Graph Language Model [Paper | Code]
    HiGPT

    The framework of HiGPT.

  • (2024.02) [Arxiv' 2024] GraphWiz: An Instruction-Following Language Model for Graph Problems [Paper | Code]
    GraphWiz

    The framework of GraphWiz.

  • (2024.03) [Arxiv' 2024] OpenGraph: Towards Open Graph Foundation Models [Paper | Code]
    OpenGraph

    The framework of OpenGraph.

GNN-LLM Alignment

  • (2020.08) [Arxiv' 2020] Graph-based Modeling of Online Communities for Fake News Detection [Paper | Code]
    SAFER

    The framework of SAFER.

  • (2021.05) [NeurIPS' 2021] GraphFormers: GNN-nested Transformers for Representation Learning on Textual Graph [Paper | Code]
    GraphFormers

    The framework of GraphFormers.

  • (2021.11) [EMNLP' 2021] Text2Mol: Cross-Modal Molecule Retrieval with Natural Language Queries [Paper | Code]
    Text2Mol

    The framework of Text2Mol.

  • (2022.07) [ACL' 2023] Hidden Schema Networks [Paper | Code]
    HSN

    The framework of HSN.

  • (2022.09) [Arxiv' 2022] A Molecular Multimodal Foundation Model Associating Molecule Graphs with Natural Language [Paper | Code]
    MoMu

    The framework of MoMu.

  • (2022.10) [ICLR' 2023] Learning on Large-scale Text-attributed Graphs via Variational Inference [Paper | Code]
    GLEM

    The framework of GLEM.

  • (2022.12) [NMI' 2023] Multi-modal Molecule Structure-text Model for Text-based Editing and Retrieval [Paper | Code]
    MoleculeSTM

    The framework of MoleculeSTM.

  • (2023.04) [Arxiv' 2023] Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs [Paper | Code]
    GRAD

    The framework of GRAD.

  • (2023.05) [ACL' 2023] PATTON : Language Model Pretraining on Text-Rich Networks [Paper | Code]
    Patton

    The framework of Patton.

  • (2023.05) [Arxiv' 2023] ConGraT: Self-Supervised Contrastive Pretraining for Joint Graph and Text Embeddings [Paper | Code]
    ConGraT

    The framework of ConGraT.

  • (2023.07) [Arxiv' 2023] Prompt Tuning on Graph-augmented Low-resource Text Classification [Paper | Code]
    G2P2

    The framework of G2P2.

  • (2023.10) [EMNLP' 2023] GRENADE: Graph-Centric Language Model for Self-Supervised Representation Learning on Text-Attributed Graphs [Paper | Code]
    GRENADE

    The framework of GRENADE.

  • (2023.10) [WWW' 2024] Representation Learning with Large Language Models for Recommendation [Paper | Code]
    RLMRec

    The framework of RLMRec.

  • (2023.10) [EMNLP' 2023] Pretraining Language Models with Text-Attributed Heterogeneous Graphs [Paper | Code]
    THLM

    The framework of THLM.

Others

LLM as Annotator

  • (2023.10) [ICLR' 2024] Label-free Node Classification on Graphs with Large Language Models (LLMs) [Paper | Code]
    LLM-GNN

    The framework of LLM-GNN.

LLM as Controller

  • (2023.10) [Arxiv' 2023] Graph Neural Architecture Search with GPT-4 [Paper]
    GPT4GNAS

    The framework of GPT4GNAS.

LLM as Sample Generator

  • (2023.10) [Arxiv' 2023] Empower Text-Attributed Graphs Learning with Large Language Models (LLMs) [Paper]
    ENG

    The framework of ENG.

LLM as Similarity Analyzer

  • (2023.11) [Arxiv' 2023] Large Language Models as Topological Structure Enhancers for Text-Attributed Graphs [Paper]
    Framework

    The framework of Sun et al.

Other Repos

We note that several repos also summarize papers on the integration of LLMs and graphs. However, we differentiate ourselves by organizing these papers leveraging a new and more granular taxonomy. We recommend researchers to explore some repositories for a comprehensive survey.

We highly recommend a repository that summarizes the work on Graph Prompt, which is very close to Graph-LLM.

Contributing

If you have come across relevant resources, feel free to open an issue or submit a pull request.

* (_time_) [conference] **paper_name** [[Paper](link) | [Code](link)]
   <details close>
   <summary>Model name</summary>
   <p align="center"><img width="75%" src="Figures/xxx.jpg" /></p>
   <p align="center"><em>The framework of model name.</em></p>
   </details>

Cite Us

Feel free to cite this work if you find it useful to you!

@article{li2023survey,
  title={A Survey of Graph Meets Large Language Model: Progress and Future Directions},
  author={Li, Yuhan and Li, Zhixun and Wang, Peisong and Li, Jia and Sun, Xiangguo and Cheng, Hong and Yu, Jeffrey Xu},
  journal={arXiv preprint arXiv:2311.12399},
  year={2023}
}

About

A curated collection of research papers exploring the utilization of LLMs for graph-related tasks.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published