Skip to content

Artefacts related to the SIGIR 2025 paper "MINTT: Memory Inductive Transfer for Temporal Graph Neural Networks"

Notifications You must be signed in to change notification settings

data-iitd/mintt

Repository files navigation

alt text

Dataset Preparation

  1. FourSquare - https://drive.google.com/file/d/0BwrgZ-IdrTotZ0U0ZER2ejI3VVk/view?usp=sharing&resourcekey=0-rlHp_JcRyFAxN7v5OAGldw
  2. Yelp - https://www.yelp.com/dataset
  3. Wikipedia Multilingual Edits - https://figshare.com/ndownloader/files/5051242
  4. Download the above zip files and place their unzipped content in directory named raw_data. The following image should be the state of raw_data before proceeding alt text
  5. Create a directory data in the main folder and data/Meta
  6. Run notebooks preprocess_4square.ipynb, preprocess_yelp.ipynb preprocess_wiki.ipynb
  7. After execution files will be generated and stored in raw_data, data, data/Meta

Example Usage

Preliminaries:

  1. Let us say our source city is Jakarta and we want to transfer to various target cities. (They are all mentioned in line 20 of transfer_script.sh.)
  2. These are all the cities that we have evaluated on in the paper. Similarly, targets for dataset wiki and dataset yelp can be found in the same file.

Step 0:

create directories models and transfers

Step 1:

run python train_fgat.py 4square 0 to train the F-GAT model. this will store the model in models, here 0 is the specified gpu.

Step 2:

run bash src_agg_script.sh 4square 0 to train TGN on sources and save F-GAT meta information on sources. generated models and data will be stored in models

Step 3:

run bash mapping_script.sh 4square 0 Jakarta GAT2 to create node mappings from Jakarta to all targets

Step 4:

  1. run bash transfer_script.sh 0 NT 4square Jakarta 2024 1 to evaluate NT-TGN on targets (no transfer)
  2. run bash transfer_script.sh 0 WT 4square Jakarta 2024 1 to evaluate WT-TGN on targets (weight transfer)
  3. run bash transfer_script.sh 0 GAT2 4square Jakarta 2024 1 to evaluate MINTT on targets (our approach)

Note:

  1. to evaluate on another dataset, repeat steps 1, 2 with 4square replaced by wiki or yelp
  2. for steps 3 and 4, additionaly replace "Jakarta" with valid source from the corresponding dataset
  3. list of valid sources for each dataset is given below -
    • 4square: ('Jakarta' 'Moscow' 'Tokyo')
    • wiki: ('en' 'fr' 'de')
    • yelp: ('Philadelphia' 'New Orleans' 'Tampa')

Results Interpretation

  1. in the directories transfers/{SOURCE_DOMAIN}/{TRANSFER_TYPE} the corresponding logs will be visible
  2. here is an example view of the results folder for Jakarta - alt text
  3. NT/, WT/, GAT2 correspond to NT-TGN, WT-TGN, MINTT as described in the paper
  4. inside each one of these folders, there is a .txt file corresponding to each city containing the training logs and the final evaluated metrics
  5. to generate an overall tabulated summary of the results run python tabulate.py 4square Jakarta (replace with suitable name of dataset and source)
  6. the summaries will be stored in a generated folder results

Baselines

To run LightGCN and NGCF do as follows:

  1. First change the directory to baselines by using cd baselines
  2. run bash LightGCN_script.sh {gpu} {dataset} {run} to get the results for LightGCN in LightGCN_results/for any dataset among 4square, wiki or yelp
  3. run bash NGCF_script.sh {gpu} {dataset} {run} to get the results for NGCF in NGCF_results/for any dataset among 4square, wiki or yelp
  4. use python3 compileResults.py {model} to get the summary of final results in {model}_results.csv for model as LightGCN or NGCF

To run TGAT and GraphMixer do as follows:

For evaluating these baselines, we used the code provided in the repo https://github.com/yule-BUAA/DyGLib. We added the feature of transfer and Mrr/recall@20 computation on top of the original code. Feel free to refer to the original repo for more detailed logs and implementation instructions.

  1. First change the directory to baselines/DyGLib-master by using cd baselines/DyGLib-master
  2. run bash train_src.sh <src> <dataset> <gpu> to train the model on src for any dataset among 4square, wiki or yelp
  3. run bash script.sh {tgt} {src} {dataset} {gpu} to make transfer from tgt to src for the given dataset. You can view the final results in out.txt. To see the detailed logs follow the README.md present in baselines/DyGLib-master

About

Artefacts related to the SIGIR 2025 paper "MINTT: Memory Inductive Transfer for Temporal Graph Neural Networks"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •