- FourSquare - https://drive.google.com/file/d/0BwrgZ-IdrTotZ0U0ZER2ejI3VVk/view?usp=sharing&resourcekey=0-rlHp_JcRyFAxN7v5OAGldw
- Yelp - https://www.yelp.com/dataset
- Wikipedia Multilingual Edits - https://figshare.com/ndownloader/files/5051242
- Download the above zip files and place their unzipped content in directory named
raw_data. The following image should be the state ofraw_databefore proceeding
- Create a directory
datain the main folder anddata/Meta - Run notebooks
preprocess_4square.ipynb, preprocess_yelp.ipynb preprocess_wiki.ipynb - After execution files will be generated and stored in
raw_data, data, data/Meta
- Let us say our source city is Jakarta and we want to transfer to various target cities. (They are all mentioned in line 20 of
transfer_script.sh.) - These are all the cities that we have evaluated on in the paper. Similarly, targets for dataset
wikiand datasetyelpcan be found in the same file.
create directories models and transfers
run python train_fgat.py 4square 0 to train the F-GAT model. this will store the model in models, here 0 is the specified gpu.
run bash src_agg_script.sh 4square 0 to train TGN on sources and save F-GAT meta information on sources. generated models and data will be stored in models
run bash mapping_script.sh 4square 0 Jakarta GAT2 to create node mappings from Jakarta to all targets
- run
bash transfer_script.sh 0 NT 4square Jakarta 2024 1to evaluateNT-TGNon targets (no transfer) - run
bash transfer_script.sh 0 WT 4square Jakarta 2024 1to evaluateWT-TGNon targets (weight transfer) - run
bash transfer_script.sh 0 GAT2 4square Jakarta 2024 1to evaluateMINTTon targets (our approach)
- to evaluate on another dataset, repeat steps 1, 2 with
4squarereplaced bywikioryelp - for steps 3 and 4, additionaly replace "Jakarta" with valid source from the corresponding dataset
- list of valid sources for each dataset is given below -
- 4square: ('Jakarta' 'Moscow' 'Tokyo')
- wiki: ('en' 'fr' 'de')
- yelp: ('Philadelphia' 'New Orleans' 'Tampa')
- in the directories
transfers/{SOURCE_DOMAIN}/{TRANSFER_TYPE}the corresponding logs will be visible - here is an example view of the results folder for Jakarta -

NT/, WT/, GAT2correspond toNT-TGN, WT-TGN, MINTTas described in the paper- inside each one of these folders, there is a
.txtfile corresponding to each city containing the training logs and the final evaluated metrics - to generate an overall tabulated summary of the results run
python tabulate.py 4square Jakarta(replace with suitable name of dataset and source) - the summaries will be stored in a generated folder
results
To run LightGCN and NGCF do as follows:
- First change the directory to
baselinesby usingcd baselines - run
bash LightGCN_script.sh {gpu} {dataset} {run}to get the results for LightGCN inLightGCN_results/for any dataset among4square, wiki or yelp - run
bash NGCF_script.sh {gpu} {dataset} {run}to get the results for NGCF inNGCF_results/for any dataset among4square, wiki or yelp - use
python3 compileResults.py {model}to get the summary of final results in{model}_results.csvfor model asLightGCNorNGCF
To run TGAT and GraphMixer do as follows:
For evaluating these baselines, we used the code provided in the repo https://github.com/yule-BUAA/DyGLib. We added the feature of transfer and Mrr/recall@20 computation on top of the original code. Feel free to refer to the original repo for more detailed logs and implementation instructions.
- First change the directory to
baselines/DyGLib-masterby usingcd baselines/DyGLib-master - run
bash train_src.sh <src> <dataset> <gpu>to train the model on src for any dataset among4square, wiki or yelp - run
bash script.sh {tgt} {src} {dataset} {gpu}to make transfer from tgt to src for the given dataset. You can view the final results inout.txt. To see the detailed logs follow theREADME.mdpresent inbaselines/DyGLib-master
