This is the official implementation of paper "Leveraging Dual Process Theory in Language Agent Framework for Simultaneous Human-AI Collaboration."
📄 Paper | 🌐 Website | 📘 机器之心 | 🧪 AGI-Eval
- [2025/03/18] Our work is featured by 机器之心 on Wechat!
- [2025/03/06] We have established a partnership with AGI-Eval platform. The benchmark results of Overcooked Challenge are now available on AGI-Eval-Overcooked Challenge.
Create a new environment
conda create -n dptagent python=3.10 -y # support python<=3.10
conda activate dptagent
# Install pytorch
# We do not need gpu
pip install torch torchvision torchaudio
# Installing dependencies
bash ./install.sh
litellm -c llms/litellm/config.yml --port 40000
# check health
curl --location 'http://127.0.0.1:40000/health' -H "Authorization: Bearer sk-1234"
For single agent exp, first change the config/envs/overcooked.yaml
mode: burger_exp1
...
num_agents: 1
Then run:
sh scripts/exp0/openai.sh
For single agent exp, first change the config/envs/overcooked.yaml
mode: burger_exp1
...
num_agents: 1
Then run:
sh scripts/exp1/openai.sh
For collaboration exp, first change the config/envs/overcooked.yaml
mode: burger_exp1
...
num_agents: 2
Then run:
sh scripts/exp2/openai.sh
For use map 2, change the config/envs/overcooked.yaml
mode: burger_aa_new
...
num_agents: 2
Then run:
sh scripts/overcooked/human_llm_app.sh
Then open the website http://localhost:5001
For more information, please run
python llm_agent_run_act.py --help
We recommend using pre-commit to unify the format before commit.
# Init
pre-commit install
# You can run manually (maybe multiple times)
pre-commit run --all-files
# or it will automatically run when you try to commit, which is slow and seems stuck.
@misc{zhang2025ldpt,
title={Leveraging Dual Process Theory in Language Agent Framework for Real-time Simultaneous Human-AI Collaboration},
author={Shao Zhang and Xihuai Wang and Wenhao Zhang and Chaoran Li and Junru Song and Tingyu Li and Lin Qiu and Xuezhi Cao and Xunliang Cai and Wen Yao and Weinan Zhang and Xinbing Wang and Ying Wen},
year={2025},
eprint={2502.11882},
archivePrefix={arXiv},
primaryClass={cs.AI},
url={https://arxiv.org/abs/2502.11882},
}