We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
command:
(opencompass) root@node1:~/user/OpenCompass/opencompass# python run.py configs/vllm-glm4-9b-chat-custom.py
vllm-glm4-9b-chat-custom.py
from opencompass.models import OpenAISDK from mmengine.config import read_base with read_base(): from .datasets.cmmlu.cmmlu_gen import cmmlu_datasets datasets = [] datasets = cmmlu_datasets api_meta_template = dict( round=[ dict(role='HUMAN', api_role='HUMAN'), dict(role='BOT', api_role='BOT', generate=True), ], reserved_roles=[dict(role='SYSTEM', api_role='SYSTEM')], ) models = [ dict( abbr='glm-4-9b-chat-vllm-API', type=OpenAISDK, key='EMPTY', openai_api_base='http://localhost:port/v1', path='glm-4-9b-chat', tokenizer_path='/root/user/models/glm-4-9b-chat', rpm_verbose=True, meta_template=api_meta_template, query_per_second=10, max_out_len=1024, max_seq_len=4096, temperature=0.01, batch_size=8, retry=3, ) ]
想在不更改OpenCompass代码的情况下实现,但是文档示例只有以下用法:
python run.py \ --models hf_llama2_7b \ --custom-dataset-path xxx/test_qa.jsonl \ --custom-dataset-data-type qa \ --custom-dataset-infer-method gen
The text was updated successfully, but these errors were encountered:
You can change the models with the API models.
Sorry, something went wrong.
acylam
No branches or pull requests
Describe the feature
想在不更改OpenCompass代码的情况下实现,但是文档示例只有以下用法:
Will you implement it?
The text was updated successfully, but these errors were encountered: