Skip to content

Commit 9d21620

Browse files
authored
[Fix][Tests] TP param used in tests unconditionally (#315)
# Description `--tensor-parallel-size` and `tp_size` flag being used regardless of the the block checking if `tp_size` exists. Signed-off-by: Rafael Vasquez <[email protected]>
1 parent 8a9a548 commit 9d21620

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

tests/conftest.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ def remote_openai_server(request):
126126
if 'tp_size' in params:
127127
tp_size = params['tp_size']
128128
skip_unsupported_tp_size(int(tp_size), backend)
129-
server_args.extend(["--tensor-parallel-size", str(tp_size)])
129+
server_args.extend(["--tensor-parallel-size", str(tp_size)])
130130

131131
try:
132132
with RemoteOpenAIServer(model, server_args,

0 commit comments

Comments
 (0)