Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py) #2725

Open
3 tasks
c935289832 opened this issue Dec 31, 2024 · 8 comments
Labels
Milestone

Comments

@c935289832
Copy link

System Info / 系統信息

transformers=4.47.1
vllm=0.4.3
python=3.11
Ubuntu-20.04

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

v1.1.1

The command used to start Xinference / 用以启动 xinference 的命令

XINFERENCE_MODEL_SRC=modelscope xinference-local --host 0.0.0.0 --port 9997

Reproduction / 复现过程

2024-12-31 09:41:26,761 xinference.core.supervisor 1692062 INFO Xinference supervisor 0.0.0.0:19033 started
2024-12-31 09:41:26,908 xinference.core.worker 1692062 INFO Starting metrics export server at 0.0.0.0:None
2024-12-31 09:41:26,910 xinference.core.worker 1692062 INFO Checking metrics export server...
2024-12-31 09:41:27,524 xinference.core.worker 1692062 INFO Metrics server is started at: http://0.0.0.0:40235
2024-12-31 09:41:27,525 xinference.core.worker 1692062 INFO Purge cache directory: /root/.xinference/cache
2024-12-31 09:41:27,526 xinference.core.utils 1692062 INFO Remove empty directory: /root/.xinference/cache/Qwen2-gptq-7b
2024-12-31 09:41:27,530 xinference.core.worker 1692062 INFO Connected to supervisor as a fresh worker
2024-12-31 09:41:27,536 xinference.core.worker 1692062 INFO Xinference worker 0.0.0.0:19033 started
2024-12-31 09:41:32,547 xinference.api.restful_api 1691973 INFO Starting Xinference at endpoint: http://0.0.0.0:9997
2024-12-31 09:41:32,631 uvicorn.error 1691973 INFO Uvicorn running on http://0.0.0.0:9997 (Press CTRL+C to quit)
2024-12-31 09:41:49,343 xinference.core.worker 1692062 INFO [request 670721fe-c718-11ef-986b-00155d18eb80] Enter launch_builtin_model, args: <xinference.core.worker.WorkerActor object at 0x7fededd0b110>, kwargs: model_uid=zpoint_large_zh-0,model_name=zpoint_large_zh,model_size_in_billions=None,model_format=None,quantization=None,model_engine=None,model_type=embedding,n_gpu=None,request_limits=None,peft_model_config=None,gpu_idx=None,download_hub=None,model_path=None
2024-12-31 09:41:49,762 xinference.model.utils 1692062 INFO Model caching from URI: /mnt/d/zpoint_large_zh
2024-12-31 09:41:49,762 xinference.model.utils 1692062 INFO cache /root/.xinference/cache/zpoint_large_zh exists
2024-12-31 09:41:52,202 xinference.core.model 1692469 INFO Start requests handler.
2024-12-31 09:41:52,452 xinference.core.worker 1692062 ERROR Failed to load model zpoint_large_zh-0
Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1793, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/trainer.py", line 226, in
from peft import PeftModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/init.py", line 22, in
from .auto import (
^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/auto.py", line 32, in
from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mapping.py", line 22, in
from .mixed_model import PeftMixedModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mixed_model.py", line 26, in
from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/init.py", line 21, in
from .lora import LoraConfig, LoraModel, LoftQConfig
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/init.py", line 20, in
from .model import LoraModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/model.py", line 50, in
from .awq import dispatch_awq
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/awq.py", line 26, in
from awq.modules.linear import WQLinear_GEMM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/init.py", line 2, in
from awq.models.auto import AutoAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/init.py", line 1, in
from .mpt import MptAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/mpt.py", line 1, in
from .base import BaseAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/base.py", line 13, in
from transformers.modeling_utils import shard_checkpoint
^^^^^^^^^^^^^^^^^
ImportError: cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/worker.py", line 897, in launch_builtin_model
await model_ref.load()
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/api.py", line 384, in on_receive
return await super().on_receive(message) # type: ignore
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 558, in on_receive
raise ex
File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.on_receive
async with self._lock:
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.on_receive
with debug_async_timeout('actor_lock_timeout',
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.on_receive
result = await result
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/model.py", line 409, in load
self._model.load()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/embedding/core.py", line 145, in load
import sentence_transformers
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/init.py", line 18, in
from sentence_transformers.trainer import SentenceTransformerTrainer
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/trainer.py", line 12, in
from transformers import EvalPrediction, PreTrainedTokenizerBase, Trainer, TrainerCallback
^^^^^^^^^^^^^^^^^
File "", line 1229, in _handle_fromlist
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1781, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1795, in _get_module
raise RuntimeError(
^^^^^^^^^^^^^^^^^
RuntimeError: [address=0.0.0.0:35141, pid=1692469] Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)
2024-12-31 09:41:52,476 xinference.core.worker 1692062 ERROR [request 670721fe-c718-11ef-986b-00155d18eb80] Leave launch_builtin_model, error: [address=0.0.0.0:35141, pid=1692469] Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py), elapsed time: 3 s
Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1793, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/trainer.py", line 226, in
from peft import PeftModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/init.py", line 22, in
from .auto import (
^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/auto.py", line 32, in
from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mapping.py", line 22, in
from .mixed_model import PeftMixedModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mixed_model.py", line 26, in
from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/init.py", line 21, in
from .lora import LoraConfig, LoraModel, LoftQConfig
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/init.py", line 20, in
from .model import LoraModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/model.py", line 50, in
from .awq import dispatch_awq
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/awq.py", line 26, in
from awq.modules.linear import WQLinear_GEMM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/init.py", line 2, in
from awq.models.auto import AutoAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/init.py", line 1, in
from .mpt import MptAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/mpt.py", line 1, in
from .base import BaseAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/base.py", line 13, in
from transformers.modeling_utils import shard_checkpoint
^^^^^^^^^^^^^^^^^
ImportError: cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/utils.py", line 94, in wrapped
ret = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/worker.py", line 897, in launch_builtin_model
await model_ref.load()
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/api.py", line 384, in on_receive
return await super().on_receive(message) # type: ignore
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 558, in on_receive
raise ex
File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.on_receive
async with self._lock:
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.on_receive
with debug_async_timeout('actor_lock_timeout',
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.on_receive
result = await result
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/model.py", line 409, in load
self._model.load()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/embedding/core.py", line 145, in load
import sentence_transformers
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/init.py", line 18, in
from sentence_transformers.trainer import SentenceTransformerTrainer
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/trainer.py", line 12, in
from transformers import EvalPrediction, PreTrainedTokenizerBase, Trainer, TrainerCallback
^^^^^^^^^^^^^^^^^
File "", line 1229, in _handle_fromlist
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1781, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1795, in _get_module
raise RuntimeError(
^^^^^^^^^^^^^^^^^
RuntimeError: [address=0.0.0.0:35141, pid=1692469] Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)
2024-12-31 09:41:52,481 xinference.api.restful_api 1691973 ERROR [address=0.0.0.0:35141, pid=1692469] Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)
Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1793, in _get_module
return importlib.import_module("." + module_name, self.name)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^
File "", line 1204, in _gcd_import
File "", line 1176, in _find_and_load
File "", line 1147, in _find_and_load_unlocked
File "", line 690, in _load_unlocked
File "", line 940, in exec_module
File "", line 241, in _call_with_frames_removed
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/trainer.py", line 226, in
from peft import PeftModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/init.py", line 22, in
from .auto import (
^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/auto.py", line 32, in
from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mapping.py", line 22, in
from .mixed_model import PeftMixedModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/mixed_model.py", line 26, in
from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/init.py", line 21, in
from .lora import LoraConfig, LoraModel, LoftQConfig
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/init.py", line 20, in
from .model import LoraModel
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/model.py", line 50, in
from .awq import dispatch_awq
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/awq.py", line 26, in
from awq.modules.linear import WQLinear_GEMM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/init.py", line 2, in
from awq.models.auto import AutoAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/init.py", line 1, in
from .mpt import MptAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/mpt.py", line 1, in
from .base import BaseAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/base.py", line 13, in
from transformers.modeling_utils import shard_checkpoint
^^^^^^^^^^^^^^^^^
ImportError: cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/api/restful_api.py", line 1002, in launch_model
model_uid = await (await self._get_supervisor_ref()).launch_builtin_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/api.py", line 384, in on_receive
return await super().on_receive(message) # type: ignore
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 558, in on_receive
raise ex
File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.on_receive
async with self._lock:
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.on_receive
with debug_async_timeout('actor_lock_timeout',
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.on_receive
result = await result
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/supervisor.py", line 1041, in launch_builtin_model
await _launch_model()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/supervisor.py", line 1005, in _launch_model
await _launch_one_model(rep_model_uid)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/supervisor.py", line 984, in _launch_one_model
await worker_ref.launch_builtin_model(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/api.py", line 384, in on_receive
return await super().on_receive(message) # type: ignore
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 558, in on_receive
raise ex
File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.on_receive
async with self._lock:
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.on_receive
with debug_async_timeout('actor_lock_timeout',
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.on_receive
result = await result
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/utils.py", line 94, in wrapped
ret = await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/worker.py", line 897, in launch_builtin_model
await model_ref.load()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 231, in send
return self._process_result_message(result)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/context.py", line 102, in _process_result_message
raise message.as_instanceof_cause()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 667, in send
result = await self._run_coro(message.message_id, coro)
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/backends/pool.py", line 370, in _run_coro
return await coro
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xoscar/api.py", line 384, in on_receive
return await super().on_receive(message) # type: ignore
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 558, in on_receive
raise ex
File "xoscar/core.pyx", line 520, in xoscar.core._BaseActor.on_receive
async with self._lock:
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 521, in xoscar.core._BaseActor.on_receive
with debug_async_timeout('actor_lock_timeout',
^^^^^^^^^^^^^^^^^
File "xoscar/core.pyx", line 526, in xoscar.core._BaseActor.on_receive
result = await result
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/core/model.py", line 409, in load
self._model.load()
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/xinference/model/embedding/core.py", line 145, in load
import sentence_transformers
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/init.py", line 18, in
from sentence_transformers.trainer import SentenceTransformerTrainer
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/sentence_transformers/trainer.py", line 12, in
from transformers import EvalPrediction, PreTrainedTokenizerBase, Trainer, TrainerCallback
^^^^^^^^^^^^^^^^^
File "", line 1229, in _handle_fromlist
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1781, in getattr
module = self._get_module(self._class_to_module[name])
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/utils/import_utils.py", line 1795, in _get_module
raise RuntimeError(
^^^^^^^^^^^^^^^^^
RuntimeError: [address=0.0.0.0:35141, pid=1692469] Failed to import transformers.trainer because of the following error (look up to see its traceback):
cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)

Expected behavior / 期待表现

run

@XprobeBot XprobeBot added the gpu label Dec 31, 2024
@XprobeBot XprobeBot added this to the v1.x milestone Dec 31, 2024
@qinxuye
Copy link
Contributor

qinxuye commented Dec 31, 2024

sentence_transformers 是什么版本?

@c935289832
Copy link
Author

sentence_transformers 是什么版本?

3.2.1

@qinxuye
Copy link
Contributor

qinxuye commented Dec 31, 2024

File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/model.py", line 50, in
from .awq import dispatch_awq
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/peft/tuners/lora/awq.py", line 26, in
from awq.modules.linear import WQLinear_GEMM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/init.py", line 2, in
from awq.models.auto import AutoAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/init.py", line 1, in
from .mpt import MptAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/mpt.py", line 1, in
from .base import BaseAWQForCausalLM
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/envs/xinference/lib/python3.11/site-packages/awq/models/base.py", line 13, in
from transformers.modeling_utils import shard_checkpoint
^^^^^^^^^^^^^^^^^
ImportError: cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/root/miniconda3/envs/xinference/lib/python3.11/site-packages/transformers/modeling_utils.py)

看上去和 peft 以及 autoawq 两个包有关。看下对应版本。

@c935289832
Copy link
Author

c935289832 commented Dec 31, 2024

看上去和 peft 以及 autoawq 两个包有关。看下对应版本。

autoawq 0.2.5
peft 0.11.1

但是我将transformers版本降为4.47.0以下就是正常运行的,目前transformers版本为4.46.3

@qinxuye
Copy link
Contributor

qinxuye commented Dec 31, 2024

shard_checkpoint

应该是 4.47 以后 transformers 把 shard_checkpoint 方法删掉了导致问题。

@c935289832
Copy link
Author

应该是 4.47 以后 transformers 把 shard_checkpoint 方法删掉了导致问题。

那这个是去transformers里提issues还是有其他方式能运行的呢?

@qinxuye
Copy link
Contributor

qinxuye commented Dec 31, 2024

应该是 4.47 以后 transformers 把 shard_checkpoint 方法删掉了导致问题。

那这个是去transformers里提issues还是有其他方式能运行的呢?

peft 升级看下。

@c935289832
Copy link
Author

peft 升级看下。

试了下将peft升级为0.14.0可以运行自定的embedding

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants