-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
phi3.5-vision fails on CPU #1146
Comments
The images can be of any format. Here's how images are loaded. onnxruntime-genai/src/models/prompt_image_processor.cpp Lines 107 to 116 in 7735e10
The
You can find more information about that here. |
i add the options as below:
but i get the error:
is this correct ? |
Can you try |
Hi @kunal-vaishnavi, I tried with genai_config.json:
error:
|
It has to be without the quotes around the value.
The above linked PR should help make these JSON mismatch errors clearer in the future. |
@kunal-vaishnavi Do you have any suggestions to overcome this? |
onnxruntime-genai/benchmark/python/benchmark_e2e.py Lines 40 to 56 in d129274
This will tell you where the error occurs in your inference.
og.set_log_options(enabled=True, model_input_values=True, model_output_values=True, ansi_tags=True) This will tell you which stage within ONNX Runtime GenAI causes the error.
If the memory usage is significantly more than running with PyTorch, then there may be an issue that needs to be investigated. If the memory usage is close, you can try resizing the image so that the model doesn't run out-of-memory or use a machine with more RAM. |
This changes the JSON parsing to use a std::variant so there just a single OnValue handler vs OnString/OnNumber/OnBool/OnNull. Previously a mismatched type would say `JSON Error: Unknown value: name at line 3 index 19` or it would say `JSON Error: Unknown value: name` if the name was known but the type of its value was wrong (example: #1146). Now it'll give a much better error message, showing first the full path of the field being parsed, and then saying exactly how the types mismatch: `JSON Error: model:type - Expected a number but saw a string at line 3 index 19`
Hi,
I am using a linux aarch64 device using ORT and onnxruntime-genai v0.5.2
On executing phi3.5-vision model on CPU following the steps mentioned: https://onnxruntime.ai/docs/genai/tutorials/phi3-v.html#run-on-cpu
The program gets killed with oom-error. My device has 16GB memory. I can easily execute phi3.5-mini models on my device, but phi3.5-vision is failing due to oom-kill error.
my error log is as follows:
are there any specific formats of image the model takes in?
I have faced this 'killed' issue with ORT before. With ORT, i have to set the flag
enable_cpu_mem_arena
toFalse
How do i execute the same using the provided python script https://github.com/microsoft/onnxruntime-genai/blob/rel-0.5.2/examples/python/phi3v.py
Do ort-genai also have such flags while executing generator models?
The text was updated successfully, but these errors were encountered: