Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Image inference error #47

Open
NorthLatitudeOne opened this issue Apr 18, 2019 · 4 comments
Open

Image inference error #47

NorthLatitudeOne opened this issue Apr 18, 2019 · 4 comments

Comments

@NorthLatitudeOne
Copy link

Hi, can somebody help to have a look my problem, thanks!
Image inference in web menu test image inference:
Predict result: [{"error": "Inference error <class 'KeyError'>: 'input_image'"}, 400]

Image inference in command line:
`E:\Anaconda\Lib\site-packages\simple_tensorflow_serving>curl -X POST -F 'image=images/NG0093.jpg' -F "model_version=3" 127.0.0.1:8500

<title>500 Internal Server Error</title>

Internal Server Error

The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.

`

Simple tensorflow serving log:
2019-04-18 09:00:46 ERROR Need to set image or images for form-data 2019-04-18 09:00:46 ERROR Exception on / [POST] Traceback (most recent call last): File "e:\anaconda\lib\site-packages\flask\app.py", line 2292, in wsgi_app response = self.full_dispatch_request() File "e:\anaconda\lib\site-packages\flask\app.py", line 1815, in full_dispatch_request rv = self.handle_user_exception(e) File "e:\anaconda\lib\site-packages\flask_cors\extension.py", line 161, in wrapped_function return cors_after_request(app.make_response(f(*args, **kwargs))) File "e:\anaconda\lib\site-packages\flask\app.py", line 1718, in handle_user_exception reraise(exc_type, exc_value, tb) File "e:\anaconda\lib\site-packages\flask\_compat.py", line 35, in reraise raise value File "e:\anaconda\lib\site-packages\flask\app.py", line 1813, in full_dispatch_request rv = self.dispatch_request() File "e:\anaconda\lib\site-packages\flask\app.py", line 1799, in dispatch_request return self.view_functions[rule.endpoint](**req.view_args) File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 184, in decorated return f(*decorator_args, **decorator_kwargs) File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 308, in inference json_result, status_code = do_inference() File "e:\anaconda\lib\site-packages\simple_tensorflow_serving\server.py", line 349, in do_inference if "model_name" in json_data: TypeError: argument of type 'NoneType' is not iterable 2019-04-18 09:00:46 INFO 127.0.0.1 - - [18/Apr/2019 09:00:46] "�[1m�[35mPOST / HTTP/1.1�[0m" 500 -

@jeffin07
Copy link

jeffin07 commented Jul 3, 2019

@NorthLatitudeOne i got a similar error
'error': "Inference error <class 'KeyError'>: 'in'"}
Did you find any solution for this ?
I also tried with the given model in Readme still the same type of error
{"error": "Inference error <class 'KeyError'>: 'features'"}

@serlina
Copy link

serlina commented Jul 24, 2019

image
I used docker to run the simple_tensorflow_serving in local, and I can open the 127.0.0.1:8500
docker run -d -p 8500:8500 tobegit3hub/simple_tensorflow_serving
the python version is python2.7

while I used postman to send the post request for image inference client call, it also raise exception as below:

image

while the code for converting jpeg to base64 is :
from PIL import Image
import cv2
import cStringIO
import base64

def base64_encode_img(img):
"""
:param img:
:return:
"""
img_rgb = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)
print('shape of img_rgb:', img_rgb.shape)
pil_img = Image.fromarray(img_rgb)

buf = cStringIO.StringIO()
pil_img.save(buf, format="JPEG", quality=100)
b64code = base64.urlsafe_b64encode(buf.getvalue()) #web-safe
# b64code = base64.b64encode('abcdefgdisoaufd,0.342,0.456,0.987')  # not web-safe
print(b64code)
return b64code

if name == "main":
img_BGR = cv2.imread('./mew.jpg')
base64_encode_img(img_BGR)

could someone give me help about the image client call for the model

@serlina
Copy link

serlina commented Jul 24, 2019

sorry ,correct my above code, I just used below code to convert local jpg to base64, and then used the base64 in postman (post body)

with open("./mew.jpg", "rb") as image_file:
encoded_string = base64.b64encode(image_file.read())
print encoded_string

comments: if I use base64.urlsafe_b64encode to encode, it will raise another error

@tobegit3hub
Copy link
Owner

Hi @serlina , it depends on your TensorFlow SavedModel's op. If you use tf.decode_base64(model_base64_placeholder) to process the input data, you may try this client code which has test in our environment.

import requests
import base64

def main():
  image_file_name = "../../images/mew.jpg"
  image_b64_string = base64.urlsafe_b64encode(
      open(image_file_name, "rb").read())

  endpoint = "http://127.0.0.1:8500"
  input_data = {
      "data": {
          "image": [image_b64_string]
      }
  }
  result = requests.post(endpoint, json=input_data)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants