Releases: alexrudall/ruby-openai
Releases · alexrudall/ruby-openai
v5.2.0
Fix
- Added more spec-compliant SSE parsing: see here https://html.spec.whatwg.org/multipage/server-sent-events.html#event-stream-interpretation
- Fixes issue where OpenAI or an intermediary returns only partial JSON per chunk of streamed data
- Huge thanks to @atesgoral for this important fix!
5.1.0
Added
- Added rough_token_count to estimate tokens in a string according to OpenAI's "rules of thumb". Thank you to @jamiemccarthy for the idea and implementation!
5.0.0
Added
- Support multi-tenant use of the gem! Each client now holds its own config, so you can create unlimited clients in the same project, for example to Azure and OpenAI, or for different headers, access keys, etc.
- [BREAKING-ish] This change should only break your usage of ruby-openai if you are directly calling class methods like
OpenAI::Client.getfor some reason, as they are now instance methods. Normal usage of the gem should be unaffected, just you can make new clients and they'll keep their own config if you want, overriding the global config. - Huge thanks to @petergoldstein for his original work on this, @cthulhu for testing and many others for reviews and suggestions.
Changed
- [BREAKING] Move audio related method to Audio model from Client model. You will need to update your code to handle this change, changing
client.translatetoclient.audio.translateandclient.transcribetoclient.audio.transcribe.
4.3.2
Fixed
- Don't overwrite config extra-headers when making a client without different ones. Thanks to @swistaczek for raising this!
- Include extra-headers for Azure requests.
4.3.1
Fixed
- Tempfiles can now be sent to the API as well as Files, eg for Whisper. Thanks to @codergeek121 for the fix!
4.3.0
Added
- Add extra-headers to config to allow setting openai-caching-proxy-worker TTL, Helicone Auth and anything else ya need. Ty to @deltaguita and @marckohlbrugge for the PR!
4.2.0
Added
- Add Azure OpenAI Service support. Thanks to @rmachielse and @steffansluis for the PR and to everyone who requested this feature!
4.1.0
4.0.0
Added
- Add the ability to stream Chat responses from the API! Thanks to everyone who requested this and made suggestions.
- Added instructions for streaming to the README.
Changed
- Switch HTTP library from HTTParty to Faraday to allow streaming and future feature and performance improvements.
- [BREAKING] Endpoints now return JSON rather than HTTParty objects. You will need to update your code to handle this change, changing
JSON.parse(response.body)["key"]andresponse.parsed_response["key"]to justresponse["key"].
3.7
Added
- Allow the usage of proxy base URIs like https://www.helicone.ai/. Thanks to @mmmaia for the PR!