Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable tool support for ollama #164

Merged
merged 1 commit into from
Feb 28, 2025
Merged

Conversation

alappe
Copy link
Contributor

@alappe alappe commented Jul 26, 2024

Adds basic support for tools with ollama v0.3, tested with the hairbrush example:

alias LangChain.Function
alias LangChain.Message
alias LangChain.Chains.LLMChain
alias LangChain.ChatModels.ChatOllamaAI


# map of data we want to be passed as `context` to the function when
# executed.
custom_context = %{
  "user_id" => 123,
  "hairbrush" => "drawer",
  "dog" => "backyard",
  "sandwich" => "kitchen"
}

# a custom Elixir function made available to the LLM
custom_fn =
  Function.new!(%{
    name: "custom",
    description: "Returns the location of the requested element or item.",
    parameters_schema: %{
      type: "object",
      properties: %{
        thing: %{
          type: "string",
          description: "The thing whose location is being requested."
        }
      },
      required: ["thing"]
    },
    function: fn %{"thing" => thing} = _arguments, context ->
      # our context is a pretend item/location location map
      {:ok, context[thing]}
  end
  })

# create and run the chain
{:ok, updated_chain, %Message{} = message} =
  LLMChain.new!(%{
    llm: ChatOllamaAI.new!(%{model: "llama3.1", verbose: true}),
    custom_context: custom_context,
    verbose: true
  })
  |> LLMChain.add_tools(custom_fn)
  |> LLMChain.add_message(Message.new_user!("Where is the hairbrush located?"))
  |> LLMChain.run(mode: :while_needs_response)

# print the LLM's answer
IO.puts(message.content)
#=> "The hairbrush is located in the drawer."

Copy link
Owner

@brainlid brainlid left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just reviewing the code, it looks good!

Can you add some tests? I'd like to see tests for the following:

  • for_api
  • get_parameters
  • call using Mimic to avoid doing a live call
  • do_process_response

Having tests really helps the maintainability of the project since I'm not easily able to test all the different chat models.

Feel free to ask questions if you need help with any of that. Thanks!

@alappe alappe force-pushed the ollama_support_tools branch from 1792a81 to daedf50 Compare August 5, 2024 09:44
@alappe
Copy link
Contributor Author

alappe commented Aug 5, 2024

I added several tests to the best of my knowledge (I'm new to langchain). Please have a look…

@alappe alappe force-pushed the ollama_support_tools branch from daedf50 to 85f00b9 Compare August 5, 2024 13:05
Copy link
Owner

@brainlid brainlid left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for adding tests! Sorry I was slow to respond.

I added some specific requests. The one example I gave about assert [%{"function" => _} | _] = data.tools applies to all the other tests as well.


data = ChatOllamaAI.for_api(ollama_ai, [], [fun])

assert [%{"function" => _} | _] = data.tools
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the main goals with the test is to verify it's formatting the data as the targeted LLM expects it. Not just that it's a map with a "function" key. Also, because of the test setup, you can safely make assumptions about the expected data. For example...

assert [%{} = result_data] = data.tools

# now create a result_data assertion

We want to test that the LangChain.Function data structure is structured how the supported Ollama server expects it to be. What does it expect things to be called? Anything different or unusual?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I now assert against the full structure.

@mpope9
Copy link
Contributor

mpope9 commented Jan 3, 2025

Will this PR be part of the 0.3.0 release?

@brainlid
Copy link
Owner

I'd love to get this merged in if someone wants to pick it up and help finish the changes!

@brainlid brainlid added the help wanted Extra attention is needed label Feb 27, 2025
@alappe
Copy link
Contributor Author

alappe commented Feb 28, 2025

Sorry, I had a lot on my plate… I can take another shot. Since then ollama added async tool support which is not with my previous changes. But maybe I'll clean up this PR first, then look at async later.

@alappe alappe force-pushed the ollama_support_tools branch 2 times, most recently from de67662 to 58a01df Compare February 28, 2025 08:52
@alappe alappe force-pushed the ollama_support_tools branch from 58a01df to bfee9e3 Compare February 28, 2025 09:18
@alappe
Copy link
Contributor Author

alappe commented Feb 28, 2025

@brainlid Rebased and made the changes.

@brainlid brainlid merged commit 40ce7f6 into brainlid:main Feb 28, 2025
1 check passed
@brainlid
Copy link
Owner

Thanks for the contribution and for making the changes! That was a large chunk of work.
❤️💛💙💜

@mustela
Copy link

mustela commented Feb 28, 2025

Hey guys Im testing these changes in the demo project and Im getting:

[error] Task #PID<0.5203.0> started from #PID<0.5193.0> terminating
** (WithClauseError) no with clause matching: %{"function" => %{"arguments" => %{"activity" => nil, "days" => 7}, "name" => "get_fitness_logs"}}
    (langchain 0.3.1) lib/message.ex:242: anonymous fn/1 in LangChain.Message.validate_and_parse_tool_calls/1
    (elixir 1.16.3) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
    (langchain 0.3.1) lib/message.ex:241: LangChain.Message.validate_and_parse_tool_calls/1
    (langchain 0.3.1) lib/message.ex:162: LangChain.Message.common_validations/1
    (langchain 0.3.1) lib/message.ex:132: LangChain.Message.new/1
    (langchain 0.3.1) lib/message_delta.ex:254: LangChain.MessageDelta.to_message/1
    (langchain 0.3.1) lib/chains/llm_chain.ex:673: LangChain.Chains.LLMChain.delta_to_message_when_complete/1
    (elixir 1.16.3) lib/enum.ex:2528: Enum."-reduce/3-lists^foldl/2-0-"/3
    (langchain 0.3.1) lib/chains/llm_chain.ex:571: LangChain.Chains.LLMChain.do_run/1
    (langchain 0.3.1) lib/chains/llm_chain.ex:520: LangChain.Chains.LLMChain.run_while_needs_response/1
    (langchain 0.3.1) lib/chains/llm_chain.ex:398: LangChain.Chains.LLMChain.run/2
    (langchain_demo 0.1.0) lib/langchain_demo_web/live/agent_chat_live/index.ex:321: anonymous fn/1 in LangChainDemoWeb.AgentChatLive.Index.run_chain/1
    (phoenix_live_view 0.20.17) lib/phoenix_live_view/async.ex:220: Phoenix.LiveView.Async.do_async/5
    (elixir 1.16.3) lib/task/supervised.ex:101: Task.Supervised.invoke_mfa/2
Function: #Function<7.104768383/0 in Phoenix.LiveView.Async.run_async_task/5>
    Args: []
[error] GenServer #PID<0.5193.0> terminating
** (WithClauseError) no with clause matching: %{"function" => %{"arguments" => %{"activity" => nil, "days" => 7}, "name" => "get_fitness_logs"}}
    (langchain 0.3.1) lib/message.ex:242: anonymous fn/1 in LangChain.Message.validate_and_parse_tool_calls/1
    (elixir 1.16.3) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
    (langchain 0.3.1) lib/message.ex:241: LangChain.Message.validate_and_parse_tool_calls/1
    (langchain 0.3.1) lib/message.ex:162: LangChain.Message.common_validations/1
    (langchain 0.3.1) lib/message.ex:132: LangChain.Message.new/1
    (langchain 0.3.1) lib/message_delta.ex:254: LangChain.MessageDelta.to_message/1
    (langchain 0.3.1) lib/chains/llm_chain.ex:673: LangChain.Chains.LLMChain.delta_to_message_when_complete/1
    (langchain_demo 0.1.0) lib/langchain_demo_web/live/agent_chat_live/index.ex:102: LangChainDemoWeb.AgentChatLive.Index.handle_info/2
    (phoenix_live_view 0.20.17) lib/phoenix_live_view/channel.ex:360: Phoenix.LiveView.Channel.handle_info/2
    (stdlib 5.2.3) gen_server.erl:1095: :gen_server.try_handle_info/3
    (stdlib 5.2.3) gen_server.erl:1183: :gen_server.handle_msg/6
    (stdlib 5.2.3) proc_lib.erl:241: :proc_lib.init_p_do_apply/3

Wondering if some of the interfaces has changed

@alappe
Copy link
Contributor Author

alappe commented Feb 28, 2025

@mustela Did you set stream: false when creating the chain's llm? e.g. here: https://github.com/brainlid/langchain_demo/blob/main/lib/langchain_demo_web/live/agent_chat_live/index.ex#L271

The code does not support streaming tool support (yet).

@mustela
Copy link

mustela commented Feb 28, 2025

That's did the trick! thank you @alappe ❤️ ! Looking forward to have streaming tool support 🙏🏼

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants