Skip to content

1.17.1

Choose a tag to compare

@ilvalerione ilvalerione released this 24 Aug 17:51
· 54 commits to 1.x since this release

Stream Tool Call

Neuron Agent now stream not only the LLM response but also tool calls. You can implement a UI feedback about the ongoing tool call behind the scenes:

use App\Neuron\MyAgent;
use NeuronAI\Chat\Messages\UserMessage;
use NeuronAI\Tools\Tool;

$stream = MyAgent::make()
    ->addTool(
        Tool::make(
            'get_server_configuration',
            'retrieve the server network configuration'
        )->addProperty(...)->setCallable(...)
    )
    ->stream(
        new UserMessage("What's the IP address of the server?")
    );

// Iterate chunks
foreach ($stream as $chunk) {
    if ($chunk instanceof ToolCallMessage) {
        // Output the ongoing tool call
        echo PHP_EOL.\array_reduce(
            $chunk->getTools(), 
            fn(string $carry, ToolInterface $tool) 
                => $carry .= '- Calling tool: '.$tool->getName().PHP_EOL, 
            '');
    } else {
        echo $chunk;
    }
}

// Let me retrieve the server configuration. 
// - Calling tool: get_server_configuration
// The IP address of the server is: 192.168.0.10

Documentation