You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to chat with Ollama using llama3.2-vision model. I have a model for my structured output. When using a docker container locally with ollama it works fine but once i try to connect to my Ollama instance over https i get the error:
NeuronAI\Exceptions\AgentException
The LLM wasn't able to generate a structured response for the class App\Models\Cheque.
When i tried to connect to my local docker through https i got the same issue.
Has this happened to anyone else?
here are my classes for context :
class Cheque {
#[SchemaProperty(description:'NOM et prénom de l\'employeur', required:true)]
publicstring$employer;
#[SchemaProperty(description:'numéro de contrat', required:true)]
publicstring$contract_nb;
#[SchemaProperty(description:'NOM et prénom de l\'employé', required:true)]
publicstring$employee;
#[SchemaProperty(description:'nombre d\'heures', required:true)]
publicint$hours;
#[SchemaProperty(description:'salaire horaire net', required:true)]
publicfloat$salary;
#[SchemaProperty(description:'total net versé', required:true)]
publicfloat$total_salary;
#[SchemaProperty(description:'mois pour lequel le cheque fait foi', required:true)]
publicstring$cheque_month;
#[SchemaProperty(description:'année pour laquel le cheque fait foi', required:true)]
publicint$cheque_year;
#[SchemaProperty(description:'raison de l\'absence pendant le mois', required:false)]
publicstring$absence_reason;
#[SchemaProperty(description:'numéro du jour de soumision du cheque', required:true)]
publicint$submit_day;
#[SchemaProperty(description:'mois de soumision du cheque', required:true)]
publicint$submit_month;
#[SchemaProperty(description:'année de soumision du cheque', required:true)]
publicint$submit_year;
}
class ChequeAgent extends Agent {
protectedfunctionprovider(): AIProviderInterface
{
returnnewOllama(
url: 'https://ollama.ia.inetis.ch/api',
model: 'llama3.2-vision',
parameters: [
'keep_alive' => '60m',
'options' => [
'temperature' => 0.0,
]
]
);
}
publicfunctioninstructions(): string
{
returnnewSystemPrompt(
background:[
"Extraction des informations du formulaire pour du travail domestique. Il résume l'activité d'un employé pour un mois donné",
"Les labels sont imprimés et les valeurs sont écrites manuellement",
"Le document est composé de quatre partie.",
"La première partie contient 3 informations, sur la première ligne: le nom et prénom de l'employeur, sur la deuxième ligne: le numéro de contrat et enfin le nom et prénom de l'employé. Les noms sont tout en majuscules.",
"La deuxième partie contient le nombre d'heure travaillé, le salaire horaire net et le total net versé",
"La troisième partie contient le mois et l'année pour ce cheque. Il contient aussi le motif d'absence si il y en a un.",
"La dernière partie le jour, mois et année de soumision du cheque"
],
);
}
protectedfunctiongetOutputClass(): string
{
return Cheque::class;
}
}
class SendCheque extends Command
{
/** * The name and signature of the console command. * * @var string */protected$signature = 'cheque:send {file?*}';
/** * The console command description. * * @var string */protected$description = 'Envoi un cheque pour qu\'il soit analysé, retourne les valeures du cheque en json';
protected$valid_ext = ['jpg' => 'JPEG'];
/** * Execute the console command. * * @return int */publicfunctionhandle()
{
$files = $this->argument('file');
if (count($files) == 0) {
$this->error('Argument manquant file');
return Command::FAILURE;
}
if (count($files) == 1 && is_dir($files[0])) {
returnmatch ($this->ask('Le chemin que vous avez fournis est un dossier, voulez-vous lire tous les fichiers .jpg du dossier ? [O/N]', 'O')) {
'o', 'O' => $this->handleDir(),
'n', 'N' => Command::SUCCESS
};
}
$cheques = $this->handleFiles(collect($files)->map(fn($file) => pathinfo($file))->toArray());
dump($cheques);
}
publicfunctionhandleDir()
{
$files = [];
$dir = newDirectoryIterator($this->argument('file')[0]);
foreach ($diras$file) {
if ($file->isDot() || $file->isDir()) continue;
$info = pathinfo($file->getPathname());
if (isset($info['extension']) && isset($this->valid_ext[$info['extension']])) {
$files[] = $info;
}
}
return$this->handleFiles($files);
}
publicfunctionhandleFiles(array$files)
{
$bar = $this->output->createProgressBar(count($files));
$bar->start();
$agent = ChequeAgent::make();
$cheques = [];
foreach ($filesas$file) {
$content = base64_encode(file_get_contents($file['dirname'] . '\\' . $file['basename']));
$message = newUserMessage("voilà un cheque a traiter.")
->addAttachment(newImage(
$content,
AttachmentContentType::BASE64,
'img/jpeg'
));
$response = $agent->structured($message);
$cheques[] = $response;
$bar->advance();
}
$bar->finish();
return$cheques;
}
}
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am trying to chat with Ollama using
llama3.2-visionmodel. I have a model for my structured output. When using a docker container locally with ollama it works fine but once i try to connect to my Ollama instance over https i get the error:When i tried to connect to my local docker through https i got the same issue.
Has this happened to anyone else?
here are my classes for context :
Beta Was this translation helpful? Give feedback.
All reactions