You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After 2.4 Run Fine-Tuning Loop, I got the following error when running 2.6 Evaluate Performance (USMLE):
Traceback (most recent call last):
File "llama2.py", line 450, in
accuracy = evaluate_model(model, tokenizer, dataset,"")
File "llama2.py", line 284, in evaluate_model
output = model.generate(input_ids, num_beams=4)
File ". pytorch/1.13.1/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File ".local/unknown/pytorch1.13.1/lib/python3.9/site-packages/transformers/generation/utils.py", line 1665, in generate
return self.beam_sample(
File ".local/unknown/pytorch1.13.1/lib/python3.9/site-packages/transformers/generation/utils.py", line 3309, in beam_sample
next_tokens = torch.multinomial(probs, num_samples=2 * num_beams)
RuntimeError: probability tensor contains either inf, nan or element < 0
The text was updated successfully, but these errors were encountered:
After 2.4 Run Fine-Tuning Loop, I got the following error when running 2.6 Evaluate Performance (USMLE):
Traceback (most recent call last):
File "llama2.py", line 450, in
accuracy = evaluate_model(model, tokenizer, dataset,"")
File "llama2.py", line 284, in evaluate_model
output = model.generate(input_ids, num_beams=4)
File ". pytorch/1.13.1/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File ".local/unknown/pytorch1.13.1/lib/python3.9/site-packages/transformers/generation/utils.py", line 1665, in generate
return self.beam_sample(
File ".local/unknown/pytorch1.13.1/lib/python3.9/site-packages/transformers/generation/utils.py", line 3309, in beam_sample
next_tokens = torch.multinomial(probs, num_samples=2 * num_beams)
RuntimeError: probability tensor contains either
inf
,nan
or element < 0The text was updated successfully, but these errors were encountered: