Skip to content
Discussion options

You must be logged in to vote

Hi @mo3az-14,

Wha happens if you use some_tensor.argmax() instead of torch.argmax(), have you tried that?

For example:

for epoch in range(epochs):
    ### Training
    model_4.train()

    # 1. Forward pass
    y_logits = model_4(X_blob_train) # model outputs raw logits 
    y_pred = torch.softmax(y_logits, dim=1).argmax(dim=1) # go from logits -> prediction probabilities -> prediction labels
    # print(y_logits)
    # 2. Calculate loss and accuracy
    loss = loss_fn(y_logits, y_blob_train) 
    acc = accuracy_fn(y_true=y_blob_train,
                      y_pred=y_pred)

I just ran through all of the code in notebook 02 and it functions as expected: https://github.com/mrdbourke/pytorch-d…

Replies: 2 comments 5 replies

Comment options

You must be logged in to vote
1 reply
@mo3az-14
Comment options

Comment options

You must be logged in to vote
4 replies
@mo3az-14
Comment options

@nazarPuriy
Comment options

@mo3az-14
Comment options

@mrdbourke
Comment options

Answer selected by mrdbourke
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants