Skip to content

Beam Search: Error in attn_decoder_input_fn in concat statement #8

@ravibansal

Description

@ravibansal

https://github.com/JayParks/tf-seq2seq/blob/master/seq2seq_model.py#L368
It gives that the dimension 0 of inputs and attention do not match (as we are tile_batching it to batch_size * beam_width). Didn't you get any error while running with beam_search?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions