Skip to content

Add token length limitation to Conversation Memory class #147

@li2109

Description

@li2109

Currently, the Conversation Memory Class can store an arbitrary number of strings with no maximum length limitation.

However, the language model has a fixed prompt length.

Feature Requirements:
Add a parameter for the maximum token length in the Memory Class. If not specified, the Memory Class should default to the current behavior with no maximum string length.

If a string is added to the Memory Class that exceeds this maximum length, the Memory Class should truncate the messages to fit within the maximum length. [remove the oldest messages]

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions