Skip to content

When should we set example_prefix to be True? And what is the difference between put ICL examples into system prompt versus multi-turn user-assistant chat? #49

@peter-peng-w

Description

@peter-peng-w

I understand that example_prefix is used during ICL so that we can put examples into the system prompt, especially when we are using GPT-4. However, I have several questions regarding this feature:

  1. What is the difference between putting ICL examples into system prompts versus putting them into multi-turn conversations between user and assistant after system prompt?
  2. Is it recommend only for GPT-4 or both GPT-4 and GPT-3.5? I found there are some discussion regarding the difference between GPT-4 and GPT-3.5 when they are taking system prompt with different name attributes, what is the suggestion from the authors regarding this issue?
  3. If my understanding is correct, when using open source models such as Llama-2, we shouldn't set example_prefix to be true as this will cause some issues with the prompt. To be more concrete, when calling the HFChat class in the src/dt/chat.py at line 395, if we are using example_prefix, we will construct multiple system prompts (2 for each ICL example, with name of example_user and example_assistant). However, at line 395, the implementation seems to overlap previous system prompt with the latest system prompt (i.e., conv.system) rather than concatenate multiple system prompts together, which will cause issue when loading multiple system prompts.

Please help me verify my understandings here. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions