Skip to content

Update openai.py#536

Open
MrEchoFi wants to merge 1 commit intolangchain-ai:mainfrom
MrEchoFi:patch-1
Open

Update openai.py#536
MrEchoFi wants to merge 1 commit intolangchain-ai:mainfrom
MrEchoFi:patch-1

Conversation

@MrEchoFi
Copy link

  • PR title: fix(community): mask openai_api_key in ChatOpenAI.__repr__to_prevent_leakage

  • PR message:

    • Description: This PR prevents accidental leakage of OpenAI API keys when printing or logging
      ChatOpenAI instances by overriding __repr__ to mask the openai_api_key
      field.

This affects common usage patterns such as:

  • print(llm)

  • logging model objects

  • debugging in notebooks or error traces

  • Lint and test:

  • current version of code-

from langchain_community.chat_models import ChatOpenAI

llm = ChatOpenAI(api_key="FAKEKEY123", model="gpt-4o")
print(llm)

  • Current version output:
lm = ChatOpenAI(api_key="FAKEKEY123", model="gpt-4o")
client=<openai.resources.chat.completions.completions.Completions object at 0x7593c05cee40> async_client=<openai.resources.chat.completions.completions.AsyncCompletions object at 0x7593c05cf8c0> model_name='gpt-4o' model_kwargs={} openai_api_key='FAKEKEY123' openai_proxy=''
  • After Fix via this:

    def __repr__(self) -> str:
          """Safe string representation that avoids leaking API keys."""
          masked_key = "***" if self.openai_api_key else None
          return (
              f"ChatOpenAI("
              f"model_name={self.model_name!r}, "
              f"openai_api_key={masked_key}, "
              f"openai_proxy={self.openai_proxy!r})"
          )
    


### OUTPUT->    ChatOpenAI(model_name='gpt-4o', openai_api_key='***', openai_proxy='')

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant