Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Agent Requests Above Maximum Token Limit #2888

Closed
2 tasks done
wTaylorBickelmann opened this issue Jul 10, 2024 · 4 comments
Closed
2 tasks done

[Bug]: Agent Requests Above Maximum Token Limit #2888

wTaylorBickelmann opened this issue Jul 10, 2024 · 4 comments
Assignees
Labels
bug Something isn't working medium effort Estimated medium effort severity:medium Affecting multiple users
Milestone

Comments

@wTaylorBickelmann
Copy link

Is there an existing issue for the same bug?

Describe the bug

Not sure if this is better understood as a bug or feature request, but I was using OpenDevin when I got the following error in the logging

litellm.exceptions.ContextWindowExceededError: litellm.BadRequestError: litellm.ContextWindowExceededError: ContextWindowExceededError: OpenAIException - Error code: 400 - {'error': {'message': "This model's maximum context length is 8192 tokens. However, you requested 8641 tokens (4545 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

It seems like this could be solvable through chunking whenever the context window is exceeded (and maybe that is the intent?)

Current OpenDevin version

ghcr.io/opendevin/opendevin:latest (which on 7/10/24 i think would be 0.7.1)

Installation and Configuration

WORKSPACE_BASE=$(pwd)/workspace
docker run -it \
    --pull=always \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name opendevin-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/opendevin/opendevin

Model and Agent

gpt4
CodeActAgent

Operating System

WSL

Reproduction Steps

I asked it to fix a flask program

Logs, Errors, Screenshots, and Additional Context

error_log.txt

@wTaylorBickelmann wTaylorBickelmann added the bug Something isn't working label Jul 10, 2024
@SmartManoj
Copy link
Contributor

#2021 will solve.

@mamoodi mamoodi added the severity:medium Affecting multiple users label Jul 14, 2024
@mamoodi mamoodi added the medium effort Estimated medium effort label Jul 24, 2024
@enyst enyst self-assigned this Aug 14, 2024
@enyst enyst added this to the 2024-08 milestone Aug 14, 2024
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Sep 15, 2024
@enyst enyst removed the Stale Inactive for 30 days label Sep 15, 2024
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Oct 16, 2024
@enyst enyst removed the Stale Inactive for 30 days label Oct 16, 2024
@enyst
Copy link
Collaborator

enyst commented Nov 14, 2024

This error should not happen anymore. #4977

We are tracking in the other issues how to do this much better, so I'll close this one.

@enyst enyst closed this as completed Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working medium effort Estimated medium effort severity:medium Affecting multiple users
Projects
None yet
Development

No branches or pull requests

4 participants