Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for platform fap buffering command #4918

Open
1 task done
dgonzalez85 opened this issue Jan 23, 2025 · 0 comments
Open
1 task done

Support for platform fap buffering command #4918

dgonzalez85 opened this issue Jan 23, 2025 · 0 comments
Labels
type: enhancement New feature or request

Comments

@dgonzalez85
Copy link

Enhancement summary

We are looking to replicate AI deployment.

By default the R3 platforms have buffers balanced across both multicast and unicast queues. Multicast is not in use a therefore the egress buffer profile is changed to “unicast” to adjust the buffers in favor of unicast traffic. This can be done with:

platform fap buffering egress profile unicast

(config)#platform fap buffering egress profile ?
  balanced  equal unicast and multicast buffers
  unicast   increase unicast buffers

But its not supported:
https://avd.arista.com/5.1/roles/eos_cli_config_gen/docs/input-variables.html#platform

Which component of AVD is impacted

eos_cli_config_gen

Use case example

AI deployments that require to adjust buffers in favour of unicast buffers in R3 platforms.

Describe the solution you would like

Add support to configure this command in the platform section.

Describe alternatives you have considered

custom_structured_configuration_eos_cli: |
platform fap buffering egress profile unicast

Additional context

No response

Contributing Guide

  • I agree to follow this project's Code of Conduct
@dgonzalez85 dgonzalez85 added the type: enhancement New feature or request label Jan 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant