Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tips: Working around 'config replace' madness #32

Open
wdoekes opened this issue Nov 14, 2024 · 1 comment
Open

tips: Working around 'config replace' madness #32

wdoekes opened this issue Nov 14, 2024 · 1 comment
Labels
purely-informational This contains logs but not an issue per se

Comments

@wdoekes
Copy link
Member

wdoekes commented Nov 14, 2024

Description

Let's say I want to remove a Loopback0 IP while swss/orchagent/syncd are down.

We cannot do:

config loopback remove Loopback0 1.2.3.4/32

Because it needs to interface with the backends, and they are down.

Instead, we'll want to edit /etc/sonic/config_db.json, remove the offending config and replace all of the config.

However, we run into madness like this:

# config replace -v /etc/sonic/config_db.json
...
Error: Given patch will produce invalid config. Error: Data Loading Failed
All Keys are not parsed in LOGGER
dict_keys(['xcvrd'])
exceptionList:["'require_manual_refresh'"]

Removing that from the config_db.json is insufficient. There is a backend where this is stored too and that fails also.

Actions:

# redis-cli -n 0 keys '*' | sort -k1
COPP_TABLE:default
COPP_TABLE:queue1_group1
COPP_TABLE:queue4_group1
...
TUNNEL_DECAP_TABLE:IPINIP_TUNNEL
TUNNEL_DECAP_TERM_TABLE:IPINIP_TUNNEL:10.1.0.1
TUNNEL_DECAP_TERM_TABLE:IPINIP_TUNNEL:192.168.0.1
TUNNEL_DECAP_TERM_TABLE:IPINIP_TUNNEL:192.168.8.1
...

^- not sure what this is

# redis-cli -n 0 'hgetall' 'TUNNEL_DECAP_TERM_TABLE:IPINIP_TUNNEL:192.168.8.1'
# redis-cli -n 0 'hdel' 'TUNNEL_DECAP_TERM_TABLE:IPINIP_TUNNEL:192.168.8.1' 'term_type'
...

^- this is an option

# redis-cli -n 4 keys '*' | sort
AUTO_TECHSUPPORT_FEATURE|bgp
AUTO_TECHSUPPORT_FEATURE|database
AUTO_TECHSUPPORT_FEATURE|dhcp_relay
...
LOGGER|xcvrd
LOOPBACK_INTERFACE|Loopback0
...

^- here is the offending config

# redis-cli -n 4 del 'LOGGER|xcvrd'

Now we can finally replace the config:

# config replace -v /etc/sonic/config_db.json
...
Config Replacer: Generating patch between target config and current config db.
Config Replacer: Generated patch: [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.1~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.4~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.2~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.3~132"}].
Config Replacer: Applying patch using 'Patch Applier'.
Patch Applier: localhost: Patch application starting.
Patch Applier: localhost: Patch: [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.1~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.4~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.2~132"}, {"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.3~132"}]
Patch Applier: localhost getting current config db.
...
Patch Applier: localhost: applying 4 changes in order:
Patch Applier:   * [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.1~132"}]
Patch Applier:   * [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.2~132"}]
Patch Applier:   * [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.3~132"}]
Patch Applier:   * [{"op": "remove", "path": "/LOOPBACK_INTERFACE/Loopback0|10.1.0.4~132"}]
Patch Applier: localhost: verifying patch updates are reflected on ConfigDB.
Patch Applier: localhost patch application completed.
Config Replacer: Verifying config replacement is reflected on ConfigDB.
Config Replacer: Config replacement completed.
Config replaced successfully.

Which build are we running (if any)

SONiC-OS-ossomain.0-41ea968fc (2024-11-12)

Upstream issues/PRs

@wdoekes
Copy link
Member Author

wdoekes commented Nov 14, 2024

Extra info in /var/run/redis/sonic-db/database_config.json:

    "DATABASES": {
        "APPL_DB": {
            "id": 0,
            "separator": ":",
            "instance": "redis2"
        },
        "ASIC_DB": {
            "id": 1,
            "separator": ":",
            "instance": "redis3"
        },
        "COUNTERS_DB": {
            "id": 2,
            "separator": ":",
            "instance": "redis6"
        },
        "LOGLEVEL_DB": {
            "id": 3,
            "separator": ":",
            "instance": "redis"
        },
        "CONFIG_DB": {
            "id": 4,
            "separator": "|",
            "instance": "redis"
        },
        "PFC_WD_DB": {
            "id": 5,
            "separator": ":",
            "instance": "redis"
        },
        "FLEX_COUNTER_DB": {
            "id": 5,
            "separator": ":",
            "instance": "redis"
        },
        "STATE_DB": {
            "id": 6,
            "separator": "|",
            "instance": "redis"
        },
        "SNMP_OVERLAY_DB": {
            "id": 7,
            "separator": "|",
            "instance": "redis"
        },
        "ERROR_DB": {
            "id": 8,
            "separator": ":",
            "instance": "redis"
        },
        "RESTAPI_DB": {
            "id": 9,
            "separator": "|",
            "instance": "redis"
        },
        "GB_ASIC_DB": {
            "id": 10,
            "separator": "|",
            "instance": "redis"
        },
        "GB_COUNTERS_DB": {
            "id": 11,
            "separator": "|",
            "instance": "redis"
        },
        "GB_FLEX_COUNTER_DB": {
            "id": 12,
            "separator": "|",
            "instance": "redis"
        },
        "EVENT_DB": {
            "id": 15,
            "separator": "|",
            "instance": "redis4"
        }

That explains why we're in 4.

@wdoekes wdoekes added the purely-informational This contains logs but not an issue per se label Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
purely-informational This contains logs but not an issue per se
Projects
None yet
Development

No branches or pull requests

1 participant