Skip to content

Mismatched error message when exceeding buffer size #109

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
nwoolmer opened this issue Jun 4, 2025 · 2 comments · Fixed by #110
Closed

Mismatched error message when exceeding buffer size #109

nwoolmer opened this issue Jun 4, 2025 · 2 comments · Fixed by #110
Labels
bug Something isn't working

Comments

@nwoolmer
Copy link
Contributor

nwoolmer commented Jun 4, 2025

repro:

import pandas as pd
import questdb.ingress
from questdb.ingress import Sender, ServerTimestamp

header = ["x", "y"]
x = list(range(10000))
y = list(range(10000))


df = pd.DataFrame(zip(x, y), columns=header)

with Sender.from_conf("http::addr=localhost:9000;auto_flush_rows=1000;max_buf_size=1024;") as sender:
    sender.dataframe(df, table_name='test_df', at=ServerTimestamp)
    sender.flush()

Error:

Traceback (most recent call last):
  File "src/questdb/dataframe.pxi", line 2365, in questdb.ingress._dataframe
  File "src/questdb/dataframe.pxi", line 2244, in questdb.ingress._dataframe_handle_auto_flush
questdb.ingress.IngressError: Could not flush buffer: Buffer size of 21780 exceeds maximum configured allowed size of 1024 bytes. - See https://py-questdb-client.readthedocs.io/en/v2.0.3/troubleshooting.html#inspecting-and-debugging-errors#flush-failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/nick/PycharmProjects/questdb-scratch/df_scratch.py", line 17, in <module>
    sender.dataframe(df, table_name='test_df', at=ServerTimestamp)
  File "src/questdb/ingress.pyx", line 2403, in questdb.ingress.Sender.dataframe
  File "src/questdb/dataframe.pxi", line 2396, in questdb.ingress._dataframe
  File "src/questdb/dataframe.pxi", line 2375, in questdb.ingress._dataframe
questdb.ingress.IngressError: Bad dataframe row at index 999: All values are nulls. Ensure at least one column is not null.
@nwoolmer nwoolmer added the bug Something isn't working label Jun 4, 2025
@kafka1991
Copy link
Collaborator

kafka1991 commented Jun 5, 2025

This is because the configuration sets auto_flush_rows=1000 while also setting max_buf_size = 1024 (in bytes).

When the auto-flush rule of 1000 rows is triggered, the cached buffer size clearly exceeds 1024 bytes.

In this case, the max_buf_size configuration can be removed.

Sender.from_conf("http::addr=localhost:9000;auto_flush_rows=1000;") as sender:

If you need to trigger a flush at 1024 bytes, you can use auto_flush_bytes instead.

Sender.from_conf("http::addr=localhost:9000;auto_flush_bytes=1000;") as sender:

The error message here is incorrect and is being fixed.

@nwoolmer
Copy link
Contributor Author

nwoolmer commented Jun 5, 2025

I am not sure max_buf_size can be removed (except perhaps set to 0, I don't recall). We default it to 100 MB. It serves a different purpose to auto_flush_bytes.

Though this example was artificial, it is easily triggered with the default config when using a wide dataframe (140 columns, long strings).

Thanks for taking a look!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants