-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DataStore PG Dump #191
base: canada-v2.10
Are you sure you want to change the base?
DataStore PG Dump #191
Conversation
- pg_dump sql format for datastore dump.
# Conflicts: # ckanext/datastore/blueprint.py ### RESOLVED.
- Datastore pg_dump endpoint.
- Add change log file.
This is an interesting idea, if someone has a local postgres it would make it easier for them to work with the data with the correct data types and everything. Just curious, do the column comments (the data dictionary) also get exported by dumping this way? |
- Use subprocess.run for timeouts. - Added max execution for sql dump config. - Made sql dump pluggable. - Used max buffer sizes for subprocess and byte chunks.
@wardi okay I have done all the above, changes to I am currently having some issues in our setup with full text search stuff with giant DS tables, so just fixing that up and will see if the column comments get exported or not. |
@wardi yup! the comments from Data Dicitonary do in fact get exported:
|
May not be accepted upstream. There's no |
@wardi yeah figured I would not put this upstream. Will show this to Data and Biz team this Friday and see what they think. |
Adds a
.sql
format option to datastore dump endpoint which streams back the contents of a customizedpg_dump
on the table. Is this useful at all? I have no idea, but the idea was to give the users who wanted to use the datastore_search_sql an option to get the SQL table and they can query locally.