-
-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: undefined json keys as defaults when using ?columns
and a Prefer
header
#2672
Conversation
?columns
The above is not ideal. When omitting a column on a view INSERT: INSERT INTO test.complex_items_view(id,name) VALUES (11, 'Eleven') RETURNING *;
id | name | settings | arr_data | field-with_sep
----+--------+----------+----------+----------------
11 | Eleven | | | 1 PostgreSQL correctly infers the |
55f9854
to
0f906e8
Compare
So I was thinking that doing this: (consider that http POST "localhost:3000/complex_items?columns=id,name,field-with_sep,arr_data" "Prefer: return=representation, resolution=merge-duplicates" <<JSON
[
{"id": 3, "name": "Drei"},
{"id": 4, "name": "Vier"},
{"id": 5, "name": "Funf", "arr_data": null},
{"id": 6, "name": "Sechs", "field-with_sep": 6, "arr_data": [1,2,3]}
]
JSON Would only UPDATE the I was thinking to do this automatically but:
|
So I just made a pgbench test on the old insert query compared to the new one: old$ postgrest-with-postgresql-15 -f test/pgbench/fixtures.sql pgbench -T 30 -f test/pgbench/1567/old.sql
pgbench (15.1)
starting vacuum...pgbench: error: ERROR: relation "pgbench_branches" does not exist
pgbench: detail: (ignoring this error and continuing anyway)
pgbench: error: ERROR: relation "pgbench_tellers" does not exist
pgbench: detail: (ignoring this error and continuing anyway)
pgbench: error: ERROR: relation "pgbench_history" does not exist
pgbench: detail: (ignoring this error and continuing anyway)
end.
transaction type: test/pgbench/1567/old.sql
scaling factor: 1
query mode: simple
number of clients: 1
number of threads: 1
maximum number of tries: 1
duration: 30 s
number of transactions actually processed: 146236
number of failed transactions: 0 (0.000%)
latency average = 0.205 ms
initial connection time = 1.234 ms
tps = 4874.710773 (without initial connection time) new
resultsSo there's
The above is adequate but I'm also thinking maybe we should add a new |
The above wouldn't be right if this feature is JSON specific, it would have to be like:
Maybe it could be added to CSV as well though.. in which case the Also both options could be done. |
fa50be2
to
1688c18
Compare
So I've added a |
ee2bd01
to
412b414
Compare
?columns
?columns
and Prefer
header
?columns
and Prefer
header?columns
and a Prefer
header
Having to add both |
a1bd971
to
da55594
Compare
add prefer header for applying defaults
da55594
to
955d3c6
Compare
So since this feature requires To avoid this, I'm only switching to |
To avoid complicating the interface, I've enabled this Prefer header to work without
Also I went ahead with the above change. Seems clearer. |
0ddf96c
to
fa14742
Compare
Tried to appease codecov with the method mentioned on #2671, it cleared up one warning but not the other 2, no idea how to solve those. Will leave it like that, they're only record fields which codecov doesn't understand. |
56fad3d
to
1560fbd
Compare
Looking good to merge now. |
I agree with putting it as a prefer header. But this header is still JSON specific. "undefined" is a concept in JSON, I think. What about something like |
Great idea! It looks more generic. Doing that on #2723. |
Closes #1567.
(
field-with_sep
has default 1 on the following)ALTER VIEW test.complex_items_view ALTER COLUMN name SET DEFAULT 'Default'
Pending