orm: add bulk insert/update support with CASE WHEN batch updates, MySQL time conversion, and checker/codegen validation#27132
orm: add bulk insert/update support with CASE WHEN batch updates, MySQL time conversion, and checker/codegen validation#27132Jengro777 wants to merge 4 commits intovlang:masterfrom
Conversation
…QL time conversion, and checker/codegen validation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: d7ba71c209
ℹ️ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
| mut batch := fill_data_with_struct[T](values[0], qb.meta) | ||
| batch.batch_rows = values.len |
There was a problem hiding this comment.
Preserve per-row default handling in insert_many
insert_many now unconditionally batches by setting batch_rows, which changes behavior for tables with serial/default columns: rows that used to omit the auto field on a per-row basis are now forced through one batched shape, so mixed explicit/zero IDs no longer preserve defaults and can write literal zero/default-breaking values. This regresses semantics from the previous per-row loop and should fall back to row-by-row inserts when auto/default fields are present (as the compiler SQL bulk path already does).
Useful? React with 👍 / 👎.
| if sql_dialect in [.sqlite, .pg, .h2] && are_values_empty { | ||
| str += 'DEFAULT VALUES' |
There was a problem hiding this comment.
Generate all rows for empty-field batch inserts
In the batch insert path, when all insert fields are skipped and batch_rows > 1, sqlite/pg/h2 still emit a single DEFAULT VALUES statement, so only one row is inserted instead of the requested batch size. Any caller that batches records containing only default/serial columns will silently lose rows; the SQL generation needs to account for row_count in this branch (or execute per-row statements).
Useful? React with 👍 / 👎.
|
Failing doesn't seem to have anything to do with PR |
fix #26993
Summary
Add bulk (batch) INSERT and UPDATE support to the V ORM, enabling efficient multi-row operations through a single SQL statement instead of individual per-row round trips.
Both the compiler-level
sql { }syntax and the function-call API (QueryBuilder) are supported.Changes
vlib/orm/orm.v— Core batch SQL generationQueryData: addbatch_rowsandbatch_keyfields for batch operationsorm_stmt_gen(.insert): generate multi-valueVALUES (?, ?), (?, ?)tuples whenbatch_rows > 0orm_stmt_gen(.update): generateCASE "key" WHEN ? THEN ? ... ELSE "col" ENDexpressions for batch updates whenbatch_rows > 0prepare_insert_query_data: handle batch data layout (interleaved row-by-row), filtertypes/kinds/auto_fieldsarrays correctly, and auto-skip serial fields only when all batch rows have zero valuesclone_query_data: propagateparentheses,batch_rows,batch_keyvlib/orm/orm_func.v— Function-call APIinsert_many(QueryBuilder): changed from N individualconn.insert()calls to a single batch INSERT with interleaved data andbatch_rowssetupdate_many(new standalone function): batch UPDATE usingCASE WHENwith a key field, acceptingfield_namesto select which columns to update andkey_fieldas the matching column; WHERE clause usesINvlib/orm/orm_fn_test.v— SQL generation unit teststest_orm_stmt_gen_bulk_insert: verify multi-value tuple generation for default, pg, and mysql dialectstest_orm_stmt_gen_bulk_update: verifyCASE WHENbatch UPDATE generation for all three dialectstest_orm_stmt_gen_insert_default_values_mysql: verify batch() VALUES (), ()for MySQL with auto-fieldsvlib/orm/orm_func_test.v— Integration teststest_orm_func_update_many: end-to-end SQLite test forupdate_manywith 2-row batch, single-row batch, and empty-values errorvlib/v/ast/ast.v— AST node fieldsSqlStmtLine: addis_array_insert,is_array_update,array_update_var,array_update_keyvlib/v/checker/orm.v— Compile-time validationsql { insert array into T }andsql { update T set ... = array.field where key == array.key }vlib/v/gen/c/orm.v— C codegen for bulk operationswrite_orm_bulk_insert: for non-serial fields, build a singleQueryDatawithbatch_rowsand callconn.insert()once; for tables with serial fields, fall back to per-row inserts so each row gets its own auto-generated IDwrite_orm_bulk_update: build per-column, per-row key+value data forCASE WHEN, with OR-connected WHERE conditionsvlib/db/mysql/orm.c.v— MySQL backend fixconvert_query_data_to_primitives: usei % data.fields.lento correctly map batch data elements back to field names fortime.TimeconversionQueryDatawith all fields propagated (includingbatch_rows,batch_key,parentheses,auto_fields)vlib/v/tests/orm_bulk_insert_update_test.v— End-to-end compiler teststest_orm_bulk_insert_and_update: insert 2 rows, verify, update both withCASE WHEN, verifytest_orm_bulk_insert_preserves_all_default_rows: insert 3 default rows, verify all get distinct IDstest_orm_bulk_insert_with_mixed_serial_values_keeps_defaults: mix zero and explicit ID values in batchtest_orm_bulk_update_with_renamed_column: batch update with@[sql: 'display_name_text']renamed columnvlib/v/checker/tests/orm_bulk_pointer_array_error.vv— Checker error test[&T]) for bulk insert/updateTesting
orm_bulk_insert_update_test.v(4 test functions) passesvlib/v/compiler_errors_test.v: 1565 passed, 5 skippedvlib/v/gen/c/coutput_test.v: 95/95.outfiles match, 117/117.c.must_havepatterns matchupdate_manyAPI