You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, when the validation is triggered, precognition sends the raw form to be validated. This behaviour conflicts with the transformed data applied when the form is submitted. The problems are:
The validation gets wrong data during form validation or submission
My use case to came to this problem was that I need to stringify an array as a single input to avoid php's max_input_vars error in a form-data request.
constform=useForm('post',route('campaigns.store'),{messages: [],contact_ids: [],}).transform((data)=>({messages: data.messages,// should not be encoded as it has media files in itcontact_ids: JSON.stringify(data.contact_ids),// must be sent as string to avoid exceeding max_input_vars}));functionsubmit(){form.submit({forceFormData: true});}
When calling form.validate() contact_ids is sent as array, exceeding php's max_input_vars. But when submitting the form, everything works as expected.
I looked over inertiajs and precognition code and the transform is not used in precognition nor exposed from inertiajs to be monkey patched :/.
Do you have any suggestions for this situation? Am I allowed to make a pull request in both inertiajs and precognition with a proposed solution for this?
ps: there's also a question in laracasts forum for the same problem.
The text was updated successfully, but these errors were encountered:
Currently, when the validation is triggered, precognition sends the raw form to be validated. This behaviour conflicts with the transformed data applied when the form is submitted. The problems are:
My use case to came to this problem was that I need to stringify an array as a single input to avoid php's
max_input_vars
error in a form-data request.When calling
form.validate()
contact_ids is sent as array, exceeding php'smax_input_vars
. But when submitting the form, everything works as expected.I looked over inertiajs and precognition code and the
transform
is not used in precognition nor exposed from inertiajs to be monkey patched :/.Do you have any suggestions for this situation? Am I allowed to make a pull request in both inertiajs and precognition with a proposed solution for this?
ps: there's also a question in laracasts forum for the same problem.
The text was updated successfully, but these errors were encountered: