-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove serde_json arbitrary precision feature #691
Comments
Hi @trevor-crypto ! Unfortunately a plutus data integer size can be up to 64 bytes (not bits) and serde can't serialize values bigger than u64(8 bytes) without this feature. If you just have serde_json versions conflict you can override serde version for specific package in .toml file of your project. |
There's no way to do this because when you're using a workspace, the feature is enabled across the whole workspace, so every dependency and their dependencies get a compiled Since Ex.
|
@lisicky Can you explain this more or show me a test that checks this? It should be possible with your own deserializer/serializer that can handle a string <-> BitInt.
|
@trevor-crypto I have pushed a test 85398ed
What do you mean ? We do not implement a serializer to a rust structure<->json serialization, we use serde for it |
We will add "arbitrary-precision-json" ( you can see it in the test ) feature into our crate, it enables "arbitrary_precision" for serde_json and it's default feature but you can just disable it in your project |
@lisicky I understand now. So the BigInt cannot be a String and must be a Number in JSON format. That fix works for me. Thanks! |
Is it possible to revert or modify the changes in this commit so that the serde_json
arbitrary_precision
feature is not enabled?This can cause problems when using
cardano-serialization-lib
alongside other crates that also depend onserde_json
but do not need thearbitrary_precision
. The Rust feature resolver cannot compile two separate versions of serde_json, one with the feature and one without.The text was updated successfully, but these errors were encountered: