Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prefetch JUMPDESTs through RPC with progressive proving #765

Draft
wants to merge 113 commits into
base: develop
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
113 commits
Select commit Hold shift + click to select a range
7b01f5d
Implement JumpDest fetching from RPC.
einar-polygon Jul 15, 2024
4591482
feedback + cleanups
einar-polygon Sep 15, 2024
91c2945
cleanups
einar-polygon Sep 15, 2024
b58c5d6
fix overflow
einar-polygon Sep 15, 2024
037fb57
fmt
einar-polygon Sep 16, 2024
85ee8c2
fix testscripts
einar-polygon Sep 16, 2024
1243768
refactor
einar-polygon Sep 16, 2024
e7244c6
for testing
einar-polygon Sep 16, 2024
16e9c26
extract initcode
einar-polygon Sep 17, 2024
f3871d9
improve test script
einar-polygon Sep 17, 2024
4fd6b8b
fix stack issue
einar-polygon Sep 18, 2024
88eb73d
random fixes
einar-polygon Sep 18, 2024
39cd26c
fix CREATE2
einar-polygon Sep 18, 2024
8a964b8
fmt, clippy
einar-polygon Sep 18, 2024
32e68bf
investigate 15,35
einar-polygon Sep 19, 2024
71b003e
merge
einar-polygon Sep 19, 2024
76df518
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 19, 2024
184878d
fix scripts
einar-polygon Sep 19, 2024
c000b5a
remove redtape for JUMP/I
einar-polygon Sep 19, 2024
ec81701
misc
einar-polygon Sep 19, 2024
bff471e
fix ci
einar-polygon Sep 20, 2024
ca9620d
minimize diff
einar-polygon Sep 20, 2024
4c97c0f
include whole function in timeout
einar-polygon Sep 20, 2024
8bce013
avoid ensure macro
einar-polygon Sep 20, 2024
b0ebc2c
fix CREATE
einar-polygon Sep 20, 2024
62c7053
small adjustments
einar-polygon Sep 23, 2024
74b86fd
fmt
einar-polygon Sep 23, 2024
c8be888
feedback
einar-polygon Sep 23, 2024
d00439f
feedback
einar-polygon Sep 24, 2024
6876c07
Add JumpdestSrc parameter
einar-polygon Sep 24, 2024
60efef9
Refactor
einar-polygon Sep 24, 2024
b07752d
Add jmp src to native
einar-polygon Sep 24, 2024
66ea811
Feedback
einar-polygon Sep 24, 2024
f230b84
fixup! Feedback
einar-polygon Sep 24, 2024
90722a3
feedback
einar-polygon Sep 25, 2024
a0e0879
fix missing code for CREATE
einar-polygon Sep 25, 2024
6bff4e4
fix
einar-polygon Sep 26, 2024
5e4162d
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Sep 26, 2024
c26f475
fix arguments
einar-polygon Sep 26, 2024
f367409
feedback
einar-polygon Sep 30, 2024
464dbc0
fix
einar-polygon Sep 30, 2024
2783ccd
debugging 460
einar-polygon Sep 30, 2024
abee812
debugging 460
einar-polygon Oct 1, 2024
8bfccdd
dbg
einar-polygon Oct 1, 2024
f9c2f76
bugfix
einar-polygon Oct 1, 2024
313d78d
dbg
einar-polygon Oct 1, 2024
09079e6
fix
einar-polygon Oct 1, 2024
7c84a63
batching working
einar-polygon Oct 1, 2024
4202ece
cleanups
einar-polygon Oct 1, 2024
e1124d3
feedback docs
einar-polygon Oct 2, 2024
1f8d476
feedback
einar-polygon Oct 2, 2024
7319a15
feedback filtermap
einar-polygon Oct 2, 2024
d4838e0
review
einar-polygon Oct 2, 2024
27b5719
fmt
einar-polygon Oct 3, 2024
eaf3ed7
fix set_jumpdest_analysis_inputs_rpc
einar-polygon Oct 3, 2024
10b6a22
discuss: deser in #427 (#681)
0xaatif Oct 3, 2024
c11d17d
feat: block structlog retrieval (#682)
atanmarko Oct 7, 2024
06b1913
better tracing
einar-polygon Oct 9, 2024
339f6af
bug fix
einar-polygon Oct 9, 2024
61a6b6a
json
einar-polygon Oct 9, 2024
f8f0a85
reinstantiate timeout
einar-polygon Oct 9, 2024
8d609ad
merge
einar-polygon Oct 9, 2024
dbb65ea
ignore None
einar-polygon Oct 9, 2024
d415d22
feedback
einar-polygon Oct 9, 2024
54a7df8
feedback: rustdoc
einar-polygon Oct 9, 2024
44b421c
feedback: add user-specified timeout
einar-polygon Oct 10, 2024
98b9c8e
feedback
einar-polygon Oct 11, 2024
4707d38
fix: addresses
einar-polygon Oct 14, 2024
4843501
todo: fix todo
einar-polygon Oct 14, 2024
8f980d2
testing: improve prove_stdio script
einar-polygon Oct 14, 2024
ee7e5f3
testing: improve test_native script
einar-polygon Oct 14, 2024
36557d1
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Oct 14, 2024
5451399
fmt
einar-polygon Oct 14, 2024
e9a8702
Round 5
einar-polygon Oct 14, 2024
b2f66ed
testing
einar-polygon Oct 15, 2024
cfb293c
testing: improve reporting, add error cases
einar-polygon Oct 15, 2024
3c497cc
change exit code
einar-polygon Oct 15, 2024
2dc52cb
don't panic!
einar-polygon Oct 16, 2024
dd89251
fix type 5 errors
einar-polygon Oct 18, 2024
6c59c41
Fix: 19548491
einar-polygon Oct 19, 2024
0d7f6b7
add stats
einar-polygon Oct 19, 2024
b8cf325
dbg
einar-polygon Oct 21, 2024
0e02af5
rename a
einar-polygon Oct 21, 2024
d04f357
add derive_more and add docs
einar-polygon Oct 21, 2024
cc1d1d9
clean up
einar-polygon Oct 21, 2024
a6ba0e5
remove todo
einar-polygon Oct 21, 2024
1a46a43
add derive_more and add docs
einar-polygon Oct 21, 2024
d31530d
clean up
einar-polygon Oct 21, 2024
6adcc9b
use Hash2code
einar-polygon Oct 21, 2024
f39af6e
mv derive_more
einar-polygon Oct 21, 2024
0b9bf0a
cleanup
einar-polygon Oct 21, 2024
d1e6efa
Optimize zkVM Proving by Skipping Unused Keccak Tables (#690)
sai-deng Oct 15, 2024
a85080d
Assign specific jobs to dedicated workers (#564)
temaniarpit27 Oct 16, 2024
09c4248
feat: SMT support in `trace_decoder` ignores storage (#693)
0xaatif Oct 16, 2024
60e32fb
Merge remote-tracking branch 'origin/develop' into einar/prefetch_tra…
einar-polygon Oct 21, 2024
029983c
stats
einar-polygon Oct 25, 2024
462fd1e
Revert "remove todo"
einar-polygon Oct 29, 2024
1f9ad53
wip
einar-polygon Oct 30, 2024
01df87a
readd CREATE1/2
einar-polygon Oct 30, 2024
aecb2eb
don't insert code
einar-polygon Nov 2, 2024
1f61ee4
WIP
einar-polygon Nov 2, 2024
a1f688e
progressive
einar-polygon Nov 3, 2024
ce98430
test_new_chain
einar-polygon Nov 3, 2024
99aba90
.
einar-polygon Nov 3, 2024
b2aa933
.
einar-polygon Nov 3, 2024
97f5684
.
einar-polygon Nov 3, 2024
54f50e9
.
einar-polygon Nov 3, 2024
6a097ee
.
einar-polygon Nov 3, 2024
75d6d80
.
einar-polygon Nov 3, 2024
7b3a9af
.
einar-polygon Nov 3, 2024
551aa3e
.
einar-polygon Nov 3, 2024
0572f3d
.
einar-polygon Nov 4, 2024
3b9eae7
Merge branch 'develop' into einar/prefetch_transaction_jumps/mem-opt
einar-polygon Nov 4, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 12 additions & 1 deletion .cargo/config.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,18 @@
[build]
# https://github.com/rust-lang/rust/pull/124129
# https://github.com/dtolnay/linkme/pull/88
rustflags = ["-Z", "linker-features=-lld"]

[env]
RUST_BACKTRACE = "1"
RUST_TEST_NOCAPTURE = "1"

[term]
verbose = true
color = 'auto'

[target.x86_64-unknown-linux-gnu]
linker = "clang"
rustflags = ["-Z", "linker-features=-lld", "-C", "target-cpu=native"] #, "-C", "link-arg=-fuse-ld=/usr/bin/mold", "-C", "debuginfo=2"]

[alias]
xtask = ["run", "--package=xtask", "--"]
26 changes: 26 additions & 0 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

32 changes: 30 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ alloy = { version = '0.3.0', default-features = false, features = [
"transport-http",
"rpc-types-debug",
] }
alloy-primitives = "0.8.0"
alloy-serde = "0.3.0"
anyhow = "1.0.86"
async-stream = "0.3.5"
axum = "0.7.5"
Expand All @@ -47,6 +49,7 @@ ciborium-io = "0.2.2"
clap = { version = "4.5.7", features = ["derive", "env"] }
compat = { path = "compat" }
criterion = "0.5.1"
derive_more = "1.0.0"
dotenvy = "0.15.7"
either = "1.12.0"
enum-as-inner = "0.6.0"
Expand Down Expand Up @@ -86,6 +89,7 @@ ruint = "1.12.3"
serde = "1.0.203"
serde_json = "1.0.118"
serde_path_to_error = "0.1.16"
serde_with = "3.8.1"
serde-big-array = "0.5.1"
sha2 = "0.10.8"
static_assertions = "1.1.0"
Expand All @@ -94,8 +98,8 @@ thiserror = "1.0.61"
tiny-keccak = "2.0.2"
tokio = { version = "1.38.0", features = ["full"] }
tower = "0.4"
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
tracing = { version = "0.1", features = ["attributes"] }
tracing-subscriber = { version = "0.3", features = ["env-filter", "json"] }
trybuild = "1.0"
u4 = "0.1.0"
uint = "0.9.5"
Expand All @@ -119,3 +123,27 @@ starky = { git = "https://github.com/0xPolygonZero/plonky2.git", rev = "2488cdac

[workspace.lints.clippy]
too_long_first_doc_paragraph = "allow"

[profile.release]
opt-level = 3
debug = true
incremental = true
debug-assertions = true
lto = false
overflow-checks = false

[profile.test]
opt-level = 3
debug = true
incremental = true
debug-assertions = true
lto = false
overflow-checks = false

[profile.dev]
opt-level = 3
debug = true
incremental = true
debug-assertions = true
lto = false
overflow-checks = false
1 change: 1 addition & 0 deletions evm_arithmetization/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ keywords.workspace = true
[dependencies]
anyhow.workspace = true
bytes.workspace = true
derive_more.workspace = true
env_logger.workspace = true
ethereum-types.workspace = true
hashbrown.workspace = true
Expand Down
1 change: 1 addition & 0 deletions evm_arithmetization/benches/fibonacci_25m_gas.rs
Original file line number Diff line number Diff line change
Expand Up @@ -192,6 +192,7 @@ fn prepare_setup() -> anyhow::Result<GenerationInputs<F>> {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
})
}

Expand Down
68 changes: 61 additions & 7 deletions evm_arithmetization/src/cpu/kernel/interpreter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,13 @@
//! the future execution and generate nondeterministically the corresponding
//! jumpdest table, before the actual CPU carries on with contract execution.

use core::option::Option::None;
use std::collections::{BTreeSet, HashMap};

use anyhow::anyhow;
use ethereum_types::{BigEndianHash, U256};
use log::Level;
use keccak_hash::H256;
use log::{trace, Level};
use mpt_trie::partial_trie::PartialTrie;
use plonky2::hash::hash_types::RichField;
use serde::{Deserialize, Serialize};
Expand All @@ -19,8 +21,10 @@
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::constants::global_metadata::GlobalMetadata;
use crate::generation::debug_inputs;
use crate::generation::jumpdest::{JumpDestTableProcessed, JumpDestTableWitness};
use crate::generation::linked_list::LinkedListsPtrs;
use crate::generation::mpt::{load_linked_lists_and_txn_and_receipt_mpts, TrieRootPtrs};
use crate::generation::prover_input::get_proofs_and_jumpdests;
use crate::generation::rlp::all_rlp_prover_inputs_reversed;
use crate::generation::state::{
all_ger_prover_inputs, all_withdrawals_prover_inputs_reversed, GenerationState,
Expand Down Expand Up @@ -54,6 +58,7 @@
/// The interpreter will halt only if the current context matches
/// halt_context
pub(crate) halt_context: Option<usize>,
/// A table of call contexts and the JUMPDEST offsets that they jumped to.
jumpdest_table: HashMap<usize, BTreeSet<usize>>,
/// `true` if the we are currently carrying out a jumpdest analysis.
pub(crate) is_jumpdest_analysis: bool,
Expand All @@ -73,9 +78,9 @@
pub(crate) fn simulate_cpu_and_get_user_jumps<F: RichField>(
final_label: &str,
state: &GenerationState<F>,
) -> Option<HashMap<usize, Vec<usize>>> {
) -> Option<(JumpDestTableProcessed, JumpDestTableWitness)> {
match state.jumpdest_table {
Some(_) => None,
Some(_) => Default::default(),
None => {
let halt_pc = KERNEL.global_labels[final_label];
let initial_context = state.registers.context;
Expand All @@ -94,16 +99,22 @@

let clock = interpreter.get_clock();

interpreter
let (jdtp, jdtw) = interpreter
.generation_state
.set_jumpdest_analysis_inputs(interpreter.jumpdest_table);
.get_jumpdest_analysis_inputs(interpreter.jumpdest_table.clone());

log::debug!(
"Simulated CPU for jumpdest analysis halted after {:?} cycles.",
clock
);

interpreter.generation_state.jumpdest_table
// if let Some(cc) = interpreter.generation_state.jumpdest_table {
// interpreter.generation_state.jumpdest_table =
// Some(JumpDestTableProcessed::merge([&cc, &jdtp]));
// } else {
// interpreter.generation_state.jumpdest_table = Some(jdtp.clone());
// }
Some((jdtp, jdtw))
}
}
}
Expand All @@ -116,7 +127,7 @@
pub(crate) withdrawal_prover_inputs: Vec<U256>,
pub(crate) ger_prover_inputs: Vec<U256>,
pub(crate) trie_root_ptrs: TrieRootPtrs,
pub(crate) jumpdest_table: Option<HashMap<usize, Vec<usize>>>,
pub(crate) jumpdest_table: Option<JumpDestTableProcessed>,
pub(crate) access_lists_ptrs: LinkedListsPtrs,
pub(crate) state_ptrs: LinkedListsPtrs,
pub(crate) next_txn_index: usize,
Expand Down Expand Up @@ -152,6 +163,49 @@
interpreter.run()
}

/// Computes the JUMPDEST proofs for each context.
///
/// # Arguments
///
/// - `jumpdest_table_rpc`: The raw table received from RPC.
/// - `code_db`: The corresponding database of contract code used in the trace.
///
/// # Output
///
/// Returns a [`JumpDestTableProccessed`].
pub(crate) fn get_jumpdest_analysis_inputs_rpc_progressive<F: RichField>(
jumpdest_table_rpc: &JumpDestTableWitness,
generation_state: &GenerationState<F>,
) -> JumpDestTableProcessed {
let current_ctx = generation_state.registers.context;
let current_code = generation_state.get_current_code().unwrap();
let current_code_hash = generation_state.get_current_code_hash().unwrap();
let code_map: &HashMap<H256, Vec<u8>> = &generation_state.inputs.contract_code;

trace!(
"current_code: {:?}, current_code_hash: {:?}, {:?} <============",
&current_code,
&current_code_hash,
code_map.contains_key(&current_code_hash),
);
trace!("code_map: {:?}", &code_map);
dbg!(current_ctx, current_code_hash, jumpdest_table_rpc.clone());
let mut ctx_proof = HashMap::<usize, Vec<_>>::new();
if jumpdest_table_rpc.contains_key(&current_code_hash) {
let cc = &(*jumpdest_table_rpc)[&current_code_hash].0;
if cc.contains_key(&current_ctx) {
let current_offsets = cc[&current_ctx].clone();
//let ctx_proof = prove_context_jumpdests(&current_code, &offsets);
let largest_address = current_offsets.last().unwrap().clone();

Check failure on line 199 in evm_arithmetization/src/cpu/kernel/interpreter.rs

View workflow job for this annotation

GitHub Actions / clippy

using `clone` on type `usize` which implements the `Copy` trait
let offset_proofs =
get_proofs_and_jumpdests(&current_code, largest_address, current_offsets);
ctx_proof.insert(current_ctx, offset_proofs);
}
}

JumpDestTableProcessed::new(ctx_proof)
}

impl<F: RichField> Interpreter<F> {
/// Returns an instance of `Interpreter` given `GenerationInputs`, and
/// assuming we are initializing with the `KERNEL` code.
Expand Down
2 changes: 2 additions & 0 deletions evm_arithmetization/src/cpu/kernel/tests/add11.rs
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,7 @@ fn test_add11_yml() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
};

let initial_stack = vec![];
Expand Down Expand Up @@ -370,6 +371,7 @@ fn test_add11_yml_with_exception() {
prev_hashes: vec![H256::default(); 256],
cur_hash: H256::default(),
},
jumpdest_table: None,
};

let initial_stack = vec![];
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,15 @@ use plonky2::hash::hash_types::RichField;
use crate::cpu::kernel::aggregator::KERNEL;
use crate::cpu::kernel::interpreter::Interpreter;
use crate::cpu::kernel::opcodes::{get_opcode, get_push_opcode};
use crate::generation::jumpdest::JumpDestTableProcessed;
use crate::memory::segments::Segment;
use crate::witness::memory::MemoryAddress;
use crate::witness::operation::CONTEXT_SCALING_FACTOR;

impl<F: RichField> Interpreter<F> {
pub(crate) fn set_jumpdest_analysis_inputs(&mut self, jumps: HashMap<usize, BTreeSet<usize>>) {
self.generation_state.set_jumpdest_analysis_inputs(jumps);
let (jdtp, _jdtw) = self.generation_state.get_jumpdest_analysis_inputs(jumps);
self.generation_state.jumpdest_table = Some(jdtp);
}

pub(crate) fn get_jumpdest_bit(&self, offset: usize) -> U256 {
Expand Down Expand Up @@ -106,7 +108,10 @@ fn test_jumpdest_analysis() -> Result<()> {
interpreter.generation_state.jumpdest_table,
// Context 3 has jumpdest 1, 5, 7. All have proof 0 and hence
// the list [proof_0, jumpdest_0, ... ] is [0, 1, 0, 5, 0, 7, 8, 40]
Some(HashMap::from([(3, vec![0, 1, 0, 5, 0, 7, 8, 40])]))
Some(JumpDestTableProcessed::new(HashMap::from([(
3,
vec![0, 1, 0, 5, 0, 7, 8, 40]
)])))
);

// Run jumpdest analysis with context = 3
Expand Down Expand Up @@ -175,7 +180,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(HashMap::from(
[(3, vec![1, 33])],
)));

interpreter.run()?;

Expand All @@ -188,7 +195,9 @@ fn test_packed_verification() -> Result<()> {
let mut interpreter: Interpreter<F> =
Interpreter::new(write_table_if_jumpdest, initial_stack.clone(), None);
interpreter.set_code(CONTEXT, code.clone());
interpreter.generation_state.jumpdest_table = Some(HashMap::from([(3, vec![1, 33])]));
interpreter.generation_state.jumpdest_table = Some(JumpDestTableProcessed::new(
HashMap::from([(3, vec![1, 33])]),
));

assert!(interpreter.run().is_err());

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -101,6 +101,7 @@ fn test_init_exc_stop() {
cur_hash: H256::default(),
},
ger_data: None,
jumpdest_table: None,
};
let initial_stack = vec![];
let initial_offset = KERNEL.global_labels["init"];
Expand Down
Loading
Loading