0xPolygonZero / plonky2

Apache License 2.0
763 stars 285 forks source link

CTL's looking table grouping is wonky #1582

Open matthiasgoergens opened 5 months ago

matthiasgoergens commented 5 months ago

ctl_helper_zs_cols in starky/src/cross_table_lookup.rs looks like this:

/// Computes helper columns and Z polynomials for all looking tables
/// of one cross-table lookup (i.e. for one looked table).
fn ctl_helper_zs_cols<F: Field, const N: usize>(
    all_stark_traces: &[Vec<PolynomialValues<F>>; N],
    looking_tables: Vec<TableWithColumns<F>>,
    challenge: GrandProductChallenge<F>,
    constraint_degree: usize,
) -> Vec<(usize, Vec<PolynomialValues<F>>)> {
    let grouped_lookups = looking_tables.iter().group_by(|a| a.table);

    grouped_lookups
        .into_iter()
        .map(|(table, group)| {
            let columns_filters = group
                .map(|table| (&table.columns[..], &table.filter))
                .collect::<Vec<(&[Column<F>], &Filter<F>)>>();
            (
                table,
                partial_sums(
                    &all_stark_traces[table],
                    &columns_filters,
                    challenge,
                    constraint_degree,
                ),
            )
        })
        .collect::<Vec<(usize, Vec<PolynomialValues<F>>)>>()
}

The line let grouped_lookups = looking_tables.iter().group_by(|a| a.table); is rather questionable, because it only groups together looking tables that are already adjacent.

I suggest we replace it with something like this:

diff --git a/starky/src/cross_table_lookup.rs b/starky/src/cross_table_lookup.rs
index 73caf28c..dc662fcd 100644
--- a/starky/src/cross_table_lookup.rs
+++ b/starky/src/cross_table_lookup.rs
@@ -396,14 +396,14 @@ fn ctl_helper_zs_cols<F: Field, const N: usize>(
     challenge: GrandProductChallenge<F>,
     constraint_degree: usize,
 ) -> Vec<(usize, Vec<PolynomialValues<F>>)> {
-    let grouped_lookups = looking_tables.iter().group_by(|a| a.table);
-
-    grouped_lookups
+    looking_tables
+        .iter()
+        .map(|a| (a.table, (&a.columns[..], &a.filter)))
+        .into_group_map()
         .into_iter()
-        .map(|(table, group)| {
-            let columns_filters = group
-                .map(|table| (&table.columns[..], &table.filter))
-                .collect::<Vec<(&[Column<F>], &Filter<F>)>>();
+        // We sort for determinism:
+        .sorted_by_key(|(table, _)| *table)
+        .map(|(table, columns_filters)| {
             (
                 table,
                 partial_sums(

But I don't know which other parts of the code have to change. Perhaps something in the verifier, too?

Currently, we work around this problem in our client code by pre-sorting our looking tables before we hand them over to starky.