diprism / fggs

Factor Graph Grammars in Python
MIT License
13 stars 3 forks source link

Allow setting torch_semiring_einsum block_size #170

Open ccshan opened 1 year ago

ccshan commented 1 year ago

The default was AutomaticBlockSize(max_cpu_bytes = 1 << 30), which means to use up to 1GB of memory (unless any block size would use >1GB). Setting bin/sum_product.py -B 64 makes a difference for parsing long strings.

davidweichiang commented 1 year ago

Is there any way to do this without passing the block_size option through so many functions? Like hide multiple options inside an "Options" object?

davidweichiang commented 1 year ago

Which way (extra argument vs. hidden inside Semiring) do you like better? Could the argument you made about different einsums in the same computation requiring different block sizes apply to hiding the block size inside Semiring?

ccshan commented 1 year ago

Which way (extra argument vs. hidden inside Semiring) do you like better? Could the argument you made about different einsums in the same computation requiring different block sizes apply to hiding the block size inside Semiring?

The hidden-inside-Semiring way is growing on me; it helps that it makes the diff much smaller. I thought about different einsums in the same computation and feel that they would have different Semirings (so could still have different block sizes).