google / latexify_py

A library to generate LaTeX expression from Python code.
Apache License 2.0
7.25k stars 387 forks source link

Add ability to generate latex from string that has the function definition #108

Open huu4ontocord opened 2 years ago

huu4ontocord commented 2 years ago

Maybe an option to get_latex where you can pass in a string instead:

from __future__ import annotations
from collections.abc import Callable
import ast
import inspect
import textwrap
from typing import Any

from collections.abc import Callable
from typing import Any
import warnings

from latexify import codegen
from latexify import exceptions
from latexify import parser
from latexify import transformers

import dill

from latexify import exceptions

def parse_function_str(fn_def) -> ast.FunctionDef:
    """Parses given function.

    Args:
        fn: Target function.

    Returns:
        AST tree representing `fn`.
    """

    # Remove extra indentation so that ast.parse runs correctly.
    source = textwrap.dedent(fn_def)

    tree = ast.parse(source)
    if not tree.body or not isinstance(tree.body[0], ast.FunctionDef):
        raise exceptions.LatexifySyntaxError("Not a function.")

    return tree
def get_latex_str(
    fn_def,
    *,
    identifiers: dict[str, str] | None = None,
    reduce_assignments: bool = False,
    use_math_symbols: bool = False,
    use_raw_function_name: bool = False,
    use_signature: bool = True,
    use_set_symbols: bool = False,
) -> str:
    """Obtains LaTeX description from the function's source.

    Args:
        fn: Reference to a function to analyze.
        identifiers: If set, the mapping to replace identifier names in the function.
            Keys are the original names of the identifiers, and corresponding values are
            the replacements.
            Both keys and values have to represent valid Python identifiers:
            ^[A-Za-z_][A-Za-z0-9_]*$
        reduce_assignments: If True, assignment statements are used to synthesize
            the final expression.
        use_math_symbols: Whether to convert identifiers with a math symbol surface
            (e.g., "alpha") to the LaTeX symbol (e.g., "\\alpha").
        use_raw_function_name: Whether to keep underscores "_" in the function name,
            or convert it to subscript.
        use_signature: Whether to add the function signature before the expression or
            not.
        use_set_symbols: Whether to use set symbols or not.

    Returns:
        Generatee LaTeX description.

    Raises:
        latexify.exceptions.LatexifyError: Something went wrong during conversion.
    """
    # Obtains the source AST.
    tree = parse_function_str(fn_def)

    # Applies AST transformations.
    if identifiers is not None:
        tree = transformers.IdentifierReplacer(identifiers).visit(tree)
    if reduce_assignments:
        tree = transformers.AssignmentReducer().visit(tree)

    # Generates LaTeX.
    return codegen.FunctionCodegen(
        use_math_symbols=use_math_symbols,
        use_raw_function_name=use_raw_function_name,
        use_signature=use_signature,
        use_set_symbols=use_set_symbols,
    ).visit(tree)
odashi commented 2 years ago

@ontocord Yes, this kind of functionality is definitely useful for (especially) other tools that want to pretty-print Python pseudo-code as LaTeX. And it can be used in get_latex too.

I have considered to provide latexify.compile(str) several days ago, which provides the same functionality with above. Actually my local environment has some implementations. Let me take some time to polish the interface.

huu4ontocord commented 1 year ago

Thank you @odashi !

lverweijen commented 3 months ago

It would be nice to also support ast.Node, because I created a package uneval which generates ast.

Or to even support uneval directly, so I can write:

from uneval import quote as q

to_latex(q.x**2 + 3)
q^2 + 3
odashi commented 3 months ago

It would be nice to also support ast.Node

Yeah this is easier because we can split such subroutine from get_latex without complex tweaks AFAIK.