Open Technologicat opened 6 years ago
One thing I've thought about since I posted this is that having seen Racket, what Python is missing is a separation of the concepts of require
(import) and provide
(export).
In Python there's just import
, which does both. All regular names that are imported to a module's top-level namespace are automatically exported from that module, in addition to any names the module itself defines.
There's the magic list __all__
that can be used to limit which names a module exports, but unfortunately it's only consulted for star-imports (talk about almost getting it right :) ).
For me, having this capability in one form or another is starting to become important, as unpythonic.syntax
has grown to over 2.2k lines. Even though these macros are designed to work together, for maintainability syntax
really wants to be a set of 4-5 smaller modules. Particularly, the TCO machinery shared by tco
and continuations
(call/cc for Python, sort of!) alone is so complex that it would benefit from living in its own module.
Perhaps, as you said in email, just an import statement can't solve this. Maybe the macropy.core.macros.Macros
instance could have a method to register names of macros to be exported from the calling module? It could allow specifying also macros imported from other modules by from mymod import macros, ...
statements.
This would require some added complexity in the logic that imports macro definitions (it would have to perform some form of lookup), but maybe it's doable?
If this issue is not relevant for what you currently need from MacroPy yourself, I might take a look at some point if I find the time - I'll keep you updated.
There's the magic list all that can be used to limit which names a module exports, but unfortunately it's only consulted for star-imports (talk about almost getting it right :) ).
star imports are frequently discouraged in python, they are used mostly only in the REPL and tests
For me, having this capability in one form or another is starting to become important, as unpythonic.syntax has grown to over 2.2k lines. Even though these macros are designed to work together, for maintainability syntax really wants to be a set of 4-5 smaller modules. Particularly, the TCO machinery shared by tco and continuations (call/cc for Python, sort of!) alone is so complex that it would benefit from living in its own module.
Frankly, I've still to understand why can't you split such a module in smaller ones and/or:
tell your users to import the macros from the respective submodules
refactor your code so that the macro registration is just a front-end to your code, thus enablinf you to structure your code in a freely from "macropy" strings
Perhaps, as you said in email, just an import statement can't solve this. Maybe the macropy.core.macros.Macros instance could have a method to register names of macros to be exported from the calling module? It could allow specifying also macros imported from other modules by from mymod import macros, ... statements.
Maybe, but maybe, if you are trying to replace an entire language with another a better platform to do that would be PyPy ;-)
If this issue is not relevant for what you currently need from MacroPy yourself, I might take a look at some point if I find the time - I'll keep you updated.
As of now it's not relevant (yet). While I understand it general utility, I don't see its real, broader usefulness coming out from here, sorry.
Star-imports, exactly, that's what I meant by Python "almost getting it right". The __all__
magic is currently useless except for use with the REPL, whereas it could have been a general solution to mark names for export.
About splitting, in this particular case it's so far purely a maintainability consideration. Also, I want the import statements at the use site to be as brief as reasonable. The name of the library aside, a syntax
submodule is sufficient to say that's where macros live. I'm fine with the syntax from mymod import macros, ...
, as the word macros
is a further explicit (very pythonic!) warning that there could be dragons in whatever is being imported.
I suppose it's possible to refactor it that way. I think I'll try that, thanks for the suggestion!
I'm not sure if I'm trying to replace a language with another, or just extend an existing one. With macros, I'm not even sure if "programming language" is a well-defined concept. Is Python plus let
still Python, or a new language? What about Python plus implicit return statements and TCO? Python plus autocurry?
I suppose what I really want is a building material, like Lisp, so I can replace repetitive patterns with syntactic abstractions, and borrow such abstractions from other languages when I see one that could simplify my code. Whatever the language is, it should run Python libraries, not have a separate namespace for functions and data (so no Hy), and I really wouldn't mind if the code mostly looked like Python. ;)
If the re-export feature is not relevant for you, don't worry - we both know how open-source development works :)
Thanks for the refactor suggestion, it was excellent! It solved the issue for me.
Thinking back, I suppose the problem was that I was thinking of all code as one kind of thing - whether it be regular functions or macros. (I mean, aside from the obvious difference of AST transformation vs. regular run-time computation and all the consequences of that.)
I think one possible way to resolve this issue would be to mention in the noob tutorial, that the way to keep things composable (and maintainable) is to manually separate the syntax transformers (regular functions processsing ASTs) from the macro interface, because macros cannot currently be re-exported (whereas for regular functions there are no such limitations). Therefore, to provide the macro interface in one central module even when the implementation is split into several modules, ...
A downside is that gen_sym
again needs to be passed around because only the macro interface layer has the ability to ask for it, but this is exactly the problem dynamic assignment was created to solve, so:
from unpythonic.dynscope import dyn, make_dynvar
def nogensym(*args, **kwargs):
raise RuntimeError("No gen_sym function set")
make_dynvar(gen_sym=nogensym)
@macros.expr
def mymac(tree, gen_sym, **kw):
with dyn.let(gen_sym=gen_sym):
return _mymacimpl(tree) # or "return (yield from _mymacimpl(tree))" if needed
...and presto, no spam in the call signatures. It's then available as dyn.gen_sym
where needed.
To improve both maintainability and ease of use of macro libraries, it would be useful if a macro-definition module could re-export macros defined in another macro-definition module.
This would allow the macro library maintainer to structure the source tree logically, and then just re-export all macros from the library's
__init__.py
(just like it is possible to do for functions in pure Python).This would allow the users of the library to just
from somelibrary import macros, ...
, without caring about in which particular submodule of the library each of the macros is defined.There are probably some pitfalls I'm not seeing here; thoughts?