Open my1e5 opened 3 months ago
Relevant comment from another issue: https://github.com/astral-sh/uv/issues/5632#issuecomment-2267115729
PDM supports this: https://pdm-project.org/latest/usage/scripts/
Yeah we plan to support something like this! We haven't spent time on the design yet.
The pyproject standard already supports [project.scripts]
, so uv may not need to use its own table.
https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#creating-executable-scripts
[project.scripts]
is a little different -- that's used to expose specific Python functions as executable scripts, and we do support that already.
Perhaps naming this section tool.uv.tasks
or tool.uv.aliases
could help disambiguate that.
Or maybe [tool.uv.run]
to be consistent with the command uv run
. Or we could even think about [tool.uv.commands]
.
I'm not a big fan of [tool.uv.scripts]
since it conflicts with [project.scripts]
and I myself got confused before.
This is the main thing I missed coming from hatch: https://hatch.pypa.io/dev/config/environment/overview/#scripts
+1 to @nikhilweee suggestions. I think “command” reflects the intent/concept.
Hatch has an “environment” concept and supports running commands namespaced to an environment like so
hatch run test:cov
where “test” is a user-defined environment (with a dependency group) and “cov” is a user-defined command for that environment.
[tool.hatch.envs.test]
dependencies = [
"pytest",
"pytest-cov",
"pytest-mock",
"freezegun",
]
[tool.hatch.envs.test.scripts]
cov = 'pytest --cov-report=term-missing --cov-config=pyproject.toml --cov=src'
[[tool.hatch.envs.test.matrix]]
python = ["3.8", "3.9", "3.10", "3.11", "3.12"]
I would be curious to hear the use cases of nesting dependency groups and commands into “environments” like this rather than defining them at the top-level (i.e. [tool.uv.commands]
/[tool.uv.dev-dependencies]
).
since it has not been mentioned yet, adding as a possible inspiration for design of tasks also pixi: https://pixi.sh/latest/features/advanced_tasks/
I happen to be writing a cross-project task runner that supports a bunch of formats (e.g. rye, pdm, package.json, Cargo.toml; even uv's workspace config).
For what it's worth, almost all python runners use tool.<name>.scripts
for task config (presumably inspired by npm's package.json
format), so it's somewhat of an easier upgrade path for people coming from other tools.
Also related to this thread, I wish uvx was uv run
instead of uv tool run
when inside a project.
uv tool run
seems like something you would do to play with a tool, like ruff. But then you will eventually uv add ruff --dev
and forever keep writing uv run ruff check
instead of uvx ruff check
which won't respect the locked version and will be in different virtualenv. It also means the tool could be running on a different Python (does not apply to ruff, but any other python lib) and all sorts of weird stuff can happen.
I know uv run stuff
is still short, but it could be 4 keystrokes shorter.
Regarding uv run
being a task runner it means people will type it waaaaay more often than uv tool run
.
I would appreciate a dedicated command like uvr
@inoa-jboliveira I opened a dedicated issue for that https://github.com/astral-sh/uv/issues/7186
Putting together some thoughts about semantics. This issue is about adding support for running arbitrary instructions specified in pyproject.toml
. I deliberately use the term instruction to avoid using any of the other terms under consideration (command, tool, script, etc).
Lots of existing tools refer to them as "scripts".
npm
and yarn
have first class support for scripts. Users can define them in package.json
composer
also uses the same term. Users can define them in composer.json
pdm
takes inspiration from npm and also uses the term scripts
. Users can define them in [tool.pdm.scripts]
rye
follows suit. Custom scripts are defined in [tool.rye.scripts]
hatch
also uses the term scripts, although they are tied to environments. Defined in [tool.hatch.envs.<env>.scripts]
It seems advantageous to just go with the term "scripts" because it is the de-facto standard. As noted by another user https://github.com/astral-sh/uv/issues/5903#issuecomment-2316413223, this would also reduce friction for users coming to uv
from other package managers. That said, this approach has a major flaw because it overlaps with the concept of entry points defined in [project.scripts]
. Entry points expose certain python functions as executable scripts, but do not allow arbitrary commands. Furthermore, [project.scripts]
has already been established in PEP-0621, as the official spec. So what about other terms?
Another option is to use the term "tasks"
pixi
uses the term "tasks". Users can define them in the [tasks]
table in pixi.toml
bundler
uses rake tasks. Although it resembles entry points, sh
is supported.grunt
and gulp
also use the term "tasks". Although they are task runners, not package managers.gradle
uses the term "tasks", defined in build.gradle
Another option is to call them "executables". dart
uses this term in pubspec.yaml
We could also use "commands", although I wasn't able to find existing tools which use this term.
After settling on a name, an obvious thing to do is to let users define instructions in the [tool.uv.<name>]
table.
There are two options here.
uv run <instruction>
(follows from npm run <script>
)uv invoke
/ uv command
/ uv task
PDM's documentation around user scripts is pretty evolved, with support for a bunch of features.
cmd
mode, shell
mode, composite
modecall
a function from a python script (entry point)Rye has its own format, which is a subset of PDM features.
chain
multiple scripts one after the othercall
a function from a python script (entry point)I hope this serves as a starter for discussing additional details for this feature.
(Nice comment, thank you!)
One nice thing about PDM is that if a command is not recognized as a built-in, it is treated as pdm run
. Thus, pdm foobar
would be shorthand for pdm run foobar
, which executes the command defined in [tool.pdm.scripts.foobar]
.
IMHO, pdm
has the most extensive support for these scripts
and I would personally like to see the same support in uv
too. And if so, best to support the same pyproject
section too :scream:. It's especially nice when one doesn't have to use any other tools like Makefiles (ugh).
Alas, one can argue that the pdm
support for scripts is feature-creep for uv
, in which case the rye
model works too :)
Furthermore, [project.scripts] has already been established in PEP-0621, as the official spec. So what about other terms?
I wonder if would be a good time to maybe standardise this? I don't know if a pep is required, but since we have some many package managers for python it would be nice if we can have one way to define these instructions
I think "tasks" works fine as a name, but this is definitely a needed feature. Otherwise we need to fall back on a Makefile or custom Python/bash scripts, which gets needlessly clumsy.
I think "tasks" works fine as a name
For reference, vscode uses the “tasks” nomenclature and they are specified in json.
From https://code.visualstudio.com/docs/editor/tasks
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{
"label": "Run tests",
"type": "shell",
"command": "./scripts/test.sh",
"windows": {
"command": ".\\scripts\\test.cmd"
},
"group": "test",
"presentation": {
"reveal": "always",
"panel": "new"
}
}
]
}
There are a lot of implementations as others in the thread have pointed out. I am partial to PDM’s approach although I think that it can be greatly simplified. For example, specifying the command type as cmd
, call
, composite
seems too granular and this could just be abstracted from the user. If given a string like uv run ruff check
, assume a command. If given a list of strings, assume a composite command. If given a python module path and function, assume a function call.
I think that this tasks concept can be aided by the development dependency group concept and should be developed with them in mind. A task is usually a development related action and always relies on dependencies, whether it is an OS built-in such as echo
or whether it’s a python library like ruff
, so having a way to specify them for a command (similar to the way dependencies can now be specified in comments for standalone python scripts) should be strongly considered. A way to specify a dependency group, platform, and python version for a named command could add a lot of value.
Supporting arbitrary shell syntax could be tricky because it is not portable and attempts to work around it are usually not DRY. Python directly solves the shell problem by being the cross platform scripting language that Bash and Batch are not, and I think that its use should be encouraged over the latter.
There is a related discussion here: https://discuss.python.org/t/a-new-pep-to-specify-dev-scripts-and-or-dev-scripts-providers-in-pyproject-toml/11457
Unsure if there are any formal PEPs on the topic yet.
This is a good example on how I see as a good practice using poetry
and poe
:
To add to @Kludex comment, i have been using poe
for ages, and still do with uv
. An example from one of my current projects:
[tool.poe.tasks]
pre.cmd = "pre-commit run --all-files"
pre.help = "Run pre-commit checks"
mypy.cmd = "mypy . --strict"
mypy.help = "Run mypy checks"
format.help = "Format code with Ruff"
format.cmd = "ruff format ."
ruff.help = "Run Ruff checks"
ruff.cmd = "ruff check --output-format=concise ."
test.help = "Run tests using Pytest"
test.cmd = "pytest"
"test:watch".cmd = "ptw . --now --clear"
"test:watch".help = "Run tests using Pytest in watch mode"
changelog.cmd = "github-changelog-md"
changelog.help = "Generate a changelog"
Other projects have more entries for mkdocs and other tools.
Poe is quite well used, it seems that using [tool.uv.tasks]
and similar task syntax would be elegant.
I would love to see built-in task management poethepoet
; one feature I'd love to see would be first-class workspaces support similar to yarn: (https://yarnpkg.com/cli/workspaces/foreach)
e.g. something like
[tool.uv.tasks]
check = "uv workspaces foreach check --topological"
build = "uv workspaces foreach build"
or maybe...
[tool.uv.tasks.check]
foreach_workspace = "check"
workspace_order = "topological"
[tool.uv.tasks.build]
foreach_workspace = "build"
workspace_order = "parallel"
[tool.uv.tasks]
check = ["_ruff", "_pyright"]
build = ["_generate_stuff", "_build_library"]
_ruff = "..."
_pyright = "..."
_generate_stuff = "..."
_build_library = "..."
This is a blocker when I tried to move from poetry
&poe
to uv
today.
Here is what existing in my pyproject.toml
:
[tool.poe.tasks]
git-hooks = { shell = "pre-commit install --install-hooks && pre-commit install --hook-type commit-msg" }
format = [
{cmd = "autoflake ."},
{cmd = "black ."},
{cmd = "isort ."},
]
lint = [
{cmd = "black --check ."},
{cmd = "isort --check-only ."},
{cmd = "flake8 ."},
]
test = [
{cmd = "pytest . -vv"},
]
test-cov = [
{cmd = "pytest --version"},
{cmd = "coverage run -m pytest ."},
{cmd = "coverage report --show-missing"},
{cmd = "coverage xml"},
]
build-doc-and-serve = [
{cmd = "mkdocs build"},
{cmd = "mkdocs serve"}
]
Hopefully to see this feature added soon! Thanks
This is a blocker when I tried to move from
poetry
&poe
touv
today.
Since poe
is standalone and works very well with uv
its not exactly a 'blocker' (poe
is an extra dep even with Poetry
)
all those scripts will work fine inside your venv, I've recently moved 3 projects from Poetry
to uv
and everything just works.
It would be nice to have it supported without an extra dependency for sure, though until then just keep using poe🤷♂️.
This is a blocker when I tried to move from
poetry
&poe
touv
today.Since
poe
is standalone and works very well withuv
its not exactly a 'blocker' (poe
is an extra dep even withPoetry
) all those scripts will work fine inside your venv, I've recently moved 3 projects fromPoetry
touv
and everything just works.It would be nice to have it supported without an extra dependency for sure, though until then just keep using poe🤷♂️.
Ah very good point! Totally agreed, thanks for dust 🙏
Regarding Poe the Poet... I really like how their Poetry plugin allows it to hook into the builtin poetry commands. Their specific example is using a "pre_build" hook to "Optimise static assets for inclusion in the build", which sounds super useful.
I've been trying to adopt thx as a multi-version task runner.
The basic premise is that you configure jobs in pyproject.toml
:
[tool.thx]
python_versions = ["3.10", "3.11", "3.12"]
requirements = ["requirements-dev.txt"]
[tool.thx.jobs]
pytest = "pytest --cov"
Then thx pytest mypy
builds a venv per interpreter in parallel, installs the package with pinned requirements, and runs pytest
for each interpreter.
It also has a --watch
mode that stays running and repeats the operation shortly after anything changes.
I like that it is multi-version by default, parallel by default, and the configuration in pyproject.toml is very simple and flat.
For me, it's interesting to run tasks in parallel with a single command (e.g. uv run task start-development
). For example, I want to run an MLflow server and execute a FastAPI project in development mode, and if I cancel it (Ctrl + C on the console), both are stopped.
$ uv run mlflow server --host 127.0.0.1 --port 5000
$ fastapi dev src/main.py
Another thing it's interesting, when I have a monorepo with N projects with .NET, the IDE allows me to change, with the UI, the projects I want to run, without overwhelming me, but I don't know what it'd be the equivalent with uv
.
For me, it's interesting to run tasks in parallel with a single command (e.g.
uv run task start-development
). For example, I want to run an MLflow server and execute a FastAPI project in development mode, and if I cancel it (Ctrl + C on the console), both are stopped.
@AndreuCodina I once had a similar requirement, and I found that there are many things to consider, such as whether to terminate other commands after one command is terminated, and whether to allow graceful termination before hard termination. So the scope might be too broad for uv
.
If Node is installed in your environment, you can use npx concurrently "uv run mlflow server --host 127.0.0.1 --port 5000" "fastapi dev src/main.py"
. Honcho and Supervisor can also be considered. (I ended up reinventing the wheel for a Python version of concurrently
)
i would love this
I haven't seen it mentioned, so I think it's worth asking, how would these scripts/tasks be handled with workspaces and nested pyproject.toml?
Intuitively, I would think scripts/tasks would be run using the pyproject.toml in the closest parent directory. Scripts are always executed in the context of the directory with the pyproject.toml file.
For example, running uv run migrate
in any subdirectory of workspace A
runs the migrate
command in workspace A
's pyproject.toml. If no migrate
script is found in A
's pyproject.toml, uv run
complains and errors out (even if there's a migrate
command in the root pyproject.toml).
NPM allows users to issue a command in the context of a workspace. For example running npm run test --workspace=a
. I'm not sure how cargo handles this.
I'm not sure what the best way is, but I'm super glad the astral team has plans to add this feature. My team uses pdm's script section heavily and i don't want to give them whiplash going from pdm -> uv + poe -> uv.
Hi. I created a small, dependency-free tool to manage scripts directly from pyproject.toml: https://github.com/phihung/tomlscript
I came across this thread after implementing my own solution… Silly me. However, there are some differences in the API design between tomlscript and existing solutions, as well as features that may not make it into the official uv implementation.
I hope the following example can provide some useful ideas for the discussion.
[tool.tomlscript]
# Start dev server ==> this line is the documentation of the command
dev = "uv run uvicorn --port {port:5001} myapp:app --reload"
# Linter and test
test = """
uv run ruff check
uv run pytest --inline-snapshot=review
"""
# Generate pyi stubs (python function) ==> Execute python function
gen_pyi = "mypackage.typing:generate_pyi"
# Functions defined here can be reused across multiple commands
source = """
say_() {
echo $1
}
"""
alias tom="uv run tom" # alias tom="uvx tomlscript"
tom # list commands
tom dev --port 8000
tom gen_pyi --mode all # execute python function with arguments
FWIW, I tried to capture some of my wish list here - https://notes.strangemonad.com/Some+thoughts+on+python+workspaces. It's a bit broader in scope than just the task runner aspect but touches on how I'd like to see task runner commands, multi-project workspaces and consistent lifecycle commands work nicely together
Hi. I created a small, dependency-free tool to manage scripts directly from pyproject.toml: https://github.com/phihung/tomlscript
First of all, I have to say that I appreciate the effort, and the terse config style. 👍🏻 The syntax is perhaps a little too magical for my tastes, but it does look nice.
That said, I admire what I find to be a slightly terrifying YOLO implementation of _find_doc()
and is_pyfunc()
. 👀 Heuristics at their finest! Well done. 😁
I believe this is one of the most major features that uv
is missing that other projects do have.
Adding support for this would bridge a major part of the feature gap!
To add to @Kludex comment, i have been using
poe
for ages, and still do withuv
. An example from one of my current projects:[tool.poe.tasks] pre.cmd = "pre-commit run --all-files" pre.help = "Run pre-commit checks" mypy.cmd = "mypy . --strict" mypy.help = "Run mypy checks" format.help = "Format code with Ruff" format.cmd = "ruff format ." ruff.help = "Run Ruff checks" ruff.cmd = "ruff check --output-format=concise ." test.help = "Run tests using Pytest" test.cmd = "pytest" "test:watch".cmd = "ptw . --now --clear" "test:watch".help = "Run tests using Pytest in watch mode" changelog.cmd = "github-changelog-md" changelog.help = "Generate a changelog"
Other projects have more entries for mkdocs and other tools.
Poe is quite well used, it seems that using
[tool.uv.tasks]
and similar task syntax would be elegant.
Thanks! I was using poethepoet for a while with poetry, didn't know I could use it in any pyproject.toml. This is enough for now
what's the downside of pure shell instead of task runners such as poe?
what's the downside of pure shell instead of task runners such as poe?
Using pure shell might need to activate the Python environment each time. Like invoking source ./.venv/bin/activate
each time opening the project. Or have to add uv run
before many commands like uv run mypy .
, uv run pre-commit ...
, which is frustrating.
And using task runners can also create shorthand for long commands. Like using pdm lint
for ruff check --fix && ruff format
.
Indeed these can all be done by some other approaches, but these tasks are often project-related and makes more sense to store as part of project config. This also allows them to be shared with other collaborators.
Thank you for the comprehensive explanation!
It does help understand the benefits of task runners.
We've been pondering the choice between pure shell and task runners in my new Python template repo. Unfortunately, I chose pure shell in the end. I'd like to share my perspective to offer others additional insights:
Using pure shell might need to activate the Python environment each time (like invoking
source ./.venv/bin/activate
) each time opening the project, or having to adduv run
before many commands likeuv run mypy .
,uv run pre-commit ...
, which is frustrating.
In local development, terminals opened in VSCode will automatically activate your venv
after Python environment selection, so uv run
and source ...
are not needed in this scenario.
For remote environments like GitHub CI, or when we need to manually activate the environment, a single uv run bash yourshell.sh
is sufficient.
And using task runners can also create shorthand for long commands. Like using
pdm lint
forruff check --fix && ruff format
.
Long commands can be auto-completed from your completion suggestions or command history with zsh plugins when running once.
Multiple commands look cleaner, are easier to control, and can be extended or combined harmoniously in one shell script (like running lint
or mypy
before pytest
). Writing dynamic execution logic in pyproject.toml
, which is meant for static configuration, feels weird to me.
So in my opinion, using a task runner feels redundant.
@AtticusZeller Yes, that's true. There is always multiple approaches to make life easier.
What I mind is that these methods require you to set up your environment separately (such as configuring IDE to recognize and automatically activate virtual environments, configuring shell auto-completion, etc.). Not everyone has the same configuration. Using a task runner can standardize it and make it independent of the environment.
This way, when I or anyone else checked out the repository in any environment, they only need to have uv
installed (which only takes one line of command) to immediately get started with the development process (linting, testing, dev server, etc.). And don't have to copy-paste some long commands first.
Because of these requirements of my use case, the task runner is more attractive to me.
@Xdynix Thank you for sharing your perspective!
I understand your point about standardization and ease of use. However, I'd like to offer a different views.
I believe this might be asking too much from uv
. Here's why:
Basic development skills like environment management, IDE configuration, and shell operations should be fundamental knowledge for team development. These are not just Python-specific skills, but universal tools that benefit all development work.
These common development practices can easily solve the standardization problem: Team documentation for environment setup or Standard shell scripts
These universal tools and skills are valuable beyond just Python projects, also can be applied to various development scenarios
Instead of avoiding these basics and implementing task execution in pyproject.toml
, wouldn't it be better to focus on building strong fundamental skills, or keep tools focused on their primary purposes
The current solution feels like over-packaging - binding script execution tightly with uv
when these problems can be naturally solved through common development practices. While task runners are powerful, I believe keeping tools focused on their core responsibilities (like uv
on package management) and addressing standardization through team practices is a more sustainable approach.
@AtticusZeller (Just some off-topic chitchat)
A single tool to replace
pip
,pip-tools
,pipx
,poetry
,pyenv
,twine
,virtualenv
, and more. -- https://docs.astral.sh/uv/
I thought that one selling point of uv
was to solve all the needs of Python environment in one stop, not just package management.
Basic development skills like environment management, IDE configuration, and shell operations should be fundamental knowledge for team development.
I totally agree. But I don't really want to force other people's toolboxes, especially when it's an informal collaboration, such as an open source project, a lab course, etc. Some people like VS Code, some like PyCharm, not to mention vim users (and notepad users?). And I'm too lazy to maintain a set of documentation for each common tool. Therefore I prefer to provide only minimal but sufficient developer experience support.
what's the downside of pure shell instead of task runners such as poe?
Here's the kicker for me -- Windows support. I'm in a mixed team with about half the people on Windows / Visual Studio (the old one, not VS Code), half on macOS with PyCharm or VS Code, and some funny people using Vim/Neovim on WSL or Linux proper. Sure, I can whip out some Makefiles and bash scripts, but they will not work reliably for all developers, on all environments.
One way to tackle this is with policy and tooling. We could force our team to all use Unix. Or we could use Vagrant.
But a better solution, in my opinion, is Python. It's already portable by design. The standard library pathlib
, shutil
, and subprocess.run
are a little more verbose than shell scripts, but not by too much. Writing our project admin tasks in portable Python and baking them into pyproject.toml
or another task runner just works. We've started using nox to build our projects instead of Makefiles for this reason.
uv offers a unified experience across all platforms, combining the aforementioned tools in a single binary, which is amazing. But IMHO, a task runner should be language-agnostic, allowing to join forces for its development and maintenance, having the same features in all projects no matter the language, and avoiding the need to learn a new tool (for essentially the same thing) for each language. A popular and powerful cross-platform task runner is Task, and it is also available as a Python package (merely shipping the compiled Go binary) to ease its installation in Python projects.
Although it appears convenient to have yet another task runner integrated with uv, I'd vote against it for the mentioned reasons.
The current solution feels like over-packaging - binding script execution tightly with
uv
when these problems can be naturally solved through common development practices. While task runners are powerful, I believe keeping tools focused on their core responsibilities (likeuv
on package management) and addressing standardization through team practices is a more sustainable approach.
I absolutely do not agree that scripts or tasks would be "over-packaging".
rye fmt
, rye lint
, rye test
(in addition, of course, to rye build
and rye publish
). This is similar to cargo's pre-defined build commands.poetry build
. And installing it is as simple as poetry self add 'poethepoet[poetry_plugin]'
.
(See also the previous overview by @nikhilweee.)
In other words -- scripts or task runner functionality is not over-packaging. It's standard and expected functionality in every other modern workflow packaging tool for Python. In fact, if uv could not offer user-definable tasks, or a plugin system, it would be worse than every other modern packaging tool in this regard.
Rye has scripts built in (and uv is the successor/replacement for rye)... In fact, if uv could not offer user-definable tasks, or a plugin system, it would be worse than every other modern packaging tool in this regard.
I respectfully disagree with this perspective for several key reasons:
The prevalence of task runners in existing tools doesn't automatically justify their inclusion in new ones. This appears to fall into the "appeal to popularity" fallacy. Just because Rye, Hatch, PDM, and others implement task runners doesn't mean it's the optimal design choice.and task runner absolutely not denote modern
The Unix philosophy advocates for tools that do one thing and do it well. When we examine the practical value of built-in task runners because most "tasks" are simply command aliases that can be handled by shell scripts and when it comes to complex workflows often require proper build tools anyway,
Adding task running to uv would dilute its core responsibility of package and Project management while providing minimal practical benefit.
...since Rye was intended to be a "cargo for Python"
This comparison overlooks crucial differences
Rust's ecosystem is built around a single, standardized toolchain,but Python's ecosystem is intentionally diverse, with different tools serving different needs, so forcing Cargo-like standardization onto Python goes against its philosophy of "we're all consenting adults here"
People choose and judge tools based on practicality and applicability, rather than opinions. These opinions have no meaning, even if you vote, a very few people would refuse this feature
Rust's ecosystem is built around a single, standardized toolchain,but Python's ecosystem is intentionally diverse, with different tools serving different needs, so forcing Cargo-like standardization onto Python goes against its philosophy of "we're all consenting adults here"
I'm excited about uv
BECAUSE its one tool to "rule them all" for common Python build needs. I'm so tired of picking five or more tools to lint/format/build/package/etc a new project, and learning how to make them play nicely with each other, all the while knowing that these tools will likely evolve independently and I'll need to re-figure this when I update them. In addition, everyone else will choose a DIFFERENT five tools and if I want to contribute to their project I'll also learn how they do it. Please uv
, save me from this!
I think running tasks is a common enough need that it would fit well into uv
and be useful for most projects. Let me just learn uv
way to do it. Of course, if a project needs a more specialized way to run tasks, they can use something else, but solve the common case of running command aliases...
For those of us migrating over from Rye, one of its nice features is the built-in task runner using
rye run
and[tool.rye.scripts]
. For example:It could have some more features - here is a selection of feature requests from the community:
A lot of these requested features are things that other 3rd party tools currently offer. I thought it might be useful to highlight a few other tools here, in particular because they also integrate with the
pyproject.toml
ecosystem and can be used with uv today.Poe the Poet
https://github.com/nat-n/poethepoet
taskipy
https://github.com/taskipy/taskipy
Perhaps these can serve as some inspiration for a future
uv run
task runner and also in the meantime offer a solution for people coming over from Rye looking for a way to run tasks.