You are in for a real treat!
Here is how to step in and get the locals
This technique depends on there being only one return statement
https://logging-strict.readthedocs.io/en/stable/code/tech_niques/context_locals.html
Multiple return statements is unusual. In very rare situations i understand. But the rule is never do that.
When there is only one return statement, can step into the function to see the local variables
The sample code for lazy imports looks wrong
STRIPE = None
def _stripe():
global STRIPE
if STRIPE is None:
import stripe
return stripe
return STRIPE
STRIPE
is never changed. And two return statements in the same function?!
Anyways can imagine how to do lazy imports without relying on the given code sample.
so does zfs, so does wayland, so does trying out every distro, so does trying out every text editor and associated plugins, so does trying out ventoy, so does GrapheneOS, ...
Everything makes life easier, but comes down to,
Linux isn't free, it costs you your time
Which can be reframed, what do you really want to spend your time on?
If i really really really had to answer that overly honestly, want:
-
my GUI apps non-blocking on heavy background processes
-
distributing out tasks to many computers and runners
None of which screams or necessitates systemd or zfs or wayland or trying out every distro, every text editor every plugin, ventoy, or GrapheneOS.
Not in a house with a fox with a crow and a bow a knot on a cot or relaxed in the snow, i will not eat never ending random suggestions Sam, i will never eat them Sam i am.
Packaging seems to be a separate skill. Separate from coding. Lots of people are good at coding. Then hit the packaging roadblock.
Can step in and white knight, but too many projects are like that.
How to separate requirements handling and build backend
then drain-swamp (+drain-swamp-action) which supports build plugins with built in plugin for specifying semantic version.
Prefer to tell the builder the desired version rather than to be told. Options: 'tag', 'current', or specify the version. Version bump is silly.
This way can set the semantic version before tagging a release.
Warning
This is hardcore. Not for noobs, the faint of heart, or wise guys. If you don't need build plugins then maybe drain-swamp is not for you. If you are doing something in setup.py
, that you can't live without, then might have a looksie at drain-swamp.
No, that’s… against community rules :) I don’t like the common use of venvs or .toml very much and I don’t like their use by other people and “timid” is also diplomatic. So you’re getting timid, and we get to get along and we can agree to disagree on the use of .venvs and we can wish each other a pleasant day.
Think you broke the Internet. That's brilliant /nosarc.
Want you to write my code of misconduct!
It’s literally 10x faster
reminds me of the ol' joke
young bull: lets run down the hill and get us a heffer
old bull: lets walk down and do 'em all
wtf is your rush?
It literally doesn't matter how long it takes. Especially when Astral has moved on.
It’s literally 10x faster. I’m not sure what kind of person wouldn’t care about that. On that, lets agree to disagree.
Thru magic Astral has funding. I don't. So why shoulder the risk that their magical situation will continue forever.
When Astral goes tits up, which we seem to agree on, and everyone is whining crying and screaming, at least there is one or more fallback(s) written in Python which is/are maintainable by people without magical super powers.
I have no need for this kind of tool, because I don’t have version conflicts. Does this manage my dependencies in other ways?
Happily no. wreck attempts to do only one thing. If you don't have version conflicts in your requirements files then whatever you are doing, keep doing that.
No idea what .in is.
requirements-*.in
. are placed in folders. So requirements/whatever.in
--> requirements/whatever.lock
and requirements/whatever.unlock
Are they still .txt or is there a new file standard for .lock and .unlock?
.txt
is meaningless or exceedingly broad. A text file huh? Well that explains everything.
The standard is what works.
use of venvs
Containerization, especially for GUIs and apps, is better than depending on venvs. Until it's not. Then still need venvs
The same argument can be made for supporting Windows and MacOS. Don't have these dev environments. But somehow found a way to support these platforms.
If you look into it, pyQt[x] and pySide[x] aren't all that different. The intent of PySide is to keep them for the most part compatible.
Don't have to manage everything, just what is being used.
Doing the wrong thing explains most my packages:
wreck -- dependency management
drain-swamp with drain-swamp-action -- build backend with build plugins
logging-strict -- strictly validated logging configuration
pytest-logging-strict -- the same thing except a pytest plugin
What else am i not supposed to do?
You are right. I added it. Thank you
There is lots of complexity creep. And i'm one person with a finite lifespan. So had to decide what to spend time on.
systemd is ideal for those running servers. I'm publishing Python packages and wanted to keep focused on that.
If you wish to work for me for free, cuz i have zero access to labor or funding, to upgrade my tech infrastructure, i could be a useful person to know.
Especially if you believe strongly i should be running much better infrastructure.
Why have you been keeping this a secret?
I have. Wanted to see if anyone would find a published pypi.org package organically, w/o any marketing.
Surely with a trillion eye balls and the super powers of AI, it would be impossible to hide something in plain sight, Right? Especially on the most important topic in Python.
Now the question becomes, does the world+dog ignore federated social media? Is every coder required to have a blog?
Dependency management
Market research
This post is only about dependency management, not package management, not build backends.
You know about these:
-
uv
-
poetry
-
pipenv
You are probably not familiar with:
-
pip-compile-multi
(toposort, pip-tools)
You are defintely unfamiliar with:
-
wreck
(pip-tools, pip-requirements-parser)
pip-compile-multi creates lock files. Has no concept of unlock files.
wreck produces both lock and unlock files. venv aware.
Both sync dependencies across requirement files
Both act only upon requirements files, not venv(s)
Up to speed with wreck
You are familiar with .in
and .txt
requirements files.
.txt
is split out into .lock
and .unlock
. The later is for packages which are not apps.
Create .in
files that are interlinked with -r
and -c
. No editable builds. No urls.
(If this is a deal breaker feel free to submit a PR)
pins files
pins-*.in
are for common constraints. The huge advantage here is to document why?
Without the documentation even the devs has no idea whether or not the constraint is still required.
pins-*.in
file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.
Explain the origin of the issue in terms a 6 year old can understand.
Configuration
python -m pip install wreck
This is logging-strict pyproject.toml
```
[tool.wreck] create_pins_unlock = false
[[tool.wreck.venvs]] venv_base_path = '.venv' reqs = [ 'requirements/dev', 'requirements/kit', 'requirements/pip', 'requirements/pip-tools', 'requirements/prod', 'requirements/manage', 'requirements/mypy', 'requirements/tox', ]
[[tool.wreck.venvs]] venv_base_path = '.doc/.venv' reqs = [ 'docs/requirements', ]
dynamic = [ "optional-dependencies", "dependencies", "version", ]
[tool.setuptools.dynamic] dependencies = { file = ["requirements/prod.unlock"] } optional-dependencies.pip = { file = ["requirements/pip.lock"] } optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] } optional-dependencies.dev = { file = ["requirements/dev.lock"] } optional-dependencies.manage = { file = ["requirements/manage.lock"] } optional-dependencies.docs = { file = ["docs/requirements.lock"] } version = {attr = "logging_strict._version.version"}
```
Look how short and simple that is.
The only thing you have to unlearn is being so timid.
More venvs. More constraints and requirements complexity.
Do it
``` mkdir -p .venv || :; pyenv version > .venv/python-version python -m venv .venv
mkdir -p .doc || :; echo "3.10.14" > .doc/python-version cd .doc && python -m venv .venv; cd - &>/dev/null
. .venv/bin/activate
python -m pip install wreck
reqs fix --venv-relpath='.venv' ``` There will be no avoidable resolution conflicts.
Preferable to do this within tox-reqs.ini
Details
TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.
If pyproject.toml not in the cwd, --path='path to pyproject.toml'
create_pins_unlock = false
tells wreck to not produce .unlock files for pins-*.in
files.
DANGER
This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!
For pins that span venv, add the file suffix .shared
e.g. pins-typing.shared.in
wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.
Issues
-
no support for editable builds
-
no url support
-
no hashs
-
your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck
-
Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.
-
a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.
What do you do when there is/are unavoidable package dependency conflict(s)? <-- biggest question in Python
Often times, split out into mutliple venvs.
For example, Sphinx
requires py310+. My default is py39. myst-parser
restricts dependencies rather than allow the latest and greatest. So there is an unavoidable need for two venv.
After setting up pyenv, how much setup is required, in pyproject.toml
, to setup these two venv?
Looking at poetry, it's focused on one venv. And it's annoying it considers pyproject.toml
to be a r/w file. I want to configure for an unlimited number of venv.
The OP author is very familiar with uv, having written multiple articles on it. Pushes it onto his students. So i get it, the article is focused on uv and he is invested in uv. This assessment is a tiny bit unfair, but enough, only, to justify reading the article with a tiny grain of salt.
For package management, i'm happy with pyenv. So there i have a bias.
The biggest praise i have is, it follows the UNIX philosophy, do one thing and do it well
. uv does multiple things, the issue comes down to resources required to maintain a super complex thing. Especially in a completely different coding language! I DONT GIVE TWO SHIATS IF ITS FASTER, I care about maintainability after the original geniuses disappear and they will.
dependency management
Any blog post which doesn't mention competitors, tiny grain of salt --> giant grain of salt.
If not mentioned, have to assume either don't know about them or haven't bothered to try them.
What are the actual competitors to uv (specifically for dependency management)?
The only package mentioned is: poetry
poetry also violates the UNIX philosophy. It combines build backend with dependency management. I want them separate.
Open up that super powerful doom pr0n machine, AI, and get it to find the other (dependency management packages). Mention them in the blog post.
until it isn't
multiprocessing humbles the plans of mortal men
(Initially thought you were being sarcastic.)
don't break up long functions
Some things are so complex they can only be understood in long functions. Break up the long functions and what's going on is lost.
Only have one example where the complexity is best left alone, the heart of a requirements management package. Where the whole is greater than the sum of it's parts. Or the parts lose meaning if taken in isolation.
docstrings
surprised flake8+black even allows a comment as the docstring.
Prefer triple single quotes over triple double quotes. But black slapped me back into conformance. Of course i'm right, but evidently that's not good enough. So triple double quotes it is.
Slap me often and hard enough, i learn conform.
Except for build backends and requirements management.
i ditched Ubuntu for Void Linux LXDE. Void Linux has runit rather than systemd
This predates snapd
Disclaimer: you have to setup the wifi and enable logind
Feedback on gh profile design
Author of wreck pytest-logging-strict sphinx-external-toc-strict and drain-swamp - msftcangoblowm

Finally got around to creating a gh profile page
The design is to give activity insights on:
-
what Issues/PRs working on
-
future issues/PRs
-
for fun, show off package mascots
All out of ideas. Any suggestions? How did you improve your github profile?
Whats in a Python tarball
From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.
What should go in a tarball and what should not?
Is it only the build files, python code, and package data and nothing else?
Should it include tests/ folder?
Should it include development and configuration files?
Have seven published packages which include almost all the files and folders. Including:
.gitignore,
.gitattributes,
.github folder tree,
docs/,
tests/,
Makefile,
all config files,
all tox files,
pre-commit config file
My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.
Thoughts?
PEP 735 does dependency group solve anything?
This PEP specifies a mechanism for storing package requirements in pyproject.toml files such that they are not included in any built distribution of the project.

PEP 735 what is it's goal? Does it solve our dependency hell issue?
A deep dive and out comes this limitation
The mutual compatibility of Dependency Groups is not guaranteed.
-- https://peps.python.org/pep-0735/#lockfile-generation
Huh?! Why not?
mutual compatibility or go pound sand!
pip install -r requirements/dev.lock pip install -r requirements/kit.lock -r requirements/manage.lock
The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.
Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.
Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!
What if this is scaled further, instead of one package, a chain of packages?!
constraint vs requirement. What's the difference?
In a requirements-*.in
file, at the top of the file, are lines with -c
and -r
flags followed by a requirements-*.in
file. Uses relative paths (ignoring URLs).
Say have docs/requirements-pip-tools.in
``` -r ../requirements/requirements-prod.in -c ../requirements/requirements-pins-base.in -c ../requirements/requirements-pins-cffi.in
... ```
The intent is compiling this would produce docs/requirements-pip-tool.txt
But there is confusion as to which flag to use. It's non-obvious.
constraint
Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!
Does not support:
-
editable mode (-e)
-
extras (e.g. coverage[toml])
Personal preference
-
always organize requirements files in folder(s)
-
don't prefix requirements files with
requirements-
, just doing it here -
DRY principle applies; split out constraints which are shared.