Avatar

logging_strict

logging_strict@programming.dev
Joined
1 posts • 27 comments
Direct message

Great explanation of the most important difference

permalink
report
parent
reply

UNIX philosophy. One tool that does one thing well

Best to have a damn good reason when breaking this principle (e.g. vendoring) or be funded by Money McBags

requirements files are requirements files, not venvs. They may install into venv, but they are not venvs themselves. The only thing a venv provides that is of interest to ur requirements files are: the relative folder path (e.g. ‘.venv’) and python interpreter path. Nothing more. When using tox, the py version is hardcoded, so only need to provide the relative folder path.

The venv management tools we have are sufficient. the problem is not the venv, it’s managing the requirements files.

Your 1 tool suacks just as much as my 5 tools when it comes to managing requirement files. None of them do the job.

permalink
report
parent
reply

Within the context of resolving dependency conflicts, poetry decided pyproject.toml is a great place to put requirements.

This is what people know.

pyproject.toml or venv management should otherwise never come into the conversation.

My personal opinion is: venv, pip, pyenv, pip-tools, and tox are sufficient to manage venvs.

venvs are not required to manage requirement files. It’s a convenience so dev tools are accessible.

Currently the options are: poetry or uv.

With honorable mention to pip-compile-multi, which locks dependencies.

poetry and uv manage venvs… Why?

permalink
report
parent
reply

That’s a loaded question. Would like to avoid answering atm. Would lead to a package release announcement which this post is not; not prepared to right right now.

Instead here is an admittedly unsatisfactory response which i apologize for.

Wish to have the option to, later, take it back and give the straight exact answer which your question deserves.

my use case is your use case and everyone else’s use case.

Avoiding dependency hell while keeping things easily manageable. Breaking up complexity into smallest pieces possible. And having a CLI tool to fix what’s fixable while reporting on what’s not.

My preference is to do this beforehand.

permalink
report
parent
reply

Was working under the assumption that everyone considered constraints (-c) to be non-negotiable required feature.

If only have requirements (-r), in a centralized pyproject.toml, then how to tackle multiple specific dependency hell issues without causing a huge amount of interconnected clutter?

permalink
report
parent
reply

Woah! Was giving the benefit of the doubt. You blow my mind.

The locking is very very specific to apps and dev environment.

But lacking constraints is like cutting off an arm.

permalink
report
parent
reply

my position is it’s not messy enough

Lets start off by admitting what the goal is.

We all want to avoid dependency hell.

Our primary interest is not merely cleaning up the mess of requirements files.

Cleaning up the mess results in some unintended consequences:

  1. noise
  2. complexity
  3. confusion

noise

All the requirements information is in one place. Sounds great until want to tackle and document very specific issues.

Like when Sphinx dropped support for py39, myst-parser restricted the Sphinx upper bound version, fixed it in a commit, but did not create a release.

Or cffi, every single commit just blows our mind. Adding support for things we all want. So want to set a lower bound cffi version.

My point being, these are all specific issues and should be dealt with separately. And when it’s no longer relevant, know exactly what to remove. Zero noise.

complexity

When things go horribly wrong, the wrapper gets in the way. So now have to deal with both the wrapper and the issue. So there is both a learning curve, an API interface, and increased required know how.

The simple answer here is, do not do that.

confusion

When a dependency hell issue arises, have to deal with that and find ourselves drawn to poetry or uv documentation. The issue has nothing to do with either. But we are looking towards them to see how others solve it, in the poetry or uv way.

The only know-how that should be needed is whats in the pip docs.

Whats ur suggestion?

Would prefer to deal with dependency hell before it happens. To do this, the requirements files are broken up, so they are easier to deal with.

Centralizing everything into pyproject.toml does the opposite.

Rather than dealing with the issue beforehand, get to deal with it good and hard afterwards.

permalink
report
parent
reply

A package’s requirements are left unlocked

An app’s requirements are locked

This doesn’t excuse app devs if an requirements.in file is not provided

e.g. pip freeze > requirements.txt and forget

This produces a lock file. Including indirect packages. The direct packages info is lost if a requirements.in is not provided.

permalink
report
reply

Betteridge’s law of headlines

nice catch and thanks for the teaching moment

permalink
report
parent
reply

To keep it simple

testing and static type checking – catches all the bugs

linting and formatters – so git diff isn’t pure noise showing trailing and unnecessary whitespace and collaborators won’t have to go back to correct things that coulda been automagically fixed.

in code documentation – Can be extracted by Sphinx as part of the documentation process. Hint: interrogate is your friend.

gh workflows – to have the test suite run against various py versions, os, and maybe architectures. Without which not even confident it runs well on your own machine let alone anywhere else.

requirements.txt – is an output file. Where is requirements.in ??

xz hacker sends his love

Makefile – for people who like a ton of shell scripts in their Python packages. Up until realize that ya know which Python interpreter is being run, but can’t have any level of confidence about the shell interpreter. Cuz it’s a big unknown and unknowable. Gotta just take it on faith.

permalink
report
parent
reply