Issue 12526: packaging.pypi.Crawler and resulting objects have a circular API (original) (raw)

The issue, as best I can describe it, is in how the a release list (packaging.pypi.dist.ReleaseList) looks up releases.

Here is a simple example using a random package on PyPI.

crawler = Crawler() projects = crawler.search_projects('snimpy') projects [<Project "snimpy">] project = projects[0] [x for x in project] []

The results show that project 'snimpy' has no releases, but this is incorrect because distribution 'snimpy' has five releases.

Even after calling sort_releases and fetch_releases on the project which both refer back to the crawler instance (see the project's _index attribute) the project fails to get the releases.

project.fetch_releases() [] project.sort_releases() [x for x in project] []

In order to get the releases, one is forced to use the crawler's API rather than the resulting project's API.

crawler.get_releases(project.name, force_update=True) <Project "snimpy" versions: 0.5, 0.4, 0.3, 0.2.1, 0.2> [x for x in project] [<snimpy 0.5>, <snimpy 0.4>, <snimpy 0.3>, <snimpy 0.2.1>, <snimpy 0.2>]

So as far as I can gather, We lack the ability to forcibly update the project (or ReleaseList). I don't have a solution at this time, but we may want to look into adding a force_update argument to the get_release method on the Crawler.

Thanks for the report. I noticed similar strangeness in the API when working on the documentation. Alexis is quite busy these weeks, but he will with no doubt comment on this later.

I’m not sure the force argument is a good idea; I think we should ask ourselves what is the behavior that would be most intuitive for users, and implement that. If there are performance or caching issues, we’ll see.