-
Notifications
You must be signed in to change notification settings - Fork 269
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tweaks for CI and CD #9
Conversation
name="cs50", | ||
packages=["cs50"], | ||
url="https://github.com/cs50/python-cs50", | ||
version="1.3.0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any way to factor out version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought we don't need anymore, since we're distributing via PyPI only, aren't we?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dmalan ended up reading the version specified in setup.py
this way, without need to factor it out. Generally, I don't think we need to factor it out so long as we don't build multiple distributions (e.g., deb, rpm, sdist, etc).
Factoring it out (e.g., to a file) and parsing that from setup.py
(or any number of other scripts/config files) is fairly easy, but just feels unnecessary to me, tbh, specially that we decided to distribute Python packages via PyPI only? Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting. Is calling the var __version__
conventional? This would seem to mean that setup.py
needs to be installed along with, e.g., cs50.py
itself. Is that indeed the norm with PyPi packages, does setup.py
end up getting installed somewhere?
Separately, what's to be the relationship between the version number in setup.py
and our GitHub repo's branches? Should we automatically (forcibly) create a GitHub tag and release called v#.#.#
anytime there's a push, extracting the release's version number from setup.py
too?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is calling the var __version__ conventional?
Yes, see https://www.python.org/dev/peps/pep-0396/#specification
- When a module (or package) includes a version number, the version SHOULD be available in the __version__ attribute.
This would seem to mean that setup.py needs to be installed along with, e.g., cs50.py itself. Is that indeed the norm with PyPi packages, does setup.py end up getting installed somewhere?
The setup.py
script end up being part of the source distribution (sdist) that we build and deploy to PyPI, since pip
uses it to build this package for and install it on the target system. I don't think it typically gets installed (it's not in our case either), but what does get installed is some metadata about the package, based on what we provided in setup.py
, and that's what's used to extract the version number at runtime in this case.
Separately, what's to be the relationship between the version number in setup.py and our GitHub repo's branches? Should we automatically (forcibly) create a GitHub tag and release called v#.#.# anytime there's a push, extracting the release's version number from setup.py too?
If you're referring to the problem per cs50/submit50#44, recall PyPI doesn't allow the same filename (including version) to be uploaded more than once. So we can't re-upload the same version of a package, to overwrite it, for example. (And frankly, this doesn't seem ideal, since two users might have the same package version installed, but effectively, they're different packages.)
But it might be the case that we, for example, have multiple PRs pending, and we want some or all of them to be part of the next release. If we just go ahead and merge these PRs into master, one after the other, only the first one will be deployed to PyPI, and all the others will fail, because they use the same version number. So we can't do that, using the current setup, without merging these PRs into one, and then merging that big PR into master, which seems unnecessary.
For that reason, we should automatically deploy, but manually trigger deployment somehow. A couple of solutions to this might be:
-
Deploy on tags instead of commits to master. This should allow us to merge as many PRs as we want before the next release is published. And once we're comfortable releasing, we just tag and push that tag, which should trigger a build and deploy to PyPI.
-
Deploy on commits to a production branch (or the equivalent). This also should allow us to merge as many PRs as we want into master, without any problems. And once a release is ready, we merge master into production, which should trigger a build and deploy to PyPI.
Merging for now to release this version! |
Thanks! |
@dmalan
as is? Or maybe
or so, to stay consistent with the descriptions of the other packages?