Jun-02-2018, 01:46 PM
I have spent days trying to find the best way to package my python project, so that it would be easy to install by other people. As I have found more possible solutions, I have just become more and more confused which is the best way to go. When it comes to packaging a python project properly, I'm a total beginner.
Below is the simplified structure of my project.
Below is my setup.py
So here's what I have considered:
Option 1:
Instead of putting my_program configuration files under /etc, I would have them in a directory in user's home directory. Then I would also use --user parameter when installing requirements with pip, so the specific version requirements don't mess up global python environment. I would also install my_program in the directory in user's home directory, and export that directory permanently to $PATH, so my_program could be run globally by the user. Then I don't know if I should have my_modules installed locally as well or could it be globally.
Option 2:
I keep everything global, so that any user in the system can run the program. Then I would just focus on improving the setup.py script. I just have a feeling that this is not the best idea, as it could wreak havoc by installing required external python modules of which some are version specific, which in turn could break other python programs' functionality.
Option 3:
I read about virtualenv. That seems to be an easy way to keep specific versions of python modules separated. What I understood though, then you would have to start virtualenv first every time before running my_program, which is not exactly what I want.
Currently I think the option 1 is the one I should go for, but I would like to hear if there are better ways to do packaging/installing of a python project. So far I have built the package with sdist or bdist command, but I also read that wheel is widely used nowadays. Is the wheel format necessary if I want to upload my project to PyPI for example? Could it be turned into binaries, if all of its external packages are pure python and the platform my_program will be used is only Linux?
Below is the simplified structure of my project.
Output:my_project
\
bin -
| \
| my_program
|
|
lib -
| \
| my_modules -
| \
| sub_modules
|
files -
| \
| some_files.txt
|
|
requirements.txt
|
setup.py
I currently have my setup script in setup.py. The idea is to install my_program in /usr/local/bin, my_modules in python modules so that my_modules can be imported globally by user, and some configuration files in /etc/my_project. I managed to make it work, but it seems dirty.Below is my setup.py
from setuptools import setup from setuptools.command.install import install import os class UserInstall(install): def run(self): install.run(self) install_external = 'pip3 install -r requirements.txt' os.system(install_external) setup( name='my_program', version='0.1.0', description='test', author='test', author_email='[email protected]', packages=['my_modules', 'my_modules.sub_modules'], package_dir={'': 'lib'}, url='https://testtest.com', python_requires='>=3.4.*', data_files=[('/etc/my_program', ['files/some_files.txt']), scripts=['bin/my_program'], cmdclass={'install': UserInstall}, )Biggest problems I have with easy_install, so that's why I use pip instead, even though I have a feeling that I shouldn't override that install function. I also currently have to run the setup.py with root privileges, because otherwise some files can't be installed due to lack of permissions. I have considered having my_program configuration files in user's home directory instead of /etc though.
So here's what I have considered:
Option 1:
Instead of putting my_program configuration files under /etc, I would have them in a directory in user's home directory. Then I would also use --user parameter when installing requirements with pip, so the specific version requirements don't mess up global python environment. I would also install my_program in the directory in user's home directory, and export that directory permanently to $PATH, so my_program could be run globally by the user. Then I don't know if I should have my_modules installed locally as well or could it be globally.
Option 2:
I keep everything global, so that any user in the system can run the program. Then I would just focus on improving the setup.py script. I just have a feeling that this is not the best idea, as it could wreak havoc by installing required external python modules of which some are version specific, which in turn could break other python programs' functionality.
Option 3:
I read about virtualenv. That seems to be an easy way to keep specific versions of python modules separated. What I understood though, then you would have to start virtualenv first every time before running my_program, which is not exactly what I want.
Currently I think the option 1 is the one I should go for, but I would like to hear if there are better ways to do packaging/installing of a python project. So far I have built the package with sdist or bdist command, but I also read that wheel is widely used nowadays. Is the wheel format necessary if I want to upload my project to PyPI for example? Could it be turned into binaries, if all of its external packages are pure python and the platform my_program will be used is only Linux?