Hello,
in the appendix of the project 2 it is written to write what we need to install and all the requirements to run our solution.
We've been working with google colab which has already all the requirements needed.
Google colab has already a lot of external libraries installed and has apparently the last versions of all libraries/python/tensorflow/etc.
which is probably not the case for the majority of computers. We have quite struggled to install all the packages in our computers for our project,
and there is a possibility that despite our effort to get a good readme explaining the installation, the setup does not work for a particular computer.
So we would like to know if it was possible to refer to the python notebook of google colab in our project for the execution part?
Or at least giving the python notebook as an alternative if the assistants fail to install all the libraries?
We don't ask you to list all the libraries you have installed on your system/Colab.
Rather, we ask you you for an easy way to install all the Python libraries which are required for your project (i.e. everything which is not in the std library and that you import with import ...).
E.g., providing a Python requirements.txt (preferably with version numbers, in the project root) is useful, because then anyone who wishes to run your experiments and install the required packages only has to run pip install -r requirements.txt in his/her environment. Note that this file is not random, it follows a structure: https://note.nkmk.me/en/python-pip-install-requirements/
To test whether anyone can run your code:
make a new virtual Python environment with virtualenv or conda
install the packages in your requirements.txt file with pip install -r requirements.txt
run the code as described in the Readme that you've written (as in project 2's description)
If your code requires to be (also) run on Google Colab, make it easy by providing a link to a Google Colab notebook.
Also, if your training procedure takes a long time, please include a way to use your trained model (auto-download, Google drive, ...).
Our models take a lot of time to compute, is it ok if we include directly the files containing the parameters, or is it better if we make them available online, to avoid having a too large archive file ?
Use of libraries and python notebook
Hello,
in the appendix of the project 2 it is written to write what we need to install and all the requirements to run our solution.
We've been working with google colab which has already all the requirements needed.
Google colab has already a lot of external libraries installed and has apparently the last versions of all libraries/python/tensorflow/etc.
which is probably not the case for the majority of computers. We have quite struggled to install all the packages in our computers for our project,
and there is a possibility that despite our effort to get a good readme explaining the installation, the setup does not work for a particular computer.
So we would like to know if it was possible to refer to the python notebook of google colab in our project for the execution part?
Or at least giving the python notebook as an alternative if the assistants fail to install all the libraries?
We don't ask you to list all the libraries you have installed on your system/Colab.
Rather, we ask you you for an easy way to install all the Python libraries which are required for your project (i.e. everything which is not in the std library and that you import with
import ...
).E.g., providing a Python
requirements.txt
(preferably with version numbers, in the project root) is useful, because then anyone who wishes to run your experiments and install the required packages only has to runpip install -r requirements.txt
in his/her environment. Note that this file is not random, it follows a structure: https://note.nkmk.me/en/python-pip-install-requirements/To test whether anyone can run your code:
virtualenv
orconda
requirements.txt
file withpip install -r requirements.txt
If your code requires to be (also) run on Google Colab, make it easy by providing a link to a Google Colab notebook.
Also, if your training procedure takes a long time, please include a way to use your trained model (auto-download, Google drive, ...).
1
Our models take a lot of time to compute, is it ok if we include directly the files containing the parameters, or is it better if we make them available online, to avoid having a too large archive file ?
Add comment