Checklist
What is the idea?
To introduce features in conda-build that allow for more dynamic versioning and dependency management within the meta.yaml file, similar to the capabilities offered by pyproject.toml with setuptools-build.
Why is this needed?
Currently, meta.yaml requires manual specification of package versions and dependencies. This can be cumbersome and error-prone, especially when dealing with frequent updates. By adopting a more dynamic approach, we can streamline the packaging process and reduce the risk of inconsistencies. Additionally, pyproject.toml is becoming the preferred method for defining Python package metadata, while setup.py is considered less common and potentially less maintainable.
What should happen?
- Dynamic Versioning: Allow referencing a package attribute (e.g.,
ds_utils.__version__) within the meta.yaml file to automatically set the version.
- Dynamic Dependencies: Enable specifying a path to a requirements file (e.g.,
requirements.txt) in meta.yaml where dependencies are listed for both build and runtime environments.
Additional Context
package:
name: data-science-utils
version: "1.8.0" # Replace with "{{ ds_utils.__version__ }}" for dynamic versioning
# Additional configurations
requirements:
host:
- python
- pip
- numpy>=1.26.3
- scipy>=1.11.4
- pandas>=2.1.4
- matplotlib>=3.8.4
- seaborn>=0.12.2
- scikit-learn>=1.2.2
- pydotplus>=2.0.2
- joblib>=1.2.0
- plotly>=5.24.1 # should read all requirements from file
run:
- python
- numpy>=1.26.3
- scipy>=1.11.4
- pandas>=2.1.4
- matplotlib>=3.8.4
- seaborn>=0.12.2
- scikit-learn>=1.2.2
- pydotplus>=2.0.2
- joblib>=1.2.0
- plotly>=5.24.1 # should read all requirements from file
# ... rest of meta.yaml
[project]
name = "data_science_utils"
dynamic = ["version", "dependencies"]
version = { attr = "ds_utils.__version__" }
dependencies = { file = ["requirements.txt"] }
- My build script uses a combination of shell commands and
conda-build to build and upload packages for various Python versions and platforms. Build Script:
for /F "tokens=2 delims== " %%i in ('findstr /R "__version__[^=]*=" ds_utils\__init__.py') do (
set version=%%i
)
set version=%version:"=%
for /f "delims=" %%v in ('python -c "import numpy; print(numpy.__version__)"') do (
set numpy_version=%%v
)
if not exist .\outputdir mkdir .\outputdir
for %%v in (3.9 3.10 3.11 3.12) do (
call conda build --python %%v data-science-utils --numpy %numpy_version% --output-folder outputdir\
)
call conda build purge
for %%v in (39 310 311 312) do (
call conda convert -f --platform all outputdir\win-64\data-science-utils-%version%-py%%v_0.tar.bz2 -o outputdir\
)
anaconda login --username IdanMorad
for %%v in (39 310 311 312) do (
for %%i in (linux-32 linux-64 linux-aarch64 linux-armv6l linux-armv7l linux-ppc64le osx-64 win-32 win-64) do (
call anaconda upload outputdir/%%i/data-science-utils-%version%-py%%v_0.tar.bz2
)
)
anaconda logout
DataScienceUtils
|
- data-science-utils
|
- bld.bat
- build.sh
- meta.yaml
- deploy-conda.bat
- pyproject.toml
Checklist
What is the idea?
To introduce features in
conda-buildthat allow for more dynamic versioning and dependency management within themeta.yamlfile, similar to the capabilities offered bypyproject.tomlwithsetuptools-build.Why is this needed?
Currently,
meta.yamlrequires manual specification of package versions and dependencies. This can be cumbersome and error-prone, especially when dealing with frequent updates. By adopting a more dynamic approach, we can streamline the packaging process and reduce the risk of inconsistencies. Additionally,pyproject.tomlis becoming the preferred method for defining Python package metadata, whilesetup.pyis considered less common and potentially less maintainable.What should happen?
ds_utils.__version__) within themeta.yamlfile to automatically set the version.requirements.txt) inmeta.yamlwhere dependencies are listed for both build and runtime environments.Additional Context
conda-buildto build and upload packages for various Python versions and platforms. Build Script:conda-build.