Skip to content
This repository was archived by the owner on Oct 25, 2021. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
hooks:
- id: yapf
name: yapf
entry: bash ./bin/yapf.sh --all-in-place
entry: bash ./bin/_yapf.sh --all-in-place
language: system
files: \.py$
require_serial: true
Expand Down
45 changes: 0 additions & 45 deletions .travis.yml

This file was deleted.

4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ check-style:
codestyle:
pre-commit run

docker-build: ./requirements/requirements-docker.txt
docker build -t catalyst-segmentation:latest . -f docker/Dockerfile
docker: ./requirements/
docker build -t catalyst-segmentation:latest . -f ./docker/Dockerfile

clean:
rm -rf build/
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

**Accelerated DL & RL**

[![Build Status](http://66.248.205.49:8111/app/rest/builds/buildType:id:Catalyst_Deploy/statusIcon.svg)](http://66.248.205.49:8111/project.html?projectId=Catalyst&tab=projectOverview&guest=1)
[![Build Status](http://66.248.205.49:8111/app/rest/builds/buildType:id:Segmentation_Tests/statusIcon.svg)](http://66.248.205.49:8111/project.html?projectId=Segmentation&tab=projectOverview&guest=1)
[![Pipi version](https://img.shields.io/pypi/v/catalyst.svg)](https://pypi.org/project/catalyst/)
[![Docs](https://img.shields.io/badge/dynamic/json.svg?label=docs&url=https%3A%2F%2Fpypi.org%2Fpypi%2Fcatalyst%2Fjson&query=%24.info.version&colorB=brightgreen&prefix=v)](https://catalyst-team.github.io/catalyst/index.html)
[![PyPI Status](https://pepy.tech/badge/catalyst)](https://pepy.tech/project/catalyst)
Expand All @@ -29,7 +29,7 @@ Part of [PyTorch Ecosystem](https://pytorch.org/ecosystem/). Part of [Catalyst E

---

# Catalyst.Segmentation [![Build Status](https://travis-ci.com/catalyst-team/segmentation.svg?branch=master)](https://travis-ci.com/catalyst-team/segmentation) [![Github contributors](https://img.shields.io/github/contributors/catalyst-team/segmentation.svg?logo=github&logoColor=white)](https://github.com/catalyst-team/segmentation/graphs/contributors)
# Catalyst.Segmentation [![Build Status](http://66.248.205.49:8111/app/rest/builds/buildType:id:Segmentation_Tests/statusIcon.svg)](http://66.248.205.49:8111/project.html?projectId=Segmentation&tab=projectOverview&guest=1) [![Github contributors](https://img.shields.io/github/contributors/catalyst-team/segmentation.svg?logo=github&logoColor=white)](https://github.com/catalyst-team/segmentation/graphs/contributors)

You will learn how to build image segmentation pipeline with transfer learning using the Catalyst framework.

Expand Down
28 changes: 20 additions & 8 deletions bin/_check_codestyle.sh
Original file line number Diff line number Diff line change
@@ -1,29 +1,41 @@
#!/usr/bin/env bash
set -e

# Cause the script to exit if a single command fails
set -eo pipefail -v

# Parse -s flag which tells us that we should skip inplace yapf
echo 'parse -s flag'
skip_inplace=""
while getopts ":s" flag; do
case "${flag}" in
s) skip_inplace="true" ;;
esac
done

echo 'isort: `isort -rc --check-only --settings-path ./setup.cfg`'
isort -rc --check-only --settings-path ./setup.cfg

# stop the build if there are any unexpected flake8 issues
bash ./bin/flake8.sh --count \
echo 'flake8: `bash ./bin/_flake8.sh`'
bash ./bin/_flake8.sh --count \
--config=./setup.cfg \
--show-source --statistics
--show-source \
--statistics

# exit-zero treats all errors as warnings.
flake8 . --count --exit-zero \
--max-complexity=10 \
echo 'flake8 (warnings): `flake8 .`'
flake8 ./bin/_flake8.sh --count \
--config=./setup.cfg \
--statistics
--max-complexity=10 \
--show-source \
--statistics \
--exit-zero

# test to make sure the code is yapf compliant
if [[ -f ${skip_inplace} ]]; then
bash ./bin/yapf.sh --all
echo 'yapf: `bash ./bin/_yapf.sh --all`'
bash ./bin/_yapf.sh --all
else
bash ./bin/yapf.sh --all-in-place
echo 'yapf: `bash ./bin/_yapf.sh --all-in-place`'
bash ./bin/_yapf.sh --all-in-place
fi
13 changes: 13 additions & 0 deletions bin/_flake8.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
#!/usr/bin/env bash

# Cause the script to exit if a single command fails
set -eo pipefail -v

# this stops git rev-parse from failing if we run this from the .git directory
builtin cd "$(dirname "${BASH_SOURCE:-$0}")"

ROOT="$(git rev-parse --show-toplevel)"
builtin cd "$ROOT" || exit 1


flake8 "$@"
50 changes: 50 additions & 0 deletions bin/_yapf.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
#!/usr/bin/env bash

# Cause the script to exit if a single command fails
set -eo pipefail -v

# this stops git rev-parse from failing if we run this from the .git directory
builtin cd "$(dirname "${BASH_SOURCE:-$0}")"

ROOT="$(git rev-parse --show-toplevel)"
builtin cd "$ROOT" || exit 1


YAPF_FLAGS=(
'--style' "$ROOT/setup.cfg"
'--recursive'
'--parallel'
)

YAPF_EXCLUDES=(
'--exclude' 'docker/*'
)

# Format specified files
format() {
yapf --in-place "${YAPF_FLAGS[@]}" -- "$@"
}

# Format all files, and print the diff to stdout for travis.
format_all() {
yapf --diff "${YAPF_FLAGS[@]}" "${YAPF_EXCLUDES[@]}" ./**/*.py
}

format_all_in_place() {
yapf --in-place "${YAPF_FLAGS[@]}" "${YAPF_EXCLUDES[@]}" ./**/*.py
}

# This flag formats individual files. --files *must* be the first command line
# arg to use this option.
if [[ "$1" == '--files' ]]; then
format "${@:2}"
# If `--all` is passed, then any further arguments are ignored and the
# entire python directory is formatted.
elif [[ "$1" == '--all' ]]; then
format_all
elif [[ "$1" == '--all-in-place' ]]; then
format_all_in_place
else
# Format only the files that changed in last commit.
exit 1
fi
45 changes: 0 additions & 45 deletions bin/flake8.sh

This file was deleted.

4 changes: 2 additions & 2 deletions bin/tests/_check_binary.sh
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ print(aggregated_loss)
print(iou_soft)
print(iou_hard)

assert aggregated_loss < 1.3
assert iou_soft > 0.30
assert aggregated_loss < 1.4
assert iou_soft > 0.25
assert iou_hard > 0.25
"""
96 changes: 0 additions & 96 deletions bin/yapf.sh

This file was deleted.

4 changes: 2 additions & 2 deletions requirements/requirements-dev.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
flake8==3.7.9
# flake8-docstrings==1.5.0
flake8-quotes==2.1.1
yapf==0.28.0
pre-commit==1.20.0
isort==4.3.21
pre-commit==1.20.0
yapf==0.28.0
9 changes: 4 additions & 5 deletions scripts/index2color.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,7 @@ def build_args(parser):
help="Path to directory with dataset"
)
parser.add_argument(
"--out-labeling",
required=True,
type=str,
help="Path to output JSON"
"--out-labeling", required=True, type=str, help="Path to output JSON"
)
parser.add_argument(
"--num-workers",
Expand Down Expand Up @@ -51,11 +48,13 @@ def main(args, _=None):
with get_pool(args.num_workers) as pool:
images = os.listdir(args.in_dir)
colors = tqdm_parallel_imap(colors_in_image, images, pool)
unique_colors = functools.reduce(lambda s1, s2: s1 | s2, colors)

unique_colors = functools.reduce(lambda s1, s2: s1 | s2, colors)

index2color = collections.OrderedDict([
(index, color) for index, color in enumerate(sorted(unique_colors))
])

print("Num classes: ", len(index2color))

with open(args.out_labeling, "w") as fout:
Expand Down
7 changes: 7 additions & 0 deletions teamcity/binary.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
echo "pip install -r requirements/requirements.txt"
pip install -r requirements/requirements.txt

echo "bash ./bin/tests/_check_binary.sh"
bash ./bin/tests/_check_binary.sh

rm -rf ./data ./logs
Loading