Skip to content

Use pre-commit + black #103

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Aug 5, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[flake8]
ignore = E203,W503
max-line-length = 159
exclude = .git,__pycache__
6 changes: 6 additions & 0 deletions .isort.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[settings]
known_first_party = code_coverage_backend,code_coverage_bot,code_coverage_tools,conftest
known_third_party = connexion,datadog,dateutil,fakeredis,flask,flask_cors,flask_talisman,google,hglib,jsone,jsonschema,libmozdata,logbook,pytest,pytz,redis,requests,responses,setuptools,structlog,taskcluster,werkzeug,zstandard
force_single_line = True
default_section=FIRSTPARTY
line_length=159
46 changes: 46 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
repos:
- repo: https://github.com/asottile/seed-isort-config
rev: v1.9.2
hooks:
- id: seed-isort-config
- repo: https://github.com/pre-commit/mirrors-isort
rev: v4.3.21
hooks:
- id: isort
- repo: https://github.com/ambv/black
rev: stable
hooks:
- id: black
- repo: https://gitlab.com/pycqa/flake8
rev: 3.7.8
hooks:
- id: flake8
additional_dependencies:
- 'flake8-coding==1.3.1'
- 'flake8-copyright==0.2.2'
- 'flake8-debugger==3.1.0'
- 'flake8-mypy==17.8.0'
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.2.3
hooks:
- id: trailing-whitespace
- id: check-yaml
- id: mixed-line-ending
- id: name-tests-test
args: ['--django']
- id: check-json
- repo: https://github.com/codespell-project/codespell
rev: v1.15.0
hooks:
- id: codespell
args: ['--exclude-file=bot/tests/fixtures/activedata_chunk_to_tests.json']
- repo: https://github.com/marco-c/taskcluster_yml_validator
rev: v0.0.2
hooks:
- id: taskcluster_yml
- repo: meta
hooks:
- id: check-useless-excludes

default_language_version:
python: python3.7
32 changes: 6 additions & 26 deletions .taskcluster.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ tasks:

taskboot_image: "mozilla/taskboot:0.1.9"
in:
- taskId: {$eval: as_slugid("bot_check_lint")}
- taskId: {$eval: as_slugid("check_lint")}
provisionerId: aws-provisioner-v1
workerType: github-worker
created: {$fromNow: ''}
Expand All @@ -57,31 +57,11 @@ tasks:
- "git clone --quiet ${repository} /src && cd /src && git checkout ${head_rev} -b checks &&
cd /src/tools && python setup.py install &&
cd /src/bot && pip install --quiet . && pip install --quiet -r requirements-dev.txt &&
flake8"
metadata:
name: "Code Coverage Bot checks: linting"
description: Check python code style with flake8
owner: [email protected]
source: https://github.com/mozilla/code-coverage

- taskId: {$eval: as_slugid("backend_check_lint")}
provisionerId: aws-provisioner-v1
workerType: github-worker
created: {$fromNow: ''}
deadline: {$fromNow: '1 hour'}
payload:
maxRunTime: 3600
image: python:3
command:
- sh
- -lxce
- "git clone --quiet ${repository} /src && cd /src && git checkout ${head_rev} -b checks &&
cd /src/tools && python setup.py install &&
cd /src/backend && pip install --quiet . && pip install --quiet -r requirements-dev.txt &&
flake8"
cd /src && pre-commit run -a"
metadata:
name: "Code Coverage Backend checks: linting"
description: Check python code style with flake8
name: "Code Coverage checks: linting"
description: Check code style with pre-commit hooks
owner: [email protected]
source: https://github.com/mozilla/code-coverage

Expand Down Expand Up @@ -134,7 +114,7 @@ tasks:
provisionerId: aws-provisioner-v1
workerType: releng-svc
dependencies:
- {$eval: as_slugid("backend_check_lint")}
- {$eval: as_slugid("check_lint")}
- {$eval: as_slugid("backend_check_tests")}
payload:
capabilities:
Expand Down Expand Up @@ -175,7 +155,7 @@ tasks:
provisionerId: aws-provisioner-v1
workerType: releng-svc
dependencies:
- {$eval: as_slugid("bot_check_lint")}
- {$eval: as_slugid("check_lint")}
- {$eval: as_slugid("bot_check_tests")}
payload:
capabilities:
Expand Down
7 changes: 3 additions & 4 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,20 +12,19 @@ We currently have several endpoints implemented:
* `/v2/path` provides the code coverage information for a directory or file in a repository, at a given revision.


## Setup instructions for developpers
## Setup instructions for developers

```shell
mkvirtualenv -p /usr/bin/python3 ccov-backend
cd backend/
pip install -r requirements.txt -r requirements-dev.txt
pip install -r requirements.txt -r requirements-dev.txt
pip install -e .
```

You should now be able to run tests and linting:

```shell
pytest
flake8
pre-commit run -a
```

## Run a redis instance through docker
Expand Down
54 changes: 32 additions & 22 deletions backend/code_coverage_backend/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,49 +9,46 @@
from code_coverage_backend.config import COVERAGE_EXTENSIONS
from code_coverage_backend.gcp import load_cache

DEFAULT_REPOSITORY = 'mozilla-central'
DEFAULT_REPOSITORY = "mozilla-central"
logger = structlog.get_logger(__name__)


def coverage_supported_extensions():
'''
"""
List all the file extensions we currently support
'''
"""
return COVERAGE_EXTENSIONS


def coverage_latest(repository=DEFAULT_REPOSITORY):
'''
"""
List the last 10 reports available on the server
'''
"""
gcp = load_cache()
if gcp is None:
logger.error('No GCP cache available')
logger.error("No GCP cache available")
abort(500)

try:
return [
{
'revision': revision,
'push': push_id,
}
{"revision": revision, "push": push_id}
for revision, push_id in gcp.list_reports(repository, 10)
]
except Exception as e:
logger.warn('Failed to retrieve latest reports: {}'.format(e))
logger.warn("Failed to retrieve latest reports: {}".format(e))
abort(404)


def coverage_for_path(path='', changeset=None, repository=DEFAULT_REPOSITORY):
'''
def coverage_for_path(path="", changeset=None, repository=DEFAULT_REPOSITORY):
"""
Aggregate coverage for a path, regardless of its type:
* file, gives its coverage percent
* directory, gives coverage percent for its direct sub elements
files and folders (recursive average)
'''
"""
gcp = load_cache()
if gcp is None:
logger.error('No GCP cache available')
logger.error("No GCP cache available")
abort(500)

try:
Expand All @@ -62,28 +59,41 @@ def coverage_for_path(path='', changeset=None, repository=DEFAULT_REPOSITORY):
# Fallback to latest report
changeset, _ = gcp.find_report(repository)
except Exception as e:
logger.warn('Failed to retrieve report: {}'.format(e))
logger.warn("Failed to retrieve report: {}".format(e))
abort(404)

# Load tests data from GCP
try:
return gcp.get_coverage(repository, changeset, path)
except Exception as e:
logger.warn('Failed to load coverage', repo=repository, changeset=changeset, path=path, error=str(e))
logger.warn(
"Failed to load coverage",
repo=repository,
changeset=changeset,
path=path,
error=str(e),
)
abort(400)


def coverage_history(repository=DEFAULT_REPOSITORY, path='', start=None, end=None):
'''
def coverage_history(repository=DEFAULT_REPOSITORY, path="", start=None, end=None):
"""
List overall coverage from ingested reports over a period of time
'''
"""
gcp = load_cache()
if gcp is None:
logger.error('No GCP cache available')
logger.error("No GCP cache available")
abort(500)

try:
return gcp.get_history(repository, path=path, start=start, end=end)
except Exception as e:
logger.warn('Failed to load history', repo=repository, path=path, start=start, end=end, error=str(e))
logger.warn(
"Failed to load history",
repo=repository,
path=path,
start=start,
end=end,
error=str(e),
)
abort(400)
18 changes: 8 additions & 10 deletions backend/code_coverage_backend/backend/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,27 +19,25 @@ def create_app():
# Load secrets from Taskcluster
taskcluster.auth()
taskcluster.load_secrets(
os.environ.get('TASKCLUSTER_SECRET'),
os.environ.get("TASKCLUSTER_SECRET"),
code_coverage_backend.config.PROJECT_NAME,
required=['GOOGLE_CLOUD_STORAGE', 'APP_CHANNEL'],
existing={
'REDIS_URL': os.environ.get('REDIS_URL', 'redis://localhost:6379')
}
required=["GOOGLE_CLOUD_STORAGE", "APP_CHANNEL"],
existing={"REDIS_URL": os.environ.get("REDIS_URL", "redis://localhost:6379")},
)

# Configure logger
init_logger(
code_coverage_backend.config.PROJECT_NAME,
PAPERTRAIL_HOST=taskcluster.secrets.get('PAPERTRAIL_HOST'),
PAPERTRAIL_PORT=taskcluster.secrets.get('PAPERTRAIL_PORT'),
SENTRY_DSN=taskcluster.secrets.get('SENTRY_DSN'),
PAPERTRAIL_HOST=taskcluster.secrets.get("PAPERTRAIL_HOST"),
PAPERTRAIL_PORT=taskcluster.secrets.get("PAPERTRAIL_PORT"),
SENTRY_DSN=taskcluster.secrets.get("SENTRY_DSN"),
)
logger = structlog.get_logger(__name__)

app = build_flask_app(
project_name=code_coverage_backend.config.PROJECT_NAME,
app_name=code_coverage_backend.config.APP_NAME,
openapi=os.path.join(os.path.dirname(__file__), '../api.yml')
openapi=os.path.join(os.path.dirname(__file__), "../api.yml"),
)

# Setup datadog stats
Expand All @@ -49,6 +47,6 @@ def create_app():
try:
code_coverage_backend.gcp.load_cache()
except Exception as e:
logger.warn('GCP cache warmup failed: {}'.format(e))
logger.warn("GCP cache warmup failed: {}".format(e))

return app
46 changes: 24 additions & 22 deletions backend/code_coverage_backend/backend/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@

TALISMAN_CONFIG = dict(
# on heroku force https redirect
force_https='DYNO' in os.environ,
force_https="DYNO" in os.environ,
force_https_permanent=False,
force_file_save=False,
frame_options=flask_talisman.talisman.SAMEORIGIN,
Expand All @@ -34,12 +34,12 @@
strict_transport_security_max_age=flask_talisman.talisman.ONE_YEAR_IN_SECS,
strict_transport_security_include_subdomains=True,
content_security_policy={
'default-src': '\'none\'',
"default-src": "'none'",
# unsafe-inline is needed for the Swagger UI
'script-src': '\'self\' \'unsafe-inline\'',
'style-src': '\'self\' \'unsafe-inline\'',
'img-src': '\'self\'',
'connect-src': '\'self\'',
"script-src": "'self' 'unsafe-inline'",
"style-src": "'self' 'unsafe-inline'",
"img-src": "'self'",
"connect-src": "'self'",
},
content_security_policy_report_uri=None,
content_security_policy_report_only=False,
Expand All @@ -50,23 +50,23 @@

def handle_default_exceptions(e):
error = {
'type': 'about:blank',
'title': str(e),
'status': getattr(e, 'code', 500),
'detail': getattr(e, 'description', str(e)),
'instance': 'about:blank',
"type": "about:blank",
"title": str(e),
"status": getattr(e, "code", 500),
"detail": getattr(e, "description", str(e)),
"instance": "about:blank",
}
return flask.jsonify(error), error['status']
return flask.jsonify(error), error["status"]


def build_flask_app(project_name, app_name, openapi):
'''
"""
Create a new Flask backend application
app_name is the Python application name, used as Flask import_name
project_name is a "nice" name, used to identify the application
'''
assert os.path.exists(openapi), 'Missing openapi file {}'.format(openapi)
logger.debug('Initializing', app=app_name, openapi=openapi)
"""
assert os.path.exists(openapi), "Missing openapi file {}".format(openapi)
logger.debug("Initializing", app=app_name, openapi=openapi)

# Start OpenAPI app
app = connexion.App(import_name=app_name)
Expand All @@ -79,19 +79,21 @@ def build_flask_app(project_name, app_name, openapi):

# Enable wildcard CORS
cors = flask_cors.CORS()
cors.init_app(app.app, origins=['*'])
cors.init_app(app.app, origins=["*"])

# Add exception Json renderer
for code, exception in werkzeug.exceptions.default_exceptions.items():
app.app.register_error_handler(exception, handle_default_exceptions)

# Redirect root to API
app.add_url_rule('/', 'root', lambda: flask.redirect(app.options.openapi_console_ui_path))
app.add_url_rule(
"/", "root", lambda: flask.redirect(app.options.openapi_console_ui_path)
)

# Dockerflow checks
app.add_url_rule('/__heartbeat__', view_func=heartbeat_response)
app.add_url_rule('/__lbheartbeat__', view_func=lbheartbeat_response)
app.add_url_rule('/__version__', view_func=get_version)
app.add_url_rule("/__heartbeat__", view_func=heartbeat_response)
app.add_url_rule("/__lbheartbeat__", view_func=lbheartbeat_response)
app.add_url_rule("/__version__", view_func=get_version)

logger.debug('Initialized', app=app.name)
logger.debug("Initialized", app=app.name)
return app
Loading