This repo has all the information you need to complete the take-home assignment. Know that we are excited about you as a candidate, and can't wait to see what you build!
- Complete user stories 1 & 2 using the language and database of your choice
- NOTE: For the database, Postgres running as a docker container is preferred. You can use the provided docker-compose.yml file as a starting point. To use it, simply
- Copy
.env.sampleto.envand set the values appropriately - Run the database with the command
docker-compose up -d
- Copy
- NOTE: For the database, Postgres running as a docker container is preferred. You can use the provided docker-compose.yml file as a starting point. To use it, simply
- Provide clear documentation
- Any code you write is clear and well organized
- You spend at least 3-4 hours total on the project (but no more than 6-8 hours)
BONUS you provide tests
Implement a data ingestion pipeline that allows you to ingest the 4 CSV files into your database for use with your REST API (see user story number 2). Provide clear documentation on how to invoke your pipeline (i.e., run this script, invoke this Makefile target, etc.). Assume that the pipeline can be run on demand and it should drop any existing data and reload it from the files.
Create an API server that features the following enpoints
/equipment- data from equipment.csv/events- data from events.csv/locations- data from locations.csv/waybills- data from waybills.csv./waybills/{waybill id}- should return information about a specific waybill/waybills/{waybill id}/equipment- should return the equipment associated with a specific waybill/waybills/{waybill id}/events- should return the events associated with a specific waybill/waybills/{waybill id}/locations- should return the locations associated with a specific waybill
All the routes should return JSON.
Any event route should allow for filtering by the posting_date field
Note: This user story is optional, and on an "if-you-have-time" basis.
Provide a * /waybills/{waybill id}/route - should return information about the route associated with a specific waybill
Note: This user story is optional, and on an "if-you-have-time" basis.
Provide a * /waybills/{waybill id}/parties - should return information about the parties associated with a specific waybill
In the data/ are 4 files.
locations.csv- a list of locations. Theidfield is the internal, autogenerated ID for each location.equipment.csv- a list of equipment (i.e., rail cars). Theidfield is the internal, autogenerated ID for each piece of equipment. Theequipment_idfield should be considered the primary key for creating relations to other files.events.csv- a list of tracking events. Theidfield is the internal, autogenerated ID for each tracking event. The fieldwaybill_idis a foreign key to the waybills file. The fieldlocation_idis a foreign key to the locations file. The fieldequipment_idis a foreign key to the equipment file.waybills.csv- a list of waybills. A waybill is a list of goods being cariied on a rail car. Theorigin_idanddestination_idare foreign keys to the locations file. The fieldequipment_idis a foreign key to the equipment file. Theidfield is the internal, autogenerated ID for each waybill. Therouteandpartiesfields contain JSON arrays of objects. Theroutefield details the rail stations (AKA "scacs") the train will pass through. Thepartiesfield defines that various companies involved in shipping the item from its origin to its destination (e.g., shippers, etc.).
NOTE: All dates are in UTC.
We have provided a sample REST API that you can finish implementing. Please note that using this sample project IS NOT REQUIRED. The sample features:
- Python 3.4+
- Postgres OR you can run the database via Docker and Docker-Compose using the provided
docker-compose.ymlfile - Falcon
- SQLAlchemy - database toolkit for Pythion
- Alembic - database migrations
The Falcon project scaffold is inspired by falcon-sqlalchemy-template
- Fork and clone this repo onto your own computer
- Start the database server
OR
- Copy
.env.sampleto.envand set the values appropriately - Run the database with the command
docker-compose up -d
- Copy
- Depending on the values you used in your
.envfile, set theSQLALCHEMY_DATABASE_URIenvironment variable to point to your database. For example,
export SQLALCHEMY_DATABASE_URI=postgresql://candidate:password123@localhost:5432/takehome- Change directory to the
webappdirectory and runpip install -r requirements.txtto install required dependencies - In the same directory, run
gunicorn --reload api.wsgi:appto run the web application
The API will be exposed locally at http://127.0.0.1:8000
Run curl http://127.0.0.1:8000/health/ping to test your server. It should return the following JSON:
{"ping": "true"}It is recommended you create a Python virtual environment for running your project
Again using Alembic is NOT required - it is just provided in case you want to use it to work with the database.
Add new migrations with
alembic revision --autogenerate -m "migration name"
Upgrade your database with
alembic upgrade head