This project is the first phase of my hackernews crawler. The main purpose of this project is to automate the mundane task of browsing through hackernews job postings, and applying one by one.
git clone this-projectpip3 install scrapyFirstly, we crawl the data from hackernews and extract the ones that are related to me.
scrapy runspider crawler.py -o ../data/output.jsonSecondly, we clean the data and add weight to fields that corresponds more to my skillset.
python3 main.pyIn order to send an email from your email account, you need to acquire your app password from your google account.
- Go to manage my google account
- Under "Signing in to Google" confirm that "2-Step Verification" is "On" for the account.
- Also under "Signing in to Google" Select "App passwords".
- Select the app as "Mail" and the device as "Other (Custom name)" and name it.
- Copy the app password, it will be in a yellow box and looks like: "XXXX XXXX XXXX XXXX"
- Copy app password and email into email-credentials.json
cp email-credentials.json.dist email-credentials.jsonLastly, we send our application with the attached resume to the extracted emails from the data if the job posting includes any email.
python3 applier.py- Fork it
- Create your feature branch (
git checkout -b my-new-feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create new Pull Request

