A simple shell script to list users with access (in this case read access) to a GitHub repository
This project contains a shell script, list-user.sh, that leverages the GitHub API to retrieve and display a list of users who have access to a specified GitHub repository using an EC2 VM on AWS.
- Utilize AWS EC2 instance to run shell script using SSH.
- Leverages the GitHub API to retrieve user information.
- Filters collaborators based on read access (pull permission).
- Displays a list of users with read access.
No installation is required. You only need a shell environment (e.g., Bash) and a way to make API calls (e.g., curl or wget).
-
Elevate script permissions and install necessary libraries.
chmod 777 list-users.sh sudo apt install jq -y
-
Run the script:
./list-user.sh organization_name repository_name
- Replace
organization_namewith the repository's organization username. - Replace
repository_namewith the name of the repository.
- Replace
The script will output a list of usernames, one per line, representing those with access to the queried organization's repository.
This project automates the creation of AWS resources and triggers email notifications whenever new objects are uploaded to an S3 bucket.
- Automatically creates:
- IAM role with appropriate permissions
- S3 bucket
- Lambda function triggered by object creation
- SNS topic for email notifications
- Sends email notifications via SNS to a specified email address
- Written in Python and Bash for platform-agnostic deployment
- Python 3.8: Script execution and Lambda function
- Bash: Resource creation and script execution
- AWS CLI: Resource configuration and management
- boto3: AWS SDK for Python (used in Lambda function)
- AWS account
- AWS CLI installed and configured
- Python 3.8 and boto3 installed (in virtual environment recommended)
- Clone the repository:
git clone https://github.com/your-username/your-repo.git
cd your-repo- Run the setup script:
./s3-notification-triggers.sh- The script will prompt for the email address to receive notifications.
- Upload files to the S3 bucket:
- Any new object upload will trigger the Lambda function and send an email notification.
- Creates the necessary AWS resources:
- IAM role with permissions for S3, Lambda, and SNS
- S3 bucket
- Lambda function (s3-lambda-function/s3-lambda-function.py)
- SNS topic for email notifications
- Email subscription to the SNS topic for the provided email address
- Uploads a sample file to the S3 bucket to demonstrate functionality
- Python script triggered by object creation in the S3 bucket
- Parses the event data to extract bucket name and object key
- Logs a message about the uploaded file
- Publishes an SNS message with file details to the configured SNS topic
- Returns a success response
- Run the s3-notification-triggers.sh script to set up the resources.
- Upload files to the S3 bucket.
- The Lambda function will be triggered for each upload.
- The uploaded file's details will be published to the SNS topic.
- An email notification will be sent to the specified address.
