Some performance ideas for S3
-
Updated
May 10, 2018 - Python
Some performance ideas for S3
Data Pipeline created from scraping the artsy.net website
AI_RAG is a document-based AI chatbot integrating AWS Bedrock and S3, featuring user/admin dashboards, enabling intelligent querying, document ingestion, and seamless cloud-powered retrieval-augmented generation (RAG).
A serverless pantry manager built with AWS Lambda and Streamlit that tracks item expiry dates and suggests recipes to reduce food waste.
s3 (through boto3) with a simple (dict-like or list-like) interface
This piece of code helps one getting started with Bots and AWS Services ( Lambda and S3)
A scraper to retrieve bus arrival information for storage in S3
A Python script used to enable MFA Delete options for AWS S3 Buckets. This script provides an easier way of enabling MFA Delete options by wrapping the boto3 module and using a much simpler API for common problems like: enabling MFA Delete for all S3 Buckets in an account.
Demo project showcasing a "Data Pipeline" built using AWS native 'Serverless' Technologies
End-to-end MLOps pipeline for predicting wine quality using ElasticNet regression. MLflow is hosted on an AWS EC2 instance for experiment tracking, with AWS S3 as the remote artifact store. The project includes reproducible pipelines using DVC, model evaluation, and cloud-based storage of metrics, parameters, and trained models.
This a repo for a realestate website and rent management portal
ETL pipeline that extracts their data from S3, stages them in Redshift, and transforms data into a set of dimensional tables for their analytics team to continue finding insights in what songs their users are listening to. Then we will test the database and ETL pipeline by running queries given to us by the analytics team from Sparkify and compa…
FastAPI for indexing S3 buckets into Postgresql
This Python script is an AWS Lambda function that reads HTML files from an S3 bucket and sends them as emails using Amazon SES. It extracts metadata from files, composes emails with attachments, sends them, and moves the files to different folders in the bucket based on the sending success.
Docker runs flask that build services on AWS
This is a project that seeks to extract, clean, upload an query retail data on sales, products and customer data.
A serverless AI blog generator using Amazon Bedrock (LLaMA 3), API Gateway, AWS Lambda, and Amazon S3, with a Streamlit frontend for user interaction.
Two client-worker applications to exchange and process numbers and images through SQS queue.
Add a description, image, and links to the s3-bucket topic page so that developers can more easily learn about it.
To associate your repository with the s3-bucket topic, visit your repo's landing page and select "manage topics."