Category Archives: AWS
Hello Everyone, AWS Lambda is phasing out support for Python 3.7 following Python 3.7 reaching its End-Of-Life on June 27, 2023. To ensure the smooth operation of your functions, AWS strongly advises upgrading your Python 3.7 functions to Python 3.10 or Python 3.11 before November 27, 2023. AWS follows a two-stage process for ending support […]
import pandas as pd import boto3 from io import StringIO data = [[‘Billu’, 31], [‘amit’, 30], [‘Mayank’, 14],[‘prabhat’, 30]] df = pd.DataFrame(data, columns = [‘Name’, ‘Age’]) df ACCESS_KEY=”AKIATHOPBKF36EHONBNK” SECRET_KEY=”dPYe96DbvnAmBDzLws5GBIgk+EqrmiPVnPWFdNoE” def upload_s3(df): i=”test.csv” s3 = boto3.client(“s3″,aws_access_key_id=ACCESS_KEY,aws_secret_access_key=SECRET_KEY) csv_buf = StringIO() df.to_csv(csv_buf, header=True, index=False) csv_buf.seek(0) s3.put_object(Bucket=”test-deltafrog-bucket”, Body=csv_buf.getvalue(), Key=’2021/’+i) ######################SNS############### ses = boto3.client(‘sns’, aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY,region_name=’us-east-1′) sns_topicname_arn=”arn:aws:sns:us-east-1:222161883511:s3_upload_notification” #Publish the message […]
import pandas as pd from pretty_html_table import build_table import boto3 from email.mime.multipart import MIMEMultipart from email.mime.text import MIMEText ACCESS_KEY = ‘AKIATHOPBKF35ELXFTFX’ SECRET_KEY = ‘2RWyaUCXusLDx/aVUuiCyTBZfbj2b5D/IuuhfAm/’ ses = boto3.client(‘ses’, aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY,region_name=’us-east-1′) def send_mail(df): body = build_table(df, ‘blue_light’) sender = “sumit8147085086@gmail.com” to = ‘sumit8147085086@gmail.com’ cc = ‘sumit8147085086@gmail.com’ rcpt = cc.split(“,”) + to.split(“,”) #rcpt = to.split(“,”) message = MIMEMultipart() […]
Hey , In this Video we are going to learn Step function AWS service with real-time Scenario. We will trigger step function from Lambda as soon as file will drop into S3 bucket. We have configured Two lambda in our Step function’s state Machine, first will truncate the redshift table and 2nd will copy […]
sudo yum install –y aws-kinesis-agent cd /etc/aws-kinesis/ sudo vi agent.json sudo service aws-kinesis-agent start sudo chkconfig aws-kinesis-agent on python3 LogGenerator.py 1000 cd /var/log/aws-kinesis-agent/ tail -f aws-kinesis-agent.log [ec2-user@ip-172-31-24-247 aws-kinesis]$ cat agent.json { “cloudwatch.emitMetrics”: true, “kinesis.endpoint”: “”, “firehose.endpoint”: “”, “flows”: [ { “filePattern”: “/home/ec2-user/*.log*”, “deliveryStream”: “kinesis_log_s3” } ] } ######### import names import random import time import […]
import json import boto3 import pandas as pd from datetime import datetime import s3fs from urllib.parse import unquote def lambda_handler(event, context): # TODO implement now = datetime.now() date_time=now.strftime(“%Y-%m-%d_%H-%M-%S”) s3 = boto3.client(‘s3′) bucket=’deltafrog-training-dev’ src_Key=’real_time_data/’ dest_file=”s3://deltafrog-training-dev/combined_data/anual_combined_” archive_Key=’archive/’ res=s3.list_objects(Bucket=bucket,Prefix=src_Key) print(res) fname_list=[] final_df=pd.DataFrame() if “Contents” in res: for i in res[“Contents”]: print(i) if “csv” in i[‘Key’]: filename=i[‘Key’].split(‘/’)[-1] print(filename) fname_list.append(filename) […]
I have create the Video and explain the same. Below is the code used in Video tutorial ###### import json import boto3 from datetime import datetime import psycopg2 from env import ENV from settings import credential,REDSHIFT_ROLE,BUCKET ENV=’dev’ credential = { ‘dbname’ : ‘dev’, ‘port’ : ‘5439’, ‘user’ : ‘awsuser’, ‘password’ : ‘Deltafrog#123’, ‘host_url’:’redshift-cluster-1.cgymtibgpcfw.us-east-1.redshift.amazonaws.com’ } […]
In Lambda when we write code in python we need to import some module that is not build in . As Lambda is serverless service and We dont have access to server where we can do “pip install module”. In this case we need to create layer for that we have to download required module […]
Dear Reader, In this blog, I will tell you how to capture S3 events using Lambda function and send status mail to the required recipient using SES. We will do it in three steps. So let’s start…… Step:-1) Create IAM Roles for Lambda Function so that the Lambda function can call AWS services. Please […]
In this digital Era , the days are gone when we were using physical devices for example external hard drive pen drive etc. We should thankful to the cloud data storage services. Two of the most big fishes in this market are Microsoft Azure and Amazon Web Services (AWS). We will compare both of the […]