Copy data from S3 to Redshift using Lambda

Posted on by Sumit Kumar

I have create the Video and explain the same.

 

Below is the code used in Video tutorial

######

import json
import boto3
from datetime import datetime
import psycopg2
from env import ENV
from settings import credential,REDSHIFT_ROLE,BUCKET

ENV=’dev’

credential = {
‘dbname’ : ‘dev’,
‘port’ : ‘5439’,
‘user’ : ‘awsuser’,
‘password’ : ‘Deltafrog#123’,
‘host_url’:’redshift-cluster-1.cgymtibgpcfw.us-east-1.redshift.amazonaws.com’
}

REDSHIFT_ROLE = {
‘dev’: ‘arn:aws:iam::222161883511:role/copy_s3_redshift3’

}
BUCKET = {
‘dev’: ‘s3://test-deltafrog-out’
}

def lambda_handler(event, context):
conn_string = “dbname='{}’ port='{}’ user='{}’ password='{}’ host='{}'”\
.format(credential[‘dbname’],credential[‘port’],credential[‘user’],credential[‘password’],credential[‘host_url’])
print(conn_string)
con = psycopg2.connect(conn_string)
cur = con.cursor()
print(con)
# src_bucket = event[“Records”][0][“s3”][“bucket”][“name”]
# filepath = event[“Records”][0][“s3”][“object”][“key”]
src_bucket = ‘test-deltafrog-out’
filepath = ‘annual_final0910.csv’
print(filepath)
print(src_bucket)
s3_file=’s3://’+src_bucket+’/’+filepath
sql=”””copy dev.public.annual
from ‘{0}’
iam_role ‘{1}’
delimiter ‘,’
IGNOREHEADER as 1
csv;”””.format(s3_file,REDSHIFT_ROLE[ENV])
print(sql)
cur.execute(“truncate table dev.public.annual”)
cur.execute(sql)

################

Thanks. Please let me know if you have any issue.

Posted in AWS.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

*