AWS Cross-account S3 to S3 files push using Lambda

CloudSatya By Satya
3 min readSep 9, 2020

As we see many clients moving to cloud along with all their source systems , there was use case to send files from source account to destination account using lambda.

Overview :

· Bucket creation in source account and destination account.

· Create role in destination account with S3 access (putoject)

· Create role in source account with lambda execution and s3 access

· Assume source account role in destination account

· Assume destination account role in source account

· Bucket policy changes in destination account

· Create lambda function with s3 event trigger to copy files from source to destination s3.

Steps :

1. Create Bucket in source account : rawbucket12

2. Create Destination Bucket in target account : destbucket12

3. Create role s3tos3role in source account as below : (s3 access and lambda execution access)

4. Create role dest_role in destination account as below :

5. Add source role (s3tos3role) in destination account in trust relationship tab.

6. Add destination role (dest_role) in source account in trust relationship tab.

Reference policy :

(replace 12 digit destination account id at below destination_account_id )

{

“Version”: “2012–10–17”,

“Statement”: [

{

“Effect”: “Allow”,

“Principal”: {

“Service”: “lambda.amazonaws.com”

},

“Action”: “sts:AssumeRole”

},

{

“Effect”: “Allow”,

“Principal”: {

“AWS”: “arn:aws:iam::destination_account_id:role/dest_role”

},

“Action”: “sts:AssumeRole”

}

]

}

7. Bucket policy changes for destination bucket in destination account. Here you need to mention source role and destination bucket.

8. Go to the AWS Lambda Console and click on Create Function in SOURCE ACCOUNT

and select s3tos3role as Execution Role for your Function

9. Now time to add trigger. This is event based trigger on s3 bucket. When you put any file in S3 folder, lambda function will be triggered and file will be copied to destination account.

10. Add below code for lambda.(courtesy: Internet)

import boto3

import json

s3 = boto3.resource(‘s3’)

def lambda_handler(event, context):

bucket = s3.Bucket(‘rawbucket12’)

dest_bucket = s3.Bucket(‘destbucket12’)

print(bucket)

print(dest_bucket)

for obj in bucket.objects.all():

dest_key = obj.key

print(dest_key)

s3.Object(dest_bucket.name, dest_key).copy_from(CopySource = {‘Bucket’: obj.bucket_name, ‘Key’: obj.key})

11. Now you can test and see logs to check if any error.Common error is “access denied” if your bucket policy is not setup properly.

--

--

CloudSatya By Satya

AWS certified solution Architect and Data Analytics Specialty certified