Amazon SQS : Working with large payloads (up to 2GB) using Python

Ravi Intodia
3 min readMar 29, 2024

--

Most of AWS Python Developers are aware and used to the 256KB limit for sending payloads to SQS. But if there is a need to send large enough information with the message what are the options?

→ Implement the same code in Java using Extended Client Library for Java

→ Save the large payload information to S3 and pass the reference of S3 file as payload to Amazon SQS

Sound’s easy but it’s not that straight forward as

→ Python developers writing Java code might not be best use of their skills and there will be concerns around quality of the code implemented

→ Writing additional logic for saving/retrieving payload information from S3 along with handling all the error scenarios along with clean-up will be an overhead.

To address above problems, AWS has introduced Extended Client Library for Python in Feb 2024. Now we have an option to use send/process payloads up to 2GB via Python without any payload management overheads.

Extended Client Library for Python also uses S3 for relaying large payloads but it takes care of all the required operations so developers can focus on implementing their core application logic/functionality.

Extended Client Library for Python provides features like:

  1. Configure if S3 is needed only for messages > 256KB or all messages
  2. Automatically uploading message in S3 and referencing that in SQS payload
  3. Retrieve/Delete corresponding message from S3

Now we will go into details on how to implement and use the library in python code.

Sample 1 : Code for sending large messages

import boto3
import sqs_extended_client

#Configure SQS Extended Client properties
sqs_extended_client = boto3.client("sqs", region_name="<region-name>")
sqs_extended_client.large_payload_support = "<bucket-name>"
sqs_extended_client.use_legacy_attribute = False

#Create Large Message
sqs_payload = "<large message>" #Size > 256KB

#Send large message
send_message_response = sqs_extended_client.send_message(
QueueUrl=<SQS-URL>,
MessageBody=sqs_payload
)

Sample 2 : Code for retrieving large messages

import boto3
import sqs_extended_client

#Configure SQS Extended Client properties
sqs_extended_client = boto3.client("sqs", region_name="<region-name>")
sqs_extended_client.large_payload_support = "<bucket-name>"
sqs_extended_client.use_legacy_attribute = False

# Receiving large messages
sqs_messages = sqs_extended_client.receive_message(
QueueUrl=<SQS-URL>,
MessageAttributeNames=['All']
)

original_message = sqs_messages['Messages'][0]['Body']

Sample 3: Code for deleting large messages

import boto3
import sqs_extended_client

#Configure SQS Extended Client properties
sqs_extended_client = boto3.client("sqs", region_name="<region-name>")
sqs_extended_client.large_payload_support = "<bucket-name>"
sqs_extended_client.use_legacy_attribute = False
# Set to True for deleting the payload from S3
sqs_extended_client.delete_payload_from_s3 = True

# Receiving large messages
sqs_messages = sqs_extended_client.receive_message(
QueueUrl=<SQS-URL>,
MessageAttributeNames=['All']
)

#Fetch the receipt handle to identify the message
receipt_handle = receive_message_response['Messages'][0]['ReceiptHandle']

# Deleting the large message
delete_message_response = sqs_extended_client.delete_message(
QueueUrl=queue_url,
ReceiptHandle=receipt_handle
)

As you have seen/learnt from above details how to work with large messages flawlessly without worrying about the overheads of sending large payloads between different application components/modules using Python.

Conclusion :

Addition of this new extended client library for python will simplify life of Python Developers who are working with large payloads which need to be processed by different de-coupled components of the application.

--

--

Ravi Intodia
Ravi Intodia

Written by Ravi Intodia

Solution Architect working on designing and implementing AWS Cloud based solutions.

No responses yet