I’m not sure if this is the issue with my understanding of how SQS queue works or some bug, but the issue is this:
I’m trying to use SQS queue to run multiple Lambdas in parallel to save time. I followed the tutorial and ended up with two Jobs, one that saves entries in the database and add messages with that ids to SQS, and another one which should be triggered by SQS and which takes given id and do some processing.
Here are my two Jobs:
class DispatcherJob < ApplicationJob
include Jets::AwsServices
iam_policy 'sqs'
def dispatch
Jets.logger.info("Dispatched from job: #{event}.")
queue_url = List.lookup(:waitlist_url)
message_body = JSON.dump(event)
sqs.send_message(queue_url:, message_body:)
end
end
and
class CourierJob < ApplicationJob
class_timeout 30
depends_on :list
sqs_event ref(:waitlist)
def process
result = ProcessEmail.new.call(event)
if result.success?
Jets.logger.info("Processed email #{result.success}.")
else
Jets.logger.error("Failed email #{result.failure}")
end
end
end
Now, I’m logging everything because I can’t find where are those messages missing between first and the second. When I send 6 objects to Dispatch, I get only 5 processed objects. When I go with 10, I get 7-8 processed… I checked database and everything seem correct, those that are not logged are missing data set in the processing step.
Am I even using this as intended? Is there a better way and what am I missing about SQS queues?
P. S. I tried to add rescue clauses to everything with logging but no errors are raised.