10K Requests Per Second Handling Architecture
Hi,
Let's suppose I have a static server containing .html/.css/.js for static website along with this we have data of users like images/json-files which links to a signed in users and users can access their own created/modified files only.
To achieve the above use case I assume following the ideal structure to serve the static content as well as apply some logic to decided users authorization on a certain file access
Architecture:
All data to be stored in a S3 bucket and add a server in front of bucket which will decide the requesting files authorization and allows the access or deny it for a certain logged in user
Challenge:
The Linode S3 Bucket can handle only 750 requests per second which I think is too low count in our app case, I can optimise the requests count but still it's too low
How to handle like 10k requests per second using using Linode S3 bucket? Do I need to install more compute instances to calculate authorization of files requesting? I probably need to install load-balancer too on top of computer instances.
In general I would like to know if (S3 + Requests Authorization Compute Instances + load balancer) is ideal solution for my case where it needs to cater 10k or more requests per second? OR is there a better design to handle this case?
Thanks
1 Reply
The way the documentation suggests to best increase the requests per second (RPS) of your bucket is to utilize Distributed Bucket Architecture. Instead of placing multi-level file paths in your bucket, place a limited amount of objects in multiple buckets.
You can optimize you bucket performance further when you pair this architecture with a CDN.