placeholder

How Statsig streams 1 trillion events a day

A behind-the-scenes look at all the operations in place to ensure that Statsig can safely and seamlessly process 1 trillion events every day.

Click to view the original at statsig.com

Hasnain says:

Interesting stuff. This bit in particular was an idea I hadn’t heard of before.

“Even when running in a degraded mode, it is able to continuously and reliably ingest data. Because of this, our data ingestion layer (written in Rust) has been designed to have as few responsibilities as possible. Its sole job is to get data into some semi-persistent storage as quickly as possible, with several cascading storage options in the case of failures or outages.
Even in the event of an auth service outage, it will still record requests, but with a flag to authorize the request async in the processing layer.”

Posted on 2024-10-12T05:40:55+0000