Architectural Suggestions re: Event Logging

I’m curious if anyone has any suggestions or experiences on how to solve this.

I’m going to be sending a large amount of event data to Keen.io - about 100 events a second - using the server-side API. That part is pretty straightforward. I also want to log/store the events (which are small JSON packets) in a secondary data store so I can audit the data that was sent and make sure it is accurate. Reliability of the secondary data store is important.

The site is running on Heroku, so I don’t have access to a durable filesystem.

I’ve been thinking about saving them in a database and periodically flushing it to S3, sending it to a custom Splunk source via their API, dropping the JSON into DynamoDB, and all manner of other things.

Any other suggestions or idea?

Thanks,
Eric

Provided your database can handle that write volume (which I assume it can with a production-level heroku db), I’d definitely start by just writing them there.

Shipping that data to yet another 3rd party adds substantial complexity in the form of more moving parts and more complicated tests. I’d keep it in your database until you outgrow that approach for some reason.

Thanks for the input. That was my hunch as well. I like also that the uptime of the mechanism for the durable store is, effectively, the same as my site, since they are both built on the same bedrock.