What makes sense depends on the agreement of the programming team.
And the smallest possible team is the programmer and their future self.
Even then the hard thing is to predict what will be better for our future selves. Maybe we will be rusty in our Go skills or maybe we will have embraced idiomatic Go, and the approach that makes sense now will require our future self to puzzle through the code.
Of course maybe we will have kept programming the same way because it still feels like the better way and our future self will be efficient. But again that's only for the smallest possible team. If the team expands, then all bets are off.
After 2y in beta, I’ve just released v2 of “do”, the dependency injection toolkit for Golang.
This major version introduces a new scope-based architecture, transient services, interface binding, improved dependency tracking, and circular dependency detection.
Error handling and service naming are more consistent, and based on your feedback, a troubleshooting UI has been added.
A new LLM-ready documentation is available, featuring numerous demos you can run in 1 click: https://do.samber.dev/
Elasticsearch is a good bet, if you need to use multiple filters with your queries, and when you grow above the acceptable size of an in-memory database.
1. It's single node, but DataFusion parallelizes query execution across multiple cores. We do have plans for a distributed architecture, but we've found that you can get ~very~ far just by scaling up a single Postgres node.
2. The only information stored in Postgres are the options passed into the foreign data wrapper and the schema of the foreign table (this is standard for all Postgres foreign data wrappers).
Reading between the lines, given they're talking > 1 million rows per minute, I'd guess on the order of trillions of rows rather than billions (assuming they retain data for more than a couple of weeks)
PG can handle billions of rows for certain use cases, but not easily. Generally you can make things work but you definitely start entering "heroic effort" territory.
So many Go developers ignore some tools because they consider them "not idiomatic".
But why not use abstractions when available ??? Did we forget to be productive ?