Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
loadTestDataFromDb: Load data in chunks of 100 rows (#90)
With the new data from our first traffic collection on the web (tweaselORG/data.tweasel.org#3), the request database has gotten quite huge and I have seen debug-adapter runs take a long time and even fail due to timeouts. Luckily, the timeouts at least can be solved quite easily. Turns out, we were already trying to load all matching rows in a single request. Now, we instead load it in chunks of 100 rows. Setting the _size parameter was sufficient for that—we had already implemented handling of the next_url in responses and row limits. With this change, I haven't seen any timeouts anymore so far and while still slower, it isn't too bad anymore. I still want to work on making data.tweasel.org as a whole faster again, but this is a good change in any case.
- Loading branch information