Replies: 1 comment
-
It could be a 2-phase implementation:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Would be nice to have a way to auto-resume between process runs and cache responses.
The rate limits are way too low from Twitter, and you will always get rate-limited eventually.
All responses should be cached to disk along with the
next_token
. Should wait for the rate-limits before retrying. If the process crashes, it can be easily resumed automatically by reading the persistednext_token
...which I assume stays valid over a 15min wait period.Maybe a wrapper library is best for this.
node-twitter-api-v2
would need to allow passing innext_token
and returning it from its fetch methods...which it looks like it does, just not documented.Related: #187
Beta Was this translation helpful? Give feedback.
All reactions