-
The library already does a pretty good job of making rate limits more explicit, but I was wondering if it could go one step further and actually delay your calls (for example with a new |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
Hi, I don't personally think it's a good idea to allow a |
Beta Was this translation helpful? Give feedback.
-
Thanks. What kind of strategy would you use if you had to potentially run a script for multiple days because of the rate limits then? A cronjob wouldn't be able to adjust to always respect the shortest waiting time possible so it wouldn't be very efficient, but I guess it would be the simplest solution… |
Beta Was this translation helpful? Give feedback.
-
You can await when there is a rate limit error, as suggested: function sleep(ms: number) {
return new Promise(resolve => setTimeout(ms, resolve));
}
async function autoRetryOnRateLimitError<T>(callback: () => T | Promise<T>) {
while (true) {
try {
return await callback();
} catch (error) {
if (error instanceof ApiResponseError && error.rateLimitError && error.rateLimit) {
const resetTimeout = error.rateLimit.reset * 1000; // convert to ms time instead of seconds time
const timeToWait = resetTimeout - Date.now();
await sleep(timeToWait);
continue;
}
throw error;
}
}
}
// Get your paginator here
const paginator = await client.v2.following('12', { asPaginator: true, max_results: 1000 });
let currentPage = paginator;
while (true) {
// Handle results of current page inside {currentPage.users}
// Do stuff here...
if (currentPage.users.length === 0) {
break; // There's nothing, paginator ends :)
}
// Fetch the next page, and await if's there is a rate limit error
currentPage = await autoRetryOnRateLimitError(() => paginator.next());
} |
Beta Was this translation helpful? Give feedback.
-
Thanks for your help, I'll give this a try :) |
Beta Was this translation helpful? Give feedback.
You can await when there is a rate limit error, as suggested: