You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After assigning the appropriate API URL endpoint to socrata_url in my R code below, I have been attempting to write a dataframe that has about 2.5 million rows to Socrata using:
But I receive the following error message: 413 Request Entity Too Large
I tried limiting my import to about a 1.5 million rows in my dataframe, data_for_uploading, and then uploading to Socrata, and they were added then without any problems. Once my data frame exceeds about 1.8 million rows (or about 250 MB in memory), I receive the 413 error message. Is it possible there's a bug in the RSocrata code that is limiting the upload size? If so, is there a work-around for this aside from splitting my dataframe in chunks with each chunk <= 1,500,000 rows, inserting the first chunk with the "REPLACE" argument, and then using an "UPSERT" for the remaining chunks?
As additional information, I am able to upload the 2.5 million record CSV file to Socrata directly through the web interface, and Socrata support indicated that there shouldn't be any limitation, and advised me to check with the RSocrata developers.
Thanks.
The text was updated successfully, but these errors were encountered:
After assigning the appropriate API URL endpoint to
socrata_url
in my R code below, I have been attempting to write a dataframe that has about 2.5 million rows to Socrata using:write.socrata(data_for_uploading, socrata_url, "REPLACE", keyring::key_get("socrata_user"), keyring::key_get("socrata_pw"))
But I receive the following error message:
413 Request Entity Too Large
I tried limiting my import to about a 1.5 million rows in my dataframe,
data_for_uploading
, and then uploading to Socrata, and they were added then without any problems. Once my data frame exceeds about 1.8 million rows (or about 250 MB in memory), I receive the 413 error message. Is it possible there's a bug in theRSocrata
code that is limiting the upload size? If so, is there a work-around for this aside from splitting my dataframe in chunks with each chunk <= 1,500,000 rows, inserting the first chunk with the "REPLACE" argument, and then using an "UPSERT" for the remaining chunks?As additional information, I am able to upload the 2.5 million record CSV file to Socrata directly through the web interface, and Socrata support indicated that there shouldn't be any limitation, and advised me to check with the
RSocrata
developers.Thanks.
The text was updated successfully, but these errors were encountered: