下载大文件时 httr GET 函数空间不足

2024-03-07

我正在尝试下载一个 1.1 GB 的文件httr但我遇到了以下错误:

x <- GET( extract.path )
Error in curlPerform(curl = handle$handle, .opts = curl_opts$values) : 
  cannot allocate more space: 1728053248 bytes

我的C盘还有400GB可用空间

in the RCurl包,我看到了maxfilesize and maxfilesize.large使用时的选项getCurlOptionsConstants()但我不明白这些是否/如何传递给httr通过config or set_config..或者如果我需要切换到RCurl为此..即使我确实需要切换,增加最大文件大小会起作用吗?

这是我的会话信息..

> sessionInfo()
R version 3.0.0 (2013-04-03)
Platform: i386-w64-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252    LC_MONETARY=English_United States.1252 LC_NUMERIC=C                           LC_TIME=English_United States.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] XML_3.96-1.1 httr_0.2    

loaded via a namespace (and not attached):
[1] digest_0.6.0   RCurl_1.95-4.1 stringr_0.6.2  tools_3.0.0   

..and (不推荐这样做,只是因为这会花费你一段时间)如果你想重现我的错误,你可以去https://usa.ipums.org/usa-action/samples https://usa.ipums.org/usa-action/samples,注册一个新帐户,选择2011年5年acs摘录,添加大约一百个变量,然后等待摘录准备好。然后编辑前三行并运行下面的代码。 (再次强调,不推荐)

your.email <- "[email protected] /cdn-cgi/l/email-protection"
your.password <- "password"
extract.path <- "https://usa.ipums.org/usa-action/downloads/extract_files/some_file.csv.gz"

require(httr)

values <- 
    list(
        "login[email]" = your.email , 
        "login[password]" = your.password , 
        "login[is_for_login]" = 1
    )

POST( "https://usa.ipums.org/usa-action/users/validate_login" , body = values )
GET( "https://usa.ipums.org/usa-action/extract_requests/download" , query = values )

# this line breaks
x <- GET( extract.path )

仅供参考 - 这已添加到write_disk()控制在httr: https://github.com/hadley/httr/blob/master/man/write_disk.Rd https://github.com/hadley/httr/blob/master/man/write_disk.Rd

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

下载大文件时 httr GET 函数空间不足 的相关文章

随机推荐