* Function to download a URL and return it as a string
@ 2012-12-09 5:17 Sean McAfee
2012-12-09 5:38 ` Drew Adams
` (4 more replies)
0 siblings, 5 replies; 6+ messages in thread
From: Sean McAfee @ 2012-12-09 5:17 UTC (permalink / raw)
To: help-gnu-emacs
I recently wanted to be able to download a document from a URL and
return it as a string. I couldn't find a function that did that
precisely, but I was able to construct my own:
(defun download (url)
(with-current-buffer (url-retrieve-synchronously url)
(prog2
(when (not (search-forward-regexp "^$" nil t))
(error "Unable to locate downloaded data"))
(buffer-substring (1+ (point)) (point-max))
(kill-buffer))))
This seems rather busy for such a basic-seeming operation--in
particular, having to take care to delete the buffer created by
url-retrieve-synchronously. Is there a better way to do it?
(On the other hand, this way I get to use prog2, which I almost never
do.)
^ permalink raw reply [flat|nested] 6+ messages in thread
* RE: Function to download a URL and return it as a string
2012-12-09 5:17 Function to download a URL and return it as a string Sean McAfee
@ 2012-12-09 5:38 ` Drew Adams
2012-12-09 13:12 ` Pascal J. Bourguignon
` (3 subsequent siblings)
4 siblings, 0 replies; 6+ messages in thread
From: Drew Adams @ 2012-12-09 5:38 UTC (permalink / raw)
To: 'Sean McAfee', help-gnu-emacs
> I recently wanted to be able to download a document from a URL and
> return it as a string. I couldn't find a function that did that
> precisely, but I was able to construct my own:
>
> (defun download (url)
> (with-current-buffer (url-retrieve-synchronously url)
> (prog2
> (when (not (search-forward-regexp "^$" nil t))
> (error "Unable to locate downloaded data"))
> (buffer-substring (1+ (point)) (point-max))
> (kill-buffer))))
>
> This seems rather busy for such a basic-seeming operation--in
> particular, having to take care to delete the buffer created by
> url-retrieve-synchronously. Is there a better way to do it?
Dunno anything about whether there is a better way, but consider using
`unwind-protect', with `kill-buffer' as your cleanup part. And to play safe,
pass the actual buffer (returned from `url-*') as arg to `kill-buffer'.
Using `unwind-protect' makes sure the buffer is killed when you're done, no
matter what.
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Function to download a URL and return it as a string
2012-12-09 5:17 Function to download a URL and return it as a string Sean McAfee
2012-12-09 5:38 ` Drew Adams
@ 2012-12-09 13:12 ` Pascal J. Bourguignon
2012-12-09 15:13 ` jun net
` (2 subsequent siblings)
4 siblings, 0 replies; 6+ messages in thread
From: Pascal J. Bourguignon @ 2012-12-09 13:12 UTC (permalink / raw)
To: help-gnu-emacs
Sean McAfee <eefacm@gmail.com> writes:
> I recently wanted to be able to download a document from a URL and
> return it as a string. I couldn't find a function that did that
> precisely, but I was able to construct my own:
>
> (defun download (url)
> (with-current-buffer (url-retrieve-synchronously url)
> (prog2
> (when (not (search-forward-regexp "^$" nil t))
> (error "Unable to locate downloaded data"))
> (buffer-substring (1+ (point)) (point-max))
> (kill-buffer))))
>
> This seems rather busy for such a basic-seeming operation--in
> particular, having to take care to delete the buffer created by
> url-retrieve-synchronously. Is there a better way to do it?
>
> (On the other hand, this way I get to use prog2, which I almost never
> do.)
Notice that you cannot just receive the bytes of the ressource. At the
very least, you also need the Content-Type:. So don't feel this header
is a bad thing.
I would have used (search-forward "\n\n"):
(defun download (url)
(with-current-buffer (url-retrieve-synchronously url)
(prog1 (buffer-substring (or (search-forward "\n\n" nil t) (point-min))
(point-max))
(kill-buffer))))
--
__Pascal Bourguignon__ http://www.informatimago.com/
A bad day in () is better than a good day in {}.
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Function to download a URL and return it as a string
2012-12-09 5:17 Function to download a URL and return it as a string Sean McAfee
2012-12-09 5:38 ` Drew Adams
2012-12-09 13:12 ` Pascal J. Bourguignon
@ 2012-12-09 15:13 ` jun net
2012-12-09 19:26 ` Jambunathan K
[not found] ` <mailman.14907.1355031511.855.help-gnu-emacs@gnu.org>
4 siblings, 0 replies; 6+ messages in thread
From: jun net @ 2012-12-09 15:13 UTC (permalink / raw)
To: Sean McAfee; +Cc: help-gnu-emacs
[-- Attachment #1: Type: text/plain, Size: 971 bytes --]
There is a function I can't remember exactly. Maybe with-temp-string? Like
with-current-buffer, but makes all output to a temp string.
在 2012-12-9 下午1:20,"Sean McAfee" <eefacm@gmail.com>写道:
>
> I recently wanted to be able to download a document from a URL and
> return it as a string. I couldn't find a function that did that
> precisely, but I was able to construct my own:
>
> (defun download (url)
> (with-current-buffer (url-retrieve-synchronously url)
> (prog2
> (when (not (search-forward-regexp "^$" nil t))
> (error "Unable to locate downloaded data"))
> (buffer-substring (1+ (point)) (point-max))
> (kill-buffer))))
>
> This seems rather busy for such a basic-seeming operation--in
> particular, having to take care to delete the buffer created by
> url-retrieve-synchronously. Is there a better way to do it?
>
> (On the other hand, this way I get to use prog2, which I almost never
> do.)
[-- Attachment #2: Type: text/html, Size: 1307 bytes --]
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Function to download a URL and return it as a string
2012-12-09 5:17 Function to download a URL and return it as a string Sean McAfee
` (2 preceding siblings ...)
2012-12-09 15:13 ` jun net
@ 2012-12-09 19:26 ` Jambunathan K
[not found] ` <mailman.14907.1355031511.855.help-gnu-emacs@gnu.org>
4 siblings, 0 replies; 6+ messages in thread
From: Jambunathan K @ 2012-12-09 19:26 UTC (permalink / raw)
To: Sean McAfee; +Cc: help-gnu-emacs
May be you can steal some stuff from
M-x find-library RET package.el
specifically
C-h f package--with-work-buffer
Or
You can file an enhancement request so that some of this stuff could be
moved out of package.el to someother library.
ps: I know nothing about handling HTTP requests. Just pitching in with
few pennies.
Sean McAfee <eefacm@gmail.com> writes:
> I recently wanted to be able to download a document from a URL and
> return it as a string. I couldn't find a function that did that
> precisely, but I was able to construct my own:
>
> (defun download (url)
> (with-current-buffer (url-retrieve-synchronously url)
> (prog2
> (when (not (search-forward-regexp "^$" nil t))
> (error "Unable to locate downloaded data"))
> (buffer-substring (1+ (point)) (point-max))
> (kill-buffer))))
>
> This seems rather busy for such a basic-seeming operation--in
> particular, having to take care to delete the buffer created by
> url-retrieve-synchronously. Is there a better way to do it?
>
> (On the other hand, this way I get to use prog2, which I almost never
> do.)
>
--
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: Function to download a URL and return it as a string
[not found] ` <mailman.14907.1355031511.855.help-gnu-emacs@gnu.org>
@ 2012-12-10 0:59 ` Sean McAfee
0 siblings, 0 replies; 6+ messages in thread
From: Sean McAfee @ 2012-12-10 0:59 UTC (permalink / raw)
To: help-gnu-emacs
"Drew Adams" <drew.adams@oracle.com> writes:
>> (defun download (url)
>> (with-current-buffer (url-retrieve-synchronously url)
>> (prog2
>> (when (not (search-forward-regexp "^$" nil t))
>> (error "Unable to locate downloaded data"))
>> (buffer-substring (1+ (point)) (point-max))
>> (kill-buffer))))
> Dunno anything about whether there is a better way, but consider using
> `unwind-protect', with `kill-buffer' as your cleanup part.
My original implementation used unwind-protect, but on reflection it
seemed quite unnecessary when there was no way for the protected form to
ever raise an exception.
I suppose there's a future-proofing argument to be made, but it's hard
to see this code evolving to the point where an exception ever could
occur.
^ permalink raw reply [flat|nested] 6+ messages in thread
end of thread, other threads:[~2012-12-10 0:59 UTC | newest]
Thread overview: 6+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2012-12-09 5:17 Function to download a URL and return it as a string Sean McAfee
2012-12-09 5:38 ` Drew Adams
2012-12-09 13:12 ` Pascal J. Bourguignon
2012-12-09 15:13 ` jun net
2012-12-09 19:26 ` Jambunathan K
[not found] ` <mailman.14907.1355031511.855.help-gnu-emacs@gnu.org>
2012-12-10 0:59 ` Sean McAfee
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).