unofficial mirror of emacs-devel@gnu.org 
 help / color / mirror / code / Atom feed
* url.el blocks Gnus+nnrss
@ 2005-01-13 12:16 Katsumi Yamaoka
  2005-01-13 14:35 ` Stefan Monnier
  0 siblings, 1 reply; 10+ messages in thread
From: Katsumi Yamaoka @ 2005-01-13 12:16 UTC (permalink / raw
  Cc: ding

Hi,

Several people reported that Gnus hangs if they subscribe to
nnrss groups.  nnrss is a back end which uses the url ELisp
package by default and enables to read rss feeds as if they were
newsgroups.

When starting Gnus or typing `g' (gnus-group-get-new-news), it
gets into the infinite loop.  The reason the problem arises is
stated in the `url-retrieve-synchronously' section of the url.el
file as follows:

	;; Quoth Stef:
	;; It turns out that the problem seems to be that the (sit-for
	;; 0.1) below doesn't actually process the data: instead it
	;; returns immediately because there is keyboard input
	;; waiting, so we end up spinning endlessly waiting for the
	;; process to finish while not letting it finish.

	;; However, raman claims that it blocks Emacs with Emacspeak
	;; for unexplained reasons.  Put back for his benefit until
	;; someone can understand it.
	;; (sleep-for 0.1)
	(sit-for 0.1))

I don't know why it sticks to `sleep-for' or `sit-for'.
Although both process data, there are adverse effects as the
comment mentioned.  Isn't the function which should process data
from the network `accept-process-output'?  I tried it there and
confirmed it solves the problem.  Since I don't have Emacspeak,
I'm not sure whether it doesn't trouble Emacspeak users, though.

The patch is here:

*** url.el~	Sun Nov 21 22:23:54 2004
--- url.el	Thu Jan 13 12:15:19 2005
***************
*** 177,194 ****
        (while (not retrieval-done)
  	(url-debug 'retrieval "Spinning in url-retrieve-synchronously: %S (%S)"
  		   retrieval-done asynch-buffer)
! 	;; Quoth Stef:
! 	;; It turns out that the problem seems to be that the (sit-for
! 	;; 0.1) below doesn't actually process the data: instead it
! 	;; returns immediately because there is keyboard input
! 	;; waiting, so we end up spinning endlessly waiting for the
! 	;; process to finish while not letting it finish.
! 
! 	;; However, raman claims that it blocks Emacs with Emacspeak
! 	;; for unexplained reasons.  Put back for his benefit until
! 	;; someone can understand it.
! 	;; (sleep-for 0.1)
! 	(sit-for 0.1))
        asynch-buffer)))
  
  (defun url-mm-callback (&rest ignored)
--- 177,183 ----
        (while (not retrieval-done)
  	(url-debug 'retrieval "Spinning in url-retrieve-synchronously: %S (%S)"
  		   retrieval-done asynch-buffer)
! 	(accept-process-output (get-buffer-process asynch-buffer) 0 100000 1))
        asynch-buffer)))
  
  (defun url-mm-callback (&rest ignored)



^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-13 12:16 url.el blocks Gnus+nnrss Katsumi Yamaoka
@ 2005-01-13 14:35 ` Stefan Monnier
  2005-01-13 22:39   ` Katsumi Yamaoka
  2005-01-25 14:32   ` Klaus Straubinger
  0 siblings, 2 replies; 10+ messages in thread
From: Stefan Monnier @ 2005-01-13 14:35 UTC (permalink / raw
  Cc: ding, emacs-devel

> Several people reported that Gnus hangs if they subscribe to
> nnrss groups.  nnrss is a back end which uses the url ELisp
> package by default and enables to read rss feeds as if they were
> newsgroups.

I've myself been using the patch below for a while now (it basically does
the same as yours, except it hoists the get-buffer-process outside the loop
and it removes the timeout since it shouldn't be needed now that Emacs knows
what we're waiting for).

I've just installed it.


        Stefan


--- orig/lisp/url/url.el
+++ mod/lisp/url/url.el
@@ -1,6 +1,7 @@
 ;;; url.el --- Uniform Resource Locator retrieval tool
 
-;; Copyright (c) 1996,1997,1998,1999,2001,2004  Free Software Foundation, Inc.
+;; Copyright (c) 1996, 1997, 1998, 1999, 2001, 2004, 2005
+;;           Free Software Foundation, Inc.
 
 ;; Author: Bill Perry <wmperry@gnu.org>
 ;; Keywords: comm, data, processes, hypermedia
@@ -169,26 +169,24 @@
 			      (url-debug 'retrieval "Synchronous fetching done (%S)" (current-buffer))
 			      (setq retrieval-done t
 				    asynch-buffer (current-buffer)))))
-    (if (not asynch-buffer)
+    (let ((proc (and asynch-buffer (get-buffer-process asynch-buffer))))
+      (if (null proc)
 	;; We do not need to do anything, it was a mailto or something
 	;; similar that takes processing completely outside of the URL
 	;; package.
 	nil
       (while (not retrieval-done)
 	(url-debug 'retrieval "Spinning in url-retrieve-synchronously: %S (%S)"
 		   retrieval-done asynch-buffer)
-	;; Quoth Stef:
-	;; It turns out that the problem seems to be that the (sit-for
-	;; 0.1) below doesn't actually process the data: instead it
-	;; returns immediately because there is keyboard input
-	;; waiting, so we end up spinning endlessly waiting for the
-	;; process to finish while not letting it finish.
-
-	;; However, raman claims that it blocks Emacs with Emacspeak
-	;; for unexplained reasons.  Put back for his benefit until
-	;; someone can understand it.
-	;; (sleep-for 0.1)
-	(sit-for 0.1))
+	  ;; We used to use `sit-for' here, but in some cases it wouldn't
+	  ;; work because apparently pending keyboard input would always
+	  ;; interrupt it before it got a chance to handle process input.
+	  ;; `sleep-for' was tried but it lead to other forms of
+	  ;; hanging.  --Stef
+	  (unless (accept-process-output proc)
+	    ;; accept-process-output returned nil, maybe because the process
+	    ;; exited (and may have been replaced with another).
+	    (setq proc (get-buffer-process asynch-buffer)))))
       asynch-buffer)))
 
 (defun url-mm-callback (&rest ignored)

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-13 14:35 ` Stefan Monnier
@ 2005-01-13 22:39   ` Katsumi Yamaoka
  2005-01-25 14:32   ` Klaus Straubinger
  1 sibling, 0 replies; 10+ messages in thread
From: Katsumi Yamaoka @ 2005-01-13 22:39 UTC (permalink / raw
  Cc: emacs-devel, ding

>>>>> In <87pt09v2m1.fsf-monnier+emacs@gnu.org> Stefan Monnier wrote:

> I've myself been using the patch below for a while now (it basically does
> the same as yours, except it hoists the get-buffer-process outside the loop
> and it removes the timeout since it shouldn't be needed now that Emacs knows
> what we're waiting for).

> I've just installed it.

Excellent.  I confirmed the problem has gone.  Thank you.



^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-13 14:35 ` Stefan Monnier
  2005-01-13 22:39   ` Katsumi Yamaoka
@ 2005-01-25 14:32   ` Klaus Straubinger
  2005-01-25 16:51     ` Stefan Monnier
  1 sibling, 1 reply; 10+ messages in thread
From: Klaus Straubinger @ 2005-01-25 14:32 UTC (permalink / raw


> I've myself been using the patch below for a while now (it basically does
> the same as yours, except it hoists the get-buffer-process outside the loop
> and it removes the timeout since it shouldn't be needed now that Emacs knows
> what we're waiting for).
>
> I've just installed it.

For me the patch provided a big improvement, but I still observed hangs -
as far as I could determine, always in connections without specified
Content-Length. I think the reason for this is that in these cases
url-http-simple-after-change-function is used which does not call the
callback function at the end, as opposed to
url-http-content-length-after-change-function. This leads to a hang in
accept-process-output when the connection already has been closed.

The following patch did help me, although I am not sure if it is the
right way to tackle the problem:


--- url.el.orig	2005-01-14 13:36:36.000000000 +0100
+++ url.el	2005-01-25 14:47:52.325350728 +0100
@@ -176,7 +176,7 @@
 	  ;; similar that takes processing completely outside of the URL
 	  ;; package.
 	  nil
-	(while (not retrieval-done)
+	(while (null retrieval-done)
 	  (url-debug 'retrieval
 		     "Spinning in url-retrieve-synchronously: %S (%S)"
 		     retrieval-done asynch-buffer)
@@ -185,11 +185,13 @@
 	  ;; interrupt it before it got a chance to handle process input.
 	  ;; `sleep-for' was tried but it lead to other forms of
 	  ;; hanging.  --Stef
-	  (unless (accept-process-output proc)
-	    ;; accept-process-output returned nil, maybe because the process
-	    ;; exited (and may have been replaced with another).
-	    (setq proc (get-buffer-process asynch-buffer)))))
-      asynch-buffer)))
+	  (if (eq (process-status proc) 'closed)
+	      (setq retrieval-done t)
+	    (unless (accept-process-output proc)
+	      ;; accept-process-output returned nil, maybe because the process
+	      ;; exited (and may have been replaced with another).
+	      (setq proc (get-buffer-process asynch-buffer)))))))
+    asynch-buffer))
 
 (defun url-mm-callback (&rest ignored)
   (let ((handle (mm-dissect-buffer t)))


-- 
Klaus Straubinger

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-25 14:32   ` Klaus Straubinger
@ 2005-01-25 16:51     ` Stefan Monnier
  2005-01-26 15:02       ` Klaus Straubinger
  2005-01-28 23:16       ` Dave Love
  0 siblings, 2 replies; 10+ messages in thread
From: Stefan Monnier @ 2005-01-25 16:51 UTC (permalink / raw
  Cc: sds, fx, wmperry, emacs-devel

> For me the patch provided a big improvement, but I still observed hangs -
> as far as I could determine, always in connections without specified
> Content-Length. I think the reason for this is that in these cases
> url-http-simple-after-change-function is used which does not call the
> callback function at the end, as opposed to
> url-http-content-length-after-change-function. This leads to a hang in
> accept-process-output when the connection already has been closed.

OT1H, looking at the docstring of url-retrieve, I'd think that it's a bug in
url-http.el that it sometimes fails to activate the callback.

But, OTHO, looking at the url-http.el code it seems that it purposefully
only activates the callback when the retrieval was successful.

If someone could explain to me which it is, that would help.

I guess what should really happen is that the callback is always activated
but is given an extra argument (explicit or implicit) so it can tell whether
the retrieval was successful.

In any case, it might very well be that your problem is no related to all
this but is a much more down to earth bug.

To debug it, could you try to replace the (setq retrieval-done t) in your
patch with (debug).
Also do (setq url-debug '(http)).
When the debugger gets called, take a look at the *URL-DEBUG* buffer (and
send it here).

> @@ -176,7 +176,7 @@
>  	  ;; similar that takes processing completely outside of the URL
>  	  ;; package.
>  	  nil
> -	(while (not retrieval-done)
> +	(while (null retrieval-done)
>  	  (url-debug 'retrieval
>  		     "Spinning in url-retrieve-synchronously: %S (%S)"
>  		     retrieval-done asynch-buffer)

not and null are synonyms.  When to use which is a question of taste, but
I tend to use `not' when applied to something that I consider as a boolean
value, whereas I tend to use `null' when applied to something that can be
nil or anything else (a list, for example).
So in the above case, I think `not' is preferable.


        Stefan

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-25 16:51     ` Stefan Monnier
@ 2005-01-26 15:02       ` Klaus Straubinger
  2005-01-28 16:58         ` Stefan Monnier
  2005-01-28 23:16       ` Dave Love
  1 sibling, 1 reply; 10+ messages in thread
From: Klaus Straubinger @ 2005-01-26 15:02 UTC (permalink / raw


> OT1H, looking at the docstring of url-retrieve, I'd think that it's a
> bug in url-http.el that it sometimes fails to activate the callback.
>
> But, OTHO, looking at the url-http.el code it seems that it purposefully
> only activates the callback when the retrieval was successful.
>
> If someone could explain to me which it is, that would help.

I can't explain it.

Unfortunately, there is no good method to always react in the right
way, especially for these connections without specified Content-Length
where the only indicator is that the connection has been closed
already. As far as I know, there is no mechanism in Emacs that triggers
a function at connection-state changes. And how would you then classify
whether the retrieval has been successful?

I think most of the complexity of the URL package comes from the fact
that it implements asynchronous fetching of data. This feature is not
necessary for my usage pattern, but obviously the author thought the
complexity worthwile.

> To debug it, could you try to replace the (setq retrieval-done t) in
> your patch with (debug). Also do (setq url-debug '(http)).

OK, I have done that.

> When the debugger gets called, take a look at the *URL-DEBUG* buffer
> (and send it here).

Here it is:

    http -> Cleaning up dead process: proxy:8080 #<process proxy>
    http -> Cleaning up dead process: proxy:8080 #<process proxy>
    http -> Contacting host: proxy:8080
    http -> Marking connection as busy: proxy:8080 #<process proxy>
    http -> Request is: 
    GET http://www.chesscenter.com/twic/twic.html HTTP/1.1
    MIME-Version: 1.0
    Connection: close
    Extension: Security/Digest Security/SSL
    Host: www.chesscenter.com
    Accept-charset: utf-8;q=1, iso-8859-15;q=0.5, iso-8859-1;q=0.5, iso-8859-2;q=0.5, iso-8859-3;q=0.5, iso-8859-4;q=0.5, iso-8859-5;q=0.5, iso-8859-7;q=0.5, iso-8859-8;q=0.5, iso-8859-9;q=0.5, gb2312;q=0.5, euc-jp;q=0.5, euc-kr;q=0.5, iso-8859-14;q=0.5, big5;q=0.5, iso-2022-jp;q=0.5, koi8-r;q=0.5, shift_jis;q=0.5, viscii;q=0.5, hz-gb-2312;q=0.5, iso-2022-cn-ext;q=0.5, iso-2022-cn;q=0.5, iso-2022-jp-2;q=0.5, iso-2022-kr;q=0.5
    Accept-language: de-DE, de, en-GB, en, it-IT, it, es-ES, es
    Accept: */*
    User-Agent: Emacs-W3/Exp URL/Emacs
    

    http -> Calling after change function `url-http-wait-for-headers-change-function' for `#<process proxy>'
    http -> url-http-wait-for-headers-change-function ( *http proxy:8080*<3>)
    http -> Saw end of headers... ( *http proxy:8080*<3>)
    http -> url-http-parse-response called in ( *http proxy:8080*<3>)
    http -> No content-length, being dumb.
    http -> Calling after change function `url-http-simple-after-change-function' for `#<process proxy>'
    [...
     many identical lines cut
     ...]
    http -> Calling after change function `url-http-simple-after-change-function' for `#<process proxy>'
    http -> url-http-end-of-document-sentinel in buffer ( *http proxy:8080*<3>)
    http -> Marking connection as free: proxy:8080 #<process proxy>
    http -> url-http-parse-headers called in ( *http proxy:8080*<3>)
    http -> url-http-parse-response called in ( *http proxy:8080*<3>)
    http -> Parsed HTTP headers: class=2 status=200
    http -> Finished parsing HTTP headers: t
    http -> Marking connection as free: proxy:8080 #<process proxy>
    http -> Activating callback in buffer ( *http proxy:8080*<3>)

The debugger backtrace looks like this, exactly as one would expect:

    (if (eq (process-status proc) (quote closed)) (debug) (if (accept-process-output proc) nil (setq proc ...)))
    (while (not (symbol-value --retrieval-done--30970)) (url-debug (quote retrieval) "Spinning in url-retrieve-synchronously: %S (%S)" (symbol-value --retrieval-done--30970) (symbol-value --asynch-buffer--30971)) (if (eq ... ...) (debug) (if ... nil ...)))
    (if (null proc) nil (while (not ...) (url-debug ... "Spinning in url-retrieve-synchronously: %S (%S)" ... ...) (if ... ... ...)))
    (let ((proc ...)) (if (null proc) nil (while ... ... ...)))
    (progn (set --asynch-buffer--30971 (url-retrieve url ...)) (let (...) (if ... nil ...)) (symbol-value --asynch-buffer--30971))
    (let ((--retrieval-done--30970 ...) (--asynch-buffer--30971 ...)) (setf (symbol-value --retrieval-done--30970) nil (symbol-value --asynch-buffer--30971) nil) (progn (set --asynch-buffer--30971 ...) (let ... ...) (symbol-value --asynch-buffer--30971)))
    (lexical-let ((retrieval-done nil) (asynch-buffer nil)) (setq asynch-buffer (url-retrieve url ...)) (let (...) (if ... nil ...)) asynch-buffer)
    url-retrieve-synchronously("http://WWW.ChessCenter.Com/twic/twic.html")
    eval((url-retrieve-synchronously "http://WWW.ChessCenter.Com/twic/twic.html"))
    eval-expression((url-retrieve-synchronously "http://WWW.ChessCenter.Com/twic/twic.html") nil)
    call-interactively(eval-expression)


By the way, I have observed also hangs at redirects, where another
url-retrieve is called when the old connection could still receive
data. (accept-process-output url-http-process) directly before the new
url-retrieve did help, but I have no clue why.

> not and null are synonyms.  When to use which is a question of taste, but
> I tend to use `not' when applied to something that I consider as a boolean
> value, whereas I tend to use `null' when applied to something that can be
> nil or anything else (a list, for example).
> So in the above case, I think `not' is preferable.

That sounds very reasonable.

-- 
Klaus Straubinger

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-26 15:02       ` Klaus Straubinger
@ 2005-01-28 16:58         ` Stefan Monnier
  2005-02-01  9:35           ` Klaus Straubinger
  0 siblings, 1 reply; 10+ messages in thread
From: Stefan Monnier @ 2005-01-28 16:58 UTC (permalink / raw
  Cc: emacs-devel

> As far as I know, there is no mechanism in Emacs that triggers
> a function at connection-state changes.

See set-process-sentinel ;-)

> I think most of the complexity of the URL package comes from the fact
> that it implements asynchronous fetching of data. This feature is not
> necessary for my usage pattern, but obviously the author thought the
> complexity worthwile.

Actually, Emacs does not provide any direct support for synchronous
"open-connection/send-data/read-data", the way it does for processes with
`call-process'.  So the author (William Perry) had no choice, really.
Furthermore, given the complexity of HTTP (with authentication and stuff),
it's basically impossible for Emacs to provide a synchronous API that would
be usable for HTTP.

> Here it is:
>     http -> Cleaning up dead process: proxy:8080 #<process proxy>

Hmm... so you're going through a proxy, good to know.

>      many identical lines cut
>      ...]
>     http -> Calling after change function `url-http-simple-after-change-function' for `#<process proxy>'
>     http -> url-http-end-of-document-sentinel in buffer ( *http proxy:8080*<3>)
>     http -> Marking connection as free: proxy:8080 #<process proxy>
>     http -> url-http-parse-headers called in ( *http proxy:8080*<3>)
>     http -> url-http-parse-response called in ( *http proxy:8080*<3>)
>     http -> Parsed HTTP headers: class=2 status=200
>     http -> Finished parsing HTTP headers: t
>     http -> Marking connection as free: proxy:8080 #<process proxy>
>     http -> Activating callback in buffer ( *http proxy:8080*<3>)

So the sentinel is properly called when the connection is closed and it does
activate "the" callback.  Now why didn't the callback set `retrieval-done'?

> The debugger backtrace looks like this, exactly as one would expect:

>     (if (eq (process-status proc) (quote closed)) (debug) (if (accept-process-output proc) nil (setq proc ...)))
>     (while (not (symbol-value --retrieval-done--30970)) (url-debug (quote retrieval) "Spinning in url-retrieve-synchronously: %S (%S)" (symbol-value --retrieval-done--30970) (symbol-value --asynch-buffer--30971)) (if (eq ... ...) (debug) (if ... nil ...)))
>     (if (null proc) nil (while (not ...) (url-debug ... "Spinning in url-retrieve-synchronously: %S (%S)" ... ...) (if ... ... ...)))
>     (let ((proc ...)) (if (null proc) nil (while ... ... ...)))
>     (progn (set --asynch-buffer--30971 (url-retrieve url ...)) (let (...) (if ... nil ...)) (symbol-value --asynch-buffer--30971))
>     (let ((--retrieval-done--30970 ...) (--asynch-buffer--30971 ...)) (setf (symbol-value --retrieval-done--30970) nil (symbol-value --asynch-buffer--30971) nil) (progn (set --asynch-buffer--30971 ...) (let ... ...) (symbol-value --asynch-buffer--30971)))
>     (lexical-let ((retrieval-done nil) (asynch-buffer nil)) (setq asynch-buffer (url-retrieve url ...)) (let (...) (if ... nil ...)) asynch-buffer)
>     url-retrieve-synchronously("http://WWW.ChessCenter.Com/twic/twic.html")
>     eval((url-retrieve-synchronously "http://WWW.ChessCenter.Com/twic/twic.html"))
>     eval-expression((url-retrieve-synchronously "http://WWW.ChessCenter.Com/twic/twic.html") nil)
>     call-interactively(eval-expression)

Could you in the *Backtrace* buffer check the value of the following
expressions (and post them here), using `e':

  proc

  (process-buffer proc)

  (with-current-buffer (process-buffer proc) url-callback-function)

  --retrieval-done--30970

  (symbol-value --retrieval-done--30970)

The "30970" might change from one run to the other, so check the backtrace
to see which number was used that time.

> By the way, I have observed also hangs at redirects, where another
> url-retrieve is called when the old connection could still receive
> data. (accept-process-output url-http-process) directly before the new
> url-retrieve did help, but I have no clue why.

Could you show a patch of the actual change you tried?
Do you men just before `url-retrieve' in url-http-parse-headers
in the lines:

	   (let ((url-request-method url-http-method)
		 (url-request-data url-http-data)
		 (url-request-extra-headers url-http-extra-headers))
	     (url-retrieve redirect-uri url-callback-function
			   url-callback-arguments)
	     (url-mark-buffer-as-dead (current-buffer))))))


-- Stefan

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-25 16:51     ` Stefan Monnier
  2005-01-26 15:02       ` Klaus Straubinger
@ 2005-01-28 23:16       ` Dave Love
  2005-01-29 16:21         ` Richard Stallman
  1 sibling, 1 reply; 10+ messages in thread
From: Dave Love @ 2005-01-28 23:16 UTC (permalink / raw
  Cc: sds, Klaus Straubinger, wmperry, emacs-devel

Stefan Monnier <monnier@iro.umontreal.ca> writes:

> OT1H, looking at the docstring of url-retrieve, I'd think that it's a bug in
> url-http.el that it sometimes fails to activate the callback.
>
> But, OTHO, looking at the url-http.el code it seems that it purposefully
> only activates the callback when the retrieval was successful.
>
> If someone could explain to me which it is, that would help.

I'm not sure I understand what this is about, and it's a long time
since dealt with that code, but I'll try to take a look at it sometime
if no-one else can.

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-28 23:16       ` Dave Love
@ 2005-01-29 16:21         ` Richard Stallman
  0 siblings, 0 replies; 10+ messages in thread
From: Richard Stallman @ 2005-01-29 16:21 UTC (permalink / raw
  Cc: emacs-devel, sds, KSNetz, monnier, wmperry

    I'm not sure I understand what this is about, and it's a long time
    since dealt with that code, but I'll try to take a look at it sometime
    if no-one else can.

Thank you very much for this offer.

^ permalink raw reply	[flat|nested] 10+ messages in thread

* Re: url.el blocks Gnus+nnrss
  2005-01-28 16:58         ` Stefan Monnier
@ 2005-02-01  9:35           ` Klaus Straubinger
  0 siblings, 0 replies; 10+ messages in thread
From: Klaus Straubinger @ 2005-02-01  9:35 UTC (permalink / raw


> Could you in the *Backtrace* buffer check the value of the following
> expressions (and post them here), using `e':
>
>   proc

#<process proxy>

>   (process-buffer proc)

nil

(because url-http-activate-callback already has called
url-http-mark-connection-as-free which resets the process buffer)

>   (with-current-buffer (process-buffer proc) url-callback-function)

After replacing the nil value of (process-buffer proc) with " *http proxy:8080*":
(lambda (&rest --cl-rest--) (apply (quote (lambda (G96930 G96931 &rest ignored) (url-debug (quote retrieval) "Synchronous fetching done (%S)" (current-buffer)) (progn (set G96931 t) (set G96930 (current-buffer))))) (quote --asynch-buffer--) (quote --retrieval-done--) --cl-rest--))

>   --retrieval-done--30970

Symbol's value as variable is void: --retrieval-done--96924

>   (symbol-value --retrieval-done--30970)

... is t at least later on in the debugger, although I have not been
able to evaluate it with the "e" command.


>> By the way, I have observed also hangs at redirects, where another
>> url-retrieve is called when the old connection could still receive
>> data. (accept-process-output url-http-process) directly before the new
>> url-retrieve did help, but I have no clue why.
>
> Could you show a patch of the actual change you tried?
> Do you mean just before `url-retrieve' in url-http-parse-headers
> in the lines:
>
> 	   (let ((url-request-method url-http-method)
> 		 (url-request-data url-http-data)
> 		 (url-request-extra-headers url-http-extra-headers))
> 	     (url-retrieve redirect-uri url-callback-function
> 			   url-callback-arguments)
> 	     (url-mark-buffer-as-dead (current-buffer))))))

Yes, exactly. There I threw in a (accept-process-output url-http-process)
just because it seemed to help. Unfortunately, I don't see why.

-- 
Klaus Straubinger

^ permalink raw reply	[flat|nested] 10+ messages in thread

end of thread, other threads:[~2005-02-01  9:35 UTC | newest]

Thread overview: 10+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2005-01-13 12:16 url.el blocks Gnus+nnrss Katsumi Yamaoka
2005-01-13 14:35 ` Stefan Monnier
2005-01-13 22:39   ` Katsumi Yamaoka
2005-01-25 14:32   ` Klaus Straubinger
2005-01-25 16:51     ` Stefan Monnier
2005-01-26 15:02       ` Klaus Straubinger
2005-01-28 16:58         ` Stefan Monnier
2005-02-01  9:35           ` Klaus Straubinger
2005-01-28 23:16       ` Dave Love
2005-01-29 16:21         ` Richard Stallman

Code repositories for project(s) associated with this public inbox

	https://git.savannah.gnu.org/cgit/emacs.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).