msg158124 - (view) |
Author: Anrs Hu (Anrs.Hu) |
Date: 2012-04-12 12:23 |
If HTTP URL response's Transfer-Encoding is 'Chunked', then the urllib2.urlopen(URL).readline() will block until there're enough 8192 bytes, even though the first chunk is just a line. Every chunks should be processed as soon as posible, so the readline() behavior should read a line and return immediately, rather than read 8K data to buffer and look up a line from the buffer. |
|
|
msg158125 - (view) |
Author: Senthil Kumaran (orsenthil) *  |
Date: 2012-04-12 12:26 |
I am trying to this test this to determine the fault. |
|
|
msg158225 - (view) |
Author: Jim Jewett (Jim.Jewett) *  |
Date: 2012-04-13 19:06 |
It would be helpful to have a testcase, so that it will stay fixed. |
|
|
msg158246 - (view) |
Author: Anrs Hu (Anrs.Hu) |
Date: 2012-04-14 01:55 |
Okay, there's a test case of web.py: Server codes are following: import web class index(object): def GET(self): yield 'hello\n' yield 'world\n' time.sleep(60) client is Python interpreter >>> resp = urllib.urlopen(URL) >>> resp.readline() # will be 'hello' >>> resp.readline() # will be 'world' >>> resp.readline() # huh, it's blocked, and we to agree with it. >>> # but to use urllib2 will another behavor. >>> urllib2.urlopen(URL).readline() # huh, it's blocked even if 'hello' and 'world' returned yet. Because urllib2 uses a 8KiB buffer on socket._fileobjece within urllib2.py, it read 8K data to buffer first. |
|
|
msg165927 - (view) |
Author: Senthil Kumaran (orsenthil) *  |
Date: 2012-07-20 13:59 |
I had a discussion with Anrs on this, and it went along these lines - I confused the buffering issue (encountered with streaming data) of urllib2 with chunked transfer encoding. The flow will be blocked in the case at the socket level waiting for 8192 bytes. But this buffer size has been kept for buffered reading purposes of normal read scenarios. However, in case of streaming data, this may be not the best way. Here it is explained best - http://stackoverflow.com/questions/1598331/how-to-read-continous-http-streaming-data-in-python The advise is to make the socket buffer size to 0. import socket socket._fileobject.default_bufsize = 0 Now, if we come to chunked transfer encoding, the chunked transfer encoding will behave as it is advertised, like sending one chunk at the time, but still having the readline limit set by MAXLINE in the httplib.py. For the chunked transfer encoding to be recognized the client will have to get a header "transfer-encoding: chunked" from the server and when it receives that header, it will follow the path reading MAXLINE at the time and then returning. For smaller chunks with a blocking behavior of the server ( like you illustrated), we may still need to adopt to turn off default_bufsize to 0 to ensure quick responses to fill the buffer. At this moment, I think that the above thing could be documented in the urllib2 docs for the issue you had raised. Not sure, if any other approach would be suitable to handle this behavior. Anrs (The original poster) also responded that they way he had to overcome this for a very small chunks is setting the socket file size to 0 locally. >> resp = opener.open(server, urllib.urlencode(data)) >> resp = opener.open(server) >> resp.fp._rbufsize = 0 >> for line in iter(resp.readline, ''): >> yield line I think, this could be documented in a certain fashion (like support for streaming without buffering or transfers for small data sizes without buffering). |
|
|
msg239099 - (view) |
Author: Martin Panter (martin.panter) *  |
Date: 2015-03-24 08:52 |
I can reproduce this with Python 2, but not with current Python 3, nor with v3.3.3. Probably doesn’t affect 3.2 either, but I haven’t tried. |
|
|
msg370091 - (view) |
Author: Zackery Spytz (ZackerySpytz) *  |
Date: 2020-05-27 15:10 |
Python 2 is EOL, so I think this issue should be closed. |
|
|