[PyKDE] QSocket mystery

michael ferraro michael at possibleworlds.com
Sun Sep 5 18:30:40 BST 2004


I implemented a "server" app using QSocketServer/QSocket through
PyQt a while ago and it worked fine, even under a heavy load.

My original tests were done with qt-mac-free-3.2.2 and PyQt/sip 
snapshots
from around 20031123 under MacOS 10.3.1

I am now running  qt 3.3.0 and PyQt-20040518 and sip-20040515 under 
10.3.4
although  tested it under Qt-3.3.3 PyQt-3.12 sip-4.0.1 under 10.3.5 
with the same
results described below.

I recently put the server to work and found that I was incurring major 
(for me)
delays in network traffic.  My client is sending small packets at 33ms 
intervals.
My server was seeing bunches of packets buffered into a single read, 
coming
in a somewhere around 100ms. I made sure the client was flushing it's 
buffers
after each packet it sent.

After doing some searches and closer reading I decided that the problem 
was
with QSocket and I should be using QSocketDevice for the network I/O. 
This
works but requires that I use a fast timer (20ms-30ms) to poll  the 
socket. This eats
up CPU time and slows the GUI.  I have set the timer value to 0 so that 
I get
polled whenever there are no GUI events but this leads to wildly varying
network response even if I'm just moving the window around.  (I need a
very steady clock pulse from the network).

I tried using a QSocketNotifier and connecting my Read method to the
activate(int) SIGNAL but this seems to get triggered only every 100ms.
Shouldn't this trigger as soon as data is available?

None of the Changes files on TrollTech site indicate that anything has
changed to any of the components I am using so I'm at a loss as to what
has happened.

M.












More information about the PyQt mailing list