Skip to content Skip to sidebar Skip to footer

Persistent Https Connections In Python

I want to make an HTTPS request to a real-time stream and keep the connection open so that I can keep reading content from it and processing it. I want to write the script in pytho

Solution 1:

It looks like your real-time stream is delivered as one endless HTTP GET response, yes? If so, you could just use python's built-in urllib2.urlopen(). It returns a file-like object, from which you can read as much as you want until the server hangs up on you.

f=urllib2.urlopen('https://encrypted.google.com/')
while True:
    data = f.read(100)
    print(data)

Keep in mind that although urllib2 speaks https, it doesn't validate server certificates, so you might want to try and add-on package like pycurl or urlgrabber for better security. (I'm not sure if urlgrabber supports https.)

Solution 2:

Connection keep-alive features are not available in any of the python standard libraries for https. The most mature option is probably urllib3

Solution 3:

httplib2 supports this. (I'd have thought this the most mature option, didn't know urllib3 yet, so TokenMacGuy may still be right)

EDIT: while httplib2 does support persistent connections, I don't think you can really consume streams with it (ie. one long response vs. multiple requests over the same connection), which I now realise you may need.

Post a Comment for "Persistent Https Connections In Python"