01_爬虫伪装成浏览器的四种方法 - Summer儿 - 博客园
www.cnblogs.com › summer1019 › p1 # 方法三:通过Request添加headers 2 req = urllib.request.Request(url) 3 req.add_header(' User-Agent ', ' Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36 SE 2.X MetaSr 1.0 ') 4 req_data = urllib.request.urlopen(req).read() 5 print (len(req_data))
urllib3 · PyPI
https://pypi.org/project/urllib310.10.2011 · Tags urllib, httplib, threadsafe, filepost, http , https ... Cleaner exception chain in Python 3 for _make_request. (Issue #861) Fixed installing urllib3[socks] extra. (Issue #864) Fixed signature of ConnectionPool.close so it can actually safely be called by subclasses.