Skip to content

Commit 330e985

Browse files
author
georg.brandl
committed
Fix old urllib/urllib2/urlparse usage.
git-svn-id: http://svn.python.org/projects/python/branches/py3k@64478 6015fed2-1504-0410-9fe1-9d1591cc4771
1 parent b1bd458 commit 330e985

7 files changed

Lines changed: 42 additions & 39 deletions

File tree

Doc/library/http.cookiejar.rst

Lines changed: 20 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ The following classes are provided:
100100

101101
.. seealso::
102102

103-
Module :mod:`urllib2`
103+
Module :mod:`urllib.request`
104104
URL opening with automatic cookie handling.
105105

106106
Module :mod:`http.cookies`
@@ -149,11 +149,11 @@ contained :class:`Cookie` objects.
149149
the :class:`CookieJar`'s :class:`CookiePolicy` instance are true and false
150150
respectively), the :mailheader:`Cookie2` header is also added when appropriate.
151151

152-
The *request* object (usually a :class:`urllib2.Request` instance) must support
153-
the methods :meth:`get_full_url`, :meth:`get_host`, :meth:`get_type`,
154-
:meth:`unverifiable`, :meth:`get_origin_req_host`, :meth:`has_header`,
155-
:meth:`get_header`, :meth:`header_items`, and :meth:`add_unredirected_header`,as
156-
documented by :mod:`urllib2`.
152+
The *request* object (usually a :class:`urllib.request..Request` instance)
153+
must support the methods :meth:`get_full_url`, :meth:`get_host`,
154+
:meth:`get_type`, :meth:`unverifiable`, :meth:`get_origin_req_host`,
155+
:meth:`has_header`, :meth:`get_header`, :meth:`header_items`, and
156+
:meth:`add_unredirected_header`, as documented by :mod:`urllib.request`.
157157

158158

159159
.. method:: CookieJar.extract_cookies(response, request)
@@ -166,14 +166,15 @@ contained :class:`Cookie` objects.
166166
as appropriate (subject to the :meth:`CookiePolicy.set_ok` method's approval).
167167

168168
The *response* object (usually the result of a call to
169-
:meth:`urllib2.urlopen`, or similar) should support an :meth:`info` method,
170-
which returns a :class:`email.message.Message` instance.
169+
:meth:`urllib.request.urlopen`, or similar) should support an :meth:`info`
170+
method, which returns a :class:`email.message.Message` instance.
171171

172-
The *request* object (usually a :class:`urllib2.Request` instance) must support
173-
the methods :meth:`get_full_url`, :meth:`get_host`, :meth:`unverifiable`, and
174-
:meth:`get_origin_req_host`, as documented by :mod:`urllib2`. The request is
175-
used to set default values for cookie-attributes as well as for checking that
176-
the cookie is allowed to be set.
172+
The *request* object (usually a :class:`urllib.request.Request` instance)
173+
must support the methods :meth:`get_full_url`, :meth:`get_host`,
174+
:meth:`unverifiable`, and :meth:`get_origin_req_host`, as documented by
175+
:mod:`urllib.request`. The request is used to set default values for
176+
cookie-attributes as well as for checking that the cookie is allowed to be
177+
set.
177178

178179

179180
.. method:: CookieJar.set_policy(policy)
@@ -715,31 +716,31 @@ Examples
715716

716717
The first example shows the most common usage of :mod:`http.cookiejar`::
717718

718-
import http.cookiejar, urllib2
719+
import http.cookiejar, urllib.request
719720
cj = http.cookiejar.CookieJar()
720-
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
721+
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
721722
r = opener.open("http://example.com/")
722723

723724
This example illustrates how to open a URL using your Netscape, Mozilla, or Lynx
724725
cookies (assumes Unix/Netscape convention for location of the cookies file)::
725726

726-
import os, http.cookiejar, urllib2
727+
import os, http.cookiejar, urllib.request
727728
cj = http.cookiejar.MozillaCookieJar()
728729
cj.load(os.path.join(os.environ["HOME"], ".netscape/cookies.txt"))
729-
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
730+
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
730731
r = opener.open("http://example.com/")
731732

732733
The next example illustrates the use of :class:`DefaultCookiePolicy`. Turn on
733734
RFC 2965 cookies, be more strict about domains when setting and returning
734735
Netscape cookies, and block some domains from setting cookies or having them
735736
returned::
736737

737-
import urllib2
738+
import urllib.request
738739
from http.cookiejar import CookieJar, DefaultCookiePolicy
739740
policy = DefaultCookiePolicy(
740741
rfc2965=True, strict_ns_domain=Policy.DomainStrict,
741742
blocked_domains=["ads.net", ".ads.net"])
742743
cj = CookieJar(policy)
743-
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
744+
opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(cj))
744745
r = opener.open("http://example.com/")
745746

Doc/library/urllib.request.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1077,15 +1077,15 @@ Adding HTTP headers:
10771077

10781078
Use the *headers* argument to the :class:`Request` constructor, or::
10791079

1080-
import urllib
1080+
import urllib.request
10811081
req = urllib.request.Request('http://www.example.com/')
10821082
req.add_header('Referer', 'http://www.python.org/')
10831083
r = urllib.request.urlopen(req)
10841084

10851085
:class:`OpenerDirector` automatically adds a :mailheader:`User-Agent` header to
10861086
every :class:`Request`. To change this::
10871087

1088-
import urllib
1088+
import urllib.request
10891089
opener = urllib.request.build_opener()
10901090
opener.addheaders = [('User-agent', 'Mozilla/5.0')]
10911091
opener.open('http://www.example.com/')

Lib/http/cookiejar.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1305,7 +1305,7 @@ def _cookie_attrs(self, cookies):
13051305
return attrs
13061306

13071307
def add_cookie_header(self, request):
1308-
"""Add correct Cookie: header to request (urllib2.Request object).
1308+
"""Add correct Cookie: header to request (urllib.request.Request object).
13091309
13101310
The Cookie2 header is also added unless policy.hide_cookie2 is true.
13111311

Lib/logging/handlers.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1002,11 +1002,11 @@ def emit(self, record):
10021002
Send the record to the Web server as an URL-encoded dictionary
10031003
"""
10041004
try:
1005-
import http.client, urllib
1005+
import http.client, urllib.parse
10061006
host = self.host
10071007
h = http.client.HTTP(host)
10081008
url = self.url
1009-
data = urllib.urlencode(self.mapLogRecord(record))
1009+
data = urllib.parse.urlencode(self.mapLogRecord(record))
10101010
if self.method == "GET":
10111011
if (url.find('?') >= 0):
10121012
sep = '&'

Lib/urllib/request.py

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -47,24 +47,25 @@
4747
4848
Example usage:
4949
50-
import urllib2
50+
import urllib.request
5151
5252
# set up authentication info
53-
authinfo = urllib2.HTTPBasicAuthHandler()
53+
authinfo = urllib.request.HTTPBasicAuthHandler()
5454
authinfo.add_password(realm='PDQ Application',
5555
uri='https://mahler:8092/site-updates.py',
5656
user='klem',
5757
passwd='geheim$parole')
5858
59-
proxy_support = urllib2.ProxyHandler({"http" : "http://ahad-haam:3128"})
59+
proxy_support = urllib.request.ProxyHandler({"http" : "http://ahad-haam:3128"})
6060
6161
# build a new opener that adds authentication and caching FTP handlers
62-
opener = urllib2.build_opener(proxy_support, authinfo, urllib2.CacheFTPHandler)
62+
opener = urllib.request.build_opener(proxy_support, authinfo,
63+
urllib.request.CacheFTPHandler)
6364
6465
# install it
65-
urllib2.install_opener(opener)
66+
urllib.request.install_opener(opener)
6667
67-
f = urllib2.urlopen('http://www.python.org/')
68+
f = urllib.request.urlopen('http://www.python.org/')
6869
"""
6970

7071
# XXX issues:
@@ -502,7 +503,7 @@ def redirect_request(self, req, fp, code, msg, headers, newurl):
502503

503504
# Strictly (according to RFC 2616), 301 or 302 in response to
504505
# a POST MUST NOT cause a redirection without confirmation
505-
# from the user (of urllib2, in this case). In practice,
506+
# from the user (of urllib.request, in this case). In practice,
506507
# essentially all clients do redirect in this case, so we do
507508
# the same.
508509
# be conciliant with URIs containing a space
@@ -655,7 +656,7 @@ def proxy_open(self, req, proxy, type):
655656
if proxy_type is None:
656657
proxy_type = orig_type
657658
if user and password:
658-
user_pass = '%s:%s' % (unquote(user),
659+
user_pass = '%s:%s' % (urllib.parse.unquote(user),
659660
urllib.parse.unquote(password))
660661
creds = base64.b64encode(user_pass.encode()).decode("ascii")
661662
req.add_header('Proxy-authorization', 'Basic ' + creds)
@@ -808,7 +809,7 @@ class ProxyBasicAuthHandler(AbstractBasicAuthHandler, BaseHandler):
808809

809810
def http_error_407(self, req, fp, code, msg, headers):
810811
# http_error_auth_reqed requires that there is no userinfo component in
811-
# authority. Assume there isn't one, since urllib2 does not (and
812+
# authority. Assume there isn't one, since urllib.request does not (and
812813
# should not, RFC 3986 s. 3.2.1) support requests for URLs containing
813814
# userinfo.
814815
authority = req.get_host()
@@ -1194,7 +1195,7 @@ def open_local_file(self, req):
11941195
return urllib.response.addinfourl(open(localfile, 'rb'),
11951196
headers, 'file:'+file)
11961197
except OSError as msg:
1197-
# urllib2 users shouldn't expect OSErrors coming from urlopen()
1198+
# users shouldn't expect OSErrors coming from urlopen()
11981199
raise urllib.error.URLError(msg)
11991200
raise urllib.error.URLError('file not on local host')
12001201

Mac/BuildScript/build-installer.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,8 @@
99
1010
Usage: see USAGE variable in the script.
1111
"""
12-
import platform, os, sys, getopt, textwrap, shutil, urllib2, stat, time, pwd
12+
import platform, os, sys, getopt, textwrap, shutil, stat, time, pwd
13+
import urllib.request
1314
import grp
1415

1516
INCLUDE_TIMESTAMP = 1
@@ -442,7 +443,7 @@ def downloadURL(url, fname):
442443
if KNOWNSIZES.get(url) == size:
443444
print("Using existing file for", url)
444445
return
445-
fpIn = urllib2.urlopen(url)
446+
fpIn = urllib.request.urlopen(url)
446447
fpOut = open(fname, 'wb')
447448
block = fpIn.read(10240)
448449
try:

Misc/cheatsheet

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1889,7 +1889,6 @@ random Random variable generators
18891889
re Regular Expressions.
18901890
reprlib Redo repr() but with limits on most sizes.
18911891
rlcompleter Word completion for GNU readline 2.0.
1892-
robotparser Parse robots.txt files, useful for web spiders.
18931892
sched A generally useful event scheduler class.
18941893
shelve Manage shelves of pickled objects.
18951894
shlex Lexical analyzer class for simple shell-like syntaxes.
@@ -1920,8 +1919,9 @@ turtle LogoMation-like turtle graphics
19201919
types Define names for all type symbols in the std interpreter.
19211920
tzparse Parse a timezone specification.
19221921
unicodedata Interface to unicode properties.
1923-
urllib Open an arbitrary URL.
1924-
urlparse Parse URLs according to latest draft of standard.
1922+
urllib.parse Parse URLs according to latest draft of standard.
1923+
urllib.request Open an arbitrary URL.
1924+
urllib.robotparser Parse robots.txt files, useful for web spiders.
19251925
user Hook to allow user-specified customization code to run.
19261926
uu UUencode/UUdecode.
19271927
unittest Utilities for implementing unit testing.

0 commit comments

Comments
 (0)