RUN: /usr/share/launchpad-buildd/slavebin/unpack-chroot ['unpack-chroot', '167406-93', '/home/buildd/filecache-default/9ed7ca12e2ce6a2f98c7c7f4080f0de4e6b1fe83'] Unpacking chroot for build 167406-93 RUN: /usr/share/launchpad-buildd/slavebin/mount-chroot ['mount-chroot', '167406-93'] Mounting chroot for build 167406-93 RUN: /usr/share/launchpad-buildd/slavebin/apply-ogre-model ['apply-ogre-model', '167406-93', 'universe'] Attempting OGRE for universe in build-167406-93 RUN: /usr/share/launchpad-buildd/slavebin/update-debian-chroot ['update-debian-chroot', '167406-93'] Updating debian chroot for build 167406-93 Get:1 http://ftpmaster.internal dapper Release.gpg [189B] Get:2 http://ftpmaster.internal dapper Release [34.8kB] Get:3 http://ftpmaster.internal dapper/main Packages [597kB] Get:4 http://ftpmaster.internal dapper/restricted Packages [5416B] Get:5 http://ftpmaster.internal dapper/universe Packages [2440kB] Fetched 3078kB in 3s (927kB/s) Reading package lists... Reading package lists... Building dependency tree... The following packages will be upgraded: bash bsdutils coreutils debconf debconf-i18n diff initscripts libc6 libc6-dev libpam-modules libpam-runtime libpam0g mount sysv-rc sysvinit util-linux 16 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Need to get 0B/11.4MB of archives. After unpacking 90.1kB of additional disk space will be used. (Reading database ... 8222 files and directories currently installed.) Preparing to replace bash 3.1-2ubuntu3 (using .../bash_3.1-2ubuntu4_i386.deb) ... Unpacking replacement bash ... Setting up bash (3.1-2ubuntu4) ... Installing new version of config file /etc/bash_completion ... (Reading database ... 8222 files and directories currently installed.) Preparing to replace bsdutils 1:2.12r-4 (using .../bsdutils_1%3a2.12r-4ubuntu1_i386.deb) ... Unpacking replacement bsdutils ... Setting up bsdutils (2.12r-4ubuntu1) ... (Reading database ... 8222 files and directories currently installed.) Preparing to replace coreutils 5.93-5ubuntu1 (using .../coreutils_5.93-5ubuntu2_i386.deb) ... No diversion `any diversion of /usr/share/man/man1/md5sum.textutils.1.gz', none removed No diversion `any diversion of /usr/bin/md5sum.textutils', none removed Unpacking replacement coreutils ... Setting up coreutils (5.93-5ubuntu2) ... (Reading database ... 8222 files and directories currently installed.) Preparing to replace diff 2.8.1-11ubuntu1 (using .../diff_2.8.1-11ubuntu3_i386.deb) ... Unpacking replacement diff ... Setting up diff (2.8.1-11ubuntu3) ... (Reading database ... 8204 files and directories currently installed.) Preparing to replace mount 2.12r-4 (using .../mount_2.12r-4ubuntu1_i386.deb) ... Unpacking replacement mount ... Setting up mount (2.12r-4ubuntu1) ... (Reading database ... 8204 files and directories currently installed.) Preparing to replace sysvinit 2.86.ds1-6ubuntu6 (using .../sysvinit_2.86.ds1-6ubuntu7_i386.deb) ... Unpacking replacement sysvinit ... Setting up sysvinit (2.86.ds1-6ubuntu7) ... init: timeout opening/writing control channel /dev/initctl (Reading database ... 8204 files and directories currently installed.) Preparing to replace util-linux 2.12r-4 (using .../util-linux_2.12r-4ubuntu1_i386.deb) ... Unpacking replacement util-linux ... Setting up util-linux (2.12r-4ubuntu1) ... Installing new version of config file /etc/init.d/hwclock.sh ... (Reading database ... 8203 files and directories currently installed.) Preparing to replace debconf-i18n 1.4.69ubuntu1 (using .../debconf-i18n_1.4.70ubuntu1_all.deb) ... Unpacking replacement debconf-i18n ... Preparing to replace debconf 1.4.69ubuntu1 (using .../debconf_1.4.70ubuntu1_all.deb) ... Unpacking replacement debconf ... Preparing to replace libc6-dev 2.3.6-0ubuntu4 (using .../libc6-dev_2.3.6-0ubuntu5_i386.deb) ... Unpacking replacement libc6-dev ... Preparing to replace libc6 2.3.6-0ubuntu4 (using .../libc6_2.3.6-0ubuntu5_i386.deb) ... Unpacking replacement libc6 ... Setting up libc6 (2.3.6-0ubuntu5) ... Current default timezone: 'UTC'. Local time is now: Sat Feb 4 10:27:04 UTC 2006. Universal Time is now: Sat Feb 4 10:27:04 UTC 2006. Run 'tzconfig' if you wish to change it. (Reading database ... 8203 files and directories currently installed.) Preparing to replace initscripts 2.86.ds1-6ubuntu6 (using .../initscripts_2.86.ds1-6ubuntu7_i386.deb) ... Unpacking replacement initscripts ... Setting up initscripts (2.86.ds1-6ubuntu7) ... Installing new version of config file /etc/init.d/mountvirtfs ... (Reading database ... 8203 files and directories currently installed.) Preparing to replace libpam-runtime 0.79-3ubuntu5 (using .../libpam-runtime_0.79-3ubuntu7_all.deb) ... Unpacking replacement libpam-runtime ... Setting up libpam-runtime (0.79-3ubuntu7) ... (Reading database ... 8203 files and directories currently installed.) Preparing to replace libpam0g 0.79-3ubuntu5 (using .../libpam0g_0.79-3ubuntu7_i386.deb) ... Unpacking replacement libpam0g ... Setting up libpam0g (0.79-3ubuntu7) ... (Reading database ... 8203 files and directories currently installed.) Preparing to replace libpam-modules 0.79-3ubuntu5 (using .../libpam-modules_0.79-3ubuntu7_i386.deb) ... Unpacking replacement libpam-modules ... Setting up libpam-modules (0.79-3ubuntu7) ... (Reading database ... 8203 files and directories currently installed.) Preparing to replace sysv-rc 2.86.ds1-6ubuntu6 (using .../sysv-rc_2.86.ds1-6ubuntu7_all.deb) ... Unpacking replacement sysv-rc ... Setting up sysv-rc (2.86.ds1-6ubuntu7) ... Setting up libc6-dev (2.3.6-0ubuntu5) ... Setting up debconf-i18n (1.4.70ubuntu1) ... Setting up debconf (1.4.70ubuntu1) ... RUN: /usr/share/launchpad-buildd/slavebin/sbuild-package ['sbuild-package', '167406-93', '-dautobuild', '--nolog', '--batch', '--archive=ubuntu', '-A', '--comp=universe', 'urlgrabber_2.9.7-2.dsc'] Initiating build Automatic build of urlgrabber_2.9.7-2 on rothera by sbuild/i386 1.170.5 Build started at 20060204-1027 ****************************************************************************** urlgrabber_2.9.7-2.dsc exists in cwd ** Using build dependencies supplied by package: Build-Depends: debhelper (>= 4.0.0), python2.3-dev, python Checking for already installed source dependencies... debhelper: missing python2.3-dev: missing python: missing Checking for source dependency conflicts... /usr/bin/sudo /usr/bin/apt-get --purge $CHROOT_OPTIONS -q -y install debhelper python2.3-dev python Reading package lists... Building dependency tree... The following extra packages will be installed: debconf-utils file gcj-4.0-base gettext gettext-base html2text intltool-debian libbz2-1.0 libgcj-common libgcj6 libmagic1 libreadline5 libssl0.9.8 po-debconf python2.3 python2.4 readline-common Suggested packages: dh-make cvs gettext-doc python-doc python-tk python-profiler python2.3-doc python2.3-profiler python2.4-doc Recommended packages: curl wget lynx libgcj6-jar libmail-sendmail-perl libcompress-zlib-perl python2.3-cjkcodecs python2.3-iconvcodec python2.3-japanese-codecs The following NEW packages will be installed: debconf-utils debhelper file gcj-4.0-base gettext gettext-base html2text intltool-debian libbz2-1.0 libgcj-common libgcj6 libmagic1 libreadline5 libssl0.9.8 po-debconf python python2.3 python2.3-dev python2.4 readline-common 0 upgraded, 20 newly installed, 0 to remove and 0 not upgraded. Need to get 4182kB/17.7MB of archives. After unpacking 57.7MB of additional disk space will be used. Get:1 http://ftpmaster.internal dapper/main python2.3 2.3.5-9ubuntu1 [2794kB] Get:2 http://ftpmaster.internal dapper/main python2.3-dev 2.3.5-9ubuntu1 [1388kB] Fetched 4182kB in 0s (23.4MB/s) Selecting previously deselected package gettext-base. (Reading database ... 8203 files and directories currently installed.) Unpacking gettext-base (from .../gettext-base_0.14.5-2ubuntu2_i386.deb) ... Selecting previously deselected package libbz2-1.0. Unpacking libbz2-1.0 (from .../libbz2-1.0_1.0.3-0ubuntu1_i386.deb) ... Selecting previously deselected package readline-common. Unpacking readline-common (from .../readline-common_5.1-5_all.deb) ... Selecting previously deselected package libreadline5. Unpacking libreadline5 (from .../libreadline5_5.1-5_i386.deb) ... Selecting previously deselected package libssl0.9.8. Unpacking libssl0.9.8 (from .../libssl0.9.8_0.9.8a-5_i386.deb) ... Selecting previously deselected package python2.4. Unpacking python2.4 (from .../python2.4_2.4.2-1ubuntu2_i386.deb) ... Selecting previously deselected package python. Unpacking python (from .../python_2.4.2-0ubuntu3_all.deb) ... Selecting previously deselected package libmagic1. Unpacking libmagic1 (from .../libmagic1_4.16-0ubuntu1_i386.deb) ... Selecting previously deselected package file. Unpacking file (from .../file_4.16-0ubuntu1_i386.deb) ... Selecting previously deselected package debconf-utils. Unpacking debconf-utils (from .../debconf-utils_1.4.70ubuntu1_all.deb) ... Selecting previously deselected package html2text. Unpacking html2text (from .../html2text_1.3.2a-3_i386.deb) ... Selecting previously deselected package gcj-4.0-base. Unpacking gcj-4.0-base (from .../gcj-4.0-base_4.0.2-7ubuntu3_i386.deb) ... Selecting previously deselected package libgcj-common. Unpacking libgcj-common (from .../libgcj-common_1%3a4.0.2-7ubuntu3_all.deb) ... Selecting previously deselected package libgcj6. Unpacking libgcj6 (from .../libgcj6_4.0.2-7ubuntu3_i386.deb) ... Selecting previously deselected package gettext. Unpacking gettext (from .../gettext_0.14.5-2ubuntu2_i386.deb) ... Selecting previously deselected package intltool-debian. Unpacking intltool-debian (from .../intltool-debian_0.34.1+20050828_all.deb) ... Selecting previously deselected package po-debconf. Unpacking po-debconf (from .../po-debconf_0.9.2_all.deb) ... Selecting previously deselected package debhelper. Unpacking debhelper (from .../debhelper_5.0.7ubuntu4_all.deb) ... Selecting previously deselected package python2.3. Unpacking python2.3 (from .../python2.3_2.3.5-9ubuntu1_i386.deb) ... Selecting previously deselected package python2.3-dev. Unpacking python2.3-dev (from .../python2.3-dev_2.3.5-9ubuntu1_i386.deb) ... Setting up gettext-base (0.14.5-2ubuntu2) ... Setting up libbz2-1.0 (1.0.3-0ubuntu1) ... Setting up readline-common (5.1-5) ... Setting up libreadline5 (5.1-5) ... Setting up libssl0.9.8 (0.9.8a-5) ... Setting up python2.4 (2.4.2-1ubuntu2) ... Setting up python (2.4.2-0ubuntu3) ... Setting up libmagic1 (4.16-0ubuntu1) ... Setting up file (4.16-0ubuntu1) ... Setting up debconf-utils (1.4.70ubuntu1) ... Setting up html2text (1.3.2a-3) ... Setting up gcj-4.0-base (4.0.2-7ubuntu3) ... Setting up libgcj-common (4.0.2-7ubuntu3) ... Setting up libgcj6 (4.0.2-7ubuntu3) ... Setting up gettext (0.14.5-2ubuntu2) ... Setting up intltool-debian (0.34.1+20050828) ... Setting up po-debconf (0.9.2) ... Setting up debhelper (5.0.7ubuntu4) ... Setting up python2.3 (2.3.5-9ubuntu1) ... Setting up python2.3-dev (2.3.5-9ubuntu1) ... Checking correctness of source dependencies... Toolchain package versions: libc6-dev_2.3.6-0ubuntu5 make_3.80+3.81.b4-1 dpkg-dev_1.13.11ubuntu1 linux-kernel-headers_2.6.11.2-0ubuntu15 gcc-4.0_4.0.2-7ubuntu1 g++-4.0_4.0.2-7ubuntu1 binutils_2.16.1cvs20060117-1ubuntu1 libstdc++6-4.0-dev_4.0.2-7ubuntu1 libstdc++6_4.0.2-7ubuntu1 ------------------------------------------------------------------------------ dpkg-source: extracting urlgrabber in urlgrabber-2.9.7 dpkg-buildpackage: source package is urlgrabber dpkg-buildpackage: source version is 2.9.7-2 dpkg-buildpackage: host architecture i386 /usr/bin/fakeroot debian/rules clean dh_testdir dh_testroot rm -f build-stamp configure-stamp # Add here commands to clean up after the build process. #-/usr/bin/make clean python setup.py clean running clean 'build/lib' does not exist -- can't clean it 'build/bdist.linux-i686' does not exist -- can't clean it 'build/scripts-2.4' does not exist -- can't clean it find /build/buildd/urlgrabber-2.9.7 -name "*.pyc" -exec rm -f '{}' \; dh_clean debian/rules build dh_testdir # Add here commands to configure the package. touch configure-stamp dh_testdir # Add here commands to compile the package. #/usr/bin/make python setup.py build running build running build_py creating build creating build/lib creating build/lib/urlgrabber copying urlgrabber/byterange.py -> build/lib/urlgrabber copying urlgrabber/keepalive.py -> build/lib/urlgrabber copying urlgrabber/progress.py -> build/lib/urlgrabber copying urlgrabber/mirror.py -> build/lib/urlgrabber copying urlgrabber/grabber.py -> build/lib/urlgrabber copying urlgrabber/__init__.py -> build/lib/urlgrabber running build_scripts creating build/scripts-2.4 copying and adjusting scripts/urlgrabber -> build/scripts-2.4 changing mode of build/scripts-2.4/urlgrabber from 644 to 755 #docbook-to-man debian/urlgrabber.sgml > urlgrabber.1 python test/runtests.py urlgrabber tests grabber.py tests BaseProxyTests Test checkfunc behavior check for proper args when used with urlgrab ... ERROR check failure with urlgrab checkfunc ... ok check success with urlgrab checkfunc ... ERROR check for proper args when used with urlread ... ERROR check failure with urlread checkfunc ... ok check success with urlread checkfunc ... ERROR CommonRegetTests exception raised for illegal reget mode ... ok FTPRegetTests simple (forced) reget ... skip Test failure behavior failure callback is called with the proper args ... ok failure callback is called on retry ... ok FileObjectTests URLGrabberFileObject .read() method ... ok URLGrabberFileObject .readline() method ... ok URLGrabberFileObject .readlines() method ... ok URLGrabberFileObject .read(N) with small N ... ok FileRegetTests simple (forced) reget ... ok test_newer_check_timestamp (test_grabber.FileRegetTests) ... ok test_older_check_timestamp (test_grabber.FileRegetTests) ... ok HTTPRegetTests simple (forced) reget ... ERROR test_newer_check_timestamp (test_grabber.HTTPRegetTests) ... ERROR test_older_check_timestamp (test_grabber.HTTPRegetTests) ... ERROR HTTPTests do an HTTP post ... ERROR download refernce file via HTTP ... ERROR Test interrupt callback behavior interrupt callback is called on retry ... ERROR interrupt callback raises an exception ... ERROR ProFTPDSucksTests test_restart_workaround (test_grabber.ProFTPDSucksTests) ... skip ProxyFTPAuthTests test_bad_password (test_grabber.ProxyFTPAuthTests) ... skip test_good_password (test_grabber.ProxyFTPAuthTests) ... skip ProxyFormatTests test_bad_proxy_formats (test_grabber.ProxyFormatTests) ... ok test_good_proxy_formats (test_grabber.ProxyFormatTests) ... ok ProxyHTTPAuthTests test_bad_password (test_grabber.ProxyHTTPAuthTests) ... skip test_good_password (test_grabber.ProxyHTTPAuthTests) ... skip Test module level functions defined in grabber.py module-level urlgrab() function ... ERROR module-level urlopen() function ... ERROR module-level urlread() function ... ERROR Test grabber.URLGrabber class grabber.URLGrabber.__init__() **kwargs handling. ... ok grabber.URLGrabber._make_callback() tests ... ok grabber.URLGrabber._parse_url() ... ok grabber.URLGrabber._parse_url('/local/file/path') ... ok grabber.URLGrabber._parse_url() with .prefix ... ok byterange.py tests Test module level functions defined in range.py byterange.range_header_to_tuple() ... ok byterange.range_tuple_normalize() ... ok byterange.range_tuple_to_header() ... ok Test range.RangeableFileObject class RangeableFileObject.seek() poor mans version.. ... ok RangeableFileObject.read() ... ok RangeableFileObject.read(): to end of file. ... ok RangeableFileObject.readline() ... ok RangeableFileObject.seek() ... ok RangeableFileObject.tell() ... ok mirror.py tests ActionTests test the effects of a callback-returned action ... ok test default action policy ... ok test the effects of passed-in default_action ... ok test the effects of method-level default_action ... ok BadMirrorTests test that a bad mirror raises URLGrabError ... ok BasicTests MirrorGroup.urlgrab ... ERROR MirrorGroup.urlopen ... ERROR MirrorGroup.urlread ... ERROR CallbackTests test that the callback can correctly re-raise the exception ... ok test that MG executes the failure callback correctly ... ERROR FailoverTests test that a the MG fails over past a bad mirror ... ERROR SubclassTests MGRandomOrder.urlgrab ... ERROR MGRandomStart.urlgrab ... ERROR keepalive.py tests CorruptionTests download a file with mixed readline() and read(23) calls ... ERROR download a file with a single call to read() ... ERROR download a file with multiple calls to readline() ... ERROR download a file with a single call to readlines() ... ERROR download a file with multiple calls to read(23) ... ERROR DroppedConnectionTests testing connection restarting (20-second delay, ctrl-c to skip) ... ERROR HTTPErrorTests test that 200 works without fancy handler ... ERROR test that 200 works with fancy handler ... ERROR test that 403 works without fancy handler ... ok test that 403 works with fancy handler ... ok test that 404 works without fancy handler ... ok test that 404 works with fancy handler ... ok ThreadingTests use 3 threads, each getting a file 4 times ... Exception in thread Thread-2: Traceback (most recent call last): File "/usr/lib/python2.4/threading.py", line 442, in __bootstrap self.run() File "test/test_keepalive.py", line 245, in run fo = self.opener.open(self.url) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: Exception in thread Thread-1: Traceback (most recent call last): File "/usr/lib/python2.4/threading.py", line 442, in __bootstrap self.run() File "test/test_keepalive.py", line 245, in run fo = self.opener.open(self.url) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: Exception in thread Thread-3: Traceback (most recent call last): File "/usr/lib/python2.4/threading.py", line 442, in __bootstrap self.run() File "test/test_keepalive.py", line 245, in run fo = self.opener.open(self.url) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: FAIL =============================================================================== ERROR: check for proper args when used with urlgrab ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 341, in test_checkfunc_urlgrab_args self.g.urlgrab(short_ref_http, self.filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: check success with urlgrab checkfunc ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 354, in test_checkfunc_urlgrab_success self.g.urlgrab(short_ref_http, self.filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: check for proper args when used with urlread ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 347, in test_checkfunc_urlread_args self.g.urlread(short_ref_http) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 820, in urlread s = self._retry(opts, retryfunc, url, limit) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 799, in retryfunc fo = URLGrabberFileObject(url, filename=None, opts=opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: check success with urlread checkfunc ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 359, in test_checkfunc_urlread_success self.g.urlread(short_ref_http) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 820, in urlread s = self._retry(opts, retryfunc, url, limit) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 799, in retryfunc fo = URLGrabberFileObject(url, filename=None, opts=opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: simple (forced) reget ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 416, in test_basic_reget self.grabber.urlgrab(self.url, self.filename, reget='simple') File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: test_newer_check_timestamp (test_grabber.HTTPRegetTests) ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 445, in test_newer_check_timestamp self.grabber.urlgrab(self.url, self.filename, reget='check_timestamp') File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: test_older_check_timestamp (test_grabber.HTTPRegetTests) ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 433, in test_older_check_timestamp self.grabber.urlgrab(self.url, self.filename, reget='check_timestamp') File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: do an HTTP post ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 100, in test_post http_headers=headers) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 593, in urlread return default_grabber.urlread(url, limit, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 820, in urlread s = self._retry(opts, retryfunc, url, limit) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 799, in retryfunc fo = URLGrabberFileObject(url, filename=None, opts=opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: download refernce file via HTTP ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 87, in test_reference_file grabber.urlgrab(ref_http, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 573, in urlgrab return default_grabber.urlgrab(url, filename, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: interrupt callback is called on retry ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 286, in test_interrupt_callback_called try: g.urlgrab(ref_http) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: interrupt callback raises an exception ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 296, in test_interrupt_callback_raises self.assertRaises(self.TestException, g.urlgrab, ref_http) File "/build/buildd/urlgrabber-2.9.7/test/munittest.py", line 383, in failUnlessRaises callableObj(*args, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: module-level urlgrab() function ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 121, in test_urlgrab filename=outfile) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 573, in urlgrab return default_grabber.urlgrab(url, filename, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 784, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 770, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: module-level urlopen() function ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 114, in test_urlopen fo = urlgrabber.urlopen('http://www.python.org') File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 583, in urlopen return default_grabber.urlopen(url, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 741, in urlopen return self._retry(opts, retryfunc, url) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 740, in retryfunc return URLGrabberFileObject(url, filename=None, opts=opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: module-level urlread() function ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_grabber.py", line 126, in test_urlread s = urlgrabber.urlread('http://www.python.org') File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 593, in urlread return default_grabber.urlread(url, limit, **kwargs) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 820, in urlread s = self._retry(opts, retryfunc, url, limit) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 702, in _retry r = apply(func, (opts,) + args, {}) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 799, in retryfunc fo = URLGrabberFileObject(url, filename=None, opts=opts) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 893, in __init__ self._do_open() File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 960, in _do_open fo, hdr = self._make_request(req, opener) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/grabber.py", line 1067, in _make_request raise URLGrabError(4, _('IOError: %s') % (e, )) URLGrabError: [Errno 4] IOError: =============================================================================== ERROR: MirrorGroup.urlgrab ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 54, in test_urlgrab self.mg.urlgrab(url, filename) File "test/../urlgrabber/mirror.py", line 411, in urlgrab return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: MirrorGroup.urlopen ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 72, in test_urlopen fo = self.mg.urlopen(url) File "test/../urlgrabber/mirror.py", line 416, in urlopen return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: MirrorGroup.urlread ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 65, in test_urlread data = self.mg.urlread(url) File "test/../urlgrabber/mirror.py", line 422, in urlread return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: test that MG executes the failure callback correctly ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 117, in test_failure_callback data = self.mg.urlread('reference') File "test/../urlgrabber/mirror.py", line 422, in urlread return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: test that a the MG fails over past a bad mirror ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 153, in test_simple_grab self.mg.urlgrab(url, filename, failure_callback=cb) File "test/../urlgrabber/mirror.py", line 411, in urlgrab return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: MGRandomOrder.urlgrab ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 102, in test_MGRandomOrder self.fetchwith(MGRandomOrder) File "test/test_mirror.py", line 88, in fetchwith self.mg.urlgrab(url, filename) File "test/../urlgrabber/mirror.py", line 411, in urlgrab return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: MGRandomStart.urlgrab ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_mirror.py", line 98, in test_MGRandomStart self.fetchwith(MGRandomStart) File "test/test_mirror.py", line 88, in fetchwith self.mg.urlgrab(url, filename) File "test/../urlgrabber/mirror.py", line 411, in urlgrab return self._mirror_try(func, url, kw) File "test/../urlgrabber/mirror.py", line 389, in _mirror_try mirrorchoice = self._get_mirror(gr) File "test/../urlgrabber/mirror.py", line 286, in _get_mirror raise URLGrabError(256, _('No more mirrors to try.')) URLGrabError: [Errno 256] No more mirrors to try. =============================================================================== ERROR: download a file with mixed readline() and read(23) calls ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 51, in setUp self.fo = self.opener.open(self.ref) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: download a file with a single call to read() ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 51, in setUp self.fo = self.opener.open(self.ref) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: download a file with multiple calls to readline() ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 51, in setUp self.fo = self.opener.open(self.ref) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: download a file with a single call to readlines() ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 51, in setUp self.fo = self.opener.open(self.ref) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: download a file with multiple calls to read(23) ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 51, in setUp self.fo = self.opener.open(self.ref) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: testing connection restarting (20-second delay, ctrl-c to skip) ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 175, in test_dropped_connection fo = self.opener.open(ref_http) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: test that 200 works without fancy handler ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 120, in test_200_handler_off fo = self.opener.open(ref_http) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== ERROR: test that 200 works with fancy handler ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 112, in test_200_handler_on fo = self.opener.open(ref_http) File "/usr/lib/python2.4/urllib2.py", line 358, in open response = self._open(req, data) File "/usr/lib/python2.4/urllib2.py", line 376, in _open '_open', req) File "/usr/lib/python2.4/urllib2.py", line 337, in _call_chain result = func(*args) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 211, in http_open return self.do_open(HTTPConnection, req) File "/build/buildd/urlgrabber-2.9.7/urlgrabber/keepalive.py", line 241, in do_open raise urllib2.URLError(err) URLError: =============================================================================== FAIL: use 3 threads, each getting a file 4 times ------------------------------------------------------------------------------- Traceback (most recent call last): File "test/test_keepalive.py", line 234, in test_basic_threading self.assert_(l == reference_logs) File "/build/buildd/urlgrabber-2.9.7/test/munittest.py", line 372, in failUnless if not expr: raise self.failureException, msg AssertionError ------------------------------------------------------------------------------- Ran 74 tests in 10583.563s FAILED (failures=1, errors=29, skipped=6) touch build-stamp /usr/bin/fakeroot debian/rules binary dh_testdir dh_testroot dh_clean -k dh_installdirs # Add here commands to install the package into debian/urlgrabber. #/usr/bin/make install DESTDIR=/build/buildd/urlgrabber-2.9.7/debian/urlgrabber python setup.py install --root=/build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber running install running build running build_py running build_scripts running install_lib creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4 creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/byterange.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/keepalive.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/progress.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/mirror.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/grabber.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber copying build/lib/urlgrabber/__init__.py -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/byterange.py to byterange.pyc byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/keepalive.py to keepalive.pyc byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/progress.py to progress.pyc byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/mirror.py to mirror.pyc byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/grabber.py to grabber.pyc byte-compiling /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/lib/python2.4/site-packages/urlgrabber/__init__.py to __init__.pyc running install_scripts copying build/scripts-2.4/urlgrabber -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/bin changing mode of /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/bin/urlgrabber to 755 running install_data creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc creating /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.7 copying README -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.7 copying LICENSE -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.7 copying TODO -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.7 copying ChangeLog -> /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.7 # rm -rf /build/buildd/urlgrabber-2.9.7/debian/python-urlgrabber/usr/share/doc/urlgrabber-2.9.6 dh_testdir dh_testroot dh_installchangelogs ChangeLog dh_installdocs dh_installexamples dh_installman dh_link dh_strip dh_compress dh_fixperms dh_python dh_installdeb dh_shlibdeps dh_gencontrol dpkg-gencontrol: warning: unknown substitution variable ${shlibs:Depends} dpkg-gencontrol: warning: unknown substitution variable ${misc:Depends} dh_md5sums dh_builddeb pkgstriptranslations: processing control file: ./debian/python-urlgrabber/DEBIAN/control, package python-urlgrabber, directory ./debian/python-urlgrabber pkgstriptranslations: python-urlgrabber does not contain translations, skipping pkgstriptranslations: no translation files, not creating tarball dpkg-deb: building package `python-urlgrabber' in `../python-urlgrabber_2.9.7-2_i386.deb'. dpkg-genchanges -b -mUbuntu/i386 Build Daemon dpkg-genchanges: binary-only upload - not including any source code dpkg-buildpackage: binary only upload (no source included) ****************************************************************************** Build finished at 20060204-1323 chroot-autobuild/build/buildd/python-urlgrabber_2.9.7-2_i386.deb: new debian package, version 2.0. size 61410 bytes: control archive= 1413 bytes. 783 bytes, 20 lines control 1194 bytes, 15 lines md5sums 358 bytes, 12 lines * postinst #!/bin/sh 173 bytes, 7 lines * prerm #!/bin/sh Package: python-urlgrabber Version: 2.9.7-2 Section: python Priority: optional Architecture: i386 Depends: python (<< 2.5), python (>= 2.4) Installed-Size: 264 Maintainer: Anand Kumria Source: urlgrabber Description: A high-level cross-protocol url-grabber. Python urlgrabber drastically simplifies the fetching of files. It is designed to be used in programs that need common (but not necessarily simple) url-fetching features. . It supports identical behavior for http://, ftp:// and file:/// URIs, HTTP keepalive, byte ranges, regets, progress meters, throttling, retries, access to authenticated http / ftp servers and proxies and the ability to treat a list of mirrors as a single source automatically switching mirrors if there is a failure. . chroot-autobuild/build/buildd/python-urlgrabber_2.9.7-2_i386.deb: drwxr-xr-x root/root 0 2006-02-04 13:23:52 ./ drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/ drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/bin/ -rwxr-xr-x root/root 4857 2006-02-04 10:27:27 ./usr/bin/urlgrabber drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/sbin/ drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/lib/ drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/lib/python2.4/ drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/lib/python2.4/site-packages/ drwxr-xr-x root/root 0 2006-02-04 13:23:52 ./usr/lib/python2.4/site-packages/urlgrabber/ -rw-r--r-- root/root 16837 2005-10-22 22:57:28 ./usr/lib/python2.4/site-packages/urlgrabber/byterange.py -rw-r--r-- root/root 20230 2005-10-22 22:57:28 ./usr/lib/python2.4/site-packages/urlgrabber/keepalive.py -rw-r--r-- root/root 18235 2005-08-19 22:59:07 ./usr/lib/python2.4/site-packages/urlgrabber/progress.py -rw-r--r-- root/root 18077 2005-10-22 22:57:28 ./usr/lib/python2.4/site-packages/urlgrabber/mirror.py -rw-r--r-- root/root 52529 2005-10-22 22:57:28 ./usr/lib/python2.4/site-packages/urlgrabber/grabber.py -rw-r--r-- root/root 2259 2005-10-22 23:05:12 ./usr/lib/python2.4/site-packages/urlgrabber/__init__.py drwxr-xr-x root/root 0 2006-02-04 13:23:51 ./usr/share/ drwxr-xr-x root/root 0 2006-02-04 13:23:52 ./usr/share/doc/ drwxr-xr-x root/root 0 2006-02-04 13:23:52 ./usr/share/doc/urlgrabber-2.9.7/ -rw-r--r-- root/root 995 2005-02-21 21:40:09 ./usr/share/doc/urlgrabber-2.9.7/README -rw-r--r-- root/root 2084 2005-10-22 23:05:11 ./usr/share/doc/urlgrabber-2.9.7/TODO -rw-r--r-- root/root 8640 2004-03-31 18:02:00 ./usr/share/doc/urlgrabber-2.9.7/LICENSE.gz -rw-r--r-- root/root 9462 2005-10-22 23:05:26 ./usr/share/doc/urlgrabber-2.9.7/ChangeLog.gz drwxr-xr-x root/root 0 2006-02-04 13:23:52 ./usr/share/doc/python-urlgrabber/ -rw-r--r-- root/root 9462 2005-10-22 23:05:26 ./usr/share/doc/python-urlgrabber/changelog.gz -rw-r--r-- root/root 2084 2005-10-22 23:05:11 ./usr/share/doc/python-urlgrabber/TODO -rw-r--r-- root/root 446 2006-02-04 10:27:26 ./usr/share/doc/python-urlgrabber/copyright -rw-r--r-- root/root 303 2006-02-04 10:27:26 ./usr/share/doc/python-urlgrabber/changelog.Debian.gz urlgrabber_2.9.7-2_i386.changes: Format: 1.7 Date: Sat, 31 Dec 2005 15:34:22 +1100 Source: urlgrabber Binary: python-urlgrabber Architecture: i386 Version: 2.9.7-2 Distribution: autobuild Urgency: low Maintainer: Ubuntu/i386 Build Daemon Changed-By: Anand Kumria Description: python-urlgrabber - A high-level cross-protocol url-grabber. Closes: 335340 Changes: urlgrabber (2.9.7-2) unstable; urgency=low . * When I imported urlgrabber into bzr, I somehow lost a Build-Dep: on python. Re-adding it so I can (Closes: #335340) Files: 3d4e0e058ac3337f4bde076ccbf4a3a6 61410 python optional python-urlgrabber_2.9.7-2_i386.deb ****************************************************************************** Built successfully Purging chroot-autobuild/build/buildd/urlgrabber-2.9.7 ------------------------------------------------------------------------------ /usr/bin/sudo dpkg --root=/home/buildd/build-167406-93/chroot-autobuild --purge intltool-debian readline-common gettext file libgcj6 python2.4 html2text gettext-base debhelper debconf-utils python2.3 po-debconf python libmagic1 libreadline5 gcj-4.0-base libbz2-1.0 python2.3-dev libssl0.9.8 libgcj-common (Reading database ... 10161 files and directories currently installed.) Removing debhelper ... Removing debconf-utils ... Removing po-debconf ... Removing python ... Removing python2.3-dev ... Removing intltool-debian ... Removing gettext ... Removing file ... Purging configuration files for file ... Removing libgcj6 ... Purging configuration files for libgcj6 ... Removing python2.4 ... Purging configuration files for python2.4 ... Removing html2text ... Purging configuration files for html2text ... Removing gettext-base ... Removing python2.3 ... Purging configuration files for python2.3 ... Removing libmagic1 ... Purging configuration files for libmagic1 ... Removing libreadline5 ... Purging configuration files for libreadline5 ... Removing libbz2-1.0 ... Purging configuration files for libbz2-1.0 ... Removing libssl0.9.8 ... Purging configuration files for libssl0.9.8 ... Removing libgcj-common ... Removing readline-common ... Purging configuration files for readline-common ... Removing gcj-4.0-base ... ****************************************************************************** Finished at 20060204-1323 Build needed 02:56:27, 1772k disk space RUN: /usr/share/launchpad-buildd/slavebin/scan-for-processes ['/usr/share/launchpad-buildd/slavebin/scan-for-processes', '167406-93'] Scanning for processes to kill in build 167406-93... Scanning for processes to kill in build /home/buildd/build-167406-93/chroot-autobuild... RUN: /usr/share/launchpad-buildd/slavebin/umount-chroot ['umount-chroot', '167406-93'] Unmounting chroot for build 167406-93... RUN: /usr/share/launchpad-buildd/slavebin/remove-build ['remove-build', '167406-93'] Removing build 167406-93