"apt-file update" downloads the whole Contents file every time
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
apt-file (Ubuntu) |
Fix Released
|
Medium
|
MOTU |
Bug Description
The 'http' definition in /etc/apt/
http = wget -N -P "<cache>" -O "<cache>/<dest>" "<uri>/
The -N is supposed to turn on timestamping, so that the file is only downloaded if it has changed on the server. However, -N doesn't work if -O is also specified, and so the whole download is down each time.
It would be better to leave out the -O flag so that timestamping can work.
The problem is that the local file and the file on the server have different names. To get timestamping to work, they have to have the same name. We can rename the local file before running wget and rename it back after, like this:
http = (if [ -f "<cache>/<dest>" ]; then mv "<cache>/<dest>" "<cache>
That line also filters out a lot of the output from wget, but leaves the progress display in place. The current version shows this:
--20:10:04-- http://
=> `/var/cache/
Resolving archive.
Connecting to archive.
HTTP request sent, awaiting response... 404 Not Found
20:10:05 ERROR 404: Not Found.
for files which don't exist on the server. My version above shows nothing for the missing files, the same as the curl line, and shows this:
Server file no newer than local file `/var/cache/
as confirmation that the server file exists but hasn't changed.
Changed in apt-file: | |
assignee: | nobody → motu |
Note that the "ftp =" line also tries to combine the -N and -O options, and so probably fails in the same way.