Best way to cache apt downloads on a LAN?


Answer #: 1

I did some research into a bunch of solutions and some Ubuntu developers came up with a proxy configuration (based on Squid) for 10.04 and later. It’s called squid-deb-proxy. It only requires a machine to act as the server. Large organizations usually run their own full mirrors but for most people the on demand mirroring is enough.

Why squid-deb-proxy?

No editing of files on the client side. Use zeroconf so that clients were “zero config” Use an existing, solid proxy solution instead of writing a new tool. Easy to set up for a typical Linux administrator.

Server Config

On the machine you want to act as a server install the tool with:

sudo apt-get install squid-deb-proxy avahi-utils

Now start the service bits:

sudo start squid-deb-proxy

And the avahi bits (You don’t need this if you’re on 12.04+):

sudo start squid-deb-proxy-avahi

This will install the proxy server (which listens to port 8000 by default) and...

0 0


When running multiple machine with the same distribution, it is interesting to set up a repository cache on your network so that once a package is downloaded from an official repository, all other machines will download it from your local area network.

Having different machines running the same linux distribution, it becomes interesting to set up a repository cache somewhere on your network. This way, you won’t download common packages more than 1 time from official repositories.

Here is the situation, we have one machine called repository-cache, this machine is going to act as the repository cache, basically, any other machines in your network is going to use it as a repository.

1. How To Set up a repository cache with apt-cacher

How To Set up a repository cache with apt-cacher

2. Conclusion

apt-cacher is an easy and efficient package which will save you both time and bandwidth when using multiple machines with the same...

0 0

TLDR: I now add the following snippet to all my Dockerfiles:

# If host is running squid-deb-proxy on port 8000, populate /etc/apt/apt.conf.d/30proxy # By default, squid-deb-proxy 403s unknown sources, so apt shouldn't proxy RUN route -n | awk '/^ {print $2}' > /tmp/host_ip.txt RUN echo "HEAD /" | nc `cat /tmp/host_ip.txt` 8000 | grep squid-deb-proxy \ && (echo "Acquire::http::Proxy \"http://$(cat /tmp/host_ip.txt):8000\";" > /etc/apt/apt.conf.d/30proxy) \ && (echo " DIRECT;" >> /etc/apt/apt.conf.d/30proxy) \ || echo "No squid-deb-proxy detected on docker host"

If you're using docker, you probably have a Dockerfile that starts like this:

FROM ubuntu:12.04 RUN apt-get update # install all my favorite utilities, putting it early to facilitate docker caching RUN apt-get install -y curl git vim make build-essential # install all pre-requisite packages for our dockerized application RUN apt-get install -y...
0 0



> On Thu, Jan 17, 2013 at 12:57 AM, Abhishek Dixit

[hidden email]


> wrote:




>> On Wed, Jan 16, 2013 at 10:19 PM, Steve Flynn

[hidden email]


>> wrote:


>>> On 16 January 2013 16:31, Abhishek Dixit

[hidden email]

> wrote:


>>> > Now what I want to know is how do I make sure do-release-upgrade uses

>>> > these

>>> > in my laptop and does not downloads

>>> > 700 mb of updates because this will take 8-10 hours at my end so I want

>>> > to

>>> > utilize the upgrades which were downloaded during the upgrade process

>>> > on one of the systems.

>>> > If there is a way in this situation let me know.


>>> Have you tried putting those package files into

>>> /var/apt/cache/archives and then just firing the upgrade process?


>> yes I have tried this and unfortunately this does not...

0 0

the following is from here:

I used to use and recommend approx, but it's growing increasingly unreliable, to the point of being worse than not using a proxy at all, since I have to keep debugging it when it fails to properly update.

So I'm going to try this one next.
apt-cacher-ng is the answer for me - I haven't encountered any problems in smallish environments (approx. 20 clients), so I suppose the issues @MagicFab mentions were solved in current version (installed on Ubuntu 10.04 and 10.10). There is no config necessary for the server, and you only need to instruct your clients to use the server as their package manager proxy.

The server is completely installed and configured by installing the apt-cacher-ng package.

The clients need to be configured by setting up APT proxy - by adding the file /etc/apt/conf.d/01proxy, containing this (where...

0 0

There are 4 steps to setting up a simple repository for yourself

1.Install dpkg-dev
2.Put the packages in a directory
3.Create a script that will scan the packages and create a file apt-get update can read
4. Add a line to your sources.list pointing at your repository

Install dpkg-dev

Type in a terminal

sudo apt-get install dpkg-dev

The Directory

Create a directory where you will keep your packages. For this example, we'll use /usr/local/mydebs.

sudo mkdir -p /usr/local/mydebs

Now move your packages into the directory you've just created.

Previously downloaded Packages are generally stored on your system in the /var/cache/apt/archives directory. If you have installed apt-cacher you will have additional packages stored in its /packages directory.

The Script update-mydebs

It's a simple three liner:

#! /bin/bash cd /usr/local/mydebs dpkg-scanpackages . /dev/null | gzip -9c > Packages.gz ...
0 0

When today traffic and casual internet speeds is measured in teens of Giga over an eye blink even for ordinary Internet clients, what’s the purpose of setting a local repository cache on LAN’s you may ask?

Setup Local Repositories in Ubuntu

One of the reasons is to reduce Internet bandwidth and high speed on pulling packages from local cache. But, also, another major reason should be privacy. Let’s imagine that clients from your organization are Internet restricted, but their Linux boxes need to regular system updates on software and security or just need new software packages. To go further picture, a server that runs on a private network, contains and serves secret sensitive information only for a restricted network segment, and should never be exposed to public Internet.

This are just a few reasons why you should build a local repository mirror on your LAN, delegate an edge server for this job and configure internal clients to pull out software form its...

0 0

I would recommend against using wireshark to monitor traffic. You'll just get too much data, but you have a hard time analyzing the data. If you need to look at/troubleshoot the interaction between a couple machines, wireshark is great. As a monitoring tool, IMHO, wireshark is not quite the tool you need.

Profile the network traffic. Try out some actual monitoring tools: You're looking for Top Type of traffic (likely HTTP, but who knows), Top Talkers (should be your servers, but who knows), and potentially Malformed Traffic (large amount of TCP retransmissions, malformed packets, high rates of very small packets. Probably won't see, but who knows)

At the same time, work with your management to develop a network resource usage policy. In general, business terms, what business needs does the computer network exist to meet, and what are appropriate uses of the resource. This thing is costing money, so there has to be a...

0 0


ClamAV can be found for Ubuntu in the apt repository. Run this command to install ClamAV:

apt-get install clamav

If you need clamd, you may also want to run:

apt-get install clamav-daemon

If you require support for scanning compressed RAR files you first need to enable the non-free archive, and then you can install the RAR-plugin using:

apt-get install libclamunrar6

There are two classes of clamav packages available for Ubuntu users:

Released Set

The released set (release, *-updates, and *-security) are patched for security updates. Following extensive testing of clamav and the packages that use it in the backports repository, they may be updated to a newer version. These are official Ubuntu packages and supported by community developers.


The Ubuntu backports repository will contain the newest clamav version that has been at least lightly tested to...

0 0