1

I am trying to keep one my systems up-to-date. The intended target is Not Connected to the internet. Am trying to get all the needed info from another machine with internet access.

After a bit of study, I found a way to achieve this. Can you please tell me if this is correct & safe?

Here is my understanding of the apt-get process..

  1. First, we run the command 'apt-get update' : This connects to all the repositories mentioned in the '/etc/apt/sources.list'.. And, downloads all the Packages.gz files like (in.archive.ubuntu.com/ubuntu/dists/trusty/main/binary-amd64/Packages.gz) & saves them in a similar name under '/var/lib/apt/lists' (for the above mentioned url the corresponding file is in.archive.ubuntu.com_ubuntu_dists_trusty_main_binary-amd64_Packages)
  2. Then, when we run 'apt-get upgrade' (or) 'apt-get install pkg_name', this checks the locally installed package list with the local meta-data downloaded & stored at '/var/lib/apt/lists'. And, then gets the download url from that & asks for user confirmation before downloading & installing the required packages.

This is my plan to keep the isolated machine up-to-date..

  1. Get the list of packages to download from the /etc/apt/sources.list conf file at the target machine..
  2. Download the meta-data files Packages.gz at another machine..
  3. Copy these files to target machine's /var/lib/apt/lists under appropriate filename.
  4. Run the apt-get --print-uris upgrade (or) apt-get --print-uris --yes install pkg_name to get the list of all the packages needed for that machine.
  5. Download these packages again at the second machine.
  6. Copy them to the target machine.
  7. Run the dpkg -i pkg_list to install all the missing packages.

I am able to achieve my goal using this process.

My question is : Is this correct & reliable? Or is there an easier way to achieve this?

Chromium
  • 332

2 Answers2

1

This should work in theory, but I have two ideas regarding that:

Copy the downloaded debs to /var/cache/apt/archives/ instead of installing them with dpkg -i. Thats where apt checks for packages and downloads them to.

And probably even better: Check out apt-zip:

APT-ZIP is a package to update a non-networked computer using apt and a (removable) media (harddisk, USB key, ZIP drive...)

The apt-zip-list and apt-zip-inst commands simplify the upgrade process of a non-networked Debian host using apt , by using (preferably high-capacity) removable media, like a ZIP or USB drive.

apt-zip-list produces two files. One is a `fetch´script (supporting wget backends, in a modular, extensible way) that can be used on another host (maybe not running a Debian system with good connectivity to the Internet) to fetch packages previously selected in dselect(8) or indicated in the command line and then install the packages on your Debian box; the other apt-zip.options saves the options used by apt-zip-list to indicates to apt-zip-inst what action to perform and/or which packages to install.

  • Thank you, @Jakob.. Will surely have a look at apt-zip. About copying the packages to /var/cache/apt/archives/. Is there any performance difference when it comes to using apt-get install to dpkg -i? I was thinking that apt-get internally uses dpkg. – Rangaraj KS Aug 31 '15 at 07:49
  • It'll be easier to find dependency problems and you'll be getting a more polished interface showing you what will happen. If you simply install via dpkg -i updates_20150831/*, dpkg will install the files in alphabetical order instead of one that works in ways of dependencies. Nothing bad will happen with dpkg, but you might need to install files in a specific order manually to get around dependencies issues. – Jakob Lenfers Aug 31 '15 at 08:07
  • Oh. Didn't know this ordering for installation. Thanks again. – Rangaraj KS Aug 31 '15 at 09:28
-1

Short answer is No.

What you are saying is an Ubuntu running without internet which in itself is pretty bad because you don't get OTA Updates for your version and your bugs will not get removed and neither will repository support be available.

This is not recommended for Ubuntu as it will have you running on a flavor that is not supported by the community. I only mention this because many smaller (not less important updates) are not available in the lists folder so will not be installed on your system.

As for actually doing it you could change the sources file at /etc/apt/sources.list to point to a location in your system.

However, like i said this is not going to help you cause what you are doing is trying to update your Ubuntu which would technically not be happening. But you could still Partially update Ubuntu using what i said above.

As for whether there is a better way to do this: not without a network connection, i don' mean an internet connection but a "network connection' that is, 2 of your computers connected with a LAN cable so here you could point the sources.list file to the network address of your connected computer which then points any incoming connections to the correct server (basically mirroring) but here too you need an internet connection on the second computer so.... for your situation still nothing.

pranay
  • 54
  • I think I've mentioned that my target is not-connected to the internet alone. It is in a network. Sorry, if wasn't clear the first time. Yeah, about pointing to the sources.list at another machine. Had a look at it.. And, came across apt-cacher. I not sure if you are referring to that.. I think it'll be better to use this than mirroring the whole repository, which may be costlier in terms of disk size.. – Rangaraj KS Aug 31 '15 at 07:40
  • So i don't see what the problem is, you should point to another machine then. Sorry i didn't get that you were connected to a network though.... Don't get why you downvoted though – pranay Aug 31 '15 at 07:51
  • I would prefer having the downloads through a Windows machine. Rather than another Ubuntu. Unless there is no other choice.. Which is why I'm reluctant to use that method. – Rangaraj KS Aug 31 '15 at 08:18
  • point to an smb address then – pranay Aug 31 '15 at 08:21
  • I didn't downvote.. It must've been someone else.. – Rangaraj KS Aug 31 '15 at 08:25
  • About using an smb.. Won't I have to replicate the same structure as apt repository at the windows machine? Its similar to mirroring isn't it? – Rangaraj KS Aug 31 '15 at 08:27
  • In some ways. But in the end you will have to download the whole repository someplace, the only problem you'll have with a windows machine is that Ubuntu will automatically delete the files after copying while windows won't, so mirroring will be a sort of tunneling connection through the windows machine while smb will copy downloaded files from a location on the windows machine – pranay Aug 31 '15 at 09:08
  • But in the end you'll simply be simply downloading someplace and accessing from the Ubuntu machine, whereas in mirroring you'll be downloading through the windows machine. In the end you have to choose between saving space on HDD on windows machine or saving read and write speed on the HDD – pranay Aug 31 '15 at 09:10
  • Glad to be of help, please choose my answer as the correct answer if i was helpful – pranay Aug 31 '15 at 10:07