Apt-offline A-Z

Keywords: "APT-offline instructions, APT-offline tutorial, APT-offline introduction, APT-offline manual, Linux without internet"
APT-offline: The ultimate tool for making changes to internet-access-impaired Debian Linux installations

Rev 6/10/22a (see Notes)


=========

Warning

Do not attempt to install a .zip or tar.xz version of APT-offline on a full installation of Linux, because the resulting APT-offline installation might not be fully functional, and you might need to re-install the OS in order to install APT-offline correctly, which is explained in Appendix II.

=========


> Introduction

This article is intended to allow someone who knows nothing about Linux to read it with minimal effort and know everything required to use APT-offline effectively. It has been revised many times, so you should use the latest version, located at https://anaptofflineblog.blogspot.com.  

The fastest way to learn how to use APT-offline is to fiddle with it, so I recommend reading the relevant sections of the article to get a general idea of how it works, then using APT-offline to make a desired change to a Debian-type installation (preferably in a setting where you have 4G or better internet access, and can perform every step of the process without having to travel to an access point/area), while referring to the article as required. If you get stuck on some point, try to find another place in the article where the same point is covered, perhaps in a different way that makes more sense.

Even if you have good internet access, I recommend that you take the opportunity to use APT-offline to become familiar with it, because it's easiest to install and learn when you have internet access at home, and you might end up in a situation where you don't have it, or at least the same quality of access. Or, you could create a secure "air gap" PC out of a cheap mini-PC running an encrypted flash-drive installation of Linux (see toggwu.blogspot.com for details), and you could use APT-offline to make changes to the installation on the air-gap PC without connecting it to the internet. With such a setup, there's no need to worry about hackers or viruses.

> What APT-offline is/does

APT-offline is the only piece of software which allows any type of change [1] to be made to any Debian installation which has either no direct internet connection (perhaps deliberately, such as in the case of an air-gap system), or a direct internet connection which is slow or data-limited. (A "Debian installation" is defined as any installation made from a type of Linux which uses the Debian Linux software manager, which is known as Advanced Packaging Tool, or APT. This type of Linux includes Debian proper, Ubuntu, and their derivatives.)

When Debian installations have direct high-speed data-unlimited internet access, the user just specifies the desired changes, and APT downloads the required modules/files from an online repository, screens them for errors, and assuming that they have no errors, places them in certain directories, and then installs them in various ways, depending on their type (details are provided as required).

APT-offline eliminates the need for a direct internet connection to make changes to Debian installations, by providing A) the ability to generate a specialized text file known as a signature file, which contains a list of URLs of files required to make the desired changes, and in some cases contains reference checksums and file sizes (obtained from the installation's internal software index) so that software modules/packages can be screened for errors as they are being downloaded (checksums for the software index files are contained in the digitally-signed In Release file downloaded with the software index files), and a name to assign to each package (the name which the package must have in order to be installed) if it is found to be error-free; B) a specialized download-mode known as the "get"-mode, which reads APT-offline signature files, downloads the listed files, screens them for errors, renames them as described previously, stores them in a user-created download-destination folder designated in the get-command, and which can run on a variety of popular devices which can readily obtain direct high-speed internet access; and C) the ability to install all of the files contained in the aforementioned download-destination folder onto the target-installation (or any equivalent) by using its installation of APT-offline to perform an "install"-operation on the folder.


> How APT-offline functions, basically

APT-offline is installed on the installation to be changed (the "Debian installation"), and on some device which can readily obtain a direct high-speed internet connection (the "download-device"). (Installation-instructions are contained in the appendices.) Then the Debian installation's installation of APT-offline (perhaps including its GUI) is used for performing a "set"-operation, during which the user specifies the desired changes, and APT-offline uses APT to generate a specialized text-file known as a signature file, which contains a list of URLs of files which must be downloaded in order to implement the specified changes.

Every file which APT-offline downloads is screened for errors as part of the get-operation. In some cases, checksums and file sizes are included in the signature file, and in the other cases, this information is included in the "In Release" file, which is included with the downloaded files and digitally signed so that it can be authenticated and screened for errors. If a file passes its screening-test, it is renamed with the name which it must have in order to be installed, and the new name is included in the signature file or in the In Release file, along with the checksum.

The signature file is then transferred via flash memory to the download-device (or directly to a portable device which is being used as the download-device), which is then connected to the internet, after which its installation of APT-offline is used for performing a "get"-operation on the signature file. During the get-operation, APT-offline's downloader reads the list, downloads each of the files, screens it for errors (its checksum is apparently calculated as it's being downloaded, and then compared to the reference checksum from the signature file), and if no errors are found, renames it as described previously, and stores it in a folder which is created by the user, specified in the get-command, and reserved for the files in the list.

The folder containing the downloaded modules/files is then transferred or just connected to the Debian installation, where an APT-offline "install"-operation is performed on the folder. The install-op copies the files to certain directories in the installation, depending on their type, and fully installs any package-index files and software modules which are upgraded versions of previously-installed modules. (The fact that they were previously installed lets APT know that it's OK to install the upgraded version.)

However, the install-op doesn't fully install software modules which are being installed for the first time on that installation as part an app or apps, which requires the user to subsequently perform a regular installation-process (as if the installation had a direct high-speed internet connection) for the corresponding app/apps. This is probably done so that the user will be notified if any required files aren't available to be installed, and to prevent any extraneous files, including malware, from being installed - the alternative would be to simply install all of the files which had been transferred to the installation, and possibly then having to discover that some required file or file wasn't installed, or destroying the installation.

APT-offline is controlled by entering commands into the terminal, including by entering the command to activate APT-offline's GUI ("sudo apt-offline-gui"), and then using the GUI, which itself generates commands in response to user-inputs. Although there are many possible commands, and they can get quite complex (see the APT-offline "manpage"), you can get by with just a few simple commands (which I provide), and they can be stored in clipboard apps (allowing them to be copied with a click or a tap, and pasted into the terminal), and in some cases stored in the terminal's command-history, which allows them to be re-used by opening the terminal, pressing the up-arrow key until the desired command appears, and hitting Enter. (So relax, terminal-phobes.) In fact, due to the simplicity of the system which I recommend for using APT-offline, you don't even need to install the GUI.

============

Note

If you enter a command and get an error-message indicating a problem with the "APT system," chances are that the problem is actually in the command, so examine the command closely.

============  


> A more detailed explanation

From a slightly more detailed perspective, the first step in the change-process is to use the Debian installation's installation of APT-offline to perform a "set" operation, during which the desired changes would be specified, and the result of which would be a list of URLs of files (potentially hundreds) which must be downloaded, combined in some cases with security-related information, in a specialized text-file known as a "signature file" (SF), which is created by the set-operation, using the name and location specified in the set-command or in the GUI's Generate Signature window. I recommend always using the Home directory as the location, for reasons which will become evident with experience, and "apt-offline.sig" as the name, as part of a system which makes it easy to perform multiple get-ops in a single download-session even on phones. 

After creating the SF, the user would then create two other elements of the aforementioned system: a) a folder known as a download-destination folder (DDF), which I name "DDF" in all cases, to be assigned to a particular SF and reserved for the files listed in it; and b) a folder known as a "change-name folder" (CNF), to contain the SF/DDF pair in order to identify the changes which it is intended to produce, and to isolate it from any other SF/DDF pairs, since in my recommended system all SFs are named "apt-offline.sig" and all DDFs are named "DDF." So, after creating the CNF and giving it a name (starting with "CNF") which will uniquely identify it (such as the PC, the installation type and version, the changes, and the date of the signature file), move the SF/DDF pair into the CNF.

To perform a get-op, the CNF would be transferred to a get-process directory, which is a sub-directory created by the user and placed in some stable, convenient directory on the download-device, or on a flash drive to be connected to the download-device. The idea behind a GPD is to be able to put a CNF in a GPD, and then move the contents of the CNF (an SF/DDF pair) out of the CNF into the GPD, so that a get-op could be performed on the SF/DDF pair by executing a get-command with the GPD's name ("GPD1," etc.) in the path. If the SF/DDF pair were left in the CNF, its path would have to be included in the get-command, which would require a new get-command for every get-op, and creating a new get-command on a portable device (or creating one on a PC and transferring it to a portable device) would be very inconvenient. The reason for moving the CNF into the GPD, and then moving the SF/DDF pair out of the CNF into the GPD, is to make it easy to return the SF/DDF pair to the correct CNF after using the SF/DDF pair in a get-op. The use of multiple GPDs requires a get-command for each GPD, but there would probably be just a few, and all of the commands would be identical except where they refer to the corresponding GPD (in two places). Each get-command would take the form of "apt-offline get /<path>/GPDn/apt-offline.sig -d /<path>/GPDn/DDF," where <path> is the location of the directory containing the GPDs, and n is the associated GPD's number. (There's no "sudo" in get-commands.)  

When using Android devices (and I presume iOS devices) as the download-device, GPDs must be placed in a particular directory (for details, see Appendix I). Another consideration is that it might be necessary to use a file manager installed on the device to move the SF/DDF pair from the CNF into the GPD, and back into the CNF, so that the file manager on the target Debian installation can find the downloaded files in the DDF. Just keep this in mind in case the target-installation's file manager can't find the downloaded files which you know are in the DDF.

Another benefit of using fixed get-commands is that they can be stored permanently in a clipboard-manager app on a portable device. Assuming that there are multiple GPDs, each containing an SF/DDF pair to be used in a get-op, and corresponding get-commands stored in a clipboard-manager app, multiple get-ops could be performed (such as to update one installation, upgrade another, and install apps on another) by simply tapping on a get-command in the clipboard-app to copy the command to the clipboard, then tapping/holding on the command line to paste the command into the command line, then tapping on Enter to initiate a get-op on the SF/DDF pair in the corresponding GPD, then waiting for the get-op to finish, and then repeating this procedure for the next GPD which contains an SF/DDF pair.

In install-mode, APT-offline acts as a convenient interface for various APT- and OS-functions which actually perform the operation. The install-operation fully installs package-index files and software packages which are upgraded versions of previously-installed software packages. But it typically just copies software packages which are being installed for the first time on that installation as part of an app to /var/cache/apt/archives, and then the user must perform a regular installation (as if the installation has a direct internet connection) for each app which was specified in the corresponding set-command, and which should be indicated in the CNF's name. This probably intended to ensure that all of the required files are installed, and to prevent any malware from being installed - the alternative would be to simply install all of the files which had been transferred to the installation, without providing any indication of any missing files, and possibly installing malware and destroying the installation.

> Installing APT-offline

APT-offline, as far as I know, isn't included by default in any version of Linux, but it doesn't hurt to check, which can be done by entering "apt-offline -h." If it's installed, it will provide some basic information on how to use it; otherwise, the system will indicate that it was unable to find it.

For details on installing APT-offline on Debian installations, see Appendix II. Instructions for installing APT-offline on Android/iOS devices (for performing get-ops) are contained in Appendix I, and instructions for installing APT-offline on Windows PCs (for performing get-ops) can be found in Appendix III.


> The local package index

To introduce APT-offline without overwhelming newbies with details, it is necessary to provide yet another intermediate level of detail.

In response to an APT-offline set-command, APT-offline translates the relatively simple set-command into a series of complex commands for APT itself, which actually performs the set-operation. The set-process translates the changes specified in the set-command into a list of modules/files required to implement the changes on that installation. It does this by referring to the installation's internal software index, known as the "local package index," to obtain a list of all of the files which the desired changes COULD POSSIBLY require on any Debian-type installation, and then by comparing this list to the list of files which the installation already contains, and finally generating a list of the files which must be downloaded and installed in order to make the desired changes to that installation in the configuration which exists at that time. (This can be demonstrated by installing an app and then generating a list to install the same app, in which case the list would be empty.) The generated list is stored in a file known as a "signature file," which is created by APT during the set-operation, using the name and location specified in the set-command. Depending on the type of changes being made, the signature file might also include security-related information such as checksums, file sizes, and a name to assign to each file when it passes its test. In some cases, these "new" names are the same as the original names (at least this was once the case), but APT-offline just renames everything with whatever the package index provides. Checksums for package-index files are contained in the "In Release" file which is downloaded with the package-index files, and which is digitally signed to allow it to be authenticated and validated (the reference checksum is encrypted in the digital signature, and the decryption key is included in the installation).   

The repository (which consists of software packages/modules, and the software/package index) is updated every day, although when a new revision of a software-module/package is added to the repository, the previous revision is retained (along with its listing in the package index) as long as it's needed by some piece of software. When you "update the package index," you're actually REPLACING the local package index with a newer copy (partial or complete) of the online package index. The effect of downloading software based on an old package index is that you might not get the latest available revisions of some software-modules/packages. (If you let it get too old, the package manager will stop working until you perform an update.) If you need the latest version of some package (such as an app), you have to wait until it's been released to update the package index.

There is considerable variation in how package indices are organized in the repositories for the various types of Linux. For some types of Linux (such as Ubuntu and its derivatives), the index is divided into sections so that users don't have to download the entire index just to install the latest version of some app. But in other cases there's little choice but to download the entire index to update the local package index. When deciding which sections to download (assuming that you have a choice), I just download everything except what I definitely won't need. For example, I thought that I could get by without some of the updates-sections, but ran into problems as a result. So, if you don't know whether you'll need a particular section, include it.

When the installation has a direct internet connection and software is installed, the software-packages are screened as they're downloaded, and if they pass their screening-tests they are assigned a new name which they must have in order to be installed, then they're stored in /var/cache/apt/archives, and then they're installed. In some types of Linux, they are retained by default in the "archives" directory after installation, but in other types of Linux they're deleted by default, although they can be retained by entering "echo 'Binary::apt::APT::Keep-Downloaded-Packages "1";' | sudo tee /etc/apt/apt.conf.d/10apt-keep-downloads" (w/o end-quotes - the command can be copied as usual and pasted into the command-line with Ctrl-Shift-V) before installing the software to be retained. I tried to find a command to do the same for package-index files, to no avail. My motive was to be able to re-use the package index files, instead of downloading them again, which in some cases requires a download of 120MB. So, if you want to re-use package-index files, install APT-offline first, before updating the package index, even if you have a direct internet connection, and then use APT-offline to perform an update so that you'll have the package-index files in a folder (which should be backed-up), and can install them on multiple installations which have APT-offline on them.

To install APT-offline before updating the package index, go to the Packages site (such as Ubuntu Packages or Debian Packages, which are interfaces to the corresponding repositories) corresponding to the installation of interest (the type and version of Linux), find the list of packages required by APT-offline, and then go to DistroWatch.com and find the list of packages installed by default on that type and version of Linux. This will tell you what packages you'll need in order to install APT-offline on the installation of interest. Then you would download those packages from the Packages site, manually screen them for errors (copy the reference checksum for each package from the Packages site, calculate the checksum of the corresponding downloaded package, and compare the two checksums) and use the installation's package-installer (such as GDebi) to install them. (This approach doesn't work with plain Ubuntu, because the package index must be updated even before using its "package installer," which is apparently APT.) This is obviously a very inefficient way to install anything, and a good reason to include APT-offline by default in every Debian-type release. For more details, see Appendix II.

When you use APT-Offline to make some change to an installation, the required modules/files are downloaded into a folder, and you can install them on multiple installations of the same type and version of Linux, thus avoiding having to download them for each installation. However, when installing apps or upgrades, this approach can get complicated unless the same apps have been installed on all of the installations, so I'd just install the same apps on all of the installations of the same type and version, even if I wouldn't use all of the apps on every installation. Considering that some package-index downloads can require 120MB of your data allocation, that installing several apps can require over 100MB, and that system upgrades can require hundreds of MB, you can clearly reduce your data consumption by using APT-offline to make changes to multiple installations. (Although many people consider a few hundred MB of internet data to be nothing, there are still places on the planet where it's considered to be a lot.) For more information along these lines, see the section entitled "> Storing and installing the downloaded modules/packages/files".

However, when APT-offline is used, it might not be convenient to download the package index files and app-files on the same day, because the package index files must be installed on the installation of interest before generating a signature file for the desired apps. In order to download the package-index files and the packages for the desired software on the same day, you would have to take two trips to your internet-access point on the same day (assuming that you have to travel to get to your access-point). But the fact is that when a newer revision of some package is added to the repository, the previous revision is retained for probably at least a month, although I suppose the precise amount of time depends on various factors. So, it's safe to assume that you could, for example, download a copy of the online package index, and a week later download software based on that copy of the package index. If you download a copy of the package index every week, you'd always have a fairly fresh copy which you could immediately install on your offline PC whenever you decide to install some software on it.
 
When using APT-offline to update the package index, you would start by using an app such as Software & Updates (a.k.a. software-properties-gtk) to select the sections which you want to include in the update. (In some cases, you don't have much choice but to download the entire package index, due to the way it's organized.) You can also choose the server, although you should avoid selecting servers intended for small organizations, because their servers have limited capacity. I just use the default server, which I assume is determined by the location selected during the OS-installation process. Then you would perform an APT-offline set-operation for an update, to create a signature file for an update (a list of package-index files).

Next, you would transfer the aforementioned signature file to your internet-access device. As of this revision (5/22), this would probably be an Android/iOS device with Userland, Ubuntu, and APT-offline installed on it, along with a text-editor app and a clipboard-manager app such as Clipper, and possibly a file-manager app for reasons mentioned elsewhere. Then you would take the device to an internet-access point, and perform an APT-offline get-operation on the signature file to download the package-index sections listed in the signature file into a folder reserved for those files.

Lastly, you would transfer or just connect the folder containing the downloaded files to the target-installation (I'd transfer the folder to a couple of archival flash drives, as described in more detail in "> Storing and installing the downloaded modules/packages/files," in case I needed the files again, and connect one of the flash drives to the target-installation), and use its installation of APT-offline to perform an install-operation on the folder. The install-op fully installs package-index files and upgraded software packages, but files which are being installed on that installation for the first time as part of an app are just copied to /var/cache/apt/archives directory, where app-files are placed in order to install them. Then, to fully install these software packages, you would perform a regular installation-operation for each new app, as if you had a direct, high-speed internet connection. This ensures that all of the files required to install each app are installed, and prevents other files (possibly including malware) which might have gotten into the "archives" directory, from being installed.

If the app has already been installed, and you just want to upgrade it to a more recent version, you would update the package index and generate a signature file for the app, download it, and perform an install-op. (Evidence of this is that when I try to install an app which is already the latest version, the package manager informs me that it's the latest version, indicating that it would have upgraded it if it weren't.) If upgrading the app requires upgrading other packages, the new versions of these other packages would be automatically upgraded, assuming that other software doesn't require the older versions.

> A more detailed description of the get-operation

A get-command causes APT-offline to read the signature file (SF) specified in the get-command, download the files listed in it, process the files according to their type, and store them in the download-destination folder (DDF) specified in the get-command. This processing includes comparison of the downloaded files' checksums and sizes against reference values (obtained from the package index and copied to the signature file during the set-operation), and assigning names (listed in the signature file) to the downloaded files if they make it through the screening-process. (I'm not sure what happens when a file fails the screening-process - perhaps APT-offline just deletes it and downloads it again, up to a certain number of times, and then declares it to be a failure if it can't get a copy that passes the tests.)

The only case in which I would use a generic downloader (such as a web-browser) to download files for installation would be to download the files required to install the text-command version of APT-offline. However, since the APT/APT-offline downloader wouldn't be used for downloading the files, they wouldn't be automatically screened for errors, and would require manual screening (as described in Appendix II) before installing them. Although the chances of getting a corrupted version of APT-offline from an official repository are very low, the effects of a corrupted version could be disastrous, because it might allow other corrupted files to be installed. After the text-command version is installed, I'd use it to install the GUI, which is designated as apt-offline-gui in text-commands etc. The GUI allows you to avoid creating set-commands, unless you need to use an option which the GUI doesn't provide (see the APT-offline "manpage" for a complete list of commands and options). However, the GUI provides all of the set-command options which I need.

> Set-operation details

Set-operations can be performed by entering an APT-offline set-command or by using APT-offline's GUI. Here are a couple of example set-commands:

A) "sudo apt-offline set apt-offline.sig --update" (w/o quotes), which creates a signature file named apt-offline.sig (which is the only name I use in my simple system) for downloading a copy of the selected sections of the package index from the selected server. These selections are made before performing a set-operation for an update (assuming that new selections are desired), by using the installation's software manager or software-sources-selection app. (The terminal can also be used, if you need more complexity.)
B) "sudo apt-offline set apt-offline.sig --install-packages apt-offline-gui" (w/o quotes), which creates a signature file for the packages required to install APT-offline's GUI.

Set-commands are easier to understand by breaking them down into their basic elements: they begin with "sudo" to require the password to be entered before the command will be executed (to prevent unauthorized changes to the installation), then they call apt-offline and tell it to perform a set-operation, then assign a location (path) and name to the signature file to be generated, and then list the desired changes to the installation. When entering app-names into a set-command or the GUI, the apps' special names without caps or spaces must be used (they can be found in various ways, such as by using a package-manager GUI such as Synaptic to perform a search for the app), and when multiple app-names are entered, they must be separated by a comma ONLY (no spaces).

To save time, set-commands can be stored in a note on the desktop, modified to suit each situation, copied, then pasted into the terminal by pressing Ctrl-Shift-V. To use the same one repeatedly, it can be stored in a clipboard-app such as Clipper, clicked or tapped to copy it, and pasted into the terminal. The only situation in which it would be absolutely necessary to use a set-command would be to use the text-input version to install the GUI (such as if you're using Ubuntu and have a slow internet connection, and decided to want to avoid using the slow connection for making changes to your Ubuntu installation whenever possible, without getting a temporary direct high-speed connection for the PC).

But assuming that APT-offline and its GUI have already been installed, you could use the GUI to perform the set-operation. When the GUI is installed, a launcher is added to the application-menu, but it can also be launched by entering "sudo apt-offline-gui" into the terminal. (Text can be pasted into the terminal with Ctrl-Shift-V, and copied from it by highlighting the text of interest and pressing Ctrl-Shift-C. Once the command has been entered, it can be re-entered by opening the terminal, pressing the up-arrow key, etc.) In either case, the password is required. When the password is entered and APT-offline's GUI appears, click on the GUI's Generate Signature button, and the Generate Signature window will appear. Then, to specify the name and location (path and name) for the signature file, click on the Browse button, etc. (To help keep things organized, I recommend "apt-offline.sig" as the name for every SF, and the Home directory as the location. Root/superuser privileges might be required to delete the original "copy" of the SF from the Home directory, but it can just be copied and pasted into a suitably-named CNF to be put to use, and the original will be overwritten when the next SF is generated.) After entering the path and name by pressing the Browse button, etc., you could copy the resulting contents of the window next to the Browse button, save them in a desktop note, and paste them into the window from then on. To install apps, you would select "binary packages" (unless you want to use source packages, in which case you're probably an advanced user and don't need me to tell you what to do). Then you would specify the apps to be installed, using the apps' special names, as mentioned previously.


> Package index details

During the set-operation, the user specifies the desired changes, and APT uses the installation's internal ("local") package index to translate the specified changes into a list of all of the packages/files which COULD POSSIBLY be required to make the specified changes to the installation being changed, and to any other installation of the same type and version. APT then compares this list to the list of packages which the installation already contains, and generates a list of packages which must be downloaded and installed to make the specified changes.

The package manager cannot access the online package index directly, for security-related reasons. It uses the local package index (which consists of downloaded, screened, and extracted package-index files) as its reference. The only way for the package manager to access the latest contents of the package index is to download and extract copies of the latest package-index files, and the only way to do that is to perform a package-index update, either with a direct internet connection, or with APT-offline. (Only APT and APT-offline can download package-index files, screen them, and, assuming that they pass the screening-tests, assign names to them which they are required to have in order to install them.) However, the user can access the latest information in the package index by going to the Packages website corresponding to the release of interest (such as Debian Packages or Ubuntu Packages).

The local index takes the form of text files known as "package lists" which are stored in /var/lib/apt/lists. An app's listing can be displayed by entering "apt-cache show <app>," assuming that the app's corresponding package-index section is installed.

The online package index is revised daily to reflect changes to the repository's contents. although when a new revision of a software-module/package is added to the repository, the previous revision is retained (along with its listing in the package index) as long as it's needed by some piece of software. When you "update the package index," you're actually REPLACING the local package index with a newer copy (partial or complete) of the online package index. The effect of downloading software based on an old package index is that you might not get the latest available revisions of some software-modules/packages. (If you let it get too old, the package manager will stop working until you perform an update, and if you try to install an app without updating the package index, APT would tell you that it can't find it even if the relevant package-index section is installed, and you could find the app's listing manually.) If you need the latest version of some package (such as an app), you have to wait until it's been released to update the package index.

The online index, which is updated daily to reflect changes to the repository, is stored in the repository in the form of compressed archive files, which cannot be read until they have been installed. (To avoid conflicts between different versions of the package index, the previous local index is automatically purged in its entirety when the new index is installed. So, the new local index includes only what is included in the update.) In some cases, the index is divided into sections, and the user can select which sections to include in the update, to avoid having to download the entire index just to install the latest version of a single app, for example. However, this introduces the possibility of excluding sections, such as updates, which are necessary to install the desired app, so I just select everything except what I know I won't need. These selections are typically made by using the installation's software-manager GUI, or small apps such as Ubuntu's Software & Updates app (software-properties-gtk).

If you must travel to gain access to the internet, and can do so only about once per week, I suggest downloading the package index at least every couple of weeks, even if you don't think you'll need it, so that you always have a fairly recent copy which you can install immediately if you want to generate a signature file for an app. So, you would generate a signature file for an update, and use it in a get-op whenever convenient. If you use a phone to perform get-operations, and use the same server and same package-index sections all the time, routinely downloading the package index could be reduced to a few screen-taps (to copy the get-command from a clipboard-app such as Clipper and paste it into the terminal). It might be a good idea to retain each package-index-download until after the next version has been downloaded, in which case the simplest solution might be to append the download-date to the DDF after performing the get-operation, and to create a new DDF for the next get-operation. Once the new index is successfully downloaded, the previous download would be deleted. The same SF would always be used unless you wanted to change something such as the selection of index-sections or the server, in which case an app such as Software & Updates would be used for making the desired changes, and then a set-operation would be performed to create a new update-SF. To keep things as simple as possible, I'd always select all of the package-index sections except those which I definitely wouldn't need for the foreseeable future, and I'd always use the same server as long as it's fast and reliable.

If the installation of interest allows you to select the the types of software updates to be included in the package-index update, I recommend including all of the updates-sections, because when I performed a package-index-update on Xubuntu 16.04 without including the updates-sections of the index, and then tried to install a couple of large apps, I was unable to install either due to "broken packages," meaning that some "dependencies" (lower-level software-modules required by higher-level modules) were missing and couldn't be installed for some reason which is never explained - the error message just says that they're required but aren't going to be installed. But after re-updating the package index with a copy which included the updates-sections, I no longer received any broken-packages messages, and was able to install the apps. So, I now always include all of the updates-sections.

If the package index isn't updated for a long time, you might get locked out of it, and you wouldn't be able to generate a signature file for any version of any app without first updating the index. So, if you must travel to access the internet, don't assume that you can neglect package-index updates without paying a price in the form of long delays to install an app, or extra trips to your internet-access point.

> Using change-name folders to stay organized

The ultimate reason for my recommended system for using APT-offline is to make it easy to perform get-operations even on phones under distracting conditions, by storing a few fixed get-commands in a clipboard-manager app so that they can be copied (one at a time) with a tap and pasted into a Linux terminal (such as the terminal in Ubuntu installed on Userland, an Android/iOS app) with a tap/hold/tap. Get-commands take the form of "apt-offline get <Path>/apt-offline.sig -d <Path>/DDF" ("apt-offline, go to the directory at <Path>, perform a get-operation on apt-offline.sig, and put the downloaded/screened files in the folder named DDF in the same directory"). So, in my recommended system, all signature files are named "apt-offline.sig" (which is also the name which APT-offline assigns to SFs by default), and all download-destination folders are named "DDF," but this requires them to be stored in "change-name folders" (CNFs), each of which has a unique and typically long name, to identify the SF/DDF pair's purposes, and to isolate the pair from any other SF/DDF pairs (since all SFs are named "apt-offline.sig" and all DDFs are named "DDF").

To perform a get-op, the CNF would be transferred to a get-process directory, which is a sub-directory created by the user and placed in some stable, convenient directory on the download-device, or on a flash drive to be connected to the download-device. The idea behind a GPD is to be able to put a CNF in a GPD, and then move the contents of the CNF (an SF/DDF pair) out of the CNF into the GPD, so that a get-op could be performed on the SF/DDF pair by executing a get-command with the GPD's name ("GPD1," etc.) in the path. If the SF/DDF pair were left in the CNF, its path would have to be included in the get-command, which would require a new get-command for every get-op, and creating a new get-command on a portable device (or creating one on a PC and transfering it to a portable device) would be very inconvenient. The reason for moving the CNF into the GPD, and then moving the SF/DDF pair out of the CNF into the GPD, is to make it easy to return the SF/DDF pair to the correct CNF after using the SF/DDF pair in a get-op. The use of multiple GPDs requires a get-command for each GPD, but there would probably be just a few, and all of the commands would be identical except where they refer to the corresponding GPD (in two places). Each get-command would take the form of "apt-offline get /<path>/GPDn/apt-offline.sig -d /<path>/GPDn/DDF," where <path> is the location of the directory containing the GPDs, and n is the associated GPD's number. (There's no "sudo" in get-commands.)  

The creation of SFs, DDFs, and CNFs (change-name folders) has been covered. At home, each CNF containing an SF/DDF pair to be used in a get-op would be placed in a separate GPD, and then the SF/DDF pair would be moved out of its CNF into the corresponding GPD. Then, once the phone has been connected to the internet, a get-op, or a series of get-ops, would be performed by copying/pasting each of the relevant get-commands into the command line, hitting Enter, waiting for the get-op to finish, then copying the next get-command, etc. After returning home, each SF/DDF pair in a GPD on the phone would be returned to its CNF, and then I would move each CNF to a folder pertaining to the relevant installation, on an archival flash drive or two, and then connect the flash drive to each target-installation to perform an install-op on the relevant DDF in each case.

=======================

Note: A subtle point that might save you a lot of time

To use the SF/DDF pair in a get-operation, it would be moved from its CNF into its get-process directory, and when the get-operation is completed, the pair would be returned to its CNF. If an Android or iOS device is being used for the get-process, it might be necessary to use a file manager installed on the device to move the SF/DDF pair out of its CNF into the get-process directory, and back into its CNF after the get-op, so that when you connect the device to the offline PC, the PC's file manager can find the downloaded files in the DDF, which is obviously necessary to install them on the PC. Just keep this in mind in case your PC's file manager can't find files which you know were downloaded to your device.

=======================


Then the contents of the corresponding DDF would be installed by means of an install-op performed with the target-installation's installation of APT-offline, after which the user would have to perform a regular installation-process on any apps just "installed" by APT-offline, which is probably done to ensure that all required modules are installed, and to prevent any extraneous files from being installed. (On the other hand, APT-offline install-op fully installs package-index files, which are actually just extracted into /var/lib/apt/lists, and upgrades of modules which have already been installed, which lets APT know that it's OK to install the upgraded versions.)

> Get-operation details

To use the signature file in a get-operation, it is transferred or connected to some device which can readily obtain high-speed internet access and which contains an installation of APT-offline. Then the device is connected to the internet, and an APT-offline get-command is executed, which causes APT-offline to read the specified signature file and download and screen the files listed in it, and save those that pass the screening-tests to the folder designated in the get-command. The get-operation uses security-related information in some types of signature files (such as those for apps) to screen the downloaded files for corruption (such as malware), and to rename some types of files to indicate that they are safe to install. (Checksums for package-index files are contained in the In Release file which is downloaded with the package index files, and which is digitally signed to authenticate it.) All of the files are "renamed" if they make it through the screening-process, but some are "renamed" with the same name under which they were downloaded, which means that these files could be downloaded via web-browsers or generic downloaders and installed without screening them, although this would be risky and pointless. If a file fails the screening test, I assume that APT-offline would delete it and download it again, and declare it to be unobtainable if it can't download a copy that passes the tests. There are a few ways to install APT-offline without a direct internet connection (see Appendix II), one of which involves downloading the required files by means of a web browser or generic downloader, and thus without proper screening. In this case, the files should be screened in a separate process, as described in Appendix II. In some versions of Linux, you can calculate a file's checksum by right-clicking on the file and making some selections. The reference checksums are contained in the package index and can be obtained by entering "apt-cache show <package name, such as "apt-offline">, but the package index might be locked if it hasn't been updated lately.

The get-command, which is basically "apt-offline get <path>/apt-offline.sig -d <path>/DDF" (without the quotes) tells APT-offline to follow <path>, to look in the corresponding directory for signature file apt-offline.sig and a folder named DDF, to download the files listed in apt-offline.sig, and to put them in folder DDF, which is reserved for the files listed in apt-offline.sig.

=============

Get-op error messages

It isn't unusual for error-messages to appear in the terminal during the get-operation, but they can usually be disregarded because they're typically caused by variations in how different servers are organized. Signature files are designed to take these variations into consideration, so that regardless of how a server is organized, the required data is always obtained. The typical error message just indicates that an approach that works for some servers isn't working for the server which is being used.

==============

I've found that the easiest way to perform the get-operation is to use a few fixed get-commands, even if it's feasible to install and use the GUI, because using fixed get-commands reduces the task of initiating a get-operation to a few simple steps (such as a few screen-taps on an Android/iOS device).
 
Using fixed get-commands requires all signature files (SF's) to have the same name, and all download-destination folders (DDF's) to have the same name, and it requires each SF/DDF-pair to be placed in a separate "get-process directory" (such as GPD1, GPD2, etc.) A good name for signature files is "apt-offline.sig," since this is the default name used by the set-operation, and I've adopted "DDF" as the name for download-destination folders. Use whatever name you prefer - all that really matters is that you know what it means, that it's not being used for anything else, and that it's easy to type. To allow multiple identically-named SF/DDF-pairs to exist in the same file system without any risk of clashing, and to identify their purposes, I place each SF/DDF pair, immediately after creating it, in a separate "change-name folder" (CNF) which is named after the target-installation (the PC and type/version of Linux, such as "Brix-MXX-21" for an installation of MX-Linux 21 XFCE on a Gigabyte Brix mini-PC), the changes specified in the set-command which generated the corresponding signature file, such as "update" or abbreviated app-names, and the date when the SF was generated (see "> Using change-name folders to stay organized").

To prepare for a get-operation, the CNF is placed in a GPD, and the SF/DDF pair is moved out of the CNF into the GPD. Then, when the download-device is connected to the internet, the get-command for that GPD is executed. After the get-op is completed (when convenient, if not immediately necessary), the SF/DDF pair is moved back into the CNF, the CNF is transferred to an archival directory directory on a flash drive, and the flash drive is connected to the installation of interest, where an APT-offline "install" op is performed on the relevant DDF on the archival flash drive.

This system allows multiple get-ops to be performed in the same download-session by saving a few fixed get-commands in a clipboard-manager app, so that each one could be copied and pasted into the terminal with a few taps. Each get-command would be "apt-offline get GPDn/apt-offline.sig -d GPDn/DDF," where n would be the GPD's number.

Many error-messages might appear in the terminal during the get-operation, indicating that certain files can't be found, but these messages are typically due to variations in repository/server-organization and can be ignored because in that repository, the supposedly missing files would be included in other files listed in the signature file. However, after performing a get-op, I check the DDF for files.

> Avoiding get-op interruption

When I performed the initial package-index update on my initial flash-drive installation of Lubuntu 22.04, I used APT-offline so that I could reuse the downloaded package-index files on additional Lubuntu 22.04 installations. (It would probably be possible to use them for installations of Ubuntu 22.04 and its derivatives.) I also placed the download-destination folder in the installation's home folder, without considering that its write-speed would be much slower than the ssd on the PC which I normally use for the get-process. (The download apparently goes to DRAM first, to speed up the download-process, and is then written to the destination-folder on the storage device.) The download-progress indicators which appear in the terminal during a get-op apparently do not indicate the progress of the write-op to the storage device, but just the progress of the download to DRAM. (However, the progress-indicators which appear in the terminal during an install-op apparently do indicate the progress of writing the data to the storage device.)

When I installed the aforementioned initial package-index-download onto my initial Lubuntu 22.04 installation, and tried to install some apps, I got a bunch of "cannot authenticate package" warnings. So, I performed a regular package-index update, using a direct internet connection, and then tried to install the same apps again, and did not receive any "cannot authenticate package" warnings. Then I performed another update using APT-offline, then tried to install the same apps, and did not receive any "cannot authenticate package" warnings.

To rule out the possibility that performing the direct update was necessary, I installed the first APT-offline update-packages onto my secondary Lubuntu 22.04 installation (which had not yet been updated), then tried to install some apps, and received "cannot authenticate package" warnings. Next, I installed my second set of update-packages obtained via APT-offline, and tried to install the same apps, and did not receive any "cannot authenticate package" warnings. So, I concluded that there was simply something wrong with the initial update-packages obtained via APT-offline, and promptly deleted them, assuming that they were useless, when they would have been useful for trying to understand what had gone wrong. But at least I didn't delete the package-LISTS, contained in /var/lib/apt/lists, which are the ultimate result of performing an update (in most cases, they're the extracted versions of the downloaded package-index files), and used the command "ls > <filename>.txt," to create lists of files contained in /var/lib/apt/lists in each of my Lubuntu 22.04 installations. Then I compared the lists, and found quite a few differences, although I could not understand how the differences would cause the "cannot authenticate package" warnings. However, I concluded that I had terminated the initial get-process before the data had been completely written to the flash-drive installation.

So, if you ever use a flash-drive installation to perform a get-op, and designate the installation as the destination for the downloaded package-index files, you won't be able to eject the drive as a means of determining when the process of writing the downloaded files to the installation is complete. So, in such a case, I recommend using something to monitor the process of writing the download to the flash drive. (If the drive could be ejected, you could eject it, and wait for drive to be ejected to know when the write-process was finished, but then you'd have to unplug it and plug it back in if you wanted to continue to use the drive. So, a disk I/O monitor would also be useful in such cases.) The XFCE desktop includes a widget for this purpose (xfce-diskperform-plugin), but otherwise, it appears that as of this writing, the best option is the GKrellm (Gnu Krell Monitors system monitor), although GKrellm requires GTK+ 2.4 (or newer), and iotop (which runs in the terminal and provides numerical indications of disk I/O) will also do the trick.

> Storing and installing the downloaded modules/packages/files

Assuming that you have just downloaded a bunch of packages to a few DDFs (each along with its associated SF and CNF, in a GPD) on a phone, tablet, or "sneaker-net" flash drive (the flash drive used for transferring data between the Debian installation and a download-device such as a remote PC with a fast internet connection), you would first move each SF/DDF pair back into its CNF (perhaps using a file manager on the phone or tablet if the file manager on the target-installation can't find the downloaded files on the phone or tablet, which has happened to me, apparently due to some quirk of Android). Then I would copy the CNFs to a couple of archival flash drives, and delete them from the GPDs (cutting/pasting might accomplish the same thing, but I'd rather see them on the archival flash drives before deleting them from the GPDs, just to be on the safe side).

Each archival flash drive would have a folder for each of my current installation-types, and each folder would contain package-index files and software. This would allow me to create a new installation of that type in the short term without having to download much if anything. In the long term, the package index will become outdated and need to be updated/replaced when creating a new installation, possibly requiring upgrades to the stored software, although you might be able to re-use a significant portion of the stored data and reduce your internet-data consumption accordingly.

In case you end up needing to use APT-offline to create new installations, keep the packages required to install APT-offline on that type of installation in a separate folder in the corresponding archive, to make it convenient to use a package-installer such as GDebi to install APT-offline before updating the package index, and then use APT-offline to perform updates, upgrades, and software-installation. Unfortunately this approach can't be used with Ubuntu, which uses APT as its package-installer, meaning that the package index would have to be updated before installing any packages. So, the only way to install APT-offline on plain Ubuntu is by using a direct internet connection.

To install the downloaded packages on their target-installation (referring to the first update/installation cycle on an installation, with no plans to make any other changes to the installation in the foreseeable future), I'd connect one of the archival flash drives to the target installation and perform an APT-offline install-op on the relevant DDF. If the packages are software-index files, the install-op would fully install them, which includes extracting them in many cases. The installed versions (located in /var/lib/apt/lists) are known as "package lists," and are collectively known as the "local package index." Upgrades (upgraded versions of previously-installed packages) are also fully installed during the install-op (the fact that earlier versions were previously installed constitutes permission to install them). But if they're software packages which are being installed for the first time on that installation, as part of an app, they are just copied to /var/cache/apt/archives, although in some simple cases, they are actually installed. But in most cases, to fully install apps for the first time on a particular installation, the user would then have to perform a normal installation-process, such as by entering "sudo apt install <app1> <app2> <app3>" for the apps being installed (separated by a space only), which should be indicated in the CNF's name. This is apparently intended to ensure that all of the modules required by the apps are installed, and to prevent any malware from being installed. The alternative would be to simply install all of the files which had been transferred to the installation, without receiving any indication of missing modules, and possibly installing malware and corrupting the installation, and perhaps destroying your data. If the CNF's name doesn't indicate the purposes of the packages in the DDF, you could look over the DDF's contents to remind yourself what you intended to install, but if the DDF contains a lot of packages, you'll wish you had done a better job of naming the CNF.

Over the long term, when using stored packages to create a new installation without a direct internet connection, I would use APT-offline to update the new installation, but I would use the installation's file manager to copy the stored software packages to the installation's /var/cache/apt/archives directory (where software must be placed before APT can install it), to avoid installing old versions of some packages, and then potentially having to replace them with newer versions shortly thereafter. Copying files to the /var/cache/apt/archives directory requires the file manager to be used in super-user mode. To do this, close the file manager, then open a terminal and enter "sudo <file-manager name>." This works with the XFCE file manager Thunar, Ubuntu Mate's file manager Caja, Gnome's file manager Files, and perhaps others. However, to open Files in super-user mode, you would enter "sudo nautilus," because Files was originally named Nautilus and it was never changed as far as opening it in super-user mode is concerned. But super-user mode isn't an option with Dolphin, KDE's default file manager, and perhaps others, in which cases you'd have to use an APT-offline install-op to copy the stored files to the new installation, or install another file manager.

==============

Warning

Be very careful when using a file manager in superuser mode, because a mistake could destroy the installation. As soon as you have performed the desired operation, close the manager and the terminal to exit super-user mode.

===============


By using the file manager to copy the stored packages to the new installation, they will be visible to APT when generating a signature file to perform an upgrade or to install apps, but upgrades would not be installed, as they would be if they were transferred to the installation by means of APT-offline. (Even installing just an app might entail installing some upgrades, so even if you installed only apps on the original installation, upgrades might be included in the stored files.) For the time being, it would be a waste to install the upgrades, because some of them might have been superseded by upgrades listed in the latest package index. So, I would just use the file manager to copy the stored packages to the "archives" folder, and then use APT-offline to generate a signature file for an upgrade (if desired or required) and for any apps to be installed (including apps which were stored on your archival flash drives and might include packages for which there are upgrades, and apps being installed from the online repository). Then I'd perform a get-op on the signature file, and then use the Debian installation's installation of APT-offline to perform an "install"-op on the folder containing the files just obtained in the get-op. The install-op would fully install the upgrades (if there were multiple versions of an upgraded package in /var/cache/apt/archives, APT would install the latest, unless some app required an older version). Then the apps would be installed by performing a regular installation for each app, which can be done with a single command which lists all of the apps ("sudo apt install app1 app2 app3 etc."), with a space between each app-name, using the apps' names which have no caps or spaces (such as "gnome-disk-utility" for the Disks utility, which is excellent overall, and the only app to my knowledge which can create encrypted partitions).


====================

Appendix I. Installing APT-offline, and using it in get-mode, on Android devices (presumably applicable to iPhones with minor modifications)
 

As far as I know, the most practical portable device for running APT-offline in get-mode is a phone with an installation of APT-offline on Ubuntu installed on top of Userland (an Android/iOS app). You have to provide a username and a couple of passwords (I suggest making them easy to remember, and to enter with the on-screen keyboard, because you have to enter one of them each time you start Userland). You also have to update the package index (enter "apt update" after installing Ubuntu) before installing any apps. You could perform an upgrade, but you probably won't need to do so just to install and use APT-offline.

Userland's "home" directory in the Android file system is /storage/emulated/0/Android/data/tech.ula/files, with which users are not allowed to do anything. (In previous revisions, I suggested creating directories inside the "files" directory, but I later realized that Userland doesn't recognize them.) To import data into Userland, place it in the "storage" directory at /storage/emulated/0/Android/data/tech.ula/files/storage, of which there are two "parallel" versions - one in internal memory, and one on any "installed" SD card. In the Ubuntu terminal on Userland, the path to the "storage" directory in internal memory is /storage/internal, and the path to the one on the SD card is /storage/sdcard.

I have an OTG phone, because I wanted to be able to connect a wired keyboard to it, which I do, via an OTG adapter-cable (which shorts two of the pins together to put the phone into "host"-mode), and a powered USB hub, to avoid draining the phone's battery. I can also connect flash drives to the hub and access them with Android file managers, and use my PC's file manager to transfer files to and from the phone, by connecting my phone to my PC with a regular (non-OTG) USB cable. To get the PC to recognize the phone, I first connect the PC and phone, and then swipe down from the top of the screen and select the type of transfer desired. When I selected "files," I was able to access the phone's file system with my PC's file manager.

My recommended system for performing get-operations on Android/iOS devices is based on the use of change-name folders (CNF's) described elsewhere in this article. At home, all of the CNF's containing SF/DDF pairs to be processed would be copied to get-process directories (GPD1, GPD2, etc., explained previously), which you would create in Userland's "storage" directory.

Suppose that you have two GPDs (GPD1 & GPD2) in the Userland storage directory, and that you have two CNFs, each of which contains an SF/DDF pair to be processed (such as to download package-index files for one installation and software-packages for another in a single download-session). While at home, you would transfer one of the CNFs into GPD1, and the other into GPD2, and move each SF/DDF pair out of its CNF into its corresponding GPD. When you get the Android/iOS device connected to the internet, you would execute the the fixed get-command corresponding to GPD1 (apt-offline get /storage/internal/GPD1/apt-offline.sig -d /storage/internal/GPD1/DDF), which would be stored in a clipboard-app, as described previously, then wait for the get-op to finish. Then examine the corresponding DDF, to ensure that the packages have been received, and if so, enter the get-command for GPD2, wait for it to finish, and examine the contents of the corresponding DDF.

Then, when you return home, you would return each SF/DDF pair to its original CNF before transferring or connecting the CNF to the target installation, or to a flash drive used for transferring the CNF to the target installation, and perhaps for maintaining an archive of packages for creating more installations. Just leave the packages in their DDF until their install-op is finished, and then you can transfer them to your backup-archive.

================

Note: A subtle point that might save you a lot of time

It might be necessary to use a file manager installed on your Android/iOS device to move each SF/DDF pair out of its CNF into its corresponding GPD, and back into the CNF after the get-op, so that the PC can find the downloaded files in the DDF, which is obviously necessary to install them on the PC.   

==================

Then you would perform an APT-offline install-operation on the DDF. If the GUI isn't available, you would enter an install-command, which is "sudo apt-offline install <path>," where <path> is the path to the DDF. To get this path, right-click on the DDF, then select Properties in the menu which appears, then copy the path in the window which appears. Or you could open a terminal in the CNF and avoid the need for a path in the command. Chances are that the install-operation won't actually install apps, but just "add" them (put them in /var/cache/apt/archives), and that you would need to do something like enter "sudo apt install <app1> <app2>  etc." to actually install them. But in some simple cases, apps are completely installed.

To load the get-commands into the clipboard-manager app on the portable device, I suggest the following system, which requires you to install a text-editor app and a clipboard app on your phone: while still at home, where it would be much easier (such as with an external keyboard connected to the phone, the phone mounted in a stand for optimal hands-free viewing, and minimal distractions), you'd enter the command pertaining to GPD1 (apt-offline get /storage/internal/GPD1/apt-offline.sig -d /storage/internal/GPD1/DDF) into a text-file (another option would be to use your PC to copy this command into text file reserved for the command, save the file as get-com.txt or whatever you prefer, and copy it to the phone, thus avoiding the need to use a keyboard with the phone for this task), then use the text-editor's "select all" function to copy the command from the text file. When you copy the command, it would be added automatically to the clipboard-app's clipboard-history list, from which it could be copied by simply tapping on it, and then it could be pasted into the terminal by pressing on the command line until a menu appears, and then selecting Paste in the menu. (The terminal's command-history is cleared each time Userland is shut down, so it's useless for storing get-commands between sessions.)

When switching between Userland and the clipboard-app, for example, tap on the round button to get Userland out of the way, then use the clipboard-app, tap on the round button again, and then tap on Userland's icon. If you mess up and have to log into Userland again, it will start another session (another terminal, running another process), and the easiest way out of that is to enter the password, then enter "exit." If there are multiple sessions running, you can drag a drawer out from the left side of the screen to switch between them.

To create a get-command pertaining to GPD2, you could use the text-editor app on the portable device to change "GPD1" to "GPD2" in two places in the get-command in get-com.txt. Then you would copy it to add it to the clipboard-manager's clipboard-history.

Clipper allows you to create lists for arbitrary categories of clips, and I suppose that other clipboard-manager apps also have this capability. So, you could create a list specifically for get-commands, and then copy the get-commands from the clipboard-history list to the get-command list.


Appendix II. Installing APT-offline on Debian installations


==========

WARNING!!

Installing the .zip or tar.xz version of APT-offline on full installations might not produce a fully-functional installation of APT-offline, and you might need to re-install the OS in order to install APT-offline's .deb version. So, it's best to just forget about installing the .zip or tar.xz version on full installations, and to just install the .deb version.

==========

Ideally, APT-offline would be included by default in all relevant types of Linux (as it once was, and might still be, in Xubuntu, despite Xubuntu's minimalism), because installing it without a direct internet connection is complicated or impossible (such as on Ubuntu proper, at least in the past, due to its inability to install even individual packages without first updating the package-index). Installing it with a slow connection is complicated, or simple but time-consuming. (However, Linux was apparently designed so that the internet connection can be interrupted while downloading packages, and the download-process will resume where it was interrupted when the connection is re-established.) So, you should take any opportunity to install it, even if you don't think you'll need it, by means of a direct high-speed internet connection, which would be a simple matter of selecting the package-index sections to be included in the update (if the default selection doesn't includes sections which you might need, or includes sections which you definitely don't need), then updating the local package index, and then entering "sudo apt install apt-offline-gui" (w/o quotes - you can copy the command and use Ctrl-Shift-V to paste it into the terminal), which would install APT-offline's GUI and APT-offline, since the GUI needs APT-offline to do anything.

You might be able to get a temporary high-speed connection just for setting-up the installation (which would naturally include installing APT-offline), such as at a friend's place, your school (perhaps through a computer class or club), your workplace, or at an "install-fest" organized by local Linux users groups. Modem drivers are automatically installed when creating installations of some types of Linux, but in Ubuntu, for example, you must select the "3rd party software" option during the installation process, or install a driver for your modem after creating the installation, which I was unable to do the last time I tried, although it might be easier now, assuming that you can find the right driver.

If you plan on using the installation as a secure installation, don't use it to access any sensitive data before performing an initial set-up with a direct internet connection. After using the installation, APT-offline would be used for making changes to it, to avoid compromising the security of any data which might be on it. With MX-Linux's MX Snapshot utility, you can turn your original installation into an ISO, and use the ISO to make other installations which you could use as working-copies.

You could get a power inverter and set up your PC in your car (for purposes of performing the initial set-up), which might require the use of a cardboard hood to shield the monitor from sunlight. If you don't even have access to a car, you could power your PC with an uninterruptible power supply, such as the APC Back-UPS (BE425M), which costs approximately US $50 as of 2019, and would apparently provide sufficient power for a mini-PC and a monitor for sufficient time to perform changes to an installation, assuming a reasonably fast internet connection. As battery technology improves, the UPS-units with a given energy-rating will become more portable. A mini-PC with low power consumption would obviously be more suitable for these situations, and it could be used for obtaining the packages required to install APT-offline on a power-hungry desktop PC which would be impractical to turn into a portable PC.
 
> 4G Hotspots, USB Modems, etc.

If you can't tether your PC to your phone, and you can't find a good wi-fi hotspot, you could get a prepaid 4G hotspot, although your data allocation will probably expire before you use all of the data, if you rarely use the hotspot. I have a Moxee K779HSDL 4G hotspot from Straight Talk, which I've found to be a very reliable service provider. The K779HSDL's micro-USB power port doubles as a data port, allowing the use of a simple, secure wired data connection to the PC, which none of the hotspot's literature mentions, as far as I've been able to determine. If you get a hotspot, study its instructions, take notes on the relevant sections, and put all of the information you'll need where it will be easy for you to find when you need it.

The first step in setting-up a hotspot is to activate it, which involves setting up an account and buying some data. To do this, you'll need to provide your email address, perhaps a nickname (although unlikely), an account password (which you might have to enter on your phone), and the hotspot's IMEI/MEID/ESN (serial number), which is a 15-digit number which can be found in the quick-start guide or in the hotspot, and which you might want to store in a file on a flash drive or in a clipboard-manager app on your phone to avoid having to type it every time you need it.

Hotspots have an IP address (typically 192.168.1.0 or 192.168.1.1) which you would enter into your web-browser to access the modem's login page, where you would enter your user-name (typically "admin" unless you've changed it) and the administrative password (provided in the instructions, unless you've changed it). If you change the user-name and/or password and lose either, you can reset the modem to reset the username and password to the default settings.

When you set up the wi-fi connection, use the most recent and secure encryption standard which all of the devices can use. As of 2022, WPA2-PSK is the most secure widely-adopted standard, and WPA3-SAE is the most secure, but not yet widely adopted.  There's also a wi-fi-password, which you'll have to enter into each device which you intend to connect to the modem via an encrypted wi-fi connection. The default wi-fi password can be found through the modem's instructions, and you can change it after logging into the administrative account. 

> Installing APT-offline without a direct internet connection

The only way to install APT-offline on a new full installation on which the package index is locked until an update is performed, without using a direct internet connection, involves the use of a package-installer utility to install the packages required to install APT-offline on that installation.

You would first obtain a list of modules/packages which would have to be installed in order to install APT-offline, by going to the section of the Packages site (such as Debian Packages or Ubuntu Packages) which pertains to the release of interest, looking up APT-offline, and then referring to a list of all packages installed by default on that release (Distrowatch.com has such a list for every Linux release). Then you'd download the packages which are required to install APT-offline, but aren't installed by default, along with their checksum reference values, from the Packages site. (There are easier ways to simply download the files, but you need the checksum references to authenticate the files to ensure that they're safe to install.)

Although I've never gotten a corrupted package from a Packages site, I always calculate each file's checksums at home using the "sha256sum <path/filename.deb>" command built into Linux, and compare the result to the reference value from the Packages site, because installing corrupted software could be catastrophic to your system and your data (so keep backups). If the calculated and reference checksums are different, the difference won't be subtle, and just glancing at the two is sufficient to compare them. But if you want to check their checksums immediately after downloading them, and you're using an Android/iOS device, you could use a checksum-calculator app, and compare the result to the reference checksum.  If you're using a PC to download the files, you might be able to calculate a file's checksum by simply right-clicking on the file and making some selections in the menu which appears. But in any case there's always the sha256sum <path/filename> command in Linux, and something similar in Windows. To avoid the need to enter a path in the command, open a terminal in the file's directory by right-clicking in the directory and selecting "open terminal here" or "open in terminal," then enter "sha256sum <filename.ext>", and compare the resulting value to the corresponding reference value. You can copy the file's name by right-clicking on it, then selecting Rename, then pressing Ctrl-A, then Ctrl-C.

If the checksums are good, install the packages by simply clicking on them, etc. When you try to install some package from the group, the installer might indicate that others in the group have to be installed first, which is why this method isn't suited for groups of much more than five packages, but in my experience APT-offline requires 7 packages at most, and as few as two. (Its GUI can require several additional packages, so I recommend installing APT-offline without the GUI, and then using APT-offline to update the package index and then to install the GUI along with any other software which you might need in the foreseeable future.) You can use a menu-editor to edit APT-offline GUI's menu-entry, including by selecting a different icon from the installation's collection of icons (in the menu-editor, click on the icon, and eventually the icon-directory will appear).


Appendix III. Using APT-offline to perform get-ops on a Windows device

Note: Due to security enhancements in Windows, it might not be possible to install and run Python and APT-offline on library PC's in the future. However, it will be possible to install and use it on your personal Windows tablet, using essentially the same procedures described below.

If a library PC (Windows PC) is used for the get-process, a USB flash drive would probably be used for transferring data between the Debian installation which is being changed, and the Windows PC. The same change-name-folder (CNF)-based system described previously would be used for organizing the change-process, although CNF's would be stored in a top-level folder named APT-offline or AOL, which would also contain such things as the Python interpreter-installer (.msi) file and the extracted APT-offline zip-file (which are required for installing Python and APT-offline on the PC at the beginning of each download-session, since library-PC's delete all changes made by the user at the end of the session). To reduce the get-operation to a simple routine that can be performed under distracting conditions, the APT-offline/AOL folder would be moved into the PC's Downloads directory during the get-operation (a backup copy of the APT-offline/AOL folder would be stored elsewhere on the flash drive to avoid accidentally erasing your only copy). Moving this folder to the PC's Downloads directory would eliminate the need to edit the commands used for installing APT-offline, and those used for performing the get-operation, which might be otherwise be necessary because Windows assigns drive-letters to flash drives when they're plugged into the PC, and if a different letter is assigned to a flash drive, the path to anything on the drive would be different, and any commands which refer to the items on the flash drive would have to be edited.

> Required items

You'll need the following items to perform the get-process on the typical library-PC, all of which would be stored in the aforementioned APT-offline/AOL folder on the flash drive used for transferring data.

1) Python interpreter-installer (.msi file from the Python-for-Windows download-page (https:// www.python.org/downloads/windows/). It's about an 18MB download. I recommend the 32-bit (i386) version of whatever happens to be the latest 3.x revision, because the 32-bit version should run on all PCs. You can try the 64-bit version, but even when it runs on a particular PC, it probably won't provide any noticeable performance advantage. Keep a copy of the .msi file on your APT-offline flash drive to have it handy whenever you want to perform a get-operation.

2) APT-offline .zip file from https://github.com/rickysarraf/apt-offline (click on green Download button, etc.), extracted. (I assume that if the latest version of Python 3.x is used, then the latest version of APT-offline can be used.)

3) The change-name folders (CNFs), each containing a signature file (SF) and a download-destination folder (DDF), pertaining to the desired changes to be made to the Debian installation(s) of interest. The CNFs would be stored in the top-level APT-offline (or AOL) folder, and each SF/DDF-pair would be moved out of its CNF into the APT-offline (or AOL) folder, one pair at a time, to be used in a get-operation, and returned to its CNF (with the downloaded/screened files in the DDF) when the get-operation is completed. (To allow the same get-command to be used all the time, the AOL folder would be moved to the Windows Downloads-directory.

4) The following commands in a batch file named something like APT-offline-install.BAT, stored in the top-level APT-offline directory on the flash drive, for installing APT-offline and preparing to perform any get-operations to be performed. A batch-file is just a text file with a .BAT suffix, containing commands which could be executed manually in the same order in which they are listed in the batch file. Keep in mind that they are for a Windows PC, and thus use Windows-command syntax:

  a) cd <path to the Windows Downloads-directory>\APT-offline\apt-offline. (The idea is to change the default directory to the inside of the folder resulting from extracting the APT-offline .zip-file, which is presumably named "apt-offline" and contained in a folder named "APT-offline" in the Windows Downloads-directory, but if these items are named or organized differently, the cd-command would have to be modified accordingly.)

  b) C:\<Python directory>\python setup.py build  [<Python directory> is the Python directory created when Python is installed, such as C:\Python27 for Python 2.7.x. By including the directory in the command, it doesn't matter whether the directory is included in the "Windows path," which is a list of directories which Windows automatically searches when executing commands. (Some library-PC's which allow Python to be installed don't allow it to be placed on the Windows path.)]

  c) C:\<Python directory>\python setup.py install

  d) cd ..\ (This command changes the default directory up one level to APT-offline, where each signature-file/DDF pair is placed to be used in a get-operation, which would be initiated by entering the following fixed get-command.)

5) The fixed get-command "apt-offline get apt-offline.sig -d DDF" in a batch file named something like "APT-offline-GetCmd.BAT" and stored in the APT-offline/AOL folder.


> Detailed get-process on Windows PC's/tablets

1) (Assuming that a USB flash drive is being used as the flash device and that a library PC which allows Python to be installed is being used as the download-device) Plug your flash drive with all the APT-offline-related files (listed above) into the Windows PC. If Python 3.x isn't installed, I recommend installing the 32-bit version of the latest 3.x revision (whatever the latest is when you read this) even if a previous revision is installed (assuming that you're allowed to install it), partly because the existing installation might have a problem, and installing it again is easy and will repair any problems with the previous installation. Installation is a matter of clicking on the file's icon to open it, and then clicking a few buttons in the resulting window.

2) Copy the APT-offline/AOL folder (which again, contains the Python interpreter-installer file, the extracted APT-offline .zip-file-folder, the batch files, and CNF's) to the Windows Downloads-directory, and to a backup-folder on the flash drive, to ensure that you can't accidentally erase your only copy.

3) Click on the APT-offline-installation batch-file (in the APT-offline/AOL folder in the Windows Downloads directory) to install APT-offline and to set the default directory to the get-process directory (the  folder).

4) Move the signature-file/DDF-pair to be used in the next get-operation from its CNF into the get-process directory (the APT-offline/AOL folder in the Windows' Downloads directory).

5) Click on the get-command batch-file to initiate the get-op. If all goes well, download-progress indicators will appear in the terminal window. Pay attention to the terminal in case something goes wrong, although most error-messages which indicate failure to find some file are due to differences in server-organization and can be ignored because the supposedly missing files are contained in other files which will be downloaded.

6) When the get-operation is complete, move the signature-file/DDF-pair back into its CNF and return to step 4.


Appendix IV. Some tips on using shell scripts

Shell scripts are essentially batch files on steroids, and they would be used instead of batch files (mentioned in Appendix III) when performing the get-process on a Linux PC, or on some portable Linux device. 

More specifically, a shell script is essentially a text file with an extension of .sh and permission to be executed as a program (which means that shell scripts must be stored in a Linux file system, such as Ext4, and can be executed only on a Linux installation), and whose first line is #!/bin/bash, which I gather is a command to start the bash interpreter, which is stored in the /bin directory, a.k.a. system/bin or computer/bin, etc.

To give a file permission to be run as a program, close the file and right-click on its icon, then click on Properties in the menu which appears, and then select the Permissions tab in the window which appears. In that window, you'll find a check-box for allowing the file to be executed as a program.

But in order to select this permission, the file has to be stored in a Linux file system, such as Ext4.

Another discovery is that in order to launch a shell script by clicking on it, you have to change a setting on the FILE MANAGER. To do this, open the file manager, then click on Edit, then Preferences, then Behavior, and then in the resulting window, under Executable Text Files, select whichever selection you prefer. (The exact procedure might be different for your installation, but it would probably be similar.) This setting would have to be set every time you boot a nonpersistent installation and want to run shell scripts by clicking on them.

The weirdest thing I discovered is that it's possible to click on a shell script and to be informed by the OS that the shell script doesn't exist. The specific error message was "Failed to execute child process X.sh (No such file or directory)." I eventually solved the problem by just copying the text into a new file in the same folder. I gather that the OS couldn't find the original file because it wasn't created via the proper channels, and consequently wasn't registered at some level of the system.

The use of an Ext4 file system also has ramifications for the process of creating signature-files. Until I realized that it would be necessary to use an Ext file system in order to use shell scripts, I recommended using a FAT file system for ease of use, since there's no need to be concerned with setting permissions when moving the flash drive between PCs. If APT-offline's set-operation creates a signature-file on a FAT-formatted flash drive, the signature file can be moved, altered, or deleted without obtaining superuser privileges, due to the nature of the FAT format. However, if it's saved to an Ext4-formatted drive directly, it's necessary to obtain superuser privileges to do anything to the resulting file (other than copy it) since the set-process saves the file with superuser privileges. So, when a flash drive or SD card with an Ext4 file system is being used, I recommend having APT-offline's set-process save signature files with the default name (apt-offline.sig) to the default location (the Home directory) every time, and then copying them to the flash drive used for the get-process, to gain control over them.

------------------------

Notes

[1] The typical types of changes are a) software-index-updates (replacing the installation's internal software index with a newer copy of the online software index, which cannot be used until it has been downloaded and installed, for good reasons, although the latest information and software revisions are available from the Packages website pertaining to the release of interest). The installation's internal or "local" software/package index can be used until its expiration-date (at which time the software manager will stop working until an update is performed), although if you want to get the latest software-revisions, you'll need to perform an update after any revisions of particular interest have been released, and before generating a signature file to perform an upgrade or to install apps. (Signature files, and the process of generating them, are explained in the main text.) To upgrade individual applications without performing a general upgrade, I would update the software index and re-install the applications of interest.

Some applications, or at least the latest revisions, are available only from "PPA's," which are small, officially-sanctioned repositories maintained by the application's developers. But before using a PPA, it must be added to the installation's sources-list, which requires a direct internet connection to authenticate the PPA, so that the installation can retrieve the PPA's encryption key without revealing it. Then a package-index update would be performed, and the desired app from the PPA would be installed, both of which could be performed with APT-offline. (So, if you have a slow internet connection at home, you could use it to add PPAs to the sources-list, which doesn't require much data, and then use APT-offline to perform updates and install software. If only a few small modules are required in order to install APT-offline, you might be able to install it via the slow connection, by downloading the required modules from the relevant Ubuntu Packages or Debian Packages site, calculating their checksums and comparing them to the reference values provided by the Package site, and assuming that the checksums are correct, installing them by means of a package-installer such as GDebi  - see Appendix II for details.)

It is possible to install software from PPAs without first using a direct internet connection for authentication, but there are plenty of authoritative and well-written articles on that subject, so fortunately for everyone I don't have to try to explain it.

Revision Notes
 
5/21/22 - Rewrote the section entitled "> Storing and installing the downloaded modules/packages/files".

5/22/22 - Realized that in my haste to wrap things up yesterday, I had published an early draft of the "> Storing and installing the downloaded modules/packages/files" section-rewrite, found the correct version in the trash (I had fortunately resisted the impulse to totally delete it), realized that it still needed a lot of work, fixed it up, and substituted it for the mess which previously occupied the same space.

5/26/22 - A) Rewrote the introduction. B) Rewrote the paragraph beginning with "The repository (which consists of software packages/modules, and the software/package index)," to indicate that using an older package index is OK as long as it's not outdated, and as long as you don't need the latest software revisions. Also split up and rewrote the subsequent three paragraphs. C) Also made similar changes to the paragraph beginning with "The online package index is revised daily".

5/29/22 - Revised Note 1.

6/4/22 - Clarified the paragraph beginning with "To perform a get-op" (two places).

6/10/22 - Added section entitled "> Avoiding get-op interruption".

6/10/22a - Added reference to the Gkrellm system monitor to the section entitled "> Avoiding get-op interruption"."