In a previous post, I indicated that repository-keys are apparently updated during package-index updates performed with a direct internet connection, but without providing any indication that the keys are being updated. (The keys are not updated when an update is performed via APT-offline, which explains the "cannot authenticate packages" warnings which appear when installing software after performing an update via APT-offline.)
However, I've since created an installation of regular Ubuntu (which I've realized is much better than I previously indicated, although I still think that the workspace-switching scheme is excessive), and performed a direct update on it via the terminal, and noticed indications that the keys were updated first. So, this confirms my suspicions that the keys are updated during direct package-index updates, but it still doesn't mean that every detail of the process was divulged.
This blog is a subsidiary of toggwu.blogspot.com, and it is dedicated to information related to Apt-offline, which is the ultimate tool for making changes to internet-access-impaired installations of Debian-type Linux, which includes Ubuntu and its derivatives.
Wednesday, August 24, 2022
Indications of key-update process during direct package-index updates found in Ubuntu
Tuesday, July 19, 2022
How to authenticate package indices for PCs which cannot be connected to the internet
Rev 8/6/22 (see Notes)
Every package index includes a Release or InRelease file (actually probably one for each of several sections) which contains a digital signature which must be decrypted with the public key corresponding to the secret/private key with which the signature was encrypted, in order to be authenticated, i.e. proven to be an official package index. This file contains checksums for the rest of the package index files, and the package index files contain checksums for every software module/package in the repository. So, this constitutes a secure chain of evidence that the packages are official and safe to install. But even without digital signatures, there would be very little chance that anyone could create counterfeit package indices and substitute them for authentic ones, although there are situations in which it would be possible. In some cases, such as in Ubuntu and its "official flavors," some of the required keys are not included by default, and if you create an installation out of one of these distros and update the package index via APT-offline, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y," where <app> is some large app which requires many packages try to install something, "cannot authenticate packages" (CAP) warnings will probably appear in the terminal because some of the keys are missing. I've tried various approaches to install the missing keys and eliminate the CAP-warnings, and the only way that works is to update the package index with a direct internet connection.
After creating a full installation, first enter "echo 'Binary::apt::APT::Keep-Downloaded-Packages "1";' | sudo tee /etc/apt/apt.conf.d/10apt-keep-downloads" (copy the command without the beginning and end quotes, and paste it into the command-line with Ctrl-Shift-V) so that any software-packages which are installed via a direct connection are retained in /var/cache/apt/archives after they're installed, instead of being deleted, which is done by default in many types of Linux. (It's easier to just enter the command instead of trying to determine whether it's necessary.)
Next, the package index should be updated, and when this is done with a direct connection (instead of using APT-offline), the first thing that happens is that missing keys are installed and outdated keys are updated. (I realized this after finding debian-archive-keyring in /var/cache/apt/archives, although I had never entered a command to install it, so obviously it was installed in the background during an update.) However, something else happens which doesn't happen when these same keys are installed without a direct internet connection, and whatever it is makes the difference between success and failure. I have tried copying the contents of usr/share/keyrings from an installation which has been updated ("sudo apt update") online (which eliminated "cannot authenticate packages"/CAP warnings from the online installation), to the same directory in an offline installation, but it didn't eliminate the CAP-warnings from the offline installation. I've also tried installing the missing keyring (debian-archive-keyring) both online and offline, and running "apt-key update" (the apt-key command will no longer be available as of Ubuntu 22.10) both online and offline, but none of these eliminated the CAP-warnings. So, obviously, something happens during an online update that doesn't happen during any of these unsuccessful approaches. Some articles indicate that it's necessary to associate each source in the etc/apt/sources.list file with a particular key, but I checked the sources.list file after updating Kubuntu 22.04 online, and the the list had the conventional format, so clearly this is not necessary, and Kubuntu's developers apparently considered it to be overkill. (It would provide a benefit only if the private keys of the official archive had been stolen, which is extremely unlikely. PPA-keys, on the other hand, might get stolen if the maintainer uses lax security practices.)
So, as far as I have been able to determine, a direct connection is required to install or update keys so that the system can make use of them.
Once the proper keys are installed, the actual package-index update begins by attempting to decrypt the digital signature in each Release/InRelease file (which are the first package-index files to be downloaded) with the corresponding key, presumably successfully (assuming that it's not counterfeit, which is highly unlikely, but not impossible). Even if the proper keys aren't installed, and the package index can't be authenticated, it would be installed anyways, because chances are that the authentication failure is due to missing or outdated keys. Still, I'd rather check the index with the proper keys to eliminate even the small chance that the index is counterfeit.
Linux Mint is one of the few Ubuntu-derivatives, if not the only one, in which all of the required keys are installed by default (apparently - I can't find any solid information on this subject), and the package index is not locked by default, so that you can use the package manager (APT) to generate a "download script" for APT-offline. (A download-script is a simple list of the URLs of the packages required to install APT-offline on that installation with the software configuration which existed when the script was generated. Installing APT-offline typically requires python3-magic to be installed first, so it's typically listed in APT-offline download-scripts, but if something else which requires python3-magic were installed first, python3-magic wouldn't be included in the APT-offline download-script.) Then, assuming that the installation doesn't have a direct internet connection, you would go to an access point and download the required packages from the Ubuntu Packages website (an interface to the repository), along with their reference SHA256 checksums, calculate their SHA256 checksums (right-click on each package, select Properties, and then select the tab with buttons for calculating various checksums) and compare each calculated value to the corresponding reference value. If they're different, you can tell with a glance.
So, Mint might be a good choice for an offline installation, assuming that it's 100% compatible with your PC. (Mint is notorious for its PC-compatibility issues, but there are lists of compatible PCs, and you can buy PCs with Mint pre-installed.) But you might prefer another type of Ubuntu without the hardware-compatibility issues, or for example Kubuntu so that you can install Kdenlive (the top-rated Linux video editor) with a mere 60MB download, as opposed to 200-300MB to install it on other types of Linux.
The problem is that there is apparently no way to update the keys on PCs which have installations of Ubuntu or its main derivatives, and which cannot be connected to the internet, such as bulky and/or power-hungry desktop PCs which have no internet connection and are not practical to take to an internet access point. However, I've devised a work-around for such cases. There might be better solutions, but if so, they're very well hidden, and chances are that most people would be able to find some way to to obtain a temporary direct high-speed internet connection for even a desktop PC, and avoid my convoluted solution.
My solution (although in practice not in this order) is to use APT-offline to obtain a copy of the package index, then to install a copy of it on an installation in which the package index has been updated by means of a direct internet connection, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y" for some large app which requires many packages (such as kdenlive). If the package index is counterfeit, you would receive CAP-messages which indicate that many of the required packages cannot be authenticated. In some cases, I've gotten CAP-messages related to just a couple of packages out of many which the app requires, which might mean that a key for a particular section of the package index is missing or outdated, although if an update had been performed, all of the keys should be available and up to date. But if just a couple of packages are affected, you could compare their checksums to the reference values on the corresponding download pages of the corresponding Packages site (i.e. Debian Packages or Ubuntu Packages). If you don't receive any CAP-messages, the package index obtained via APT-offline would have to be authentic, and you could install another copy of it on the "unconnectable" installation, and disregard any CAP-messages which will probably appear when you install software due to the unconnectable installation's missing or outdated keys.
There are two major approaches to accomplishing this goal, one of which involves taking a mini PC and possibly a 12VDC-110VAC power inverter or an uninterruptible power supply to an access-point (if no power outlets are available at the access point), and perhaps a cardboard hood to shield the monitor from sunlight. The other approach would involve a couple of trips to the access-point, but with just a phone or a tablet (for details on installing and using APT-offline on an Android or iOS device, see APT-offline A-Z). If the keys in the distro which you use for your "unconnectable" PC are incomplete by default (i.e. in the source-file and when the installation is first created) you'll have to take the first approach. With Linux Mint, however (at least as of 20.3), the latter approach can be used. (But with Mint, the first step is to ensure that your PC is on some list of compatible PCs.)
A) Take mini-PC to access point
The most straightforward approach would be to take your "portable" PC [2] with a flash-drive installation made from the distro that was used for the installation on the "unconnectable" PC to the access point and perform a direct update using the default repository-settings, unless you really know what you're doing when you change the repository-settings. If some of the required keyrings are missing or outdated, the latest versions would evidently be automatically installed in the background (i.e. with no indication) during the package-index update, based on my experience. For example, I performed a direct update on a flash-drive installation, after entering the aforementioned command to retain downloaded software packages after installing them, and later found the debian-archive-keyring package in the software archive, although I didn't enter a command to install it. So, it must have been installed automatically in the background during the update.
Next, I'd install APT-offline and nmon [3], along with any other software I'd need in the foreseeable future, on the flash-drive installation. Then, if your installation doesn't have a good disk I/O monitor (such as a widget or a configurable system-monitor app), I'd launch nmon by opening a terminal and entering "nmon," then hit "d" to configure it to monitor disk I/O, which comes in handy when writing large amounts of data to a flash-drive installation, to avoid prematurely terminating the write-process, which can take longer than indicated by the terminal. Normally, to avoid interrupting the write-process, flash drives are ejected and left plugged in until the ejection-process is complete, but flash-drive installations can't be ejected when the PC is running from them.
Then I'd use APT-offline to generate a signature file for an update (open a terminal and enter "sudo apt-offline set apt-offline.sig --update," which would create the signature file and place it in the Home directory), and then put the signature file and a folder named DDF (download-destination folder, to contain the downloaded packages) in a folder named "CNF-<distro-abbrev>-update-<date>," (such as "CNF-Kub2204-update-M-D-Y). CNF is for "change-name folder," which is part of a system I devised for using APT-offline efficiently, even when using Android/iOS devices for the get-operation - see APT-offline A-Z for details.
Next, I'd open a terminal in the CNF and enter a generic get-command ("apt-offline get apt-offline.sig -d DDF" - there is no "sudo" in get-commands). This would download the package-index files, screen them for errors using the reference checksums and file sizes in the Release or InRelease file, and if a file passes its screening-test, it is renamed with the name listed in the signature file after the file's URL (the name which it must have in order to be installed), and placed in the DDF.
The final step in updating an installation via APT-offline is to install the downloaded package-index files. To do this, you would normally transfer the flash drive which contains the downloaded files (in the DDF in the relevant CNF), to the target-installation, and then use the target-installation to open a terminal in the CNF and enter "sudo apt-offline install DDF," while keeping an eye on the disk I/O monitor to see when the installation is actually complete. But in this case there is no need to transfer the device containing the downloaded files to the target-installation, since it already is connected to the target-installation.
Once the package index obtained via APT-offline is installed, I would perform a mock installation (without an internet connection) of some large app that requires many packages and that's not already installed, by entering "sudo apt install <app> -y." If any CAP-messages appear which indicate that packages in general cannot be authenticated, the package index would be counterfeit, but otherwise it would be authentic, in which case another copy of the package index could be installed on the unconnectable installation, and software could be installed on it without regard for the CAP-messages caused by its missing or outdated keys.
B) Multiple trips to access point with phone or tablet
Based on my recent experience, the keys in Ubuntu and all of its "official flavors" are incomplete until the relevant installation is updated via a direct internet connection. Linux Mint is the exception, as far as I know, at least as of 20.3, so might be the best distro for your unconnectable PC (assuming that that PC is 100% Mint-compatible). In that case, you could use APT to identify the packages required to install APT-offline, by performing a mock installation as if you had a direct internet connection, and APT would indicate what it tried to install but couldn't due to the lack of an internet connection. Then you would download those packages from the Ubuntu Packages site, along with their checksums. (Copy the checksums from the download-page into a text file, because although the Ubuntu Packages download-pages with the checksums can be downloaded like a regular html page, they can't be opened with a web browser. LibreOffice Writer will open them, but it's not as convenient as copying the desired information and saving it in a text file.)
Then you'd check the checksum on each of the packages required to install APT-offline (right-click on each one, select Properties in the menu which appears, and then select the tab which calculates checksums, and the rest is obvious), and if they match the reference values from the Packages site, you would install them by clicking on each of the packages, and following the package-installer's instructions. The next step would be to use APT-offline to perform an update, as described above, and once the update is performed, you could install any software you would need for the foreseeable future. By the time the keys expire, I assume that there would be another Mint release with the new keys.
If you don't know whether any of the keys required by your installation are missing by default, you could install APT-offline using the package installer, then use APT-offline to update the package index, and then use APT itself to perform a mock installation (without an internet connection) of some large app that's not already installed. Then if you get any CAP-messages indicating that none of the packages can be authenticated, you'll know that you'll have to perform a direct update to install or update the keys, and if you have an unconnectable installation of the same type, you'd have to re-install the package index obtained via APT-offline onto the directly-updated installation and perform another mock installation to check for CAP-messages which indicate that packages in general can't be authenticated. If there are none, the package index would have to be authentic, and you could install it on the unconnectable installation and disregard the CAP-messages which I assume it would produce due to disabled or outdated keys.
So, it's a convoluted process, but being able to use Ubuntu or one of its derivatives (I prefer the derivatives) on an unconnectable desktop PC, for example, is worth the wade.
===========
Notes
8/6/22 - A) In the fourth paragraph, added a list of unsuccessful methods which I used in an attempt to install the required keys. B) Cleaned up some lousy writing in a few places and tweaked several passages. C) Revised Note 1 and added Note 3.
[1] 16GB Sandisk USB3 Ultra-FITs are good for Linux installations, due to their combination of reliability, price, performance and power consumption. Their problem is that they have plastic connector-shrouds, which wear out, but you could effectively give them a metal connector-shroud by pluging them into a USB3 M-F adapter and just leaving them plugged in. Sandisk's small, metal-cased Ultra-Luxe drives are also quite fast, although they seem to dissipate quite a bit of heat, and the only 16GB version which I tried didn't last very long. (Still, other 16GB units might run just on the warm side, and provide excellent performance and reliability.) Sandisk also makes a small metal-cased "Flair" drive which might be good for Linux installations, although the one I tried didn't last very long at all. So, Sandisk seems to have a problem getting its chips into small metal cases. I've tried a few other major brands, and other than a 16GB Kingston USB3 Datatraveler (no longer available), they were DOA, or Linux wouldn't even run on them. Cheap Chinese drives are notoriously unreliable because they're made from 2nd-rate chips from the periphery of the wafer, so I wouldn't put an installation on them. An article which recommends various flash drives for Linux installations recommends Sandisk Ultra-FITs and PNY Turbos, among others which are either expensive, or USB2's which are good for live installations but not full installations. But Amazon reviews indicate that PNY Turbos have a high failure rate. HP flash drives (which are made by PNY) might be more reliable, but they're not cheap.
[2] Laptops and Chromebooks might be convenient, but I doubt that you can use either to create a flash-drive Linux installation and boot from it, which can be done with mini-PCs. Mini-PCs can also be configured as air-gap PCs and connected to a keyboard and monitor through a KM switch to share the keyboard and monitor with other PCs, including an online mini-PC and perhaps a desktop PC for heavy-duty data processing. (For details on creating a cheap air-gap PC, see my post on the subject.) The potential problem with mini-PCs is providing them with power at the access point, which can be done with a 12VDC-110VAC power inverter plugged into a car's cigarette lighter or with an uninterruptible power supply with sufficient capacity to power the PC for about 15 minutes if no power outlets are available.
[3] Nmon is an excellent, terminal-based system monitor which uses bar-graph indicators and can be easily configured to monitor various processes, although your installation might already have a good widget for monitoring disk I/O (Kubuntu 22.04's is the best in my experience), and then there's Conky with the MX-Linux standard configuration, which I covered in one of my posts, and GKrellM.
Every package index includes a Release or InRelease file (actually probably one for each of several sections) which contains a digital signature which must be decrypted with the public key corresponding to the secret/private key with which the signature was encrypted, in order to be authenticated, i.e. proven to be an official package index. This file contains checksums for the rest of the package index files, and the package index files contain checksums for every software module/package in the repository. So, this constitutes a secure chain of evidence that the packages are official and safe to install. But even without digital signatures, there would be very little chance that anyone could create counterfeit package indices and substitute them for authentic ones, although there are situations in which it would be possible. In some cases, such as in Ubuntu and its "official flavors," some of the required keys are not included by default, and if you create an installation out of one of these distros and update the package index via APT-offline, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y," where <app> is some large app which requires many packages try to install something, "cannot authenticate packages" (CAP) warnings will probably appear in the terminal because some of the keys are missing. I've tried various approaches to install the missing keys and eliminate the CAP-warnings, and the only way that works is to update the package index with a direct internet connection.
After creating a full installation, first enter "echo 'Binary::apt::APT::Keep-Downloaded-Packages "1";' | sudo tee /etc/apt/apt.conf.d/10apt-keep-downloads" (copy the command without the beginning and end quotes, and paste it into the command-line with Ctrl-Shift-V) so that any software-packages which are installed via a direct connection are retained in /var/cache/apt/archives after they're installed, instead of being deleted, which is done by default in many types of Linux. (It's easier to just enter the command instead of trying to determine whether it's necessary.)
Next, the package index should be updated, and when this is done with a direct connection (instead of using APT-offline), the first thing that happens is that missing keys are installed and outdated keys are updated. (I realized this after finding debian-archive-keyring in /var/cache/apt/archives, although I had never entered a command to install it, so obviously it was installed in the background during an update.) However, something else happens which doesn't happen when these same keys are installed without a direct internet connection, and whatever it is makes the difference between success and failure. I have tried copying the contents of usr/share/keyrings from an installation which has been updated ("sudo apt update") online (which eliminated "cannot authenticate packages"/CAP warnings from the online installation), to the same directory in an offline installation, but it didn't eliminate the CAP-warnings from the offline installation. I've also tried installing the missing keyring (debian-archive-keyring) both online and offline, and running "apt-key update" (the apt-key command will no longer be available as of Ubuntu 22.10) both online and offline, but none of these eliminated the CAP-warnings. So, obviously, something happens during an online update that doesn't happen during any of these unsuccessful approaches. Some articles indicate that it's necessary to associate each source in the etc/apt/sources.list file with a particular key, but I checked the sources.list file after updating Kubuntu 22.04 online, and the the list had the conventional format, so clearly this is not necessary, and Kubuntu's developers apparently considered it to be overkill. (It would provide a benefit only if the private keys of the official archive had been stolen, which is extremely unlikely. PPA-keys, on the other hand, might get stolen if the maintainer uses lax security practices.)
So, as far as I have been able to determine, a direct connection is required to install or update keys so that the system can make use of them.
Once the proper keys are installed, the actual package-index update begins by attempting to decrypt the digital signature in each Release/InRelease file (which are the first package-index files to be downloaded) with the corresponding key, presumably successfully (assuming that it's not counterfeit, which is highly unlikely, but not impossible). Even if the proper keys aren't installed, and the package index can't be authenticated, it would be installed anyways, because chances are that the authentication failure is due to missing or outdated keys. Still, I'd rather check the index with the proper keys to eliminate even the small chance that the index is counterfeit.
Linux Mint is one of the few Ubuntu-derivatives, if not the only one, in which all of the required keys are installed by default (apparently - I can't find any solid information on this subject), and the package index is not locked by default, so that you can use the package manager (APT) to generate a "download script" for APT-offline. (A download-script is a simple list of the URLs of the packages required to install APT-offline on that installation with the software configuration which existed when the script was generated. Installing APT-offline typically requires python3-magic to be installed first, so it's typically listed in APT-offline download-scripts, but if something else which requires python3-magic were installed first, python3-magic wouldn't be included in the APT-offline download-script.) Then, assuming that the installation doesn't have a direct internet connection, you would go to an access point and download the required packages from the Ubuntu Packages website (an interface to the repository), along with their reference SHA256 checksums, calculate their SHA256 checksums (right-click on each package, select Properties, and then select the tab with buttons for calculating various checksums) and compare each calculated value to the corresponding reference value. If they're different, you can tell with a glance.
So, Mint might be a good choice for an offline installation, assuming that it's 100% compatible with your PC. (Mint is notorious for its PC-compatibility issues, but there are lists of compatible PCs, and you can buy PCs with Mint pre-installed.) But you might prefer another type of Ubuntu without the hardware-compatibility issues, or for example Kubuntu so that you can install Kdenlive (the top-rated Linux video editor) with a mere 60MB download, as opposed to 200-300MB to install it on other types of Linux.
The problem is that there is apparently no way to update the keys on PCs which have installations of Ubuntu or its main derivatives, and which cannot be connected to the internet, such as bulky and/or power-hungry desktop PCs which have no internet connection and are not practical to take to an internet access point. However, I've devised a work-around for such cases. There might be better solutions, but if so, they're very well hidden, and chances are that most people would be able to find some way to to obtain a temporary direct high-speed internet connection for even a desktop PC, and avoid my convoluted solution.
My solution (although in practice not in this order) is to use APT-offline to obtain a copy of the package index, then to install a copy of it on an installation in which the package index has been updated by means of a direct internet connection, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y" for some large app which requires many packages (such as kdenlive). If the package index is counterfeit, you would receive CAP-messages which indicate that many of the required packages cannot be authenticated. In some cases, I've gotten CAP-messages related to just a couple of packages out of many which the app requires, which might mean that a key for a particular section of the package index is missing or outdated, although if an update had been performed, all of the keys should be available and up to date. But if just a couple of packages are affected, you could compare their checksums to the reference values on the corresponding download pages of the corresponding Packages site (i.e. Debian Packages or Ubuntu Packages). If you don't receive any CAP-messages, the package index obtained via APT-offline would have to be authentic, and you could install another copy of it on the "unconnectable" installation, and disregard any CAP-messages which will probably appear when you install software due to the unconnectable installation's missing or outdated keys.
There are two major approaches to accomplishing this goal, one of which involves taking a mini PC and possibly a 12VDC-110VAC power inverter or an uninterruptible power supply to an access-point (if no power outlets are available at the access point), and perhaps a cardboard hood to shield the monitor from sunlight. The other approach would involve a couple of trips to the access-point, but with just a phone or a tablet (for details on installing and using APT-offline on an Android or iOS device, see APT-offline A-Z). If the keys in the distro which you use for your "unconnectable" PC are incomplete by default (i.e. in the source-file and when the installation is first created) you'll have to take the first approach. With Linux Mint, however (at least as of 20.3), the latter approach can be used. (But with Mint, the first step is to ensure that your PC is on some list of compatible PCs.)
A) Take mini-PC to access point
The most straightforward approach would be to take your "portable" PC [2] with a flash-drive installation made from the distro that was used for the installation on the "unconnectable" PC to the access point and perform a direct update using the default repository-settings, unless you really know what you're doing when you change the repository-settings. If some of the required keyrings are missing or outdated, the latest versions would evidently be automatically installed in the background (i.e. with no indication) during the package-index update, based on my experience. For example, I performed a direct update on a flash-drive installation, after entering the aforementioned command to retain downloaded software packages after installing them, and later found the debian-archive-keyring package in the software archive, although I didn't enter a command to install it. So, it must have been installed automatically in the background during the update.
Next, I'd install APT-offline and nmon [3], along with any other software I'd need in the foreseeable future, on the flash-drive installation. Then, if your installation doesn't have a good disk I/O monitor (such as a widget or a configurable system-monitor app), I'd launch nmon by opening a terminal and entering "nmon," then hit "d" to configure it to monitor disk I/O, which comes in handy when writing large amounts of data to a flash-drive installation, to avoid prematurely terminating the write-process, which can take longer than indicated by the terminal. Normally, to avoid interrupting the write-process, flash drives are ejected and left plugged in until the ejection-process is complete, but flash-drive installations can't be ejected when the PC is running from them.
Then I'd use APT-offline to generate a signature file for an update (open a terminal and enter "sudo apt-offline set apt-offline.sig --update," which would create the signature file and place it in the Home directory), and then put the signature file and a folder named DDF (download-destination folder, to contain the downloaded packages) in a folder named "CNF-<distro-abbrev>-update-<date>," (such as "CNF-Kub2204-update-M-D-Y). CNF is for "change-name folder," which is part of a system I devised for using APT-offline efficiently, even when using Android/iOS devices for the get-operation - see APT-offline A-Z for details.
Next, I'd open a terminal in the CNF and enter a generic get-command ("apt-offline get apt-offline.sig -d DDF" - there is no "sudo" in get-commands). This would download the package-index files, screen them for errors using the reference checksums and file sizes in the Release or InRelease file, and if a file passes its screening-test, it is renamed with the name listed in the signature file after the file's URL (the name which it must have in order to be installed), and placed in the DDF.
The final step in updating an installation via APT-offline is to install the downloaded package-index files. To do this, you would normally transfer the flash drive which contains the downloaded files (in the DDF in the relevant CNF), to the target-installation, and then use the target-installation to open a terminal in the CNF and enter "sudo apt-offline install DDF," while keeping an eye on the disk I/O monitor to see when the installation is actually complete. But in this case there is no need to transfer the device containing the downloaded files to the target-installation, since it already is connected to the target-installation.
Once the package index obtained via APT-offline is installed, I would perform a mock installation (without an internet connection) of some large app that requires many packages and that's not already installed, by entering "sudo apt install <app> -y." If any CAP-messages appear which indicate that packages in general cannot be authenticated, the package index would be counterfeit, but otherwise it would be authentic, in which case another copy of the package index could be installed on the unconnectable installation, and software could be installed on it without regard for the CAP-messages caused by its missing or outdated keys.
B) Multiple trips to access point with phone or tablet
Based on my recent experience, the keys in Ubuntu and all of its "official flavors" are incomplete until the relevant installation is updated via a direct internet connection. Linux Mint is the exception, as far as I know, at least as of 20.3, so might be the best distro for your unconnectable PC (assuming that that PC is 100% Mint-compatible). In that case, you could use APT to identify the packages required to install APT-offline, by performing a mock installation as if you had a direct internet connection, and APT would indicate what it tried to install but couldn't due to the lack of an internet connection. Then you would download those packages from the Ubuntu Packages site, along with their checksums. (Copy the checksums from the download-page into a text file, because although the Ubuntu Packages download-pages with the checksums can be downloaded like a regular html page, they can't be opened with a web browser. LibreOffice Writer will open them, but it's not as convenient as copying the desired information and saving it in a text file.)
Then you'd check the checksum on each of the packages required to install APT-offline (right-click on each one, select Properties in the menu which appears, and then select the tab which calculates checksums, and the rest is obvious), and if they match the reference values from the Packages site, you would install them by clicking on each of the packages, and following the package-installer's instructions. The next step would be to use APT-offline to perform an update, as described above, and once the update is performed, you could install any software you would need for the foreseeable future. By the time the keys expire, I assume that there would be another Mint release with the new keys.
If you don't know whether any of the keys required by your installation are missing by default, you could install APT-offline using the package installer, then use APT-offline to update the package index, and then use APT itself to perform a mock installation (without an internet connection) of some large app that's not already installed. Then if you get any CAP-messages indicating that none of the packages can be authenticated, you'll know that you'll have to perform a direct update to install or update the keys, and if you have an unconnectable installation of the same type, you'd have to re-install the package index obtained via APT-offline onto the directly-updated installation and perform another mock installation to check for CAP-messages which indicate that packages in general can't be authenticated. If there are none, the package index would have to be authentic, and you could install it on the unconnectable installation and disregard the CAP-messages which I assume it would produce due to disabled or outdated keys.
So, it's a convoluted process, but being able to use Ubuntu or one of its derivatives (I prefer the derivatives) on an unconnectable desktop PC, for example, is worth the wade.
===========
Notes
8/6/22 - A) In the fourth paragraph, added a list of unsuccessful methods which I used in an attempt to install the required keys. B) Cleaned up some lousy writing in a few places and tweaked several passages. C) Revised Note 1 and added Note 3.
[1] 16GB Sandisk USB3 Ultra-FITs are good for Linux installations, due to their combination of reliability, price, performance and power consumption. Their problem is that they have plastic connector-shrouds, which wear out, but you could effectively give them a metal connector-shroud by pluging them into a USB3 M-F adapter and just leaving them plugged in. Sandisk's small, metal-cased Ultra-Luxe drives are also quite fast, although they seem to dissipate quite a bit of heat, and the only 16GB version which I tried didn't last very long. (Still, other 16GB units might run just on the warm side, and provide excellent performance and reliability.) Sandisk also makes a small metal-cased "Flair" drive which might be good for Linux installations, although the one I tried didn't last very long at all. So, Sandisk seems to have a problem getting its chips into small metal cases. I've tried a few other major brands, and other than a 16GB Kingston USB3 Datatraveler (no longer available), they were DOA, or Linux wouldn't even run on them. Cheap Chinese drives are notoriously unreliable because they're made from 2nd-rate chips from the periphery of the wafer, so I wouldn't put an installation on them. An article which recommends various flash drives for Linux installations recommends Sandisk Ultra-FITs and PNY Turbos, among others which are either expensive, or USB2's which are good for live installations but not full installations. But Amazon reviews indicate that PNY Turbos have a high failure rate. HP flash drives (which are made by PNY) might be more reliable, but they're not cheap.
[2] Laptops and Chromebooks might be convenient, but I doubt that you can use either to create a flash-drive Linux installation and boot from it, which can be done with mini-PCs. Mini-PCs can also be configured as air-gap PCs and connected to a keyboard and monitor through a KM switch to share the keyboard and monitor with other PCs, including an online mini-PC and perhaps a desktop PC for heavy-duty data processing. (For details on creating a cheap air-gap PC, see my post on the subject.) The potential problem with mini-PCs is providing them with power at the access point, which can be done with a 12VDC-110VAC power inverter plugged into a car's cigarette lighter or with an uninterruptible power supply with sufficient capacity to power the PC for about 15 minutes if no power outlets are available.
[3] Nmon is an excellent, terminal-based system monitor which uses bar-graph indicators and can be easily configured to monitor various processes, although your installation might already have a good widget for monitoring disk I/O (Kubuntu 22.04's is the best in my experience), and then there's Conky with the MX-Linux standard configuration, which I covered in one of my posts, and GKrellM.
Monday, April 18, 2022
Software developer casts doubts on the future of Flatpak, etc.
Rev 4/19/22
On a lark, I decided to try Amberol, a simple music player which is currently available only in the form of a Flatpak. After downloading what must have been about 100MB, which was just the first part of six, I decided that I was no longer interested in it, and terminated the installation. I've been using the MOC player, although after fiddling with Clementine (standard in MX-Linux XFCE) for a few minutes, I realized that it provides the option of dragging files and folders onto it to play them and put them in a queue, and the ability to clear the queue with a click. By default, it displays a periodic spectrum analysis, but it can display other types of waveform analyses, or none, which I chose to reduce the use of system resources, although the analyzer wasn't much of a load on the system. So, for my purposes, Clementine is a great choice. Something like Amberol would be better, if it were available as a Debian package, instead of as a separate OS to run on top of the existing OS.
I decided to see if I'm the only one who thinks that the amount of data required to install Flatpaks is ridiculous, and found a 2021 article entitled Flatpak is Not the Future at ludocode.com, written by an experienced software developer, who indicates that Flatpak, Snappy, etc. are causing more problems than they're solving, and that the solution is to standardize the libraries (the OS-modules used by apps) across Linux distributions, which is supposedly taking place to some extent.
I also looked into the global prevalence of internet access, and found that there are still billions of people on the planet without good internet access, and for them, Debian-type Linux, which includes APT-offline, is a good option. There's also the option of using a laptop or a Chromebook to run Linux, and to take the device to some internet access point (such as a wi-fi hotspot) to access the internet and to make changes to the installation. However, Linux laptops tend to be expensive. (Netbooks, which were cheap little Linux laptops with the potential for extreme security, have apparently been banned, because they're no longer available despite their popularity when they were available, or precisely because of it.) But Chromebooks which can run Linux apps have reasonable prices.
But if you don't like laptops/Chromebooks for some reason (I don't like the cost, and all the compromises made to fit everything into the case), and would rather use a PC that remains at home, and use a phone or a tablet to download web pages to view at home, and to download software/data modules to make changes to the home PC's installation, you could do the former with apps designed for that purpose (since web-pages for phones aren't the same as those for PCs), and the latter with APT-offline.
On a lark, I decided to try Amberol, a simple music player which is currently available only in the form of a Flatpak. After downloading what must have been about 100MB, which was just the first part of six, I decided that I was no longer interested in it, and terminated the installation. I've been using the MOC player, although after fiddling with Clementine (standard in MX-Linux XFCE) for a few minutes, I realized that it provides the option of dragging files and folders onto it to play them and put them in a queue, and the ability to clear the queue with a click. By default, it displays a periodic spectrum analysis, but it can display other types of waveform analyses, or none, which I chose to reduce the use of system resources, although the analyzer wasn't much of a load on the system. So, for my purposes, Clementine is a great choice. Something like Amberol would be better, if it were available as a Debian package, instead of as a separate OS to run on top of the existing OS.
I decided to see if I'm the only one who thinks that the amount of data required to install Flatpaks is ridiculous, and found a 2021 article entitled Flatpak is Not the Future at ludocode.com, written by an experienced software developer, who indicates that Flatpak, Snappy, etc. are causing more problems than they're solving, and that the solution is to standardize the libraries (the OS-modules used by apps) across Linux distributions, which is supposedly taking place to some extent.
I also looked into the global prevalence of internet access, and found that there are still billions of people on the planet without good internet access, and for them, Debian-type Linux, which includes APT-offline, is a good option. There's also the option of using a laptop or a Chromebook to run Linux, and to take the device to some internet access point (such as a wi-fi hotspot) to access the internet and to make changes to the installation. However, Linux laptops tend to be expensive. (Netbooks, which were cheap little Linux laptops with the potential for extreme security, have apparently been banned, because they're no longer available despite their popularity when they were available, or precisely because of it.) But Chromebooks which can run Linux apps have reasonable prices.
But if you don't like laptops/Chromebooks for some reason (I don't like the cost, and all the compromises made to fit everything into the case), and would rather use a PC that remains at home, and use a phone or a tablet to download web pages to view at home, and to download software/data modules to make changes to the home PC's installation, you could do the former with apps designed for that purpose (since web-pages for phones aren't the same as those for PCs), and the latter with APT-offline.
Tuesday, March 15, 2022
Installing APT-Offline on MX-Linux without an internet connection
3/19/22
Because updating the package index in MX-Linux requires an approximately 80MB download for the XFCE version and 120MB for the KDE version, I wanted to be able to download the package index and the software once, and reuse them, to avoid having to download them again. So, I wanted to use APT-offline, because when you update an installation with APT-offline, it downloads the package-index files into a folder created by the user for those files only, and it can install them on as many installations (of the same type and version) as you like.
However, there's a catch: APT-offline isn't included with MX-Linux by default, and the software/package manager wouldn't tell me what packages I had to install in order to install APT-offline, because the package manager is locked on new installations (including on live installations) until the package index is updated. So, it looked as if I would have to update the package index by means of a direct internet connection, in order to unlock the package manager, and then install APT-offline and use it to download the package-index files into a folder to retain them. (During a normal package-index update-process, the downloaded package-index files are automatically deleted after being extracted and installed, although I suppose there's some secret, convoluted command which would prevent them from being deleted.)
Then it occurred to me that I could go to the "Debian packages" site, find a list of packages which are required for installing APT-offline on MX-Linux 21 (which is based on Debian 11, a.k.a. Debian Bullseye), and then determine which of these had been installed when I installed them on another installation by means of a direct internet connection, and assuming that only a few packages would be required, copy them from the aforementioned installation (thus avoiding the need to validate them, since they had been validated during the normal installation process), and install them with package-installer such as GDebi, by simply clicking on them. It turned out that, in order to install APT-offline on MXL-21, only two packages totaling 65KB are required: python3-magic, and APT-offline itself (not APT-offline-GUI, which requires several dependencies and would be a pain to install by means of a package-installer). So, I obtained copies which I had installed on another installation by means of a direct internet connection, and which therefore had been validated, and installed them on live versions of MX-Linux 21 XFCE and KDE by just clicking on them (python3-magic first, since APT-offline depends on it), as a test, and found that it worked.
But in general, you could go to the relevant Packages site (such as Ubuntu Packages), get the list of "depends" (required) files required to install APT-offline, and then check Distrowatch's list of packages installed by default on the release of interest. Then you'd go to the relevant Packages site and download those that aren't installed by default, calculate the checksums of your copies, compare them to the reference values on the Packages site, and if they match, install them by simply clicking on them. The installer would then let you know whether they had already been installed, or whether another of the packages which you had downloaded and screened would have to be installed first. But the list would probably be small in any case for APT-offline, so this approach would probably be practical in any case.
> Creating a signature file for a package-index update
Once APT-offline is installed, you would tell it to create a signature file [1] named apt-offline.sig, for a package-index update, by simply opening a terminal wherever you want to put the signature file (right-click in the chosen directory, and select "open terminal here" or "open in terminal"), and entering "sudo apt-offline set apt-offline.sig --update." Next, you'd create an "APT-offline Change-Name Folder" named "AOL-CNF-<PC-designation>-<type of installation, such as MX21-X for MX-Linux 21 XFCE>-update-<date>," which would provide you with sufficient information to identify the purpose of the signature file and its age, which you would put inside the CNF-folder along with a generically-named download-destination folder named DDF to receive the downloaded/screened files produced by the get-process, performed with some device with a 4G or better internet connection and an installation of APT-offline (such as on Ubuntu installed on top of the Android app Userland - see APT-Offline A-Z at AnAptOfflineBlog for details).
> Creating a signature file for installing apps
To create a signature file to install apps, you would enter "sudo apt-offline set apt-offline.sig --install-packages," followed by a list of apps, using their special names without caps or spaces, separated by commas ONLY (no spaces). To get the software modules/packages/files, you would process the signature file exactly as you would for getting software-index modules - put it in an appropriately-named CNF along with a DDF, transfer the CNF to the download-device, etc.
> Rationale for my system for using APT-offline
By using generically-named signature files and download-destination folders, and always putting them in the same directory on a phone before initiating the get-process, the get-process can be initiated with a fixed get-command, such as "apt-offline get <pathGPD>/apt-offline.sig -d <pathGPD>DDF," where pathGPD is the path to the "get-process directory" (any stable, convenient directory into which each apt-offline.sig/DDF pair would be transferred for use in a get-operation, and then returned to their CNF). Once you enter this command into a terminal, you can execute it again by pressing the up-arrow key (or whatever key has this effect) until the command appears on the command-line, and then pressing Enter, which is easy to do even on a phone. Then, if you have another signature file to process, you could move the processed apt-offline.sig/DDF pair back to its CNF, using an Android file manager such as X-plore (my favorite), then move the next apt-offline.sig/DDF pair into the GPD, and perform another get-op by re-entering the fixed get-command.
> Just ignore get-op error-messages
You can probably safely ignore the error-messages which typically appear in the terminal during the get-process, because they typically indicate that APT-offline has given up on one approach and is taking another (due to differences between repository-structures), and in my experience it has always found a combination of approaches that works.
Installing packages obtained via APT-offline
Then you would connect or copy the CNF to the target-installation, open a terminal in the CNF, and enter "sudo apt-offline install DDF." This would install the package-index, which could take a while to install on a flash-drive installation due to the size of the index after extraction (120MB for XFCE, and 200MB for KDE, as I recall), and the typical slow write-speed of flash drives. So, wait for the terminal to return a command-prompt, indicating that it's done.
To install software packages, you would first "add" them to the installation (put them in the "archives" directory, either manually or by using APT-offline to perform an install-op on the folder containing the software), after which you would actually install them by performing a normal installation-process for each of the apps listed in the set-command which created the signature file, and hopefully listed in an abbreviated manner in the CNF's name. They would be saved in an archive-backup which would contain all of the software packages which you had installed on that installation, along with the latest package-index-file download which you had installed on that installation, because to install the software specified by a particular set of package-index files, you'll have to install the same package index files. If you forget what's included in the folder, you might have to search through a lot of files to find the actual app-packages, so it might be a good idea to keep a list of installed apps along with the software and index files.
If you want to install more software on the same installation, you could try your old copy of the package index, but you might find that you have to update it, such as if some of the packages specified by your old package index are no longer available from the repository. When you get the packages for the new software, you would add them to your archive-backup which contains all of the software packages you have installed, and your most recent copy of the package index.
If you create a new installation, and try to install your saved software using your latest copy of the package index, you might find that you have to update some of the software to be consistent with the package index. Assuming that the packages are still available from the online software repository for that type and version of Linux, you would then get them and install them, but if they aren't available, you'd have to update the package index and then go through the process of installing all of the software again, which might have the effect of updating some of the software modules (or, if you're using APT-offline to install the software, listing the updated software modules in the signature file). Avoiding this convoluted process is why it's best to install all of the software you'll need at once, but you can't always anticipate what you'll need. If just a few packages are required to add an app, you could try to install them by means of a package-installer, without updating the package index, as described above for installing APT-offline. But if there are conflicts between the requirements for the existing and desired software, it might not be practical to resolve them without using the package manager.
Even if you never want to install more software, but just want to keep making new installations indefinitely (theoretically), the package-index files will probably become too old to use at some point, and the package manager would refuse to install them on a new installation, even though you weren't planning on using it to access the (online) repository. So, if you want to continue to create additional installations without downloading package-index files every time, you would have to get new package index files with APT-offline and save them. You could try to use the same software with the new package index, but you might have to get some upgraded software packages. Then, each time you end up having to download more software packages, you'd add them to your backup-archive for that installation. You wouldn't have to delete packages that have been superseded by updated versions, because you can copy them to the new installation (to /var/cache/apt/archives), and the package manager would just ignore them.
If you want to get fancy, such as to download vast amounts of data (such as an OS-upgrade) very quickly (assuming that you have a suitable internet connection), there are options which you can use with the get command to optimize the get-process. These options are listed on the apt-offline(8) "manpage" (manual-page).
Notes
[1] An APT-offline signature file is a specialized text file which contains a list of URLs of files to download, combined with security-related information such as checksums, file sizes, and names to assign to the files if they pass their tests.
Because updating the package index in MX-Linux requires an approximately 80MB download for the XFCE version and 120MB for the KDE version, I wanted to be able to download the package index and the software once, and reuse them, to avoid having to download them again. So, I wanted to use APT-offline, because when you update an installation with APT-offline, it downloads the package-index files into a folder created by the user for those files only, and it can install them on as many installations (of the same type and version) as you like.
However, there's a catch: APT-offline isn't included with MX-Linux by default, and the software/package manager wouldn't tell me what packages I had to install in order to install APT-offline, because the package manager is locked on new installations (including on live installations) until the package index is updated. So, it looked as if I would have to update the package index by means of a direct internet connection, in order to unlock the package manager, and then install APT-offline and use it to download the package-index files into a folder to retain them. (During a normal package-index update-process, the downloaded package-index files are automatically deleted after being extracted and installed, although I suppose there's some secret, convoluted command which would prevent them from being deleted.)
Then it occurred to me that I could go to the "Debian packages" site, find a list of packages which are required for installing APT-offline on MX-Linux 21 (which is based on Debian 11, a.k.a. Debian Bullseye), and then determine which of these had been installed when I installed them on another installation by means of a direct internet connection, and assuming that only a few packages would be required, copy them from the aforementioned installation (thus avoiding the need to validate them, since they had been validated during the normal installation process), and install them with package-installer such as GDebi, by simply clicking on them. It turned out that, in order to install APT-offline on MXL-21, only two packages totaling 65KB are required: python3-magic, and APT-offline itself (not APT-offline-GUI, which requires several dependencies and would be a pain to install by means of a package-installer). So, I obtained copies which I had installed on another installation by means of a direct internet connection, and which therefore had been validated, and installed them on live versions of MX-Linux 21 XFCE and KDE by just clicking on them (python3-magic first, since APT-offline depends on it), as a test, and found that it worked.
But in general, you could go to the relevant Packages site (such as Ubuntu Packages), get the list of "depends" (required) files required to install APT-offline, and then check Distrowatch's list of packages installed by default on the release of interest. Then you'd go to the relevant Packages site and download those that aren't installed by default, calculate the checksums of your copies, compare them to the reference values on the Packages site, and if they match, install them by simply clicking on them. The installer would then let you know whether they had already been installed, or whether another of the packages which you had downloaded and screened would have to be installed first. But the list would probably be small in any case for APT-offline, so this approach would probably be practical in any case.
> Creating a signature file for a package-index update
Once APT-offline is installed, you would tell it to create a signature file [1] named apt-offline.sig, for a package-index update, by simply opening a terminal wherever you want to put the signature file (right-click in the chosen directory, and select "open terminal here" or "open in terminal"), and entering "sudo apt-offline set apt-offline.sig --update." Next, you'd create an "APT-offline Change-Name Folder" named "AOL-CNF-<PC-designation>-<type of installation, such as MX21-X for MX-Linux 21 XFCE>-update-<date>," which would provide you with sufficient information to identify the purpose of the signature file and its age, which you would put inside the CNF-folder along with a generically-named download-destination folder named DDF to receive the downloaded/screened files produced by the get-process, performed with some device with a 4G or better internet connection and an installation of APT-offline (such as on Ubuntu installed on top of the Android app Userland - see APT-Offline A-Z at AnAptOfflineBlog for details).
> Creating a signature file for installing apps
To create a signature file to install apps, you would enter "sudo apt-offline set apt-offline.sig --install-packages," followed by a list of apps, using their special names without caps or spaces, separated by commas ONLY (no spaces). To get the software modules/packages/files, you would process the signature file exactly as you would for getting software-index modules - put it in an appropriately-named CNF along with a DDF, transfer the CNF to the download-device, etc.
> Rationale for my system for using APT-offline
By using generically-named signature files and download-destination folders, and always putting them in the same directory on a phone before initiating the get-process, the get-process can be initiated with a fixed get-command, such as "apt-offline get <pathGPD>/apt-offline.sig -d <pathGPD>DDF," where pathGPD is the path to the "get-process directory" (any stable, convenient directory into which each apt-offline.sig/DDF pair would be transferred for use in a get-operation, and then returned to their CNF). Once you enter this command into a terminal, you can execute it again by pressing the up-arrow key (or whatever key has this effect) until the command appears on the command-line, and then pressing Enter, which is easy to do even on a phone. Then, if you have another signature file to process, you could move the processed apt-offline.sig/DDF pair back to its CNF, using an Android file manager such as X-plore (my favorite), then move the next apt-offline.sig/DDF pair into the GPD, and perform another get-op by re-entering the fixed get-command.
> Just ignore get-op error-messages
You can probably safely ignore the error-messages which typically appear in the terminal during the get-process, because they typically indicate that APT-offline has given up on one approach and is taking another (due to differences between repository-structures), and in my experience it has always found a combination of approaches that works.
Installing packages obtained via APT-offline
Then you would connect or copy the CNF to the target-installation, open a terminal in the CNF, and enter "sudo apt-offline install DDF." This would install the package-index, which could take a while to install on a flash-drive installation due to the size of the index after extraction (120MB for XFCE, and 200MB for KDE, as I recall), and the typical slow write-speed of flash drives. So, wait for the terminal to return a command-prompt, indicating that it's done.
To install software packages, you would first "add" them to the installation (put them in the "archives" directory, either manually or by using APT-offline to perform an install-op on the folder containing the software), after which you would actually install them by performing a normal installation-process for each of the apps listed in the set-command which created the signature file, and hopefully listed in an abbreviated manner in the CNF's name. They would be saved in an archive-backup which would contain all of the software packages which you had installed on that installation, along with the latest package-index-file download which you had installed on that installation, because to install the software specified by a particular set of package-index files, you'll have to install the same package index files. If you forget what's included in the folder, you might have to search through a lot of files to find the actual app-packages, so it might be a good idea to keep a list of installed apps along with the software and index files.
If you want to install more software on the same installation, you could try your old copy of the package index, but you might find that you have to update it, such as if some of the packages specified by your old package index are no longer available from the repository. When you get the packages for the new software, you would add them to your archive-backup which contains all of the software packages you have installed, and your most recent copy of the package index.
If you create a new installation, and try to install your saved software using your latest copy of the package index, you might find that you have to update some of the software to be consistent with the package index. Assuming that the packages are still available from the online software repository for that type and version of Linux, you would then get them and install them, but if they aren't available, you'd have to update the package index and then go through the process of installing all of the software again, which might have the effect of updating some of the software modules (or, if you're using APT-offline to install the software, listing the updated software modules in the signature file). Avoiding this convoluted process is why it's best to install all of the software you'll need at once, but you can't always anticipate what you'll need. If just a few packages are required to add an app, you could try to install them by means of a package-installer, without updating the package index, as described above for installing APT-offline. But if there are conflicts between the requirements for the existing and desired software, it might not be practical to resolve them without using the package manager.
Even if you never want to install more software, but just want to keep making new installations indefinitely (theoretically), the package-index files will probably become too old to use at some point, and the package manager would refuse to install them on a new installation, even though you weren't planning on using it to access the (online) repository. So, if you want to continue to create additional installations without downloading package-index files every time, you would have to get new package index files with APT-offline and save them. You could try to use the same software with the new package index, but you might have to get some upgraded software packages. Then, each time you end up having to download more software packages, you'd add them to your backup-archive for that installation. You wouldn't have to delete packages that have been superseded by updated versions, because you can copy them to the new installation (to /var/cache/apt/archives), and the package manager would just ignore them.
If you want to get fancy, such as to download vast amounts of data (such as an OS-upgrade) very quickly (assuming that you have a suitable internet connection), there are options which you can use with the get command to optimize the get-process. These options are listed on the apt-offline(8) "manpage" (manual-page).
Notes
[1] An APT-offline signature file is a specialized text file which contains a list of URLs of files to download, combined with security-related information such as checksums, file sizes, and names to assign to the files if they pass their tests.
Subscribe to:
Posts (Atom)