Sunday, March 17, 2024

New APT-offline-GUI icon


4/19/24 (see Notes)

===================

Warning

There are potential security risks involved in connecting a phone (or for that matter any device with wireless capability) to an "air gap" PC. Even if the phone's wireless capability is supposedly disabled via software, it's possible that malware could enable it, such as in burst-mode, and even if it remained disabled, there would still be the possibility of malware on the phone accessing data on the PC. To be certain that hackers and malware can't access the PC via the phone, a "dumb" storage device such as a flash drive or SD-card should be used to relay data between the PC and the phone. Another possibility would be to use an MX-Linux installation just to make changes and to make duplicate installations for accessing/processing data (details below), so that there would be no data on the installation which would be connected to the phone.

====================

My original APT-offline-GUI icon was too abstract and had a crude appearance by icon-standards, so I devised a replacement which expresses the essential aspects of using APT-offline with a phone as the internet-access device, and I used a combination of standard icons and shapes to give it a more polished appearance. It's more complex than the typical icon, but APT-offline is so unique that it can't be lumped in with anything else and symbolized by a simple icon. The details are too small to be seen in a small icon, but they could be seen by increasing the icon-size, which is an option wherever icons are used.

So, in the unlikely case that the icon were incorporated into APT-offline-GUI, and APT-offline were included in a Linux release, users could examine the icon and get a basic idea of what APT-offline does. They might realize that it would allow them to create a totally offline, totally secure installation, and to make changes to it without compromising its security. [1]   

In order to automatically appear in a panel or dock when the GUI is launched, and to be pinned in place to create a launcher in the panel or dock, the icon would have to be assigned an official name to avoid conflicts with other icon-names, the name would have to be incorporated into the APT-offline-GUI source code, and the icon would have to be placed in the appropriate icon-directory as part of the APT-offline-GUI installation-process. As it is, a generic icon is used.

In the new icon, the black arrow symbolizes the exchange of data between the PC and phone. This data consists of a) "signature files" (SFs), which contain lists of URLs of files to download, and information which allows the downloaded files to be validated [2]; and b) the resulting downloaded files, which are downloaded, validated, and stored in a folder reserved for the files listed in the corresponding SF, by the phone's installation of APT-offline running in "get" mode, perhaps after taking the phone to a suitable access-point, such as an area with 5G coverage or a fast wifi hotspot.

Error messages about files not being found are common during the download-process, but they're typically insignificant because signature files contain redundant listings to compensate for differences in server-organization, and if a file isn't found in one location on a particular server, it's found in another.

The downloaded files consist of software modules/packages, and/or "package index" files. A complete set of package index files is known as a package index, which contains information on all of the packages in the relevant online repositories, including which packages are required by each app. In some types of Linux, certain subsets of the online package index can be downloaded and installed, to reduce the amount of data required for an update, although I would caution against de-selecting any sections from the default selections unless you really know what you're doing, because you might end up needing the de-selected sections to make the desired changes to the PC. For example, I once de-selected the updates-sections (an option in Ubuntu at the time), because I was just installing apps, and it caused a problem because one of the apps I was installing required packages which were listed in one or more of the updates-sections. So, I decided to restore the updates-sections, without knowing whether it would solve the problem, and fortunately it did.

In Linux, a software-installation cycle consists of downloading package index files, then installing them, then specifying the desired software (such as apps or a software-update, which updates the OS and apps, and can require hundreds of MB), which causes the software/package manager to refer to the "local" (installed) package index and generate a list of packages (software modules) which must be downloaded in order to install the specified software. Some of the modules might have already been installed, and wouldn't have to be installed again, which is partly why the list must be generated on the installation of interest, and there are many other things which the package manager takes into consideration when generating this list. Once the list is completed, the packages would be downloaded and installed.

With a direct high-speed internet connection, it's easy to complete this cycle on a single day, while the package index reflects the latest state of the repository. But when using APT-offline, the interval between downloading package-index files and downloading packages might exceed a week, and the package index used for generating the list of packages to download would no longer reflect the state of the repository when the packages are downloaded. So, in previous revisions, I warned of a potential scenario in which packages might be deleted from the repository during the interval between downloading package index files, and using an SF based on those package index files in an attempt to download packages. But then it occurred to me that software modules are probably retained in the repository for at least a few weeks after being replaced by new revisions and de-listed from the package index, to eliminate the potential for such a scenario as long as the installation-cycle is performed in a reasonable amount of time.

In case you're wondering, the online package index can't be used directly, for various reasons, which would be very difficult to explain because APT (Advanced Packaging Tool, the software/package manager) and APT-offline are the result of years of development during which many complex factors had to be considered. Various ways to accomplish the same thing (installing software on a Debian-based system without a direct internet connection) have been tried, and the combination of APT and APT-offline was found to be the optimal approach.

After downloading the files, the phone (or the storage device such as an SD-card being used for passing data between the phone and the PC - see warning above) would be connected to the PC, and the PC's installation of APT-offline-GUI would be used in its install-mode to navigate to the aforementioned folder reserved for the files listed in the corresponding SF, and to initiate an install-op on those files. (There are other ways to use APT-offline to accomplish the same thing, but this is the most convenient and efficient.) This would completely install the files, except for app-packages being installed for the first time on that installation, in which case APT-offline places them in /var/cache/apt/archives, where packages are placed for installation during a normal installation process. To completely install them, the PC's software manager (Advanced Packaging Tool, or APT) would then be given a command to install the apps for which the packages were obtained, such as by entering "sudo apt install <app name>" for each app. This ensures that nothing gets installed which isn't part of an app, such a a piece of malware that somehow got slipped in with with the requested files, which is unlikely although you should protect the downloaded files until they're installed, and install them at the first opportunity. To make it easy to install the apps, it might help to keep a list of the apps which were requested during the set-operation which ultimately produced the downloaded files. (The name of the corresponding CNF [2] is a logical place for such a list, in an abbreviated form.) You could sift through the downloaded files to remember which apps you had requested, but there might be hundreds of files.

The sky-blue arrows in the icon represent internet connections, although the PC's has a negation-symbol superimposed to indicate the lack of an internet connection, or at least one with sufficient bandwidth to download the amount of data typically required to make a change to a Linux installation (updating the package index alone requires a download of 25-80MB, and installing some large apps can require over 100MB).

The list-icon on the phone-icon symbolizes a signature file stored on the phone. The lock-icon symbolizes APT-offline's "bullet-proof" level of security guaranteed to prevent corrupted files or malware from being installed [3].

An icon for Debian packages is used as the icon's background, to symbolize the ultimate goal, which is to install Debian packages.

Notes

Revisions

If the revision-date is within a week of the actual date, there will probably be more revisions soon.

4/16/24 - Revised the warning at the top of the page, and the description of my backup-system (beginning with "for backup"), and possibly a few tweaks not worth mentioning.

4/17/24 - a) Revised the warning at top, again. b) Added a warning about connecting phones directly to full installations, including encrypted, in the first paragraph under the heading "Types of installations for air-gap PCs." c) Revised section starting with "For backup" and ending with "MX-Linux PERSISTENT installations."

4/18/24 - a) Revised the paragraph beginning with "I also keep a record...." b) Clarified the paragraph beginning with "I also keep a record...." c) various tweaks.

4/19/24 - In Note 2, starting at "This allows a fixed get-command," translated the basic get-command into English, and added details about creating get-commands and putting them in a clipboard-app on a phone to allow them to be entered with screen-taps.

[1] To create an "air gap" (EM gap) PC, get a barebones mini-pc with a wifi card, and remove the card. (I assume that if the wifi is simply disabled via software, there is some way to activate it with malware and use the wifi in burst-mode. Call me paranoid, but it doesn't hurt to be paranoid when it comes to data security.) Use long-nose pliers to grab each of the tiny gold-plated push-on RF connectors, and pull it off. Assuming that the wireless card is horizontal, you would pull the connectors straight up. Cover the connectors with tape to prevent any potential shorting. Install memory, but don't install any internal storage (use external storage devices with encrypted and/or unencrypted partitions for storage and backup), and avoid any peripherals with wireless capability, because hackers might be able to use them to get into the OS. Never connect it to the internet when using an installation which has been used for accessing sensitive data, just to be on the safe side. However, a nonpersistent "live" installation which has been shut down probably wouldn't have any session-data on it, and would probably be safe to use. For the keyboard, I use an E-SDS Waterproof Industrial Keyboard with Touchpad, and to share my keyboard and monitor with up to four PCs, I used a CKLau-64H2ua KM switch. Neither has wireless capability.

Hackers want us to believe that they can spy on air-gap PCs via the power lines, but this is BS intended to prevent us from using air-gap PCs, because they can't be hacked, at least without taking extraordinary measures which require physical access to the PC. There is no way that any useful data can make it out to the power lines. There isn't even any useful data on the power supply pins of any of the chips in a PC. Even if there were, it wouldn't make it through the power supply out to the power line, which is typically noisy.

Authentication-key updates

Each installation contains decryption keys to allow digital signatures on InRelease files (which contain checksums for package-index files) to be authenticated, and the keys are updated periodically (because the encryption keys are changed periodically), which takes place during package index updates with a direct internet connection. However, key-updates aren't strictly required, as explained in the relevant entry of this blog. If you use a recent release to create an installation, the keys will probably be current for a while, and if you install all of the software right after creating the installation, you might be able to avoid the "cannot authenticate" messages which appear when you install software when the keys aren't up-to-date.

Types of installations for air-gap PCs

You could run the air-gap PC with a "full" installation (preferably encrypted to protect the data which it would retain) on a flash drive, and conceal the drive when not in use. (However, even if a full installation were encrypted, any data on it would be accessible to hackers when it's running, and the warning at the top of this page would apply.) To create a full installation, select a log-in password (or two if you intend to create an encrypted installation), boot the PC from a live installation, run its installation program (which would include an encryption-option), and use a freshly-formatted USB3 flash drive as the destination for the full installation, because full installations require USB3 drives, and some installers won't overwrite a previous installation. In case there's malware on the drive, perform a slow-format to overwrite the drive, which can take hours for large flash drives, and should be done at your convenience before performing the installation-process. In my experience, Sandisk Ultra-Fits and Ultra-Luxes are the only flash drives that consistently work well for full installations, although I tried just a few major brands. Another option would be to use a small nonvolatile memory modules, as long as it's easy to conceal when it's not in use.

MX-Linux, Snapshots, and NP "live" installations

I prefer to use MX-Linux (which is Debian-based, and thus compatible with APT-offline), because it has a "killer app" known as Snapshot, which makes it easy to turn an MX-Linux installation, with all of its settings and added software, into an ISO, which can then be used to create duplicate installations of various types. So, the full installation could be used as a "master," which would be used only for making changes and Snapshots, and the Snapshots could be used for creating nonpersistent flash-drive installations (NPFDIs), a.k.a non-persistent "live" installations, which are ideal for running air-gap PCs, because when they're shut down, they don't retain any trace of data accessed with them. Another advantage of this type of installation is that they're easy to replace if they fail, or get lost or stolen. I put them on USB2 drives, because they're adequate for this purpose, they're cheap, and they run cool, indicating low power consumption.

For storage, I use separate USB3 flash drives with encrypted and unencrypted partitions, and several backups scattered around.

For backup, I use a system with a) a low-security drive (used on online and offline PCs) with a large unencrypted partition mainly for downloads, and a small encrypted one for text files such as this article; and b) a high-security drive (used on air-gap PCs only) with an encrypted partition. The system also includes a primary backup for a) and one for b), and secondary backups which contain a) copies of the low- and high-security drives (but not all downloads - only the ones which I consider to be reference material and want to archive); and b) a copy of my latest Snapshot-ISO in some cases, or c) a nonpersistent live installation of my latest Snapshot-ISO (which requires the entire drive). The primary backups are used for backing-up everything as soon as it's downloaded, created or revised. The secondary backups are updated weekly where appropriate.

The high-security drives contain a "heap"-folder (named "disseminate") which contains a) a copy of everything which is added to the high-security drives, and b) a weekly-updated folder which contains a copy of the encrypted partition on the low-security drives; after updating this folder, the heap-folder is copied to each of the secondary backups, after deleting the previous copy.

When the heap-folder becomes too large and takes too long to copy to the secondary backups on a weekly basis, the copy on the main high-security drive is emptied by disseminating its contents to suitable locations in the high-security drive's main file system (and by deleting what isn't worth saving), and the drive is then copied to each of the secondary backups. Assuming that it's been over a year since doing this, I also reformat all of the drives and refresh/update their contents. Each user would adapt this system to their needs.

Creating Snapshot-ISOs and nonpersistent live installations

When creating a Snapshot-ISO on my air-gap PC, I use an SD-card to store the ISO, which saves considerable time, because Snapshot ISOs are typically huge, and SD cards have a very fast write-speed. Even so, creating a Snapshot takes about ten minutes. Creating NPFDIs is just a matter of copying an ISO to a flash drive in a special format, using a USB-installer program such as Etcher or MX Live USB Maker, an MX Tool on MX-Linux. This process is simple and takes about ten minutes.

The disadvantage of using NPFDIs to run the PC is that changing a setting, and making it persistent, requires changing the setting on the full installation, then creating a new ISO and a new NPFDI. So, I usually wait until I have to make a few changes, or an important one, before going through this process.

MX-Linux "master" installations: no data to hack

Another advantage of using the MX-Linux full installation as a "master," i.e. for nothing except making changes and Snapshot-ISOs, is that it can be connected directly to the internet to update the package index and install software, without having to worry about compromising any data. Using a direct connection to update the package index also ensures that the installation's decryption keys (which ultimately authenticate packages) are up-to-date, although this isn't strictly necessary if a major server is used, since any problems with it would be discovered quickly because many others would be constantly connecting to it directly, and the chances that someone would be able to spoof a repository are negligible.

If a master-installation fails (or if it's inadvertently used to run the PC and might contain sensitive data, and you want to be able to connect it directly to the internet), it could be re-created easily from the latest Snapshot-ISO. The new installation wouldn't have a package index, but it wouldn't need one until installing more software, by which time it might need to be updated anyways.

I also keep a record of every change (including every setting) that I make to the MX-Linux master installation, to minimize the number of revision-cycles (i.e. make changes to the master installation, then generate a Snapshot-ISO, then create a working-installation, and then use it for a while and realize that more changes are required) before arriving at a stable installation with a new MX-Linux release.

MX-Linux PERSISTENT installations

MX-Linux uses the "Antix live system," which allows for the creation of PERSISTENT live installations which could probably be used for running an air-gap PC without compromising security, because they can be set up to have "root" (software) persistence (so that you could make persistent changes to the installation), without "home" (data) persistence (so that the installation wouldn't retain data after being shut down), and they can be encrypted.

However, persistent live MX installations are difficult to use (although better written instructions/explanations, and experience, would help), and if a persistent live installation were being used for running the PC all the time, it would be more likely to fail than would a full installation being used just for making changes and Snapshots, and if you wanted a current backup, you'd still have to make a Snapshot every time you made changes. You'd avoid the step of making a new NP live installation, but making NP live installations is simple and takes only about ten minutes. So, I'm sticking with my system of using a full installation just to make changes and Snapshots, and NP live installations made from the Snapshots to run the PC. But for certain applications, the flexibility of persistent MX live installations is a big advantage.

[2] APT-offline signature files (SFs) are text files, but with a .sig filename extension, generated by the PC's installation of APT-offline running in "set" mode. In this mode, APT-offline uses the PC's software/package manager (Advanced Packaging Tool, or APT) to perform the most complex aspects of the set-process. The information which allows the downloaded files to be validated consists of reference checksums and files sizes, and names to assign to files if they are found to be valid. For packages, whose names aren't changed when they are found to be valid, the replacement name is the original name, but by using the original name as the replacement name, APT-offline can simply replace the original name with the replacement name when a file is found to be valid, regardless of whether the file in question is a package-index file or a package.

APT-offline names every SF "apt-offline.sig" by default. As part of a system I devised to organize and simplify the overall process of using APT-offline, the default name is used for every signature file, but each SF is placed, along with a user-created folder named DDF (download-destination folder, which receives the files listed in the SF as they are downloaded and validated, and is reserved for files listed in the SF), in a user-created "change-name folder" (CNF). The purpose of CNFs is to identify the purposes of the corresponding SF/DDF pair, and to separate SF/DDF pairs from each other, since they're all named alike and would clash if placed in the same directory. A CNF's name might include a description of the target-installation, the date, and an abbreviated description of the changes which the folder's contents are intended to produce. For example, a CNF might be named "CNF-Brix-Xub2404-dd-mm-yy-update," or "CNF-Brix-Xub2404-dd-mm-yy-gimp-marble-update," where Brix is the name of a mini-PC, Xub2404 is the Xubuntu ("zoo-boon-tu") 24.04 installation on the Brix (some PCs have multiple installations on them), and GIMP and Marble are large popular apps. (GIMP is a powerful image-processor, and Marble is virtual-globe app like Google Earth, although not as sophisticated, but it can be used without an internet connection.)

The basic idea behind my system is to reduce the process of initiating a get-op to a series of screen-taps which can be performed on a phone in a distracting environment, and to be able to consecutively perform multiple get-ops in a single download session, while keeping track of each SF/DDF-pair's purpose. So, at home, the phone would be connected to the PC, and the PC's file manager would be used to move each CNF into a separate get-op "hopper"-directory (a get-process directory, or GPD) on the phone. Then, a file manager installed on the phone would be used to move each SF/DDF pair out of its CNF and directly into the corresponding GPD. This allows a fixed get-command (which translated to English is "go to GPDn, get the files listed in apt-offline.sig, and put them in DDF") to be used for initiating a get-op on the SF/DDF pair in the corresponding GPD (GPD1, GPD2, etc.). So, if you have three PCs, you might want to perform three get-ops in a single download-session, and you would have three GPDs, each with a fixed get-command.

To create the get-commands, you would use a PC to create a text-file named "get-command-1.txt" containing the basic get-command ("apt-offline get path/GPD1/apt-offline.sig -d path/GPD1/DDF," where "path" would be left empty for the time being), and move the file to the phone. Then, on the phone, you would install a file-manager app, such as X-plore, which allows you navigate to a directory and copy its path so that it can be pasted where required. Then you would navigate to a certain directory (specified in the main article in this blog) which is the optimal location for GPDs, copy its path, paste it into the aforementioned text file, and move the text file back to the PC, because it's far easier to perform the next steps on a PC, than on a phone. Then you would open the text file on the PC, insert the path into the command in two places, delete everything besides the command from the file, and make two copies of the text file (one named "get-command-2.txt" and one named "get-command-3.txt"), so that there's one file for each command, to make it easy to copy each command to a clipboard-app on the phone (I tried selecting individual commands from a single text-file on my phone, and gave up). Next, you would substitute the correct number into each occurrence of GPD1 in each of the copies, and you would have the required get-commands.

Then you would copy the get-command files to the phone, install a clipboard-app on the phone, open each text file (one at a time), and copy the command, which would enter the command into the clipboard-app, and repeat this for each text file containing a get-command. Then you would put the text-commands in their own group on the clipboard-app, and practice entering them into the terminal with a series of taps. (The terminal is part of the Debian app which must be installed on the phone to install APT-offline on the phone. Another app named Userland must be installed on the phone to install the Debian app. This site's main article provides the details.) After each command is actually used in a get-op, it would become part of the terminal's command-history, and it could be re-entered without having to use the clipboard-app.

After the download-session, the downloaded files could be installed by using the PC's installation of APT-offline-GUI in install-mode to navigate to the DDF of interest and to initiate an install-op. If the changes are being made to an MX-Linux installation which is being used as a "master" installation (explained elsewhere in this article), the phone could be connected directly to the PC without creating security risks. But otherwise, the DDF containing the downloaded files, along with the associated SF, would be returned to their CNF (using the phone's file manager), and the CNF would then be copied to a flash drive or SD-card, which would then be transferred to the PC, to eliminate the security risks.

This probably sounds like a lot, but it's just a series of simple steps, and after performing them a couple of times, it will be easy.

If you want to install software, and it's been a while since the package index was updated, you would generate an SF for a package-index update, then use the SF in a get-op to get the package-index files. Then you would install them, generate an SF for the desired apps, and use that SF in a get-op. You might want to consider downloading package-index files on a weekly basis, if convenient, just to have fairly fresh ones on hand in case you need to install them.

[3] Never create an installation from an ISO which you have obtained from an outside source, without first authenticating the ISO. This is done by calculating the ISO's checksum (a function built into some file managers, sometimes initiated by right-clicking on the ISO and selecting the calculate-checksum function in the resulting menu), and comparing the calculated checksum to the reference checksum. I use SHA256 checksums, which are the highest-security reference checksums typically provided. If the checksums are different, the difference won't be subtle. In my experience, the easiest way to find the reference checksum of the latest release of some type of Linux is to visit the official website, but if you need a checksum for an older release, I suggest Googling "checksum Linux <type version>." Distrowatch has checksums for older releases, but it can be difficult to find them.