Rev 8/6/22 (see Notes)
Every package index includes a Release or InRelease file (actually probably one for each of several sections) which contains a digital signature which must be decrypted with the public key corresponding to the secret/private key with which the signature was encrypted, in order to be authenticated, i.e. proven to be an official package index. This file contains checksums for the rest of the package index files, and the package index files contain checksums for every software module/package in the repository. So, this constitutes a secure chain of evidence that the packages are official and safe to install. But even without digital signatures, there would be very little chance that anyone could create counterfeit package indices and substitute them for authentic ones, although there are situations in which it would be possible. In some cases, such as in Ubuntu and its "official flavors," some of the required keys are not included by default, and if you create an installation out of one of these distros and update the package index via APT-offline, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y," where <app> is some large app which requires many packages try to install something, "cannot authenticate packages" (CAP) warnings will probably appear in the terminal because some of the keys are missing. I've tried various approaches to install the missing keys and eliminate the CAP-warnings, and the only way that works is to update the package index with a direct internet connection.
After creating a full installation, first enter "echo 'Binary::apt::APT::Keep-Downloaded-Packages "1";' | sudo tee /etc/apt/apt.conf.d/10apt-keep-downloads" (copy the command without the beginning and end quotes, and paste it into the command-line with Ctrl-Shift-V) so that any software-packages which are installed via a direct connection are retained in /var/cache/apt/archives after they're installed, instead of being deleted, which is done by default in many types of Linux. (It's easier to just enter the command instead of trying to determine whether it's necessary.)
Next, the package index should be updated, and when this is done with a direct connection (instead of using APT-offline), the first thing that happens is that missing keys are installed and outdated keys are updated. (I realized this after finding debian-archive-keyring in /var/cache/apt/archives, although I had never entered a command to install it, so obviously it was installed in the background during an update.) However, something else happens which doesn't happen when these same keys are installed without a direct internet connection, and whatever it is makes the difference between success and failure. I have tried copying the contents of usr/share/keyrings from an installation which has been updated ("sudo apt update") online (which eliminated "cannot authenticate packages"/CAP warnings from the online installation), to the same directory in an offline installation, but it didn't eliminate the CAP-warnings from the offline installation. I've also tried installing the missing keyring (debian-archive-keyring) both online and offline, and running "apt-key update" (the apt-key command will no longer be available as of Ubuntu 22.10) both online and offline, but none of these eliminated the CAP-warnings. So, obviously, something happens during an online update that doesn't happen during any of these unsuccessful approaches. Some articles indicate that it's necessary to associate each source in the etc/apt/sources.list file with a particular key, but I checked the sources.list file after updating Kubuntu 22.04 online, and the the list had the conventional format, so clearly this is not necessary, and Kubuntu's developers apparently considered it to be overkill. (It would provide a benefit only if the private keys of the official archive had been stolen, which is extremely unlikely. PPA-keys, on the other hand, might get stolen if the maintainer uses lax security practices.)
So, as far as I have been able to determine, a direct connection is required to install or update keys so that the system can make use of them.
Once the proper keys are installed, the actual package-index update begins by attempting to decrypt the digital signature in each Release/InRelease file (which are the first package-index files to be downloaded) with the corresponding key, presumably successfully (assuming that it's not counterfeit, which is highly unlikely, but not impossible). Even if the proper keys aren't installed, and the package index can't be authenticated, it would be installed anyways, because chances are that the authentication failure is due to missing or outdated keys. Still, I'd rather check the index with the proper keys to eliminate even the small chance that the index is counterfeit.
Linux Mint is one of the few Ubuntu-derivatives, if not the only one, in which all of the required keys are installed by default (apparently - I can't find any solid information on this subject), and the package index is not locked by default, so that you can use the package manager (APT) to generate a "download script" for APT-offline. (A download-script is a simple list of the URLs of the packages required to install APT-offline on that installation with the software configuration which existed when the script was generated. Installing APT-offline typically requires python3-magic to be installed first, so it's typically listed in APT-offline download-scripts, but if something else which requires python3-magic were installed first, python3-magic wouldn't be included in the APT-offline download-script.) Then, assuming that the installation doesn't have a direct internet connection, you would go to an access point and download the required packages from the Ubuntu Packages website (an interface to the repository), along with their reference SHA256 checksums, calculate their SHA256 checksums (right-click on each package, select Properties, and then select the tab with buttons for calculating various checksums) and compare each calculated value to the corresponding reference value. If they're different, you can tell with a glance.
So, Mint might be a good choice for an offline installation, assuming that it's 100% compatible with your PC. (Mint is notorious for its PC-compatibility issues, but there are lists of compatible PCs, and you can buy PCs with Mint pre-installed.) But you might prefer another type of Ubuntu without the hardware-compatibility issues, or for example Kubuntu so that you can install Kdenlive (the top-rated Linux video editor) with a mere 60MB download, as opposed to 200-300MB to install it on other types of Linux.
The problem is that there is apparently no way to update the keys on PCs which have installations of Ubuntu or its main derivatives, and which cannot be connected to the internet, such as bulky and/or power-hungry desktop PCs which have no internet connection and are not practical to take to an internet access point. However, I've devised a work-around for such cases. There might be better solutions, but if so, they're very well hidden, and chances are that most people would be able to find some way to to obtain a temporary direct high-speed internet connection for even a desktop PC, and avoid my convoluted solution.
My solution (although in practice not in this order) is to use APT-offline to obtain a copy of the package index, then to install a copy of it on an installation in which the package index has been updated by means of a direct internet connection, and then, with the installation disconnected from the internet, enter "sudo apt install <app> -y" for some large app which requires many packages (such as kdenlive). If the package index is counterfeit, you would receive CAP-messages which indicate that many of the required packages cannot be authenticated. In some cases, I've gotten CAP-messages related to just a couple of packages out of many which the app requires, which might mean that a key for a particular section of the package index is missing or outdated, although if an update had been performed, all of the keys should be available and up to date. But if just a couple of packages are affected, you could compare their checksums to the reference values on the corresponding download pages of the corresponding Packages site (i.e. Debian Packages or Ubuntu Packages). If you don't receive any CAP-messages, the package index obtained via APT-offline would have to be authentic, and you could install another copy of it on the "unconnectable" installation, and disregard any CAP-messages which will probably appear when you install software due to the unconnectable installation's missing or outdated keys.
There are two major approaches to accomplishing this goal, one of which involves taking a mini PC and possibly a 12VDC-110VAC power inverter or an uninterruptible power supply to an access-point (if no power outlets are available at the access point), and perhaps a cardboard hood to shield the monitor from sunlight. The other approach would involve a couple of trips to the access-point, but with just a phone or a tablet (for details on installing and using APT-offline on an Android or iOS device, see APT-offline A-Z). If the keys in the distro which you use for your "unconnectable" PC are incomplete by default (i.e. in the source-file and when the installation is first created) you'll have to take the first approach. With Linux Mint, however (at least as of 20.3), the latter approach can be used. (But with Mint, the first step is to ensure that your PC is on some list of compatible PCs.)
A) Take mini-PC to access point
The most straightforward approach would be to take your "portable" PC [2] with a flash-drive installation made from the distro that was used for the installation on the "unconnectable" PC to the access point and perform a direct update using the default repository-settings, unless you really know what you're doing when you change the repository-settings. If some of the required keyrings are missing or outdated, the latest versions would evidently be automatically installed in the background (i.e. with no indication) during the package-index update, based on my experience. For example, I performed a direct update on a flash-drive installation, after entering the aforementioned command to retain downloaded software packages after installing them, and later found the debian-archive-keyring package in the software archive, although I didn't enter a command to install it. So, it must have been installed automatically in the background during the update.
Next, I'd install APT-offline and nmon [3], along with any other software I'd need in the foreseeable future, on the flash-drive installation. Then, if your installation doesn't have a good disk I/O monitor (such as a widget or a configurable system-monitor app), I'd launch nmon by opening a terminal and entering "nmon," then hit "d" to configure it to monitor disk I/O, which comes in handy when writing large amounts of data to a flash-drive installation, to avoid prematurely terminating the write-process, which can take longer than indicated by the terminal. Normally, to avoid interrupting the write-process, flash drives are ejected and left plugged in until the ejection-process is complete, but flash-drive installations can't be ejected when the PC is running from them.
Then I'd use APT-offline to generate a signature file for an update (open a terminal and enter "sudo apt-offline set apt-offline.sig --update," which would create the signature file and place it in the Home directory), and then put the signature file and a folder named DDF (download-destination folder, to contain the downloaded packages) in a folder named "CNF-<distro-abbrev>-update-<date>," (such as "CNF-Kub2204-update-M-D-Y). CNF is for "change-name folder," which is part of a system I devised for using APT-offline efficiently, even when using Android/iOS devices for the get-operation - see APT-offline A-Z for details.
Next, I'd open a terminal in the CNF and enter a generic get-command ("apt-offline get apt-offline.sig -d DDF" - there is no "sudo" in get-commands). This would download the package-index files, screen them for errors using the reference checksums and file sizes in the Release or InRelease file, and if a file passes its screening-test, it is renamed with the name listed in the signature file after the file's URL (the name which it must have in order to be installed), and placed in the DDF.
The final step in updating an installation via APT-offline is to install the downloaded package-index files. To do this, you would normally transfer the flash drive which contains the downloaded files (in the DDF in the relevant CNF), to the target-installation, and then use the target-installation to open a terminal in the CNF and enter "sudo apt-offline install DDF," while keeping an eye on the disk I/O monitor to see when the installation is actually complete. But in this case there is no need to transfer the device containing the downloaded files to the target-installation, since it already is connected to the target-installation.
Once the package index obtained via APT-offline is installed, I would perform a mock installation (without an internet connection) of some large app that requires many packages and that's not already installed, by entering "sudo apt install <app> -y." If any CAP-messages appear which indicate that packages in general cannot be authenticated, the package index would be counterfeit, but otherwise it would be authentic, in which case another copy of the package index could be installed on the unconnectable installation, and software could be installed on it without regard for the CAP-messages caused by its missing or outdated keys.
B) Multiple trips to access point with phone or tablet
Based on my recent experience, the keys in Ubuntu and all of its "official flavors" are incomplete until the relevant installation is updated via a direct internet connection. Linux Mint is the exception, as far as I know, at least as of 20.3, so might be the best distro for your unconnectable PC (assuming that that PC is 100% Mint-compatible). In that case, you could use APT to identify the packages required to install APT-offline, by performing a mock installation as if you had a direct internet connection, and APT would indicate what it tried to install but couldn't due to the lack of an internet connection. Then you would download those packages from the Ubuntu Packages site, along with their checksums. (Copy the checksums from the download-page into a text file, because although the Ubuntu Packages download-pages with the checksums can be downloaded like a regular html page, they can't be opened with a web browser. LibreOffice Writer will open them, but it's not as convenient as copying the desired information and saving it in a text file.)
Then you'd check the checksum on each of the packages required to install APT-offline (right-click on each one, select Properties in the menu which appears, and then select the tab which calculates checksums, and the rest is obvious), and if they match the reference values from the Packages site, you would install them by clicking on each of the packages, and following the package-installer's instructions. The next step would be to use APT-offline to perform an update, as described above, and once the update is performed, you could install any software you would need for the foreseeable future. By the time the keys expire, I assume that there would be another Mint release with the new keys.
If you don't know whether any of the keys required by your installation are missing by default, you could install APT-offline using the package installer, then use APT-offline to update the package index, and then use APT itself to perform a mock installation (without an internet connection) of some large app that's not already installed. Then if you get any CAP-messages indicating that none of the packages can be authenticated, you'll know that you'll have to perform a direct update to install or update the keys, and if you have an unconnectable installation of the same type, you'd have to re-install the package index obtained via APT-offline onto the directly-updated installation and perform another mock installation to check for CAP-messages which indicate that packages in general can't be authenticated. If there are none, the package index would have to be authentic, and you could install it on the unconnectable installation and disregard the CAP-messages which I assume it would produce due to disabled or outdated keys.
So, it's a convoluted process, but being able to use Ubuntu or one of its derivatives (I prefer the derivatives) on an unconnectable desktop PC, for example, is worth the wade.
===========
Notes
8/6/22 - A) In the fourth paragraph, added a list of unsuccessful methods which I used in an attempt to install the required keys. B) Cleaned up some lousy writing in a few places and tweaked several passages. C) Revised Note 1 and added Note 3.
[1] 16GB Sandisk USB3 Ultra-FITs are good for Linux installations, due to their combination of reliability, price, performance and power consumption. Their problem is that they have plastic connector-shrouds, which wear out, but you could effectively give them a metal connector-shroud by pluging them into a USB3 M-F adapter and just leaving them plugged in. Sandisk's small, metal-cased Ultra-Luxe drives are also quite fast, although they seem to dissipate quite a bit of heat, and the only 16GB version which I tried didn't last very long. (Still, other 16GB units might run just on the warm side, and provide excellent performance and reliability.) Sandisk also makes a small metal-cased "Flair" drive which might be good for Linux installations, although the one I tried didn't last very long at all. So, Sandisk seems to have a problem getting its chips into small metal cases. I've tried a few other major brands, and other than a 16GB Kingston USB3 Datatraveler (no longer available), they were DOA, or Linux wouldn't even run on them. Cheap Chinese drives are notoriously unreliable because they're made from 2nd-rate chips from the periphery of the wafer, so I wouldn't put an installation on them. An article which recommends various flash drives for Linux installations recommends Sandisk Ultra-FITs and PNY Turbos, among others which are either expensive, or USB2's which are good for live installations but not full installations. But Amazon reviews indicate that PNY Turbos have a high failure rate. HP flash drives (which are made by PNY) might be more reliable, but they're not cheap.
[2] Laptops and Chromebooks might be convenient, but I doubt that you can use either to create a flash-drive Linux installation and boot from it, which can be done with mini-PCs. Mini-PCs can also be configured as air-gap PCs and connected to a keyboard and monitor through a KM switch to share the keyboard and monitor with other PCs, including an online mini-PC and perhaps a desktop PC for heavy-duty data processing. (For details on creating a cheap air-gap PC, see my post on the subject.) The potential problem with mini-PCs is providing them with power at the access point, which can be done with a 12VDC-110VAC power inverter plugged into a car's cigarette lighter or with an uninterruptible power supply with sufficient capacity to power the PC for about 15 minutes if no power outlets are available.
[3] Nmon is an excellent, terminal-based system monitor which uses bar-graph indicators and can be easily configured to monitor various processes, although your installation might already have a good widget for monitoring disk I/O (Kubuntu 22.04's is the best in my experience), and then there's Conky with the MX-Linux standard configuration, which I covered in one of my posts, and GKrellM.