Monday, December 3, 2018

Anti air-gap-PC hysteria

Rev 12/6/18

Today I did a quick search using the term "air-gap," and found what appears to be a desperate attempt by the NSA to suppress the use of air-gap PC's. The claims that they aren't secure from all possible attacks might apply to systems which are targeted by intelligence agencies, but the average person probably doesn't have to worry about these attacks. For instance, the notion that someone can monitor the power line outside your home and determine what's on your encrypted flash drive is absurd, especially since PC's use regulated power supplies which would prevent any data from getting out to the power line from a PC. (The only way to get a signal out would be to modulate the PC's power consumption to such an extent that it could be detected on the power line, which is possible but would probably require noticeable changes in the PC's behavior. A PC's power consumption is low compared to a lot of other things, which might be turning on and off, and then there's power-line noise, so it would be very difficult to get any data this way.) The claim that there are Windows viruses that copy data to and from flash drives in hopes that they will be plugged into an air gap PC is plausible, but I don't care about Windows since nobody who is security-conscious would use Windows for an air-gap PC. Other, even more exotic techniques are also the stuff of intelligence agents, and not likely to be applied to the average home PC-user who just wants to keep personal data such as shopping lists and correspondence from Big Brother. So don't let these claims frighten you away from using an air-gap PC, which combined with an online mini-PC and a KVM switch provide a very high level of security for a very low cost in terms of time and money.


Revision notes

12/6/18- Added "(The only way to get a signal out....)"

Friday, November 23, 2018

Some ideas on gaining temporary internet access for setting up Debian installations for use as internet-access-impaired installations

Rev 11/26/18

KDE Neon is one of the most difficult types of Linux to use without a direct internet connection, due to several factors, although they can be overcome by using a temporary direct connection for initially setting it up, such as by updating the package index and adding Apt-offline, kdesudo, Synaptic Package Manager, and Software & Updates (software-properties-kde for the KDE desktop, and software-properties-gtk for everything else). (The software manager provided with Neon is clearly intended for use with a direct high-speed internet connection, and not for customizing the package index to your needs.)

Ubuntu itself was also intended for use with a direct high-speed internet connection, although it doesn't require as much initial set-up as Neon to make it suitable to use as an internet-access-impaired installation. (Just update the package index and install Apt-offline-gui and any other apps you want initially.)

If you can afford a laptop PC which can boot Linux from a USB port, setting-up a flash-drive installation would be easy, and the installation could then be used for booting another PC, including a desktop which could be used even for video editing (the video files would be stored on an hdd/ssd, perhaps in unencrypted form due to the amount of processing required to encrypt video). However, such laptops tend to be expensive, and I don't like laptops in general for various reasons. There was a time when you could get a small laptop ("netbook") without an OS (and without built-in wireless!), into which you could plug "Linux on a stick" and boot it, but this didn't sit well with Big Bro, who didn't want it to be convenient for everyone to get a PC which he couldn't monitor. (Based on Snowden's claims, the Thought Police are a secret division of the NSA.)

So, netbooks disappeared from the "free market," and if you want something with the same capabilities without bowing down to the laptop-dictatorship, the best alternative is to use an AMD-based mini-PC without built-in wireless, and with a wired-only keyboard and a wired-only monitor, because it could double as a secure home-PC. (At home, I have cheap "offline" and "online' mini-PC's which run Linux and share the same keyboard and monitor through a KVM switch. I don't have to worry about hackers, viruses, or updates, although I replace the OS on each PC every year or so to stay reasonably current.) Portability is compromised, because you'd probably need an AC outlet. Perhaps you know of an access-point where you would be allowed to use AC outlets, but if not, you could get a DC-to-AC inverter and run it from your car's electrical system. There are small monitors which are designed to use in cars, or you could use a regular monitor when there is little sunlight, or somehow block some of the sunlight (perhaps put a cardboard hood on the monitor - typically on the top and sides, and about a foot deep, although it could be something like a box with a slit in it). You could use drive-in restaurants as your wi-fi access points (you'd use an external wi-fi adapter, some of which plug into Ethernet ports and don't require special drivers). These are just some possibilities, and you can probably come up with some others.

But once the installation is set up, it would be much more convenient to use Apt-offline to make changes, and if you want to make some changes to a secure installation, Apt-offline allows the changes to be made without compromising security.

But then there's always Xubuntu, which includes Apt-offline's text-command version by default (which makes it easy to install Apt-offline's GUI, which comes in handy at times). The decision to include Apt-offline's text-command version is a reflection of Xubuntu's intelligent, minimalistic overall design. It's a little rough around the edges, but Ubuntu and its derivatives are undergoing major changes (such as the development and incorporation of a new display-server known as Mir), and rough edges are to be expected until the dust settles.


Revision notices

11/24/18 - A) Corrected the 1st sentence in the 4th paragraph. In the initial version, I accidentally wrote "wireless" when I meant "wireless-less," i.e. wired-only. If wireless capability exists, you should assume that it can be surreptitiously enabled, such as in burst-mode, to spy on you. Wi-fi burst-mode is a reality, and it cannot be detected without special equipment. So, a secure PC cannot have any wireless circuitry, period, including in any peripherals. The NSA's TEMPEST-spec takes this to an extreme - it considers even wire to be a wireless device. B) Clarified the last paragraph.

11/25/18 - A) Revised the sentence beginning with "If you can afford a laptop PC..." for clarity and to mention the use of an hdd/ssd to store video files.

11/26/18 - A) Tweaked the sentence beginning with "If you can afford a laptop PC which...."  B) In the sentence beginning with "There are small monitors which are designed to use in cars," changed "shroud" to the more common term "hood," and suggested a couple of basic designs for the hood.

Monday, October 1, 2018

Linux for practical secure dual-PC systems


During the interview, Snowden discussed his motivations for releasing the documents to journalists, explaining, "The intelligence capabilities themselves are unregulated, uncontrolled, and dangerous. People at NSA can actually watch internet communications and see our thoughts form as we type. What's more shocking is the dirtiness of the targeting. It's the lack of respect for the public and for the intrusiveness of surveillance."

Edward Snowden, quoted in In NBC interview, Snowden says NSA watches our digital thoughts develop
http://arstechnica.com/tech-policy/2014/05/in-nbc-interview-snowden-says-nsa-watches-our-digital-thoughts-develop/


Snowden was apparently referring to a "Thought-Police" black-op within the NSA which snoops on us partly to interfere with our plans to make us feel powerless. Don't plan on getting rid of them, because you can't get rid of something if you can't find it.

Although learning to use Linux can be frustrating, it is worth the effort because it is the most practical OS to my knowledge for creating relatively inexpensive secure PC systems which don't rely on software (other than AES-grade encryption software, which is trustworthy) for protecting data. In such systems, there are two PC's which share a monitor and a keyboard by means of a KVM switch, one of which is used for accessing the internet but not for processing sensitive information, and another PC for processing sensitive information (such as shopping lists, plans, and messages to be encrypted and transferred to the internet-connected PC for transmission), which is unquestionably secure because a) it is completely electromagnetically isolated from the internet, and it has no internal drive which can be used for surreptitiously storing data to be uploaded to the Thought Police when a connection becomes available, or to be retrieved during a "no-knock search" (it stores data only in an encrypted form on physically small USB flash drives which are removed and hidden when not in use); and b) the installation can be maintained by using a unique program named Apt-offline, which combined with the new "containerized" software systems for Linux, allows any available change to be made to the installation (including OS-updates) without connecting it to the internet. Installing the new "containerized" types of Linux software is like installing Windows software, but the new systems won't entirely replace the old systems anytime soon, if ever, so there will still be a place for Apt-offline in the Linux world for the foreseeable future.

You could probably create the same kind of system with Windows, but you would need two copies of Windows, and to update Windows on the secure PC, you'd have to get a copy of the latest version of Windows and use the copy to update the installation (or to create a new installation). This will become more feasible when Windows Core is available, because it will allow installations to be tailored to the PC, instead of using a gargantuan OS which is the same for every device. It might also be possible to create two installations of Windows from a single copy to run on a single PC, and run one on the PC in "secure mode" and the other on the PC in "non-secure mode," but it would be inconvenient to put it mildly to switch between the two installations.

Linux is great for accessing the internet, because it's essentially immune to viruses. I've been using it to access the internet for over 7 years, and have never had any problems with viruses, or any need to install anti-virus software. Although OS-updates are made available, I don't use them except in rare cases, and have never had reason to regret it. Instead, I replace the entire OS every couple of years to stay reasonably current.

Linux also allows you to format any drive, including physically tiny, all-metal USB flash drives and micro-SD cards, which can be hidden easily, with an encrypted format which the NSA reportedly won't even try to crack. Instead, they would try to get the password, which I suppose could involve sneaking into someone's home, and circumventing the typical security measures. (Booby traps are illegal for good reasons, so don't get any stupid ideas.)

The only reason I would use Windows would be for some application which I couldn't get anywhere else, or to use a piece of hardware that doesn't work with Linux. Windows works well in Windows ads, but recent Chromebook ads indicate that many people still have problems with it. Chromebooks are great for people who don't want to think about their PC beyond using the applications, and think that BB couldn't have any reason to spy on them and/or that taking steps to prevent him from spying on them is un-American, but if you want a PC which you can trust to protect your data, Linux is the probably the best OS for you. You might balk at the cost of the required hardware, but you can use inexpensive mini-PC's which use the latest AMD processors, which have a lot of processing-power per buck and per watt, and you can't obtain completely trustworthy security through other means at any cost.

Sunday, September 16, 2018

Full installations of Ubuntu etc on flash drives

Rev 10/7/18 (see Revision Notes)


Another application for Apt-offline is to make changes to what I call "secure" full installations of Ubuntu and its derivatives on such things as USB 3.0 flash drives and SD cards (8GB minimum typically), running on "secure" PC's, in cases where the changes couldn't be made with the new "containerized" software systems (Snappy and Flatpak). When creating such installations, the encryption-option would be selected, so that the retained data would be secure, supposedly, although it should be backed up on separate encrypted flash drives. I can't guarantee that such installations are secure, although they can be hidden just about anywhere when not in use. Persistent live installations are another option, although they would retain data in an unsecure form, and there aren't many USB-installers which can make such installations, perhaps indicating what the experts think of such installations. I gather that there are also security risks in using such installations for accessing the internet, partly because they aren't password-protected.

By "secure" PC's, I mean PC's which have no internal storage which might be surreptitiously used for storing data to be sent to the Thought Police when an internet connection is available (or retrieved by sneaking into your abode), and which are electromagnetically isolated from the internet, meaning no wired connections or wireless circuitry, including in peripherals such as keyboards, monitors, printers, etc., because wireless circuitry might be surreptitiously enabled, such as in burst-mode, to send a dispatch to the Thought Police. This approach (which requires an extra PC for use as the "secure" PC) might seem less convenient and more expensive, but when you consider all the factors, it's actually more convenient and less expensive than trying to secure a single internet-connected PC, which you can never be certain is secure. Mini-PC's with AMD APU's could be used for the optimal combination of price, power-consumption, and performance (units with 7 nM lithography, and even lower power-consumption, will supposedly hit the market in 2019). AMD processors are less likely to have built-in wireless circuitry, and in some mini-PC's, the wireless circuitry is placed on a separate module which can be removed without much difficulty (tip: to disconnect the tiny RF connectors, pull straight up on them with a pair of long-nosed pliers).

"Secure" installations running on "secure" PC's could be used for composing and encrypting secure messages, and decrypting and reading them, so that the messages never exist in unencrypted ("plaintext") form on an internet-connected PC, where you should assume they will find their way to the internet. When  using such high levels of security, the weak link is the recipient (who would also have to use a dual-PC system for security), and you should just assume that they're going to betray your confidence eventually unless there would be a significant penalty, and keep this in mind when deciding what information to provide to them, and when. It might be a good idea to test them with bogus "secrets" before sending any actual secrets. Don't use digital signatures unless necessary, because you can't deny sending something that has your signature on it unless your private key has been compromised.

To share peripherals such as keyboards and monitors between "secure" and internet-connected PC's, a KVM switch would be used. This requires the video standards of the PC's to conform to the standard used by the KM switch, or to be adapted to it. In some cases, when booting multiple PC's connected to a KM switch, each PC must be fully booted before switching to another PC and booting it. 


Notes

Revisions

10/7/18 - Revised the entire article after some experience with using a full installation on a USB 3.0 flash drive, and hopefully clarified it.

Sunday, June 3, 2018

Apt-offline set-command examples

I realized that my main article on Apt-offline does't provide a sufficient selection of samples of set-commands, perhaps because I assume tnat most people would use the GUI, except when using Apt-offline to install the GUI, in which case you would use the set-command I provided for this case.

In general, the set-command begins with "sudo apt-offline set apt-offline.sig," where "sudo" is a means of requiring the administrative password to be entered in order to begin making any changes (it's also required in order to perform Apt-offline install-ops, which are the final step in the process of making changes via Apt-offline). "Apt-offline set" is the logical beginning for an Apt-offline set-command. "Apt-offline.sig" is the name of the signature file which the command is intended to generate, and I recommend leaving the name as it is, for reasons explained many times elsewhere (it's part of my system for efficiently using Apt-offline). But you could get fancy and insert a path before the signature file name, so that the signature file would end up somewhere else besides the Home directory (wherever you put it, make sure it's a permanent directory, such as Downloads), and you could use a different name for the signature file in each case. But it's just more work.

The rest of the set-command depends on what sort of change you want to make. You can perform an update, install apps, upgrade apps, etc. To do an update, the command would be "sudo apt-offline set apt-offline.sig --update." To install a couple of apps, it would be "sudo apt-offline set apt-offline.sig --install-packages <app1>,<app2>." (Note that there's just a comma between the app-names.) A complete listing of set-command options can be found on Apt-offline's man-page at https://manpages.debian.org/testing/apt-offline/apt-offline.8.en.html.

Apt-offline's command syntax is fairly easy to master, and you could create command-templates containing the options which you normally use, and just copy the template and substitute the particulars in to create the command required for each case. Once you become adept at using the commands, you'll find that it's more efficient to use the commands than to use the GUI.

Installing Apt-offline on a new installation which requires an update before installing anything

I just recently learned that it's sometimes necessary to update the package index before installing anything on a new hdd/ssd installation of Debian Linux or some derivative. This requirement has apparently existed for quite a while, but I've never run into it before despite having performed many installations.

But what if you need to install Apt-offline in order to update the package index? There is a way, but it drastically limits the selection of types of Linux which can be used, because they have to include two things by default: the section of the package index pertaining to Apt-offline, and an app which can install individual packages. The only two that I've found which satisfy these criteria are Ubuntu Mate and KDE Neon, and I'm looking into others. Since it's possible to copy the package LISTS (the installed form of the package index, contained in /var/lib/apt/lists) from Ubuntu Mate to other types of Ubuntu, this might improve the selection of types of Ubuntu which can satisfy the aforementioned criteria. Debian in general doesn't include the pertinent section ("contrib"). Debian KDE might, because 8.7.1 KDE does even though no other version of Debian does, based on my experience.

OK, so what you do is to use a flash-drive installation (of the type of Linux being installed) to generate a download-script for Apt-offline (or Apt-offline GUI, to avoid having to learn set- and install-command syntax, which is somewhat more difficult than falling off a log), by entering "sudo apt install apt-offline," etc. and copying the list of URL's of the files which APT tries to download but can't due to the lack of an internet connection. (To copy the list from the terminal, highlight the list and press Ctlr-Shift-C.) Copy it to a text file named something like DLS-UM1804-AOL.txt (saved with Windows line endings in case you'll end up using it on a Windows PC), edit out everything except the URL's, and use it to download the files, as described in the main article on Apt-offline in this blog.

When you have the packages (.deb files), you would then install them one at a time using a an application which can install single packages, which includes Gdebi and various package manager GUI's. Since the installation-order matters, it's a matter of trial and error - if some dependency is missing, the installer will tell you to install it first. For a small number of packages, this is no big deal, but for a larger number, it can be a PITA, which is why you might want to avoid installing Apt-offline's GUI until after you've installed Apt-offline and can use it to install the GUI.

The first thing you would do after installing Apt-offline would be to update the package index, which requires Apt-offline (for security) and begins by selecting the package-index sections which you might need in the foreseeable future, and various other parameters, using an app such as Software & Updates. Then, if you haven't installed Apt-offline's GUI, you'd use a text command such as "sudo apt-offline set apt-offline.sig --update," which creates a fancy download-script known as a signature file, containing the URL's of the new package-index files and various security-related information, located in the Home directory unless you specify a different location in the set-command, although I would leave it as it is.

Then you'd copy the signature file to what I call a "change-name folder," which is a folder named after the change or changes represented by the signature file. You would also create a folder named "pkgs" (an arbitrary name which has worked well for me) in the change-name folder. (The reason for this approach will become clear as you go through the process of making changes to an installation using Apt-offline. You don't want to wade through an explanation, so just trust me and do this.)

Next, you'd copy the aforementioned change-name folder to a flash drive or smartphone, depending on the device used for performing the Apt-offline get-op. To prepare for the get-op, you would at some point move apt-offline.sig and the "pkgs" folder out of the change-name folder and into the get-process directory, which is just whatever directory is specified in the get-command. The get-command would be something like "apt-offline get <path to get-process directory>/apt-offline.sig -d <path to get-process directory>/pkgs," which means "download the files listed in apt-offline.sig in this directory and put them in the folder pkgs in this directory." (Obviously, pkgs should be empty at the beginning of the process, to keep things organized.)

After the get-op, the signature file and pkgs folder would be moved back to the change-name folder, which would be transferred back to the installation being updated. Finally, an install-op would be performed on the pkgs folder, such as by executing the command "sudo apt-offline install pkgs," assuming that the pkgs folder is located in the Home directory. The update process would be completed, and then you could use Apt-offline to install whatever other available software you desire. I'd start with Apt-offline-gui and Synaptic package manager. The corresponding set-command would be "sudo apt-offline set apt-offline.sig --install-packages apt-offline-gui,synaptic." (Note that the apps are separated by just a comma - no spaces. Each app's special name without caps or spaces can be found by using a software-manager app to perform a search for the app, assuming that the relevant section of the package index is installed.)

Once you make a couple of changes to an installation using Apt-offline, it will become routine. It's not exactly convenient, which is why you should make all of the possible changes which you'll probably want for the foreseeable future at once. It helps if you have frequent and convenient access to a variety of wifi hotspots with good download speeds.

There might be a local Linux users group (LUG) in your area, in which case you could contact them about lugging your desktop PC to a location where you could connect it to the internet in order to set up the software on your PC, at least initially, with someone with experience on hand to provide guidance, and from then on use Apt-offline to make changes.

Wednesday, March 14, 2018

Using Apt-offline to routinely install apps on live sessions

To conveniently add apps to live sessions of Debian-based versions of Linux, such as to add an app every time you boot up, you can use a very simple shell script (a program consisting of Linux instructions in a text-file with a ".sh" extension), which is fed into "bash," the Linux command-interpreter), combined with the file extracted from an Apt-offline tar.xz or .zip source-file, and a folder containing the packages to be installed. I came up with this trick as a result of wanting to install Mousepad on live sessions of KDE-Neon. These items would be stored in a folder in a fixed location on a flash drive, and to install the app, all that would be necessary would be to plug in the flash drive, give the shell script permission to run as a program, and then open it with bash (details below).

I decided to name the main folder for this purpose "ForAddingAppsToLiveSessions," and in this folder to put a sub-folder for each case (such as for one named Mousepad-Neon for installing Mousepad on KDE Neon), and put the following items in the sub-folder:

1) a copy of the folder extracted from the Apt-offline tar.xz or zip-file which is downloaded from the official Apt-offline site. I used version 1.7 for this example because I had a copy on hand, and the extracted folder in this case is named "apt-offline-master."
2) a folder named Mousepad to serve as a repository for the Mousepad-related packages to be installed 
3) the aforementioned installation-script, which reads as follows:


#!/bin/bash

cd /home/neon/ForAddingAppsToLiveSessions/Mousepad-Neon/apt-offline-master

python setup.py build

sudo python setup.py install

sudo apt-offline install /home/neon/ForAddingAppsToLiveSessions/Mousepad-Neon/Mousepad

sudo apt-get install mousepad

[end of script]

The first line starts bash, which the reads the subsequent lines. The first "cd" command tells bash to look in the apt-offline-master folder for the setup.py module used in the subsequent commands (to install Apt-offline). ("Python" tells bash to feed setup.py into the installation's Python interpreter, one of which is included with every Linux distribution I've checked. To check it yourself, enter "python." Apt-offline requires Python 2.x, and is not compatible with 3.x, at least as of this writing.) The "apt-offline install" command tells Apt-offline to "install" the packages contained in Mousepad, i.e. to copy them to the /var/cache/apt/archives folder. (If only one packgage had to be installed in order to install Mousepad, it could installed by opening it anywhere, such as on a flash drive, as long as it's connected to the target installation, but when multiple packages have to be installed in order to install a particular app, it is my experience that they must be placed in the aforementioned archives folder, which requires "superuser" privileges and is typically inconvenient at best, and very inconvenient in the case of KDE Neon, if this is done by using the file manager.)  The "apt-get install..." command performs the actual installation.

To run the shell script, give it permission to run as a program (right-click on the file, click on Properties in menu which appears, then on the Permissions tab in the window which appears, etc.). Then close the Properties window, right-click on the script and select "Run with," then in the window which appears, enter "bash" in the box at the top of the window, select "Run in terminal" below, and click on OK.  Then all you have to do is enter "y" when the package manager asks whether you want to proceed with the installation. (You could add a "-y" to the end of the "sudo apt-get install mousepad" instruction so that the question would be answered automatically in the affirmative, and avoid the need to run it in the terminal, but I prefer to monitor the progress of the process.) When the process is complete, the terminal will close.

Monday, March 5, 2018

Etcher could be the ultimate USB-installer for Linux

Rev 3/9/18 (see Revisions)

My motive for experimenting with manually creating flash drive installations (which turned out to be a failure in many cases, due to boot-process subtleties which are over my head) was that I had tried Etcher (after trying every other major USB-installer that runs on Linux) and concluded that there is no reliable USB-installer that runs on Linux.

However, it appears that the problem was that I was using a live installation to run Etcher, and to run Brasero for purposes of generating ISO's (from bootable DVD's) to use as inputs for Etcher. Since switching to a full installation of Linux Mint for these tasks, Brasero has so far produced error-free ISO's from bootable DVD's, and Etcher has so far handled everything I've thrown at it, except Ubuntu 1510, perhaps due to a problem with the ISO (it was able to install Ubuntu 16.04 and 17.10). However, Startup Disk Creator (SDC) in Ubuntu 1710 was able to install 15.10, and it boots (it also works on Debian and Ubuntu-derivatives, although it won't recognize ISO's with certain types of problems). I don't know why Etcher failed in this case, but the important thing is that I was still able to perform the installation without using Rufus and Windows.


Notes

Revisions

3/9/18 - Added reference to my experience with attempting to install Ubuntu 15.10.

Friday, March 2, 2018

Manually creating Linux USB flash drive installations

Rev 3/4/18 (see Recent Revisions at end)


In the interest of becoming less dependent upon USB-installers to create the simple flash-drive installations which I prefer for running my "secure" PC (a Zbox without any internal storage, internet connection, or wireless capability, running a nonpersistent flash drive installation of Linux - data is stored on small, easily-concealable encrypted flash drives), I decided to try to create one manually, and found that it's easy to do in some cases, but that in general it requires a lot of knowledge about the subtleties of the boot process and about configuring bootloaders. In cases where it works, however, changing installations might be as easy as deleting the existing version of Linux from the flash drive and replacing it with another.

The basic procedure is to create a FAT32 partition on a MBR/FAT-formatted flash drive (I use the Disks and/or GParted utility for this), then install the Syslinux bootloader-chain (mbr.bin, ldlinux.sys, and ldlinux.c32 - commands to do this are included below, but there are many approaches to accomplishing the same result), extract an ISO and copy the extracted files to the flash drive (or copy the files from a bootable DVD), and create a text file named syslinux.cfg (contents listed below) to tell the bootloader what to do (if it's different than the default behavior), and place it on the flash drive along with everything else.

I did this, using a Debian DVD which I purchased from OSDisc.com as the source material. The DVD contains some "symbolic links," which are files whose icons include "shortcut"-arrows, which cannot be copied to a FAT partition. So, what you do in such cases is to right-click on them, identify the item to which they link/point, and copy that item to the flash drive. In one case, the symbolic link pointed to the directory in which it was located, i.e. the "." directory. (".." is the parent directory.) So, I just disregarded the link. Some other symbolic links pointed to a certain folder, so I copied that folder.

To do this with an ISO-file, the file has to be extracted first, by right-clicking on it, etc. The extraction-process creates a folder named after the ISO, extracts the ISO's contents, and places them in the folder. The contents (known as the "image"), not the folder, would be copied to the flash drive.

The configuration file is a text file named syslinux.cfg. To avoid having to learn how to write these configuration files, which requires considerable education, I decided to try one which I found on an installation which I made with Rufus. The contents of this configuration file are as follows:

DEFAULT loadconfig

LABEL loadconfig
  CONFIG /isolinux/isolinux.cfg
  APPEND /isolinux/

This essentially tells the bootloader to use the isolinux.cfg file in the isolinux folder (included on DVDs and in ISO's) as its configuration file, and to append certain elements in the isolinux directory to the kernel command line, to add options (which I presume are those contained in the boot-menu).


> The general Linux boot-process using the Syslinux bootloader

The boot-process begins with a few instructions which are built into the PC's firmware and executed whenever the PC is started, which loads mbr.bin from the flash drive's first sector (which isn't part of the file system), and then runs mbr.bin, which passes control to ldlinux.sys, which does some things and passes control to ldlinux.c32, which reads the configuration file and acts accordingly, typically by executing a menu-program to give the user various boot-options. These pass control to the OS-kernel, which creates an "initial ram disk" (initrd), which is a temporary file system to contain such things as drivers for hardware required for the boot-process, and starts the core the of OS, which does a lot of stuff to set up shop, and passes control to user-interface. The kernel has enough intelligence to find what it needs if you just put everything from the DVD or extracted ISO in the root directory of the flash drive.


> Detailed bootloader installation process

To install the Syslinux bootloader-chain onto a flash drive formatted as described above, enter the following commands. (Don't include the quotation-marks which I use to identify the beginning and end of each command. The commands can be copied to a "scratchpad" text-file, edited as necessary to include the specifics, then copied and pasted into the terminal with Ctrl-Shift-V.

A) "sudo syslinux -i /dev/sdX1" (X is typically b, c, or d, depending on the PC's drive-configuration at the time. The specific letter can be found by using the Disks or Gparted utility included with many types of Linux, although typically not in those with the KDE interface (there are exceptions). This installs ldlinux.sys and ldlinux.c32 in the flash drive's first partition (which in this case is the only one).

B) "cd" to the directory containing the mbr.bin file. This directory is typically /usr/lib/syslinux/mbr, and if so so the command would be "cd /usr/lib/syslinux/mbr" (without quotes).

C) Enter "sudo dd conv=notrunc bs=440 count=1 if=mbr.bin of=/dev/sdX" (without quotes). (This puts mbr.bin, which is the primary bootloader, into the master boot record of the flash drive, where the BIOS will find it when the PC is booted.)

D) Enter "sudo parted /dev/sdX set 1 boot on" (without quotes) (this sets the partition's boot-flag to make it bootable, which can also be done with GParted)

E) Simply copy the contents of a bootable Linux DVD or an extracted ISO-file (as mentioned previously) to the flash drive partition/directory where ldlinux.sys and ldlinux.c32 are located.

F) Likewise, copy the aforementioned syslinux.cfg file to the flash-drive partition/directory where ldlinux.sys etc. are located.

That's what I did, basically, and it worked, but it doesn't work in all cases. But if you find that it works for certain types of Linux, you might be able to change the installation on a flash drive by simply deleting one version of Linux and replacing it with another.


Notes

Recent Revisions

Rev 3/3/18 - Pretty much rewrote the entire thing to improve its clarity.

Rev 3/4/18 - Revised the first paragraph to reflect additional experience with the system described in this article.

Friday, February 9, 2018

Apt-offline: The Amazing Sneakernet Repo-Link for Web-Access-Impaired Debian Linux Installations

This post has been incorporated into the post entitled "Apt-offline: The Ultimate Tool...." in this blog, because of the amount of work required to keep both post synchronized, which was causing this post to become inconsistent and in some cases contradictory to the "Ultimate Tool" post, which had become my de facto primary article on Apt-offline.

Saturday, January 27, 2018

Some practical suggestions for getting started with Linux

(from AnAptOfflineBlog.blogspot.com)

Rev 1/29/18


In case you're just at the point where you're considering adopting Linux as an OS for your home PC, your first question is probably how to get it and get booted up with it. There are many types of Linux, and if you don't have a fast internet connection for your home PC, you should get a type which is derived from Debian Linux, because this will give you the best selection of software. Although the new "containerized" Linux software systems (Snappy & Flatpak) make it easy to install software on internet-access-impaired installations, there are quite a few popular apps which have yet to be made available in these new formats. Of the two major older formats, the Debian format is the better one for internet-access-impaired installations, because its software manager has an accessory known as Apt-offline which allows the software manager to use Android devices and some library PC's as its connections to the installation's corresponding online software repository, which is vast. The main alternative to Debian is the "RPM" system, and as far as I know there is no convenient means to install software on internet-access-impaired RPM-based systems.

It's hard to go wrong when choosing a Debian derivative to use as an hdd/ssd-installation on an internet-access-impaired PC, because after decades of development they're all very slick, although I'd avoid KDE Neon, which appears to have been deliberately designed to require a direct high-speed internet connection. There might be others like this, but not that I've used. Linux Mint is also not the best choice, due to the fact that downloading software requires accessing two servers, and when you don't have a direct internet connection, it's trickier to find a combination of two servers that provides good speed than it is to find a single server which does.

To create an hdd/ssd-installation, you need a so-called "bootable" or "live" DVD-installation [1], which can be purchased from OSDisc.com or created by downloading the ISO-file (a typically 1-2 GB archive file which contains the OS and a bunch of apps and utilities) from the website for that type of Linux, and using a DVD-burner program to copy the ISO's "image" (contents) to a blank DVD (in Windows, right click on the ISO, then select "open with," and then select the DVD image-writer). If there is a choice between a "desktop" ISO and other types (such as server), the desktop-ISO is the one you would want for your PC. There might also be a choice between i386 ("32-bit") and AMD-64 ("64-bit") CPU instruction-sets, in which case you'd want the AMD-64 version unless your PC is ancient.

To create a flash-drive installation, you'll need the ISO corresponding to your chosen type of Linux, a PC running a "USB-installer" program (which writes the ISO's "image"/contents to flash drives in a special format), and a flash drive (I use 4GB Kootion flash drives for this purpose, and for flexibility, I don't use them for anything else at the same time). There are usb-installers which run on Windows (my favorite is Rufus, which is covered in more detail below), and various types of usb-installers are included with various types of Linux. If you're starting from scratch, you could boot a PC in Linux with a bootable DVD and use one of the resulting "live" session's usb-installers. ISO's can be downloaded as mentioned previously, or created from a bootable DVD using the Windows program CD Burner XP or the Linux program Brasero. The catch is that you can't boot from a bootable Linux DVD and use Brasero to convert the same DVD to an ISO - you'd have to use another DVD and DVD-drive.

To get started from scratch, you could also go to your local library, download the ISO of interest and make a copy so you don't have to download it again. To create a bootable DVD, put a blank DVD in the drive, right-click on the ISO, select "open with," and then the DVD-image-burner option. Then you could create a couple of flash-drive installations from the ISO by downloading Rufus and running it (no installation required). So, you'd need to take a flash drive for storing a copy of the ISO, a few blank DVD-ROM's in cases or sleeves (one to use as a bootable DVD and couple of spares in case something goes wrong), and a couple of 2GB and 4GB flash drives to use as bootable flash drives. 2GB drives would probably be sufficient, but if you choose an ISO larger than 2GB, you'll need the 4GB units. When using Rufus, you have to use the entire flash drive for the installation - you won't be able to use the flash drive to store anything besides the installation.

One of the more elaborate usb-installers is Startup Disk Creator (SDC) in Ubuntu. You can use it create "nonpersistent" flash-drive installations which don't retain any data or settings upon shut-down, or "persistent" installations, which save settings and data - the latter of which should be backed up if it's important. I've used SDC to install Ubuntu on the 2nd partition of a flash drive with a FAT 1st partition (so that Windows and Linux could access it), a FAT 2nd partition for the installation, and a LUKS (Linux encryption format) 3rd partition. (Not all PC's can boot from the 2nd partition, so it had limited portability.) I did this just to prove to myself that it could be done, not to use it, and I'm not sure that SDC can still create installations on partitions other than the 1st one. (There seems to be a trend toward simplicity, apparently because few people have any use for persistent or elaborate flash-drive installations.) At one time, it was possible to boot from a DVD and use the DVD as the source of the image to be written to the flash drive, but no longer.

To create what I call a "secure" PC, I use a nonpersistent installation for booting a barebones mini-PC with a couple of GB of RAM, no internal storage or wireless capability (remove any wireless modules, and use a wired-only keyboard), and I save data on encrypted flash drive (and backup important data religiously on multiple encrypted backup drives). So, when the PC is shut down, it retains no data except on small encrypted flash drives which are easy to hide. Encrypted flash drives, or rather encrypted partitions on flash drives, are created by first using the GParted utility to create a FAT partition on a flash drive, and then the Disks program to reformat the FAT partition as a LUKS partition. GParted and Disks are commonly included in Linux ISO's. Big Brother can't sneak a peek at the data on such a PC, period, without finding the encrypted flash drives and decrypting them, or forcing you to divulge the password. It should be obvious by now that software (other than AES-grade encryption software) cannot be trusted to isolate data from the internet. The only way to be sure is to isolate it physically and electromagnetically, and otherwise by encrypting it.

I like Kootion brand no-nonsense 4GB flash drives (plain metal body, a metal connector with a simple cap, and an LED activity-indicator). Although the price is low, they use low-quality chips and the failure rate leaves something to be desired, so that the long-term cost is not so low, especially if your only copy of some important data ends up being lost due to chip failure. (If SanDisk offered a similar no-nonsense style at twice the price, I'd buy them.) They're not waterproof, but they could be stored in so-called "poly vials" to keep them dry even when submerged in water. I've had good luck with booting Linux from the red ones, but the blue ones have given me problems with booting certain types of Linux installed with certain usb-installers, such as long pauses in the boot-process, or complete boot-failures, which are typically remedied by rebooting. Clearly, the blue ones are different under the skin from the red ones (at least the ones I got are different). Another reason I like Kootion drives is that they can be labeled fairly easily. For example, I write on them with a Sharpie or print a label, rub a glue-stick on the back and apply it to the drive, and wrap a piece of wide transparent tape crossways around the drive, then trim it with scissors. The result is a durable label which can be removed fairly easily (with alcohol in the case of labels created with a Sharpie). To create a "window" under which labels can be inserted, tape a piece of regular-width transparent tape, sticky-side up, onto a counter, and apply a piece of wide transparent tape crossways on top of it. Then apply the wide transparent tape crossways onto the flash drive so that the regular-width transparent tape forms a "window" on one side of the drive. To hold the cap on the drive when carrying it around in a pocket, you can use a piece of tape, or insert the drive into a 1.5"-long piece of 5/8" ID heater hose.

My other "offline" PC is a full-sized Acer Windows PC with a extra hdd for Linux (it's risky to mix Windows and Linux on the same drive) and a rotary switch on back to control power to the drives, so that I can select the Windows drive to boot in Windows, the Linux drive to boot in Linux, or neither drive so that I can boot from a flash-drive installation. (Just don't use the switch while the PC is running.) I use the Windows installation mainly to create flash-drive installations of Linux, using the Rufus USB-installer, which is a no-nonsense installer which creates nonpersistent installations and has never given me any problems, other than perhaps when installing some type of Linux which is newer than the Rufus revision. In such cases, Rufus might need a particular revision of the "Linux system" files (ldlinux.sys and ldlinux.bss), which are available from the Rufus website's /files page, and can be downloaded and stored in a folder along with Rufus (just mimic the website's organization-scheme), so that when Rufus needs these files, it will search the folder and sub-folders and find the files it needs. I use the Linux installation mainly as a PVR, with the HD Homerun tuner (the required software is contained in the repository - install Synaptic Package Manager and use it to search for HDHomerun to find all HDHomerun-related software).

There's nothing like having an internet connection at home to do research, but at least with Apt-offline, you can get by with a slow connection and use Apt-offline to make changes to your Linux installation. A slow connection could also be used for adding so-called PPA's to your Linux installation's selection of software-sources, which can't be done without an internet connection due to security requirements. But if the software available from a PPA is popular, it's typically included in the repository, although perhaps not the most recent revisions.

For more information (some of dubious usefulness) on getting started with Linux, see the blog toggwu.blogspot.com, of which this blog is a branch.


Notes

[1] Flash drive installations can be used for creating hdd/ssd-installations, but this requires changing the PC's boot-order to try booting from the USB ports first, which I've found to be very difficult if not impossible in some cases, at least if the hdd/ssd already contains an OS. It's easier to boot from a DVD-installation under these circumstances, although it takes a long time.

Recent Revisions

1/29/18 - Rewrote much of the article.

Wednesday, January 10, 2018

Apt-offline get-op performed on an Android device

 The images in this folder consist of frame-grabs from a screen-recording of an Apt-offline get-process performed on an Android device via Apt-offline installed on GnuRoot Debian, an Android app.
Sorry about the messy layout, which was forced on me by Blogger.

A) Shows GnuRoot's home folder with Apt-offline signature file apt-offline.sig (basically the list of items to download), a download-destination folder named "pkgs," and a "change name" folder ("AOL-CNF...") where these two items are normally stored to ID them and to isolate them from any other signature-file/destination-folder pairs, which are all named alike and placed in the home folder when it's their turn to be used in a get-op, to allow the use of a fixed get-command for every get-op. The change-name folder, signature-file, and destination folder were created on the PC and "sideloaded" to the Android device using a regular USB-to-micro-USB adapter cable. No special software is required for this transfer.

B) Shows that the download-destination folder was empty before the get-op.

C) Shows the cd /home command being copied from Clipper, where it was placed by creating a text file containing commands, side-loading it (as described in step A) onto the Android device, opening it with a text-editor app such as QuickEdit, selecting each command, and copying it.

D & E) Shows the cd /home command being pasted into the command-line (after step C, press on command line until menu appears, and select Paste).

F) Shows get-command being copied from Clipper.

G) Shows get-command being pasted into command line.

H) Shows that get-op has commenced.

I) Shows large file being downloaded.

J) Shows that get-op has been completed.

K) Shows downloaded files in destination-folder "pkgs."