Syncing an Entire Home Directory with Nextcloud

One of the many things that I enjoy about using Nextcloud is having a trusted space for all of my files to sync across devices. Regardless of which computer I'm using, there's a good chance that everything I want to be present will be present, including all of my bash aliases. Most people that I've spoken to who use Nextcloud or Dropbox to keep files in sync across devices tend to have a dedicated directory just for the application. What I do with Nextcloud is a little different in that I have it sync my entire home directory, including the hidden files and folders. This ensures SSH keys, preferences, and aliases remain in sync, and changing a setting on one machine changes them in all locations. It's a lovely little setup1. That said, it's not the easiest thing to get going thanks to the enforced rules within apparmor. I'll show you how to get around the problem so that your account's home directory can be completely synchronized with Nextcloud.

What you'll need:

  • an up-to-date installation of Ubuntu
  • Internet access
  • your Nextcloud credentials

How long it'll take:

  • 2 minutes, give or take leap seconds

The Nextcloud client is a snap and, as such, must operate within a certain set of rules enforced by apparmor. Anyone who tries to sync their home account while apparmor is enabled will get a lovely "unable to initialize sync journal" message. This is because apparmor will block any snap from writing to /home/{account}. Some people online will suggest disabling apparmor, but this defeats the purpose and utility of the application. It's better to leave it in place and change how the Nextcloud client snap is used on the system.

Here's how to get around the problem:

sudo snap remove nextcloud-client
sudo snap install nextcloud-client --devmode

By using the --devmode option, the snap will be able to operate completely outside the sandbox enforced by apparmor. Now it will be possible to connect to your Nextcloud instance, enter your credentials, and sync your home directory. One thing you might want to check in advance is the name of the Pictures directory in Nextcloud. By default, it's Photos, while the one in Ubuntu is Pictures. If you'd like both to be the same, I've found it easier to simply set Nextcloud's folder to "Pictures". Once all of the settings are configured to your liking, the sync will start going. Depending on how much data you choose to keep on the client, it may be a good idea to use a wired network connection.

And that's all there is to it!

  1. I understand that there are reasons why some people would not suggest having their entire home directory in sync with a tool like Nextcloud or Dropbox, but this post isn't about why someone shouldn't do it.

Connecting to a L2TP VPN Network with 17.10

Soon after installing 17.10 on my notebook, I found myself struggling to connect to the corporate VPN in order to do some work from home. While this could be a great excuse to use my home time for things like resting or spending time with the family, being able to connect to serves locked behind firewalls in order to respond to problems is something that comes with the job. Unfortunately, while Artful Aardvark does make it easy to connect to a VPN, it does not make it easy to do so when you're trying to connect to an L2TP VPN network.

Luckily, the solution is pretty simple, and no reboots are required.

First, we need to add a PPA to our system and update apt to get the list of available packages:

sudo add-apt-repository ppa:nm-l2tp/network-manager-l2tp
sudo apt-get update

Then we install two packages:

sudo apt install network-manager-l2tp network-manager-l2tp-gnome

And that's all. Now we'll see the option to add an L2TP VPN to our settings so that we can get on with our lives.

L2TP Configuration in Ubuntu 17.10

There are some concerns that L2TP has been pretty much abandoned as it's not the most secure form of VPN out there, but this doesn't mean we can't make it work for the sake of our employer.

Ars Likes Aardvark ... and So Do I

ArsTechnica just recently published a short review of Ubuntu Desktop 17.10, otherwise known as Artful Aardvark, and Scott sounded genuinely happy with how the operating system feels with the GNOME desktop replacing Unity. I've been using the system on my 2011-era MacBook Air and it has been a serious breath of fresh air … no pun intended. While memory usage is up about 7% give or take, performance is way better than it was with stock Ubuntu 16.04 LTS and noticeably nicer than one can expect with the most recent release of macOS[1. That said, the 2011-era MacBook Air does seem to run macOS High Sierra better with its 4GB RAM than my 2015-era MacBook Pro does with it's faster processors and 16GB RAM]. All in all, this most recent update to Canonical's human-friendly operating system makes me smile the more I dig into it … and that's a wonderful feeling.

Ubuntu Logo on a Grille

One interesting idea Scott posited in his article on Ars was the possibility that Canonical may be positioning Ubuntu Desktop to be given over to the community for further development at some near future date. This would certainly make sense given the company's recent decision to move towards an IPO and focusing efforts on revenue-generating activities such as Ubuntu Server and enterprise support. Mark Shuttleworth has shown that he's not afraid to wind down projects that he once had a vested interest in, such as Ubuntu Mobile, Mir, Convergence and, obviously, Unity 8. By passing the baton on to the very people who champion Ubuntu, Canonical can focus on what it does best while passionate people focus on what they do best. This has worked to a certain degree for other Linux distributions and, unlike many other communities, the Ubuntu crowd is incredibly welcoming of newcomers who want to contribute.

I certainly share Scott Gilbertson's optimism for the future of the platform and look forward to seeing what comes in the 18.04 LTS release.

Ditching /tmp in MySQL

Over the last 20-odd years I've been developing software, very rarely have I had an excess of computing resources. More often than not, I would be asking a simple machine to do far more than the average developer might, and the machine would do the best it could with the resources available, processor fans spinning at full speed the whole time. Since fully loading the Lenovo notebook from work, though, I've yet to run into many instances where I wished the unit were more powerful. A 4th Generation i7 wtih 32GB of RAM and a terabyte of SSD can make for quite the processing beast, after all. But two decades of being very careful with resources has taught me to consider every setting and test every change … until now.

Last week I was having an issue with some long-running queries in MySQL. Long story short, as part of a project I'm working on at the day job, I need to take roughly 9-million records from two different CMSes and merge them into consistent records in the new system that can then be manipulated by colleagues. These 9 million records are essentially summaries of what was in the bigger CMSes and, when split, would result in close to 23-million records spread across a dozen or so data tables. Nothing crazy at all, given that import processes tend to be very processor and IO intensive. That said, some of the queries that were running were being read from storage, then written back to storage in /tmp, then read back into memory, then written back to /tmp, before finally being output as required. These reads and writes to /tmp struck me as odd, given that the machine does have 32GB of RAM, and rarely did the system use more than 3.8GB for MySQL itself.

I wanted to ditch /tmp.

Luckily, Linux is quite versatile when it comes to situations like this, and I decided the best way to reduce the need to write to storage would be to make a tmpfs directory, and have MySQL use that instead. A tmpfs directory will use the system's RAM first, then write to storage when necessary. And, while I am fortunate enough to have a pair of SSDs in place offering write speeds in excess of 800MB/sec, RAM is faster still. More than this, there is typically 10GB that sits completely unused throughout the day, so why not rectify the problem?

First, I created a new tmpfs directory and applied the requisite permissions:

# sudo mkdir -p /var/mysqltmp
# sudo chmod -R 1755 /var/mysqltmp
# sudo chown -R mysql:mysql /var/mysqltmp

Next,I determined the ids for MySQL on the machine:

# id mysql

Whichresulted in:

uid=107(mysql) gid=111(mysql) groups=111(mysql)

From here, it was time to edit the fstab file so that the directory would be properly mounted as tmpfs after a reboot. Add these lines to your /etc/fstab file, changing the uid and gid values according to your own MySQL ids:

tmpfs /var/mysqltmp tmpfs rw,uid=107,gid=111,size=8G,nr_inodes=10k,mode=1700 0 0

Also keep in mind that this line will make the directory 8GB in size. Not everyone will need or want a directory of this size, so do set it to the most appropriate value for your MySQL installation. My server1 at home, for example, are set to 512M, because the machine has just 4GB of RAM installed.

Now let's mount this directory:

# mount -a

Ifyou're running Ubuntu 16.04 LTS or thereabouts, you'll need to update the apparmor service next. If you do not do this, MySQL will never be able to access the tmpfs directory you just set up.

Edit the /etc/apparmor.d/usr.sbin.mysqld file and add the following lines:

# Allow temp dir access
/var/mysqltmp r,
/var/mysqltmp/** rwk,

Thenrestart apparmor with a sudo service apparmor restart. Now it's time to edit the MySQL configuration file.

In your /etc/mysql/mysql.conf.d/mysql.cnf file2, look for the following line:

tmpdir = /tmp

and change it to:

tmpdir = /var/mysqltmp

And that's all there is to it. Restart mysql with a sudo service mysql restart, and you'll be off to the races. Temporary tables and whatnot will be written into memory and instantly destoryed when not needed, and your IO will drop significantly during those more oppressive queries. Of course, if you specify a tmpfs directory that is too large to fit in memory, Linux is smart enough to start using the storage system in order to complete the tasks. Should this happen, you'll see that the queries are no faster than before.

Again, I understand that most people will not have the luxury of having a beefy Quad-Core i7 with 32GB of RAM available to them. Heck, I don't have anything approaching this myself, and I've worked with databases measuring in the hundreds of gigabytes for years. That said, there's a lot that our systems are capable of if we just know how to set it up right.

  1. I call it a server, but it's really just a re-purposed computer with a bunch of external drives connected to it because I can't quite afford a "real" server at home.

  2. Some people say you should put it in a different file. If you're one of these people, that's cool. Put it wherever you'd like. So long as that config file is read after the file specifying the tmpdir variable.

A Month on Ubuntu

A little more than a month ago I stated that I was running about 60% Ubuntu. Since then, I've made the mistake of accidentally blowing away the OS X installation on my MacBook and taking Ubuntu full time. What have I learned in the intervening weeks of using Canonical's OS full time? It's a powerful platform with a lot of potential that really could be used by anybody still harbouring a Windows XP, Vista, 7, or 8 installation at home. Heck, considering how most of the corporate tools are designed at the day job, I'd go so far as to say that my employer could completely eliminate Windows from 90% of the workforce and nobody would need any sort of retraining in order to be just as efficient as before.

Ubuntu is a really nice platform.

This isn't to say that Ubuntu is perfect, though. There are still a few areas that need some attention, particularly when it comes to drivers and core applications, but this is something that can continue to be tackled as more people give Linux a try. One example of the driver problems I've run into is a lack of Bluetooth on the MacBook Pro. There are a few resources online that talk about how to resolve the issue, but I've yet to get any to work. That said, this is something that will be resolved with time.

There are a number of tools that I do occasionally miss from OS X. Pixelmator, 1Password, Hindenburg, Coda 2, and Rested are five that immediately spring to mind. Aside from 1Password, I doubt any of these will see an Ubuntu-friendly port anytime soon. This does leave the market open for people who would like to take on the challenge of making similar full-featured applications. But there's the problem …

With drivers, I expect something to happen in the future. With software, I hope something will happen in the future. Ubuntu is a great, human-friendly version of Linux with a great deal of potential, but a small developer base. In order for Ubuntu to continue to grow, it needs to attract more developers. In order for developers to target Ubuntu, there needs to be more people. In order for there to be more people, there needs to be more awareness. What makes Ubuntu better than Windows or OS X for the average person? I can think of a number of answers that appeal to me, but I have yet to convince any "normal" person to give Ubuntu a try when they look over my shoulder and ask about the software I'm using. People hear "Linux" and immediately stop listening, as though the mere mention of the word is a trigger to ignore everything after.

I still strongly feel that Ubuntu has a great deal of potential and can fulfill 99% of people's home requirements with a basic installation. We just need to get people to keep listening after the word "Linux" or, better yet, describe the operating system as a viable alternative to the commercial operating systems in the world. My mother doesn't care what operating system she's using so long as there's a browser for Facebook and YouTube. I'm willing to bet a lot of people who don't rely on very specific software for very specific tasks might be in a similar boat.

The Notebook I'm Glad I Didn't Buy

A few months ago I was looking at buying a new notebook computer to replace my MacBook Air and become the centre of my creative and entertainment endeavours. I spent months researching the different options available, reading reviews all over the web and going to big-box stores to get some hands-on time with similar units. Whenever I buy a computer, I first look at two key elements: the screen quality and the keyboard. The screen, because that's what we look at for the vast majority of the time we spend in front of the device, and the keyboard because that's what we touch when interacting with the system. Almost everything else is secondary as CPUs are generally fast enough for what most of us want to do, SSDs are often fast enough to keep up with expectations, and just about every other part of today's computers can do what we ask of them. Well … except for the touchpads, of course. Synaptics has had two decades to get their software right, and it's still a pain in the butt to type on most notebooks without first disabling the touchpad entirely.

During my research, I paid close attention to the expandability and repairability of the notebooks. Apple devices look and feel great, but they're neither upgradable or easy to fix without taking the device to Apple or one of their authorized repair centres. I've never been comfortable with the idea of letting another person have physical access to my computers when I'm in the same room, let alone trusting someone who will open the machine while out of sight. I have a lot of digital resources on my computer that fall under strict NDAs, and I won't give strangers the benefit of the doubt when it comes to leaving my storage mediums alone. What this meant is that Lenovo, with its history of easy hardware upgrades, was quickly up front as a manufacturer to stay close to. The units that caught my eye were the X250, the T450S, and a T550. All of these machines had some clear advantages and a respectable amount of upgrade room.

In addition to looking at Lenovos, I looked at an interesting device from Hewlett-Packard, as they were my preferred hardware vendor a decade ago. The Spectre 13 x360 Limited Edition is basically HP's version of the MacBook Air. The screen is wonderful. The keyboard is … a keyboard. Upgrades and repairs, though, are impossible. Everything is soldered to the motherboard, including the storage device. If I wanted to upgrade to a 1TB SSD in a year or two when the prices came down, I'd be out of luck.

After literally months of deliberations and a handful of podcast episodes where my friend Keita and I discussed moving from Apple to Linux, I decided to buy a MacBook Pro. The extra power over the Air was welcome and, as just about every 13" notebook maxes out at 16GB RAM, I opted to just get a unit that came fully equipped. I can't upgrade the unit, nor will I be able to easily repair components as they fail with age, but the keyboard and screen are wonderful to work with in Ubuntu MATE 16.04.

So with all this said, what notebook am I glad I didn't buy? A Lenovo W541; the machine my employer supplied for development and I pimped out with 32GB RAM and a terabyte of SSD storage. I'm glad I didn't buy one of these because the screen is awful and the keyboard is frustratingly laid out.

The Screen

First the screen. I've been using this notebook for about 200 hours based on the information reported by my employer's time tracking tools. This screen is a 15" Full HD (1920x1080) screen. The resolution, while not quite as lovely as a 3K or 4K screen might offer, does show characters with smooth-looking lines and very little pixellation. Unfortunately, the colours are wrong. I just cannot for the life of me adjust the screen to show things without an annoying blue tinge. Yellows look green. Reds look faded. Blacks are gray. As someone who is red-green colour blind, I'm disgusted that stuff like this can even leave the factory. I took my unit to a big-box retailer to put it next to some of the other ThinkPads I had considered buying and found that this screen on the W541 is actually better at showing colour than some of the display model screens.

This is just downright unacceptable.

The Keyboard

I should have called this section "Wyould You Like Tpyos With That?", because the keys on this unit do not lend themselves to bouncing back at the same speed as my fingers. I have actually had to slow down my typing speed when I use the Lenovo to get around the fact that the keyboard is just not good enough. I find it's the keys on the left side that tend to bounce back slower and, as a result, cause problems. On the right side, some keys will not register unless you make sure the key is firmly pressed. The [I], [L], and [;] keys are the worst, and this is particularly problematic when trying to write code, query a database, type a blog post, chat online, search the web, or make notes. So long as I don't want to do any of these things, the keyboard is just fine.

An Afterthought

Look at this. Who thought that a "Print Screen" button would be ideal between Control and Alt? Does nobody test their layouts before sending them to the factory?

These first world problems are, of course, downright unacceptable.

Making Due

I've recently put in a request with my bosses for a very specific external monitor, a 24" Dell P2415Q. It has a 4K (3840 x 2160) resolution and better colour matching than the Lenovo. It's not as expensive as the UltraSharp models, so it might be possible to get the unit without blowing this year's hardware budget. As for keyboards, an external is becoming more and more necessary. The amount of time that's wasted by slowing down and fixing typos is just stupid. I can completely understand why a lot of professionals who use notebooks at the day job hook them up to external devices and use them as desktops.

The Lenovo isn't all bad, of course. It's still the most powerful and capable unit I've ever had the luxury to use. With a quad-core i7 under the hood paired with 32GB RAM and 1TB of SATA3 SSD, there's no task (that I need to do) that is too taxing for the unit. Would I buy one of these to use at home? No. I'd probably just build a custom desktop and call it a day.

Please tell me that there are some good, non-Apple notebooks with beautiful screens and excellent keyboards sold overseas. I've looked at hundreds of models here in Japan over the years, and I've been disappointed every time.

Glances Is top On Steroids

While listening to this week's Ubuntu Podcast, I heard about an interesting little package for the command line that shows a great deal more information than top and htop called glances. I decided to give this a try and, my goodness, it really is top on steroids!

Glances on Ubuntu

Installation can be done with a single request, allowing you to see a great deal of information about your system. I'm particularly impressed with the DISK I/O view and the NETWORK section above it. People who prefer using the terminal over a GUI interface will undoubtedly enjoy watching this instead of System Monitor.

Keeping a Hardware MAC Address for VirtualBox

When a computer connects to a TCP/IP¹ network, DHCP (the Dynamic Host Configuration Protocol) is typically used to dynamically assign an IP address to the network device. In the years before DHCP every computer needed to have a manually-assigned network configuration that included information such as an IP address, a subnet mask, network gateway, DNS server addresses, and so on. Most of us haven't had to think about these numbers in well over a decade, but there are times when it's very important that a given computer is always assigned the same IP address.

At my office, where I am on a team to develop a new suite of software for the company, direct access to various development resources is restricted to a handful of computers — mine being one of them. My machine is always assigned the same IP address so long as I'm working at a handful of "approved locations". This is accomplished by sending the MAC address (the media access control address) to the DHCP server at the time of connection. If the MAC address matches a rule, then a specific IP address is given to that network device.

My Windows VM depends on this IP being assigned to it as almost all of my development is done from within this virtual container. The problem is that, because I replaced the IT-supplied Windows installation with Ubuntu, the reserved IP address is being given to Ubuntu rather than Windows. Ubuntu does not need direct access to the development resources, so a dynamic IP would be fine. So how can we make sure the reserved address is always given to the Windows VM?

Luckily, it's really very simple.

First, let's fire up the terminal and get the name of the network device with the DHCP reservation.

  1. ifconfig

Listing the Network Adapters

In my case, the wired network adapter is called enp0s25. Yours might be eth0 if it's wired, wlan0 if it's wireless, or something completely different so be sure to get the proper name. The MAC address for the device can be seen in the ifconfig listing as HWaddr (Hardware Address). That's the number we want to use for the VirtualBox VM.

Open VirtualBox, select the VM you want to have the custom MAC address and click the "Settings" button.

VirtualBox Network Settings

Expand the "Advanced" tab and set the MAC Address to the same 12-character value as you found in ifconfig. The colons are unnecessary here.

Now that VirtualBox is configured, we need to change the "MAC Address" value on Ubuntu. No two machines on a network can have the same IP address or MAC address. For this reason, we'll make a quick change to the reported address from Ubuntu by incrementing the MAC address by one.

In the terminal:

  1. sudo ifconfig {network device} down
  2. sudo ifconfig {network device} hw ether xx:xx:xx:xx:xx:xx
  3. sudo ifconfig {network device} up

For me, this looked like:

  1. sudo ifconfig enp0s25 down
  2. sudo ifconfig enp0s25 hw ether 54:ee:75:80:49:c5
  3. sudo ifconfig enp0s25 up

One thing to keep in mind with the MAC address value is that hexadecimal values are used. This means we can only use the numbers 0 to 9 and the characters a to f. Anything outside of this range will result in an error.

Once this is done, hit ifconfig again to confirm you've been given a new, dynamic IP address. Now you can launch your virtual machine and confirm you've been assigned the proper reserved IP address there.

  1. TCP/IP is the most commonly-used network protocol in the world. If you're reading this — and it's the early 21st century — you're probably using TCP/IP right now.

Disabling the Fingerprint Reader

Biometric security devices were rare and exciting¹ a decade ago but, as the technology has become cheaper and more common, we're starting to see them in a lot of the devices that we use in place of other locking tools. A fingerprint reader on our phone allows for much faster access to the software and data within. A fingerprint reader on our house doors allows us to enter even when we've forgotten the physical keys inside. The speed and convenience of the technology is unmistakable. However, when they don't work, the frustration they can cause can result in computers being thrown across rooms.

Well … that's the mental image some of us might have, anyway.

One of the first things I did after getting Ubuntu installed on my Lenovo W541 is configure the fingerprint reader to act as a proxy for my typed password. On my iOS device, this tends to work 90% of the time and I rarely need to enter my 19-character password², saving heaps of time. Unfortunately, the fingerprint reader on the Lenovo is — to be completely blunt — garbage. I've managed to get just five successful reads in the last week, and the way the software steals focus and stutters when you try to type the password is nothing short of infuriating. The passwords I use for mobile computers tend to be twice the length of the ones I use for phones, considering the type of data contained within, so having focus taken away mid-entry can be quite unwelcome.

I don't want to use the fingerprint reader anymore, but Ubuntu doesn't give us an easy way to disable the device. Luckily, we can do so with a little command line luv. First we need to find the bus number and the device id for the fingerprint reader. Open the terminal and type:

  1. lsusb


Now we'll have a list of devices connected to the various USB busses. I can see "Validity Sensors, Inc. Fingerprint Reader" is using Bus 003 Device 002 on my notebook, so now I can shut the device off.

  1. cd /dev/bus/usb/{bus number}
  2. sudo rm -f 002

Done and done. Now the fingerprint reader is disabled and I can go about typing my password like an animal again.

Getting biometric software right is hard. Very hard. Back in university this was a project that some classmates and I tried to solve with the technology of the time, and it's not an easy thing to do. That said, Apple could get it right with the iPhone 5S, and Samsung's Galaxy phones are faster and just as accurate. One would think that a quad-core i7 would have no problem parsing a fingerprint, but clearly there is something missing in the actual reading of the finger. Is it due to the oils in our skin? The angle of our finger? Something else? I really don't know. What I do know, however, is that notebook makers really need to step up their game. If cell phones are able to accomplish this goal more accurately with weaker hardware, then there's no reason why powerful notebooks can't do the same.

  1. Well … they were exciting to me, anyway.
  2. Yes, the passwords I use on my phone typically range between 17 and 22 characters in length depending on the week of the year. My passwords are rotated on a 9-day cycle, too. Good luck using my phone for anything more than calling emergency services :P

Taking Back Ctrl in VirtualBox

Keyboard shortcuts are a wonderful thing. They can save a great deal of time and allow a person to quickly navigate across documents, applications, and screens. I love keyboard shortcuts so much that I ditched the mouse in 2008 and haven't gone back¹! That said, people might experience some problems when trying to use keyboard shortcuts that involve the Ctrl key when running Windows in VirtualBox.

In my case, the problem was the result of a single checkbox in the Mouse Preferences.

Mouse Position

When the "Show position of pointer when the Control key is pressed" checkbox is checked, the keystroke is not properly passed through to Windows. As a result, Ctrl+Z will result in a Z appearing on the screen rather than an undo action. Unchecking this option will resolve the problem, allowing us to properly copy, paste, cut, select all, undo, redo, print, and everything else we might want to accomplish with the Ctrl key. It's not a perfect solution, but it does work.

This does give me an idea, though. Perhaps I could make a quick little feature for Ubuntu that will grow the size of the mouse pointer when it's shaken a lot, similar to how it works on the current releases of OS X. It's a little gimmicky, but it does solve the problem of letting people know where the mouse is when working with large screens.