Foreign Keys to custom Primary Key caveats in Ruby on Rails


Image Credit: S Egger, 2007

The Ruby on Rails model convention of automatically including primary keys named id and foreign keys named {primary_key_model}_id works well for the vast majority of models, but what if the object we’re modelling already has a unique numerical property? It would make sense to use this property instead of id, and Rails allows custom primary keys but there are some gotchas..

Let’s walk through an example using Books and Chapters

First of all we need a new Rails app which we’ll call Bookshelf: (I’m using Rails 5.2.1 here)
rails new bookshelf
cd bookshelf
rails db:create

Now we can create our Book model - but instead of id we’ll specify the International Standard Book Number (ISBN) as the primary key 1
rails generate model Book isbn:integer title:string genre:string

We can’t specify the primary key change with the command-line generator so we need to edit the migration Rails created for us, from this:

# db/migrate/20180825101955_create_books.rb
class CreateBooks < ActiveRecord::Migration[5.2]
  def change
    create_table :books do |t|
      t.integer :isbn
      t.string :title
      t.string :genre


to this:

# db/migrate/20180825101955_create_books.rb
class CreateBooks < ActiveRecord::Migration[5.2]
  def change
    create_table :books, id: false, primary_key: :isbn do |t|
      t.primary_key :isbn
      t.string :title
      t.string :genre


Then we can apply it and check the primary key is as expected by creating a Book:
rails db:migrate

== 20180825101955 CreateBooks: migrating ======================================
-- create_table(:books, {:id=>false, :primary_key=>:isbn})
   -> 0.0024s
== 20180825101955 CreateBooks: migrated (0.0025s) =============================

rails c
irb(main):001:0>Book.create(isbn: 9780099518471, title: 'Brave New World', genre: 'Science Fiction')

  Book Load (0.3ms)  SELECT  "books".* FROM "books" WHERE "books"."isbn" = ? LIMIT ?  [["isbn", 9780099518471], ["LIMIT", 1]]
=> #<Book isbn: 9780099518471, title: "Brave New World", genre: "Science Fiction", created_at: "2018-08-25 10:36:37", updated_at: "2018-08-25 10:36:37">

So far so good. Now let’s create our Chapter association: (type quit to exit rails c)
rails generate model Chapter title:string no:integer book:references

If we run the Chapter migration now it will apply, but if we try to create a Chapter and reference our Book we get a rollback transaction error:
rails db:migrate
rails c
irb(main):001:0>Chapter.create!(title: "Chapter 1", no: 1, book: Book.find(9780099518471))
  Book Load (0.5ms)  SELECT  "books".* FROM "books" WHERE "books"."isbn" = ? LIMIT ?  [["isbn", 9780099518471], ["LIMIT", 1]]
   (0.1ms)  begin transaction
  Chapter Create (0.7ms)  INSERT INTO "chapters" ("title", "no", "book_id", "created_at", "updated_at") VALUES (?, ?, ?, ?, ?)  [["title", "Chapter 1"], ["no", 1], ["book_id", 9780099518471], ["created_at", "2018-08-25 11:26:44.984648"], ["updated_at", "2018-08-25 11:26:44.984648"]]
   (0.0ms)  rollback transaction
ActiveRecord::StatementInvalid: SQLite3::SQLException: foreign key mismatch - "chapters" referencing "books": INSERT INTO "chapters" ("title", "no", "book_id", "created_at", "updated_at") VALUES (?, ?, ?, ?, ?)
There's no way to create a Chapter because the foreign key constraint is trying to enforce values against which doesn't exist so we will never get a foreign key match

There are a couple of manual steps which are pointed to in the foreign key documentation to get this working properly:

If the column names can not be derived from the table names, you can use the :column and :primary_key options

Let’s throw in a little complication at this point - we don’t want our foreign key column name to be chapters.book_id - we’d like it to be chapters.book_isbn as that’s in keeping with the Books model and looks better in the database too.

We need to make a few changes to our Chapters migration, from this:

# db/migrate/20180825111413_create_chapters.rb
class CreateChapters < ActiveRecord::Migration[5.2]
  def change
    create_table :chapters do |t|
      t.string :title
      t.integer :no
      t.references :book, foreign_key: true


to this:

# db/migrate/20180825111413_create_chapters.rb
class CreateChapters < ActiveRecord::Migration[5.2]
  def change
    create_table :chapters do |t|
      t.string :title
      t.integer :no
      t.references :book_isbn, references: :books, null: false # creates 'book_isbn_id'


    rename_column :chapters, :book_isbn_id, :book_isbn
    add_foreign_key :chapters, :books, column: 'book_isbn', primary_key: 'isbn'

As you can see we’re taking the default naming convention for the foreign key references column, renaming it, and then adding the foreign key constraint with the column: and primary_key: options as per the documentation.

Now we can apply our migration:
rails db:migrate

== 20180825111413 CreateChapters: migrating ===================================
-- create_table(:chapters)
   -> 0.0032s
-- rename_column(:chapters, :book_isbn_id, :book_isbn)
   -> 0.0328s
-- add_foreign_key(:chapters, :books, {:column=>"book_isbn", :primary_key=>"isbn"})
   -> 0.0000s
== 20180825111413 CreateChapters: migrated (0.0366s) ==========================
If we try to create a Chapter and reference our Book now we get a missing attribute error:
rails c
irb(main):001:0>Chapter.create!(title: "Chapter 1", no: 1, book: Book.find(9780099518471))
  Book Load (0.2ms)  SELECT  "books".* FROM "books" WHERE "books"."isbn" = ? LIMIT ?  [["isbn", 9780099518471], ["LIMIT", 1]]
ActiveModel::MissingAttributeError: can't write unknown attribute `book_id`
We still can't create a Chapter because Rails doesn't know we're using custom column and primary key names - the error message is telling us there's a problem with our Chapter model

The final peice is to tell Rails about our custom primary key and while we’re editing the model, we can add our associations at the same time.
Change the Book from this:

# app/models/book.rb
class Book < ApplicationRecord

to this:

# app/models/book.rb
class Book < ApplicationRecord
  self.primary_key = 'isbn'
  has_many :chapters, primary_key: 'isbn', foreign_key: 'book_isbn'

And then change the Chapter from this:

# app/models/chapter.rb
class Chapter < ApplicationRecord
  belongs_to :book

to this:

# app/models/chapter.rb
class Chapter < ApplicationRecord
  belongs_to :book, foreign_key: 'book_isbn'

Now let’s try adding a Chapter or two..

rails c
irb(main):001:0>Chapter.create!(title: "Chapter 1", no: 1, book: Book.find(9780099518471))

  Book Load (0.5ms)  SELECT  "books".* FROM "books" WHERE "books"."isbn" = ? LIMIT ?  [["isbn", 9780099518471], ["LIMIT", 1]]
   (0.1ms)  begin transaction
  Chapter Create (1.8ms)  INSERT INTO "chapters" ("title", "no", "book_isbn", "created_at", "updated_at") VALUES (?, ?, ?, ?, ?)  [["title", "Chapter 1"], ["no", 1], ["book_isbn", 9780099518471], ["created_at", "2018-08-27 07:29:24.381030"], ["updated_at", "2018-08-27 07:29:24.381030"]]
   (4.0ms)  commit transaction
=> #<Chapter id: 1, title: "Chapter 1", no: 1, book_isbn: 9780099518471, created_at: "2018-08-27 07:29:24", updated_at: "2018-08-27 07:29:24">


And because we’ve added the associations, we can also do:
irb(main):001:0>bravenewworld = Book.find(9780099518471)
irb(main):002:0>bravenewworld.chapters.create!(title: "Chapter 2", no: 2)

And to show the association works the other way..
irb(main):003:0>chapter2 = Chapter.where(book_isbn: 9780099518471, no: 2).first

  1. In reality ISBN probably isn’t a good choice for a primary key as the specification states the 10-digit versions can start with a zero, and leading zeros are dropped by integer datatypes. 

Setting up Ubuntu on Digital Ocean


Digital Ocean* is an internet hosting service that makes it trivial to spin up virtual servers called Droplets. While the base Ubuntu image Droplets are configured for the job, there are a couple of extra steps I take with new Ubuntu Droplets that I’m documenting here as much for my own future reference as to elicit your feedback

SSH Keys

I’ll typically create a new SSH key pair for a each Droplet. Digital Ocean’s community guide is comprehensive if you need a refresher or haven’t done it before.

ssh-keygen -t rsa -b 4096 -C "[email protected]"

Droplet creation

After logging into Digital Ocean (or signing up - use this link for an extra $10 USD credit), we click Create Droplet and follow the wizard.

Here are the typical base settings I use:

Distributions Ubuntu, latest LTM, x64
Size As per requirements (usually the smallest $5/mo)
Datacenter region Best to pick the one closest to the majority of our expected userbase. That might only be us
Select additional options As per requirements (usually just Monitoring)
Add your SSH Keys Click New SSH Key and paste in the public part of the SSH Key generated earlier
Finalise and create As per requirements

Then we click Create and wait less than a minute while Digital Ocean performs its magic


For convenience we can give our new Droplet a friendly SSH name by adding the following to our local ~/.ssh/config file (I usually make this the same as the Droplet’s name):

# ~/.ssh/config
Host {droplet-name}
    User root
    HostName {droplet-ip-address}
    IdentityFile "~/.ssh/{our-new-ssh-private-key}"

Now we can SSH into our new Ubuntu Droplet with
ssh {droplet-name}

Set the timezone

dpkg-reconfigure tzdata

Ensure all packages are up-to-date

apt-get update; apt-get -y upgrade; apt-get -y clean

Configure automatic security patches (documentation here and here)

apt-get -y install unattended-upgrades; dpkg-reconfigure unattended-upgrades
Follow the prompts and accept the defaults.

Lock SSH to keys-only

Edit sshd_config to prevent root SSH login with a password - change PermitRootLogin from yes to without-password like so:

# /etc/ssh/sshd_config
# Authentication:
LoginGraceTime 120
PermitRootLogin without-password
StrictModes yes

And finally, reboot the Droplet to ensure our settings are loaded, current and it comes back to us before we start installing or configuring our application stack of choice..

Is there anything you’d add to this list of initial Ubuntu server setup steps? - Please let us know in the comments!

Creating Thumbnails for the Synology DSM PhotoStation


Having updated my Synology NAS box to the latest Disk Station Manager (DSM) - version 6.0.2 as of December 2016 - I read that the PhotoStation thumbnail filenames had changed, and it now generates fewer of them which takes less time and saves space. Given that we’ve somewhere north of 105k photos and videos, it would take the little 1.6GHz ARM CPU in the Synology weeks (if not months) to recreate them so I started looking for a faster way.

After a brief search I found the sterling work of Matthew Phillips’ synothumb script, which has been tweaked a few times most recently by one Andrew Harris and it’s his version I started with.


  • Synology NAS with PhotoStation installed
  • Any PC, Netbook or Laptop that’s more powerful than our Synology NAS, and that we don’t mind leaving running for long periods of time (possibly days, depending on fast it is and how many photos we have..), and ideally with a wired Gigabit LAN connection
  • A Linux install or a live CD - I recommend the latest Linux Mint Cinnamon (18.1 as of writing)
  • An executable copy of the synothumb Python script in our linux home directory..

curl -o

chmod +x

1. A little Synology configuration..

First we have to configure the NFS settings in the Synology DSM - log in as ‘Admin’, open up the Package Center and disable PhotoStation - it’s in the Action menu..

Next we need to enable NFS, which we’ll find in the Control Panel under File Services.. we tick the Enable NFS box if it isn’t already and then click Apply

And then we need to configure it.. still in the Control Panel, this time in Shared Folder (just above File Services), we select the photo folder and click the Edit - in the ‘Edit Shared Folder photo’ window, we need the last tab called NFS Permissions where we need to Create a new Read/Write privilege for the IP address of our Linux client. Click Apply and then OK

2. Connecting from Linux to our Synology

Now to our Linux client.. first we need NFS and the video libraries installed. In a Terminal we run

sudo apt-get install nfs-common ffmpeg libav-tools ufraw

Then we can check if we’ve configured NFS properly by running

showmount -e {synology-ip-address}

If it returns the IP address of our Linux client then we can mount the photo share like so..

cd; mkdir mnt_photo

sudo mount -t nfs {synology-ip-address}:/volume1/photo mnt_photo/

3. (Optional) Remove existing thumbnails

Matthew’s synothumbs script skips media that already has thumbnails, so if we want to recreate them all rather than generate any missing ones we just need to run this first..

find mnt_photo/ -type d -name "@eaDir" -exec rm -rf {} \;

4. Generating thumbnails

./ mnt_photo/

5. Monitoring progress..

As well at watching the image and video filenames fly past, I like to have the Activity Monitor with the graphical Resources tab running..

Watching the System Monitor - if the CPU's aren't pegged at 100% then we need a faster network!

6. Finishing up..

A little while later..

Once the script has finished, we need to modify the ownership and permissions of the thumbnails we’ve generated. For this we need an SSH or terminal / telnet session on our Synology, then we can issue the following:

Note: If we've logged in with Terminal / telnet, and our prompt shows us as the admin user, we can issue the following command to become root (it'll prompt us for our Admin password again)
[email protected]#sudo -i

[email protected]#cd /volume1/photo
[email protected]#find . -type d -name "@eaDir" -exec chown -R PhotoStation:PhotoStation {} \;
[email protected]#find . -type d -name "@eaDir" -exec chmod -R 770 {} \;
[email protected]#find . -type f -name "SYNOPHOTO_THUMB*" -exec chmod 660 {} \;

Next we need to ensure that when the PhotoStation start re-indexing that it can’t re-create the thumbnails itself - that would seriously defeat the point of all of the work we’ve just done! Fortunately it’s easy to do.. while we’re in our SSH or terminal / telnet session we can point the existing thumbnail links to a simple script that just returns success!

[email protected]#cd /usr/syno/bin
[email protected]#echo -e '#!/bin/bash\nexit 0' >
[email protected]#chmod +x

Next we’ll just make a note of the current symbolic links:

Which should yield the following results:

  • convert-thumb -> /usr/bin/convert
  • ffmpeg-thumb -> /usr/bin/ffmpeg
Important! Make a note of these paths if they're different to the one's above as we'll need them later!

Now we can point these symbolic links to out script:

[email protected]#ln -s convert-thumb
[email protected]#ln -s ffmpeg-thumb

Back to our Linux thumbnail processing box, we can unmount the drive..

sudo umount mnt_photo

.. and in the DSM we then

  • Remove the NFS Permission we created (Control Panel > Shared Folder > photo > NFS Permissions)
  • Disable the NFS Service if we enabled it
  • Restart the PhotoStation package (Package Center > PhotoStation > Actions > Run)

Once the PhotoStation package is back up and running it should automatically start re-indexing - that is, searching /volume1/photo for new photos. If we’d just copied a lot of new photos and videos onto our Synology box then this process can take longer than the thumbnail generation! - for every media file found it reads the file’s EXIF data and writes it to local database.

We can check that the re-indexing has started (and kick it off it hasn’t) by opening PhotoStation in our browser - https://{synology-ip}/photo - and logging in as Admin.

Then we choose Settings > Photos > and click Re-index

Ensuring that PhotoStation knows about any new photos and videos we've added by re-indexing

When the PhotoStation’s re-indexing has finished, we mustn’t forget to revert the thumbnail symbolic links:

[email protected]#ln -s /usr/bin/convert /usr/syno/bin/convert-thumb
[email protected]#ln -s /usr/bin/ffmpeg /usr/syno/bin/ffmpeg-thumb


Huge thanks to Matthew Phillips for creating the synothumbs script in the first place, to all the forkers for their tweaks and to Francesco Pipia for his comment that helped me get the NFS connection working! Grazie mille!