Aurelia with a Rails API

0 Comments

At the end of Aurelia’s Contact Manager Tutorial the first ‘Next Step’ is:

Create a real backend for the app and use the http-client or fetch-client to retrieve data
As with most things, there's more than one way to skin a cat. This is not the only way to combine Aurelia and Rails. In all likelihood it's not the best way either - as I write this in December 2016 I'm new to Aurelia, I'm by no means an expert at Rails, and I'm still learning about modern front-end web development.

My aim was to be as in-keeping with both frameworks as possible. This is straightforward because Aurelia borrows some of the core principles from Rails: convention over configuration and clean, simple models.source

Table of Contents

Installing Rails

Picking up at the end of the Aurelia Contact Manager Tutorial, and assuming our project folder is ~/contact-manager..

Drop back to the parent folder from our Aurelia Contact Manager app

cd ..

gem install rails --no-document

We’ll want to persist the Contacts to a database at some point, but although we’re not going to cover that in this post we can set the groundwork by specifying our database engine now..

rails new contact-manager -d postgresql --api --skip-action-cable --skip-turbolinks

Side note: It's likely that the rails new ... will prompt us to overwrite .gitignore. I'd chose no and just paste in the default Rails .gitignore rules which I've included here for completeness. It's a shame there's no option to merge..
# See https://help.github.com/articles/ignoring-files for more about ignoring files.
#
# If you find yourself ignoring temporary files generated by your text editor
# or operating system, you probably want to add a global ignore instead:
#   git config --global core.excludesfile '~/.gitignore_global'

# Ignore bundler config.
/.bundle

# Ignore all logfiles and tempfiles.
/log/*
/tmp/*
!/log/.keep
!/tmp/.keep

# Ignore Byebug command history file.
.byebug_history

cd contact-manager

Before we create our database, I always give ./config/database.yml a once-over to make sure I’m happy with the database names chosen (in this case, I’d change the dashes to underscores: contact-manager_development becomes contact_manager_development and so on..)

rake db:create

rails server - check it’s working - “Yay! You’re on Rails!”

Yay!

A Rails Contacts API

First, let’s change the code-only WebAPI from the demo to a Rails API backed implemenation. The first thing we need to define is our Rails API and Rails makes that really easy..

# ./config/routes.rb
Rails.application.routes.draw do
  # For details on the DSL available within this file, see http://guides.rubyonrails.org/routing.html

  resources :contacts

end

Let’s check our route configuration which tell us which methods we need to implement..

rake routes (we can leave our Rails server running in another Terminal tab)..

$ rake routes
  Prefix Verb   URI Pattern             Controller#Action
contacts GET    /contacts(.:format)     contacts#index
         POST   /contacts(.:format)     contacts#create
 contact GET    /contacts/:id(.:format) contacts#show
         PATCH  /contacts/:id(.:format) contacts#update
         PUT    /contacts/:id(.:format) contacts#update
         DELETE /contacts/:id(.:format) contacts#destroy

Obviously, we now need a Contacts controller..

rails generate controller Contacts index create show update destroy --skip-routes

Now we can update our newly minted Contacts Controller so it mimics the funcionality of the Aurelia WebAPI by implementing the index and show methods like so:

# ./app/controllers/contacts_controller.rb
class ContactsController < ApplicationController

  # Like the Aurelia demo, we'll start with an array of contacts
  #  as it's not important where Rails gets the data - only that
  #  Aurelia gets it from Rails.
  CONTACTS = [
    {
      id:1,
      firstName:'John',
      lastName:'Tolkien-Rails',
      email:'[email protected]',
      phoneNumber:'867-5309'
    },
    {
      id:2,
      firstName:'Clive',
      lastName:'Lewis-Rails',
      email:'[email protected]',
      phoneNumber:'867-5309'
    },
    {
      id:3,
      firstName:'Owen',
      lastName:'Barfield-Rails',
      email:'[email protected]',
      phoneNumber:'867-5309'
    },
    {
      id:4,
      firstName:'Charles',
      lastName:'Williams-Rails',
      email:'[email protected]',
      phoneNumber:'867-5309'
    },
    {
      id:5,
      firstName:'Roger',
      lastName:'Green-Rails',
      email:'[email protected]',
      phoneNumber:'867-5309'
    }
  ]

  def index
    render json: { status: :ok, data: CONTACTS }
  end

  def create
  end

  def show
    contact = CONTACTS.select { |c| c[:id].to_s == params[:id] }
    if !contact.empty?
      render json: { status: :ok, data: contact }
    else
      render json: { status: :no_content }
    end
  end

  def update
  end

  def destroy
  end
end

We can check our handiwork by pointing our browser to http://localhost:3000/contacts where we should see our list of contacts rendered as JSON data into the browser like so:

Browser screenshot showing Rails API Contacts List data in JSON format

Side note: I'm using the JSON Formatter Extension for Google Chrome by Callum Locke. Thanks Callum!

Fetching data from a Rails API with Aurelia

Now let’s modify the tutorial’s WebAPI to pull our data. Following the documentation for Aurelia’s HTTP Services we’ll also need a fetch polyfill as it’s still being implemented by the Browsers[Dec 2016] - to the command line..

npm install whatwg-fetch aurelia-fetch-client --save

Then we can add them to our dependencies:

// ./aurelia_project/aurelia.json
"build": {
  ...
  "bundles": [
    ...
    "whatwg-fetch",
    "aurelia-fetch-client"
  ]
  ...
 }

Before we start using them, let’s check that the Aurelia bit still builds:

au build == no errors. Good, now we can hook up the front to the back..

cp src/web-api.js src/web-api-fetch.js

Edit our new fetch version of the getContactList() method to look like so:

// ./src/web-api-fetch.js
import { HttpClient } from 'aurelia-fetch-client';
let client = new HttpClient();
...
  getContactList() {
    this.isRequesting = true;
    return new Promise((resolve, reject) => {
      // We'll change this hardcoded URL in a moment..
      client.fetch('http://localhost:3000/contacts')
        .then(response => response.json())
        .then(response => {
          let results = response.data.map(x => { return {
            id: x.id,
            firstName: x.firstName,
            lastName: x.lastName,
            email: x.email
          }});
          resolve(results);
          this.isRequesting = false;
        })
        .catch((ex) => {
          console.log('ERROR', ex);
          reject(response);
        });
    });
  }
...

Now we can swap out the in-memory WebAPI for our Rails API..

// ./src/contact-list.js
import {WebAPI} from './web-api';         // change this..
import {WebAPI} from './web-api-fetch';   // ..to this
...

If we take a look at our app (http://localhost:9000 - which will have automatically refreshed if we have an Aurelia CLI au run --watch sitting in a terminal somewhere. Start one if you haven’t) we see that it’s..

empty..

Screenshot of our app in the web browser with no data - it's broken

Oops, we've broken it..

A quick look at our browser javascript console shows us the reason.. CORS..

Screenshot of the web browser developer tools console showing the Cross Origin Resource Sharing error

Of CORS! ba-dum tish*

Cross-Origin Resource Sharing (CORS)

Essentially, our default Rails 5 API which is being hosted on port 3000 (http://localhost:3000) is refusing to service requests from our Aurelia front-end which is being hosted on port 9000 (http://localhost:9000).

In production this won't be an issue as they'll both be hosted from the same origin, but if we wish to use the Aurelia CLI development toolchain (or WebPack, gulp, grunt, burp, etc..) then we'll need to configure our Rails API to allow CORS in development. Thankfully that's easy too..

First we need to enable the CORS middleware by un-commenting it in our Gemfile, but adding the group: constraint:
# ./Gemfile
...
# Use Rack CORS for handling Cross-Origin Resource Sharing (CORS), making cross-origin AJAX possible
gem 'rack-cors', group: :development
...
Then un-comment the default configuration Rails included for us, but we'll enclose it in an if Rails.env.development? then ... end block..
# config/initializers/cors.rb
if Rails.env.development? then
  Rails.application.config.middleware.insert_before 0, Rack::Cors do
    allow do
      origins 'localhost:9000'

      resource '*',
      headers: :any,
      methods: [:get, :post, :put, :patch, :delete, :options, :head]
    end
  end
end

That's it. We just need to install the missing gem and restart our server

  1. Stop our rails server with Ctrl+C
  2. Run bundle install
  3. Then restart it: rails server

Refresh, and we see that the Contact List does indeed show the list of contacts from the Rails API - notice that we appended ‘-Rails’ to the last names so we could tell where our data was coming from..

Screenshot of our app working in development - Contact List is from Rails, but the Profile Details aren't yet

Fixed! Our separately hosted Aurelia frontend and Rails API backend can now talk to each other in development

That’s great, but the Profile panel details on the right are showing the non-Rails-backed data, so next we need to make the same two changes to the Aurelia Contact Details component. First, we’ll update the method to fetch the data from our Rails backend..

// ./src/web-api-fetch.js
...
  getContactDetails(id){
    this.isRequesting = true;
    return new Promise((resolve, reject) => {
      // Normally we'd find the record in a locally held array filled by
      //  an earlier call to getContactList() and then we might consider
      //  asking the server if we didn't find it depending on our
      //  applications requirements.
      // For now though, we'll just leave it hitting the server..
      client.fetch('http://localhost:3000/contacts/' + id)
        .then(response => response.json())
        .then(response => {
          let result = response.data.map(x => { return {
            id: x.id,
            firstName: x.firstName,
            lastName: x.lastName,
            email: x.email,
            phoneNumber: x.phoneNumber
          }});
          resolve(result.pop());
          this.isRequesting = false;
        })
        .catch((ex) => {
          console.log('ERROR', ex);
          reject(response);
        });
    });
  }
...

Then we point src/contact-detail.js to our new Web API..

// ./src/contact-detail.js
import {WebAPI} from './web-api';         // change this..
import {WebAPI} from './web-api-fetch';   // ..to this
...

In our app we see that the Profile panel shows we’re now fetching the details from Rails, which we can verify in the Network tab of our browser development tools, or by watching the output of our running rails server

Screenshot of our app working in development - Contact List and Contact Details are now fetched from our Rails API

Preparing for Release

Now we’ve hooked up the Aurelia front-end to the Rails 5 API backend, it’s about time to ask Rails to serve the Aurelia front-end files as well so we can deploy the entire application to a single server. But before we do that, we really must take care of the hard-coded API URLs in our src/web-api-fetch.js.

As we’re using the Aurelia CLI to build our front-end, we can use the existing dev, stage and prod configuration files they thoughtfully included.

Side note: For the curious.. the Aurelia build environment is a handled by a pre-configured gulp task called configureEnvironment() which you'll find in ./aurelia_project/tasks/transpile.js
// ./aurelia_project/environments/dev.js
export default {
  debug: true,
  testing: true,
  apiBaseUrl: 'http://localhost:3000'
};
// ./aurelia_project/environments/stage.js
// ./aurelia_project/environments/prod.js
export default {
  debug: true,
  testing: true,
  apiBaseUrl: ''
};

Next we can use our new apiBaseURL parameter to configure an application-wide HttpClient..

// ./src/main.js
...
import { HttpClient } from 'aurelia-fetch-client';
...

export function configure(aurelia) {
  ...
  // Configure an application-wide HttpClient
  configureHttpContainer(aurelia.container);      // <- add this..

  aurelia.start().then(() => aurelia.setRoot());
}

// And this new function..
function configureHttpContainer(container) {
  let httpClient = new HttpClient();
  httpClient.configure(config => {
    config
      .useStandardConfiguration()
      .withBaseUrl(environment.apiBaseUrl)
  });

  container.registerInstance(HttpClient, httpClient);
}

And the last thing to do to get this working is to update our web-api-fetch.js file to use our configured HttpClient..

// ./src/web-api-fetch.js
import { inject } from 'aurelia-framework';
import { HttpClient } from 'aurelia-fetch-client';

let latency = 200;
let id = 0;

function getId(){
  return ++id;
}

@inject(HttpClient)
export class WebAPI {
  isRequesting = false;

  constructor(httpClient) {
    this.httpClient = httpClient;
  }

  getContactList() {
    this.isRequesting = true;
    return new Promise((resolve, reject) => {
      this.httpClient.fetch('/contacts')
        .then(response => response.json())
        .then(response => {
          let results = response.data.map(x => { return {
            id: x.id,
            firstName: x.firstName,
            lastName: x.lastName,
            email: x.email
          }});
          resolve(results);
          this.isRequesting = false;
        })
        .catch((ex) => {
          console.log('ERROR', ex);
          reject(response);
        });
    });
  }

  getContactDetails(id){
    this.isRequesting = true;
    return new Promise((resolve, reject) => {
      // Normally we'd find the record in an array filled by the
      //  earlier call to getContactList() and then we might consider
      //  asking the server if we didn't find it depending on our
      //  applications requirements.
      // For now though, we'll just leave it hitting the server..
      this.httpClient.fetch('/contacts/' + id)
        .then(response => response.json())
        .then(response => {
          let result = response.data.map(x => { return {
            id: x.id,
            firstName: x.firstName,
            lastName: x.lastName,
            email: x.email,
            phoneNumber: x.phoneNumber
          }});
          resolve(result.pop());
          this.isRequesting = false;
        })
        .catch((ex) => {
          console.log('ERROR', ex);
          reject(response);
        });
    });
  }

  // We haven't updated this method to use our Rails API yet,
  //  that's left as an exercise for the reader (that's you  :o)
  saveContact(contact){
    this.isRequesting = true;
    return new Promise(resolve => {
      setTimeout(() => {
        let instance = JSON.parse(JSON.stringify(contact));
        let found = contacts.filter(x => x.id == contact.id)[0];

        if(found){
          let index = contacts.indexOf(found);
          contacts[index] = instance;
        }else{
          instance.id = getId();
          contacts.push(instance);
        }
          resolve(results);
          this.isRequesting = false;
      }, latency);
    });
  }
}

The changes in the file above are:

  1. We imported the aurelia-framework: import {inject} from 'aurelia-framework'
  2. Removed the line let client = new HttpClient();
  3. Added an @inject(HttpClient) decorator the the WebAPI class
  4. Saved an reference to the injected HttpClient with a new constructor method
  5. Updated the 2 getContact...() methods to use this.httpClient.fetch() instead of client.fetch(), and stripped out the hard-coded URLs!

Because we’re going to use the Aurelia CLI to bundle our front-end app and lean on the Rails Asset Pipeline to actually serve these assets, there’s one last Aurelia configuration tweak we need to make - switching the output destination from ./scripts to ./assets:

/* ./aurelia_project/aurelia.json */

{
  ...
  "platform": {
    ...
    "output": "assets",
    ...
  },
  ...
  "build": {
    "targets": [
      {
        ...
        "output": "assets",
        ...
      }
    ]
  ...
  }
}
<!-- ./index.html -->
...
    <script src="assets/vendor-bundle.js" data-main="aurelia-bootstrapper"></script>
...

A little housekeeping..

rm -rf ./scripts && mkdir ./assets

And a quick check to make sure our development configuration is still working..

au build --env dev then check our browser: http://localhost:9000 == Works fine

Now let’s try our production configuration..

au build --env prod

Screenshot of our app working in production - no data is shown because the URLs are now relative

No data, but it's working! - look at the /contacts URL - it's relative and all we did was change the au --env flag

It correctly breaks! (?!!) - bear with me.. the URL is /contacts on the same server (notice the port is still 9000 - the same as in the address bar) - this proves it’s working!

Serving the app with the Rails Asset Pipeline

There are just a few things to do at the backend to get the Rails API to serve our front-end.

Enable the Rails Asset Pipeline

# ./config/application.rb
require "rails"
...
# Pick the frameworks you want:
require "active_model/railtie"
require "active_job/railtie"
require "active_record/railtie"
require "action_controller/railtie"
require "action_mailer/railtie"
require "action_view/railtie"
# require "action_cable/engine"
require "sprockets/railtie"        # <-- we want sprockets! (uncomment this line)
require "rails/test_unit/railtie"
...

Next we need to tell the Asset Pipeline / sprockets which javascript files our app uses:

mkdir -p ./app/assets/javascripts && mkdir -p ./vendor/assets/javascripts
touch ./app/assets/javascripts/application.js

// ./app/assets/javascripts/application.js

// These 2 files are generated by running `./au build --env prod`
// ** DO NOT MODIFY THESE FILES DIRECTLY **
//= require vendor-bundle
//= require app-bundle

Now we can copy these files.. vendor-bundle.js to vendor assets, and app-bundle.js to app assets - that make me feel warm and fuzzy inside (we’ll recap the steps needed to release the application at the end of the post..)

cp ./assets/vendor-bundle.js ./vendor/assets/javascripts/
cp ./assets/app-bundle.js ./app/assets/javascripts/

Next we need a HTML Controller to serve our index.html..

rails g controller Home index --skip-routes

We skipped the automatic route addition because we need to edit ./config/routes.rb to add our site root..

# ./config/routes.rb
Rails.application.routes.draw do
  # For details on the DSL available within this file, see http://guides.rubyonrails.org/routing.html

  root 'home#index'
  resources :contacts

end

Now we need a quick tweak to our HomeController inheritance so it renders HTML instead of JSON by default..

# ./app/controllers/home_controller.rb
class HomeController < ActionController::Base
  def index
  end
end

And what about that #index? The final step! We copy and then edit the index.html..

mkdir -p ./app/views/home
cp ./index.html ./app/views/home/index.html.erb

<!-- ./app/views/home/inde.html.erb -->
<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8">
    <title>Aurelia</title>
  </head>

  <body aurelia-app="main">
    <%= javascript_include_tag "application", { "data-main" => "aurelia-bootstrapper" } %>
  </body>
</html>

We’ve swapped the <script> tag for the Rails Asset Pipeline generated javascript_include_tag, and now we can test it..

This is the only un-clean part of the integration as it means a manual update to this file if we've changed it during development. I suspect that we can use a gulp plugin ([gulp-html-replace](https://www.npmjs.com/package/gulp-html-replace) looks promising) to augment the build process so that we don't have to manually edit ./index.html. We could also get our build process to copy the bundled files into our Rails folders too.. If you can me help with this, please leave a comment!

rails server - hit our Rails server at http://localhost:3000 and..

Screenshot of our Aurelia Contact Manager app completely hosted by Rails!

Et voila! - Our Aurelia Contact Manager application hosted by Ruby on Rails

Wrap up..

Development Workflow

For development, we just spin up the frontend and backends in separate terminal tabs like so:

au run --watch
rails s

Then point our editor to ./src/* and our browser to http://localhost:9000

Release / Deployment Workflow

Stop the Aurelia CLI if it’s running (Ctrl+C our au run --watch task), then

rm ./assets/* - remove our development assets
au build --env prod - generate our production assets
cp ./assets/vendor-bundle.js ./vendor/assets/javascripts/ - copy our vendor bundle
cp ./assets/app-bundle.js ./app/assets/javascripts/ - copy our app bundle

Remember! If we've changed our ./index.html then we need to copy it and update the <script> tag as we did before:
cp ./index.html ./app/views/home/index.html.erb
<!-- ./app/views/home/index.html.erb -->
<!DOCTYPE html>
<html>
  ...
  <body aurelia-app="main">
    <!-- remove this line -->
    <script src="assets/vendor-bundle.js" data-main="aurelia-bootstrapper"></script>

    <!-- add this line -->
    <%= javascript_include_tag "application", { "data-main" => "aurelia-bootstrapper" } %>
  </body>
</html>

.. and we’re done

Generating a Google Map (KML) from GPS-tagged photos

0 Comments

Google world map overlaid with route generated from digital photo metadata

Joining the dots from digital photo metadata..

As I mentioned earlier, my current compact travel-zoom camera is the excellent Panasonic DMC-TZ40. As it saves the GPS coordinates with every photo I take I thought it’d be a fun little exercise to try and script the extraction of this metadata, with a view to plotting the points on a map or even better - drawing a line between them all so I could retrace my steps!

Looping through files is pretty trivial in all programming and scripting languages, and most if not all of them already have JPEG libraries that allow for the straightforward extraction of the photo metadata. The final piece was finding a suitable output format that would be understood for rendering points and lines on maps. Enter KML:

Keyhole Markup Language (KML) is an XML notation for expressing geographic annotation and visualization within Internet-based, two-dimensional maps and three-dimensional Earth browsers.
- [Wikipedia](https://en.wikipedia.org/wiki/Keyhole_Markup_Language)

Google acquired the language in 2004, implemented it in Google Earth and provide excellent documentation of the format.

The result of pulling this all together is the following Ruby script which I’ve called photo-mapper and is an Open Source project on GitHub..

# This Ruby script generates two Keyhole Markup Language files:
#  1. points.kml  - a point for every photo with GPS coords
#  2. route.kml   - a single line that joins every photo with GPS coords
#
# from a folder (and sub-folders) of digital photos
#
# The intention is to create a chronological map of
#  photographed destinations from the digital photos themselves
#
# Usage: ruby photo-mapper.rb starting_directory
# e.g.:  ruby photo-mapper.rb Photos
#
# Author: Andrew Freemantle			http://www.fatlemon.co.uk/photo-mapper

require 'exifr'
require 'date'

# The Panasonic DMC-TZ40 always saves GPS coords. Even indoors. Exclude photos with these coords..
INVALID_GPS_COORDS = [17056881.853375, 17056881.666666668]
# Exclude the following directories when traversing..
IGNORE_DIR = ['.', '..', '.git', '.DS_Store', '@eaDir']
# List of supported photo filename extensions..
ALLOWED_EXTENSIONS = ['.jpg', '.JPG', '.jpeg', '.JPEG']


# Directory traversing class
#  initialized with a starting path, it recursively descends through
#  any directories it finds that aren't in the IGNORE_DIR array above
class Traverse

	def initialize(path, pointsFile, routeFile)
		puts "in " + path
		@files = Dir.entries(path).sort
		@files.each do |f|
			if !IGNORE_DIR.include? f
				if File.directory?(File.join(path, f))
					@t = Traverse.new(File.join(path, f), pointsFile, routeFile)
				elsif File.file?(File.join(path, f))

					# Is this an allowed file?
					if ALLOWED_EXTENSIONS.include? File.extname(f)
						# Does this file have Geo coords?
						puts "Got allowed file #{File.join(path,f)}"

						begin
							@file = EXIFR::JPEG.new(File.join(path, f))
							if @file.exif?()
								# We have EXIF, but do we have sensible Lat & Long?
								if @file.gps != nil
									if !INVALID_GPS_COORDS.include? @file.gps.latitude
										#puts @file.gps
										pointsFile.puts("<Placemark><name>#{f}</name><Point><coordinates>#{@file.gps.longitude},#{@file.gps.latitude},#{@file.gps.altitude}</coordinates></Point></Placemark>")
										routeFile.puts("#{@file.gps.longitude},#{@file.gps.latitude},0 ")
									else
										#puts "No GPS in " + ARGV[0]
									end
								end
							else
								#puts "No EXIF in " + ARGV[0]
							end
						rescue EOFError
							# End Of File can happen for partially copied or uploaded photos
							#  and there's nothing we can do here but report out and skip
							puts "Reached EOF for #{File.join(path,f)} - skipped."
						end
					end

				end
			end
		end

		pointsFile.flush()
		routeFile.flush()

	end
end


# Start the two output files:
pointsFile = File.open('points.kml', 'w')
routeFile = File.open('route.kml', 'w')

date = Date.today

# write the file headers
pointsFile.puts("<?xml version=\"1.0\" encoding=\"UTF-8\"?>
<kml xmlns=\"http://www.opengis.net/kml/2.2\">
<Folder>
	<name>points</name>
	<description>Generated on #{date.strftime('%a %-d %b %Y')} by photo-mapper - http://www.fatlemon.co.uk/photo-mapper</description>
	<open>1</open>")

routeFile.puts("<?xml version=\"1.0\" encoding=\"UTF-8\"?>
<kml xmlns=\"http://www.opengis.net/kml/2.2\">
<Folder>
  <name>route</name>
	<description>Generated on #{date.strftime('%a %-d %b %Y')} by photo-mapper - http://www.fatlemon.co.uk/photo-mapper</description>
  <open>1</open>
  <Style id=\"linestyle\">
    <LineStyle>
      <color>ff000000</color>
      <width>2</width>
    </LineStyle>
  </Style>
  <Placemark>
    <name>Route</name>
    <styleUrl>#linestyle</styleUrl>
    <LineString>
      <extrude>1</extrude>
      <tessellate>1</tessellate>
      <coordinates>")


go = Traverse.new(File.absolute_path(ARGV[0]), pointsFile, routeFile)


# Close the files
pointsFile.puts("
</Folder>
</kml>")
pointsFile.flush()

routeFile.puts("</coordinates>
    </LineString>
  </Placemark>
</Folder>
</kml>")
routeFile.flush()

# Done  :o)

For the route output, the script assumes that sorting the directories and files alphabetically will result in the same order the photos were taken. This should be true and work fine in most cases. Personally, I organise my photos like so:

  year
    month
      day - with short description of the day or the location
        photo.jpg, photo2.jpeg, etc..

e.g:

  2012
    01 - January
      01 - New Years day dip in the North Sea
        DSC01265.jpg, DSC01266.jpg, etc..
      02 - Discharged from hospital after recovering from hypothermia
        DSC01271.jpg
    02 - February
      29 - Cycle-ride along the coast
        DSC01411.jpg, etc..

Viewing with Google Earth

Google earth overlaid with points generated from digital photo metadata

The resulting `points.kml` opened in Google Earth

Google Earth natively supports KML, so once you have it installed and open, just go to File > Open and select either points.kml, route.kml or both!

Viewing with Google Maps

Google world map overlaid with route generated from digital photo metadata

The resulting `route.kml` opened in Google Maps. A little more involved and not without some limitations..

Google Maps also understands KML files, but there are some limitations which I’ll point out in a moment.

  1. First, head over Google Maps and sign in with your Google+ account, or create one
  2. Next, expand the Google Maps menu by clicking on the 3 horizontal bars inside the maps search box on the left, then choose ‘My Maps’ and ‘Create’
  3. You’ll get a new web-browser tab with a new map in it. Simply click the highlighted ‘Import’ link under the ‘Untitled layer’ and select the points.kml, route.kml or both!

Importing the KML file into Google Maps is pretty straightforward

Importing the KML file into Google Maps is pretty straightforward but requires a Google+ account

####However..

You’ll likely run into the following message..
Google world map overlaid with route generated from digital photo metadata

Google Maps will only import the first 10 layers and 2000 features from this KML file.

For now at least, Google Maps is limited to the number of points and lines it can process. photo-mapper generates a single layer, but each photo will be a point so we’re currently limited to 2,000 geotagged photos.

Google’s retired MapsEngine could handle far more data so I think it’ll just be a matter of time before this restriction is lifted.

Panasonic LUMIX MapTool.pkg – Open Source edition

0 Comments

The Panasonic LUMIX DMC-TZ40 Digital Camera

The Panasonic LUMIX DMC-TZ40 Digital Camera, in a word, excellent!

I recently upgraded my compact travel zoom camera from the tried and trusted Sony DMC-HX9V to the Panasonic DMC-TZ40. It’s quite an upgrade and while I’m delighted by my new purchase, the review will have to wait now that I can use a major feature: MAPS!

Maps! On a Digital Camera! What ever will they think of next!

Maps! On a Digital Camera! What ever will they think of next!

Yes, as well as GPS Geotagging of photographs, the Panasonic LUMIX DMC-TZ30 (ZS20) and Panasonic LUMIX DMC-TZ40 (LZ30) come with Map Data that you can use to find your way to your next photo shooting location, or back to your hostel if you find yourself lost!

The Map Data comes on the CD-ROM / DVD with the camera, along with a little application called the LUMIX Map Tool. There’s a version for Microsoft Windows and Apple Mac OSX, but not for Linux.

To save on space I’d copied the contents of the DVD onto a USB drive so I could update the map data while travelling, but the only thing that didn’t copy was the Apple Mac OSX version of the LUMIX MapTool.pkg!

It took a bit of Googling, but I eventually found a great blog post by Roland Kluge where he’d written a simple script version of the tool for his LUMIX DMC-TZ31 - and after reading the comments I was able to modify his script to make it work for my new LUMIX DMC-TZ40.

I cannot stress how thankful I am for his work and the comments on his post - thank you Roland, and thank you commenter Falk

Introducing LUMIX Map Tool - Open Source edition :o)

My contribution to Roland Kluge's Simple Replacement for Lumix Map Tool

My contribution to Roland Kluge's Simple Replacement for Lumix Map Tool

I’ve forked Roland’s code and added support for the Lumix DMC-TZ40, and I thought I’d make it a little more interactive as I’m not likely to change the Map Data too often.

Simply download the maptool.py file from the GitHub repository and run it with..

$ python maptool.py

.. and it will prompt you for the information it needs to get the Map Data from your DVD onto a formatted SD Card - maps away!