Archive

Tag Archives: Linux

Screen shot 2014-01-15 at 08.58

Raspberry Pis with camera modules, ready for testing.

Okay, it’s a whole year since my last post on this blog. I’ve been busy.

Flying Monkey TV was selected for an Arts Council England (ACE) funded mentoring scheme in 2010, via The Culture Company‘s Artimelt programme, aimed at incubating early-stage projects that might then go on to greater things.

Although the project has had some funding early on, it was not much and nothing ongoing. We also applied for funding from Umbro Industries, 4IP (now defunkt) and Nesta‘s Digital R&D programme, amongst others, but no joy so far. I suspect I am a whole generation too old for Umbro and we didn’t seem to tick the right boxes for Nesta. In the meantime, I’ve decided to stop applying for stuff and start applying stuff, if you see what I mean. Life is short and funding applications are a very onerous task best left to professional administrators.

At least to give ACE their money’s worth, I have been continuing to develop techniques and accumulate equipment to progress the project despite its lack of funding. The downside is that it gets done when it gets done. Anyway, out of my own pocket I’ve just acquired a little testbed of equipment that I can use to test my ideas.

If you were reading my Flying Monkey TV Missing Link blog from a couple of years ago, you will know that some of my technical research of the time, thanks to a small bursary from Access Space, Sheffield UK, was rendered entirely pointless by the launch of the Raspberry Pi platform. It’s taken me a while to get around to it, but late last year I got my first Pi and this week have just accumulated three more, and all four with camera modules. I have only just got together all the various bits and bobs to make it into a functional network, but stay tuned for more shenanigans.

I am also testing battery-powered setups and solar charging, so expect a few passes close to the sun.

A640 rotated_84_2

Having been flying around the country over the past couple of weeks, I’ve shot a lot of timelapse out of the front windscreen. Just because I can.

It’s largely thanks to CHDK and I have managed to record myself getting lost on routes between Sheffield, Lancaster, Leeds, Newcastle, Nottingham and York. I have a few plans for this media, but one journey is already online here:

This was quite a successful Flying Monkey TV experiment and I had it edited and online within three hours of getting home.

These journeys were mostly shot on a Canon PowerShot A560, timelapse-enable with CHDK, and mounted on the inside of the front windscreen with a suction mount (see below). I mounted the camera hanging from the top of the screen, upside down. This meant that it was in line with the passenger side roof pillar and hence did not obscure my view.

I don’t use the auto-rotate feature in most cameras because it sometimes gets confused and can give you a few incorrectly rotated frames here and there. Consequently, shooting this way, I end up with an upside-down video that then needs rotating.

559CANON_full_2048_03pct

I used to drop them into Final Cut Pro, render them and export them to a new movie file, which works fine, but I think it’s time to write a Bash script to do it.

No sooner said than done! Well, not quite, but it was much easier than I imagined. As usual Google managed to find some very helpful resources for me to cannibalise. Writing the code is much faster than documenting it into a usable blog post.

I started by reminding myself how to create a basic for-next procedure (detail included here for other noobs). Here is a bit of basic code with lists the files in the present working directory ending in “.JPG” (don’t forget this is case-sensitive) and echos them to the screen. The semicolons separate the statements and the “done” terminates it.

for i in $(ls *.JPG); do echo $i; done

For the next step, instead of just listing the file names to the screen, I added the ImageMagick step to rotate the image and write it over the original.

for i in $(ls *.JPG); do convert $i -rotate 180 $i; done

As you might have already gathered, I like some progress feedback so added an echo with the file count and file name.

for i in $(ls *.JPG); do convert $i -rotate 180 $i; x=$(($x+1)) ;echo “$i $x”; done

However, a current file number is only of use if you know how many more to go. A bit of googling revealed this forum thread and the code:

ls -l | wc -l

This lists all the files in the current directory and then pipelines that list to give an integer count of the items in that list. It’s not necessary if this is just a temporary folder and it’s cleared between uses, but if you want to filter the file types you can add a wild card search like this:

ls -l *.JPG | wc -l

I found information about the two commands on linux.about.com and a forum thread which combines the two on unix.com.
http://linux.about.com/od/commands/l/blcmdl1_ls.htm
http://linux.about.com/library/cmd/blcmdl1_wc.htm
http://www.unix.com/unix-dummies-questions-answers/36490-ls-command-listing-number-files.html

So my final piece of code (for this iteration at least) is here:

x=0; c=$(ls -l *.JPG | wc -l); for i in $(ls *.JPG); do convert $i -rotate 180 $i; x=$(($x+1)) ;echo “$x /$c $i”; done

This sets x to be 0, c to be the number of files with filenames ending in “.JPG” in the current directory, rotates each one of them by 180 degrees, overwrites the original file and echos the file number, the file count and the filename. It’s pretty basic and I’ll roll it into a script soon.

There may well be better ways of doing this, but it worked well as an exercise to reinforce my learning.

I’m going a bit off-piste with this post, and looking at the hidden-in-plain-view potential of some of the libraries, very familiar to Linux geeks, but pretty much unknown to the outside world.

Although I talk about the past a lot, I’m not nostalgic. I don’t miss the frustrations of the past. I just like to remind myself how good things are now. By the way, I am by no means a computer historian, and my opinions and experience are subjective. If I post anything that is factually incorrect, please let me know.

I realise some of this might be bleeding obvious to the Linux world, but those of us cloistered for years in the walled-garden of the Mac OS X GUI, it’s quite a revelation.

As I’ve said before, I’m a lapsed programmer and used to write commercial software using framework applications, primarily FileMaker Pro 3.0 amongst others. FileMaker has developed into a very sophisticated rapid application development (RAD) platform now that FileMaker Inc have got around to including the features that us developers were clamouring for in the 1990s.

Given a choice, I am very happy tinkering with nice GUI-based visual programming / scripting tools, but the compelling reason for me to get my hands dirty at the command line is the fact that nothing gets in the way.

One of my most enduring love / hate relationships has been with HyperCard / SuperCard. I remember being excited, almost to the point of peeing myself, when I first got wind of HyperCard. My programming days started with BBC BASIC, then Spectrum BASIC, then the woefully under-implemented Commodore 64 BASIC. After that things just got worse and worse and Atari ST BASIC was absolutely useless. I can’t even remember if there was an implementation for the Amiga, but I did try CanDo which was promising but way ahead of its time. As the years passed it got more and more hard work to get to a command line and I kind of missed the direct simplicity of BBC or Spectrum BASIC, which were at least broadly complete implementations that you could actually do stuff with.

Update: I completely forgot about STOS on the ST and AMOS on the Amiga, which were both brilliant.
http://en.wikipedia.org/wiki/STOS_BASIC
http://en.wikipedia.org/wiki/AMOS_(programming_language)

HyperCard was a revelation to me as it was on the desktop, and you could easily create GUI-based applications with it, and it had a pretty decent plain-English programming language. You could actually do stuff with it.
http://en.wikipedia.org/wiki/HyperCard

As desktop computers “improved”, programming became more and more remote from the user, and there was a very frustrating period in the late 80s and early 90s where application software was not mature enough, but programming not accessible enough, to plug all the gaps in our productivity.

Fortunately those days are over.

There were many good things about HyperCard, but it was lacking in some fundamental functions, and for me, they were colour and structured graphics. HyperCard was strictly monochrome and bitmap only, and although Apple eventually did include plugins that supported colour graphics, it was an afterthought and not adequately implemented. Apple neglected HyperCard for years and it is now a minority interest tool. I still use it for programming on old compact Macs as it is one of the few programming tools that will work on machines with no more than 4 Mb RAM.

Silicon Beach SuperCard seemed to be the obvious successor, the so-called “HyperCard on steroids”. Its own language, SuperTalk, was an extended and mostly compatible development of HyperCard’s HyperTalk, and it supported 8 bit colour and script-controllable vector graphics.
http://en.wikipedia.org/wiki/Silicon_Beach_Software

SuperCard, whilst not neglected in the same way, passed from one owner to another for years, developing only gradually. I cannot tell you just how closely related it is to the original, but SuperCard appears to have evolved into something called LiveCode by Runtime Revolution.
http://www.runrev.com/

LiveCode deploys on a number of platforms including Windows, Linux, OS X, iOS and Android, and looks very promising. However, it seems to be lacking one of the fundamental features that I need, and that is to append single images to a movie file. Please correct me if I’m wrong, but I can’t find any mention of it, although I’m still looking into it.

And this is my big frustration when using very-high-level development tools. I tried using Apple Automator and QuickTime Pro player to assemble digital stills into movie files. Again, maybe it’s there but I couldn’t find it without having to operate the menu items in the player, and I really don’t want to go back to the days of using marionette software like Keyquencer (remember that?) or similar to operate programs via their menus. In my experience, automated clicking of buttons and selecting menu items is just not reliable or fast enough.

Adobe Photoshop’s Actions is a very powerful tool but it’s quite slow (although things may well have changed since CS3).

And this is when we get back to Linux, GIMP and ImageMagick. Whilst I have been banging my head against a brick wall for years, looking for the ideal development platform, Linux tools and libraries have been quietly maturing under my nose. I recently looked into using ImageMagick, which can do, well, everything. For the uninitiated, ImageMagick is a “software suite to create, edit, compose, or convert bitmap images” and can also procedurally create images by manipulating graphics or drawing shapes.

There are some very coherent and complete help files, written by Anthony Thyssen, and they seem to go on forever, documenting more and more features, available via command line. And the command line bit is the real killer because it means I can write a program and just call the single function I want, and add it to a Bash script.
http://www.imagemagick.org/Usage/

My own interest is in batch-converting, montaging of multiple images and format conversion of data sets, often comprising tens of thousands of images. I have managed to use gPhoto2 and FFmpeg to assemble the images into movies, and ImageMagick / GIMP will help me to manipulate the images if need be.

More soon…

The script here is only a slightly cleaned up version of the previous one, but also with a major addition. A timer.

It may be just that I’m new to Linux, but getting the syntax correct was very unintuitive. It took a lot of trial-and-error to get the spaces in the right place, and it reminds me a lot of programming in the 1980’s where you just get a “syntax error” message and nothing else.

This timer function uses the classic technique of storing the system time at the start of the script and then again once the script has finished, subtracting the former from the latter and the difference is the time taken.

This function only returns an the number of seconds as an integer, and a result of “1350575350” might as well be in grains of sand. So, I have then reformatted the output to a more human-readable minutes and seconds by dividing the number of seconds elapsed by 60 to get the minutes and deriving the modulus (what’s left over) for the number of seconds. I could also do a similar thing with the value 3600 if I wanted to format it in hours, minutes and seconds.

Strangely, when I tried date +%2s (which returns the current system time in seconds) at the command line in OS X it failed, but not on the same machine whilst booted into the Linux partition. Turns out, it was a typo but Linux is happy to ignore this glaring mistake (should be date +%s) whilst OS X is not. Strange when it’s so particular about other syntax.

However, all complaining aside, I have got some actual useful working code up and running much more easily than I had expected.

# !/bin/bash

clear
echo “Hello Monkey Planet”
starttime=$(date +%s)

cd /home/richard/Desktop/RBTest/newtest
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/testtemp
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/newtest
#gphoto2 –get-all-files

x=1; for i in $(find $(pwd) -name \*.JPG | xargs ls -t -r); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done

cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.mov
rm *.jpg

stoptime=$(date +%s)
fulltime=$(expr $stoptime – $starttime)
fulltimesecs=$(expr $fulltime % 60)
fulltimemins=$(expr $fulltime / 60)

echo “$fulltimemins minutes $fulltimesecs seconds”

I suppose it would be good to structure this programming at some point, such as the timer function, so that I could call it from any script. There probably is a function in Linux already.

More soon…

THE CODE PUBLISHED HERE COMES WITH ABSOLUTELY NO WARRANTY WHATSOEVER, SO PLEASE USE AT YOUR OWN RISK.

I am basing my code on existing bits of open source stuff scavenged from various places and cobbled together. At some point in the future I hope to make some sort of point-and-click front end for it, but in the meantime it’s just command-line code, and at the moment, pretty crude.

Here is a slightly updated version of my previous script, which now clears the work folders and downloads from the camera using gPhoto2.

# !/bin/bash
clear
echo “Hello Monkey world”
#gphoto2 —

cd /home/richard/Desktop/RBTest/newtest
rm *.JPG
rm *.jpg
cd /home/richard/Desktop/RBTest/testtemp
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/newtest
gphoto2 –get-all-files

x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done

cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.avi

This is pretty clunky but at least it’s getting somewhere. At some point I’ll find out how to reference files case-insensitive.

On some of the older cameras, such as my A620, the camera saves images in discrete folders of only 200 images each, which means a lot of effort in post-production if it has to be done by hand.

I think the next logical steps are to compile multiple folders of images and also to name the destination movie file discretely. The name could be created from some format information, a date/time stamp, and the model of the camera which can be derived from gPhoto2.

Googling something about recursive folders I found this:

http://stackoverflow.com/questions/245698/list-files-recursively-in-linux-with-path-relative-to-the-current-directory

However, this does not return them in creation time order and, as far as I could google, there doesn’t seem to be an argument to produce that.

After googling something about Linux pipeline commands I found a reference to xargs, and Bob’s your uncle. I can hardly believe I found it so quickly, but I remember Martyn at Access Space telling me about pipelining so I used a bit of intuition and worked out how to take the results from the find command and sort them afterwards.

http://www.cyberciti.biz/faq/linux-unix-bsd-xargs-construct-argument-lists-utility/

Here is my Bride of Frankenstein, which takes a folder full of folders (full of images), lists them in creation order, renames them and compiles them into a movie file. Simples!

# !/bin/bash
clear
echo “Hello Monkey Planet”

cd /home/richard/Desktop/RBTest/testtemp
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/newtest

x=1; for i in $(find $(pwd) -name \*.JPG | xargs ls -t -r); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done

cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.avi

Well, FFMpeg is a monster. Getting it to work with some default settings was easy, but the number of parameters is breathtaking. I work in professional video on a daily basis but can hardly understand any of the plethora of options. There is extensive documentation on ffmpeg.org but most of it is meaningless unless you already know what it means.

Below is the basic structure of a Bash script. It requires the #!/bin/bash at the beginning

#!/bin/bash
clear
echo “Hello world.”

This script clears the screen and prints the words “Hello world”.

An excellent beginners’ tutorial is here and I’m not going to to duplicate it.
http://linuxcommand.org/wss0010.php

This process reminds me of teaching myself to program back in the 1980s. In a good way and a bad way. One frustratingly fatal error after another is a real test of application, and that’s the discipline, but teaching yourself may be very time-consuming but it is also very educational.
After some head scratching I got this to work. It’s a bit of a Frankenstein lash-up with bits of code I got from various places, but also with a little bit of my own intuition. I discovered by trial and error how to change the target directory on-the-fly within the script and this made it much simpler.

It’s still a bit clunky and needs the download-from-camera stage adding but it works. I decided to include absolute paths to the work directories and point the script at them in turn, and so far I am running the script using the command ./myscript from the current directory.

# !/bin/bash
cd /home/richard/Desktop/RBTest/newtest
x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done
cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.avi

The next bit I need to add is the download stage using gPhoto2 to get the images from the camera so that I can have a single workflow to download and assemble the thousands of images files in separate directories into a single usable video file.

Some of this coding documentation needs expanding and tidying up, but stay tuned for more soon.

There are some technologies that none of us will miss, such as CRTs, the SCSI interface, VHS and (for me) optical storage media (CDs and DVDs). But one technological era I am particularly glad to leave behind is the 1980s, characterised by cassette-loading or actual typing in of huge BASIC programs, distributed in magazines.

These days we have an embarrassment of riches in the form of the internet. For a programmer, it's a goldmine of free resources. However, something that is often missing is an exhaustive deconstruction of the code, particularly useful to noobs. I'm a lapsed programmer, so I don't find it too hard, but Linux is new to me and has some strange “conventions”.

Anyway, as promised here is some actual code that I found (well, Google found it and I copied it) with some explanation about what the bits and pieces do. I will probably update this a bit and please let me know if you spot any errors.

FFMpeg requires an unbroken sequence of image files starting at 1 and will fail if the numbers are not continuous. This is not optional and so it is best to rename them as a matter of course.

The following two code snippets are from stackoverflow.com and I've adapted them slightly.

http://stackoverflow.com/questions/2829113/ffmpeg-create-a-video-from-images

This is the script for renaming the files.

x=1; for i in *jpg; do counter=$(printf %03d $x); ln "$i" /tmp/img"$counter".jpg; x=$(($x+1)); done

And here is the original FFMpeg command from the same source.

ffmpeg -f image2 -i /tmp/img%03d.jpg /tmp/a.mpg

Note:

24 seconds to ininitalise the Powershot A640 and another 9 minutes to download 2,595 images totalling 2.7 Gb (over USB 2 on 1st generation MacBook (1.83 GHz Intel dual core) running Ubuntu Linux 11.04.

Sort by creation date/time

x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /tmp/img_”$counter”.jpg; x=$(($x+1)); done

The $( some stuff ) contruct eveluates everything within the brackets before using the result. In this case, $(ls -t -r *JPG) returns a list of files ending in “JPG”, sorted by ascending creation time. The for i in $(ls -t -r *JPG) is a for..next loop where i is each file ending in “JPG”, sorted into creation time order.

The snippet counter=$(printf %04d $x) resolves the integer variable x into a 4-digit decimal number with leading zeros and puts it into the variable counter. The leading zeros are necessary in order to maintain the 8-character file names so that they sort alpha-numerically. The snippet x=$(($x+1)) increments the variable x by integer 1 after each renaming operation.

It’s important to get the capitalisation correct and the destination pathname, or it will just fail.

The -t modifier sorts by creation time in reverse order, most recent first. The -r modifier reverses that order to, in this case, change the order back to the correct chronological order.

Canon Powershot cameras automatically number the images files in the format “IMG_0001.JPG” and restarts the numbering after 9,999 images. It will automatically create a new folder every 2,000 images and will create a new folder and restart from 0001 once it gets to IMG_9999.JPG.

Consequently, it is possible that for a timelapse shoot, the file names will not be in continous alpha-numeric order. Sometimes they will be, but it’s as well to assume that they won’t. Also, gPhoto2 does not preserve the folder structure if you download using the gphoto2 –get-all-files command, and will dump the whole lot into a single folder.

In this case, it is very likely that the alpha-numeric order will not be strictly chronological. As a result, it was necessary to include an amendment to the code that will order them into chronological order before renaming them.

More soon.

Rockingham Street Q-Park car park, Sheffield, UK.

After a partially voluntary and partially involuntary hiatus, I am back in flight. After significant work commitments, then getting married with a honeymoon in Barcelona, ES, I am back in Sheffield, UK, the greenest city in England. As this project is currently unfunded, and solo, I’m updating things as-and-when.

I promised you some performance tests and here they are. I’m am perfectly satisfied with the installations of Ubuntu Linus 8.04.1 on blue-and-white G3s and 10.04 on grey-and-white  Macs. As suspected, the GUI is a major burden on the system and opening a folder of about 3,000 jpeg images can take a full 5 minutes because of the overhad loading all the thumbnails.

The solution is simple. Don’t.

For my purposes this is no problem. I want to use these machines as workhorses and don’t need the GUI.

I’m going to be using FFmpeg to convert and compress a folder full of images into a movie. However, I came across a well-documented problem immediately. FFmpeg will not continue if a.) the first file is not numbered xxxxxx1 and will fail if there is a gap in the numbering.

Fortunately, I easily found a Bash script to sort this out, and I found it here:

http://stackoverflow.com/questions/2829113/ffmpeg-create-a-video-from-images

x=1; for i in *jpg; do counter=$(printf %03d $x); ln “$i” /tmp/img”$counter”.jpg; x=$(($x+1)); done

(Also see the revision below).

This works but note the format of the filename extensions is case-sensitive. Helpfully (not), the script just fails if you do not get this correct. It renames all the images in a folder beginning at xxxxxx1, so if you sequence does not, or it is missing one or more images in the middle, it will not cause FFmpeg to fail. I’ll deconstruct this script in a later blog post.

Even more usefully, Linux does not actually duplicate the binary files, it just creates link files which can be safely deleted afterwards.

These tests were done on a 1 GHz PowerMac G4 with 640 Mb RAM & 21 Gb HD.

Rename 2933 files – 23 seconds

Using FFmpeg:

Encode jpeg files at 1600×1200 to mjpeg – approx 9 min 30 secs – 192.4 MB
Encode jpeg files at 1600×1200 to mjpeg – approx 8 min 11 secs – 192.4 MB
Encode jpeg files at 1600×1200 to mjpeg – approx 8 min 12 secs – 192.4 MB

Encode jpeg files at 640×480 to mjpeg AVI – approx 10 min 15 secs – 40.4 MB – atrocious quality
Encode jpeg files at 640×480 to mjpeg AVI – forgot to time it but similar – 199.5 MB – quality set to 1

This last set of tests was done whilst messing about within the GUI. I don’t really know enough about what is going on under the hood, but it seems that things slow down a lot and this might be due to caching thumbnails or, err, whatever. After a restart performance increased dramatically.

Rename 2,933 items – 1’05’’
Rename 2,933 items – 1’12’’
Rename 2,933 items – 0’20’’ (after restart)

For my purposes the performance seems acceptable at this stage.

One other revision I had to make was to compensate for the original file numbering restarting. Canon Powershot cameras create jpeg images with filenames of the format “IMG_xxxx.JPG” and shooting timelapse with them quickly reaches numbers in excess of 9,999. For consumer-level snap cameras, this number is probably well in excess of a normal user’s needs but I can easily shoot that many in one day. Consequently, the image numbers cannot solely be relied upon to reference the images in the correct chronological order, hence this revision which sorts them into creation order before renaming.

x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /Users/richardbolam/Desktop/testout/img_”$counter”.jpg; x=$(($x+1)); done

I’m still a bit of a noob when it comes to Bash scripting, but I will write another post soon that disassembles these scripts, partly for your benefit, but also to help me learn more about what I’m doing. This script was revised with the invaluable help of the technical staff at Access Space. Stay tuned.

access-space.org

“Personally, I liked the university, they gave us money and facilities. We didn’t have to produce anything. You’ve never been out of college! You don’t know what it’s like out there! I’ve worked in the private sector, they expect results.” Dr Ray Stanz, Ghostbusters, 1984, Ivan Reitman.

One of the major frustrations with computers is ambiguous error messages. They’re just machines, and no-one expects them to work without failing occasionally, but give us a break. The next biggest frustration is explicit error messages that give you every detail but nothing you have the remotest chance of doing anything about. Linux tends to fall into the latter category. Windows even gives you hexadecimal memory addresses. Gee thanks.

In Final Cut Pro, Apple’s professional video editor software for Mac OS X, I regularly get a message saying “General Error”. That’s no help at all, and whilst it is quite fascinating to see all the text status messages scrolling up a screen, so beloved of science fiction films, much of it is so minutely detailed, but abstracted from anything remotely helpful, that it might as well just say “FAIL!”.

I’ve worked on Macintosh computers since the early ‘90s and one of the advantages is the relatively small number of hardware configurations. However, even given identical specifications, sometimes things just don’t work for no reason you can identify. My parents would say “you’ve got to hold your mouth right while you do it”. This means that it’s achievable but not necessarily using logical or sound procedural methods, just luck.

Installing Ubuntu Linux on legacy Macintosh computers has been like that.

It was amazingly easy on my knackered and flaky first generation Macbook, running OS X 10.6 Snow Leopard. Partially following instructions I found online, I used the included Bootcamp utility (originally intended to allow dual booting of OS X and Windows) to create a partition that I then installed Ubuntu 11.10 on the other partition from a CD boot disk.

However, this is all academic and I have a job to do. Although it’s interesting and valuable to be able to re-purpose otherwise obsolete computers, this geekery is not what the project is about. I need to keep in sight the fact that the only reason I am using the Macs is that they are free and not doing anything else, and the project has no funding. Out in the private sector, where I work, I need achieve some results and get these machine producing something.

I have four working blue-and-white G3s and a 400 MHz G4. I have managed to get two of the G3s to run Ubuntu 8.04.1 and the G4 to run Ubuntu 10.04. I am certain the problems I have had with the other two are peculiar to faults with the individual machines, rather than PowerPC Linux, but I am yet to solve them. I was hoping to get all four G3s working as I have an idea for an art installation and need three, so this would give me a spare.

Anyway, it’s time to move on.

I’m convinced that Linux (rather than MacPorts on OS X) is the way forward, and I’m not going to give up on these working machines, but I have more to achieve.

Stay tuned for performance tests..

Flight of the Phoenix (1965 dir Robert Aldrich)

Flight of the Phoenix (1965) is a rollicking good adventure yarn about a group of heroic failures whose plane crashes in the desert and they attempt to build a new one from the leftovers. It’s a great film and I highly recommend it, although please don’t bother with the bizarre, homoerotic and Hollyoaks-esque 2004 remake.
http://www.imdb.com/title/tt0059183/

Here is some detail about my ongoing quest to bequeath a new lease of life to old, wilderness-bound PowerPC-based Macintosh computers.

Having successfully installed Ubuntu 8.04.1 on a blue-and-white G3 (see video), a few days later it refused to boot either from the internal hard drive or the boot CD. I have three other similar G3s and they all refused to boot from the CD. One of them just sat there, making a metallic flapping sound. I still don’t know why.

In a slight aside, sometimes both the G3s and G4s just refuse to start up. No lights, no action, no nothing. A couple of tips I picked up somewhere are 1) try pulling the clock battery and 2) unplug it a leave it for an unspecified amount of time. Both seem to have succeeded for me on occasion.

So, after an initial success, I couldn’t get any of the G3s to boot from the 8.04.1 CD ROM. So I tried the 6.06 Dapper Drake disk and this installed successfully.
http://old-releases.ubuntu.com/releases/6.06/

However, when I tried to install gPhoto2 and FFMpeg, it could not find the repositories. I guess it’s old and they’ve all moved. However, the G3 would now boot from the 8.04.1 Hardy Heron alternate install disk and I successfully upgraded the system to 8.04.1.
http://cdimage.ubuntu.com/ports/releases/hardy/release/

Linux MintPPC seems to have many successful installations but I couldn’t get it to work.
http://www.mintppc.org/content/imac-g3-bondi-blue-and-bw-g3

At this point I bet you’re thinking “I’m sorry, but just remind me why you’re telling me all this.”

It’s a long story, but the upshot is that I have a load of old Macs that are doing nothing and I’d like to put them to work. However, Apple OS X has abandoned them although there is life in the old monkeys yet. It’s not just about Macs though, and I have work for lots of computers. Linux is platform-independent and has some very valuable, although almost invisible and ambiguously documented, resources that can be pressed into service. And, in the spirit of open-source, I’m publishing the information as I go along.

I’d like to use the blue-and-white G3s to capture timelapse images from Canon Powershot cameras and for the G4s to collate those images into movies. Although they do not have the horse-power to play the movies back, they do have the monkey-power to capture, collate and compress the stills into usable files that can either be transferred to another playback device, or uploaded to the internet. To this end, I’m testing which systems will work on these old machines and writing shell scripts. Having dismissed the use of MacPorts on OS X (on pre-Intel machines), the next stage will be to write some scripts and test the performance of Linux-based utilities.

Conclusion

Ubuntu 8.04.1 seems viable on blue-and-white G3s, and I have successfully installed FFMpeg, gPhoto2 and ImageMagick.

Ubuntu 10.04 Lucid Lynx seems viable on G4s, and I have successfully installed FFMpeg, gPhoto2 and ImageMagick.

Stay tuned for performance comparisons and some actual code.