Archive

Tag Archives: UK

COCOA Castle House graphics v1 4.004

Flying monkey TV (FMTV) has a busy weekend ahead, on Friday I will be making some micro-docs and mini-promos for the forthcoming Castlegate Festival on the 20th & 21st June. Friday is the set-up day and not open to the public, but I will be interviewing the artists and shooting the set-up and environs, and making a few short videos about the activities to go online before the event.

20june_460x230_3

On Saturday 20th I will be joining the People’s Assembly End Austerity Now public demonstration in London, and documenting that in whatever way I can. I am not planning any editing on-the-fly, but might publish a few clips from my iPhone. Other than the iPhone 5s, I will be taking a basic GoPro Hero, possibly a Canon EOS 550D and the RODE SmartLav+ mics. I have improvised a couple of wrist straps for the GoPro & iPhone and will also be taking a 12,000 mAh NiMH battery to recharge the cameras throughout the day. I don’t have any specific agenda for my shooting and have no official role, so I’m just going to see what happens.

On Sunday 21st I will be back in Castle House in Sheffield to join the Castlegate Festival and will be on-site throughout the day to talk to people about FMTV and how I go about making videos.

The most important aspect of this activity for FMTV is the workflow. The whole point of FMTV is to use easily available technologies (that is, not professional equipment), but still achieve good quality and get it published quickly. Sure, we can all shoot a bit of smartphone video and get it online very quickly, but the aim of FMTV is to make is as near-broadcast quality as is reasonably possibly.

The smallest hurdles to making half-decent TV are at the beginning and the end, that is the capture and the publishing. The world is awash with cheap, high-quality cameras and I have more online video capacity than I can possibly fill, but it’s the workflow between the two that requires the skill, and this is where it gets more difficult.

IMG_2823

A pair of RODE SmartLav+ mics with twn adapter, E-Prance DashCam, GoPro Hero, 12,00 mAh USB battery (with iPhone 5s in the reflection) & clip-on fish-eye lens.

Whereas there’s nothing wrong with posting informal, unedited videos online, if you have more purpose it often desirable to edit out the umms and ahhs, and the WobblyCam shots. I bought the iOS version of iMovie and have looked at it several times but, beyond basic trimming, it doesn’t do anything I want. I have no doubt it will mature, but I have decided to edit using Final Cut Pro 6 (FCP6) on an oldish MacBook. There are other video edit apps for iOS, but I don’t have an iPad yet and editing on an iPhone screen is nothing less than painful.

IMG_2827

FiLMic Pro

However, on the plus side, when I was looking for apps that would record audio from external mics whilst shooting video (another thing iMovie does not do), I bought an app called FiLMiC Pro by Cinegenix, which is superb (can’t find the price but it was only a few pounds). Unlike the built-in camera app, FiLMiC Pro allows you to create presets that store different combinations of resolution, quality and frame rate, rather than the built-in take-it-or-leave-it settings. I have found some less than complimentary reviews citing its flakiness, and I have had a few problems, but only a few. It allows you to choose where in the frame to focus and where to exposure for, and it allows you to lock both settings.
http://www.filmicpro.com/apps/filmic-pro/

So, to shoot the promos / mini-docs, my main camera will be the iPhone 5s, controlled with FiLMiC Pro, with sound for interviews from a pair of RODE SmartLav lapel mics.
http://www.rode.com/microphones/smartlav

Then I will download the video clips to the Mac and transcode them into Apple Intermediate Codec (AIC) for editing. Some apps seem to be able to edit inter-frame encoded video (such as the H264 from the iPhone), but in my experience FCP6 does not handle it well during editing. I don’t know much about the specification of AIC, and it appears to be a lossy format, but it works well for me.

For the edits that I want to get online quickly, I will master to 1280x720p 25fps using a square overlay. This is so that I can edit the videos only once, in 16:9 aspect ratio, but they will be action-safe for uploading to Instagram, which crops everything square by default. The reason that we (the Studio COCOA team) decided to make things Instagram-friendly is because it is populated by a younger audience than us middle-aged Twitter freaks, and is appropriate for the the activities taking place over the weekend.

IMG_2703

I also bought a stabilizer mount, not specifically for this job, but it was very cheap and comes with a smartphone mount, and this might correct some of the unintended WobblyCam shots when operating such a small camera. I also intend to experiment a bit with some other features of the iPhone 5s’ superb built-in camera, such as the panoramic still capture and the 120 fps slow-motion video capture. I also have a clip-on fish-eye lens (which fits any smartphone) that is remarkably good.

And finally, I will also be capturing timelapse, as I usually do, using CHDK-hacked Canon Powershot cameras, and will also try out the timelapse function on the GoPro Hero.
http://chdk.wikia.com/wiki/CHDK

Stand by:
https://instagram.com/bolam360
https://twitter.com/RBDigiMedia
http://flyingmonkey.tv

#CastlegateFest
https://twitter.com/castlegatefest
http://cocoartists.co/

Advertisements

COCOA Castle House graphics v1

After something of an enforced hiatus, Flying Monkey TV has been invited to be part of the Castlegate Festival in Castle House (the old Co-op building), Sheffield, UK over the weekend of 20th & 21st June. Please note, I will only be there on Sunday 21st but will have some video showing on the 20th.

I will be making micro docs & promo videos, 15 seconds long so that they are Instagram-friendly, and publishing them online. I will be covering the set-up of the event on Friday 19th, and will be on-site for chats and a kind of drop-in nano workshop on the Sunday. It’s not a workshop exactly, but I will be making short videos and showing my methods and how to make the media suitable for online social media channels.

IMG_2604

Castle House, a disused department store.

IMG_2596

Castle House, the building is huge, both spooky and inspiring.

The event is a collaboration between the University of Sheffield and Studio COCOA (Castlegate Open Community of Artists), a collective of artists organised by current Yorkshire Artspace artist-in-residence Paul Evans.

More information about the Castlegate Festival and Studio COCOA is here:
http://www.sheffield.ac.uk/castlegatefestival/home
http://cocoartists.co/

If you are not already aware of what Flying Monkey TV is all about, it is a project conceived by myself and artist / media producer Matt Lewis back in 2010 to make documentaries using domestic-level, available technology and open-source software to manage the media overload that the proliferation of cameras has produced.

We applied for a number of funding sources and did attract a small amount of money for some research and development of both the ideas and techniques. However, “available technology” has advanced so quickly, it has been difficult to know where to place the goal posts and we have not been successful in attracting any funding more than a few hundred pounds here and there. Consequently, the project is running on an as-and-when basis. This is one of those whens.

IMG_1606

Castle House has seen better days.

This iteration of the project will be using an iPhone 5s (I know, not the most available technology) with RØDE smartLav+ microphones and a stabilizing camera mount in order to shoot the best possible quality video and audio (for a domestic set-up). The iPhone 5s has a superb built-in camera, but the Camera app only gives limited control and does not allow capture of audio from external mics. I will be using an app called FiLMiC Pro to shoot with, and Final Cut Pro 6 to edit the video. Finally I will be compressing the videos for online delivery using Apple QuickTime Pro. The videos will be shot and edited with social media channels in mind.

I will also be capturing the whole thing on CHDK-hacked timelapse cameras (obvs).
http://chdk.wikia.com/wiki/CHDK

This is not a no-cost set-up, but it is intended to show how you can achieve near broadcast quality with only a tiny, one-person set-up. Below are a couple of very quick videos I made recently using the iPhone 5s and the smartLav+ mics (I have ordered a stabilizer to smooth out some of that camera shake). I will be around all day on Sunday 21st so please come along and say hello.

Richard Bolam
Twitter RBDigiMedia
I
nstagram bolam360

3d stereo experiments.009

Yogi from Mars 3D

Going off-piste again this week, kinda. Randomly, I wondered just how easy it would be to capture and process stereo 3D timelapse with open-source tools. Having mostly ignored 3D, and being largely unimpressed by its cinema application, I was still wondering what the killer application for stereoscopic photography.

These days, a number of single lens cameras have a “3D” function which stitches together a number of exposures into a navigable image that allows the point-of-view (POV) to be changed, interactively. To my mind this is not really 3D, it’s like the moving lenticular images we used to collect in the 1970s. What I am interested in is true stereoscopic imaging, which requires genuine binocular vision to give a convincing effect of depth.

Tim Dashwood has written an excellent introduction to stereo/3D photography that I do not intend to duplicate, but what I am going to cover is the specifics of doing it with CHDK, FFMpeg and ImageMagick.
http://www.dashwood3d.com/blog/beginners-guide-to-shooting-stereoscopic-3d/

This is just an introductory blog post and I’m not going to get to any workflows just yet.

Stereo imaging has been around since 1840, almost as long as photography itself, and here are some amazing stereographs captured during American Civil War.
http://www.theatlantic.com/infocus/2012/02/the-civil-war-part-3-the-stereographs/100243/

Some of these give away the fact that they were show with one camera in two positions.

Landscape

I was introduced to stereoscopic 3D it in the 1970s via my sister’s View-Master, but this was not much more sophisticated than the widely available stereo viewers from the nineteenth and early twentieth centuries.

Stereoscoop_VM

The documented optimum lens separation for human vision is 65mm, give or take, and my eyes are pretty much exactly that distance apart. However, according to various sources larger separations work fine for some subjects and for landscapes and distant objects, a larger separations will work fine. I will be doing some tests with different sized subjects.

For reference, I am not interested in red/cyan anaglyph type image, because of the weird colour effects, but I am going to use this technique for the sake of being able to display them on vanilla monitors. Not everyone can do it, but I can also do the cross-eyed right-left trick too, although it’ not practical for any length of time.

Recently, I bought a 3D LED TV that supports passive polarised glasses and I saw an excellent demo of a Fujifilm W3 3D camera displaying media on LG monitors at a photography trade show in 2010.

There are some excellent existing resources for shooting stereographs with CHDK, including the StereoDataMaker site, and Gentles Ltd, and I’ll add more info about other resources soon.

I have mixed feelings about 3D and am not sure just what I really want to do with it, but how hard can it be? I was not sure how to prepare the media and assumed it would be more difficult than it is. Turns out processing pairs of images is very easy in ImageMagick, and as far as the polarised light monitors go, all the cleverness is done in the screen so you just have to give it 2 images side-by-side.

It never occurred to me that it would be so easy.

So, side-by-side is my eventual destination format, but using red/cyan anaglyph for convenience and online dissemination.

Anyway, I’m running out of time, but I might update this post later. In the meantime here are a few links and I’ll post more soon with come code and practical tests.

Stereoscopy.com
http://www.stereoscopy.com/gallery/

History of Sterography
http://www.arts.rpi.edu/~ruiz/stereo_history/text/historystereog.html

A640 rotated_84_2

Having been flying around the country over the past couple of weeks, I’ve shot a lot of timelapse out of the front windscreen. Just because I can.

It’s largely thanks to CHDK and I have managed to record myself getting lost on routes between Sheffield, Lancaster, Leeds, Newcastle, Nottingham and York. I have a few plans for this media, but one journey is already online here:

This was quite a successful Flying Monkey TV experiment and I had it edited and online within three hours of getting home.

These journeys were mostly shot on a Canon PowerShot A560, timelapse-enable with CHDK, and mounted on the inside of the front windscreen with a suction mount (see below). I mounted the camera hanging from the top of the screen, upside down. This meant that it was in line with the passenger side roof pillar and hence did not obscure my view.

I don’t use the auto-rotate feature in most cameras because it sometimes gets confused and can give you a few incorrectly rotated frames here and there. Consequently, shooting this way, I end up with an upside-down video that then needs rotating.

559CANON_full_2048_03pct

I used to drop them into Final Cut Pro, render them and export them to a new movie file, which works fine, but I think it’s time to write a Bash script to do it.

No sooner said than done! Well, not quite, but it was much easier than I imagined. As usual Google managed to find some very helpful resources for me to cannibalise. Writing the code is much faster than documenting it into a usable blog post.

I started by reminding myself how to create a basic for-next procedure (detail included here for other noobs). Here is a bit of basic code with lists the files in the present working directory ending in “.JPG” (don’t forget this is case-sensitive) and echos them to the screen. The semicolons separate the statements and the “done” terminates it.

for i in $(ls *.JPG); do echo $i; done

For the next step, instead of just listing the file names to the screen, I added the ImageMagick step to rotate the image and write it over the original.

for i in $(ls *.JPG); do convert $i -rotate 180 $i; done

As you might have already gathered, I like some progress feedback so added an echo with the file count and file name.

for i in $(ls *.JPG); do convert $i -rotate 180 $i; x=$(($x+1)) ;echo “$i $x”; done

However, a current file number is only of use if you know how many more to go. A bit of googling revealed this forum thread and the code:

ls -l | wc -l

This lists all the files in the current directory and then pipelines that list to give an integer count of the items in that list. It’s not necessary if this is just a temporary folder and it’s cleared between uses, but if you want to filter the file types you can add a wild card search like this:

ls -l *.JPG | wc -l

I found information about the two commands on linux.about.com and a forum thread which combines the two on unix.com.
http://linux.about.com/od/commands/l/blcmdl1_ls.htm
http://linux.about.com/library/cmd/blcmdl1_wc.htm
http://www.unix.com/unix-dummies-questions-answers/36490-ls-command-listing-number-files.html

So my final piece of code (for this iteration at least) is here:

x=0; c=$(ls -l *.JPG | wc -l); for i in $(ls *.JPG); do convert $i -rotate 180 $i; x=$(($x+1)) ;echo “$x /$c $i”; done

This sets x to be 0, c to be the number of files with filenames ending in “.JPG” in the current directory, rotates each one of them by 180 degrees, overwrites the original file and echos the file number, the file count and the filename. It’s pretty basic and I’ll roll it into a script soon.

There may well be better ways of doing this, but it worked well as an exercise to reinforce my learning.

The script here is only a slightly cleaned up version of the previous one, but also with a major addition. A timer.

It may be just that I’m new to Linux, but getting the syntax correct was very unintuitive. It took a lot of trial-and-error to get the spaces in the right place, and it reminds me a lot of programming in the 1980’s where you just get a “syntax error” message and nothing else.

This timer function uses the classic technique of storing the system time at the start of the script and then again once the script has finished, subtracting the former from the latter and the difference is the time taken.

This function only returns an the number of seconds as an integer, and a result of “1350575350” might as well be in grains of sand. So, I have then reformatted the output to a more human-readable minutes and seconds by dividing the number of seconds elapsed by 60 to get the minutes and deriving the modulus (what’s left over) for the number of seconds. I could also do a similar thing with the value 3600 if I wanted to format it in hours, minutes and seconds.

Strangely, when I tried date +%2s (which returns the current system time in seconds) at the command line in OS X it failed, but not on the same machine whilst booted into the Linux partition. Turns out, it was a typo but Linux is happy to ignore this glaring mistake (should be date +%s) whilst OS X is not. Strange when it’s so particular about other syntax.

However, all complaining aside, I have got some actual useful working code up and running much more easily than I had expected.

# !/bin/bash

clear
echo “Hello Monkey Planet”
starttime=$(date +%s)

cd /home/richard/Desktop/RBTest/newtest
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/testtemp
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/newtest
#gphoto2 –get-all-files

x=1; for i in $(find $(pwd) -name \*.JPG | xargs ls -t -r); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done

cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.mov
rm *.jpg

stoptime=$(date +%s)
fulltime=$(expr $stoptime – $starttime)
fulltimesecs=$(expr $fulltime % 60)
fulltimemins=$(expr $fulltime / 60)

echo “$fulltimemins minutes $fulltimesecs seconds”

I suppose it would be good to structure this programming at some point, such as the timer function, so that I could call it from any script. There probably is a function in Linux already.

More soon…

There are some technologies that none of us will miss, such as CRTs, the SCSI interface, VHS and (for me) optical storage media (CDs and DVDs). But one technological era I am particularly glad to leave behind is the 1980s, characterised by cassette-loading or actual typing in of huge BASIC programs, distributed in magazines.

These days we have an embarrassment of riches in the form of the internet. For a programmer, it's a goldmine of free resources. However, something that is often missing is an exhaustive deconstruction of the code, particularly useful to noobs. I'm a lapsed programmer, so I don't find it too hard, but Linux is new to me and has some strange “conventions”.

Anyway, as promised here is some actual code that I found (well, Google found it and I copied it) with some explanation about what the bits and pieces do. I will probably update this a bit and please let me know if you spot any errors.

FFMpeg requires an unbroken sequence of image files starting at 1 and will fail if the numbers are not continuous. This is not optional and so it is best to rename them as a matter of course.

The following two code snippets are from stackoverflow.com and I've adapted them slightly.

http://stackoverflow.com/questions/2829113/ffmpeg-create-a-video-from-images

This is the script for renaming the files.

x=1; for i in *jpg; do counter=$(printf %03d $x); ln "$i" /tmp/img"$counter".jpg; x=$(($x+1)); done

And here is the original FFMpeg command from the same source.

ffmpeg -f image2 -i /tmp/img%03d.jpg /tmp/a.mpg

Note:

24 seconds to ininitalise the Powershot A640 and another 9 minutes to download 2,595 images totalling 2.7 Gb (over USB 2 on 1st generation MacBook (1.83 GHz Intel dual core) running Ubuntu Linux 11.04.

Sort by creation date/time

x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /tmp/img_”$counter”.jpg; x=$(($x+1)); done

The $( some stuff ) contruct eveluates everything within the brackets before using the result. In this case, $(ls -t -r *JPG) returns a list of files ending in “JPG”, sorted by ascending creation time. The for i in $(ls -t -r *JPG) is a for..next loop where i is each file ending in “JPG”, sorted into creation time order.

The snippet counter=$(printf %04d $x) resolves the integer variable x into a 4-digit decimal number with leading zeros and puts it into the variable counter. The leading zeros are necessary in order to maintain the 8-character file names so that they sort alpha-numerically. The snippet x=$(($x+1)) increments the variable x by integer 1 after each renaming operation.

It’s important to get the capitalisation correct and the destination pathname, or it will just fail.

The -t modifier sorts by creation time in reverse order, most recent first. The -r modifier reverses that order to, in this case, change the order back to the correct chronological order.

Canon Powershot cameras automatically number the images files in the format “IMG_0001.JPG” and restarts the numbering after 9,999 images. It will automatically create a new folder every 2,000 images and will create a new folder and restart from 0001 once it gets to IMG_9999.JPG.

Consequently, it is possible that for a timelapse shoot, the file names will not be in continous alpha-numeric order. Sometimes they will be, but it’s as well to assume that they won’t. Also, gPhoto2 does not preserve the folder structure if you download using the gphoto2 –get-all-files command, and will dump the whole lot into a single folder.

In this case, it is very likely that the alpha-numeric order will not be strictly chronological. As a result, it was necessary to include an amendment to the code that will order them into chronological order before renaming them.

More soon.

Rockingham Street Q-Park car park, Sheffield, UK.

After a partially voluntary and partially involuntary hiatus, I am back in flight. After significant work commitments, then getting married with a honeymoon in Barcelona, ES, I am back in Sheffield, UK, the greenest city in England. As this project is currently unfunded, and solo, I’m updating things as-and-when.

I promised you some performance tests and here they are. I’m am perfectly satisfied with the installations of Ubuntu Linus 8.04.1 on blue-and-white G3s and 10.04 on grey-and-white  Macs. As suspected, the GUI is a major burden on the system and opening a folder of about 3,000 jpeg images can take a full 5 minutes because of the overhad loading all the thumbnails.

The solution is simple. Don’t.

For my purposes this is no problem. I want to use these machines as workhorses and don’t need the GUI.

I’m going to be using FFmpeg to convert and compress a folder full of images into a movie. However, I came across a well-documented problem immediately. FFmpeg will not continue if a.) the first file is not numbered xxxxxx1 and will fail if there is a gap in the numbering.

Fortunately, I easily found a Bash script to sort this out, and I found it here:

http://stackoverflow.com/questions/2829113/ffmpeg-create-a-video-from-images

x=1; for i in *jpg; do counter=$(printf %03d $x); ln “$i” /tmp/img”$counter”.jpg; x=$(($x+1)); done

(Also see the revision below).

This works but note the format of the filename extensions is case-sensitive. Helpfully (not), the script just fails if you do not get this correct. It renames all the images in a folder beginning at xxxxxx1, so if you sequence does not, or it is missing one or more images in the middle, it will not cause FFmpeg to fail. I’ll deconstruct this script in a later blog post.

Even more usefully, Linux does not actually duplicate the binary files, it just creates link files which can be safely deleted afterwards.

These tests were done on a 1 GHz PowerMac G4 with 640 Mb RAM & 21 Gb HD.

Rename 2933 files – 23 seconds

Using FFmpeg:

Encode jpeg files at 1600×1200 to mjpeg – approx 9 min 30 secs – 192.4 MB
Encode jpeg files at 1600×1200 to mjpeg – approx 8 min 11 secs – 192.4 MB
Encode jpeg files at 1600×1200 to mjpeg – approx 8 min 12 secs – 192.4 MB

Encode jpeg files at 640×480 to mjpeg AVI – approx 10 min 15 secs – 40.4 MB – atrocious quality
Encode jpeg files at 640×480 to mjpeg AVI – forgot to time it but similar – 199.5 MB – quality set to 1

This last set of tests was done whilst messing about within the GUI. I don’t really know enough about what is going on under the hood, but it seems that things slow down a lot and this might be due to caching thumbnails or, err, whatever. After a restart performance increased dramatically.

Rename 2,933 items – 1’05’’
Rename 2,933 items – 1’12’’
Rename 2,933 items – 0’20’’ (after restart)

For my purposes the performance seems acceptable at this stage.

One other revision I had to make was to compensate for the original file numbering restarting. Canon Powershot cameras create jpeg images with filenames of the format “IMG_xxxx.JPG” and shooting timelapse with them quickly reaches numbers in excess of 9,999. For consumer-level snap cameras, this number is probably well in excess of a normal user’s needs but I can easily shoot that many in one day. Consequently, the image numbers cannot solely be relied upon to reference the images in the correct chronological order, hence this revision which sorts them into creation order before renaming.

x=1; for i in $(ls -t -r *JPG); do counter=$(printf %04d $x); ln “$i” /Users/richardbolam/Desktop/testout/img_”$counter”.jpg; x=$(($x+1)); done

I’m still a bit of a noob when it comes to Bash scripting, but I will write another post soon that disassembles these scripts, partly for your benefit, but also to help me learn more about what I’m doing. This script was revised with the invaluable help of the technical staff at Access Space. Stay tuned.

access-space.org