Archive

Tag Archives: fmtv

COCOA Castle House graphics v1 4.004

Flying monkey TV (FMTV) has a busy weekend ahead, on Friday I will be making some micro-docs and mini-promos for the forthcoming Castlegate Festival on the 20th & 21st June. Friday is the set-up day and not open to the public, but I will be interviewing the artists and shooting the set-up and environs, and making a few short videos about the activities to go online before the event.

20june_460x230_3

On Saturday 20th I will be joining the People’s Assembly End Austerity Now public demonstration in London, and documenting that in whatever way I can. I am not planning any editing on-the-fly, but might publish a few clips from my iPhone. Other than the iPhone 5s, I will be taking a basic GoPro Hero, possibly a Canon EOS 550D and the RODE SmartLav+ mics. I have improvised a couple of wrist straps for the GoPro & iPhone and will also be taking a 12,000 mAh NiMH battery to recharge the cameras throughout the day. I don’t have any specific agenda for my shooting and have no official role, so I’m just going to see what happens.

On Sunday 21st I will be back in Castle House in Sheffield to join the Castlegate Festival and will be on-site throughout the day to talk to people about FMTV and how I go about making videos.

The most important aspect of this activity for FMTV is the workflow. The whole point of FMTV is to use easily available technologies (that is, not professional equipment), but still achieve good quality and get it published quickly. Sure, we can all shoot a bit of smartphone video and get it online very quickly, but the aim of FMTV is to make is as near-broadcast quality as is reasonably possibly.

The smallest hurdles to making half-decent TV are at the beginning and the end, that is the capture and the publishing. The world is awash with cheap, high-quality cameras and I have more online video capacity than I can possibly fill, but it’s the workflow between the two that requires the skill, and this is where it gets more difficult.

IMG_2823

A pair of RODE SmartLav+ mics with twn adapter, E-Prance DashCam, GoPro Hero, 12,00 mAh USB battery (with iPhone 5s in the reflection) & clip-on fish-eye lens.

Whereas there’s nothing wrong with posting informal, unedited videos online, if you have more purpose it often desirable to edit out the umms and ahhs, and the WobblyCam shots. I bought the iOS version of iMovie and have looked at it several times but, beyond basic trimming, it doesn’t do anything I want. I have no doubt it will mature, but I have decided to edit using Final Cut Pro 6 (FCP6) on an oldish MacBook. There are other video edit apps for iOS, but I don’t have an iPad yet and editing on an iPhone screen is nothing less than painful.

IMG_2827

FiLMic Pro

However, on the plus side, when I was looking for apps that would record audio from external mics whilst shooting video (another thing iMovie does not do), I bought an app called FiLMiC Pro by Cinegenix, which is superb (can’t find the price but it was only a few pounds). Unlike the built-in camera app, FiLMiC Pro allows you to create presets that store different combinations of resolution, quality and frame rate, rather than the built-in take-it-or-leave-it settings. I have found some less than complimentary reviews citing its flakiness, and I have had a few problems, but only a few. It allows you to choose where in the frame to focus and where to exposure for, and it allows you to lock both settings.
http://www.filmicpro.com/apps/filmic-pro/

So, to shoot the promos / mini-docs, my main camera will be the iPhone 5s, controlled with FiLMiC Pro, with sound for interviews from a pair of RODE SmartLav lapel mics.
http://www.rode.com/microphones/smartlav

Then I will download the video clips to the Mac and transcode them into Apple Intermediate Codec (AIC) for editing. Some apps seem to be able to edit inter-frame encoded video (such as the H264 from the iPhone), but in my experience FCP6 does not handle it well during editing. I don’t know much about the specification of AIC, and it appears to be a lossy format, but it works well for me.

For the edits that I want to get online quickly, I will master to 1280x720p 25fps using a square overlay. This is so that I can edit the videos only once, in 16:9 aspect ratio, but they will be action-safe for uploading to Instagram, which crops everything square by default. The reason that we (the Studio COCOA team) decided to make things Instagram-friendly is because it is populated by a younger audience than us middle-aged Twitter freaks, and is appropriate for the the activities taking place over the weekend.

IMG_2703

I also bought a stabilizer mount, not specifically for this job, but it was very cheap and comes with a smartphone mount, and this might correct some of the unintended WobblyCam shots when operating such a small camera. I also intend to experiment a bit with some other features of the iPhone 5s’ superb built-in camera, such as the panoramic still capture and the 120 fps slow-motion video capture. I also have a clip-on fish-eye lens (which fits any smartphone) that is remarkably good.

And finally, I will also be capturing timelapse, as I usually do, using CHDK-hacked Canon Powershot cameras, and will also try out the timelapse function on the GoPro Hero.
http://chdk.wikia.com/wiki/CHDK

Stand by:
https://instagram.com/bolam360
https://twitter.com/RBDigiMedia
http://flyingmonkey.tv

#CastlegateFest
https://twitter.com/castlegatefest
http://cocoartists.co/

Advertisements

COCOA Castle House graphics v1

After something of an enforced hiatus, Flying Monkey TV has been invited to be part of the Castlegate Festival in Castle House (the old Co-op building), Sheffield, UK over the weekend of 20th & 21st June. Please note, I will only be there on Sunday 21st but will have some video showing on the 20th.

I will be making micro docs & promo videos, 15 seconds long so that they are Instagram-friendly, and publishing them online. I will be covering the set-up of the event on Friday 19th, and will be on-site for chats and a kind of drop-in nano workshop on the Sunday. It’s not a workshop exactly, but I will be making short videos and showing my methods and how to make the media suitable for online social media channels.

IMG_2604

Castle House, a disused department store.

IMG_2596

Castle House, the building is huge, both spooky and inspiring.

The event is a collaboration between the University of Sheffield and Studio COCOA (Castlegate Open Community of Artists), a collective of artists organised by current Yorkshire Artspace artist-in-residence Paul Evans.

More information about the Castlegate Festival and Studio COCOA is here:
http://www.sheffield.ac.uk/castlegatefestival/home
http://cocoartists.co/

If you are not already aware of what Flying Monkey TV is all about, it is a project conceived by myself and artist / media producer Matt Lewis back in 2010 to make documentaries using domestic-level, available technology and open-source software to manage the media overload that the proliferation of cameras has produced.

We applied for a number of funding sources and did attract a small amount of money for some research and development of both the ideas and techniques. However, “available technology” has advanced so quickly, it has been difficult to know where to place the goal posts and we have not been successful in attracting any funding more than a few hundred pounds here and there. Consequently, the project is running on an as-and-when basis. This is one of those whens.

IMG_1606

Castle House has seen better days.

This iteration of the project will be using an iPhone 5s (I know, not the most available technology) with RØDE smartLav+ microphones and a stabilizing camera mount in order to shoot the best possible quality video and audio (for a domestic set-up). The iPhone 5s has a superb built-in camera, but the Camera app only gives limited control and does not allow capture of audio from external mics. I will be using an app called FiLMiC Pro to shoot with, and Final Cut Pro 6 to edit the video. Finally I will be compressing the videos for online delivery using Apple QuickTime Pro. The videos will be shot and edited with social media channels in mind.

I will also be capturing the whole thing on CHDK-hacked timelapse cameras (obvs).
http://chdk.wikia.com/wiki/CHDK

This is not a no-cost set-up, but it is intended to show how you can achieve near broadcast quality with only a tiny, one-person set-up. Below are a couple of very quick videos I made recently using the iPhone 5s and the smartLav+ mics (I have ordered a stabilizer to smooth out some of that camera shake). I will be around all day on Sunday 21st so please come along and say hello.

Richard Bolam
Twitter RBDigiMedia
I
nstagram bolam360

I’m going a bit off-piste with this post, and looking at the hidden-in-plain-view potential of some of the libraries, very familiar to Linux geeks, but pretty much unknown to the outside world.

Although I talk about the past a lot, I’m not nostalgic. I don’t miss the frustrations of the past. I just like to remind myself how good things are now. By the way, I am by no means a computer historian, and my opinions and experience are subjective. If I post anything that is factually incorrect, please let me know.

I realise some of this might be bleeding obvious to the Linux world, but those of us cloistered for years in the walled-garden of the Mac OS X GUI, it’s quite a revelation.

As I’ve said before, I’m a lapsed programmer and used to write commercial software using framework applications, primarily FileMaker Pro 3.0 amongst others. FileMaker has developed into a very sophisticated rapid application development (RAD) platform now that FileMaker Inc have got around to including the features that us developers were clamouring for in the 1990s.

Given a choice, I am very happy tinkering with nice GUI-based visual programming / scripting tools, but the compelling reason for me to get my hands dirty at the command line is the fact that nothing gets in the way.

One of my most enduring love / hate relationships has been with HyperCard / SuperCard. I remember being excited, almost to the point of peeing myself, when I first got wind of HyperCard. My programming days started with BBC BASIC, then Spectrum BASIC, then the woefully under-implemented Commodore 64 BASIC. After that things just got worse and worse and Atari ST BASIC was absolutely useless. I can’t even remember if there was an implementation for the Amiga, but I did try CanDo which was promising but way ahead of its time. As the years passed it got more and more hard work to get to a command line and I kind of missed the direct simplicity of BBC or Spectrum BASIC, which were at least broadly complete implementations that you could actually do stuff with.

Update: I completely forgot about STOS on the ST and AMOS on the Amiga, which were both brilliant.
http://en.wikipedia.org/wiki/STOS_BASIC
http://en.wikipedia.org/wiki/AMOS_(programming_language)

HyperCard was a revelation to me as it was on the desktop, and you could easily create GUI-based applications with it, and it had a pretty decent plain-English programming language. You could actually do stuff with it.
http://en.wikipedia.org/wiki/HyperCard

As desktop computers “improved”, programming became more and more remote from the user, and there was a very frustrating period in the late 80s and early 90s where application software was not mature enough, but programming not accessible enough, to plug all the gaps in our productivity.

Fortunately those days are over.

There were many good things about HyperCard, but it was lacking in some fundamental functions, and for me, they were colour and structured graphics. HyperCard was strictly monochrome and bitmap only, and although Apple eventually did include plugins that supported colour graphics, it was an afterthought and not adequately implemented. Apple neglected HyperCard for years and it is now a minority interest tool. I still use it for programming on old compact Macs as it is one of the few programming tools that will work on machines with no more than 4 Mb RAM.

Silicon Beach SuperCard seemed to be the obvious successor, the so-called “HyperCard on steroids”. Its own language, SuperTalk, was an extended and mostly compatible development of HyperCard’s HyperTalk, and it supported 8 bit colour and script-controllable vector graphics.
http://en.wikipedia.org/wiki/Silicon_Beach_Software

SuperCard, whilst not neglected in the same way, passed from one owner to another for years, developing only gradually. I cannot tell you just how closely related it is to the original, but SuperCard appears to have evolved into something called LiveCode by Runtime Revolution.
http://www.runrev.com/

LiveCode deploys on a number of platforms including Windows, Linux, OS X, iOS and Android, and looks very promising. However, it seems to be lacking one of the fundamental features that I need, and that is to append single images to a movie file. Please correct me if I’m wrong, but I can’t find any mention of it, although I’m still looking into it.

And this is my big frustration when using very-high-level development tools. I tried using Apple Automator and QuickTime Pro player to assemble digital stills into movie files. Again, maybe it’s there but I couldn’t find it without having to operate the menu items in the player, and I really don’t want to go back to the days of using marionette software like Keyquencer (remember that?) or similar to operate programs via their menus. In my experience, automated clicking of buttons and selecting menu items is just not reliable or fast enough.

Adobe Photoshop’s Actions is a very powerful tool but it’s quite slow (although things may well have changed since CS3).

And this is when we get back to Linux, GIMP and ImageMagick. Whilst I have been banging my head against a brick wall for years, looking for the ideal development platform, Linux tools and libraries have been quietly maturing under my nose. I recently looked into using ImageMagick, which can do, well, everything. For the uninitiated, ImageMagick is a “software suite to create, edit, compose, or convert bitmap images” and can also procedurally create images by manipulating graphics or drawing shapes.

There are some very coherent and complete help files, written by Anthony Thyssen, and they seem to go on forever, documenting more and more features, available via command line. And the command line bit is the real killer because it means I can write a program and just call the single function I want, and add it to a Bash script.
http://www.imagemagick.org/Usage/

My own interest is in batch-converting, montaging of multiple images and format conversion of data sets, often comprising tens of thousands of images. I have managed to use gPhoto2 and FFmpeg to assemble the images into movies, and ImageMagick / GIMP will help me to manipulate the images if need be.

More soon…

The script here is only a slightly cleaned up version of the previous one, but also with a major addition. A timer.

It may be just that I’m new to Linux, but getting the syntax correct was very unintuitive. It took a lot of trial-and-error to get the spaces in the right place, and it reminds me a lot of programming in the 1980’s where you just get a “syntax error” message and nothing else.

This timer function uses the classic technique of storing the system time at the start of the script and then again once the script has finished, subtracting the former from the latter and the difference is the time taken.

This function only returns an the number of seconds as an integer, and a result of “1350575350” might as well be in grains of sand. So, I have then reformatted the output to a more human-readable minutes and seconds by dividing the number of seconds elapsed by 60 to get the minutes and deriving the modulus (what’s left over) for the number of seconds. I could also do a similar thing with the value 3600 if I wanted to format it in hours, minutes and seconds.

Strangely, when I tried date +%2s (which returns the current system time in seconds) at the command line in OS X it failed, but not on the same machine whilst booted into the Linux partition. Turns out, it was a typo but Linux is happy to ignore this glaring mistake (should be date +%s) whilst OS X is not. Strange when it’s so particular about other syntax.

However, all complaining aside, I have got some actual useful working code up and running much more easily than I had expected.

# !/bin/bash

clear
echo “Hello Monkey Planet”
starttime=$(date +%s)

cd /home/richard/Desktop/RBTest/newtest
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/testtemp
rm *.JPG
rm *.jpg

cd /home/richard/Desktop/RBTest/newtest
#gphoto2 –get-all-files

x=1; for i in $(find $(pwd) -name \*.JPG | xargs ls -t -r); do counter=$(printf %04d $x); ln “$i” /home/richard/Desktop/RBTest/testtemp/img_”$counter”.jpg; x=$(($x+1)); done

cd /home/richard/Desktop/RBTest/testtemp
ffmpeg -r 25 -i img_%04d.jpg -s 640×480 -qscale 1 -vcodec mjpeg movie.mov
rm *.jpg

stoptime=$(date +%s)
fulltime=$(expr $stoptime – $starttime)
fulltimesecs=$(expr $fulltime % 60)
fulltimemins=$(expr $fulltime / 60)

echo “$fulltimemins minutes $fulltimesecs seconds”

I suppose it would be good to structure this programming at some point, such as the timer function, so that I could call it from any script. There probably is a function in Linux already.

More soon…