Home | Alastair Barber
Update: Check out the FxGuide DigiPro Report
On the 23rd of July, I’ll be presenting a paper at the Digital Production Symposium (DigiPro) in Anaheim called Camera Tracking in Visual Effects, An Industry Perspective of Structure from Motion (Free Download).
The overall aim is to examine why, when there’s so much research being done in automatic camera tracking in the computer vision community, we in VFX spend so much time and human effort figuring out camera movement, even with access to the latest tools, equipment and research.
I did this work as an earlier part of my EngD and it’s written up in the Elsevier Computers And Graphics Journal (free preprint version). The idea came about from an earlier work
by Klein & Drummond who used motion blur to produce a single frame visual gryroscope. We had a go at extending this to see if you could accurately
determine changes in focal length from motion blur.
JASIOHost is based upon the observer model of software design. In order to successfully gather audio data from an Audio Input – one must write a class that is able to handle the data returned – in a similar fashion to how you may handle UI events
Here’s an example…
For my final year project, I had to design and code a digital audio mixing program, with an interface as a multi-touch table. Although I had access to some (very good) FTIR (Frustrated Total Internal Reflection) ones in the lab at my university, I thought it’d be cool to try and build one myself over the winter holiday. Here are some pointers from my experience…