Feb 1

The object of our affection: Panasonic’s AF100

AF100 test shots from Philip Bloom on Vimeo.

To say we’re obsessed with this under $5,000 (no lens) camera is an understatement. The device looks a bit weird, but with the ability to offer beautiful DSLR-ey images, but also with audio controls which videographers have come to expect, we’ll get over its boxy chasse.

More from Philip Bloom.

Jan 31

Art vs products: What motivates your creativity?

I wrote a lengthy essay about what motivates people, and how those motivations affect the products which they create. At one end is Art, a work which is created exclusively for the enjoyment of the artist and has no external benefits. At the other end is Product, a work which is created solely to satisfy some external validator, which often is money.

It’s a food for thought idea, and obviously relates to the creation of video. Check out the full post over on my other blog.

Jan 26

Getting started with Handheld rigs

Since we’re relative noobs around the DSLR parts, we’re digging this post from El Skid about their top 10 tips for handheld rigs. The top things worth considering? Try before you buy, they’re all a bit different. Also, an image stabilizing lens might be enough to keep your shakycam to a minimum, without adding all that bulk.


[Via @ryanbkoo]

Jan 25

Blender + DSLR commercial

Jan 21

Quality curve

Which one of these videos was shot by a record label, and which one was shot by a single dude?

I think most video production nerds can tell the difference, but I won’t spoil it.

The point? With the emergence of sub $10k cameras that support depth-of-field and other higher quality production options… we’re extremely close to the tipping point where “pro with no money” can compete effectively with “pro with bankroll.”

Of course editing / shooting technique and on-camera talent will always distinguish good productions from bad productions, but money might not be a factor in the equation.

Jan 11

Google dropping H.264 support in Chrome

Ruh roh. Google says it’s dropping H.264 support in newer builds of its flagship browser to “enable open innovation, support for the codec will be removed and our resources directed towards completely open codec technologies.” That means the two practically unsupported standards (from a publisher’s perspective) WebM and Ogg Theora are now the only games in town, when it comes to HTML5 playback on Chrome.

Earlier, we said that Google would have to be a little evil to get WebM adopted. This move falls in line with that logic.

Will you encode another version of videos, or will you stick to H.264? We’re thinking we like the idea of WebM, but we’re not convinced it has enough momentum to overthrow H.264.

Jan 11

Capture HDMI, Component and Composite video on a laptop (MacBook Pro)

Update: FB friend Neil Cohen¬†points me toward Ebay where there’s one called the PE4L + EC2C. $52.90 + $15 shipping. Even better!

Wow, folks… this is like a unicorn or something equally rare. I’ve finally discovered a way to capture HDMI video to any laptop, without spending over $1000. The total cost is about half that, and it’s a bit of a precarious installation process.

LeCroy makes this beauty, which is pictured right. It’s an express card adapter which enables a 1-lane PCI Express slot. Fittingly, the BlackMagic Intensity Pro is a 1-lane PCI Express card. Put the two together, and voila.

The card costs $275, and is really freaking difficult to find. I could only find mention of the card on a PDF from the company, but I think you can call them and they’ll ship you one for $275. The BlackMagic Intensity Pro is $200, making the total price $475.

If you end up picking one up, please let me know how it works out. I’m super curious to try it out for myself.

If you don’t have a MacBook Pro with ExpressCard, there’s some hope for you too. I’ve seen other people do similar hacks using less expensive means by removing the WiFi adapter. The unit that does the dirty work, called the pemini2x1-F, costs about $100.

Dec 23

The Cube wireless video transfer system

Cube Streaming HD Video Vehicle to Vehicle from Teradek on Vimeo.

Apparently The Cube does its thing over WiFi, taking an HDMI or HD-SDI signal (with audio) and sending it wirelessly with its own ad-hoc WiFi network with a purported 300ft range, and low latency h.264 (level 4.1) encoding.

Consider us intrigued. At $1600 for an HDMI equipped version, it’s definitely something worth investigating. Regardless if this product is perfect, it’s a great first step.

Dec 15

Use a MIDI device to switch between shots with Wirecast (Our V++ Wirecast switcher.)

Good news kids, we’ve got a framework to roll out for Wirecast. If you’re not familiar, Wirecast our favorite choice for being able to broadcast and record from your computer. And, smartly, the company has put in a really capable AppleScript implementation.

There’s two main reasons to use this framework. One: You want to have a hardware controller to do the job for you. Two: you want to be able to programmatically select shots and have stuff happen without clicking around. This framework solves both problems.

A video to explain myself:

We’ve gone a bit above and beyond to create our ideal scenario. Here are the required pieces:

  1. Wirecast (either 3.5+ or 4.0 should work.)
  2. MidiPipe (free! Direct download link)
  3. The V++ framework (download!)
  4. A Mac. (Sorry Windows users. We hear there’s some sort of OLE-style scripting available for you, but we don’t use that system much.)

Optionally, you can rock some gear to make things go better. We’re using:

  1. A Korg nanoKONTROL. (If you buy one, use our Amazon affiliate link, so you can hook us up for showing you this rad tutorial.)
  2. Korg’s Kontrol Editor software (direct download link)
  3. An iPad with Midi software. (We like MXNM.)

Here’s the table o’ contents, so you know this isn’t a long tutorial.

  1. Configure your MIDI device
  2. Use our v++.mipi file with MidiPipe
  3. Use our v++.wcst file with Wirecast
  4. Rule the world.

When we’re done, we’ll show you how to customize the scripts so you can create even better actions.

Configure your MIDI device

So we’re going to pretend like everyone has a Korg nanoKontrol for a second. If you don’t, just skip down a bit to the MidiPipe section.

One of our favorite parts of the Korg nanoKontrol is the ability to flexibly update which MIDI messages are sent by the device. To edit the nanoKontrol (and anything in the nano series) we’ll use some software called Korg Kontrol Editor. (Download link!)

Inside of our zip file, we’ve provided a file called v++.nktrl_set and if you open this file with Kontrol Editor, you’ll see the standard configuration we use for the system. Note that if you’d like to change an individual scene to be non-Wirecast-ey, this is the tool with which to do it. We’re going to pretend like you’re just using our setup as-is, but if you modify anything, modify scenes 2, 3, or 4 by clicking on the “Wirecast” words at the top.

If you want to use some other sort of Midi device with our setup, you can either modify your device to send the control cues listed above, or you can edit the MidiPipe file to work with your controller. It’s up to you.

Configure MidiPipe

OK so you’ve got your Midi Device configured, now it’s time to play with MidiPipe.

The software works by “piping” MIDI data from transformer to transformer. On the left is a library of “patches” which you can use in your composition. On the right is the actual composition, showing the patches which are currently active.

First, open up our v++.mipi file in the software.

Click on the first item in the list, called “Midi In.” From the drop down menu, select your Midi device which you’ll be using. In our case, it’s the nanoKONTROL.

Next, to test if it’s working, select the “Alist” item. If you’d like to have Alist appear in a separate window (we do) double click on the word.

You’ll notice some important facts. As you use your Midi device, by using sliders or buttons, the “Data” column fills with information. These are the actual Midi messages, and they’ll trigger which shots are activated and the sort.

Configure Wirecast

OK, so now it’s Wirecast’s turn to get the V++ treatment. From our zip file, open v++.wcst by double clicking. The file will open with Wirecast, and should look like our screenshot below.

So each of these shots, which are currently blank, map to individual buttons on the keyboard.

By default, the bottom button of the channel maps directly to the shot named “camX” in Wirecast, where X is the channel number. It also blanks out the “background” layer by default.

The top button maps to “camX+cg”, again where X is the channel number. It also blanks out the “background” and “foreground” layers by default.

If you slide the slider up, that will load the “l3rd-X” shot from the “foreground” layer. If you slide any slider down, the “foreground” layer is set to blank.

If you twist the knob left, that will load “camX-lb” in the “background” layer. It won’t, by default, push the shot live yet. If you turn the knob right, it’ll load “camX-rb” in the “normal” layer, and will push any queued up shots live.

If you hit the “record” button, it’ll start recording. If you hit the “play” button, it’ll start broadcasting. Hit either button again, and it’ll stop the previous action.

Rule the world

OK so now that you’ve got everything set up, let me explain some reasoning behind the framework, and how you can use it to rule the world.

I built the framework with a specific usecase in mind: Live-style interview shows. I used it previously with a show called TechVi, which you can see below.

The framework should be pretty easy to use, especially given the video context above. The entire show was recorded live, and I made it so we had NO post production.

I firmly believe that anyone who’s ever thought it’d be fun to make video programs should have the ability to do so, and further, it shouldn’t take a massive knowledge and understanding of coding to do so. This framework is the first attempt at lowering the bar.

Over the coming years, I’m making it my personal mission to democratize video, Anyone with a computer and a desire to share something with the world should be able to do so, without having to ask for anyone’s permission.

Are you interested in this idea? Do you code Objective-C? Want to create great video shows? Sign up for our newsletter. We want to know who you are.

If you’re feeling particularly motivated, you can email me too: hello [at] vidplusplus.com and get in touch.


So inevitably, you’re going to want to customize the scripts. I’m going to write a full post about how to do that, but here’s the basics.

Edit the scripts by clicking on “AppleScript Trigger” in MidiPipe. The first block outlines a button’s behavior. You should be able to reverse engineer it, if you’re smart. For instance:

on top(channelnumber)
my switcher's setLayer("normal", "cam" & channelnumber & "+cg")
my switcher's setLayer("foreground", "Blank Shot")
my switcher's go()
end top

That sets two layers, then tells the switcher to “go live.”

The actual MIDI stuff is the next block, and it looks like this:

on runme(message)
-- transport controls
if (item 1 of message = 176) and (item 2 of message = 101) and (item 3 of message = 127) then
my button's play("down")
end if
end runme

I explain how most of this works in the video at the top of the post. So just watch that and this part should make sense.

Lastly, I have a block that is the actual switcher functions. Here’s the entire block. I’m not going to explain how the methods work in this post, but you should be able to figure it out, it’s pretty self explanatory.

script switcher
property myDoc : "1"
property normal_layer : "1"
property foreground_layer : "1"
property background_layer : "1"
property title_layer : "1"
property audio_layer : "1"
property wirecast : application "Wirecast"

on setup()
tell application "Wirecast"
set my myDoc to last document
set my normal_layer to the layer named "normal" of myDoc
set my foreground_layer to the layer named "foreground" of myDoc
set my background_layer to the layer named "background" of myDoc
set my title_layer to the layer named "text" of myDoc
set my audio_layer to the layer named "audio" of myDoc
set my myDoc's auto live to false
end tell
end setup

on setTransition(anumber, speed)
tell application "Wirecast"
set my myDoc's active transition popup to anumber
set my myDoc's transition speed to speed
end tell
end setTransition

on go()
tell application "Wirecast"
go my myDoc
end tell
end go

on startBroadcast()
tell application "Wirecast"
start broadcasting myDoc
end tell
end startBroadcast
on stopBroadcast()
tell application "Wirecast"
stop broadcasting myDoc
end tell
end stopBroadcast
on startRecord()
tell application "Wirecast"
start recording myDoc
end tell
end startRecord
on stopRecord()
tell application "Wirecast"
stop recording myDoc
end tell
end stopRecord

on setLayer(layername, shotname)
tell application "Wirecast"
set the active shot of the layer named layername of my myDoc to the shot named shotname of my myDoc
end tell
end setLayer
end script

Hopefully this all makes sense. If it doesn’t, feel free to comment on this post and I’ll try to clear things up as they come up.

If you start using this, I’d love to see it. Please leave a comment with a link to your site, and I’ll create a section where we link to you.

Good luck! (Oh yeah, sign up for our newsletter, too!)

Dec 1

What we’re watching: Tron projection display

Via CDM.