The Good, the Bad, and the Ugly
The Good:
I found out about this awesome GPU accelerated image processing framework, GPUImage. Thanks to @bradlarson for writing it, and to @roderic for tweeting me the link. It’s an extremely well-designed library, and it took me about 15 minutes to comment out all the openCV code, and integrate GPUImage into my test app. Of course, my test app didn’t do much, other than run Canny Edge detection on frames from my video camera. But anyway, the results were pleasing to say the least. 640x480 on my iPhone4 runs at a smooth 20+ fps. And, although I haven’t tried it yet, I’m sure my iPad3 can do much better than that, and could probably handle 720p pretty well.
The Bad:
I’m no longer with Bre.ad. I wish that affair had lasted longer, but it wasn’t meant to be. So, If you, or someone you know, is looking to hire a mobile/backend engineer with mad skillz (hopefully inside the 7x7), check out my resume and pass it along. Okay, that wasn’t so bad. Onwards to…
The Ugly:
So, I have this super slick UI in my mind for the app, that shows live effect previews overlaid on top of a live video stream. It involves applying different effects to different parts of the video, and making the live previews scrollable as well (and smoothly at that), to make it appear as if you are looking at the world through these magical panes of effect-glass.
Anyway, I had to first make sure this was possible, and hacked out a simple prototype with 4 live effects (Canny, Canny+Inverted Colors, Sepia, and a Posterize/Saturation filter). And, following in the age old tradition of making ugly prototypes, it’s a pretty ugly implementation. But, it’s the beginning of something amazing. Check it out!