Feasibility Test: Passed!
That idea that I posted about earlier… Yeah, the one that I can’t talk about just yet. Yes, that one! All that math, and the algorithms which worked out fine in my mind take a lot of processing power. And, if I’m going to build a “Video” app, ensuring real-time performance is paramount. But of course, you all knew that already. Why else would I be reading up on OpenCV and GPU accelerated programming?
So, today, I hacked out a little app to make sure that what I want to build can be done with today’s hardware. i.e, it should be able to run on my 3 year old iPhone 4.
Okay, I will be a little less cryptic about it, and let you in on just this one little secret. For what I want to achieve in the final app, I need to be able to detect edges in real-time. That means, I should be able to apply a gaussian blur function on the video frame (to smooth out unwanted edges), and then run the Canny86 algorithm on top of it to detect the edges. And do all that while achieving real-time or near real-time performance on my iPhone. I need at least 12 FPS for my app. (In fact, I prefer 12fps for the effect I desire, over either 24 or 30fps)
Coding it in OpenCV couldn’t have been any simpler. Talk about readily available functions with names like cv::GaussianBlur, and cv::Canny. It’s as if they wrote the library with my app in mind :) Anyway, skipping over to the good news: It works! I wrote the app not knowing what kind of performance to expect, and when I ran it for the first time… I smiled.
Here’s a little preview of the app showing my parents’ backyard: