Acces to the video camera

I was wondering if there was a way to access to the camera video buffer from lua ?
I think there a way with the iphone SDK even tho, apple doesn’t really provide it.
It’s kinda of a hack in the SDK but it should be feasible to have the same in your framework right ?
I’d be willing to help if needed.

Ben [import]uid: 3804 topic_id: 199 reply_id: 300199[/import]

From what I’ve discussed with ARToolWorks and what I’ve read on http://gamesalfresco.com/2009/10/16/ismar-2009-sketch-and-shape-recognition-preview-from-ben-gurion-university/ … I would suspect that:

a) Apple wouldn’t approve the app until they finally publish public APIs for allowing marker based augmented reality apps
b) It’s going to be too slow to do anything in Lua anyway. If Objective-C is pushing to get decent frames per second of recognition out of the buffer, Lua’s going to be too slow to have any practical use.

I did put a request in for Corona access to: UIImagePickerController.cameraOverlayView along the lines of: http://mobile-augmented-reality.blogspot.com/2009/09/overlaying-views-on-uiimagepickercontro.html and Carlos said they’d consider it. That would be useful for location based AR projects.

Hope that’s useful info.

Cheers,

  • Ian [import]uid: 238 topic_id: 199 reply_id: 232[/import]

Yeah you are totally right I’m really eager to start doing some AR stuff on the iphone…
Also I was wondering why your guess is that lua would be too slow ?
I thought the nice thing about Lua was that it can be really efficient almost as good as objective-C, isn’t it ?
Also who should I get in touch with for a closer following regarding how much need to be done in order to have access to the raw data from the camera ? [import]uid: 3804 topic_id: 199 reply_id: 275[/import]

Lua is efficient (though I haven’t done any benchmarks on the device), but nowhere near as bare metal as Objective C or plain C running on the device.

To do any type of marker tracking, you need to be able run a whole bunch of image interpolation algorithms - using AS3 on a desktop is not that big a deal since the typical desktop is so fast these days. On an embedded device, that’s a totally different story. I seriously doubt anything like this could be done in Lua (loved to be proven wrong though!), although I have some ideas for a bunch of apps once Core Location is in place for Corona, and we had the ability to overlay the camera with Corona sprites. That might be a problem though - I don’t know how the Corona rendering engine is setup.

The only way we’ll get raw data from the camera is for Apple to approve it. File a Radar for it … https://bugreport.apple.com/ (-;

I can see at least a dupe on Open Radar: http://openradar.appspot.com/7294400 - make yours along the same lines and it’ll help the cause (though I’m sure companies like ARToolWorks, Total Immersion and Metaio have been knocking on the door for months.)

  • Ian [import]uid: 238 topic_id: 199 reply_id: 276[/import]