Extending camera API

Hi all,

since I’m test driving corona enterprise and just new to it, I have a simple question which probably has a simple answer.

I want to extend the current corona camera api so that I can add a timer to take pictures at a certain point in time. Like a countdown of 5 or 10 seconds.

So, can I extend the native camera API with corona Enterprise by bridging the missing functions?

Or am I completely misusing Corona enterprise by trying to do this?

regards,

Jürgen
[import]uid: 19590 topic_id: 33076 reply_id: 333076[/import]

I’m an Enterprise user, and it certainly sounds like something that you should be able to do.
Somebody can correct me if I’m wrong, but if you know how to do it with the iOS SDK, you should be able to write a LUA wrapper for it.
[import]uid: 70847 topic_id: 33076 reply_id: 131364[/import]

I’m an Enterprise user, and it certainly sounds like something that you should be able to do.
Somebody can correct me if I’m wrong, but if you know how to do it with the iOS SDK, you should be able to write a LUA wrapper for it.
[import]uid: 70847 topic_id: 33076 reply_id: 131364[/import]

Yup, right now, the best way to do this is to write a Lua wrapper around the iOS/Android APIs, e.g. UIImagePickerController for iOS. [import]uid: 26 topic_id: 33076 reply_id: 131541[/import]

Yup, right now, the best way to do this is to write a Lua wrapper around the iOS/Android APIs, e.g. UIImagePickerController for iOS. [import]uid: 26 topic_id: 33076 reply_id: 131541[/import]

Hi Walter,

Can you provide sample code on how to write this Lua wrapper please?

Thanks,

Caleb

I’d recommend getting the book “Programming in Lua” by Roberto Ierusalimschy.

I bought the latest Third Edition one. Section IV goes through the C API which you’ll use to interface your Objective-C and C-code.

For iOS, what you’d do is to create a plugin which is structured the same way any Xcode project would be (apart from the Lua C calls). You can take the plugin from the Enterprise App template and modify it to your needs. If you need to respond to delegates you can create a new class and base it off NSObject and set it up to respond to the delegates you need.

I haven’t had the need to tackle Android with Enterprise yet. 

There’s insanely little (read as NONE) documentation for how to integrate with the SDK… I couldn’t even get an answer regarding how to access the apps opengl buffers, so you’re not going to really get any support from CoronaLabs by barking up that tree methinks… They couldn’t really support much beyond the interface to the platform anyways, as once you get outside the SDK, it’s apples (or googles) ball game.

Anyways, here’s some ios code I cobbled together to access the cameraView to take videos, save them out in a specific file format, save out a thumbnail (at a specific jpg compression, different than coronas default), and communicate back to the lua code what’s going on…

Disclaimer: As I’m a freshman native coder, I’m sure it’s riddled with errors and you’ll notice various commented out lines of code, print statements etc developed as I stumbled along… it’s terribly inefficient, probably uses deprecated calls, is ugly, and it gets bad gas mileage. But without getting a single question answered - it works. Maybe something in there will give you an idea for a direction to go.

Best wishes


//
// Lets get the imagePicker delegates out of the way…
//
////////////////////
 + (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{       
    NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
    
    // force corona to update our lua state…
    // Create event and add message to it
    CoronaLuaNewEvent( _CoronaLuaState, kEvent );    
    lua_pushstring( _CoronaLuaState, [@“PushValue” UTF8String] );
    lua_setfield( _CoronaLuaState, -2, [@“push” UTF8String] );    
    // Dispatch event to library’s listener
    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );    
            
    NSString *mp4Basename = [newFilename stringByAppendingString:@".mp4"];
    NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:mp4Basename];    
   
    // Let’s save a thumbnail…
    // Generate an image…    
    MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
    UIImage *thumbnail = [player thumbnailImageAtTime:0.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
    //Player autoplays audio on init
    [player stop];
    [player release];
    
    // Write image to JPG
    NSString *jpgBasename = [newFilename stringByAppendingString:@".jpg"];
    NSString  *jpgPath = [NSTemporaryDirectory() stringByAppendingPathComponent:jpgBasename];
//    NSLog(@“JPG Path: %@”,jpgPath);
    [UIImageJPEGRepresentation(thumbnail, 0.6) writeToFile:jpgPath atomically:YES];   
//    NSLog(@“JPEG written…”);
   
    NSLog(@“Attempting MP4 conversion”);

    AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
    exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetPassthrough];      // static… ugh    
//    NSLog (@“created exporter. supportedFileTypes: %@”, exportSession.supportedFileTypes);                
//    NSLog (@" Compat presets: %@", compatiblePresets);            
    
    NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
                        
    exportSession.outputURL = exportUrl;    
    exportSession.outputFileType =  AVFileTypeMPEG4;  //  AVFileTypeAppleM4V;  // AVFileType3GPP; // We’ll go with mpeg4 for now… Seems to be good back to android 3.0? CoronaLabs never answered…
    exportSession.shouldOptimizeForNetworkUse = YES;   
            
    [exportSession exportAsynchronouslyWithCompletionHandler:^(void){
                                
        id<CoronaRuntime> runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );                    
        
        switch (exportSession.status) {
        
            case AVAssetExportSessionStatusFailed:
            case AVAssetExportSessionStatusCancelled:
                NSLog(@“Export failed: %@”, [[exportSession error] localizedDescription]);
                
                runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );
                                
                dispatch_async(dispatch_get_main_queue(), ^void{
                    // It should have updated our lua state…
                    [runtime.appViewController dismissModalViewControllerAnimated:YES];                  
                });                                

                [exportSession release];
                
                break;
                
            case AVAssetExportSessionStatusExporting:
                NSLog(@" Status Exporting…");                    
                break;
            
            case AVAssetExportSessionStatusUnknown:
                NSLog(@" Status Unknown…");                    
                break;
            
            case AVAssetExportSessionStatusWaiting:
                NSLog(@" Status waiting…");                    
                break;

            case AVAssetExportSessionStatusCompleted:
                NSLog(@" – AVAssetExportSessionStatusCompleted");

                runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );
                                
                dispatch_async(dispatch_get_main_queue(), ^void{
                    // It should have updated our lua state…
                    [runtime.appViewController dismissModalViewControllerAnimated:YES];                  
                });                
                                                
                // Now push the final event…                
//                dispatch_async(dispatch_get_main_queue(), ^void{
                    CoronaLuaNewEvent( _CoronaLuaState, kEvent );                
                    lua_pushstring( _CoronaLuaState, [[exportSession.outputURL path] UTF8String] );
                    lua_setfield( _CoronaLuaState, -2, [@“videoName” UTF8String] );                
                    // Dispatch event to library’s listener
                    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );                
//                });                
                                                                                
//                [videoURL release];
//                [avAsset release];
                  
                NSLog(@“export session done”);
                [exportSession release];
                
                break;
            
            default:
                NSLog(@“Default condition”);                                        
                break;
            }                
        }];

    
   NSLog(@“exiting didFinishPicking”);                
}  

////////////////////

  • (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
    {
        
        id<CoronaRuntime>runtime = CoronaLuaGetContext( _CoronaLuaState );
        // It should have updated our lua state…
        [runtime.appViewController dismissModalViewControllerAnimated:YES];                  

    // Create event and add message to it
    CoronaLuaNewEvent( _CoronaLuaState, kEvent );

    lua_pushstring( _CoronaLuaState, [@“Cancelled” UTF8String] );
    lua_setfield( _CoronaLuaState, -2, [@“error” UTF8String] );     // Pass back error as set… Doesn’t matter to what, just that it exists in return struct, as checked by lua code

    // Dispatch event to library’s listener
    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );   

    // process the cancellation of movie picker here
    NSLog(@“Capture cancelled”);

}
 

@mpappas, you can’t access the OpenGL buffers via Enterprise. As for Enterprise docs, you can find them here: http://docs.coronalabs.com/native/index.html  The integration usually revolves around bridging between Lua and C/Obj-C, so that involves the Lua C API, and ingemar’s book suggestion is spot on.

Thanks for the info Walter, it’s the first CL response I’ve got regarding enterprise features!

First off per your comment, it would be useful with enterprise to have a coronaLabs sanctioned function / var to get at the SDK’s openGL back /buffers to copy pixels out (which were composed by standard app lua code). It would be nice to be able to get at / have a CL defined function / protocol to get at (interface/read/write/or even point to) an SDK lua side display.newImage() buffer as well of course - but I digress… (I can throw out 5 more useful, generic enterprise <–> sdk funcs that would be helpful if you have a lotta spare time!)

Without any info on getting at the apps openGL buffer, I ended up with the following enterprise code hot mess to grab an x,y width, height area of the apps screen when it is called from the apps lua side code.

Took a while, and 8-10 totally different buffer reading approaches until a  modified version of “Brad Larsens” example from stack overflow was able to get at the corona sdk opengl buffer (Probably a bug that i can at all). But hey, it works, and I get 70% jpg compression (instead of the default) iOS jpg compression.

Probably make the compression % a variable in the future, but as they say, “it’s good enough for government work”. (I really hope you don’t “fix” this and my code breaks Walter! I’m offering this with the intent you get an idea of what hoops we jump through to get at some things…)

Disclaimer: I’m a freshman native coder, so the following is probably buggy, deprecated, and ugly. And it gets bad gas mileage. Lot’s of commented out things that didn’t work and debug print statements…

PluginLibrary::saveJPG( lua_State *L )
{
    
    NSLog(@"-saveJPG called.");
    
    int num = lua_gettop( L);    
    NSLog(@" num args: : %d", num);
    
    NSLog(@" --------------");
    
    // Argument 1 passed from lua code… the source filename – not used with final technique, read from screen…
    NSLog(@" --------------");
    int ret = lua_isstring( L, 1);    
    NSLog(@" arg1 is string: %d", ret);    
    const char *sourceFilename = lua_tostring( L, 1 );   
    NSLog(@" source fileName : %s", sourceFilename);

    // Argument 2… the dest filename, code saves a jpg out from screen with this name
    NSLog(@" --------------");
    ret = lua_isstring( L, 2);    
    NSLog(@" arg2 is string: %d", ret);    
    const char *destFilename = lua_tostring( L, 2 );   
    NSLog(@" dest fileName : %s", destFilename);

        
    ret = lua_isnumber( L, 3);    
    NSLog(@" arg3 is number: %d", ret);
    int x1 = lua_tonumber( L, 3);                  // Top Left
    NSLog(@" arg3 is number: %d", x1);

    ret = lua_isnumber( L, 4);    
    NSLog(@" arg4 is number: %d", ret);
    int y1 = lua_tonumber( L, 4);                 // Top Left
    NSLog(@" arg4 is number: %d", y1);

    ret = lua_isnumber( L, 5);    
    NSLog(@" arg5 is number: %d", ret);
    int x2 = lua_tonumber( L, 5);
    NSLog(@" arg5 is number: %d", x2);

    ret = lua_isnumber( L, 6);    
    NSLog(@" arg6 is number: %d", ret);
    int y2 = lua_tonumber( L, 6);
    NSLog(@" arg6 is number: %d", y2);

if( 1)  // Brad Larson example
{
id<CoronaRuntime> runtime = (id<CoronaRuntime>)CoronaLuaGetContext( L );

EAGLContext *coronaContext = nil;
coronaContext = [EAGLContext currentContext];

    GLint width;
    GLint height;    
//    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);
//    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);

    width = x2-x1;   // passed in args
    height = y2-y1;
    
    NSLog(@“GL w,h == %d %d”,width,height);

    NSInteger myDataLength = width * height * 4;
    
//    eaglLayer.drawableProperties = @{ kEAGLDrawablePropertyRetainedBacking: [NSNumber numberWithBool:YES], kEAGLDrawablePropertyColorFormat: kEAGLColorFormatRGBA8 };    

//    [runtime.appWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
//    [runtime.appViewController.view.layer renderInContext:UIGraphicsGetCurrentContext()];
//    [runtime.appViewController.view.layer renderInContext:coronaContext];
   
    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(x1,y1, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
//    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, NULL);
//    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    // then make the uiimage from that
    UIImage *viewImage = [UIImage imageWithCGImage:imageRef];
    
    // Save the file
    NSString *jpgBasename = [[NSString alloc] initWithUTF8String:destFilename];    // Passed in arg
    NSString *jpgPath = [NSTemporaryDirectory() stringByAppendingPathComponent:jpgBasename];      
    NSLog(@“JPG Path: %@”,jpgPath);    
    [UIImageJPEGRepresentation(viewImage, 0.7) writeToFile:jpgPath atomically:YES];       
    NSLog(@“JPEG written (6)?!”);    // 6th different technique to get at gl buffer…
    
    CGImageRelease(imageRef);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpaceRef);
    free(buffer);

// 3. Restore Corona’s context
[EAGLContext setCurrentContext:coronaContext];

}

    return 0;

}

Hi Walter,

Can you provide sample code on how to write this Lua wrapper please?

Thanks,

Caleb

I’d recommend getting the book “Programming in Lua” by Roberto Ierusalimschy.

I bought the latest Third Edition one. Section IV goes through the C API which you’ll use to interface your Objective-C and C-code.

For iOS, what you’d do is to create a plugin which is structured the same way any Xcode project would be (apart from the Lua C calls). You can take the plugin from the Enterprise App template and modify it to your needs. If you need to respond to delegates you can create a new class and base it off NSObject and set it up to respond to the delegates you need.

I haven’t had the need to tackle Android with Enterprise yet. 

There’s insanely little (read as NONE) documentation for how to integrate with the SDK… I couldn’t even get an answer regarding how to access the apps opengl buffers, so you’re not going to really get any support from CoronaLabs by barking up that tree methinks… They couldn’t really support much beyond the interface to the platform anyways, as once you get outside the SDK, it’s apples (or googles) ball game.

Anyways, here’s some ios code I cobbled together to access the cameraView to take videos, save them out in a specific file format, save out a thumbnail (at a specific jpg compression, different than coronas default), and communicate back to the lua code what’s going on…

Disclaimer: As I’m a freshman native coder, I’m sure it’s riddled with errors and you’ll notice various commented out lines of code, print statements etc developed as I stumbled along… it’s terribly inefficient, probably uses deprecated calls, is ugly, and it gets bad gas mileage. But without getting a single question answered - it works. Maybe something in there will give you an idea for a direction to go.

Best wishes


//
// Lets get the imagePicker delegates out of the way…
//
////////////////////
 + (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{       
    NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
    
    // force corona to update our lua state…
    // Create event and add message to it
    CoronaLuaNewEvent( _CoronaLuaState, kEvent );    
    lua_pushstring( _CoronaLuaState, [@“PushValue” UTF8String] );
    lua_setfield( _CoronaLuaState, -2, [@“push” UTF8String] );    
    // Dispatch event to library’s listener
    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );    
            
    NSString *mp4Basename = [newFilename stringByAppendingString:@".mp4"];
    NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:mp4Basename];    
   
    // Let’s save a thumbnail…
    // Generate an image…    
    MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
    UIImage *thumbnail = [player thumbnailImageAtTime:0.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
    //Player autoplays audio on init
    [player stop];
    [player release];
    
    // Write image to JPG
    NSString *jpgBasename = [newFilename stringByAppendingString:@".jpg"];
    NSString  *jpgPath = [NSTemporaryDirectory() stringByAppendingPathComponent:jpgBasename];
//    NSLog(@“JPG Path: %@”,jpgPath);
    [UIImageJPEGRepresentation(thumbnail, 0.6) writeToFile:jpgPath atomically:YES];   
//    NSLog(@“JPEG written…”);
   
    NSLog(@“Attempting MP4 conversion”);

    AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];
    exportSession = [[AVAssetExportSession alloc]initWithAsset:avAsset presetName:AVAssetExportPresetPassthrough];      // static… ugh    
//    NSLog (@“created exporter. supportedFileTypes: %@”, exportSession.supportedFileTypes);                
//    NSLog (@" Compat presets: %@", compatiblePresets);            
    
    NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
                        
    exportSession.outputURL = exportUrl;    
    exportSession.outputFileType =  AVFileTypeMPEG4;  //  AVFileTypeAppleM4V;  // AVFileType3GPP; // We’ll go with mpeg4 for now… Seems to be good back to android 3.0? CoronaLabs never answered…
    exportSession.shouldOptimizeForNetworkUse = YES;   
            
    [exportSession exportAsynchronouslyWithCompletionHandler:^(void){
                                
        id<CoronaRuntime> runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );                    
        
        switch (exportSession.status) {
        
            case AVAssetExportSessionStatusFailed:
            case AVAssetExportSessionStatusCancelled:
                NSLog(@“Export failed: %@”, [[exportSession error] localizedDescription]);
                
                runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );
                                
                dispatch_async(dispatch_get_main_queue(), ^void{
                    // It should have updated our lua state…
                    [runtime.appViewController dismissModalViewControllerAnimated:YES];                  
                });                                

                [exportSession release];
                
                break;
                
            case AVAssetExportSessionStatusExporting:
                NSLog(@" Status Exporting…");                    
                break;
            
            case AVAssetExportSessionStatusUnknown:
                NSLog(@" Status Unknown…");                    
                break;
            
            case AVAssetExportSessionStatusWaiting:
                NSLog(@" Status waiting…");                    
                break;

            case AVAssetExportSessionStatusCompleted:
                NSLog(@" – AVAssetExportSessionStatusCompleted");

                runtime = (id<CoronaRuntime>)CoronaLuaGetContext( _CoronaLuaState );
                                
                dispatch_async(dispatch_get_main_queue(), ^void{
                    // It should have updated our lua state…
                    [runtime.appViewController dismissModalViewControllerAnimated:YES];                  
                });                
                                                
                // Now push the final event…                
//                dispatch_async(dispatch_get_main_queue(), ^void{
                    CoronaLuaNewEvent( _CoronaLuaState, kEvent );                
                    lua_pushstring( _CoronaLuaState, [[exportSession.outputURL path] UTF8String] );
                    lua_setfield( _CoronaLuaState, -2, [@“videoName” UTF8String] );                
                    // Dispatch event to library’s listener
                    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );                
//                });                
                                                                                
//                [videoURL release];
//                [avAsset release];
                  
                NSLog(@“export session done”);
                [exportSession release];
                
                break;
            
            default:
                NSLog(@“Default condition”);                                        
                break;
            }                
        }];

    
   NSLog(@“exiting didFinishPicking”);                
}  

////////////////////

  • (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
    {
        
        id<CoronaRuntime>runtime = CoronaLuaGetContext( _CoronaLuaState );
        // It should have updated our lua state…
        [runtime.appViewController dismissModalViewControllerAnimated:YES];                  

    // Create event and add message to it
    CoronaLuaNewEvent( _CoronaLuaState, kEvent );

    lua_pushstring( _CoronaLuaState, [@“Cancelled” UTF8String] );
    lua_setfield( _CoronaLuaState, -2, [@“error” UTF8String] );     // Pass back error as set… Doesn’t matter to what, just that it exists in return struct, as checked by lua code

    // Dispatch event to library’s listener
    CoronaLuaDispatchEvent( _CoronaLuaState, fListener, 0 );   

    // process the cancellation of movie picker here
    NSLog(@“Capture cancelled”);

}
 

@mpappas, you can’t access the OpenGL buffers via Enterprise. As for Enterprise docs, you can find them here: http://docs.coronalabs.com/native/index.html  The integration usually revolves around bridging between Lua and C/Obj-C, so that involves the Lua C API, and ingemar’s book suggestion is spot on.

Thanks for the info Walter, it’s the first CL response I’ve got regarding enterprise features!

First off per your comment, it would be useful with enterprise to have a coronaLabs sanctioned function / var to get at the SDK’s openGL back /buffers to copy pixels out (which were composed by standard app lua code). It would be nice to be able to get at / have a CL defined function / protocol to get at (interface/read/write/or even point to) an SDK lua side display.newImage() buffer as well of course - but I digress… (I can throw out 5 more useful, generic enterprise <–> sdk funcs that would be helpful if you have a lotta spare time!)

Without any info on getting at the apps openGL buffer, I ended up with the following enterprise code hot mess to grab an x,y width, height area of the apps screen when it is called from the apps lua side code.

Took a while, and 8-10 totally different buffer reading approaches until a  modified version of “Brad Larsens” example from stack overflow was able to get at the corona sdk opengl buffer (Probably a bug that i can at all). But hey, it works, and I get 70% jpg compression (instead of the default) iOS jpg compression.

Probably make the compression % a variable in the future, but as they say, “it’s good enough for government work”. (I really hope you don’t “fix” this and my code breaks Walter! I’m offering this with the intent you get an idea of what hoops we jump through to get at some things…)

Disclaimer: I’m a freshman native coder, so the following is probably buggy, deprecated, and ugly. And it gets bad gas mileage. Lot’s of commented out things that didn’t work and debug print statements…

PluginLibrary::saveJPG( lua_State *L )
{
    
    NSLog(@"-saveJPG called.");
    
    int num = lua_gettop( L);    
    NSLog(@" num args: : %d", num);
    
    NSLog(@" --------------");
    
    // Argument 1 passed from lua code… the source filename – not used with final technique, read from screen…
    NSLog(@" --------------");
    int ret = lua_isstring( L, 1);    
    NSLog(@" arg1 is string: %d", ret);    
    const char *sourceFilename = lua_tostring( L, 1 );   
    NSLog(@" source fileName : %s", sourceFilename);

    // Argument 2… the dest filename, code saves a jpg out from screen with this name
    NSLog(@" --------------");
    ret = lua_isstring( L, 2);    
    NSLog(@" arg2 is string: %d", ret);    
    const char *destFilename = lua_tostring( L, 2 );   
    NSLog(@" dest fileName : %s", destFilename);

        
    ret = lua_isnumber( L, 3);    
    NSLog(@" arg3 is number: %d", ret);
    int x1 = lua_tonumber( L, 3);                  // Top Left
    NSLog(@" arg3 is number: %d", x1);

    ret = lua_isnumber( L, 4);    
    NSLog(@" arg4 is number: %d", ret);
    int y1 = lua_tonumber( L, 4);                 // Top Left
    NSLog(@" arg4 is number: %d", y1);

    ret = lua_isnumber( L, 5);    
    NSLog(@" arg5 is number: %d", ret);
    int x2 = lua_tonumber( L, 5);
    NSLog(@" arg5 is number: %d", x2);

    ret = lua_isnumber( L, 6);    
    NSLog(@" arg6 is number: %d", ret);
    int y2 = lua_tonumber( L, 6);
    NSLog(@" arg6 is number: %d", y2);

if( 1)  // Brad Larson example
{
id<CoronaRuntime> runtime = (id<CoronaRuntime>)CoronaLuaGetContext( L );

EAGLContext *coronaContext = nil;
coronaContext = [EAGLContext currentContext];

    GLint width;
    GLint height;    
//    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);
//    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);

    width = x2-x1;   // passed in args
    height = y2-y1;
    
    NSLog(@“GL w,h == %d %d”,width,height);

    NSInteger myDataLength = width * height * 4;
    
//    eaglLayer.drawableProperties = @{ kEAGLDrawablePropertyRetainedBacking: [NSNumber numberWithBool:YES], kEAGLDrawablePropertyColorFormat: kEAGLColorFormatRGBA8 };    

//    [runtime.appWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
//    [runtime.appViewController.view.layer renderInContext:UIGraphicsGetCurrentContext()];
//    [runtime.appViewController.view.layer renderInContext:coronaContext];
   
    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(x1,y1, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
//    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, NULL);
//    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    // then make the uiimage from that
    UIImage *viewImage = [UIImage imageWithCGImage:imageRef];
    
    // Save the file
    NSString *jpgBasename = [[NSString alloc] initWithUTF8String:destFilename];    // Passed in arg
    NSString *jpgPath = [NSTemporaryDirectory() stringByAppendingPathComponent:jpgBasename];      
    NSLog(@“JPG Path: %@”,jpgPath);    
    [UIImageJPEGRepresentation(viewImage, 0.7) writeToFile:jpgPath atomically:YES];       
    NSLog(@“JPEG written (6)?!”);    // 6th different technique to get at gl buffer…
    
    CGImageRelease(imageRef);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpaceRef);
    free(buffer);

// 3. Restore Corona’s context
[EAGLContext setCurrentContext:coronaContext];

}

    return 0;

}