display.capture (build 773)

Ok, am I missing something with display.capture?

I’m trying to just capture a square object to the camera roll, so the new display.capture seems like it should be perfect.

The object is centered on the visible screen.

… But if I do a display.capture, I’m getting just a corner of the square object.

… If I do a display.captureScreen, I get the whole object right in the center of the screen, along with everything else and a rectangular image in the camera roll. :wink:

Off hand anyone have any ideas what I’m missing or is this a bug? From the working captureScreen, I’m guessing bug. But I’m wondering if anyone has gotten this work?
[import]uid: 13859 topic_id: 23885 reply_id: 323885[/import]

I haven’t seen any problems with the display.capture() function yet. We’re currently using it with our storyboard library to handle screen transitions now.

Are you capturing a group? Perhaps your group object has a funky size for whatever reason (although we’ve tested it with group objects too). Try doing a display.capture() directly on an image object. [import]uid: 32256 topic_id: 23885 reply_id: 96420[/import]

I tried both, capturing a group and capturing just the image…

Only tested on iPhone 4S / iOS 5.1 … And I’m sure the placement is right because the display.captureScreen is giving me dead center. Meanwhile display.capture just gives me a weird fraction of the image.

Dunno what that could be then… [import]uid: 13859 topic_id: 23885 reply_id: 96426[/import]

Hello. My situation is:

On the simulator(build:773)
“display.capture()” works fine.

Then build it to a device:

On devises(iPhone4)

  • When I call a function that includes “display.capture()” right after the construction, the capture image doesn’t appear.
  • Capturing is too slow. All I want to do is, Tap(phase==began) -> Capture -> Drag(phase==move) -> Drag the capture image -> Tap out(phase==ended) -> Remove capture image
    but it takes about 0.5 seconds to complete capturing. So the operation feeling is bad.

sorry my poor english. [import]uid: 96013 topic_id: 23885 reply_id: 96530[/import]

The display.capture() function requires that the object to be displayed on screen. This is because it does a screen capture of that object. Any parts of that object that is off screen will not be captured, thus yielding a fraction of an image. If the object is not on screen at all, then nothing will be captured and nothing will be displayed. Perhaps this is what is happening to you two?

The display.capture() function will be slow if you are having it save to the photo library. This is because writing to storage is slow. Capture the screen on high resolution displays such as the New iPad (aka: iPad3) will also be much slower than on an old iPhone. This is because this function just has way more pixels to capture. [import]uid: 32256 topic_id: 23885 reply_id: 96672[/import]

Perhaps this is what is happening to you two?

Hi Josh,

Well, note that I said display.captureScreen shows the object I want to capture perfectly centered on the screen.

Yet if I run display.capture either immediately before or immediately after (i’ve tried both) display.captureScreen, I’m getting just a piece of the object.
If I see the correct positioning on captureScreen, shouldn’t capture give me what I need? :slight_smile:

[import]uid: 13859 topic_id: 23885 reply_id: 96675[/import]

Kenn,

So the object you are trying is centered on screen then? Okay, well display.capture() should be able to capture the whole object then. The returned object will always be positioned in the top left corner of the screen and not over the captured object, if that helps you any. But if you are saying that the image capture is actually clipped, then I’m unable to reproduce this issue.

Can you send us a small sample project that can reproduce this issue please? You can do so by clicking the “Report a Bug” link at the top of this page. Thanks. [import]uid: 32256 topic_id: 23885 reply_id: 96693[/import]

I’m getting warped images on my android. Works fine in the simulator. I noticed images get weird if they are moving.

You can see my image here trees in the back ground should be strait up and down.
http://developer.anscamobile.com/forum/2012/03/24/build-772-displaycapture-image-warped-some-andriod-devices

[import]uid: 7177 topic_id: 23885 reply_id: 96723[/import]

Thanks Joshua.

In my case, I ensure that I capture a group that holds some display objects.
As a test, I use 100ms&1time timer to invoke a function that includes capturing task, it work fine.
I failed capturing when the timing that main.lua first runs (means during a construction).

About the slowness of capturing, I think “display.capture” is not save to photo library by default. Also I tried to set 2nd parameter to false, but the capturing is still slow on my iPhone4.

These two cases occur only on the actual device. On the simulator, it works fine as I want.

I will try to investigate further. thank you.
[import]uid: 96013 topic_id: 23885 reply_id: 96743[/import]

Err … my bad, capture on the GROUP does solve the problem. It is indeed capturing just the image that fails. Bah. I was sure I had tried that both ways. :wink: As long as one of them works, I’m good. *chuckle*

But, here’s sample code that lets you do a test for the image only capture if you want it, since I wrote it to figure it all out. lol :wink:

local tmpFunc = function ( event )  
  
 local fileName ="pic\_" .. os.time() .. ".jpg";  
  
 local saveListener = function (evt)   
  
 -- create a group  
 local saveGroup = display.newGroup();  
  
 -- load the downloaded image  
 local photo = display.newImage (fileName, system.DocumentsDirectory, 0, 0);  
  
 -- scale, move and add it to the group.  
 local xScale = (display.contentWidth \* 0.95) / photo.contentWidth  
 local yScale = (display.contentHeight \* 0.95) / photo.contentHeight  
 local scale  
 if (xScale \< yScale) then  
 scale = xScale  
 else  
 scale = yScale  
 end  
 if (scale \< 1) then  
 photo:scale(scale, scale)  
 end  
 photo.x = display.contentWidth/2;  
 photo.y = display.contentHeight/2;  
 saveGroup:insert(photo);  
 -- experiment with captures:  
  
 -- capture the photo object  
 display.capture ( photo, true );  
  
 -- capture the group  
 local screencap = display.capture ( saveGroup, true );  
 if (screencap and screencap.removeSelf) then  
 screencap:removeSelf();  
 end  
  
 -- capture the screen  
 local screencap = display.captureScreen ( true );  
 if (screencap and screencap.removeSelf) then  
 screencap:removeSelf();  
 end  
  
 -- Remove the photo with a quick timer.   
 local timerFunc = function (evt)  
 saveGroup:removeSelf();  
 saveGroup = nil;  
 return true;  
 end  
 timer.performWithDelay (1, timerFunc);  
 event.target.txt.text = "Good tap. Check your camera roll.";  
 event.target.txt.size = 25;  
 return true;  
  
 end  
  
 -- download a cool logo from the net and see if we can put it in the camera roll.  
 network.download("http://www.21x20.com/21x20-logo.gif", "GET", saveListener, fileName, system.DocumentsDirectory )  
  
end  
-- blah blah.  
local tmpGroup = display.newGroup();  
  
local thisRect = display.newRect (-50, -50, display.contentWidth+100, display.contentHeight+100);  
thisRect:setFillColor(150, 150, 150);  
thisRect:addEventListener("tap", tmpFunc)  
tmpGroup:insert(thisRect);  
  
local txt = display.newText ("tap to try it.", 100, 200, "Arial", 30);  
tmpGroup:insert(txt);  
thisRect.txt = txt;  

[import]uid: 13859 topic_id: 23885 reply_id: 96762[/import]

I should also note it’s SUPER COOL that works. Thanks to all on the team for this one! All of us non-gamer folks sincerely appreciate the attention to this kind of stuff that most gamers likely won’t ever use, but we will. :smiley:

And thanks also for your responsiveness in the forums Josh! :slight_smile:

~~Kenn [import]uid: 13859 topic_id: 23885 reply_id: 96764[/import]

shuhei,

I’ve noticed that all of our capture APIs fail to capture the display on the first drawn frame on iOS and Android, but I believe this works on the Corona Simulator (Mac and Windows). Might be some kind of OpenGL ES limitation that we have to work-around. Not really sure. In any case, it has been an outstanding issue for a year now, but we flagged it low priority since most apps do not capture anything on the 1st frame. Yeah, so sorry that gave you trouble.

And right, display.capture() without a 2nd argument does not save to the photo library by default. The larger the captured object in pixels, the longer it will take. Worst case is the New iPad (aka: iPad3) which I’ve heard takes about 2 seconds to capture the entire display. Unfortunately, we do not have a solution to this… but it is noticeably faster compared to doing a display.save(). I think this is as fast as our screen capture APIs are going to get at the moment. [import]uid: 32256 topic_id: 23885 reply_id: 96773[/import]

Hi Kenn,

I’m glad you got it working and find it useful. We actually use that new display.capture() function for our own storyboard library to make screen transitions faster. We thought it would come in handy for the rest of the community too. :slight_smile:

I’ll have to give your code a go later, because it should work with an image object. I know I’ve tested this… and with sprites too. I’m thinking since you scale the image just before you capture it is causing the issue. Like Corona is confused about the actual position of the object on screen. Something for me to look into later. [import]uid: 32256 topic_id: 23885 reply_id: 96774[/import]

I’ve noticed the cropping issue as well, where only the bottom right corner of my object group is captured. I believe this is because if you just create a group and add things to it, its located at 0,0. The, when you call capture, its only saving the pixels that are > 0,0, so you only see the bottom right corner of your object.

My solution is to put the group at the center of the screen before capturing so that the whole thing gets captured. [import]uid: 122310 topic_id: 23885 reply_id: 97144[/import]

Just to reiterate what I’ve mentioned before, our display.capture() and display.save() functions will only capture what you see on screen. If the object lies partially outside of the screen, then it will be clipped. This is by design and we do not consider this to be a bug.

Now, if the object fully visible on screen and it is still clipped, well then that is definitely a bug that needs to be fixed. We will re-test it on our end to be sure. [import]uid: 32256 topic_id: 23885 reply_id: 97164[/import]

I don’t think its a bug…its just a gotcha that users of the function need to be aware of.

Its not entirely true that things need to be “on the screen” as you can create an object, capture it, then remove the original, all before it is visible to the user. It does need to be located in a location that would be visible had the draw loop run, but since you create and destroy before the draw to screen, it all works without the user seeing a flicker. [import]uid: 122310 topic_id: 23885 reply_id: 97170[/import]