Is there an estimate as to when the screen.capture and screen.captureBounds APIs will work again on-device in graphics 2.0? It is an essential feature for many of our company’s apps (that currently works great in the Simulator).
Thanks,
Is there an estimate as to when the screen.capture and screen.captureBounds APIs will work again on-device in graphics 2.0? It is an essential feature for many of our company’s apps (that currently works great in the Simulator).
Thanks,
The behavior of display.capture() and display.captureBounds() should be correct in the next Graphics 2.0 daily build. Please let us know if you have any further problems. Thanks!
There is still something wrong with display.captureBounds in the latest G2 build (#2093)
Is Corona aware of this and going to fix it?
The behavior of display.capture() and display.captureBounds() should be correct in the next Graphics 2.0 daily build. Please let us know if you have any further problems. Thanks!
Can you describe your problem and provide a sample that demonstrates the problem?
It would also be very useful for us to know which devices are causing you problems.
Try this code on an Android device (I tried with Galaxy Tab 3 & HTC One)
local img = display.newImage("test.jpeg", 0, 0) local scale = math.max(200 / img.width, 200 / img.height) img:scale(scale, scale) img.anchorX = 0.5 img.anchorY = 0.5 img.x = display.contentWidth \* 0.5 img.y = display.contentHeight \* 0.5 local screenBounds = { xMin = (display.contentWidth-200)\*0.5, xMax = (display.contentWidth+200)\*0.5, yMin = (display.contentHeight-200)\*0.5, yMax = (display.contentHeight+200)\*0.5, } local newPic = display.captureBounds(screenBounds) img:removeSelf() img = nil newPic.y= display.contentWidth \* 0.5 newPic.x= display.contentWidth \* 0.5
The test.jpeg can be any graphic image.
The image shows fine at the beginning but if you press HOME key & then re-enter the app, you will see the image is either skewed or disappears. Some of my users report images are either black or incorrect (looks like the captureBounds capture a wrong position) even without pressing HOME & re-entering the app.
This is not a bug and it worked this way before graphics 2.0. Let me explain why…
Corona’s capture APIs produce a capture images in memory only. When you suspend out of your app on Android, all OpenGL textures are freed from memory. When your app gets resumed, Corona must reload all images. Now, the problem here is that your capture image no longer exists in memory and there is no image file to restore the capture image from… so the image is lost.
So, the solution in your case is to do the following:
Save the returned capture image to file via the display.save() function.
Display the image saved to file via display.newImageRect() with the capture object’s bounds.
Remove the image object that was returned by the capture function.
The above will generate a capture image file that your app can restore from after a suspend/resume. Just note that writing to file/storage is a performance hit. In fact, that’s why our capture APIs do not write to file by default… to avoid file I/O performance hit when needing a temporary capture image for, say, a storyboard scene transition.
I hope this helps!
Thanks for the detailed explanation and your points are understood.
I cannot use the solution by saving the image to a file because I have tried and it’s too slow (like you said, the I/O performance hit).
I actually can work around this by using Container in G2 for my case. So if it works as design, I am actually fine with it.
But I still have one question… this problem has never been seen before G2. None of my users reported such a problem. Why is that?
And besides, if it is the case, the problem should happen very easily, this should be mentioned in the API document, isn’t it?
This problem definitely happens before G2… but only on Android, because that OS destroys the app’s OpenGL context when suspended. I’m quite sure of this because this issue comes up every few months on Android.
iOS does not have this issue. That OS preserves the app’s OpenGL context when suspended. This also means that iOS can resume apps faster than Android.
There is still something wrong with display.captureBounds in the latest G2 build (#2093)
Is Corona aware of this and going to fix it?
Can you describe your problem and provide a sample that demonstrates the problem?
It would also be very useful for us to know which devices are causing you problems.
Try this code on an Android device (I tried with Galaxy Tab 3 & HTC One)
local img = display.newImage("test.jpeg", 0, 0) local scale = math.max(200 / img.width, 200 / img.height) img:scale(scale, scale) img.anchorX = 0.5 img.anchorY = 0.5 img.x = display.contentWidth \* 0.5 img.y = display.contentHeight \* 0.5 local screenBounds = { xMin = (display.contentWidth-200)\*0.5, xMax = (display.contentWidth+200)\*0.5, yMin = (display.contentHeight-200)\*0.5, yMax = (display.contentHeight+200)\*0.5, } local newPic = display.captureBounds(screenBounds) img:removeSelf() img = nil newPic.y= display.contentWidth \* 0.5 newPic.x= display.contentWidth \* 0.5
The test.jpeg can be any graphic image.
The image shows fine at the beginning but if you press HOME key & then re-enter the app, you will see the image is either skewed or disappears. Some of my users report images are either black or incorrect (looks like the captureBounds capture a wrong position) even without pressing HOME & re-entering the app.
This is not a bug and it worked this way before graphics 2.0. Let me explain why…
Corona’s capture APIs produce a capture images in memory only. When you suspend out of your app on Android, all OpenGL textures are freed from memory. When your app gets resumed, Corona must reload all images. Now, the problem here is that your capture image no longer exists in memory and there is no image file to restore the capture image from… so the image is lost.
So, the solution in your case is to do the following:
Save the returned capture image to file via the display.save() function.
Display the image saved to file via display.newImageRect() with the capture object’s bounds.
Remove the image object that was returned by the capture function.
The above will generate a capture image file that your app can restore from after a suspend/resume. Just note that writing to file/storage is a performance hit. In fact, that’s why our capture APIs do not write to file by default… to avoid file I/O performance hit when needing a temporary capture image for, say, a storyboard scene transition.
I hope this helps!
Thanks for the detailed explanation and your points are understood.
I cannot use the solution by saving the image to a file because I have tried and it’s too slow (like you said, the I/O performance hit).
I actually can work around this by using Container in G2 for my case. So if it works as design, I am actually fine with it.
But I still have one question… this problem has never been seen before G2. None of my users reported such a problem. Why is that?
And besides, if it is the case, the problem should happen very easily, this should be mentioned in the API document, isn’t it?
This problem definitely happens before G2… but only on Android, because that OS destroys the app’s OpenGL context when suspended. I’m quite sure of this because this issue comes up every few months on Android.
iOS does not have this issue. That OS preserves the app’s OpenGL context when suspended. This also means that iOS can resume apps faster than Android.