ratio distortion in jpg after screenshot


when getting a screen shot with the take picture method it is adding the openGL view to the cameraview and I am getting a problem with the ratio of the resulting object in the image.

 before the picture/screenshot is taken the object on the screen looks normal, for example the default object with light green color image and large blue "B" inside it, that is floating on the screen,  it has a ratio of same height as width, is a square like shape,  however in the resulted jpg the image is stretched with the ratio distorted, being more tall than wide.

i tried to fix this by changing the ratio of bitmaps in various places and i am lost.  how do I fix this ratio problem?

where in the code can i go to change the ratio of width to height for the openGL View when taking a screenshot?

there is no ratio distortion when viewing in the camera or tablet on the screen. everthing looks perfect there.  the problem is only when I take the screenshot and it results in the distorted or stretched image of the virtual object in the jpg.

kkkkk's picture

I made a NEW DISCOVERY in the search to find out what is going on under the hood.   I checked to see what is going on in the ScreenShotCallback class that is an inner class of the ScreenshotHelper class.

For the ScreenshotHelper class member variables,  btmCamera, and btmGl,  I am getting two different sizes.  using the .getWidth() and .getHeight() methods on btmCamera I am getting W1280 and H736 which is the correct size of the screen for my device, and for btmGl  the results are W640 and H480

so now I think that this is why the image is stretched and distorted.  the openGL image is a different resolution size than the camera view. 

last week I made a custom public function to get the camera object from inside the cameraView class.  This was used this to change the resolution of the cameraView to the screen resolution of my device (1280X736).   this was done by setting a Camera.parameters object directly on the camera object.

the size 1280 X 736 is the size of the root layout of my Activity for the device that is being tested.  I got that from getWidth() and getHeight() in the onMeasure callback funciton for the Activity.

it looks like to fix this problem I have to maually change the size of the OpenGL view to the same as the camera View when the screenshot/picture is taken so both the views have the same size.

what is interesting is that both OpenGL and CameraView are the same size of 1280X736 when viewed on the device viewfinder screen.  It is only when taking a screenshot/take picture that the sizes are different.   in that case the the camera View is 1280X736 and the  OpenGL view is lowered to 640X480.

the main quesiton now is, what would be the best way to manually set the OpenGL view to a certain size?     is there  something like a Camera.parameters object that I can use to set the object in the OpenGL view and what object in the openGL classes can I use to set the overall screen size? 

if I can make a method to get the camera object from the camera View class, then there must be an object that I can get from OpenGL class to set the size.

I am getting close to figuring out how to fix this problem but any comments or ideas on this would help.


Beyondar's picture

In order to support as many devices as possible the size of the picture is set as closest as possible to the screen size, there are some cameras that support the same size as the screen, but there are other cameras that doesn't :(

I 'll added this info on gitHub and I will try to fix it asap. (https://github.com/BeyondAR/beyondar/issues/6)

You also can fork the project and do a pull request with your findings :)