android – Previewing the camera in framelayout mode with distorted image

Again, need help!

I develop an app that uses the smartphone's camera only for display, ie it will not save photos or videos, but only a preview.

Already managed to open the smartphone camera in Activity and all. The problem was that I noticed the camera view of my application compared to other applications and the default application of my smartphone and that I found that my application was distorting the picture. image displayed on my camera.

Below are the pictures to illustrate what is happening.

Note: Both images were taken at the same distance and in the same position.

Image 1 – Standard camera and normal smartphone mode (ORIENTATION_LANDSCAPE)

enter the description of the image here

Image 2 – Camera of my application and normal mode (ORIENTATION_LANDSCAPE)

enter the description of the image here

Note that photo 1, is sharper and "thinner", ie the size of the original image without zoom is included. Already photo 2, is flattened "chubby", that is to say deformed.

Below the class code, layout xml and manifest file of the application:





public class MainActivity extends AppCompatActivity {

Camera camera;
FrameLayout frameLayout;
CameraPreview cameraPreview;

protected void onCreate(Bundle savedInstanceState) {

    frameLayout = findViewById(;
    camera =;
    cameraPreview = new CameraPreview(this, camera);


public void onWindowFocusChanged(boolean hasFocus) {
    View decorView = getWindow().getDecorView();
    if (hasFocus){
                | View.SYSTEM_UI_FLAG_FULLSCREEN
                | View.SYSTEM_UI_FLAG_HIDE_NAVIGATION);

protected void onPause() {
    if (camera != null){
        camera = null;



public class CameraPreview extends SurfaceView implements SurfaceHolder.Callback{

Camera camera;
SurfaceHolder holder;

public CameraPreview(Context context, Camera camera) {
    super(context); = camera;
    this.holder = getHolder();


public void surfaceCreated(SurfaceHolder holder) {
    Camera.Parameters parameters = camera.getParameters();
    List sizes = parameters.getSupportedPictureSizes();
    Camera.Size mSize = null;

    for (Camera.Size size: sizes){
        mSize = size;

    if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE){
        parameters.set("orientation", "portrait");
    }else {
        parameters.set("orientation", "portrait");

    parameters.setPictureSize(mSize.width, mSize.height);

    try {
    } catch (IOException e){

public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {


public void surfaceDestroyed(SurfaceHolder holder) {






If someone can give me a light where I make a mistake, thanks a lot?

Scope of the USB camera for sculpture

I have one of these lenses turned on for an ordinary usb to connect to a laptop or monitor that I can download the installation process or use a small disk drive. I do not want to spend a lot, it's just for a hobby. Where can I get one used?

Undesirable resizing of camera photos on Snapchat to a larger size

I do not know why Snapchat changes the size of saved photos to enlarge them when they are downloaded from the filmstrip.

Is there a solution to keep the same size on the film before sharing it with others via Snapchat?

Please, give your opinion!

Partity Unity making the user interface of the camera space

I'm using LWRP in Unity 2019.1.12f1. The user interface is in Screen Space – Camera. The shader used for particles is Lightweight Render Pipeline / Particles / Unlit (Transparent, Premultiply).


enter the description of the image here

indoor – LCD of the camera versus computer screen

If your camera's LCD screen displays the image as you want, you need to check if your monitor's color settings are properly configured.

The software you use can also view the file. So you have to examine the files with the help of different software to see if there are any differences.

It would be useful to have a photo illustrating the problem.

unit – How to position a mesh just outside the view of the camera?

I want to slide in an object in the view, coming from the right of the screen.

To do this, I have to position the object next to the camera view, so that it is still not visible, but close enough to allow quick insertion.

For example, set the object to X = 1000000 (so that it's surely outside the camera's view) for security reasons would not work because it could not sneak up fast enough . For a reasonable slide, it would have to be very close to the field of view of the camera.

How could I calculate the position of the object for this?

Thank you.

Filters – What is this mirror-like object in front of the lens of this industrial camera?

I have an industrial camera. His model is BFLY-23S6M, as shown below. This camera is used for recognition of the license plate. An 850 nm infrared projector is located near the camera. As I know, it is better to reject visible light.

There is a mirror-like object after the sensor. You can see it in silver color. I want to know what it is and what is its use. Is it a filter (for example, anti-glare or bandpass)?

From the documentation, I know the transparent filter (for monochrome) and the non-transparent filter used for color cameras. This type of mirror is a custom user installed and is not a default setting intended solely for protection against dust.

Camera with visible reflective surface

objective – Can a camera measure one kilometer?

Well, if the light from one element to another moves exactly parallel, the distance only creates problems. If the light from one element to another does not move in parallel, you will need huge elements, or such a quantity of light will not reach the next element, so that the camera will not have enough light to get a reasonable picture. And you would certainly want to avoid stray light from entering, so you would like to have a tube around it all.

It does not seem like a good investment.

Should a camera have a contiguous housing?

The group of lenses on this image would have its own protective tubes, like that of a camera, but one for each group of lenses. What problem would there be a gap between the lens group and the flare if it was used at night?

enter the description of the image here contiguous

Related: Can a camera go one kilometer?

goal – What is the farthest that a camera can see?

An answer to this question is not what existing lenses and sensors can do in practice, but what an optical system is. can do in theory. Here, "in theory" means "in conditions of perfect vision, without atmospheric disturbance at all". I suspect (but I'm not sure) that for relatively small optical systems like camera lenses and relatively good atmospheric conditions, the atmosphere is not limiting. he is limitation for large optical systems like telescopes, although there are deeply astonishing techniques that carry the name of "adaptive optics" and which involve, of course, lasers attached to the telescope that can deal. In addition, you can simply be in the space.

The answer to this is therefore that the angular resolution limit of an optical system with a front element diameter d, operating at a wavelength of λ is given by

Δθ = 1.22 λ / d

The numerical fudge factor of 1.22 can be slightly adjusted according to what you mean by resolution, but not much. This limit is called the diffraction limit for an optical system.

If Δθ is small (which is the case if you have a reasonable goal), then at a distance then the length that you can solve is

Δl = 1.22 r / d

Rearrange what we get

r = Δl d / (1.22 λ)

This is the range at which an optical device with a front element of diameter d can solve Δl at a wavelength of λ.

The wavelength of the green light is about 500 nm and suppose that you need Δl = 1 cm to be able to see all the details on a face (I do not know if you can identify a person with this resolution, but you can know that this is the case – a face).

By connecting these numbers, we get r = 16393 where r and d are both in cm. If d is 5cm then r is a little less than 1 km. What does it mean, that is, that however large the magnificationif your front element is 5 cm in diameter, it is the resolution limit at this distance: if you enlarge the image further, you simply increase the blur.

In another answer, someone mentioned a 150-600mm Sigma zoom: this one seems to have a 105mm front element size. This gives r = 1.7 km, so this goal is probably close to or actually limited by diffraction: it is close to being able to resolve as well as it is physically possible to do it.

We also mention this perhaps Canon's mythical goal 5200mm. It is difficult to find specifications for this, but I found a place that claims overall dimensions of 500 mm by 600 mm by 1890 mm: if they are correct, the front element has no more than 500 mm in diameter, so we get r = 8 km for this purpose. So, in particular, it will not let you see faces tens of kilometers away, which the media hype involves in a way.

You can of course use this formula for any purpose: for example, it explains to you why you can not see the Apollo landing sites on the moon from Earth with a plausible telescope !