Hasselblad: Which lenses mount on which cameras?

Im interested in the Hasselblad digital system and I would like to know which lenses mount on which cameras. Im especially interested in the H4D and H5D cameras. Which lenses are compatible with these cameras?

equipment recommendation – Can smartphones, like iPhone 12 Pro, replace entry-level DSLR cameras?

There is no clear answer: That depends on your type of photography.

If you want the camera just to take the usual this and that photos, you are probably not missing out very much. The computational support in the phone has come quite a way.

However, you will notice some limitations: Due to the small sensor, the performance in low light is limited. The phone’s image enhancement will do its best, but it cannot completely negate the limitations of the small sensor.

Things you can do with the phone, you probably cannot do that easily with the entry level dslr

  • Filming without knowing what you do, including slow motion
  • Carry it around you ALL the time
  • Photos in very wet environments
  • Automated panorama

Things that are hard or impossible with a phone

  • low light photography
  • anything that needs long time exposures or filters (like ND filters for landscape)
  • Anything that needs a flash (you will have to use continuous light for any elaborate lighting, but then might suffer again from more noise and you cannot easily freeze action with continuous light)
  • Zoom lenses. The best phones have a 5x optical zoom lens, equivalent to ~130mm. With a DSLR you can get a 400mm-equivalent lens cheaply (250mm lens on a crop sensor).

I am assuming that you use a third party photo app on the phone to have basic access to the settings a photo is taken with.

equipment recommendation – Can phones like iphone 12 pro replace entry level DSLR cameras?

There is no clear answer: That depends on your type of photography.

If you want the camera just to take the usual this and that photos, you are probably not missing out very much. The computational support in the phone has come quite a way.

However, you will notice some limitations: Due to the small sensor, the performance in low light is limited. The phone’s image enhancement will do its best, but it cannot completely negate the limitations of the small sensor.

Things you can do with the phone, you probably cannot do that easily with the entry level dslr

  • Filming without knowing what you do, including slow motion
  • Carry it around you ALL the time
  • Photos in very wet environments
  • Automated panorama

Things that a hard or impossible with a phone

  • low light photography
  • anything that needs long time exposures or filters (like ND filters for landscape)
  • Anything that needs a flash (you will have to use continuous light for any elaborate lighting, but then might suffer again from more noise and you cannot easily freeze action with continuous light)

I am assuming that you use a third party photo app on the phone to have basic access to the settings a photo is taken with.

External Flash triggered by camera’s internal trigger

… can my camera trigger this external flash?

Yes, but not as a Canon wireless master. You have to put the YN-560IV into S1/S2 optical slave mode (use the Lightning/Radio button), and use the pop-up flash as a regular flash. You use S1 if the pop-up flash is in M mode; S2 if it’s in TTL, so you can skip the metering pre-flash.

Keep in mind the S1/S2 “dumb” optical slaving is completely different from the Canon optical wireless system.

Does the popup flash in my camera always has to stay up to trigger the external flash?

Yes. Something on the camera has to tell the flash when to fire. And that’s either going to be the pop-up flash, or a radio transmitter. Without the add-on transmitter, you have to use the pop-up flash, and it has to fire a pulse bright enough for the slave sensor to “see”. You can set the power of the pop-up to be minimal, and it should not register in the image at regular (non-macro) subject distances.

If this is possible, will there be any difference using a build in trigger vs an external one?

Yes. If you get a YN-560-TX transmitter, you won’t have to wrestle with the line-of-sight issues required by optical slaving. The light sensor on the flash has to be able to “see” your pop-up flash to be remotely triggered, so it’s difficult to use a flash behind you, or behind a solid object. The light signal can also be overpowered if you try to use optical slaving outside in bright sunlight.

The YN-560-TX is a radio trigger, so isn’t affected by either of these factors. In addition, it can let you remotely set the power and zoom on the flash from the camera.

However. A Godox TT600 and X1T-C combination would cost roughly the same and also give you HSS capability over the remote flash, while giving you far better expansion options in the future.

I will be using a soft dome or a reflecting umbrella sometimes so will they be in line of sight using the internal trigger?

Depends on how you set things up and whether there are bounce surfaces around. The red panel on the front of the flash is where the optical slave sensor is.

Adapter to use DSLR lens on Mirrorless cameras

I am planning to use adapter for dslr prime/zoom lenses from Nikon/Canon with Sony Mirrorless A7iii camera.
Will this impact the quality photos taken? Will it be advisable to go for Sony E mount prime/zoom lenses rather than using adapter.

lens – Mirrorless cameras and Night photography

Let’s talk about lenses first.

You can get cheap lenses in almost any mount, mirrorless or (D)SLR. You can get very expensive lenses for any mount, mirrorless or SLR. You can get a lot of lenses in between for either type of camera. Since most SLR mounts have been around decades and most mirrorless mounts have only been around less than a decade, there are far more used lenses for popular SLR mounts (Canon EF, Nikon F, Minolta/Sony A, Pentax K, etc.) in circulation than for the newer mirrorless mounts. But for buying new lenses you need to look at the particular lens you are interested in and compare similar lenses from one platform to another.

Now let’s talk about cameras.

In very low light the actual view through the viewfinder of any type of camera – film, DSLR, or mirrorless – can be very different than the resulting photo. This is more a function of the length of exposure normally used in very low light than anything else. Your eye observes the scene in real time through a viewfinder. If the camera is collecting the light from a period of several seconds or several minutes, there’s no way to see that through the viewfinder near instantly.

But when we talk about a digital camera’s efficiency, we’re not usually talking about how much the view in the viewfinder matches the actual image produced. We’re talking about what percentage of the light that enters the camera is converted to energy and recorded as a photograph. In theory there’s absolutely no difference in how efficient a camera with a mirror can be versus a camera without a mirror. One can put the same exact sensors in either type of camera, and some manufacturers from time to time have done just that. For example, the Canon EOS R is a mirrorless camera that uses the same sensor that is provided in the Canon EOS 5D Mark IV which is a DLSR. So again, you’re right back to comparing two specific sensors from two specific cameras, whether both are mirrorless, both are DSLRs, or you’re looking at one of each.

Depending upon exactly what kind of “night photography” you plan to do Sony sensors may be better or less well equipped to handle your specific use case. For astrophotography some Sony models have a reputation as “star eaters” because their aggressive noise reduction routines eliminate dim stars along with digital noise. On the other hand, those same models with aggressive noise reduction might be just the thing for night street photography.

color management – How do cameras undistort photos?

Distortion is produced by the lens, not the sensor. So the distortion correction you describe is also referred to as lens correction.

There are different distortion models (equations). The parameters depend on the specific models used. Different cameras and software may use different models. Camera manufacturers tend to not reveal the distortion models they use.

The correction parameters may be taken from a lens model (more equations) that describes the behavior of specific lenses. Or they may be found empirically, by photographing scenes and making measurements. Usually a representative copy of a lens is used. Results are usually good enough, and averaging parameters does not necessarily produce desirable results.

The models used by Hugin are used to perform lens correction in open-source software (eg, lensfun).

To solve your problem, you should be able to use Hugin to calculate the transformation.

❓ASK – Why are iphone’s cameras so much better than those in android? | NewProxyLists

Not always. Like ErickP said, it depends on the models. For example, the Samsung phones are supposed to have cameras that are the same quality or even better than iPhone’s. Cheaper Android phones will, of course, have cheaper cameras but if you get an expensive one, the camera will be good.

 

unity – OnMouse Events stop working when multiple cameras are active

I’m working on 2019.4.9f1 on a 2D project using Universal Rendering Pipeline. The camera settings changed so I’m stating it.

I wrote a OnMouseEnter and OnMouseExit in a script to manage whenever my mouse cursor is on top of a SpriteRenderer with a collider located at 0,0,0. I have a main camera located at 0,0,-10. I’ve tested the script that manages the OnMouse events. It works fine.

Problem: But the moment I add a new camera (UI Camera set to Overlay located at 0,0,-10) in and it is enabled, OnMouse events aren’t getting called. Disabling the UI camera makes the OnMouse events work normally.

Solutions that have failed:

  1. I’ve tried setting the tag of the UI Camera to MainCamera and set
    Main Camera’s tag to Default.
    https://answers.unity.com/questions/241334/onmousedown-w-multiple-cameras-doesnt-work-right.html
  2. I tried making the UI Camera a bit closer to the sprites.
    https://answers.unity.com/questions/425478/onmousedown-not-firing.html

I intend to avoid Raycasting and get the OnMouse Events to work because I want to fix the problem and not work around it.

Question: Any possible reason why my OnMouse events aren’t called?

lens – Why arent there supertele lenses for APS-C sensor sized cameras?

Because making these lenses for APS-C sensors won’t make them significantly smaller. The focal length and max aperture are the main criteria for the diameter of the front lens, that itself drives the size and weight of the lens. So you would have two lens line-ups (FF and APS-C), and given the smaller market for each the lenses could be more expensive than lenses sold for both.

On the other hand you can consider that the APS-C supertele lens exist, it is just a lens with shorter length. The APS-C version of a 600mm-f/4 is just the 400mm-f/4…

For shorter lenses (standard and wide angle) the smaller image circle is easier to take advantage of, and the market is big enough to warrant designing lenses specific to APS-C.