I have been experimenting with Live View of my camera (Nikon D5100) to get a feel for what a mirrorless camera would be like, and also because I am wanting to shoot video clips of distant moving objects such as planes and birds. I realize that video on this camera is really not ideal, but at the moment it’s the only thing I have with the level of zoom required to get close enough to such objects.
But in doing so I have noticed that I find it almost impossible to actually use live view for either planes or (especially) birds, because I can’t actually get the camera aimed the right direction for them to show up. With a plane it’s a bit easier as it’s not moving as fast relative to me, so I can quickly locate it through the lens, then switch to live move. It’s still a bit tricky, and in the process of moving the lens away from my eye the camera still moves and I can lose the aircraft I’m trying to track. For moving birds it has proved impossible – I can get off a few shots through the lens, but if I try to live view instead by the time I have live view on the bird is gone. Trying to hunt for either object directly in live view without first looking through the viewfinder seems too time consuming.
At first I thought maybe it was the size or brightness of the display, but I turned the brightness to maximum and it didn’t help, and I am not sure how a bigger display would help me point the camera in the right direction.
Is using live view or correspondingly a mirrorless camera suitable for telephotos shots of moving subjects? I am wondering if there is some trick doing this or if maybe it’s just me. My on-axis hand-eye coordination seems to be great – I have never had any problem photographing moving subjects through the lens; but off axis (not looking through the lens), will it improve with lots of practice or am I just wasting my time?
The suitability of live-view and mirrorless camera options for use with long telephoto lenses comes down to ergonomics, practice, and context.
The biggest factor is likely how you’re viewing, sighting, and stabilizing the camera, vs what you are photographing.
Viewing a non-tilting rear mount screen is probably the worst option in nearly any context.
- You give up stabilization to hold it out from your body enough to view the screen.
- You leave the camera ‘floating’ in front of you more, reducing your bio-mechanical feedback with where the glass is actually pointing, and reduce your ability to reliably sight ‘over the barrel’ to approximate your aim before locking in on target.
- Increased risk of issues such as screen glare making it difficult to see.
An eye level viewfinder is typically the most favourable option for the majority of use cases.
- You gain stability by bringing the camera close into the body.
- You gain bio-mechanical feedback by typically using the same ‘contact points’ with your body, as your hand and eye consistently align while you swing the camera.
*However, there are scenarios where one may benefit from using a rear screen, especially when tilt options come into play. [These may also apply to electronic viewfinders that can be viewed independently of the camera, such as ‘google mount’ options.]
- Positioning the camera lower than a traditional viewfinder allows
- Working in hazardous situations where gluing your eye to the back of the camera may make it more difficult to notice a danger in time to react.
All that said, getting a long lens to hold on target is a skill, and it demands practice.
But if photographers can learn to track moving subjects with a waist level finder on an old medium format camera [where the view is flipped left to right], then odds are good that you can learn to adapt to a given setup.
[I know we’re straying into the world of video here, but I feel it’s still relevant]
Why is live view harder than “TTL”?
because you are not using the senses you were born with – that your eye & hands are easily co-ordinated to match direction, distance, speed & focus [of attention, not sharpness of image].
Catch a ball. No maths required, you just do it.
Catch a ball whilst watching your live view… maybe not.
Will you improve over time?
Yes, otherwise the entire movie industry would be full of missed shots. Almost no cinematographer has his eye right through the lens these days, there’s always some remote aspect to it – in effect, live view. Watch a steadicam operator, gazing towards his feet, where his monitor is, whilst his camera is pointed firmly at the action in front of him.
There’s a dual purpose in that, he can see what he’s shooting whilst not falling over anything.
Many pro video cameras have the same as DSLR or mirrorless cameras, a TTL & separate moveable screen(s) for remote viewing. The use of the eyepiece is becoming much more rare, the remote ‘live view’ is more common.
So, for stills what does it mean?
Mirrorless cameras currently still have both options.
The “TTL” viewfinder is a teeny TV screen, but it’s still really “through the lens”, with your face pressed against the camera, the same as ever..
The live view you’ll still have to train towards, just like every ‘movie’ cameraman working today.
Your potential pitfall, for fast action, is the frequency with which your live view updates on your screen. Modern high-end cameras have truly ‘as live’ playback; no discernible delay at all. An old entry-level consumer DSLR is going to be pretty laggy by comparison.
As the other answers have hinted at, this isn’t so much about the difference between an electronic viewfinder and an optical viewfinder as it is about the different ergonomics of using an eye level viewfinder held in a constant position relative to your face and eyes versus a screen on the back of a camera held a foot or so in front of your face.
When you are using an eye level viewfinder, whether it is an optical viewfinder such as is the case with your D5100 or an electronic viewfinder as would be the case with a mirrorless camera, you hold the camera against your face. To aim the camera you don’t merely move the camera with your hands. You aim the camera by moving your face towards your target and your hands follow to keep the camera in the same position relative to your face.
When you move the camera away from your face to use a monitor screen that is not in a fixed position relative to your face, you can no longer aim the camera by simply moving your face relative to the target/subject. It’s not an insurmountable problem, but it is a different skill set that takes much practice to master. After all, you’ve been learning to bring distant objects into the center of your field of vision by moving your face towards the desired subject for your entire life! Now you’ve got to unlearn that to some extent.
It’s not easy. It is doable. It will probably never become as instinctive as using an eye level viewfinder can be, though. You’ll always have to be more consciously thinking about how to move the camera to keep it pointed in the right direction when the camera is not pressed against your eye and face.
The difficulty of pointing is a combination of both the different body position when looking at a screen and the inherent latency in a LCD display.
One option would be to mount an external viewfinder to the hot shoe on the camera. There are many models available, search for “external optical viewfinder” or “hot shoe viewfinder”.
Such a viewfinder will not show the image as it is through the lens, so the framing will be different. Typically the external viewfinder will show a wider area than your telephoto lens, which actually helps in initially finding the target but may make it a bit harder to accurately follow it.