Dantist
Newbie
Hi all,
I have an M4 myself, but always wanted to also have RF in iPhone. Unfortunately there was not such a thing, so I developed one.

Some of the key features:




I have an M4 myself, but always wanted to also have RF in iPhone. Unfortunately there was not such a thing, so I developed one.

Some of the key features:
- Full film rangefinder shooting experience:
- Manual rangefinder focusing with proper RF-coincidence patch. That's powered by LIDAR, so app is limited to iPhone pro models (ideally 14+, but anything from 11 pro onwards shall work), sorry
- Manual shutter and aperture control. The selected aperture defines bokeh (including cat eyes, hexagon shape at closed apertures etc)
- You actually have to wind the film to capture anything. You actually need to finish the film roll to see results (which takes a while). Once the app is released, you'll actually have to buy film too (subscription option will be available but not recommended as it ruins value of each frame you take).
- 50mm FOV to make results feel different from your stock camera (and yes, I like 50mm more than 35)
- authentic acoustic feedback mimicking cloth-shuttered rangefinders, and Taptic powered tactile feedback (although here one can only imitate
- color filters for B&W photography
- Development cycle at the end of each roll before you can see results. Light-table view to inspect your negatives
- The best film emulation in town:
- Physics-based simulation, i.e. implementation of the algorithm by Nelson et al 2017 (Alasdair Newson - Film Grain Rendering) on iPhone. This means that grain is not a texture, but is actually rendered according to exposure, including edge effects etc. Algorithm re-written from scratch for metal, so now it runs faster on iPhone than original cpu code on a high-end pc (seconds vs minutes). Still a gpu heavy task which benefits from pro power.
- halation and glow simulations tuned to film stock
- B&W, Color and Slide-stocks (Kodak-inspired) with exposure-dependent color/density rendering. Negatives appear as such when viewed on light table
- The best bokeh/depth rendering in town:
- Hi-resolution LIDAR-assisted ML depth estimation
- highly optimized bokeh renderer which operates in real metric space and renders like a 50mm lens (i.e. depending on distance to camera to subject and subject to background). Physics is as close to the real lens as is possible without compromising robustness too much.
- The bokeh look is specified at capture time via selected aperture/shutter just like on real camera (via gestures to make things usable). This includes physical limitations imposed by real camera, i.e. if it's sunny and you're shooting iso 400 film, you can not really get f2 aperture without overexposing. Just like a real camera.
- Hand shake simulation. OIS can not be switched off on iPhones, which yields unrealistic results at slow shutter speeds, so a dedicated motion blur engine powered by accelerometer was developed and tuned to match what you get on Leicas with cloth shutter




