Ever wanted to have rangefinder in your iPhone pro?

Dantist

Newbie
Local time
3:21 AM
Joined
Dec 28, 2020
Messages
2
Hi all,
I have an M4 myself, but always wanted to also have RF in iPhone. Unfortunately there was not such a thing, so I developed one.

sc_1.jpg

Some of the key features:
  • Full film rangefinder shooting experience:
    • Manual rangefinder focusing with proper RF-coincidence patch. That's powered by LIDAR, so app is limited to iPhone pro models (ideally 14+, but anything from 11 pro onwards shall work), sorry
    • Manual shutter and aperture control. The selected aperture defines bokeh (including cat eyes, hexagon shape at closed apertures etc)
    • You actually have to wind the film to capture anything. You actually need to finish the film roll to see results (which takes a while). Once the app is released, you'll actually have to buy film too (subscription option will be available but not recommended as it ruins value of each frame you take).
    • 50mm FOV to make results feel different from your stock camera (and yes, I like 50mm more than 35)
    • authentic acoustic feedback mimicking cloth-shuttered rangefinders, and Taptic powered tactile feedback (although here one can only imitate
      :(
    • color filters for B&W photography
    • Development cycle at the end of each roll before you can see results. Light-table view to inspect your negatives
  • The best film emulation in town:
    • Physics-based simulation, i.e. implementation of the algorithm by Nelson et al 2017 (Alasdair Newson - Film Grain Rendering) on iPhone. This means that grain is not a texture, but is actually rendered according to exposure, including edge effects etc. Algorithm re-written from scratch for metal, so now it runs faster on iPhone than original cpu code on a high-end pc (seconds vs minutes). Still a gpu heavy task which benefits from pro power.
    • halation and glow simulations tuned to film stock
    • B&W, Color and Slide-stocks (Kodak-inspired) with exposure-dependent color/density rendering. Negatives appear as such when viewed on light table
  • The best bokeh/depth rendering in town:
    • Hi-resolution LIDAR-assisted ML depth estimation
    • highly optimized bokeh renderer which operates in real metric space and renders like a 50mm lens (i.e. depending on distance to camera to subject and subject to background). Physics is as close to the real lens as is possible without compromising robustness too much.
    • The bokeh look is specified at capture time via selected aperture/shutter just like on real camera (via gestures to make things usable). This includes physical limitations imposed by real camera, i.e. if it's sunny and you're shooting iso 400 film, you can not really get f2 aperture without overexposing. Just like a real camera.
    • Hand shake simulation. OIS can not be switched off on iPhones, which yields unrealistic results at slow shutter speeds, so a dedicated motion blur engine powered by accelerometer was developed and tuned to match what you get on Leicas with cloth shutter
Now, I'd like to share the first beta version with the community to get some feedback (generic as well, but especially bugs, of course, which I expect to be many 🙂. If you are interested and are on iPhone pro, please DM or write to support at mkamera.app, I'll get you onboard if there're still places (I plan to limit number of testers to 100 to have any hope to process and actually account for feedback). Thanks!

9B7D1D8D-AAEA-4C9E-A806-B6911C961961_final.jpeg899B5974-A4D9-4B04-A301-9F4FDAD8817D_final.jpeg4504453F-4DD9-4B07-A119-F70A2E7CAE68_final.jpeg196403AE-2A1D-43AF-B6AC-2BDC8D8D02E2_final.jpeg
 
Hello Dantist! I think before certain amount of posts you cannot send and receive DMs here.

I’m curious about the app, sign me up please! Guess you use testflight for distribution?
 
Yes, it's test flight, maybe it's easier if I post the link here. I wanted to do it from the star, but it was blocked as spam. Anyway, everyone is welcome to join! Would appreciate feedback of any kind, you can also post it here, so others can comment. The beta will run for couple of weeks, and then I plan to make a break and work on feedback before release.

Screenshot 2026-04-05 at 19.16.49.png
 
Back
Top Bottom