Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a photographer on the side: nahhhh. I have a “real” camera because I want control. Give me the RAW data, and I can move mountains.

What have made interchangeable lens cameras increasingly irrelevant: the workflow. On a phone, somewhat obviously, I take a photo, and post it to Instagram, instantly. On a stand-alone camera? I…

…uh, well if I’m shooting RAW I need to either have RAW/JPEG turned on, or convert to JPEG in camera, and then save that image, and then use the awful built-in wifi to connect to a shoddily made app that will randomly not connect for reasons, and then hope the transfer finishes, or use a SD-to-Lightning adapter and dongle up my phone.

The DSLR/mirrorless workflow is still, conceptually, film, except replace a darkroom with Lightroom. It’s terrible for run-and-go casual shooting, which was a huge part of the lower-end market.



And Lightroom has a fair bit of computational photography built into it as well as does Photoshop. Though I use my iPhone mostly on raw a lot as well as my “real” cameras. They’re mostly easier for snapshots and sharing.


Distortion correction, chromatic aberration correction, perspective correction…yeah, all nice stuff. Especially the perspective feature, which is relatively recent. It’s nice to be able to straighten up an image quickly, in the Lightroom workflow, without having to round-trip to Photoshop.


I don't know about the latest versions of lightroom (I still use my perpetual license) but the one I have remains untouched by the icy tendrils of "A.I.", and my output is still worlds better than it


AI is admittedly an overused term. But Lightroom absolutely has algorithms of various types that it uses when you make any number of changes. Do you know exactly what vibrancy does to use one of the more trivial examples?


There are things you cannot do on your DSLR and computer setup. For instance, the phone can take the same picture simultaneously at three different focuses. That's just data you don't have, whether you blend it together by hand or not.


Excuse me? Focus is a property of the lens, no? I understand rapid e-shutter for dynamic range (kinda like bracketing then mixing in post), but how exactly is a single camera taking three focuses? Or do you mean the two+ distinct cameras being fused together? Because I don't think they do that (yet), but I could be wrong.

Personally, I like the control a more manual camera offers, but appreciate the quality of my phone's camera at times too. Generally, I agree, technology that tries to outsmart us feels out of place. Technology that tries to empower us feels just right.


yes, they do fuse different lenses together on pixels


How does it know my finger is over one? I just tested this on my iPhone 13 Mini, and the two photos, one without a finger over a lens and one with a finger over a lens, both look identical.

Given that the focal length of the lenses is different, it would be hard to "fuse" the images together, though I suppose it wouldn't be impossible. All this to accomplish what? A higher effective aperture? Or are you imagining some strange blending of foreground and background in focus, with middle-ground out of focus?


I should have capitalized Pixel, since I meant the phone. iphone doesn't do this as far as I know. if I remember correctly it's to blur the background.


generally, people use a DSLR with a bit more intention, and it makes sense to optimize the sensor and lens to capture a single image as envisioned by the photographer.

however, there are lenses available that combine multiple focuses, focal lengths, and framings by projecting multiple images onto the sensor, to be captured with a single actuation of the shutter.

modern cameras also include features for automatically integrating multiple captures to expand details such as depth of field, dynamic range, resolution, and so on. your phone camera does this too, it is just concealed and non-obvious.


> The DSLR/mirrorless workflow is still, conceptually, film, except replace a darkroom with Lightroom. It’s terrible for run-and-go casual shooting, which was a huge part of the lower-end market.

Yep. And there's a whole culture around it, too. The manufacturers who try to deliver improvements - Olympus, for example, have quite a bit of in-body smarts - generally get shat on for doing so. There's a lot of emotional investment in the "digital darkroom", make everything hard mindset.


I know we don’t say “disruption” any longer, as it’s a phrase co-opted by hucksters trying to sell ad tech…but the camera manufacturers got disrupted.

Because of what you said right here. They kept trying to appease their highest-end clientele without actually thinking about the job to be done for 98% of photography, which is to post kid and cat photos instantly.


No camera can compete against a phone for "post cat pic instantly", because a 10 year old camera phone can already produce a good enough picture and a phone has a cellular connection and instagram app to upload it instantly. A camera has to compete in the "better pictures" space.


That's you, but I have a "real" camera because I want higher quality photos. And it used to be that way

But when I compared my DSLR with my more modern cellphone - well, the colors were just nicer on the cellphone camera.

Just because the DSLR would have better software wouldn't force you to use it. If you want control do that, if you want incredibly high quality point-and-shoot photos, you can do that too.


That’s…extremely debatable that a phone produces better colors.

For my particular weird niche of photography: there is no way a phone is going to get anywhere near the richness of tone and color I get in very low light.

https://www.flickr.com/photos/perardi/albums/721577201444334...


Just a heads up: there are a few NSFW or borderline NSFW photos in here. It might not be a bad idea to tag that.

Just out of curiosity, are these flat out of the camera or after processing?


Almost all photos people put up online have some form of processing. Including these. Most cameras don’t have this contrasty of a look OOC.


A fellow Sony user, I see.


Sony user here. I just plug into the USB-C on the camera (a7R IV) to download photos from it, it mounts the SD cards as USB drives. No sense faffing about with their crappy wifi program.


Does your phone read the RAW files, or are you shooting JPEG? (iOS and macOS don’t preview Fuji RAW files, which is not a surprise, given how they require very special handling as they’re not traditional Bayer filter sensors.)

And how is the performance when editing on the phone? Those are some big files.


My iPad Pro reads the RAW files from the Sony A7R3. Performance is fine but I stick to JPG due to file size. The iPad 6 had terrible performs with the same RAW files, apps would frequently crash due to memory limitations.


About what I expected.

I’ve only tried editing Nikon Z7 RAW files on an iPad Pro, and I honestly don’t remember which generation of Pro. It was OK, but not stellar.

The file size, oof, yeah. That’s a good point. My lossless compression RAW files from my Fuji are ~30 megabytes. Given an iOS device averages 128GB of storage (my rough ballpark guess), that’s going to fill up quickly.


I don't edit on my phone. The screen isn't color calibrated and it's tiny.


Are you getting this to work straight to mobile? I use the crappy wifi share to get shots onto my phone on the go, but I'd much rather get the RAW files over vs. a downsampled JPEG if possible. Might be worth carrying a small USB-C to lightning cable if it works straight to iPhone.


Sony? Luxury.

Fuji. It’s like trying to connect a Palm Pilot to wifi or something.


I was thinking the same thing.


I get away with Sony Imaging Edge on iPhone + Lightroom mobile to post to Instagram on the fly. It's only 2MP JPEG but it's good enough for on-the-fly Instagram stories.


You need 3-5x exposure bracketed RAW to recreate most phone tricks, and there's no software that can do it. The big one is HDR photos in HEIF; "HDR" photo editing software does tone mapping into sRGB, which is literally the opposite of an HDR photo.

(I'm actually not sure why this is, since pro video workflows handle it. I think it's because phones and movie theaters have HDR displays but PCs don't.)

Presumably they can do something like deep fusion for noise reduction/superresolution even better than you already have from a real camera, but nobody seems very interested in doing it. Adobe can't even improve their image resizing for some reason (backwards compatibility?), when they come up with a new resizing method they have to put it in a different place in the UI and give it a silly name like "neural enhance RAW".


I’m curious to hear your take on phones shooting in RAW then?


The workflow is still easier that way.

I do have to go through the step of post-processing…but I'm doing that with JPEGs from the phone regardless.

Completely tangential rant: #nofilter is dumb. Your camera is actually producing a low-contrast black-and-white image that is then demosaiced and interpreted according to an engineer somewhere who is trying to make it look fairly literal. There's filters all the way down.

Wait, where was I? Right. Post-processing. So once I've editing the RAW file in the Photos app, or in Lightroom…boom, it's right there for me. It's on my phone, it's getting backed up to iCloud, and I can trivially post the photo anywhere.

Whereas, on a stand-alone camera, I have to jump through a series of connecting hoops. So RAW doesn't have much impact on the workflow.


You're not imagining the stuff you can do with raw sensor access.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: