What is Happening with iPhone Camera?

Last Updated on January 5, 2023 by Detective Dev

Foreign Okay what exactly is happening with the IPhone's camera like we've done years of Blind smartphone camera tests in bracket Format and the iPhone supposedly one of The premium cameras in the entire Smartphone industry consistently loses In the first round then we do a Scientific version with 20 million plus Votes and it finishes in the middle of The pack and yet Marquez you named it The fourth time running best overall Smartphone camera system in 2022 and Gave it a trophy what's up with that a Concerning amount of people have started To notice that the iPhone's camera feels Like it's taken a few steps back lately And I agree with them I think we should Take a closer look at this so first of All cameras have come a really long way To the point where smartphone cameras Aren't just cameras anymore see back in The day a camera was basically a sensor That would travel around covered all the Time and we wanted to take a photo you Would expose that sensitive bit to the Environment around it and it would Collect the light and close it then the Photo would be a representation of how Much light hit each part of the sensor The better the sensor the better an Image you could get the more light Information super simple these days Though it's turned into a whole

Computational event your smartphone Sensor is sampling the environment not Once but often several times in Rapid Succession at different speeds it's Taking that light information merging Exposures together it's doing tone Mapping Noise reduction HDR processing and putting it all Together into what it thinks will be the Best looking image this of course is a Very different definition of a picture So now it's not just about having the Best sensor that gathers the most light Information it's at the point where Software makes a much bigger difference To the way the image looks at the end of The day than anything else like next Time you watch a smartphone reveal event For example keep an eye on all the new Additions that get made and just how Many of them are pure software so Google Basically struggled when they first Started using the IMX 363 sensor way Back in the day with the pixel 3's Camera because they got their software Tuning with it just right and it was an Instant Smash Hit so they kept using That great camera combo in every pixel Since then the three the three a the 4 The 4A the 5 the 5A and even the pixel 6A so year after year of new phones same Sensor same software tuning combo Because it just worked if it ain't broke

Don't fix it so when you saw the pixel 6A win December's scientific blind Smartphone camera test what you saw was A four-year-old sensor and software Tuning combo that is still so good that In a postage stamp sized comparison of Compressed side-by-side images where you Can't really judge sharpness or depth of Field too much basically just Appreciating the basics this combo Absolutely nailed the basics better than Anyone else now when the pixel 6 came Along stay with me Google finally Updated their design and their branding And they finally changed to a new sensor With this new camera system so they go From the tried and true 12 megapixel to This massive new 50 megapixel sensor and It kind of threw a wrench into things so It looks to me that the pixel is over Sharpening I think the one on the left Looks too crunchy the camera on the Pixel 6 does have a habit of making Things just look hdria I don't know if There's really a technical term for that If you look at all the photos it's clear The pixel is still doing pixel things I Think Google's still running all of Their camera algorithms at 11. like when They don't need to anymore right now new Phones with much bigger sensors are Still processing like they're smaller Older ones the basic principle is they Were doing all this processing with the

Old sensors as if they were not getting A lot of light and then suddenly they Had this massive new sensor which is Getting way more light information but They were still running all of this Processing they would still do high Sensitivity stuff and they do noise Reduction because if you have high Sensitivity you need noise reduction but Then since they're doing noise reduction You need to do sharpening on top of that To make up for it and just overall You're doing way too much and so the Photos are literally over processed so This fancy new phone to come out with a New camera system but you could argue Legitimately that the older pixel still Took better looking photos so Google had To work really hard at the drawing board And make some adjustments and some Updates to the software to dial in this New sensor it took a while but now it's The pixel 7 out a full year later with The same huge 50 megapixel sensor They're back on track and hey would you Look at that pixel 7 right behind the Pixel 6A in the blind camera test so When I see iPhone 14 pro photos looking A little inconsistent and a little over Processed right now I actually see a lot of the same stuff That Google just went through with the Pixel because the iPhone story is kind Of along the same lines they used a

Small 12 megapixel sensor for years and Years and years then the 13 Pro sensor Got a little bigger but this year the IPhone 14 pro is the first time they're Bumping up to this dramatically larger 48 megapixel sensor and so guess what Some iPhone photos this year are looking A little too processed and it's nothing Extreme but it's real and they will have To work on this I suspect that by the Time we get to iPhone 15 Pro you know a Year later they'll have some new Software stuff they're working on and I Bet there's one new word they use on Stage you know we finally have deep Fusion and pixel binning and all this Stuff I bet there's one new word they Use to explain some software improvement With the camera but anyway I think this Will continue improving with software Updates over time and they'll continue To get it dialed and I think it'll be Fine But that's only half my theory uh this Does not explain why all the previous 12 Megapixel iPhones also all lost in the First round and all those other bracket Style tests and this is a separate issue That I'm actually a little more curious About Because as you might recall all of our Testing photos have been photos Of me now this was on purpose right like We specifically designed the tests to

Have as many potential factors to judge A photo as possible like if it was just A picture of this figurine in front of a White wall the winner would probably Just be whichever one's brighter maybe Whichever one has a better gold color Basically but then if we take the Figurine with some fall off in the Background now we're judging both color And background blur maybe you add a sky To the background now you're also Testing dynamic range and HDR so yeah With our latest photo it's a lot it's Two different skin tones it's two Different colored shirts it's some Textures for sharpness the sky back There for a dynamic range short range Fall off on the left long range fall off On the right I mean with all these Factors whichever one people pick is the Winner ideally is closer to the best Overall photo I also wanted the pictures To be of a human just because I feel Like most of the important pictures that People take most often that they care About are of other humans but as it Turns out using my own face as a subject For these revealed a lot about how Different smartphones handle taking a Picture of a human face because as I've Already mentioned these smartphone Cameras are so much software now that The photo that you get when you hit that Shutter button isn't so much reality as

Much as it is this computer's best Interpretation of what it thinks you Want reality to look like and each Company goes to a different level of Making different choices and different Optimizations to change their pictures Up to look different ways they used to Actually be a little more transparent About it there are phones that would Literally identify when you're taking a Landscape photo and they'd pump up any Greens they can find of the grass or That identify any picture with a sky in It and pump up the blues to make it look Nicer I did a whole video on smartphone Cameras versus reality that I'll link Below the like button if you want to Check it out but the point is when you Snap that photo on your phone you're not Necessarily getting back a capture of What was really in front of you they're Really bending it in many ways the IPhone's thing Is when you take a photo It likes to identify faces and Evenly light them It dries every time and so this feels Like a pretty innocent thing right like If you ask people normally what do you Think should look good in a photo and You say I'll evenly light all the faces In it that sounds fine right and a lot Of time it looks fine but it's a subtle Thing like in a photo where you can see

The light is coming from one side Clearly where you can see from the Pixels camera there's a shadow on the Right side of the face with the iPhone Though it's almost like someone walked Up and added a little bounce fill just a Really nice little subtle bounce fill But sometimes it looks a little off like Look this is the low light photo test we Did from our blind camera test on the Left is the pixel 7 again which looks Like all the other top dogs and on the Right is the iPhone 14 pro that finished In the middle of the pack it might be Hard at first to see why it looks so Weird but look at how they completely Removed the Shadow from half of my face I am clearly being lit from a source That's to the side of me and that's part Of reality but in the iPhone's reality You cannot tell at least from my face Where the light is coming from every Once in a while you get weird stuff like This and it all comes back to the fact That it's software making choices And the other half of that is skin tones So you've heard me say for a few years In a row that I mostly prefer photos Coming from the pixels camera and we've Done lots of tests where I have me as a Sample photo and you can tell it looks Really good turns out Google's done this Thing over the past few years with the Pixel camera called real tone it doesn't

Get that much attention but it turns out To be making a pretty big difference Here historically a real issue for film Cameras back in the day was that they Were calibrated for lighter skin tones And people with darker skin tones would Typically be underexposed in those Pictures so now fast forward today Cameras are all software smartphone Cameras are software so they can all Make adjustments to account for Different variety of skin tones of Course But they still all do it to different Varying degrees like you might have Noticed a lot of phones sold in China Will just brighten up faces across the Board because that's what people prefer In photos in that region very often Google goes the extra mile to train Their camera software on data sets that Have a large variety of skin tones to Try to represent them correctly across The board and that's what it's calling Real tone and Apple's cameras from what I've observed simply just like the Evenly light faces across the board and Doesn't necessarily account for Different white balances and exposures Necessary to accurately represent Different types of skin tones when I Think they totally could so basically it Turns out this is a big part of what we Were observing in pixels and a lot of

The phones that do accurately represent My skin tone finishing higher in this Blind voting thing that we did because They happen to do that really well and That's the thing that people really Considered when they voted on them I Haven't said this a lot but I think this Is one of the earliest reasons that I Actually really liked RED cameras was Obviously 8K is great color science is Great but the way it represents and Renders my skin tone accurately over a Lot of you know the Sony's and the Ares And canons that I've tried that's that's Actually one of the things that really Drew me to these cameras so all this Software stuff is why photo comparisons Between modern smartphones is so hard Like there are a lot of channels that do A really good job with the side by side Photo test you know but even as you're Trying to like pick one over the other You've probably noticed this you might Like the way one of them renders Landscape photos over the other but the Way a different one renders photos with Your own skin tone and then the way a Different one renders photos of your pet For example so I'm sure Apple will Defend everything they're doing now with Their current cameras as they typically Do But I'm gonna keep an eye on what I'm Also sure which is they're for sure

Working on tuning these new cameras Dialing it in and eventually getting it Better with the iPhone 15 and 15 Pro so Back to the original question from the Beginning of the video can't leave that Unanswered which is all right the pixel 6A you like the pixel photos Marquez it Won the blind scientific camera test But you still give the trophy for best Overall camera system to the iPhone the Very 14 pro that we've been talking About this whole video why and if you Listen carefully You already got it which is that Scientific test that we did tested one Specific thing it tested the the small Postage stamp sized you know exposure And colors General thing with a bunch of Different factors But sharpness and detail with all the Compression that we did wasn't tested Also speed of autofocus reliability of Autofocus wasn't tested the open close Time of the camera app how fast and Reliable you can get a shot Wasn't tested and also video wasn't Tested so the microphone quality video Quality speed and reliability of Autofocus there file formats sharpness HDR all that stuff wasn't tested maybe Someday we will test all that but until Then the lesson learned is The pretty pictures that come from the Pixel or whatever phones in your pocket

Are partly photons but primarily Processing Thanks for watching Catch you guys in the next one peace

Make $$ On YouTube

(Without Showing Your Face)

Leave a Comment