Close ad

At this year's WWDC, Apple presented a lot of news that it is preparing for the new version of the iOS 8 mobile system. there was no time left and if at all, Craig Federighi only mentioned them very briefly. However, developers are taking notice of these features, and this week they discovered one. It has the option of manual camera control.

From the first iPhone to the very latest, users were used to having everything happen automatically in the Camera application. Yes, it is possible to switch to HDR mode and now also to panoramic or slow motion mode. However, when it came to exposure control, the options were very limited for now – basically, we could only lock the autofocus and exposure metering to one specific point.

However, this will change with the next mobile system. Well, at least it can be changed using third-party applications. While the functions of the built-in Camera, according to the current form of iOS 8, will only increase by the possibility of exposure correction (+/- EV), Apple will allow third-party applications much more control.

A new API called AVCaptureDevice will offer developers the ability to include the following settings in their apps: sensitivity (ISO), exposure time, white balance, focus, and exposure compensation. Due to design reasons, the aperture cannot be adjusted, as it is fixed on the iPhone – just like on the vast majority of other phones.

Sensitivity (also known as ISO) refers to how sensitively the camera sensor will detect incident light rays. Thanks to a higher ISO, we can take pictures in poorer lighting conditions, but on the other hand, we have to reckon with increasing image noise. An alternative to this setting is to increase the exposure time, which allows more light to hit the sensor. The disadvantage of this setting is the risk of blur (higher time is harder to "maintain"). White balance indicates the color temperature, i.e. how the entire image tends towards blue or yellow and green or red). By correcting the exposure, the device can let you know that it is miscalculating the brightness of the scene, and it will automatically deal with it.

The documentation of the new API also talks about the possibility of so-called bracketing, which is the automatic photography of several pictures at once with different exposure settings. This is used in difficult lighting conditions, where there is a high chance of bad exposure, so it is better to take, for example, three pictures and then choose the best one. It also uses bracketing in HDR photography, which iPhone users already know from the built-in application.

Source: AnandTech, CNET
.