Apple’s WWD was a small thing for me, with no new hardware announced and some new features outside the glass interface for iOS 26. The iPhone 16 Pro packs one of the best camera setups already found on any phone, it is capable of taking amazing images in any case. It is also a powerful video shooter in its process video, log recording and clean 4K slow motion mode. He even fought a tough fight against the other best camera phones, which included the Galaxy S25 Ultra, Pixel 9 Pro and the Xumi 14 Ultra.
Read more: Camera Champions faces: iPhone 16 Pro vs Galaxy S25 Ultra
Nevertheless, it is still not the perfect camera. Although preliminary reports of industry internal companies claim that phone video skills will be promoted, the iPhone 17 will need to build a powerhouse for all -round photography. As an experienced phone reviewer and a professional photographer, I have extraordinary expectations from top and phone cameras. And, after using the iPhone 16 Pro since its inception, I have some ideas to change what to change.
Here are the important points I want to see better on the iPhone 17 when it is likely to be launched in September 2025.
Accessable Pro Camera Mode
In the WWDC, Apple has shown changes to the upcoming iOS 26, which includes a fundamental change in the interface with liquid glass. But this simple style was also extended to the camera app, Apple brought the interface to the most basic tasks of the photo, video and zoom surface. Exterior. The idea is to open the camera and snap Instagram even the most initially of photographers.
The new camera app is incredibly bare bonus
And that’s fine, but what will happen to those of us who buy pro -model to take advantage of the features such as exposure, photography style and proda formats? It is not yet clear how these features can be accessed in the new camera interface, but they need to be removed. Many photographers – who also include a lot of themselves – want to use these tools as a standard, as well as use our powerful iPhones in the same way that we apply mirror -equipped cameras from canon or Sony.
This means relying on modern settings to overcome the photo -taking process that prepare to craft shots that go beyond Snap. If anything, Apple’s camera app has always been very easy, even basic functions like white balance are not available. It is disappointing for Apple to look at things even more simply, and I want to see how the company will make these phones useful for passionate photographers.
Large Image Sensor
Although the 1/1.28-inch sensor found on the iPhone 16 Pro’s main camera is already a good size-and the S24 is slightly larger than the 1/1.33-inch sensor of Ultra-I want to see Apple go bigger. A large image sensor can capture more light and offer a better dynamic limit. This is why Pro cameras have at least “full frame” image sensor, while really advanced cameras, such as amazing Hasbled 907x, have too much “medium format” sensor for ancient image quality.
Even on Pro cameras, the size of the sensor is important. Even the full frame image sensor in the middle is sowed by the medium format sensor on the right. Phone camera sensors do not come near this size.
Xiaomi understands it, equipped with 15 ultra and the previous 14 ultra -1 -inch sensor. It is larger than the sensors found on almost any other phone, which allowed 15 Ultra to take amazing images all over Europe, while 14 Pro Taylor was brave in capturing Swift concerts. I want to meet Apple with at least a similar type of sensor with Xiaomi’s phone. Although we are talking about the wishes of the sky, the iPhone 17 may be the first smartphone with a full frame image sensor. I will not take my breath on it – the phone and lenses will need to be too high to adjust it, so it is possible that you can call the camera without your mirror.
Variable aperture
Talking about Xiaomi 14 ultra, another reason for the phone for photography is that it has a variable aperture in the main camera. Its width is F/1.6-iPhone 16 Pro’s F/1.78 significantly wider. It sheds a lot of light in the wider aperture dimmed conditions and receives the focus on a more authentic subject.
Outside this pub, the Street Light Xumi 14 has turned into an attractive star burst thanks to the variable aperture of Ultra.
But Xumi’s 14 ultra -aperture may also be closed to F/4, and with this tight aperture, it is able to make starrebeers around light locations. I like to get this effect in the night imagery with the phone. The resulting images are very visible as they are taken with a professional camera and lens, while the same points of lighting on the iPhone look just like a round bulb. Delegate, Xiaomi in fact removed this feature from the new 15 ultra to see if Apple sees the value of implementing such technology.
More photography style
Although Apple’s various styles and effects have been integrated into iPhone cameras, the iPhone 16 range has further pushed it further with more control over effects and more toning options. It is sufficient that CNET’s senior editor Lisa Edicco has even announced new photography styles “her favorite new feature on Apple’s latest phone”.
I think they are very good too. Or instead, they are a great Start. Different colored tons, such as you get with amber and gold style, add some beautiful warmth to the scenes, and the calm effect adds vintage film matteness, but it still does not have much and the interface can be a bit slow to work. I would like to introduce Apple to more photography style with various color toning options, or even with tones that imitate Vintage Film stock from Kodak or Military Film.
I like the hot tons made by the iPhone amber style in this icon, but I definitely want to see more options for being creative with color tons.
And of course, there are many apps like the VSCO or Snapseed that let you play with color filters you want. But using Apple’s styles means that you can take your photos with a already applied look, and then change it if you don’t like it-nothing is hard in your image.
I was recently impressed with the new Samsung’s tool to create customs color filters based on the form of other images. I would love to see Apple’s level of imagery on the iPhone.
Better Puora integration with photography styles
I think Apple has lost a chance with its photography style, though you can only use it when taking pictures in the shepherd (high performance image form). Unfortunately, you cannot use them when shooting in Puru. I like the use of Puro on Apple’s previous iPhones, as it takes advantage of all computational photography of the iPhone – which includes things like HDR image mixtures – but still outset the DNG raw file for easy modification.
The DNG file usually offers a more essence to illuminate the dark areas in an image or to illuminate the tone down highlights, making it extremely versatile. Earlier, Apple’s colorful presentants could be used while shooting in Priora, and I liked it. I often shoot Street Style images using high contrast Black and White mode and then edit the raw file further.
I do a lot of street photography in black and white, and I would like to take more flexibility to take the porora shots in the monochrome.
Now using the same black and white look only means shooting images in the Hehif format, which will eliminate the benefits of using Apple’s Pauru. The strange thing is, when taking the raw picture is no longer available in the old -style “filters” camera app, you can still apply these filters to raw images in the iPhone gallery app through the editing menu.
Luts for Peruvar Video
And when we are on the subject of colorful prostations and filters, Apple also needs to bring them into the video. On the iPhone 15 Pro, Apple introduced the ability to shoot the video in the Procece, which results in a very low proportion, almost gray -visible footage. The idea is that video editors will take this raw footage and then apply their amendments, often contrast and colorful prostations known as luts (Look-up tables), which gives the footage a special shape-think about dark and blue for the films.
But Apple does not offer any type of LUT on the iPhone on the iPhone, on the contrary, in addition to promoting the contrary, which does not really do the work properly. Certainly, the point of the process is that you will remove this footage from the iPhone, put it in the software such as Divanity, and then accommodate the footage properly so that it looks ascaw and professional.
The footage looks very little and unconscious. Apple needs to introduce ways to help work more with Process files on the iPhone.
But it still leaves files on your phone, and I would love to be able to work more with them. My gallery is full of non -grade video files with which I will work very little because they need color rating externally. I would love to share these files with my family on Instagram, or WhatsApp, after changing these files from the gray and gray.
With the iPhone 17, or even with the iPhone 16, as the software update, I want to see Apple a range of my own LOTS that can apply directly to the iPhone video files. Although we did not see the software’s functionality as part of the company’s June WWDC key, it does not mean that it cannot be launched with the iPhone in September.
If Apple succeeds in enforcing all these changes-by leaving, perhaps, the full frame sensor that I can also recognize is a touch-in his hands the camera will have the absolute animal.


