is the likely logo for iOS 19
WWDC, which will be on the horizon, which will be a place changes in iOS 19. This is what we want Apple to include in part of the iPhone OS update.
Apple will hold its annual worldwide conference of developers from June 9 to June 13. This is an event when Apple talks in detail about what new functions and changes will be made to its various operating systems, and iOS is its main conversational point.
This often included the details of the changes in the existing iOS functions, as well as new functions and even large strategic shifts. The implementation of the WWDC 2024 Apple Intelligence initiatives at that time was huge news, and this is still, even if Apple gained deployment.
that Apple announces during WWDC, will not be used by the public before the fall, accompanying the IPhone 17 collection. Even then, not everything will be available at this moment, and the functions that often expect will be presented in the next months.
However, the Apple beta testing program will have some elements that will immediately become useful for developers. Shortly after the performance on Monday, Apple is inclined to release primary beta -version of the developer, giving a hint of what to expect in September.
While Apple has its own internal list of things that it will add to iOS 19, there are many elements that iOS users will want to address Apple. AppleinSider is on.
This is what we want Apple to change in iOS 19, if not necessarily what we really get.
More intelligence Apple, the fastest
is unknown that the Apple intelligence will be a large WWDC function. It was a huge part of the 2024 event, and this is what Apple intends to develop.
However, it has a huge number of catching up. Not only with the market as a whole, but also for their own purposes.
Apple was criticized during the year due to her ice deployment of Apple intelligence. Many elements were covered by delays, to the level at which the best business bureau intervened in order to complain about Apple's promotion all this.
an example of a contextual request that Siri will be able to answer, ultimately. -Inobroting loan: Apple
Then there is Siri to discuss, since Apple promised a revised version that could understand the context to a high degree. We were told that Siri would once understand such issues as “when the flight of mom arrives at the airport?” But this is still not a reality.
Apple took steps to correct the situation, including internal permutation from how the reconnaissance functions of Siri and Apple are controlled. Allegedly, allegedly tries to avoid promoting future functions until they soon from the launch.
We can expect that the Apple MEA Culpa on the entire reconnaissance of Apple Intelligence and Siri during WWDC. We can probably get guarantees that updates will arrive quickly and steadily.
But we can also expect that Apple will promote where Apple intelligence will develop in the future. Perhaps opening it for third -party applications.
let's just hope that everything he says is actually stepping on the way.
continuous camera, but with iPados
function> function> function> Continuity chambers in Ecosystem are an extremely useful element, especially it comes to this. The rear camera of the iPhone is fantastic, and it is better than many webmers in the market.
for MAC users who have no luxury to own the MacBook Air or MacBook Pro with a built-in camera on the display, it also saves money from the need to buy a webcam in the first place.
This is very useful for the needs related to the video on the Mac. So, why not use the same concept in another place as on the iPad.
You can use the continuity chamber for using the IPAD or iPhone rear chamber on the Mac, but you cannot use the iPhone rear camera in the same way to apply iPad.
The continuity camera turns your iPhone into a Mac webcam.
iPad has completely adequate front and rear chambers, of course, therefore, the expansion of the continuity chamber available for MacOS applications does not seem necessary.
However, this can be useful in cases where you record the video on the iPhone. You can record on the iPhone and then transfer it to the iPad, but you need to largely create everything on the iPhone itself.
the ability to remotely control the iPhone camera from another device, whether it was another iPhone or iPad, would be very convenient.
Indeed, you have this to some extent with the final camera and uses it with Live Multicam on the Final Cut Pro for iPad. Not everyone wants to pay for the Final Cut Pro subscription or even wants to use this tool.
its presence as a separate feature can be a find for YouTube images to update your productions.
and not, pre-viewing the representation of the camera on the Apple Watch is simply not enough.
However, although you can easily access many chamber application settings, you can change much more the problem is that they are buried in the “Settings” application Regulation of an increase in the camera’s increase in iOS
Do you want to turn on or disconnect the mesh or level indicator, or stop the screen reflecting the image using the front camera? These are the parameters in the settings.
also does not have a way to change your formats or configure photos of photographs if you do not go to the settings. If you want to disable the correction of the lens or macro management, you must also do it there.
The request here is not necessary to add all this as parameters in the camera application itself. We would be satisfied with the option in the camera that delivers you to the camera page in the settings.
For those who spend a lot of time on photography and video and perceive it seriously, this is an addition that will be extremely useful.
The settings of the button
the introduction of the Apple button has made more than just replace the picky disconnected switch with a pressing object. He discovered the world of personalization, because you can configure it to open one of the applications, as well as in various ways.
Although this is one of the most regulated physical controls on the iPhone, there are relatively few other available capabilities.
Settings of the action button.
The main of them are capable of switching the buttons for the lock screen for other tools and turn on the button up to the volume to make the images in the chamber. If you have a new iPhone with camera control, there are other options, but it is still very limited.
What would be nice if you can make more changes to how the buttons react to different presses. For example, settings of volume buttons to open applications with long pressure, but actually change the volume only with short presses.
Providing a wider range of parameters in order to press each button is also welcome. Of course, you can configure the camera control for opening the camera, a QR cord or increase scanner, but why not many options as the “action” button settings?
by general recognition, this can interfere with some existing functions based on the press that already exist. But it would be good to have available options if we really want them.
separated screen iPhone
Let's take a look in the truth, the iPhone screens are quite large, especially if you take a model, such as the iPhone 16 Pro Max. With a 6.9-inch display and equally high resolution, there is a lot of digital space for the game.
you can say the same about the 6.3-inch model Pro or even the 6.1-inch iPhone 16E. This does not even take into account the potentially large display of the long -term iPhone fold.
with the screen sizes, which are actually not very far from the 8.3-inch mini-panel of the iPad, we can confidently say that iPhone users have a lot of display for work. So much so that it should be used more efficiently.
the use of several applications simultaneously on the iPad.
IPad and the iPad Pro have functions such as the scene manager, a divided view and scroll so that users can simultaneously work with two applications. Given the processing performance and the display size, of course, some form of the separated screen mode can be brought to iOS.
You can technically work with two applications at the same time, with the image in the picture that allows you to watch a video on the top of another application. This is normal, but it would be nice to be able to fully use two applications on the screen at the same time.
it is easy to imagine that the application that raises the upper or lower half of the iPhone display is used to be used by another. It is probably not difficult to make it a reality.
manager of the buffer exchange
the Apple ecosystem is, especially when it comes to data from one device to another. Trivally use a universal clipboard for copying a text or image on a Mac in a message that you write on the iPhone.
This is great for the most part, while you do not need to process a few bits of data at the same time. You can copy and insert only one thing at a time on Apple equipment, whether it be Mac, iPad or iPhone.
to be fair, this is that in fact it is not a problem for iPhone users. Unlike the iPad or Mac, you will not necessarily perform so much productive work on a smaller screen, so there is no real need to combine several elements of the exchange buffer.
Despite this, it would be completely useful to have any form of the instrument of the manager of the buffer, which allows several elements in the specified buffer of the exchange. Of course, this would be more useful for larger display equipment than the iPhone, but if you do it there, you can do this in iOS to parity functions.
You can get managers of the Buffer of the exchange for MacOS, although on iOS you have relatively few affordable ones. Insert one of the main options for iPhone users.
This is a function that you would expect that Apple somehow realizes in his ecosystem.
Library of applications on the home screen
Apple has in the consistent IOS lighting, increased the amount of personally. Currently, users can include widgets of various sizes, represent gaps and place icons, wherever they want, and even configure the color scheme of the icons themselves.
This is all good and good, but Apple can take a step forward.
Imagine folders of applications library on the main pages of the home screen.
If you run in the extreme right of the right screen, you are met by a library of applications. This section classifies applications, but does it quite carefully.
Each square of the category is actually a folder, with three or four frequently used applications from this category, or three applications, as well as a quadrant with tiny application icons representing others in the folder. Click on this and you see a complete list in the category.
This feature of the folder would be very useful on the main pages of the home screen, as an alternative to the existing settings of the folder, which requires you to introduce the folder to select the application.
This is a little excess, since you can consider each page of the home screen as a folder on their own, but this will give users more ways to organize their icons.
for users such as the editorial team of Appleinsider, who have hundreds of applications, we need all the help we can get.
registration of food
Apple has a deep interest in the health of its users. With Apple Watch and Health, which monitor fitness and vital user organs, it already does a lot to promote a healthier lifestyle.
Indeed, there were rumors about changes in the application for health in the form of an agent of artificial intelligence. The one that could give reviews about ways to improve their health, like a coach.
However, one area on which the apple can take on is in tracking food. It is one thing to have tools available for monitoring how much a person works, but it would be useful to track the food side of things.
Indeed, in the health application there is a section built into it for nutrition. Except that he does not offer a way to track food, since it relies on third -party services, such as MyFitnessPal.
The food tracking application will help fill out this category, as well as provide any potential AI coach for more data for work.
Automatic creation of memoja
at the moment that the Apple intelligence included several elements of image generation. Give some tips and you can get user emoticons or even whole images.
Image PlayGround, one of the most famous elements of Apple intelligence, also uses the images supplied by the user as the basis for his creations.
If the playground can recreate the person’s face from the photo, this may be expanded to another area.
Setting Memuadzi on the iPhone
Memoji is an extension of Animoji, in which users can create 3D-lice avatars. In this process, a menu system is used in which you choose each element of the face from the choice of components, which leads to something, from a widely similar caricature to something wildly, depending on your choice.
, taking into account this, it seems plausible that the Apple system for recognizing and copying the face in the image of the playground could be used for memoja. One in which the user offers the image of a person to create, and Memoji automatically selects the most suitable facial functions.
even better, imagine that you will update Memuji by saying iOS to update the model that it has for you, and this does it, scanning your face.
, of course, there is a lot of fun when choosing components on their own, but not everyone wants to fuss when they just want to quickly create a computer of their face.
Follow Appleinsider in Google News