TECH

WWDC 25 Visionos 3 Wishlist: What we would like to see further for Apple Vision Pro

Visionos 3 can bring some large updates to the Apple Vision Pro

0

WWDE-LEAR” WWLLEDE-LETE “> WWDE-LETEE-LETEE”> wwde-LETEE LETEE “”. The debut of Visionos 3 will be the first glance at what the annual update cycle for the Apple Vision Pro software looks like. Here are some of our hopes on the platform.

there is a call to the easier, cheaper Apple Vision Pro, but Apple cannot provide this using the software update. There are many updates of the quality of life that can help increase the platform.

When the Apple Vision Pro launched in February 2024, he controlled Visionos 1.0 with the idea of ​​Apple about what the platform should be. After several months of using Apple engineers, the company was able to solve many problems that were not identified in beta testing.

Issue Visionos 2 just a few months after the WWDC 2024 presented a new way to access the control center, volume and other settings. Previously, it was literally experience trying to gain access to the necessary control.

Many upcoming changes depend on how Apple considers Apple Vision Pro and its level of importance in the lineup. It is focused on bringing Apple intelligence from the Earth, not to mention the recalculation throughout the ecosystem, so Visionos can take the rear seat in 2025. Data-enchor = “More local Applemade-Apps” ID = “more local Applemade-Apps> more native Apple applications developed

the most basic and obvious element in this list of wishes is more native applications. Apple has not transferred any of its compatible iPad applications to NATIVE VISIONOS apps from the start of devices & mdash; Although some completely new native applications were introduced.

This screenshot was taken in Visionos 1.0, nothing has changed

the following applications still work in the iPad-compatible mode:

    block
  • Calendar
  • shortcuts
  • Photomator
  • News
  • News

  • maps
  • numbers
  • pages
  • Books
  • stocks
  • voice memos

>It ' S Quite The List Consideing I Use Nearly All of these applications on a regular basis. Without Apple, a good example of how he thinks that some applications should look and behave, not many developers followed his example.

We are a year before the public development of Visionos, but Apple also worked on the product for many months before launch. The Apple Vision Pro was first discovered two years ago in June of this year, so it seems unforgivable that all Apple applications are native, even with a minimum design and support.

Clock, the calendar, reminders and the house seem to be excellent options for applications that support all types of interaction and MDash; Objects, windows and space. Labels can go even further, allowing users to choose their own USDZ file as an armature for interaction.

For example, I have a USDZ file with an animating fish, a small blue poemon, and it would be great to attach a shortcut to this object. Let me see and press to start any number of shortcuts associated with the actions of the exchange buffer, or even just reproduce the random episode of Pokemon to YouTube.

loose an impulses in 3D as a small figure, I can place anywhere in Visionos

there are many Apple with these applications. Let me place a physical book on my table and press it to launch reference materials in the Books application, reminders appear as small sticky notes that I can adhere to the windows of the application or objects, or show a virtual thermostat that I can install on my wall.

This leads me to my next idea for Apple Vision Pro & MDash; Imintiff environment.

New exciting media and control conditions

I love exciting Apple environment. You can instantly work on the beach, near the edge of the cliff or under a massive mountain.

a new immersive environment would be excellent, but third-party access is even better

, in which there are several ways that I think that Apple should think about these spaces. Obviously, Apple should open an immersive environment for third parties so that they exist outside the applications.

would also be interesting if users could develop and load their own spaces. Perhaps Apple or a third party can make an exciting spatial designer and allow users to save and use them & mdash; Think about how the wallpaper works today.

In fact, there is no good place to mention this, so I will stick it here. Apple is desperate to maintain the detection of third -party keyboards for passing them through the environment and fix the texture panel. Perhaps this can help my next idea.

this may not like this except me, and may completely miss the planned use of Apple Apple Vision Pro. Nevertheless, one of the first things that I noticed about working in these virtual environments is a little strange – this is the lack of a table or physical surfaces.

' the task simulator should probably not create an immersive environment

now I am not saying that there should be a cabin, although I would not be against it as a kind of gag. Nevertheless, I would like to see the possibility of adding anchored surfaces to space.

For example, there are three tables in my office, and I know that Apple Vision Pro knows where they are and where the walls are behind them. I suggest that I allow me to place a virtual table in identical physical locations of real tables and use them to accommodate three -dimensional objects or anchor windows.

Let's look at my idea for Native Apple applications. If the Apple Books allow me to post a virtual book on the table, I could look right and see a book that is always tied to this table and choose it to open the book window on the reference page.

I could place other objects on the surfaces of the desktop, for example, physical clock for the clock, a virtual notebook that opens a specific note, etc. You can even take this a step further and fix the virtual calendar on your desktop calendar.

The surfaces should not be limited by tables. Apple should allow users to have a wall in their virtual spaces. Imagine if you can look left and see a virtual wall where you can place the facilities of the calendar and other widgets.

Apple should work on its exciting mode and the transfer of objects

I can imagine how the video game refers to the SIMS walls. Let me design the space, raise the walls, hide the walls, etc. It may seem that I may seem that I am in the room at the table, looking from the window from floor to ceiling on the bora-brow, and not right in the sand itself.

maybe it becomes too skeoMorphic, but this seems to be the idea for spatial calculations. Imagine when one day all this is visible through a set of glasses and mdash; The physical attachment of the object and digital representations of objects will all enter the game.

This leads me to my next idea.

Constant consolidation for Windows and objects

Apple improved the ability of Apple Vision Pro to remember where the windows are located. This is not always perfect, and if you press and keep the digital crown in the subsequent language, this can cause chaos.

Microsoft Hololens had early examples of objects and windows attached to real objects

should be a way to establish a kind of constant anchor using a physical object. It is here that the Apple Airtag 2 technology can enter, but now the iPhone or iPad can play this role.

Let the user enter into the binding reset mode, which relies on the U-series U-series U in the iPhone to set the binding point. Thus, the user can install a global anchor based on the iPhone tuning on the magsafe stand on your table, and always organize everything in space without problems.

The presence of a constant binding point, which is attached using a combination of GPS and accurate location means that memory and automation become more possible. For example, check the iPhone binding point, then press the shortcut to enter the work mode, where all windows open for a given position around the anchor.

Save your positions are fixed as part of the focus modes. In addition, use different anchors to cause various automation.

For example, if your iPhone is in the Magsafe Mount bedroom, ask him to check his position as part of the automation when starting the window settings. So, if you focus on working in the office compared to the bedroom, the windows, accordingly, agreed.

A window controlled by automation using physical points of binding can open various options. If Apple is largely based on the use of USB in Visionos, this may mean that virtual windows are open and passively closed when the user moves around the house.

I think that it would be interesting, since more digital objects become the norm. There may be virtual displays attached to the wall, instructions for the care of plants attached to plants throughout the house, home control elements for fans, lamps and more attached to the object.

The possibilities are endless.

Mac Virtual Display Updates

Returning to more reasonable concepts, Apple can do interesting work with a virtual mac display. Ultra wide display settings are an incredible update of the system, but there is always more that can be done.

Free MacOS applications from a virtual desktop

Listeners of the Appleinsider Payed “+” Payed “+” Payed “heard how I mentioned this in passing. Apple can allow users to get applications from a virtual environment and consider them as free applications in Visionos.

Now, of course, the user interface is most likely to be largely relied on the accuracy indicated through the mouse, but after a moment I can track my eyes. I think that there is another spine, it helps to return this function home.

there are rumors that suggest that Apple is going to redesign all its operating systems to have a more humiliated design inspired by Visionos. Whatever it means to MacOS 16, it can help the applications behave better in Visionos.

I really think that no matter how powerful Visionos will receive or how popular the inevitable smart glasses of Apple will continue. There is no reason to expect that Apple Vision Pro or especially the glasses will be powerful enough for all proportional tasks.

So, imagine that if you can start the Mac virtual environment, then immerse the programs into space for spatial calculations. Full XCode, Final Cut Pro, or other applications, laid down through the attached mouse on an almost infinite display.

Visionos interface is very different from what is on the iPhone, iPad or Mac

, perhaps in the Virtual Mac window will receive some user interface settings, which will make it easier navigation using a type and protection.

I believe that this is a smart average step that strengthens both ecosystems. Instead of tying the paradigm on a platform with which it cannot or should not process, let it work with equipment that is already optimized for it.

We saw such errors made with the iPad. Too often the discussion concerned how the iPad could replace the Mac, and not discuss how it could increase the Mac.

just because the iPad and Apple Vision Pro are the future calculations, does not mean that they should be the only final computing point. Apple products work best when they work together.

This is what I put earlier & MDash; Apple Vision Pro is an expensive product and should not constantly be updated at this price. Thus, if the M4 Mac is involved in an increase in the M2 Vision Pro, it is just an ecosystem.

improved eye tracking

Eye tracking technology is the main one for the experience of Apple Vision Pro. On the scale of things, it is the least accurate and most susceptible to error compared to the large sticky ipados cursor and a thin, accurate MacOS pointer.

there are many small elements that are too close to each other to exactly choose with the eyes

there is a problem that, regardless of how much you look at the choice in Visions, you inevitably acted on something higher, below, or in the neighboring application. There should be some level of specificity and control with the cursor, and our eyes, it seems, is not enough.

I am not an engineer in encoding, but, of course, there is a solution. One of the options that I suggest is the opportunity to watch, then press and hold, then scroll between the points of choice, using the same scroll gesture for Windows. Then you release TAP to choose.

Regardless of the solution, there are too many operations on Apple Vision Pro. The buttons of the applications are faced with the window settings, and too often you unintentionally close the window when you tried to grab the adjustment panel.

The problem worsens on the Internet in safari or in any compatible application for iPad. Places where Apple Vision Pro is not considered, since the main tool for interaction is constantly falling in this.

I expect that Apple will have some solution or at least more accurate eye tracking using updates and software algorithms.

Other possible updates

I will complete this list of wishes with two more obvious elections. Firstly, Apple needs to return to work on Personas.

the persons did not improve, since the initial launch in which Tim Chatea and I recorded the podcast in this way

Slightly inaccurate characters are within a strange valley, so I'm not sure what a solution is. Either Apple may try to make them more realistic and animated, or they could return to something more similar to memorandums.

Personally, I would like to see the opportunity to simply use the meme in the quality of the personality.

In any case, the then cashed functions are the wrong way. Apple should use data to create an avatar based on your functions, and not on mission of depth.

Video games understood this correctly years ago. In some games, a primitive recognition of photographs for sliders before repayment was even used. 3D cameras Apple Vision Pro and scanners are much more advanced and can easily create the best solution.

My recommendation here is to pass the line between the Pixar, similar to parts in the hair, skin and clothing, but did not go so far as to make the avatar look like Call of Duty Catcene.

, making an avatar more than 3D presentation instead of a form, it will open the opportunity for better adjustment. Let users choose clothes, hairstyles or even just fantastic with new eyes and skin shades.

Yes, the characters should go to a full video game. More professional human defaults by default for office settings or user avatar for Dungeons & amp; Dragons over Apple Vision Pro.

Apple Vision Pro needs physical controllers for games

, speaking of video games, Apple must ensure support for the third-party controller of Apple Vision Pro. PlayStation VR2 controllers may be something that Apple works with Sony Distribution, but this is not an easy function.

Apple should also work with the manufacturer of the controller to develop an individual controller Apple Vision Pro. People on a surreal touch will make great partners or, possibly, by acquisition.

and, speaking of games, there are already data connectors in which the construction of the developer is connected. There are absolutely no reasons why this can not be used to transmit a 3D IGR with a PC or Mac Apple Vision Pro, as the valve index does.

Send a new strap in WWDC, Apple. Let it be used for wider compatibility with a larger VR platform & mdash; PC.

Apple Vision Pro is a well-made piece of hardware, but this does not make it perfect. Software and operating systems cannot be developed in vacuum, so Visionos perfection will take much more years.

We hope for some surprises on WWDC 2025 with Visionos 3.

Follow AppleinSider in Google News

Leave a Reply