Over the past few months, there have been many rumors and reports of Apple plans to release AI-SHAREREBles. & Nbsp; 2027, along with Airpods with cameras that will offer their own set of functions with AI support. Open ML Framework, specially designed for apple silicon. This is quickly
now Apple has released & nbsp; FASTVLM: a model of a visual language (VLM), which uses MLX to offer almost instantaneous processing of images with high resolution, at the same time requiring much less computers than similar models. According to Apple:
based on a comprehensive analysis of the effectiveness of the image resolution, visual delay, number of token and sizes LLM, we introduce a FASTVLM model, which is an optimized trading between the back, and the size of the model, and the PASTVLM model, which is an optimized trade between the back.
The basis of FASTVLM is an encoder called fastvithd . This code was “ Specially designed for effective VLM performance in images with high resolution ”.
it is 3.2 times faster and 3.6 times less than similar models. This is very important if you want your device to process information locally, not relying on the cloud to generate the answer that the user just asked (or looks at it). According to Apple, its model has 85 times faster to the first process than similar models, that is, the time that is necessary for the user to send the first hint and return the first answer value. Less tokens on a faster and lighter model mean faster processing. This is not easy to read, but it is definitely worth checking if you are interested in more technical aspects of Apple “AI Projects.