We may earn a commission from links on this page.
Calling it a step towards "super intelligence," Meta announced it is releasing Muse Spark, an overhauled and improved AI. This "natively multimodal reasoning model" goes way beyond a chatbot, and it will soon live in your glasses and your social feeds. It's available now in the Meta AI app, with plans to roll out with a smart glasses update in the next few weeks.
Instead of a one-size-fits-all approach, there are three levels to Muse Spark's "thinking," and users will be able control how deep the intelligence goes.
-
Instant Mode: For quick questions and everyday chats.
-
Thinking Mode: This mode is designed to solve more complex problems, so if you need some help with math, science, or logic, this is the mode.
-
Contemplating Mode: Muse Sparks' highest level engages multiple AI agents that work in parallel and collaborate to complete complex, multi-step tasks.
Meta says Muse Spark's performance compares to or exceeds their Llama 4 Maverick model while using over an order of magnitude less computing power. That means, theoretically, high-level reasoning without excessive server use.
While Muse Sparks will be accessible in a variety of places, it seems like Muse Spark's ground-up integration of visual material is made for smart glasses. Here are some of the ways Ray-Ban Meta and Oakley Meta users will be able to use the new AI.
AI is now integrated across different tools
One of the Muse Spark main improvements over Meta's previous model is the way the new AI will integrate visual information across different tools. So, theoretically, you could point your glasses at a mess of wires and electronic boxes and say "how do I hook up this home theater system?" Or get step-by-step coaching on assembling a piece of IKEA furniture without opening the booklet. The AI would read the instructions and make sure you're not screwing anything in upside down.
Muse Sparks will have health reasoning capabilities
Meta said its Meta Superintelligence Lab collaborated with over 1,000 physicians to develop the AI's health reasoning capabilities. Users will be able to do things like generate an interactive display that unpacks the nutritional information about food, and maps out what muscles are activated during a workout.
But how will it actually perform?
All of the above is "in theory." Artificial intelligence hasn't always lived up to its hype, even when it's being hyped in front of a massive audience. It's one thing to perform well in laboratory benchmark tests, but how the tech works in the real world, where lighting is spotty, wi-fi is slow, and furniture instructions can be extremely complicated, is the real challenge.
While I haven't dug deeply into the tech, I did give it a quick test by turning on "thinking" mode and sending Meta AI the below picture of a random assortment of audio gear:

It not only correctly identified everything in the picture, it gave me a couple different options for possible ways to hook it together, and told me (correctly) what cords I needs. So I look forward to having it on my glasses. If you want to test it yourself, Muse Spark is already running on meta.ai and the Meta AI app, and smart glasses firmware and social media integrations are expected to follow shortly.

0 commentaires:
Enregistrer un commentaire