×
Brilliant Labs integrates Liquid AI to boost smart glasses vision processing
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Brilliant Labs has partnered with MIT-born Liquid AI to integrate advanced vision-language foundation models into its Halo smart glasses, significantly enhancing the device’s ability to interpret visual content. This collaboration aims to improve the glasses’ agentic memory capabilities, which create personalized knowledge bases by analyzing users’ daily experiences and providing contextual responses to future questions.

What you should know: The partnership will integrate Liquid AI’s LFM2-VL series models into Brilliant Labs’ products, starting with the Halo AI glasses launched in July.

  • Liquid AI’s vision-language foundation models can process text and images at various resolutions with “millisecond latency,” transforming camera sensor input into detailed scene descriptions.
  • Brilliant Labs will license both current and future multimodal Liquid foundation models to optimize how its AI glasses understand and interpret visual scenes.

Why this matters: Smart glasses represent what many consider the ideal form factor for AI assistance, as they can continuously feed AI systems everything users see for real-time help.

  • The effectiveness of this concept depends entirely on the glasses’ ability to accurately interpret visual content they capture.
  • Enhanced scene understanding is particularly crucial for the Halo glasses’ long-term agentic memory feature, which needs to accurately identify daily events to provide useful responses about past experiences.

The companies involved: Brilliant Labs was founded by ex-Apple employee Bobak Tavangar and specializes in AI-powered smart glasses.

  • MIT-born Liquid AI is a foundation model company that has developed Liquid Foundation Models (LFMs) specifically designed for multimodal applications.
  • The LFM2-VL series represents Liquid AI’s latest advancement in vision-language processing technology.

How the agentic experience works: The Halo AI glasses feature long-term agentic memory that creates a personalized knowledge base for each user.

  • This system analyzes life context and daily experiences to provide more relevant responses to future questions.
  • For the agentic experience to be truly helpful, it must accurately identify and remember the day’s events so users can ask questions about earlier moments and receive responses that match their lived experiences.
These Halo smart glasses just got a major memory boost, thanks to Liquid AI

Recent News