The Future of Smart Glasses: User-Created Apps

One of the frequent issues I often come across in reviewing wearable technology is the available apps. Products like the Envision glasses and Meta Ray-Bans have a wide array of apps available to them assist the user with AI or human sighted exploration of their environment. But what if you’re needing something more specific to your needs or the functionality provided to you just doesn’t cover all bases?
I’m no programmer but am interested in what the concept of “vibe coding” can offer to fill this gap.
This week I came across Brilliant Lab’s Halo AI smartglasses. My first impressions are that this is an interesting mix of Google and Meta’s approach to smartglasses however with at least one significant difference to me: the ability for the wearer to create their own custom, fit for purpose app. Here’s definition of this feature from DesignBloom:
The glasses also include a function called Vibe Mode, which is a tool inside the system that lets users turn ideas into apps. The user types their idea into a terminal called Vibe Mode, and Noa reads the idea and starts creating it. The AI agent takes the input and generates a version of the application, and after that the user can then edit or adjust the design.
(Design Bloom, 01/08/2025, https://www.designboom.com/technology/halo-open-source-glasses-private-ai-agent-brilliant-labs-08-01-2025/)
Of course the accessibility of this solution is the first thing that concerns me and how this can be overcome. How exactly can the user “type their idea” into a terminal: will voice input (and voice feedback) be supported? If this is done in an accessible way this is going to be very interesting to note what it is capable of.