One of the hardest things in smart eyewear is adding intelligence without making the glasses too heavy, too power-hungry, too distracting, or too awkward to wear in daily life. That is why the Everysight™ Maverick AI smart glasses, shown with Alif Semiconductor® at CES 2026, are so interesting. Maverick is a lightweight, all-day AR+AI eyewear with a forward-facing AI camera, full-color in-lens OLED™ display, 28° field of view, 8+ hours of display-on battery life, smartphone tethering, integrated audio, prescription support, and open software development kit (SDK). The platform is powered by its Ensemble E7® fusion processor and delivers intelligence directly at the edge.
That matters because the smart glasses market has long suffered from a familiar problem: impressive demos often come wrapped in a form factor that people do not actually want to wear. Everysight’s answer is to focus just as much on optics and ergonomics as on AI. Its BEAM™ display system projects directly to the visor, reducing the need for intermediate optics, while the company says the optical engine adds less than 50g per channel to typical eyewear weight. Everysight also highlights daylight visibility, compact packaging and a high-contrast wide field of view (FOV) display, which are exactly the kinds of features needed if smart glasses are to move from niche gadget to everyday device.
But form factor is only half the story. The other half is what happens under the hood. In CES posts, Maverick described AI as a live demonstration of on-device intelligence, with ultra-low-power AI processing, a bright color OLED display, wireless connectivity to iOS and Android devices. Also, the value of the design around advanced AR experiences that do not rely on the cloud, opening the door to more scalable and privacy-preserving use cases. This is not just about putting a display on a face. It is about making the glasses responsive enough, efficient enough and self-contained enough to be genuinely useful.
This is where Alif architecture becomes important. The Ensemble E7 was built for demanding embedded workloads such as virtual and augmented reality, image processing and AI/ML acceleration. The E7 combines two Arm Cortex-M55® CPUs, two Arm Ethos-U55® NPUs and two Arm Cortex-A32® application processors, giving developers a mix of real-time processing, AI acceleration and higher-level application capability in one platform. That combination makes the E7 a marvelous fit for smart glasses, which must handle sensor input, visual intelligence, user interaction, connectivity and power management all at once. Inference, display handling and system control all have to coexist in a battery-operated device, and that is exactly the sort of multi-domain workload a fusion processor is meant to address.
What makes the Maverick proposition compelling is that Everysight is already mapping this capability to visible, user-facing functions. The company’s materials point to navigation, translation, transcription, object recognition, scene understanding and activity-focused applications such as cycling, golf and running. The AI camera is described as turning the scene in front of the wearer into actionable visual intelligence, while the integrated speaker and microphone array support AI interaction, calls and media. So, the value proposition is not abstract. It is about giving the wearer information in context, in line of sight, at the moment it is needed.

Most intriguingly, Everysight’s Maverick AI Pro configuration adds native eye-tracking through its GazeIntent™ capability. Everysight says this is intended to enable intuitive, hands-free gaze-based interaction, and described it as a way for the system to understand what the wearer is focusing on. That points to a more advanced model of edge AI in wearables. Instead of waiting for a user to stop, reach for a phone, open an app and type a query, the device can begin to interpret visual context and user intent in real time. For smart glasses, that is a far more natural interface than forcing users to adapt to the limitations of the hardware.
For Alif, the significance of the Everysight design is clear. It is another proof point for the idea that edge AI is not won by a single headline spec. It is not just about display brightness, and it is not just about AI operations per second either. In a wearable product, what matters is the system-level balance between compute, memory, optics, battery life, weight, interaction and software openness. Everysight’s Maverick smart glasses show what happens when those pieces are brought together well: the result starts to look less like a technology demo, and more like something people could genuinely choose to wear every day.