Apple’s foray into augmented reality glasses remains a moving target, with the company’s most advanced iteration now slated for 2028—a full year after Meta’s expected entry. The delay isn’t just about timing; it’s about pushing display technology to unprecedented levels. Unlike earlier prototypes, Apple’s final design will incorporate dual 0.6-inch OLEDoS displays, a micro-OLED-on-silicon approach that promises sharper outdoor visibility, lower power draw, and pixel densities far exceeding today’s wearables.
The shift toward OLEDoS—where OLED pixels are etched directly onto silicon—marks a departure from traditional glass-based OLEDs. This method allows for ultra-compact, high-resolution screens while integrating circuitry into the substrate itself, reducing energy consumption and heat. For context, Meta’s upcoming 2027 AR glasses will use a waveguide system (projecting images via micro-mirrors) paired with dual OLEDoS panels, but Apple’s dual-display approach suggests a focus on binocular clarity—critical for tasks like navigation or 3D spatial mapping.
Key specs and what they mean
- Display: 0.6-inch dual OLEDoS (2028 model); no AR display in 2026 AI glasses.
- Tech leap: OLEDoS vs. traditional OLED—silicon wafer substrate enables higher pixel density and lower power use.
- Competitors: Meta (2027, waveguide + OLEDoS), Asus/RayNeo (2026, OLEDoS).
- 2026 preview: AI assistant, cameras, speakers, but no built-in AR display.
- Meta’s Ray-Ban Display (2025): $799, 42 ppd resolution, 5,000-nit brightness, EMG gesture control via Neural Band.
Apple’s two-phase rollout reflects a calculated risk. The 2026 model—likely a software-heavy, AI-driven companion—will test hardware-software integration without the complexity of AR optics. Features like hands-free Siri, real-time translation, and gesture controls (similar to Meta’s EMG-based Neural Band) will set expectations, but the lack of an AR display suggests Apple is reserving its most ambitious tech for the 2028 launch. That model will need to compete not just with Meta’s waveguide tech but also with Asus and RayNeo, both targeting 2026 OLEDoS releases—though their designs may lack Apple’s ecosystem lock-in.
Why the delay?
OLEDoS is notoriously difficult to mass-produce. Traditional OLEDs require precision alignment of organic layers on glass; OLEDoS demands semiconductor-grade fabrication, where misalignments can ruin entire wafers. Apple’s 2028 timeline aligns with industry reports that OLEDoS yields are improving, but scalability remains a hurdle. Meanwhile, Meta’s waveguide approach trades some resolution for outdoor usability—a tradeoff Apple may avoid by betting on dual OLEDoS for depth perception.
The $799 price tag for Meta’s Ray-Ban Display (including the Neural Band) hints at the premium positioning of early AR glasses. Apple’s final model will likely command a similar premium, but its dual-display strategy could justify higher costs for professionals (e.g., surgeons, field technicians) who demand stereo AR overlays. For consumers, however, the 2026 AI glasses may serve as a gateway—proving Apple’s voice and gesture controls before the 2028 AR leap.
The bigger picture
Apple’s AR glasses aren’t just a hardware play; they’re a platform play. The 2026 model will integrate with iOS, Siri, and Apple Silicon, creating a walled garden for apps, spatial computing, and ARKit. The 2028 version, with its OLEDoS displays, will need to deliver on outdoor usability—a weakness of early AR glasses. If successful, it could redefine how people interact with digital information, from real-time navigation to remote collaboration. The stakes are high, but Apple’s patience suggests confidence in outpacing competitors with superior optics and seamless software integration.
Availability for the 2028 AR glasses remains unconfirmed, but industry sources suggest a holiday 2028 launch window. The 2026 AI glasses may arrive as early as spring 2026, though exact pricing and features are still under wraps.
