Sensor technology at CES 2026 followed two distinct paths: pushing the limits of imaging in extreme environments through single-photon detection and eliminating motion distortion with large-format global shutters.
Canon’s next-generation SPAD (Single Photon Avalanche Diode) sensor is one of the most disruptive technologies at the show. Unlike traditional CMOS sensors that measure the volume of light accumulated as an analog signal, the SPAD sensor utilizes a digital photon-counting mechanism.
By recording the avalanche of electrons triggered by a single photon, this sensor inherently eliminates readout noise, maintaining an exceptional signal-to-noise ratio even in near-total darkness . Canon demonstrated the sensor's ability to clearly detect pedestrians 120 meters away under 0.1 lux illumination—essentially pitch-black conditions. Its 26-stop (equivalent to 156dB) engineering dynamic range allows it to capture details in the darkest shadows and brightest highlights simultaneously without clipping .
For Physical AI, this technology significantly enhances the safety of Autonomous Driving (AD) systems in challenging lighting (e.g., tunnel exits or night-time glare). Combined with image-processing software from Ubicept, the SPAD sensor also eliminates motion blur, making it indispensable for high-speed industrial inspection and robotic vision.
While Canon broke the limits of light, Sony established a new standard for motion capture. Sony unveiled the IMX928, a Type 2.0 (31.9mm diagonal) large-format global shutter sensor featuring the Pregius S stacked architecture .
Traditional rolling shutter sensors suffer from "jello effects" or geometric distortion when capturing fast-moving objects. Sony’s Pregius S architecture stacks signal-processing circuitry beneath the photodiode layer, enabling simultaneous exposure of all pixels . At a high resolution of 68.16 megapixels, the sensor achieves an impressive readout speed of 138.9 fps (8-bit) .
| Sensor Model | Resolution | Max Frame Rate | Optical Format | Core Technology | Application |
| Canon SPAD (Prototype) | N/A | High-Speed | N/A | Photon Counting/Digital |
Auto-Driving / Industrial |
| Sony IMX927 | 105.51 MP | 100 fps | Type 2.5 (39.7mm) | Pregius S Global Shutter |
FPD / Semiconductor Inspection |
| Sony IMX928 | 68.16 MP | 138.9 fps | Type 2.0 (31.9mm) | Large Format / Square Pixel | 3D Vision / Object Recognition |
| Sony IMX929 | 50.79 MP | 200 fps | Type 1.8 (28.1mm) | High-Speed Global Shutter | Sports Broadcast / Motion Analysis |
Sony's roadmap indicates that global shutter technology is no longer limited to small-scale industrial cameras. As formats approach Full-Frame size, this technology is penetrating cinema-grade systems and high-end AI vision to ensure perfect temporal and spatial data consistency .
CES 2026 witnessed the transition of metalenses from laboratory prototypes to mass commercialization. By utilizing nanostructures (metasurfaces) to manipulate light, this technology disrupts the traditional requirement for curved, thick lenses.
Singapore-based MetaOptics showcased glass-based metalenses manufactured using semiconductor processes. A standout exhibit was a 5G smartphone module that completely eliminates the "camera bump" common in modern devices.
MetaOptics utilizes a 12-inch Deep Ultraviolet (DUV) lithography process. Unlike traditional circular lenses, metalenses can be fabricated into any shape. MetaOptics demonstrated a rectangular metalens that perfectly matches the shape of CMOS sensors, enabling full-area capture with no edge loss and delivering higher resolution in a much thinner module .
This shift implies that the lens supply chain is moving from precision mechanical grinding toward a semiconductor foundry model. This allows for the monolithic integration of optics and sensors, paving the way for micro-medical robots, ultra-lightweight smart glasses, and "invisible" smart home sensors .
Kyocera extended metalens application to the display field. By precisely manipulating focal positions based on the wavelength of light, Kyocera developed a prototype "Wearable Aerial Display" [15, 10, 10].
This system leverages the extremely thin profile of metalenses to create a compact optical system capable of reproducing images with natural depth perception. This addresses a major pain point in AR—Vergence-Accommodation Conflict (VAC)—by allowing the brain to perceive objects at different depths naturally, significantly reducing eye strain.
With an aging global population, smart lenses targeting presbyopia and vision correction were a highlight of CES 2026. These devices have evolved from recording tools into dynamic human sensory enhancers.
Finnish startup IXI introduced adaptive autofocus glasses designed to replace traditional progressive or bifocal lenses. The system combines liquid crystal lenses with an ultra-low-power eye-tracking system.
Unlike camera-based tracking, IXI uses an "inward-looking" system where infrared LEDs embedded in the frame project light onto the eye, and photodiode arrays capture the "eye fingerprint" reflection . The system monitors gaze direction at 60 fps. When a user shifts from distance vision to near reading, a microprocessor reorders the liquid crystal molecules within milliseconds to adjust the lens power .
Key performance metrics include:
Power Consumption: Only 4mW, allowing the 35mAh temple batteries to last for 18 hours .
Weight: Just 22 grams (excluding lenses), comparable to standard frames .
Health Insights: The system can also estimate attentiveness and detect conditions like dry eye by monitoring blink rates and gaze patterns .
In the professional XR sector, Goeroptics showcased a liquid crystal variable-focus lens less than 1mm thick. By using an electronic drive to modulate liquid crystal alignment, it provides continuous diopter adjustment from -3.00D to +3.00D. This eliminates the need for prescription inserts in XR headsets and fundamentally solves motion sickness caused by VAC.
As AR glasses are positioned to succeed smartphones as the next computing platform, CES 2026 saw companies tackling the "optical triangle" of field of view (FoV), brightness, and size.
Lumus demonstrated its ZOE geometric reflective waveguide, which pushes the FoV to 70 degrees—a massive leap from the current 50-degree industry standard. ZOE also eliminates "light leakage," ensuring privacy for the wearer while achieving high ambient transparency. This enables AR to move from simple notification overlays to immersive multi-window workspaces.
To achieve all-day wearability, weight is critical. Meta-Bounds showcased two CES Innovation Award-winning designs: a 25g monochromatic AR glasses and a 38g full-color AI+AR glasses. These utilize proprietary resin (polymer) waveguides rather than glass. Goeroptics also displayed its F15Pi full-color resin waveguide module, which weighs only 4g but maintains over 92% grating transmittance without rainbow artifacts.
| AR/XR Device/Module | Optical Solution | Key Specs / Advantages | Key Partners |
| Lumus ZOE | Geometric Reflective Waveguide | 70° FoV / High Efficiency |
Meta (Potential) |
| ASUS ROG Xreal R1 | Micro-OLED + Prism | 240Hz Refresh / 171" Virtual Screen | XREAL, ASUS |
| Even Realities G2 | Waveguide + Mono Green | Camera-less Privacy / Prescription Ready | Even Realities |
| Goertek Spinel (AI) | Diffractive Waveguide | 35g / 4K Photo / 1080p Video |
Goeroptics |
| Vuzix/Himax Ref. | Waveguide + LCoS | 0.34c.c. Light Engine / Prescription Ready |
Vuzix, Himax |
After years of "computational photography" dominance, 2026 marks a return to physical optical advantages through mechanical innovation.
Xiaomi’s 17 Ultra features a physical manual zoom/focus ring surrounding the rear camera module. This ring can detect displacements as small as 0.03mm, allowing photographers to achieve smooth, linear rack focusing or zooming with tactile feedback, addressing the imprecision of screen-tap focusing.
Honor showcased a "Robot Phone" prototype that integrates a motorized three-axis gimbal directly into the camera module . The camera can independently rotate, tilt, and autonomously track subjects, providing professional-grade stabilization and cinematic tracking for creators without external accessories .
Samsung Semiconductor’s ISOCELL HP5 sensor features industry-smallest 0.5μm pixels. To overcome light-gathering challenges at this scale, Samsung integrated High Refractive Index (HRI) micro-lenses directly into the sensor structure, ensuring 200MP purity while enabling thinner camera modules.
For professional photographers, optical bokeh remains the ultimate "moat." Sigma continues to push these boundaries at CES 2026.
Sigma announced the 135mm f/1.4 DG DN Art, the world's first lens to achieve an f/1.4 aperture at this focal length . It delivers rendering and bokeh that exceeds even the legendary 105mm f/1.4 "Bokeh Master" . Additionally, Sigma's 200mm f/2 DG DN Sports utilizes HLA (High-response Linear Actuator) motors for lightning-fast focus, bringing f/2 speed to the 200mm range for indoor sports and portraits .
Tamron was honored with EISA awards for its 28-300mm f/4-7.1 Di III VC VXD and 90mm f/2.8 Macro. Tamron’s strategy of porting its popular E-mount lenses (like the 70-180mm f/2.8 G2) to the Nikon Z-mount has continued to expand its market footprint against native manufacturer offerings.
Kyocera showcased an AI-based depth sensor using a unique triple-lens configuration. Unlike dual-lens systems, the triple-lens setup handles reflections and translucent materials more effectively, measuring objects as small as 0.30mm. This is designed for medical surgeries (identifying anatomy) and industrial wiring inspection [15, 10, 10].
In a surprising development, Lens Technology debuted aerospace-grade UTG for LEO satellite solar arrays. Utilizing chemical strengthening and laser-cutting techniques honed on foldable smartphones, this glass is as thin as a cicada’s wing and can be rolled up like a tape measure. It protects solar cells from atomic oxygen and UV radiation in space while allowing satellites to be stowed with "origami-like" efficiency during launch.
The logic of lens design has moved into TV backlighting. Samsung, LG, and Hisense showcased "Micro RGB" TVs. Every individual sub-pixel LED (under 100μm) is paired with a micro-lens array to precisely control light emission angles . This enables flagship models like the Hisense 116UXS to reach 10,000 nits and 100% of the BT.2020 color gamut .
The overarching theme of CES 2026 is that the optical lens has evolved from an "image capture card" into a "closed-loop sensor for physical interaction."
As Nvidia CEO Jensen Huang noted, "Physical AI" is the backdrop for all these breakthroughs. Whether it is Canon’s 26-stop dynamic range or Sony’s distortion-free global shutter, the goal is to provide Physical AI (autonomous vehicles, robots, humanoids) with accurate, real-world physical data that exceeds human sensory capabilities.
For the industry, three strategic directions have emerged:
Invisible Integration: Driven by metalenses and resin waveguides to blend tech into daily life.
Absolute Fidelity: Driven by SPAD and Global Shutter to eliminate artifacts in any lighting or speed.
Computational Coupling: Lenses are no longer independent; they are tightly coupled with AI NPUs (like Goertek's tri-chip platform) to achieve semantic recognition the moment light enters the system.
By the end of 2026, we expect to see consumer smart glasses that look like ordinary eyewear, automatically adjust focus, and act as a proactive AI assistant. Lenses are becoming our new eyes, our satellites' new skin, and the robotic world's new brain.