Bright future beyond Bayer: new tech captures color without light loss
Diagram showing the structure of Eyeo's color-splitting waveguide technology. The image is focused at the front of the structure, with the waveguides channeling the light down to a conventional CMOS sensor underneath. The geometry and position of the upper, tapered waveguides defines the wavelength at which light is split into parallel beams in the lower, rectangular guide. Image: Eyeo A Belgo-Dutch startup is developing an alternative to color filters that would let sensors capture all the light shone on them. This has the potential to push past on of the limitations of existing Bayer sensors, giving up to a stop of improvement in tonal quality and noise, and potentially allowing higher resolution capture. Its technology uses nano-scale waveguides to split the incoming light depending on its color, meaning the sensor receives all the light projected on it, rather than having to use filters that absorb some of the incoming light. Eyeo, a spin-off from Imec (a Belgian research organization) has just received €15M in seed funding to develop the technology. Its system doesn't block or absorb any light, instead redirecting it into neighboring pixels, based on its color. The technology uses a waveguide that channels the incoming light to a very fine point – the geometry of which is on the scale of the wavelength of light – splitting the light by wavelength. From here it's channelled separately down through a second, rectangular waveguide into a pair of photodiodes below. The company has shown it can adjust how the colors are split by adjusting the precise geometry and positioning of the waveguides. It has developed pairs of waveguides that split light at the same wavelengths that the human eye does, with one separating red light from cyan (green and blue), and the other separating blue light from yellow (green and red). Diagram showing the sensitivity of the cones of the human eye to different wavelengths (top) and the output spectra of the two waveguide designs, showing one tuned to give a 480nm crossover (left) and a second with crossover at 580nm (right).Image: Eyeo This means you still need four photodiodes to capture full color, but you can measure the light intensity, irrespective of color, with only two; giving a significant resolution boost and with minimal light loss. Its work suggests these pairs of waveguide stacks, combined with conventional CMOS sensors, should be able to deliver color accuracy comparable with modern cameras, with scope to further improve the performance to at least match the very best examples. In addition to avoiding light loss to color filters, the design should be able to work with smaller pixels than previous attempts to split colors by diffraction, allowing the used of smaller pixels to give higher resolutions. The company's focus is, understandably, on the large and potentially lucrative smartphone market. Because its technology doesn't waste as much light and can work with smaller pixels, it allows the creation of smaller sensors that deliver quality comparable with existing Bayer ones, or higher resolution sensors that outperform Bayer sensors of the same size. However, even in the comparatively huge sensors used in most standalone cameras, avoiding light loss to a color filter array would allow a ∼1EV improvement in tonal quality and noise performance. Current sensors have very high quantum efficiency (generating a signal from a very high percentage of the light that hits them) and very low levels of read noise, meaning there's a limit to how much further you can improve the performance of the CMOS itself. However, the silicon part of the sensor is held back by the need to filter-out around one stop of the 'wrong' colored light before it hits each photodiode. The 'Vora' values, measuring color filtering accuracy for a large set of cameras, calculated both for the CIE standard observer and a range of other ages and races. Eyeo's technology using off-the-shelf CMOS sensors and a custom-made thin-film perovskite detector are shown on the right. Image: Eyeo Although the technology is still at the relatively early development stage, the company tells us its technology is compatible with existing CMOS sensor manufacturing and that the fabrication tech for its waveguides that's already in use at scale. It says it has worked to ensure the pixels at the edge of the frame maintain high acceptance angles for incoming light, without the use of microlenses, and has patented a methodology for optimizing the design to match typical numerical apertures used in the latest cameras and smartphones. The original idea dates to 2018, with patents and prototypes following over the next few years. The Eindhoven-headquartered company was established in 2024 and it says it hopes to engage with potential customers in the next year or so, with evaluation kits available next year. However, when asked, the company didn't give a timescale of when it thought the technology

A Belgo-Dutch startup is developing an alternative to color filters that would let sensors capture all the light shone on them. This has the potential to push past on of the limitations of existing Bayer sensors, giving up to a stop of improvement in tonal quality and noise, and potentially allowing higher resolution capture.
Its technology uses nano-scale waveguides to split the incoming light depending on its color, meaning the sensor receives all the light projected on it, rather than having to use filters that absorb some of the incoming light.
Eyeo, a spin-off from Imec (a Belgian research organization) has just received €15M in seed funding to develop the technology.
Its system doesn't block or absorb any light, instead redirecting it into neighboring pixels, based on its color.
The technology uses a waveguide that channels the incoming light to a very fine point – the geometry of which is on the scale of the wavelength of light – splitting the light by wavelength. From here it's channelled separately down through a second, rectangular waveguide into a pair of photodiodes below.
The company has shown it can adjust how the colors are split by adjusting the precise geometry and positioning of the waveguides. It has developed pairs of waveguides that split light at the same wavelengths that the human eye does, with one separating red light from cyan (green and blue), and the other separating blue light from yellow (green and red).
This means you still need four photodiodes to capture full color, but you can measure the light intensity, irrespective of color, with only two; giving a significant resolution boost and with minimal light loss.
Its work suggests these pairs of waveguide stacks, combined with conventional CMOS sensors, should be able to deliver color accuracy comparable with modern cameras, with scope to further improve the performance to at least match the very best examples.
In addition to avoiding light loss to color filters, the design should be able to work with smaller pixels than previous attempts to split colors by diffraction, allowing the used of smaller pixels to give higher resolutions.
The company's focus is, understandably, on the large and potentially lucrative smartphone market. Because its technology doesn't waste as much light and can work with smaller pixels, it allows the creation of smaller sensors that deliver quality comparable with existing Bayer ones, or higher resolution sensors that outperform Bayer sensors of the same size.
However, even in the comparatively huge sensors used in most standalone cameras, avoiding light loss to a color filter array would allow a ∼1EV improvement in tonal quality and noise performance. Current sensors have very high quantum efficiency (generating a signal from a very high percentage of the light that hits them) and very low levels of read noise, meaning there's a limit to how much further you can improve the performance of the CMOS itself. However, the silicon part of the sensor is held back by the need to filter-out around one stop of the 'wrong' colored light before it hits each photodiode.
Although the technology is still at the relatively early development stage, the company tells us its technology is compatible with existing CMOS sensor manufacturing and that the fabrication tech for its waveguides that's already in use at scale.
It says it has worked to ensure the pixels at the edge of the frame maintain high acceptance angles for incoming light, without the use of microlenses, and has patented a methodology for optimizing the design to match typical numerical apertures used in the latest cameras and smartphones.
The original idea dates to 2018, with patents and prototypes following over the next few years. The Eindhoven-headquartered company was established in 2024 and it says it hopes to engage with potential customers in the next year or so, with evaluation kits available next year.
However, when asked, the company didn't give a timescale of when it thought the technology could be ready to appear in a consumer product.