Category Ranking

98%

Total Visits

921

Avg Visit Duration

2 minutes

Citations

20

Article Abstract

Under-display camera (UDC) systems enable full-screen displays in smartphones by embedding the camera beneath the display panel, eliminating the need for notches or punch holes. However, the periodic pixel structures of display panels introduce significant optical diffraction effects, leading to imaging artifacts and degraded visual quality. Conventional approaches to mitigate these distortions, such as deep learning-based image reconstruction, are often computationally expensive and unsuitable for real-time applications in consumer electronics. This work introduces an inverse-designed metasurface for wavefront restoration, addressing diffraction-induced distortions without relying on external software processing. The proposed metasurface effectively suppresses higher-order diffraction modes caused by the metallic pixel structures, restores the optical wavefront, and enhances imaging quality across multiple wavelengths. By eliminating the need for software-based post-processing, our approach establishes a scalable, real-time optical solution for diffraction management in UDC systems. This advancement paves the way to achieve software-free real-time image restoration frameworks for many industrial applications.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC12397738PMC
http://dx.doi.org/10.1515/nanoph-2025-0242DOI Listing

Publication Analysis

Top Keywords

wavefront restoration
8
under-display camera
8
udc systems
8
pixel structures
8
inverse-designed metasurfaces
4
metasurfaces wavefront
4
restoration under-display
4
camera systems
4
systems under-display
4
camera udc
4

Similar Publications

Under-display camera (UDC) systems enable full-screen displays in smartphones by embedding the camera beneath the display panel, eliminating the need for notches or punch holes. However, the periodic pixel structures of display panels introduce significant optical diffraction effects, leading to imaging artifacts and degraded visual quality. Conventional approaches to mitigate these distortions, such as deep learning-based image reconstruction, are often computationally expensive and unsuitable for real-time applications in consumer electronics.

View Article and Find Full Text PDF

We propose an end-to-end model that estimates the exit pupil wavefront directly from phase diversity images using deep learning. The aim is to restore the exit pupil wavefront through zonal reconstruction to obtain more high-order modal aberrations, thereby improving the reconstruction quality of degraded images. Our simulated experimental results show that zonal reconstruction significantly outperforms modal reconstruction in restoring high-order aberrations.

View Article and Find Full Text PDF

In ultrasound imaging, propagation of an acoustic wavefront through heterogeneous media causes phase aberrations that degrade the coherence of the reflected wavefront, leading to reduced image resolution and contrast. Adaptive imaging techniques attempt to correct this phase aberration and restore coherence, leading to improved focusing of the image. We propose an autofocusing paradigm for aberration correction in ultrasound imaging by fitting an acoustic velocity field to pressure measurements, via optimization of the common midpoint phase error (CMPE), using a straight-ray wave propagation model for beamforming in diffusely scattering media.

View Article and Find Full Text PDF

3D tracking and localization of particles, typically fluorescently labeled biomolecules, provides a direct means of monitoring cellular transport and communication. However, sample-induced wavefront distortions of emitted fluorescent light as it passes through the sample and onto the detector often yield point spread function (PSF) aberrations, presenting an important challenge to 3D particle tracking using pre-calibrated PSFs. PSF calibration is typically performed outside cellular samples, ignoring sample-induced aberrations, which can result in localization errors on the order of tens to hundreds of nanometers, ultimately compromising sub-diffraction limited tracking.

View Article and Find Full Text PDF

Optical imaging systems are significantly affected by aerodynamic thermal effects under varying flight conditions, resulting in complex image blurring. To address this challenge, this study proposes a novel wavefront-coded image restoration method based on a multi-scale deep autoencoder neural network (MS-DAE). By modulating blur levels and incorporating a multi-scale loss function with residual attention mechanisms, the proposed method achieves a remarkable improvement in peak signal-to-noise ratio (PSNR) by 16.

View Article and Find Full Text PDF