A Survey of Day-Night Illumination Domain Translation for Outdoor Vision Covering 30 Methods 22 Datasets and Evaluation Protocols
Alam, M. S., Singh, P., Bazilinskyy, P.
Submitted (2026)
ABSTRACT Day-night appearance shift degrades vision for driving and surveillance. Low illumination, mixed lighting, glare, and sensor noise weaken cues for detection, segmentation, localisation, and tracking. We survey illumination domain translation for images and video, focusing on day to night and night to day mapping that changes illumination while preserving geometry, semantics, and temporal coherence. We relate illumination modelling and colour transfer to learning based methods, and introduce a constraint centric taxonomy linking supervision, five domain gap factors, and five families of constraints and priors to typical failure modes. Using this taxonomy, we organise 30 representative methods and summarise 22 datasets. We also report an artefact availability audit of 34 published methods: 29 release code, 22 provide pretrained weights, 21 specify licences, and 19 provide reproducibility packages. Finally, we recommend evaluation spanning perceptual quality, semantic preservation, downstream utility, and temporal stability.