Reference Image Deuteranope's Perception Recolored for Deuteranopes
Real-Time Temporal-Coherent Color Contrast Enhancement for Dichromats
Gustavo M. Machado
gmmachado@inf.ufrgs.br

Manuel M. Oliveira
oliveira@inf.ufrgs.br

Instituto de Informática, UFRGS

Logo INFLogo UFRGS

Computer Graphics Forum.
Volume 29 (2010), Number 3, Proceedings of EuroVis 2010, pp. 933-942 (ISSN 0167-7055). [DOI]


Contents

Abstract Downloads Results Tutorial Reference Acknowledgments

Abstract

We present an automatic image-recoloring technique for enhancing color contrast for dichromats whose computational cost varies linearly with the number of input pixels. Our approach can be efficiently implemented on GPUs, and we show that for typical image sizes it is up to two orders of magnitude faster than the current stateof-the-art technique. Unlike previous approaches, ours preserve temporal coherence and, therefore, is suitable for video recoloring. We demonstrate the effectiveness of our technique by integrating it into a visualization systemand showing, for the first time, real-time high-quality recolored visualizations for dichromats.

Downloads

Eurographics Association Copyright Notice: Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than Eurographics must be honored.

Full paper
Shaders
Sample Code

Results

Video


Images

These images illustrate how our approach can enhance color contrast for dichromats. The first two columns show the images as perceived by normal trichromats, and by dichromats, respectively. The third column shows the recolored images produced by our technique. These images cover examples from scientific, medical, and information visualization, as well as natural images.

Reference Image Dichromats Perception Recolored
Flame
deuteranope
-
Brain
deuteranope
-
Nebula
protanope
-
Tornado
protanope
-
Europe Ill
deuteranope
-
Flower
deuteranope
-

Reference Image Credits - From top to bottom: CCSE at LBNL (Flame), Francisco Pinto (Brain), Wikimedia.org (Nebula), Martin Falk and Daniel Weiskopf (Tornado), Wikimedia.org (Europe Ill), Karl Rasche (Flower).

Tutorial on How to Use the Provided Shaders

Affecting approximately 200 million of people worldwide, color vision deficiency (CVD) compromises the ability of these individuals to effectively perform color andvisualization-related tasks. Recoloring techniques aim mainly to reduce ambiguity in colored visualization data. In this tutorial we explain how to use the shaders available in the download section to integrate this technique as a feature to any visualization application.

First, the rotation matrix must be initialized according to the dichromacy type. Table 1 shows the rotation angle for each dichromacy type. Those angles represents rotations in CIE L*a*b* color space around L* axis.

Table 1: Rotation angles
Dichromacy
type
Rotation
angle
Protanopy
-11.47
Deuteranopy
-8.10
Tritanopy
46.37

Then, create two textures R and N and initialize texture R with the reference image RGB colors and the texture N as a noise texture containing the randomly chosen neighbour coordinates x and y.

Run shader "RGBToLab_Rotate.glsl" to convert colors from RGB to CIE L*a*b* color space and to rotate colors according to the input rotation matrix. Set input shader variables rgbtex and rotmat as the texture R and the rotation matrix, respectively. Save output to a temporary texture T1.

Now use the shader "PCA_step1.glsl" to perform first PCA computations in the GPU. As input use T1 and N as the shader variables labtex and noisetex, respectively. Save output to another temporary texture T2.

With shader "PCA_step2.glsl" we must perform a texture reduction as it represents a somatory. Define shader variable AtA initialy as T2 and subsequently as its own output in reduction process. The shader variable dimB represents the current width and height of reduced texture area. Keep output of final result in texture T2.

Shader "PCA_step2_5_EigenVector.glsl" takes as input parameter AtA the texture T2 that was output in the last shader. As the last shader was a somatory and the process reduces the texture and stores the result in only one texel, this shader must be called only for the texel coordinates (x=1,y=1). Output the result to a temporary texture T3.

Call shader "PCA_step3.glsl" to project colors to the eigenvector computed. For that use as input variables Eab and Lab the textures T3 and T1 respectively. Output the result to T2.

Again, with a texture reduction, call shader "PCA_step4.glsl" to reduce texture T2 as shader variable ProjEffect. Use variable dimB similarly as used in the other reduction shader. Save original input data from texture T2 in a temporary variable T4.

Now, to compute the result colors, call shader "PCA_step5.glsl" using textures T2, T4, and T1 as the shader variables ProjEffectSum, ProjEffect, and Lab, respectively. Output the result to T3.

Finally, call shader "Rotate_LabToRGB.glsl" to convert the final colors to RGB color space. Set as input parameters Lab and rotmat the texture T3 and the inverse rotation matrix.

A sample code is available in the download section to be used as example and guide the implementation of this technique and facilitates the integration with visualization systems.

Reference

Citation

Gustavo M. Machado and Manuel M. Oliveira "Real-Time Temporal-Coherent Color Contrast Enhancement for Dichromats"Computer Graphics Forum. Volume 29 (2010), Number 3, Proceedings of EuroVis 2010, pp. 933-942.

BibTeX

@article{Machado2010,
author = {Gustavo M. Machado and Manuel M. Oliveira},
title = {Real-Time Temporal-Coherent Color Contrast Enhancement for Dichromats},
journal = {Computer Graphics Forum},
volume = {29},
number = {3},
month = {June},
year = {2010},
pages = {933-942} ,
note = {Proceedings of EuroVis}
}

Keywords

Color-Contrast Enhancement, Dichromats, Real Time, Temporal Coherence

Acknowledgments

CNPq-Brazil fellowships and grants # 200284/2009-6, 131327/2008-9, 476954/2008-8, and 305613/2007-3.