/
SVCam Preprocessing: Technical Details

SVCam Preprocessing: Technical Details

This document is about the technical details of the preprocessing steps, not about running the pipeline.

Approach to Semi-Automatic Preprocessing of SVCam Images

  1. Identify representative image with colorchecker card in the image (NC_1740166530)

  2. Manually get the color checker rgb values from a demosaiced (debayered) image. Perform no processing (eg. gamma correction, clipping, etc.) in order to keep the true color values.

  3. Use GIMP to get the average colorchecker square colors and save to file

    1. Load your image into GIMP

    2. Choose the color picker tool, “sample average”, adjust the radius, and open the “Change foreground color” window by pressing on foreground square

    3. Copy the values of the sample for each square along wiht the colorchecker xrite (macbeth) target value

    4. Convert LcH to Lab

image-20250227-162021.png
colorchecker: - number: 1 name: "dark skin" rgb_target: [115, 82, 68] Lab_target: [37.986, 13.555, 14.059] rgb_sample: [21.3, 23.8, 10.0] LcH_sample: [7.6, 7.8, 114.9] - number: 2 name: "light skin" rgb_target: [194, 150, 130] Lab_target: [64.711, 18.13, 17.81] rgb_sample: [65.3, 78.8, 34.7] LcH_sample: [31.5, 26.5, 115.8] - number: 3 name: "blue sky" rgb_target: [98, 122, 157] Lab_target: [49.497, -4.88, -21.925] rgb_sample: [25.9, 53.3, 39.9] LcH_sample: [19.8, 14.8, 159.7] - number: 4 name: "foliage" rgb_target: [87, 108, 67] Lab_target: [43.139, -13.095, 21.905] rgb_sample: [15.6, 28.1, 10.8] LcH_sample: [8.7, 12.1, 136.2] - number: 5 name: "blue flower" rgb_target: [133, 128, 177] Lab_target: [55.112, 8.844, -25.399] rgb_sample: [25.4, 57.2, 44.1] LcH_sample: [21.9, 12.8, 155.4] - number: 6 name: "bluish green" rgb_target: [103, 189, 170] Lab_target: [70.719, -33.397, -0.199] rgb_sample: [32.7, 86.4, 47.7] LcH_sample: [32.4, 31.0, 146.8 ] - number: 7 name: "orange" rgb_target: [214, 126, 44] Lab_target: [62.661, 36.067, 57.096] rgb_sample: [59.6, 52.1, 11.8] LcH_sample: [21.9, 25.4, 91.4] - number: 8 name: "purplish blue" rgb_target: [80, 91, 166] Lab_target: [40.02, 10.41, -45.964] rgb_sample: [14.9, 31.3, 34.4] LcH_sample: [10.6, 7.8, 214.4] - number: 9 name: "moderate red" rgb_target: [193, 90, 99] Lab_target: [51.124, 48.239, 16.248] rgb_sample: [52.6, 39.1, 17.6] LcH_sample: [16.9, 17.1, 75.6] - number: 10 name: "purple" rgb_target: [94, 60, 108] Lab_target: [30.325, 22.976, -21.587] rgb_sample: [14.7, 15.8, 14.7] LcH_sample: [4.5, 0.7, 142.5] - number: 11 name: "yellow green" rgb_target: [157, 188, 64] Lab_target: [72.532, -23.709, 57.255] rgb_sample: [38.8, 76.1, 19.1] LcH_sample: [28.6, 36.1, 128.9] - number: 12 name: "orange yellow" rgb_target: [224, 163, 46] Lab_target: [71.941, 19.363, 67.857] rgb_sample: [64.6, 70.7,14.2] LcH_sample: [28.6, 32.1, 105.9] - number: 13 name: "blue" rgb_target: [56, 61, 150] Lab_target: [28.778, 14.179, -50.297] rgb_sample: [8.5, 19.0, 24.9] LcH_sample: [5.3, 5.7, 244.8] - number: 14 name: "green" rgb_target: [70, 148, 73] Lab_target: [55.261, -38.342, 31.37] rgb_sample: [15.7, 42.8, 15.0] LcH_sample: [14.7, 21.7, 138.3] - number: 15 name: "red" rgb_target: [175, 54, 60] Lab_target: [42.101, 53.378, 28.19] rgb_sample: [40.2, 23.9, 8.6] LcH_sample: [10.1, 13.7, 59.1] - number: 16 name: "yellow" rgb_target: [231, 199, 31] Lab_target: [81.733, 4.039, 79.819] rgb_sample: [73.2, 97.3, 19.4] LcH_sample: [38.1, 42.3, 115.8] - number: 17 name: "magenta" rgb_target: [187, 86, 149] Lab_target: [51.935, 49.986, -14.574] rgb_sample: [44.5, 38.3, 28.0] LcH_sample: [15.8, 8.2, 79.5] - number: 18 name: "cyan" rgb_target: [8, 133, 161] Lab_target: [51.038, -28.631, -28.638] rgb_sample: [14.3 ,50.5, 40.3] LcH_sample: [18.2, 16.0, 171.2] - number: 19 name: "white (0.05*)" rgb_target: [243, 243, 242] Lab_target: [96.539, -0.425, 1.186] rgb_sample: [103.5, 174.8, 94.3] LcH_sample: [65.3, 49.0, 136.4] - number: 20 name: "neutral 8 (0.23*)" rgb_target: [200, 200, 200] Lab_target: [81.257, -0.638, -0.335] rgb_sample: [70.0, 120.0, 66.2] LcH_sample: [45.8, 35.7, 137.8] - number: 21 name: "neutral 6.5 (0.44*)" rgb_target: [160, 160, 160] Lab_target: [66.766, -0.734, -0.504] rgb_sample: [45.8, 78.7, 43.9] LcH_sample: [30.2, 25.0, 138.4] - number: 22 name: "neutral 5 (0.70*)" rgb_target: [122, 122, 121] Lab_target: [50.867, 0.153, -0.27] rgb_sample: [25.3, 45.3, 24.4] LcH_sample: [15.7, 15.1, 138.9] - number: 23 name: "neutral 3.5 (1.05*)" rgb_target: [85, 85, 85] Lab_target: [35.656, -0.421, -1.231] rgb_sample: [11.5, 19.1, 11.3] LcH_sample: [5.1, 4.8, 142.4] - number: 24 name: "black (1.50*)" rgb_target: [52, 52, 52] Lab_target: [20.461, -0.079, -0.973] rgb_sample: [4.5, 6.7, 4.4] LcH_sample: [1.7, 1.1, 141.9]
  1. Get the Colorchecker standard values for an XRite ColorChecker (Macbeth)

  2. Calculate and store a 9x9 color correction matrix

  3. Apply the color correction matrix along with other preprocessing steps

image-20250227-160633.png
Debayered Image
image-20250227-213137.png
Color corrected image

Computing the Color Correction Matrix

  1. Extracting Reference and Measured Colors:

    • Two sets of RGB color values:

      • Reference colors (ideal or target values from a color chart).

      • Measured colors (actual values captured by the camera; see above).

    • These colors are normalized (divided by 255) and stored in matrices.

  2. Building the Design Matrix (matrix_a):

    • The measured colors (measured RGB values) are used to create a polynomial expansion, including linear (R, G, B), quadratic (R², G², B²), and cubic (R³, G³, B³) terms.

    • This accounts for nonlinear distortions in color mapping.

  3. Computing the Transformation Components (matrix_m and matrix_b):

    • matrix_m: The pseudo-inverse of matrix_a is computed using least squares (np.linalg.solve), which helps determine the best-fit transformation.

    • matrix_b: The reference colors are also expanded into polynomial terms to match the structure of matrix_a.

  4. Computing the Transformation Matrix (transformation_matrix):

    • The transformation is applied by multiplying matrix_m with each polynomial term from the reference colors.

    • The resulting 9×9 transformation matrix defines how each color component (R, G, B) should be transformed.

  5. Saving the Matrix:

    • The final 9×9 transformation matrix is saved in a .npz file.

We computed the color correction matrix by essentially solving a polynomial regression problem between the measured and reference colorchecker color values. The transformation takes into account nonlinear distortions using quadratic and cubic terms, and the least-squares method determines the best mapping. The final 9×9 matrix is the transformation which is then applied to correct the colors.

def get_transformation_components(target_matrix: np.ndarray, source_matrix: np.ndarray) -> tuple[np.ndarray, np.ndarray, np.ndarray]: """ https://github.com/danforthcenter/plantcv/blob/main/plantcv/plantcv/transform/color_correction.py Calculate components required for generating a transformation matrix. Depending on the shape of the input matrices, the first column is assumed to be an identifier. Parameters: target_matrix (np.ndarray): Reference color matrix. source_matrix (np.ndarray): Measured color matrix. Returns: tuple: (matrix_a, matrix_m, matrix_b) to be used for calculating the transformation. """ _, t_r, t_g, t_b = np.split(target_matrix, 4, axis=1) _, s_r, s_g, s_b = np.split(source_matrix, 4, axis=1) # Build the design matrix using powers of the source colors. matrix_a = np.hstack([s_r, s_g, s_b, s_r**2, s_g**2, s_b**2, s_r**3, s_g**3, s_b**3]) # Compute the pseudo-inverse (least-squares solution) of the design matrix. matrix_m = np.linalg.solve(matrix_a.T @ matrix_a, matrix_a.T) # Build the target matrix with powers. # matrix_b = np.hstack([t_r, t_g, t_b, t_r**2, t_g**2, t_b**2, t_r**3, t_g**3, t_b**3]) matrix_b = np.hstack([t_r, t_r**2, t_r**3, t_g, t_g**2, t_g**3, t_b, t_b**2, t_b**3]) return matrix_a, matrix_m, matrix_b def calc_transformation_matrix(matrix_m: np.ndarray, matrix_b: np.ndarray) -> tuple[float, np.ndarray]: """Calculate the transformation matrix and its deviance.""" t_r, t_r2, t_r3, t_g, t_g2, t_g3, t_b, t_b2, t_b3 = np.split(matrix_b, 9, 1) # multiply each 22x1 matrix from target color space by matrix_m red = np.matmul(matrix_m, t_r) green = np.matmul(matrix_m, t_g) blue = np.matmul(matrix_m, t_b) red2 = np.matmul(matrix_m, t_r2) green2 = np.matmul(matrix_m, t_g2) blue2 = np.matmul(matrix_m, t_b2) red3 = np.matmul(matrix_m, t_r3) green3 = np.matmul(matrix_m, t_g3) blue3 = np.matmul(matrix_m, t_b3) # concatenate each product column into 9X9 transformation matrix transformation_matrix = np.concatenate((red, green, blue, red2, green2, blue2, red3, green3, blue3), 1) # find determinant of transformation matrix t_det = np.linalg.det(transformation_matrix) return 1-t_det, transformation_matrix def compute_transformation_matrix(cfg: DictConfig) -> np.ndarray: """ Compute and save the transformation matrix based on configuration values. This function reads the reference and measured colors from the configuration, computes the transformation matrix, saves it to disk, and returns it. Parameters: cfg (DictConfig): Hydra configuration with color checker information. Returns: np.ndarray: The computed 9x9 transformation matrix. """ reference_colors = np.array([ [ref['number']] + [x / 255.0 for x in ref["rgb_target"]] for ref in cfg.ccm ]) measured_colors = np.array([ [meas['number']] + [x / 255.0 for x in meas["rgb_sample"]] for meas in cfg.ccm ]) _, matrix_m, matrix_b = get_transformation_components(reference_colors, measured_colors) deviance, transformation_matrix = calc_transformation_matrix(matrix_m, matrix_b) log.info(f"Transformation matrix deviance: {deviance:.6f}") return transformation_matrix

Preprocessing Steps

Demosaicing

Demosaicing reconstructs a full-color image from a Bayer-filtered sensor by interpolating missing color information. This step converts the raw sensor data into an RGB image, reducing artifacts and ensuring accurate color representation. The OpenCV function cv2.cvtColor is used for demosaicing:

demosaiced = cv2.cvtColor(raw_array, cv2.COLOR_BayerBG2RGB_EA)

Applying Gamma Correction

Gamma correction is applied to adjust the brightness and contrast of the image. We use power-law gamma correction, where the pixel values are adjusted using the formula:

gamma = 1.1 image_normalized = raw_array / 65535.0 gamma_corrected = np.power(image_normalized, 1 / gamma)

Applying Color Correction

Color correction ensures that the image aligns with expected color profiles by adjusting each color channel. This is done using a transformation matrix, which corrects distortions introduced by the camera sensor and lighting conditions.

The color correction matrix is applied by transforming each pixel’s color values using a polynomial function which corrects color distortions nonlinearly, making sure the camera's captured colors match a known reference.

def apply_transformation_matrix(source_img: np.ndarray, transformation_matrix: np.ndarray) -> np.ndarray: """ Applies a color transformation matrix to correct the color space of an RGB image. The transformation uses polynomial (up to cubic) terms on each color channel. Parameters: source_img (np.ndarray): Input RGB image. transformation_matrix (np.ndarray): A 9x9 matrix containing transformation coefficients. Returns: np.ndarray: The color-corrected image, or None if input validations fail. """ if transformation_matrix.shape != (9, 9): log.error("Transformation matrix must be a 9x9 matrix.") return None if source_img.ndim != 3: log.error("Source image must be an RGB image.") return None # Split the transformation matrix into channel-specific coefficients. red, green, blue, *_ = np.split(transformation_matrix, 9, axis=1) source_r, source_g, source_b = cv2.split(source_img) # Compute the polynomial terms (up to third power) for each channel. source_r2, source_r3 = source_r ** 2, source_r ** 3 source_g2, source_g3 = source_g ** 2, source_g ** 3 source_b2, source_b3 = source_b ** 2, source_b ** 3 # Apply the transformation for each channel. b = (source_r * blue[0] + source_g * blue[1] + source_b * blue[2] + source_r2 * blue[3] + source_g2 * blue[4] + source_b2 * blue[5] + source_r3 * blue[6] + source_g3 * blue[7] + source_b3 * blue[8]) g = (source_r * green[0] + source_g * green[1] + source_b * green[2] + source_r2 * green[3] + source_g2 * green[4] + source_b2 * green[5] + source_r3 * green[6] + source_g3 * green[7] + source_b3 * green[8]) r = (source_r * red[0] + source_g * red[1] + source_b * red[2] + source_r2 * red[3] + source_g2 * red[4] + source_b2 * red[5] + source_r3 * red[6] + source_g3 * red[7] + source_b3 * red[8]) corrected_img = cv2.merge([r, g, b]) return corrected_img

 

 

image-20250228-142132.png
raw image
image-20250228-142047.png
debayered
image-20250228-142028.png
gamma corrected
image-20250228-142327.png
color corrected

 

 

Sample of Preprocessing results for NC_2025-02-21

 

NC_17401632150_resized.jpg
JPEG Quality reduced by 1/2 for viewing in confluence

Related content