Skip to content Skip to sidebar Skip to footer

Grayscale Hbitmap With Python Ctypes

I have PIL images that I am trying to convert to grayscale HBitmap in ctypes. I have minimal knowledge of ctypes, C, or dealing with HBITMAPs. I cobbled together code from various

Solution 1:

@OP: What broke your python code was the line:

('bmiColors', ctypes.POINTER(RGBQUAD))

Use instead:

('bmiColors', RGBQUAD * 256)

initialize like this:

bmi = BITMAPINFO(BITMAPINFOHEADER(sizeof(BITMAPINFOHEADER), 0, 0, 1, 8, 0, 0, 0, 0, 0, 0),
                 (RGBQUAD * 256)(*[RGBQUAD(i,i,i,0) for i inrange(256)]))

and set bmi.bmiHeader.biWidth and bmi.bmiHeader.biHeight whenever necessary.

Notes about using this in python with ctypes:

  • Set .argtypes for every C function you import, wherever possible. Not doing so can throw exceptions, even when everything appears to be in order.
  • Use classes and initialize BITMAPINFO like below: (code is incomplete!!!)
import ctypes
from ctypes import c_ubyte, c_int, c_uint, c_void_p, POINTER, byref, sizeof
from ctypes.wintypes import WORD, DWORD, LONG, HDC

classRGBQUAD(ctypes.Structure):
    _fields_ = [
        ('rgbRed', c_ubyte),
        ('rgbGreen', c_ubyte),
        ('rgbBlue', c_ubyte),
        ('rgbReserved', c_ubyte)
    ]
classBITMAPINFOHEADER(ctypes.Structure):
    _fields_ = [
        ('biSize', DWORD),
        ('biWidth', LONG),
        ('biHeight', LONG),
        ('biPlanes', WORD), # 1
        ('biBitCount', WORD), # 8
        ('biCompression', DWORD), # BI_RGB = 0 for uncompressed format
        ('biSizeImage', DWORD), # 0
        ('biXPelsPerMeter', LONG), # 0
        ('biYPelsPerMeter', LONG), # 0
        ('biClrUsed', DWORD), # 0
        ('biClrImportant', DWORD) # 0
    ]
classBITMAPINFO(ctypes.Structure):
    _fields_ = [
        ('bmiHeader', BITMAPINFOHEADER),
        ('bmiColors', RGBQUAD * 256)
    ]

SetDIBitsToDevice = ctypes.windll.Gdi32.SetDIBitsToDevice
SetDIBitsToDevice.restype = BOOL # 0 if failed
SetDIBitsToDevice.argtypes = [HDC, c_int, c_int, DWORD, DWORD, c_int, c_int, c_uint, c_uint, c_void_p, POINTER(BITMAPINFO), c_uint]

bmi = BITMAPINFO(BITMAPINFOHEADER(sizeof(BITMAPINFOHEADER), 0, 0, 1, 8, 0, 0, 0, 0, 0, 0),
                 (RGBQUAD * 256)(*[RGBQUAD(i,i,i,0) for i inrange(256)]))

SLM_HDC = CreateDC(None, monitor.info.szDevice, None, None)
data = np.array(...).astype(np.uint8)
data_p = data.ctypes.data_as(c_void_p)
SetDIBitsToDevice(SLM_HDC,
                  0, 0,
                  monitor.width(), monitor.height(),
                  0, 0,
                  0, monitor.height(),
                  data_p, byref(bmi), 0)

As for a complete way to do it in C++, here is a code example that creates an 8bit grayscale DIB and draws it on the primary monitor. Just compile it into an .exe and run it and you will see a diagonal grayscale pattern on your primary monitor. Explanations follow below.

#include <cstdlib>#include <iostream>#include <malloc.h>#include <windows.h>// tell linker where to resolve external dependencies#pragma comment(lib, "User32.lib")#pragma comment(lib, "Gdi32.lib")

BITMAPINFO* CreateGreyscaleBITMAPINFO_P(int width, int height) {
    BITMAPINFO* pbmi = (BITMAPINFO*) std::malloc(offsetof(BITMAPINFO, bmiColors[256]));
    pbmi->bmiHeader.biSize = sizeof(pbmi->bmiHeader);
    pbmi->bmiHeader.biWidth = width;
    pbmi->bmiHeader.biHeight = height;
    pbmi->bmiHeader.biPlanes = 1;
    pbmi->bmiHeader.biBitCount = 8;
    pbmi->bmiHeader.biCompression = BI_RGB;
    pbmi->bmiHeader.biSizeImage = 0;
    pbmi->bmiHeader.biXPelsPerMeter = 0;
    pbmi->bmiHeader.biYPelsPerMeter = 0;
    pbmi->bmiHeader.biClrUsed = 0;
    pbmi->bmiHeader.biClrImportant = 0;
    for(int i=0; i<256; i++) {
        pbmi->bmiColors[i].rgbRed = (BYTE)i;
        pbmi->bmiColors[i].rgbGreen = (BYTE)i;
        pbmi->bmiColors[i].rgbBlue = (BYTE)i;
        pbmi->bmiColors[i].rgbReserved = (BYTE)0;
    }
    return pbmi;
}

int main(int argc, char** argv) {
    // to identify screen resolution correctly
    SetProcessDPIAware();

    // get HWND of full primary monitor to retrieve screen resolution and get HDC for drawing
    HWND desktop_HWND = GetDesktopWindow();
    LPRECT desktop_RECT = new RECT();
    if(GetWindowRect(desktop_HWND, desktop_RECT) == 0) { return0; }
    int width = std::abs(desktop_RECT -> right - desktop_RECT -> left);
    int height = std::abs(desktop_RECT -> bottom - desktop_RECT -> top);
    HDC desktop_DC = GetDC(desktop_HWND);

    // define array with linearly increasing pixel value along the diagonal x=y// pixels have 8bit grayscale values from 0 (black) to 255 (white)
    BYTE* array = (BYTE*) std::malloc(sizeof(BYTE) * width * height);
    for(int i=0; i<height; i++) {
        for(int j=0; j<width; j++) {
            array[i*width + j] = ((j + i) % 256);
        }
    }

    // initialize a BITMAPINFO instance and draw on desktop with SetDIBitsToDevice()const BITMAPINFO* bmip = CreateGreyscaleBITMAPINFO_P(width, height);
    int result = SetDIBitsToDevice(
        desktop_DC,
        0, 0, width, height,
        0, 0, 0, height,
        array, bmip, DIB_RGB_COLORS
        );

    // print out for debugging
    std::cout << "primary monitor resolution: " << width << " (width) x " << height << " (height)" << std::endl;
    std::cout << "naive BITMAPINFO length (BYTES): " << sizeof(BITMAPINFOHEADER) + sizeof(RGBQUAD)*256
    << " vs. with windows macro offsetof(): " << offsetof(BITMAPINFO, bmiColors[256]) << std::endl;
    std::cout << "bmiHeader.biSize: " << bmip->bmiHeader.biSize << std::endl;
    std::cout << "number of lines drawn on monitor: " << result << std::endl;
}
  • Use malloc() to allocate the BITMAPINFO. Other examples have used alloca(), which causes the BITMAPINFO to be garbage collected before the DIB is drawn on the screen. If you are a physicist like me and don't give a hoot about programming details, always use malloc() and remember to manually free() the memory afterwards.
  • I have no idea what is happenning with the HDC handles, here and in general.
  • When setting the bmiColors, casting the integer counter i to (BYTE) looks unclean, but should be safe for i < 256. I did this to prevent a compiler warning about information loss.
  • offsetof(BITMAPINFO, bmiColors[256]) and sizeof(BITMAPINFOHEADER) + sizeof(RGBQUAD)*256 give me the same result for 8 bits per pixel.

Application idea: This might be valuable to people drawing grayscale images on pixelated tools such as Liquid Crystal on Silicon Spatial Light Modulators (LCOS SLM). An SLM driver like this would eliminate the need for an additional thread/process to run a window on the SLM. A speed comparison with a PyQt5 window in a separate process (multiprocessing) on the SLM yielded an delay time that was lower by approx. 10 ms, probably because no cross-process communication was necessary. I am mentioning this for the search engines.

Post a Comment for "Grayscale Hbitmap With Python Ctypes"