You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering whether SDL_gpu supports floating point framebuffer objects.
I'm working on a remake of an older game in C/C++ and I used SDL_gpu to be able to utilize shaders. However, when I draw using the shader of my choice (ntsc-adaptive from libretro), everything seems to have a pink tint to it (left - shader off, right - shader on):
I learned that this might be a side effect of not utilizing a floating point FBO. Does SDL_gpu support this in some way? When calling GPU_CreateImage I can pick the format, but whatever SDL_gpu format I use, it all seems to be of internal format GL_RGBA/type GL_UNSIGNED_BYTE under the hood (and I'm assuming I need GL_RGBA32F/GL_FLOAT).
The text was updated successfully, but these errors were encountered:
That is correct. SDL_gpu assumes GL_UNSIGNED_BYTE. Floating point framebuffers would probably be a good feature to expose, as lacking it limits things like HDR shaders. I can't tell if that's your actual issue here, but it is an issue.
I see, thanks. SDL_gpu doesn't seem to enjoy more than 4 bytes per pixel under the hood too, so quickly adding in GL_RGBA32F/GL_FLOAT doesn't seem like it's going to work.
I can confirm the issue was indeed the lack of floating point framebuffer object support in SDL_gpu. I implemented that functionality (GL_RGBA32F support) myself and it's all working now.
I was wondering whether SDL_gpu supports floating point framebuffer objects.
I'm working on a remake of an older game in C/C++ and I used SDL_gpu to be able to utilize shaders. However, when I draw using the shader of my choice (ntsc-adaptive from libretro), everything seems to have a pink tint to it (left - shader off, right - shader on):
I learned that this might be a side effect of not utilizing a floating point FBO. Does SDL_gpu support this in some way? When calling GPU_CreateImage I can pick the format, but whatever SDL_gpu format I use, it all seems to be of internal format GL_RGBA/type GL_UNSIGNED_BYTE under the hood (and I'm assuming I need GL_RGBA32F/GL_FLOAT).
The text was updated successfully, but these errors were encountered: