Ellie
I just tripped over this: stb_image when requesting 3 channels with 8-bit actually returns them as 3 bytes per pixel with no alignment, so basically 4 pixels are 12 bytes with no padding (0...2, 3...5, 6...8, and 9...11). This I would have naively expected to be called RGB888 or BGR888, since there is no "dead" unused byte as I would expect for something called e.g. RGBX8888.
However, SDL2's SDL_PIXELFORMAT_BGR888 uses 4 bytes, same as SDL_PIXELFORMAT_BGRX8888, even though the latter appears to be a longer storage format - which it isn't, internally. It's just swapped, in byte order X, B, G, R (instead of BGRX). So why isn't the macro name also swapped, as "XBGR888" instead of just "BGR888"?
I find the formats therefore named inconsistently, and unless there is a reason for this I suggest these changes:
1. deprecate SDL_PIXELFORMAT_BGR888 in favor of a new SDL_PIXELFORMAT_XBGR8888
and
2. deprecate SDL_PIXELFORMAT_RGB888 in favor of a new SDL_PIXELFORMAT_XRGB8888
JackBoosY
In src/video/winrt/SDL_winrtgamebar.cpp line 55:
virtual HRESULT STDMETHODCALLTYPE add_VisibilityChanged(
__FIEventHandler_1_IInspectable *handler,
Windows::Foundation::EventRegistrationToken *token) = 0;
The macro __FIEventHandler_1_IInspectable defined in windows.fondation.h(Windows10 SDK 10.0.17763.0) line 3576:
#define __FIVector_1_Windows__CFoundation__CPoint ABI::Windows::Foundation::Collections::__FIVector_1_Windows__CFoundation__CPoint_t
but no longer exists in Windows 10 SDK 10.0.19041.0.
After searching this macro in the sdk include path, I found that it was defined in many header files. But it should be replaced in windows.system.h .
On some systems, GetClipCursor() impacts performance when called frequently, so only call it every once in a while to make sure we haven't lost our capture.
Manuel Alfayate Corchete
I'm trying to build SDL2 with threads support here in GNU/Linux, both X86 and ARM, and it does not seem to be possible ATM:
/home/manuel/src/SDLLLL/src/core/linux/SDL_threadprio.c:233:26: error: 'rtkit_max_realtime_priority' undeclared (first use in this function)
Manuel Alfayate Corchete
This patch is needed so programs that do this work as expected:
1) Start in a different video mode than the mode used by the system and then...
2) Try to go fullscreen with the mode originally used by the system via SetWindowFullScreen() with the SDL_WINDOW_FULLSCREEN_DESKTOP flag.
An example would be pt2-clone in https://github.com/8bitbubsy/pt2-clone.
This program does this:
Starts with:
video.window = SDL_CreateWindow("", SDL_WINDOWPOS_CENTERED,
SDL_WINDOWPOS_CENTERED, screenW, screenH, windowFlags);
and then, *IF* the user has configured it in fullscreen mode in its .ini, it tries to go fullscreen with the desktop mode:
SDL_SetWindowFullscreen(video.window, SDL_WINDOW_FULLSCREEN_DESKTOP);
This sequence of operations is currently failing because SDL_SetDisplayModeForDisplay() in SDL_video.c fails because display->desktop_mode is not being initialized with its correct value: SetDisplayMode() in SDL_kmsdrmvideo.c will not be able to set the mode because it detects the mode to have a driverdata of 0x0 ("if (!modedata)") and rightfully returns an error.
So, the included patch fixes this small problem, and programs that first change the video mode and then try to go fullscreen with the system video mode will now work.
The patch simply fixes an small omission, but its really needed now that dynamic video mode changing was implemented on the KMSDRM backend.
Ryan C. Gordon
As discussed here:
https://discourse.libsdl.org/t/question-about-implementation-of-sdl-updatewindowsurfacerects/27561
"As you can see this function [WIN_UpdateWindowFramebuffer, in src/video/windows/SDL_windowsframebuffer.c] calls BitBlt on entire screen, even though it accepts the rects. Rects variable is not even used in this function at all. Now my question is why is that the case?"
Anthony Pesch
I was looking into my own input bug and noticed an issue in the HIDAPI code while looking over it. I don't have a controller that goes down this path to test and try to provoke the issue, but it looks pretty straight forward.
The memmove to shift the joystick id array on disconnect isn't scaling the size by sizeof(SDL_JoystickID), likely corrupting the ids on disconnect.
Giovanni Bajo
The CMake build system supports several audio frameworks for Linux: one of them is sndio.
All frameworks can be built with "runtime linking" (that is, using dlopen to load the library at runtime). In sdlchecks.cmake, there's code to do the same with sndio:
=================================================================
# Requires:
# - n/a
# Optional:
# - SNDIO_SHARED opt
# - HAVE_DLOPEN opt
macro(CheckSNDIO)
if(SNDIO)
# TODO: set include paths properly, so the sndio headers are found
check_include_file(sndio.h HAVE_SNDIO_H)
find_library(D_SNDIO_LIB sndio)
if(HAVE_SNDIO_H AND D_SNDIO_LIB)
set(HAVE_SNDIO TRUE)
file(GLOB SNDIO_SOURCES ${SDL2_SOURCE_DIR}/src/audio/sndio/*.c)
set(SOURCE_FILES ${SOURCE_FILES} ${SNDIO_SOURCES})
set(SDL_AUDIO_DRIVER_SNDIO 1)
if(SNDIO_SHARED)
if(NOT HAVE_DLOPEN)
message_warn("You must have SDL_LoadObject() support for dynamic sndio loading")
else()
FindLibraryAndSONAME("sndio")
set(SDL_AUDIO_DRIVER_SNDIO_DYNAMIC "\"${SNDIO_LIB_SONAME}\"")
set(HAVE_SNDIO_SHARED TRUE)
endif()
else()
list(APPEND EXTRA_LIBS ${D_SNDIO_LIB})
endif()
set(HAVE_SDL_AUDIO TRUE)
endif()
endif()
endmacro()
=================================================================
The feature is gated by an option called SNDIO_SHARED. It is also fully implemented in SDL_sndioaudio.c
Unfortunately, it seems there is a missing line in CMakeLists.txt, so SNDIO_SHARED is not defined:
======================================================================
set_option(ALSA "Support the ALSA audio API" ${UNIX_SYS})
dep_option(ALSA_SHARED "Dynamically load ALSA audio support" ON "ALSA" OFF)
set_option(JACK "Support the JACK audio API" ${UNIX_SYS})
dep_option(JACK_SHARED "Dynamically load JACK audio support" ON "JACK" OFF)
set_option(ESD "Support the Enlightened Sound Daemon" ${UNIX_SYS})
dep_option(ESD_SHARED "Dynamically load ESD audio support" ON "ESD" OFF)
set_option(PULSEAUDIO "Use PulseAudio" ${UNIX_SYS})
dep_option(PULSEAUDIO_SHARED "Dynamically load PulseAudio support" ON "PULSEAUDIO" OFF)
set_option(ARTS "Support the Analog Real Time Synthesizer" ${UNIX_SYS})
dep_option(ARTS_SHARED "Dynamically load aRts audio support" ON "ARTS" OFF)
set_option(NAS "Support the NAS audio API" ${UNIX_SYS})
set_option(NAS_SHARED "Dynamically load NAS audio API" ${UNIX_SYS})
set_option(SNDIO "Support the sndio audio API" ${UNIX_SYS})
set_option(FUSIONSOUND "Use FusionSound audio driver" OFF)
dep_option(FUSIONSOUND_SHARED "Dynamically load fusionsound audio support" ON "FUSIONSOUND" OFF)
======================================================================
You can see that all frameworks define a "dep_option" NAME_SHARED, and SNDIO is the only one where the option is missing.
This means that runtime loading of sndio is never activated. If sndio is found at configuration time, it is always activated in "linked" mode, so that the final binary will have a load-time dependency with libsdnio. This is unfortunate.
To fix the problem, it is sufficient to add this line:
dep_option(SNDIO_SHARED "Dynamically load the sndio audio API" ${UNIX_SYS} ON "SNDIO" OFF)
I've verified that this fixes the bug, and sndio can now be dynamically loaded as expected.
It currently behaves like a locking key which is pressed
when Caps Lock is enabled and released when disabled. This
means that apps that trigger events on Caps Lock key down will
only fire these events every other time Caps Lock is pressed.
Lacky
It looks like refactoring of SDL2 internal API has broken SDL_RenderFillRect for DirectFB. In new version function SDL_RenderFillRect returns 0, but rectangle is not visible.
Replacing "count" with "len" in the argument list for SDL_memcpy in DirectFB_QueueFillRects fixes problem.
Jan Bujak
I wrote a new driver for my gamepad on Linux. I'd like SDL to support it out-of-box, as currently it just treats it as a generic joystick instead of a gamepad. From what I can see the only way to do that is to either 1) pick one of the already supported controllers' PID, VID and button layouts and have my driver send that (effectively lying that it's something else), or 2) submit a preconfigured, hardcoded mapping to SDL.
Both of those, in my opinion, are silly when we already have the Linux Gamepad Specification which standarizes this:
https://www.kernel.org/doc/html/v4.15/input/gamepad.html
Unfortunately SDL doesn't make use of it currently. So I've took it upon myself to add it; patch is in the attachments.
Basically what the patch does is that if SDL finds no built-it controller mappings for a given joystick it then asks the joystick backend to autodetect it, and that uses the relevant evdev bits to figure out which button/axis is which. (See the specs for more details.)
With this patch applied my own driver for my controller works out-of-box with SDL with no extra configuration and is correctly recognized as a gamepad; this is also going to be the case for any other driver which follows the Linux Gamepad Specification.
Arithmatic operations promote Uint8 to signed int. If the top bit of a
Uint8 is set, and it is left shifted 24 places, then the result is not
representable in a signed 32 bit int. This would be undefined behaviour
on systems where int is 32 bits.