Instead of creating an X11 connection to test that X11 is available,
closing the connection, and then reconnecting for real, use the same
connection to handle both cases.
The X11 connection retry delay mechanism in the case where X11 is
dynamically loaded has been removed. It was only necessary to avoid
authetnication token reuse from the XOpenDisplay call that used to
exist in X11_Available. Now that this call is only made once, it
is no longer needed.
Also drop unused and inapplicable code from a comment.
***
The two are only ever called together, and combining them makes it possible
to eliminate redundant symbol loading and redundant attempts to connect
to a display server.
For systems without strlcpy and strlcat, just declare them as if they exist;
the analyzer possibly still knows the details of these functions and can
utilize that in its analysis.
Most of this patch was from meyraud705 at gmail and Martin Gerhardy. Thanks!
Fixes Bugzilla #5163.
Apparently the "-x objective-c" made it down to the linker, who then treats
the .o file as Objective-C source code. Apparently the -ObjC argument does
the same thing but gets ignored by the linker.
Fixes Bugzilla #4988.
Manuel Alfayate Corchete
This small patch fixes the KMSDRM_CreateSurfaces() call in KMSDRM_CreateWindow(), that was segfaulting deeper into SDL internals because the windata->viddata pointer wasn't set before the KMSDRM_CreateSurfaces() call.
So that's what this small patch does.
Now, L?VE2D works perfectly well on the Raspberry Pi 3, instead of just segfaulting.
Ellie
I just tripped over this: stb_image when requesting 3 channels with 8-bit actually returns them as 3 bytes per pixel with no alignment, so basically 4 pixels are 12 bytes with no padding (0...2, 3...5, 6...8, and 9...11). This I would have naively expected to be called RGB888 or BGR888, since there is no "dead" unused byte as I would expect for something called e.g. RGBX8888.
However, SDL2's SDL_PIXELFORMAT_BGR888 uses 4 bytes, same as SDL_PIXELFORMAT_BGRX8888, even though the latter appears to be a longer storage format - which it isn't, internally. It's just swapped, in byte order X, B, G, R (instead of BGRX). So why isn't the macro name also swapped, as "XBGR888" instead of just "BGR888"?
I find the formats therefore named inconsistently, and unless there is a reason for this I suggest these changes:
1. deprecate SDL_PIXELFORMAT_BGR888 in favor of a new SDL_PIXELFORMAT_XBGR8888
and
2. deprecate SDL_PIXELFORMAT_RGB888 in favor of a new SDL_PIXELFORMAT_XRGB8888
JackBoosY
In src/video/winrt/SDL_winrtgamebar.cpp line 55:
virtual HRESULT STDMETHODCALLTYPE add_VisibilityChanged(
__FIEventHandler_1_IInspectable *handler,
Windows::Foundation::EventRegistrationToken *token) = 0;
The macro __FIEventHandler_1_IInspectable defined in windows.fondation.h(Windows10 SDK 10.0.17763.0) line 3576:
#define __FIVector_1_Windows__CFoundation__CPoint ABI::Windows::Foundation::Collections::__FIVector_1_Windows__CFoundation__CPoint_t
but no longer exists in Windows 10 SDK 10.0.19041.0.
After searching this macro in the sdk include path, I found that it was defined in many header files. But it should be replaced in windows.system.h .
On some systems, GetClipCursor() impacts performance when called frequently, so only call it every once in a while to make sure we haven't lost our capture.
Manuel Alfayate Corchete
I'm trying to build SDL2 with threads support here in GNU/Linux, both X86 and ARM, and it does not seem to be possible ATM:
/home/manuel/src/SDLLLL/src/core/linux/SDL_threadprio.c:233:26: error: 'rtkit_max_realtime_priority' undeclared (first use in this function)