Discussion:
[SDL] SDL_RenderDrawLine endpoint inconsistency
rtrussell
2017-01-06 23:36:40 UTC
Permalink
SDL_RenderDrawLine is documented as drawing the line "to include both end points" and that's exactly what I find it does when running on Windows (OpenGL renderer), Mac OS and Android. But on Linux (specifically Ubuntu because I haven't had a chance to try it on anything else) it appears to draw the line exclusive of the end point, indeed if you specify the same coordinates for both the start point and the end point it draws nothing at all!

The same thing happens on two different PCs: one running 32-bit Ubuntu 16.04 and the other running 64-bit Ubuntu 16.04 in VirtualBox under Windows, so I'm assuming the graphics card is not to blame. The version of SDL is whatever is currently available from the Ubuntu repository.

I'm rather mystified by this inconsistency. Any thoughts as to the possible cause?

Richard.
rtrussell
2017-01-08 11:13:29 UTC
Permalink
Here's a test case to demonstrate the issue. It should draw a white square on a blue background, with all four corner pixels drawn, and this is what happens in Windows and Mac OS. But in Linux (Ubuntu 16.04, SDL 2.0.4) it draws only two horizontal lines. Can somebody confirm this, or if they find it is working correctly report on what version of Linux/OpenGL/SDL they are using:


Code:
// Demo of SDL_RenderDrawLine endpoint behavior, R.T.Russell http://www.rtrussell.co.uk/

#include <stdio.h>
#include "SDL2_gfxPrimitives.h"

#define WIDTH 640
#define HEIGHT 480

int main(int argc, char* argv[])
{

if (SDL_Init(SDL_INIT_VIDEO))
{
printf ("SDL_Init Error: %s", SDL_GetError());
return 1;
}

SDL_SetHint (SDL_HINT_RENDER_DRIVER, "opengl") ;

SDL_Window *window = SDL_CreateWindow("SDL_RenderDrawLine test", 100, 100, WIDTH, HEIGHT, SDL_WINDOW_OPENGL);
if (window == NULL)
{
printf ("SDL_CreateWindow Error: %s", SDL_GetError());
SDL_Quit();
return 2;
}

SDL_Renderer *renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);
if (renderer == NULL)
{
SDL_DestroyWindow(window);
printf ("SDL_CreateRenderer Error: %s", SDL_GetError());
SDL_Quit();
return 3;
}

SDL_Event e;

int quit = 0;
while (!quit)
{
int i;
int w = WIDTH / 4;
int h = w;
int x = (WIDTH - w) / 2;
int y = (HEIGHT - h) / 2;

if (SDL_PollEvent(&e))
{
if (e.type == SDL_QUIT)
quit = 1;
}
SDL_SetRenderDrawColor(renderer, 0, 0, 0xFF, 0xFF);
SDL_RenderClear(renderer);

SDL_SetRenderDrawColor(renderer, 0xFF, 0xFF, 0xFF, 0xFF);
SDL_RenderDrawLine(renderer, x, y, x + w, y);
SDL_RenderDrawLine(renderer, x, y + h, x + w, y + h);
for (i = y + 1; i < y + h; i++)
{
SDL_RenderDrawLine(renderer, x, i, x, i) ;
SDL_RenderDrawLine(renderer, x + w, i, x + w, i);
}

SDL_RenderPresent(renderer);
SDL_Delay(10);
}

SDL_DestroyRenderer(renderer);
SDL_DestroyWindow(window);
SDL_Quit();
return 0;
}


Richard.
rtrussell
2017-01-12 20:01:37 UTC
Permalink
Since nobody has ventured an opinion I've looked at the source and there seems to be code specifically to make Linux behave differently; the trouble is, according to my tests, it makes Linux behave wrongly:

sdl_render_gl.c:
Code:
#if defined(__MACOSX__) || defined(__WIN32__)
/* Mac OS X and Windows seem to always leave the last point open */
data->glVertex2f(0.5f + points[count-1].x, 0.5f + points[count-1].y);
#else
/* Linux seems to leave the right-most or bottom-most point open */
x1 = points[0].x;
y1 = points[0].y;
x2 = points[count-1].x;
y2 = points[count-1].y;

if (x1 > x2) {
data->glVertex2f(0.5f + x1, 0.5f + y1);
} else if (x2 > x1) {
data->glVertex2f(0.5f + x2, 0.5f + y2);
}
if (y1 > y2) {
data->glVertex2f(0.5f + x1, 0.5f + y1);
} else if (y2 > y1) {
data->glVertex2f(0.5f + x2, 0.5f + y2);
}
#endif


It seems unlikely that OpenGL works differently on Linux by design so should this code still be in SDL?

Richard.

Loading...