A slightly longer look at Lwjgl3

While I’m waiting for Lwjgl’s GLES binding, I thought I’d put together some code using shaders that could be easily ported to GLES later (As such it uses a subset of what’s usually available)

I’ll quickly run through how the shader is implemented.  Lets look at the main body of the vertex shader, which couldn’t be simpler…

               gl_Position = modelviewProjection * pos;
               v_color = color;
               v_frag_uv = uv_attrib;

We’ll see shortly that our vertices have the following attributes position, colour, and UV or texture coordinate.  Each time we render we once per frame set the MVP matrix, this is a combination of the model matrix (position and orientation of the model) the view matrix (the view or “camera” position and orientation) and the projection matrix (which simply put makes distant objects smaller).  The vertex position is combined (multiplied) by this matrix, its this magic black box of a matrix that makes our 2d screen look like a 3d view.  The colour and texture coordinate are passed to the fragment shader (which does the actual drawing now it knows where everything is…)

So this means we need some data to feed the shader you can see a large array of data in the code, this is used to “load” the vertex array buffer

float verts[] = {
   // vert                     // colour               // UV
   -0.5f, -0.5f, -0.5f,        1.0f, 0.0f, 0.0f,       0.0f, 0.0f,

As you can see there are three floats for the vertex position (x,y,z) followed by three floats for the colour (red, green, blue) and finally two floats for the texture coordinate.  This is repeated for each vertex of the model.

I should mention that including a colour component for each vertex possibly isn’t that useful, something rather more useful would be vertex normals, which is typically used for lighting (and potentially physics too).  But including lighting would make the shader more complex, but I may well make a sample for that when Lwjgl gets a GLES binding.

One point that I’ve seen causing people no end of problem is the glVertexAttribPointer command, lets look at how its used

// attrib, number items, type, normalised, byte stride, byte offset
glVertexAttribPointer( attr_pos, 3, GL_FLOAT, false, 32, 0 );
glVertexAttribPointer( attr_color, 3, GL_FLOAT, false, 32, 12 );
glVertexAttribPointer( attr_uv, 2, GL_FLOAT, false, 32, 24 );

one point to remember is the unlike second parameter size (or number of items) the last two parameters are not expressed in number of items, but rather as a number of bytes.

To get from any one point in the array to the same point in the next vertex is the stride, which in this case is 32 because…

3 pos (x,y,z) + 3 colour (rgb) + 2 UV = 8 * 4 bytes per float = 32

from this you should be able to see how the offset works…

Looking at the comments in the code and reading up on the individual OpenGL commands you should see how it all works.

The demo has some simple controls they are normally inactive while the cube auto rotates.

the controls are toggled with the space bar, here’s a run down on what they do:

Q/W    Rotate round the models X axis
A/S            ""    ""        Y axis
Z/X            ""    ""        Z axis

R/T    Move along the world X axis
F/G          ""    ""       Y axis
V/B          ""    ""       Z axis

You can download the source here, enjoy!

Leave a Reply

Your email address will not be published.