Posts
Search
Contact
Cookies
About
RSS

C and Physics - the preface

Added 12 Mar 2019, 8:41 p.m. edited 18 Jun 2023, 7:26 p.m.

Adding physics to your application really brings things alive, not only don't you have to worry about complex issues like collision but it also reduces the overall complexity across the board. While we could launch straight into using ODE (Open Dynamics Engine) The renderer it uses for its own demos is, well, to be honest a little bit dated and in need of some love. I've targeted OpenGL version 3.3 on top of GLFW so as to provide a breadth of compatibility, but with some reasonably modern features.

By the end of this post I'll have shown you C code that's more than sufficient to render some of the graphics primitives we'll need for working with ODE. In addition there will also be a fair bit of additional code to handle all the less exciting stuff like loading files, setting up shaders and so on.

Looking inside the source code directory (download below) we see that the project is broken down into a number of units

main.c           the application
maths.c contains a small set of routines for 3d maths
camera.c helps with rendering our scene from a specific view point
obj.c loads and renders 3d shapes
util.c miscellaneous helper functions
loadpng.c loads png files for texture creation
gl_core_3_3.c discovers OpenGL function pointers

obj.c contains various functions to load 3d graphical objects, in the resources directory there is a helper script to create ".gbo" files from wavefront ".obj" files which are easily exported from 3d editors such as blender. In addition to loading these binary files there are also functions to allow rendering

void drawObj(obj_t* obj, float sz[3], int texu, float mod[16], camera_t* cam);

This function takes a pointer (obj) to a previously loaded GBO file, the sz parameter scales the object which when used with objects of unit size has the effect of creating an object of a specific size. For convenience as we are using a small number of textures we keep all the textures bound to specific texture units, the texu parameter specifies which texture unit to render the object with. A matrix comprised of 16 floats (the mod parameter) describes the orientation and translation for the rendering, finally the cam parameter helps OpenGL render from the required view point.

To define our viewpoint we have a number of variables in the camera_t struct

     float pEye[3];        // location of camera
float pCentre[3]; // where its looking
float pUp[3]; // direction of up
float viewDir[3]; // calculated from eye to centre (these two used in shader)
float lightDir[3]; // direction light is shining
float view[16]; // view matrix
float projection[16];
float vp[16]; // combined view and projection matrix

We're free to change any of first three and once we have the updateCamera function should be called to calculate the remaining variables.

The matrix is a fantastic maths tool which allows us to not only describe the orientation and position of something, but also calculate a wide range of transformations. Looking at a code snippet we use to render an object we can see an example of use

     mat4RotateXYZ(r, za, xa, ya );
mat4Translation(t, 3, 0, 0);
mat4Mul(r, t, r);
drawObj(&drumObj, sz, 2, r, &camera);

There are a number of functions to describe rotations and also translations (position), multiplying two matrices together allows us to combine them, however the order that you combine matrices is important, for example a rotation and translation has a different effect to a translation and a rotation

Armed with the above information, it should be fairly easy to work out what is happening with the main application. If you filter out the GLFW and OpenGL specific calls, it should be reasonably straight forward to see whats going on.

So sadly just yet there isn't any actual physics going on, however I think you should be able see that there is a firm footing for our future experiments (more to come soon!). You can download the whole project here.