In this section we will have short but powerful introduction to Blender. We will cover just enough of model creation basics, you will need to create most of simple projects.

No, we will not cover animation, shaders or modificators here, but just enough minimum to create this ramp floor for our tutorial:

The desired result

You will find lot of keyboard shortcuts here. And this is one of the most awesome features of Blender - you can work without menus or panels! Everything you need can be done with keyboard!

So let’s dive in Blender now!

Welcome to Belnder

When you open Blender, you will see some pretty image, made with Belnder, version information, some useful links and recent files

Blender

To close this window, simply click outside it. You will then see your workspace with the Default window layout (we will learn about them later). Workspace contains a few kickstarting items:

  • camera
  • light
  • cube

Workspace

You may be confused of the last one, but you will see shortly that so many cool things could be done starting with plain cube and modifying it. Oh, and about modifying: let’s switch to the Edit mode, hitting the Tab key:

Edit mode

You can exit it hitting Tab again. In the edit mode you can manipulate mesh’ edges, vertices or faces. To switch between these, use three buttons on the screen’s bottom:

Switching between edges, faces and vertices in edit mode

Let’s choose the faces editing mode. Unlike other 3D editors, in Blender selection is done with the Right mouse button. Select one face of the cube:

Selecting items in blender

You may have noticed that the axis arrows have moved in the selected face’s place. These are used to manipulate selected elements. Also, they show the orientation of the selected element. You can move the selected element by simply dragging one of the arrows. Selected element will be moved along the selected axis only:

Moving selected elements

The same operation, movement, could be performed hitting the G key. You can move other elements, too - this will change the form of our cube:

Moving edges

Now let’s try something more complex. See the Tools panel on your left?

Tools panel

Select a face in a face editing mode and click Extrude (or hit the E key). Your face will be extruded and you will be able to move it freely. But usually, designers move elements along some axis - this makes models more accurate. To fix the movement axis, just hit its letter while being in the extruding mode - X, Y or Z:

Extruding faces

Interesting fact: you may extrude both vertices and edges too.

Now, let’s use even more advanced operation, which is oftenly described later in tutorials on 3D modelling. Choose the Loop cut and slice operation from the Tools panel - you will see nothing. Until you move your cursor over your model. Depending on the edge, cursor is closer to, you will see purple rectangle, looping through your model:

Loop cut

When you click the Left mouse button, you will move to the next part of this operation - slicing. Just place the new edges where you want:

Slicing the loop cut

Now let’s create walls for our “ramp”. Create a few loop cuts alongside the ramp and we will start extruding:

Extruding one wall

Or maybe just moving faces?..

Moving vs extruding

No, that’s definitely not what we want! We want walls, not a new ramp! Hmmm… But if we will extrude walls one-by-one, it will be inaccurate… Hold the Shift key and right-click the two neighbour walls:

Multiple selection

Now we will work with three elements in the same way. Hit the E key and then - Z and extrude all three walls at the same time up:

Simultaneous extrusion

Now we need two more walls to prevent our hero (the ball, if you recall from the previous part) from falling aside. Select two edges at the corner of our ramp and hit the W key. You will see the context menu like this:

Editing context menu

Click the Subdivide item, and the selected edges will be connected right in the middle:

Subdivision for two edges

You can perform that operation on faces - that is oftenly handy. Now, if you undo your changes with usual Ctrl+Z (or Command+Z on Mac) and try to perform the same operation on four opposite edges, you will see there is a redundant (in our case) edge:

4-th subdivision

You can remove it by selecting that edge, hitting X and selecting Dissolve edges. If you choose the Delete edge - you will loose the neighbour faces, which were made of that edge.

Delete or dissolve?

So in the end we need to have two edges on the same line:

The needed edges

Now, switch to the Ortho View, choosing one from the View menu at the bottom of the screen, or hitting the Num 5 key:

View menu

Your workspace now should look different:

Ortho view

Using the View menu, you may switch between different views, perpendicular to your model.

Top view

Right view

Switching between different views will not clear the selection. And this is awesome! So if you try to move the selected edges in the Right Ortho View, you will move both of them:

Selection persistence

Yeeks… They move just along the Y axis, but not along the edge. But Blender easily handles that - you need to switch between coordinate system using the corresponding menu at the bottom of your screen:

Coordinate system

Use the Normal one and you will see the arrows at the selected edges changed:

Normal coordinate system

Now movement is done along the edge, just as we need:

Moving with Normal coordinate system

Try moving (yes, moving, not extruding) our edges up - they will move along the normal:

Moving edges in Right Ortho view

But if you click the mouse wheel and rotate camera, or even if you switch to the Top Ortho view, you will notice that our walls have different width:

Whoops...

So we need to make one wall thinner. But we should not forget about other edges - ones, which will make another wall for us. Undoing now is not an option… We need to move the edges. But if you move only those visible at the Top Ortho view, you will forget about the ones at the bottom and screw the model. And selecting all those edges one-by-one is not an option too…

Selecting many edges manually is a pain...

Moreover, we do not see those edges at the bottom! This is easy to fix, though: see the small button with rectangles near the vertex/edge/face switcher?

'Limit selection to visible' switcher

Click it and you will be able to select bottom edges without the need to rotate the camera. And now we will try the circle-selection tool, which will come to help you when you need to select many elements at a time. Hit the C key and you will see the circle in a workspace. Try dragging it (left-click the mouse and drag) over the edges we need:

Circle selection

Hmmm… It’s way too much… Now, hold the Shift key and drag the circle over the neighbour, redundant ones:

Unselecting elements

Now we can switch back to the Top Ortho View and successfully move our edges:

Making walls thinner

Now that we have our walls precisely set up, we can extrude the last two walls. Select the Normal coordinate system and perform the extrusion along the Z axis:

Extruding last two walls

Now we will scale our model a few times. Staying in the Edit mode, select all the faces with the A key:

Selecting everything

And hit the S key and start entering scale factor number. That’s right, just press, say, 5:

Entering factor while scaling

You can correct what you entered using the Backspace key. You can do the same thing while moving or rotating elements. This is useful when you need to make operation really precise. But you still can use your mouse, of course.

Hint: if you scaled your model outside the Edit mode, you may find your scale, translation or rotation different from identity values (1, 1, 1 for scale or 0, 0, 0 for position/rotation). This may cause different bugs while exporting models. To fix this, you need to select your model in the Object mode, hit the Ctrl+A and select Apply Scale (or whatever you need to fix) from the pop-up menu.

Applying scale

Texturing our model

Now we need to paint our model to have something more beautiful in our application than just pitch black… stuff…

Adding textures to a model in Blender is extremely easy - you just select your model, switch to the Texture tab on the right panel and click New button:

Creating a texture

Then you pass in some params like texture size, the background color and image name - and you are done!

New texture params

But that will only add a blank texture. And then you will need to paint it as you wish. But painting a texture requires your model to have vertices, synchronized with your texture. So each vertex will know where it lays both in 3D space and on the texture image. This assignment process is called Texture unwrapping or UV mapping (because texture coordinates are usually called u and v instead of x and y, since those are already involved to describe vertex’ position). And this process requires one thing from you: you need to specify, where Blender should “cut” your model. This is quite simple task, but this will result on how the texture will look like and how easy it will be to paint.

So, go to the Edit mode and select a few loops of edges:

Selecting seam edges

Selecting seam edges

Now, on the left panel, switch to the Shading/UVs tab and click the Mark Seam button:

Shading/UVs tab

This will mark the selected edges as seams to “cut” your model along. Have no fear, your model will not be actually cut - it will be used for maths only.

Then, on the same panel click the Unwrap button and select the first unwrapping method on the list:

Unwrapping method

Again, no effect you will see now. To see something, switch the window layout at the top menu to UV Editing:

Layout switcher

Layouts available

You will see two windows - on your left there will be UV/image editor and on your right there will be the 3D view. And again, nothing interesting here… But I am not fooling with you - it’s only how Blender works… To see something marvelous, select everything on the 3D view:

UV-Mapped model

You will see some lines on your left. That’s what you have selected, mapped onto image plane. But there is no actual image in the UV/Image editor for now. To add one, just click the New button on the bottom menu of the UV/Image editor or select an existing one:

Selecting background image for UV mapping

This will not change the image itself. The image will be the background for our image editor window, nothing more. To start making miracles, go to the Texture Paint mode in the 3D view:

Texture Paint mode

And your model will change its look…

Pinky!

What is this pink monster?! Well, on the left panel of our 3D View there’s a message, saying the texture slot is missing and proposing to create one… Let’s do this…

Texture slot creation

Now we are able to paint our model! See, how awesome it is: you have a brush tool activated. Brush has three params:

  1. Color - this could be changed with the color circle below
  2. Radius
  3. Pressure, or Alpha

Radius could be changed by pressing the F key and moving mouse cursor:

Brush radius changing

Pressure could be changed by pressing Shift+F and doing the same:

Brush pressure changing

And you can just pain like in… Microsoft Paint!

Just paint!!!

But if you look into the UV/Image editor, you will see… nothing! Again! ‘the hell?!

WTF?!

That is just misunderstanging - you were painting on the other image instead of the selected one:

Choosing image for UV/Image editor

We created a new one, when created a texture slot…

To start drawing in the UV/Image editor instead of 3D View, you just need to switch its mode to Paint at the bottom menu:

Painting in the UV/Image editor

Okay, so far so good. We are able to paint our model. But there’s one interesting thing: if you try to draw a straight line - you may face situation, when line is straight in the image but is curved on the model:

UV mapping mistakes

UV mapping mistakes

But that’s happening not everywhere - only on certain faces/edges:

Mistakes are only on certain faces

Well, that’s because of UV mapping is not precise enough. If you switch to the View mode in the UV/Image editor and to the Edit mode in the 3D View, and select all the model, you will see the points in the image editor, you may drag:

Control points in the image editor

Try selecting them with Right mouse button and moving them with G:

Selecting control points

Moving control points

Yes, now texture looks creepy, but lines are almost straight:

Fixing UV mapping errors manually

Fixing UV mapping errors manually

Exporting our model

When you finish painting your texture, the last thing we need to do is to export our model to the format, understandable by Irrlicht. For good, both Blender and Irrlicht support many different formats:

Blender exporting

Blender’s file dialogs look differently, but have very intuitive interface:

Blender file dialog

If you do not see the needed format in Blender - you just need to turn on a corresponding plugin:

Blender settings menu

Blender settings menu

After exporting our model to, say, 3DS format, take a look at the directory you have exported your model to:

No textures!

Where are the textures? Relax, they are in the UV/Image editor, yet unsaved. You can save the modified image with the Image -> Save menu at the bottom of UV/Image Editor:

Saving image from UV/Image Editor

Now we have everything we need for our Newtonian sample!

Next chapter

Small intro

Have you ever heard about end-to-end testing? Or maybe about testing automation? Those of you who had, may now be imaging Selenium. That’s right, in most of cases you will need to run Selenium Server and use Selenium Webdriver in your tests. Those come handy to run a standalone browser window, with no caches, filled-in fields or cookies and perform some operations in it.

In this article I will tell you my story of writing E2E tests for Angular webapp.

A brief of history

In my case, we first tried to use Protractor with Chai.js. That time we ended up with almost unsupportable bunch of code, succeeding in 100% of runs.

Next time we eliminated Chai and reworked all our tests to use Protractor only. So the code became more clear (I did not like the syntax, but it worked…), but after upgrading libraries (including Protractor), the ratio of successfull test runs decreased to just 40%.

We worked for two days, trying to fix those tests. And that’s how webdriverio came to our project.

And here’s a short tutorial on how to implement E2E tests with webdriverio in a sample project.

Whaaaaat?

These days I was given a reeeeally interesting homework at the university. I was given a set of MD5 hashes, calculated from single words (taken from Libre Office’ dictionaries) with a given sault. And the task was to find all those words.

So, the first idea which came to my mind was using an internet service for MD5 breaking. But… aaarrrggghhh! There’s a sault, so the webservice, looking for words over a dictionary fails to find mines…

So the second idea was to take that dictionary from Libre Office and iterate through it. At the end, it worked =) And worked reeeally fast. But that is not an interesting part.

I wandered if I could find those words in my dictionary, generated by my own code.

Timbre?

At my job we recently started researching logging tools to make our RESTful API, written in Clojure, writing logs in JSON format. We were using Log4j already, but decided to use another tool for this task, making it less painful. So we felt into timbre. Is seemed so easy to use, but it is really undocumented.

According to timbre’s API, we needed to define our own appender for writing to a custom JSON file. And we found the output-fn option to configure this behaviour. But it is not documented at all, so we started looking for repositories, using timbre, examples and all the stuff. And finally, we ended up with our own solution.

Underneath you will find description of our way to use timbre from scratch.

Prologue

How do we usually create a web application? We run a bootstrapping script, which provides us with a skeleton of our application and then we just extend it with the features we need.

That’s exactly what we did at the last hackathon we were attending - we started with rails new twf and spent half of the day integrating our blank app with Angular, Paperclip, creating API methods and so on. But the effort we needed to accomplish our goal (quite a simple web app) was really huge.

So I decided to find the best combination of backend and frontend technologies that would cause less pain.

At the project I was recently introduced to, the line between frontend and backend is distinguished very clearly: we have an API, written in Clojure and thin frontend application, made with Angular that works on a generated set of static assets - HTMLs, CSS and JS files (but under the hood we are using HAML and SCSS).

The application I will be implementing throughout the whole article has the same architecture: it has RESTful API and MVVM on the frontend, made with Angular. I welcome you to the journey of research and new technologies!

At my work we’ve lately been having a discussions on email validation. I recalled a post on habrahabr, showing different options, including effective and psycho solutions.

At this point we have an application with

  • 1x Ninja, walking around
  • 1x sphere, hanging in the center of the screen
  • 1x cube, flying around the sphere

That’s our “game”? Doubtely… So let’s make things move like in real world! Or just like that…

Requirements

First of all, go and get the Newton GD files. And unpack it… right to the source directory of our project! That’s right! I’m not insane and I’m aware you are going to put a lot of files in your project. But have no fear - you may always add them to .gitignore and skip them from being tracked in your Git repo:

source/newton-dynamics-master/applications
source/newton-dynamics-master/packages/projects
source/newton-dynamics-master/packages/thirdParty
source/newton-dynamics-master/coreLibrary_300/projects

You are using Git, right?.. Now, you place the Newton GD sources in your project directory and change your CMakeLists.txt file to look like this:

cmake_minimum_required(VERSION 3.1)
project(irrlicht_newton_game1)

set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")

option("NEWTON_DEMOS_SANDBOX" "Build demos sandbox" OFF)

set(NEWTONGD_PATH source/newton-dynamics-master)
set(NEWTONGD_INCLUDE_DIRS
        ${NEWTONGD_PATH}/packages/dCustomJoints
        ${NEWTONGD_PATH}/packages/dContainers
        ${NEWTONGD_PATH}/packages/dMath
        )

set(NEWTON_LIBRARIES Newton dMath)

add_subdirectory(${NEWTONGD_PATH})

find_package(X11)
find_package(OpenGL)
find_package(ZLIB)

if (NOT IRRLICHT_LIBRARY_PATH)
    find_library(IRRLICHT_LIBRARY_PATH
            NAMES Irrlicht
            PATHS ${IRRLICHT_PATH}/lib/
            PATH_SUFFIXES Linux MacOSX Win32-gcc Win32-visualstudio Win64-visualstudio)

    message(STATUS "Found Irrlicht: ${IRRLICHT_LIBRARY_PATH}")
endif()

include_directories(${IRRLICHT_PATH}/include ${NEWTONGD_INCLUDE_DIRS})

set(SOURCE_FILES source/main.cpp)
set(EXECUTABLE_NAME irrlicht_newton_game1)

add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})

target_link_libraries(${EXECUTABLE_NAME}
        ${NEWTON_LIBRARIES}
        ${IRRLICHT_LIBRARY_PATH}
        ${X11_LIBRARIES}
        ${OPENGL_LIBRARIES}
        ${ZLIB_LIBRARIES}
        ${X11_Xxf86vm_LIB})

Try to compile your project - it should be just fine. And observe the power of CMake!

Gravity

Let’s start modifying our Irrlicht sample application. First of all, we will add some Newton headers:

#include "newton-dynamics-master/coreLibrary_300/source/newton/Newton.h"
#include "newton-dynamics-master/packages/dMath/dVector.h"
#include "newton-dynamics-master/packages/dMath/dMatrix.h"
#include "newton-dynamics-master/packages/dMath/dQuaternion.h"

The basic thing in the whole Newton GD library is NewtonWorld. That is what it means - the world, where all the physics happen. It is something different from where we place our 3D models. And that should be obvious - graphics are managed by Irrlicht and physics - by Newton. Those are totally different libraries. So we need to tie those two so that graphics correspond to what happens in physical world.

First of all, we need to have a variable for our NewtonWorld. And since physics are handled by scripts too, we need to have that variable close to our other objects - in the ScriptManager class.

There are two functions we need to bind to our NewtonBody:

static void transformCallback(const NewtonBody* body, const dFloat* matrix, int threadIndex) {
    // update ISceneNode so that it is in the same position and rotation as the NewtonBody
}

static void applyForceAndTorqueCallback(const NewtonBody* body, dFloat timestep, int threadIndex) {
    // just add gravity to our body
    dFloat Ixx, Iyy, Izz;
    dFloat mass;

    NewtonBodyGetMassMatrix(body, &mass, &Ixx, &Iyy, &Izz);

    dVector gravityForce(0.0f, mass * -9.8f, 0.0f, 1.0f);
    NewtonBodySetForce(body, &gravityForce[0]);
}

The first one, transformCallback, is called whenever body changes its transform - e. g. either position or rotation. This is a good place to synchronize our Irrlicht meshes’ positions with their Newton bodies.

The applyForceAndTorqueCallback function is called on each NewtonUpdate to set the final forces and torques for bodies. We will modify this one later, but for now its implementation is just good.

But what’s with that NewtonUpdate? This is a function, which does as it says: it updates NewtonWorld and all its bodies, taking into account the time since the last update. This function call has one great candidate to be placed into: handleFrame. But we need to modify that method to receive the time since the last frame been rendered and we will use this time to update NewtonWorld too.

private:
    void updatePhysics(float dt) {
         NewtonUpdate(newtonWorld, dt);
    }

public:
    void handleFrame(float dt) {
        auto handler = luaState.GetGlobalEnvironment().Get<LuaFunction<void(void)>>("handleFrame");

        updatePhysics(dt);
        setKeyStates();

        handler.Invoke();
    }

Remember about architecture: everything, what needs to be exposed to our scripts should be declared as public in our ScriptManager. Everything else - as protected or private. This is the basic principle of encapsulation, so let’s keep it in our application.

And update the main application loop:

while (device->run()) {
    // Work out a frame delta time.
    const u32 now = device->getTimer()->getTime();
    const f32 frameDeltaTime = (f32) (now - then);
    then = now;

    // ...

    scriptMgr->handleFrame(frameDeltaTime);

    // ...
}

Hint: to make simulation slower and so watch ball falling in detail, make the NewtonUpdate argument even smaller. Thousand times, say.

Since we have initialization for our Newton stuff, we need to clean it up at the exit to prevent memory leaks. Let’s declare a method for that:

private:
    void stopPhysics() {
        NewtonDestroyAllBodies(newtonWorld);
        NewtonDestroy(newtonWorld);
    }

And call it right before the program’s end:

device->drop();

scriptMgr->handleExit();
delete scriptMgr;

And now is the right moment to add key codes’ definitions and exit function to our ScriptManager so that we could write more clear code and close our application correctly, using, say, Esc key.

To stop our application, we need to break our while (device->run()) loop. This could be achieved by simply closing the IrrlichtDevice with device->closeDevice(). But we do not have an access to the device from the ScriptManager. So let’s add it as a constructor argument:

private:
    irr::IrrlichtDevice *device;

public:
    ScriptManager(irr::IrrlichtDevice *_device, scene::ISceneManager *_smgr, video::IVideoDriver *_driver) {
        driver = _driver;
        smgr = _smgr;
        device = _device;

        initPhysics();
    }

So now we can create a function, exposed to our scripts, which will stop our application:

void exit() {
    device->closeDevice();
}

And bind it to the Lua function:

auto exitFn = luaState.CreateFunction<void(void)>(
                [&]() -> void { exit(); });

global.Set("exit", exitFn);

Now we can use our exit function in the Lua scripts. But we will need to use hexadecimal key codes and that’s… ugly. So we need to define some symbolic names for those codes:

void setGlobalVariables() {
    setKeyStates();
    setKeyCodeConstants();
}

void setKeyCodeConstants() {
    std::map<std::string, int> keyMapping = {
        { "KEY_LBUTTON", 0x01 }, // Left mouse button
        { "KEY_RBUTTON", 0x02 }, // Right mouse button
        { "KEY_CANCEL", 0x03 }, // Control-break processing
        { "KEY_MBUTTON", 0x04 }, // Middle mouse button (three-button mouse)
        { "KEY_XBUTTON1", 0x05 }, // Windows 2000/XP: X1 mouse button
        { "KEY_XBUTTON2", 0x06 }, // Windows 2000/XP: X2 mouse button
        { "KEY_BACK", 0x08 }, // BACKSPACE key
        { "KEY_TAB", 0x09 }, // TAB key
        { "KEY_CLEAR", 0x0C }, // CLEAR key
        { "KEY_RETURN", 0x0D }, // ENTER key
        { "KEY_SHIFT", 0x10 }, // SHIFT key
        { "KEY_CONTROL", 0x11 }, // CTRL key
        { "KEY_MENU", 0x12 }, // ALT key
        { "KEY_PAUSE", 0x13 }, // PAUSE key
        { "KEY_CAPITAL", 0x14 }, // CAPS LOCK key
        { "KEY_KANA", 0x15 }, // IME Kana mode
        { "KEY_HANGUEL", 0x15 }, // IME Hanguel mode (maintained for compatibility use KEY_HANGUL)
        { "KEY_HANGUL", 0x15 }, // IME Hangul mode
        { "KEY_JUNJA", 0x17 }, // IME Junja mode
        { "KEY_FINAL", 0x18 }, // IME final mode
        { "KEY_HANJA", 0x19 }, // IME Hanja mode
        { "KEY_KANJI", 0x19 }, // IME Kanji mode
        { "KEY_ESCAPE", 0x1B }, // ESC key
        { "KEY_CONVERT", 0x1C }, // IME convert
        { "KEY_NONCONVERT", 0x1D }, // IME nonconvert
        { "KEY_ACCEPT", 0x1E }, // IME accept
        { "KEY_MODECHANGE", 0x1F }, // IME mode change request
        { "KEY_SPACE", 0x20 }, // SPACEBAR
        { "KEY_PRIOR", 0x21 }, // PAGE UP key
        { "KEY_NEXT", 0x22 }, // PAGE DOWN key
        { "KEY_END", 0x23 }, // END key
        { "KEY_HOME", 0x24 }, // HOME key
        { "KEY_LEFT", 0x25 }, // LEFT ARROW key
        { "KEY_UP", 0x26 }, // UP ARROW key
        { "KEY_RIGHT", 0x27 }, // RIGHT ARROW key
        { "KEY_DOWN", 0x28 }, // DOWN ARROW key
        { "KEY_SELECT", 0x29 }, // SELECT key
        { "KEY_PRINT", 0x2A }, // PRINT key
        { "KEY_EXECUT", 0x2B }, // EXECUTE key
        { "KEY_SNAPSHOT", 0x2C }, // PRINT SCREEN key
        { "KEY_INSERT", 0x2D }, // INS key
        { "KEY_DELETE", 0x2E }, // DEL key
        { "KEY_HELP", 0x2F }, // HELP key
        { "KEY_KEY_0", 0x30 }, // 0 key
        { "KEY_KEY_1", 0x31 }, // 1 key
        { "KEY_KEY_2", 0x32 }, // 2 key
        { "KEY_KEY_3", 0x33 }, // 3 key
        { "KEY_KEY_4", 0x34 }, // 4 key
        { "KEY_KEY_5", 0x35 }, // 5 key
        { "KEY_KEY_6", 0x36 }, // 6 key
        { "KEY_KEY_7", 0x37 }, // 7 key
        { "KEY_KEY_8", 0x38 }, // 8 key
        { "KEY_KEY_9", 0x39 }, // 9 key
        { "KEY_KEY_A", 0x41 }, // A key
        { "KEY_KEY_B", 0x42 }, // B key
        { "KEY_KEY_C", 0x43 }, // C key
        { "KEY_KEY_D", 0x44 }, // D key
        { "KEY_KEY_E", 0x45 }, // E key
        { "KEY_KEY_F", 0x46 }, // F key
        { "KEY_KEY_G", 0x47 }, // G key
        { "KEY_KEY_H", 0x48 }, // H key
        { "KEY_KEY_I", 0x49 }, // I key
        { "KEY_KEY_J", 0x4A }, // J key
        { "KEY_KEY_K", 0x4B }, // K key
        { "KEY_KEY_L", 0x4C }, // L key
        { "KEY_KEY_M", 0x4D }, // M key
        { "KEY_KEY_N", 0x4E }, // N key
        { "KEY_KEY_O", 0x4F }, // O key
        { "KEY_KEY_P", 0x50 }, // P key
        { "KEY_KEY_Q", 0x51 }, // Q key
        { "KEY_KEY_R", 0x52 }, // R key
        { "KEY_KEY_S", 0x53 }, // S key
        { "KEY_KEY_T", 0x54 }, // T key
        { "KEY_KEY_U", 0x55 }, // U key
        { "KEY_KEY_V", 0x56 }, // V key
        { "KEY_KEY_W", 0x57 }, // W key
        { "KEY_KEY_X", 0x58 }, // X key
        { "KEY_KEY_Y", 0x59 }, // Y key
        { "KEY_KEY_Z", 0x5A }, // Z key
        { "KEY_LWIN", 0x5B }, // Left Windows key (Microsoft� Natural� keyboard)
        { "KEY_RWIN", 0x5C }, // Right Windows key (Natural keyboard)
        { "KEY_APPS", 0x5D }, // Applications key (Natural keyboard)
        { "KEY_SLEEP", 0x5F }, // Computer Sleep key
        { "KEY_NUMPAD0", 0x60 }, // Numeric keypad 0 key
        { "KEY_NUMPAD1", 0x61 }, // Numeric keypad 1 key
        { "KEY_NUMPAD2", 0x62 }, // Numeric keypad 2 key
        { "KEY_NUMPAD3", 0x63 }, // Numeric keypad 3 key
        { "KEY_NUMPAD4", 0x64 }, // Numeric keypad 4 key
        { "KEY_NUMPAD5", 0x65 }, // Numeric keypad 5 key
        { "KEY_NUMPAD6", 0x66 }, // Numeric keypad 6 key
        { "KEY_NUMPAD7", 0x67 }, // Numeric keypad 7 key
        { "KEY_NUMPAD8", 0x68 }, // Numeric keypad 8 key
        { "KEY_NUMPAD9", 0x69 }, // Numeric keypad 9 key
        { "KEY_MULTIPLY", 0x6A }, // Multiply key
        { "KEY_ADD", 0x6B }, // Add key
        { "KEY_SEPARATOR", 0x6C }, // Separator key
        { "KEY_SUBTRACT", 0x6D }, // Subtract key
        { "KEY_DECIMAL", 0x6E }, // Decimal key
        { "KEY_DIVIDE", 0x6F }, // Divide key
        { "KEY_F1", 0x70 }, // F1 key
        { "KEY_F2", 0x71 }, // F2 key
        { "KEY_F3", 0x72 }, // F3 key
        { "KEY_F4", 0x73 }, // F4 key
        { "KEY_F5", 0x74 }, // F5 key
        { "KEY_F6", 0x75 }, // F6 key
        { "KEY_F7", 0x76 }, // F7 key
        { "KEY_F8", 0x77 }, // F8 key
        { "KEY_F9", 0x78 }, // F9 key
        { "KEY_F10", 0x79 }, // F10 key
        { "KEY_F11", 0x7A }, // F11 key
        { "KEY_F12", 0x7B }, // F12 key
        { "KEY_F13", 0x7C }, // F13 key
        { "KEY_F14", 0x7D }, // F14 key
        { "KEY_F15", 0x7E }, // F15 key
        { "KEY_F16", 0x7F }, // F16 key
        { "KEY_F17", 0x80 }, // F17 key
        { "KEY_F18", 0x81 }, // F18 key
        { "KEY_F19", 0x82 }, // F19 key
        { "KEY_F20", 0x83 }, // F20 key
        { "KEY_F21", 0x84 }, // F21 key
        { "KEY_F22", 0x85 }, // F22 key
        { "KEY_F23", 0x86 }, // F23 key
        { "KEY_F24", 0x87 }, // F24 key
        { "KEY_NUMLOCK", 0x90 }, // NUM LOCK key
        { "KEY_SCROLL", 0x91 }, // SCROLL LOCK key
        { "KEY_LSHIFT", 0xA0 }, // Left SHIFT key
        { "KEY_RSHIFT", 0xA1 }, // Right SHIFT key
        { "KEY_LCONTROL", 0xA2 }, // Left CONTROL key
        { "KEY_RCONTROL", 0xA3 }, // Right CONTROL key
        { "KEY_LMENU", 0xA4 }, // Left MENU key
        { "KEY_RMENU", 0xA5 }, // Right MENU key
        { "KEY_OEM_1", 0xBA }, // for US    ";:"
        { "KEY_PLUS", 0xBB }, // Plus Key   "+"
        { "KEY_COMMA", 0xBC }, // Comma Key  ","
        { "KEY_MINUS", 0xBD }, // Minus Key  "-"
        { "KEY_PERIOD", 0xBE }, // Period Key "."
        { "KEY_OEM_2", 0xBF }, // for US    "/?"
        { "KEY_OEM_3", 0xC0 }, // for US    "`~"
        { "KEY_OEM_4", 0xDB }, // for US    "[{"
        { "KEY_OEM_5", 0xDC }, // for US    "\|"
        { "KEY_OEM_6", 0xDD }, // for US    "]}"
        { "KEY_OEM_7", 0xDE }, // for US    "'""
        { "KEY_OEM_8", 0xDF }, // None
        { "KEY_OEM_AX", 0xE1 }, // for Japan "AX"
        { "KEY_OEM_102", 0xE2 }, // "<>" or "\|"
        { "KEY_ATTN", 0xF6 }, // Attn key
        { "KEY_CRSEL", 0xF7 }, // CrSel key
        { "KEY_EXSEL", 0xF8 }, // ExSel key
        { "KEY_EREOF", 0xF9 }, // Erase EOF key
        { "KEY_PLAY", 0xFA }, // Play key
        { "KEY_ZOOM", 0xFB }, // Zoom key
        { "KEY_PA1", 0xFD }, // PA1 key
        { "KEY_OEM_CLEAR", 0xFE }, // Clear key
    };

    for (auto it = keyMapping.begin(); it != keyMapping.end(); ++it) {
        luaState.GetGlobalEnvironment().Set(it->first, it->second);
    }
}

Now we can create a Esc key handler in our script:

function handleFrame()
    -- Esc
    if KEY_STATE[KEY_ESCAPE] == true then
        exit()
    end
end

Now we are ready to create our first Newton bodies. Bodies are some invisible objects, which define how our Irrlicht meshes will behave (e. g. where they will be placed, how they will interact when moving, etc.). Basically, there are two types of bodies:

  1. dynamic, whose movement is determined by the forces, applied to them
  2. kinematic, which are controlled by setting their velocities

Those two kinds of bodies are totally different, so the interactions between them are not pre-defined. So when your dynamic body will fall onto a kinematic one, it will fall through.

And each body has its shape, which determines behaviour of the body, when it collides others and the collision detection itself, of course. Shapes could be convex or concave. Convex shapes are easier to work with (on the level of physics simulation), but not all the bodies in practice are convex. For example, levels are oftenly concave. So they need their special shapes, which are called Triangle Mesh.

Note: to keep the performance of your application high, try to minimalize the use of triangle meshes and use as simple shapes, as possible. Sometimes it is more effective to combine a set of primitive shapes, like spheres, cylinders and boxes into one compound shape, then to use a trimesh.

Let’s create our first simple scene, empowered with physics! We will need only two things:

  1. a sphere
  2. the floor

Since we do not have the good mesh in standard Irrlicht distribution for the floor (there is a Quake-like level, but that is too much for our case), we will learn how to make that simple thing in Blender. The next part is a short break between coding sessions.

Next chapter

Aug 28, 2015

First script

As we discussed, we will describe the whole game in scripts, and the core functionality we will define in the core. In this chapter we will be adding Lua to our application. You do not need to download Lua itself - you’d better install it with your system’s package manager (yum or apt or whatever your Linux uses, brew for OSX…).

Dependencies

The only thing you need to download from Internet this time is Lua wrapper called luacppinterface. So go and get it from Github.

And unpack it… right to the source directory of our project! That’s right! That’s really small library so it will not pollute your project with tons of files.

Now, I mentioned dependency managers earlier. This is how we will handle them in our C++ application - we will simply put the sources of all the libraries we depend on, with the versions we depend on, right in our project. Given that, you may put Irrlicht there as well - you are free to do anything with our project!

Build instructions

To build our project we will need to change our CMakeLists.txt file to fetch our new dependency:

cmake_minimum_required(VERSION 3.1)
project(irrlicht_newton_game1)

set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")

set(SOURCE_FILES source/main.cpp)
set(EXECUTABLE_NAME irrlicht_newton_game1)

set(LUACPPINTERFACE_PATH source/luacppinterface-master)

add_subdirectory(${LUACPPINTERFACE_PATH})

find_package(X11)
find_package(OpenGL)
find_package(ZLIB)
find_package(Lua)

if (NOT IRRLICHT_LIBRARY_PATH)
    find_library(IRRLICHT_LIBRARY_PATH
            NAMES Irrlicht
            PATHS ${IRRLICHT_PATH}/lib/
            PATH_SUFFIXES Linux MacOSX Win32-gcc Win32-visualstudio Win64-visualstudio)

    message(STATUS "Found Irrlicht: ${IRRLICHT_LIBRARY_PATH}")
endif()

include_directories(${IRRLICHT_PATH}/include ${LUA_INCLUDE_DIR} ${LUACPPINTERFACE_PATH}/include)

add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})

target_link_libraries(${EXECUTABLE_NAME}
        luacppinterface
        ${IRRLICHT_LIBRARY_PATH}
        ${X11_LIBRARIES}
        ${OPENGL_LIBRARIES}
        ${ZLIB_LIBRARIES}
        ${X11_Xxf86vm_LIB}
        ${LUA_LIBRARIES})

And here’s the thing: if you try to compile our project on another machine, you will not need to install any other libraries than Lua on that machine! That supposed to sound like “sweet, huh?”, except that one little “but…“… Bittersweet…

Back to our busines… luacppinterface needs to be tweaked a bit to fit our project - we will hack its CMakeLists.txt file to make it depend on system Lua libraries. Just make it look like this:

cmake_minimum_required (VERSION 2.6)

project(luacppinterface)

include_directories("lua/src")

find_package(Lua)

include_directories(${LUA_INCLUDE_DIR})

add_library(luacppinterface STATIC
    include/luacoroutine.cpp
    include/luareference.cpp
    include/luacppinterface.cpp
    include/luatable.cpp
    include/luafunction.cpp
)

target_link_libraries(luacppinterface ${LUA_LIBRARIES})

It barely differs from the original file, but it makes a compilation pleasant - you do not need to specify paths to Lua libs anymore!

Injecting some Lua

Our application now uses C++ code to place some 3D objects in a scene. Let’s move, say, sphere creation, to the script.

First of all, add luacppinterface headers to our main.cpp file:

#include "luacppinterface-master/include/luacppinterface.h"

Now let’s look at some of Irrlicht’ conventions:

  • it uses irr::video::IVideoDriver for rendering operations
  • it uses irr::scene::ISceneManager for scene management

So why not to define a ScriptManager to handle scripts? Our requirements for this class (for now) are:

  • it should load and evaluate scripts
  • it should provide simple API to our scripts

Let’s get coding!

class ScriptManager {
private:
    Lua luaState;
    std::map<std::string, scene::ISceneNode*> nodes;
    video::IVideoDriver *driver;
    scene::ISceneManager *smgr;

    void bindFunctions();

public:
    ScriptManager(scene::ISceneManager *_smgr, video::IVideoDriver *_driver) {
        driver = _driver;
        smgr = _smgr;
    }

    void createSphereNode(const std::string name, const std::string textureFile);

    void setNodePosition(const std::string name, LuaTable pos);

    LuaTable getNodePosition(const std::string name);

    void loadScript(const std::string filename) {
        std::ifstream inf(filename);
        std::string code((std::istreambuf_iterator<char>(inf)), std::istreambuf_iterator<char>());

        bindFunctions();

        luaState.RunScript(code);
    }
};

This is just a skeleton - we will fill it out in a minute. Just catching up:

  1. this class depends on IVideoDriver and ISceneManager to handle 3D objects and the scene
  2. it contains Lua luaState field to store the current state of our script running
  3. it stores all the nodes as a <string, ISceneNode*> map to allow access to our nodes from scripts
  4. it exposes three methods as an API to Lua scripts: createSphereNode, setNodePosition and getNodePosition so we will be able to make some manipulations in our scripts
  5. it provides really short and simple interface to our C++ core: ScriptManager(...) and loadScript

The main principle, each and every programmer breaks every day is KISS (Keep It Stupidly Simple). And that principle should guide us through this whole tutorial to not overthink and override ourselves as well as the project we are making. That is why our APIs are that simple.

But let’s get back to our ScriptManager. It shows how things will look like, but never defines how they will actually work. So here are the key points to Lua API:

  1. LuaTable is an array-like structure in Lua, representing both indexed as well as key-value arrays in Lua. This type is a way to pass variables between Lua script and C++ program. You may use both table.Get<value_type>(index) and table.Get<value_type>("key") methods to access its values.

  2. To bind our ScriptManager methods to Lua functions, we need to use pointers to those functions. And as it is not that simple in usual C++, we will use C++11x lambdas:

    auto createSphere = luaState.CreateFunction<void(std::string, std::string)>([&](std::string name, std::string tex) -> void { createSphereNode(name, tex); });
    
  1. All the functions and variables you want to pass to Lua scripts should be global. And since we have our pretty luaState member, we may set global members through its methods:
    LuaTable global = luaState.GetGlobalEnvironment();

    // ...

    global.Set("createSphere", createSphere);
    
  1. We will be using just a map of a Irrlicht’ nodes and its name to bypass those nodes between scripts and core:
    void createSphereNode(const std::string name, const std::string textureFile) {
      scene::ISceneNode *node = smgr->addSphereSceneNode();
  
      if (node) {
            node->setPosition(core::vector3df(0, 0, 30));
          node->setMaterialTexture(0, driver->getTexture(textureFile.c_str()));
          node->setMaterialFlag(video::EMF_LIGHTING, false);
      }
  
      nodes[name] = node;
    }
  
    void setNodePosition(const std::string name, LuaTable pos) {
          float x, y, z;
  
          x = pos.Get<float>("x");
          y = pos.Get<float>("y");
          z = pos.Get<float>("z");
  
          nodes[name]->setPosition(core::vector3df(x, y, z));
      }
  
      LuaTable getNodePosition(const std::string name) {
          LuaTable pos = luaState.CreateTable();
  
          core::vector3df v = nodes[name]->getPosition();
  
          pos.Set("x", v.X);
          pos.Set("y", v.Y);
          pos.Set("z", v.Z);
  
          return pos;
      }
    

Given those, we have our API and are able to create and run our first Lua script. Add one in the media/scripts/ directory:

createSphere("sphere1", "media/textures/wall.bmp")

Note: paths in the script will be used by C++ core, relatively to the binary file, which is… generated by our C++ code! So all the paths in the scripts are just the same as they are in C++ core.

And add the ScriptManager initialization code:

ScriptManager *scriptMgr = new ScriptManager(smgr, driver);

scriptMgr->loadScript("media/scripts/test1.lua");

Now you may remove the code, creating sphere in the main() function. And run the code. You should see exactly the same picture as before:

Homework

Your task is: try to move all the other “factory” functions (creating cube, ninja, circle animator for cube and fly animator for Ninja) to Lua script, adding API for them to ScriptManager.

More separation

We will now advance our script and add some convention to it. These will be our tasks for the rest of this chapter:

  1. move keyboard events handling to script
  2. create two function in script so we may call them by convention, not by configuration

The last phrase I took from Ember.js introduction. It says “prefer convention over configuration”, meaning we’d better call the functions of same name on different scripts, instead of setting somehow which function to call.

That is, we will define handleFrame() function in our script, which will be called on each onFrame event in our C++ core and the main() function, which will be called right after script has been loaded.

auto handler = luaState.GetGlobalEnvironment().Get<LuaFunction<void(void)>>("handleFrame");

// ...

handler.Invoke();

Moreover, we will define a global keyboard state table for each of scripts we load and will be updating it as user presses keys on his keyboard. And this variable will be shared with script, but as read-only one. So changes in that table will have no effect on the application itself.

class ScriptManager {
private:
    std::map<int, bool> keyStates;

public:
    void setGlobalVariables() {
        setKeyStates();
    }

    void setKeyStates() {
        LuaTable keysTable = luaState.CreateTable();

        for (auto &kv : keyStates) {
            keysTable.Set(kv.first, kv.second);
        }

        luaState.GetGlobalEnvironment().Set("KEY_STATE", keysTable);
    }

    void setKeyState(int key, bool state) {
        keyStates[key] = state;
    }

    void handleFrame() {
        auto handler = luaState.GetGlobalEnvironment().Get<LuaFunction<void(void)>>("handleFrame");

        setKeyStates();

        handler.Invoke();
    }

    void loadScript(const std::string filename) {
        std::ifstream inf(filename);
        std::string code((std::istreambuf_iterator<char>(inf)), std::istreambuf_iterator<char>());

        bindFunctions();
        setGlobalVariables();

        luaState.RunScript(code);

        auto scriptMainFn = luaState.GetGlobalEnvironment().Get<LuaFunction<void(void)>>("main");
        scriptMainFn.Invoke();
    }
};

class MyEventReceiver : public IEventReceiver {
public:
    MyEventReceiver(ScriptManager *scriptManager) {
        scriptMgr = scriptManager;

        for (u32 i = 0; i < KEY_KEY_CODES_COUNT; ++i)
            scriptMgr->setKeyState(i, false);
    }

    // This is the one method that we have to implement
    virtual bool OnEvent(const SEvent &event) {
        // Remember whether each key is down or up
        if (event.EventType == irr::EET_KEY_INPUT_EVENT)
            scriptMgr->setKeyState(event.KeyInput.Key, event.KeyInput.PressedDown);

        return false;
    }

private:
    ScriptManager *scriptMgr;
};

Variables are added to a GlobalEnvironment just as function do:

luaState.GetGlobalEnvironment().Set("KEY_STATE", keysTable);

Lua-defined functions are found by their names and called with Invoke(args) method:

auto handler = luaState.GetGlobalEnvironment().Get<LuaFunction<void(void)>>("handleFrame");

handler.Invoke();

Let’s add some simple interaction to our script now. I’ll help you a bit:

void moveNode(const std::string name, LuaTable pos) {
    scene::ISceneNode *node = findNode(name);
    core::vector3df vec = tableToVector3df(pos);

    core::matrix4 m;

    core::vector3df rot = node->getRotation();
    m.setRotationDegrees(rot);

    m.transformVect(vec);
    node->setPosition(node->getPosition() + vec);
    node->updateAbsolutePosition();
}

This is how nodes could be moved relatively to their current position in Irrlicht.

And here’s how our Lua script may look like now:

function handleFrame()
    -- w
    if KEY_STATE[0x57] == true then
        move("sphere1", { x = 0, y = 1, z = 0 })
    end

    -- s
    if KEY_STATE[0x53] == true then
        move("sphere1", { x = 0, y = -1, z = 0 })
    end
end

function main()
    createSphere("sphere1", "media/textures/wall.bmp")
    setPosition("sphere1", { x = 0, y = 0, z = 30 })

    createCube("cube1", "media/textures/t351sml.jpg")
    addCircleAnimator("cube1", { x = 0, y = 0, z = 30 }, 20.0)

    createAnimatedMesh("ninja", "media/models/ninja.b3d", "media/textures/nskinbl.jpg", 0, 13, 15)
    setRotation("ninja", { x = 0, y = -90, z = 0 })
    setScale("ninja", { x = 2, y = 2, z = 2 })
    addForwardAnimator("ninja", { x = 100, y = 0, z = 60 }, { x = -100, y = 0, z = 60 }, 3500, true)
end

If you run our application now, you should be able to control sphere with w and s keys:

Next chapter

Aug 27, 2015

First application

Install Irrlicht

First of all, you will definetely need the Irrlicht engine, so go get it.

Then you will need to compile it. Compilation process depends on the operating system you use, but it’s really similar on every one.

Linux

Install these dependencies with your system’ package manager: libenet-dev libxxf86vm-dev zlib-dev cmake.

Unzip Irrlicht, go to the directory you unpacked with the terminal and run the following:

cd source/Irrlicht
make

Belive it or not, but that’s all!

Windows

Unzip Irrlicht, go to the directory you unpacked and open the VisualStudio project (depending on VisualStudio version, you might want to open a bit different file) in source/Irrlicht:

Irrlicht10.0.sln
Irrlicht11.0.sln
Irrlicht8.0.sln
Irrlicht9.0.sln

Build it with VisualStudio - and you are done!

MacOS X

The steps are a bit complicated. And they require you to install XCode and Command-Line Tools - those could be found either in AppStore or on the Apple website.

  • First of all, you need to install a bunch of dependencies (I use brew for this purpose):
    brew install tinyxml enet lua cmake
    
  • Get a list of all compilers available for your OSX version:
  xcodebuild -showBuildSettings | grep DEFAULT_COMPILER
  

I got something like this:

  $ xcodebuild -showBuildSettings | grep DEFAULT_COMPILER
    DEFAULT_COMPILER = com.apple.compilers.llvm.clang.1_0
  
  • Now the build process:
  cd source/Irrlicht/MacOSX
  xcodebuild -project MacOSX.xcodeproj GCC_VERSION=com.apple.compilers.llvm.clang.1_0
  
  • And the final step - copy the library to the lib/MacOSX directory:
  cp build/Release/libIrrlicht.a ../../../lib/MacOSX
  

Phew! That’s a damn bunch of commands, don’t you think?

Common

By performing those steps, described above, you will end up with the compiled Irrlicht library file within the lib/ directory, depending on your platform:

Linux/libIrrlicht.a
MacOSX/libIrrlicht.a
Win32-visualstudio/Irrlicht.lib
Win64-visualStudio/Irrlicht.lib

Now, create a blank project in your favorite IDE and proceed…

Application itself

Our first application will show you Irrlicht basic features we will use later. They are:

  • mesh handling - loading, rendering, animating, etc.
  • user input handling - reacting to keyboard and mouse events
  • user interface (UI) - displaying some information within the application window

The good start for that is standard example from Irrlicht pack, the 04 - Movement one. Let’s take a look over its code:

/** Example 004 Movement

This Tutorial shows how to move and animate SceneNodes. The
basic concept of SceneNodeAnimators is shown as well as manual
movement of nodes using the keyboard.  We'll demonstrate framerate
independent movement, which means moving by an amount dependent
on the duration of the last run of the Irrlicht loop.

Example 19.MouseAndJoystick shows how to handle those kinds of input.

As always, I include the header files, use the irr namespace,
and tell the linker to link with the .lib file.
*/
#ifdef _MSC_VER
// We'll also define this to stop MSVC complaining about sprintf().
#define _CRT_SECURE_NO_WARNINGS
#pragma comment(lib, "Irrlicht.lib")
#endif

#include <irrlicht.h>

using namespace irr;

/*
To receive events like mouse and keyboard input, or GUI events like "the OK
button has been clicked", we need an object which is derived from the
irr::IEventReceiver object. There is only one method to override:
irr::IEventReceiver::OnEvent(). This method will be called by the engine once
when an event happens. What we really want to know is whether a key is being
held down, and so we will remember the current state of each key.
*/
class MyEventReceiver : public IEventReceiver
{
public:
    // This is the one method that we have to implement
    virtual bool OnEvent(const SEvent& event)
    {
        // Remember whether each key is down or up
        if (event.EventType == irr::EET_KEY_INPUT_EVENT)
            KeyIsDown[event.KeyInput.Key] = event.KeyInput.PressedDown;

        return false;
    }

    // This is used to check whether a key is being held down
    virtual bool IsKeyDown(EKEY_CODE keyCode) const
    {
        return KeyIsDown[keyCode];
    }

    MyEventReceiver()
    {
        for (u32 i=0; i<KEY_KEY_CODES_COUNT; ++i)
            KeyIsDown[i] = false;
    }

private:
    // We use this array to store the current state of each key
    bool KeyIsDown[KEY_KEY_CODES_COUNT];
};

/*
The event receiver for keeping the pressed keys is ready, the actual responses
will be made inside the render loop, right before drawing the scene. So lets
just create an irr::IrrlichtDevice and the scene node we want to move. We also
create some other additional scene nodes, to show that there are also some
different possibilities to move and animate scene nodes.
*/
int main()
{
    // create device
    MyEventReceiver receiver;

    IrrlichtDevice* device = createDevice(video::EDT_OPENGL,
            core::dimension2d<u32>(640, 480), 16, false, false, false, &receiver);

    if (device == 0)
        return 1; // could not create selected driver.

    video::IVideoDriver* driver = device->getVideoDriver();
    scene::ISceneManager* smgr = device->getSceneManager();

    /*
    Create the node which will be moved with the WSAD keys. We create a
    sphere node, which is a built-in geometry primitive. We place the node
    at (0,0,30) and assign a texture to it to let it look a little bit more
    interesting. Because we have no dynamic lights in this scene we disable
    lighting for each model (otherwise the models would be black).
    */
    scene::ISceneNode * node = smgr->addSphereSceneNode();
    if (node)
    {
        node->setPosition(core::vector3df(0,0,30));
        node->setMaterialTexture(0, driver->getTexture("../../media/wall.bmp"));
        node->setMaterialFlag(video::EMF_LIGHTING, false);
    }

    /*
    Now we create another node, movable using a scene node animator. Scene
    node animators modify scene nodes and can be attached to any scene node
    like mesh scene nodes, billboards, lights and even camera scene nodes.
    Scene node animators are not only able to modify the position of a
    scene node, they can also animate the textures of an object for
    example. We create a cube scene node and attach a 'fly circle' scene
    node animator to it, letting this node fly around our sphere scene node.
    */
    scene::ISceneNode* n = smgr->addCubeSceneNode();

    if (n)
    {
        n->setMaterialTexture(0, driver->getTexture("../../media/t351sml.jpg"));
        n->setMaterialFlag(video::EMF_LIGHTING, false);
        scene::ISceneNodeAnimator* anim =
            smgr->createFlyCircleAnimator(core::vector3df(0,0,30), 20.0f);
        if (anim)
        {
            n->addAnimator(anim);
            anim->drop();
        }
    }

    /*
    The last scene node we add to show possibilities of scene node animators is
    a b3d model, which uses a 'fly straight' animator to run between to points.
    */
    scene::IAnimatedMeshSceneNode* anms =
        smgr->addAnimatedMeshSceneNode(smgr->getMesh("../../media/ninja.b3d"));

    if (anms)
    {
        scene::ISceneNodeAnimator* anim =
            smgr->createFlyStraightAnimator(core::vector3df(100,0,60),
            core::vector3df(-100,0,60), 3500, true);
        if (anim)
        {
            anms->addAnimator(anim);
            anim->drop();
        }

        /*
        To make the model look right we disable lighting, set the
        frames between which the animation should loop, rotate the
        model around 180 degrees, and adjust the animation speed and
        the texture. To set the right animation (frames and speed), we
        would also be able to just call
        "anms->setMD2Animation(scene::EMAT_RUN)" for the 'run'
        animation instead of "setFrameLoop" and "setAnimationSpeed",
        but this only works with MD2 animations, and so you know how to
        start other animations. But a good advice is to not use
        hardcoded frame-numbers...
        */
        anms->setMaterialFlag(video::EMF_LIGHTING, false);

        anms->setFrameLoop(0, 13);
        anms->setAnimationSpeed(15);
//      anms->setMD2Animation(scene::EMAT_RUN);

        anms->setScale(core::vector3df(2.f,2.f,2.f));
        anms->setRotation(core::vector3df(0,-90,0));
//      anms->setMaterialTexture(0, driver->getTexture("../../media/sydney.bmp"));

    }

    /*
    To be able to look at and move around in this scene, we create a first
    person shooter style camera and make the mouse cursor invisible.
    */
    smgr->addCameraSceneNodeFPS();
    device->getCursorControl()->setVisible(false);

    /*
    Add a colorful irrlicht logo
    */
    device->getGUIEnvironment()->addImage(
        driver->getTexture("../../media/irrlichtlogoalpha2.tga"),
        core::position2d<s32>(10,20));

    gui::IGUIStaticText* diagnostics = device->getGUIEnvironment()->addStaticText(
        L"", core::rect<s32>(10, 10, 400, 20));
    diagnostics->setOverrideColor(video::SColor(255, 255, 255, 0));

    /*
    We have done everything, so lets draw it. We also write the current
    frames per second and the name of the driver to the caption of the
    window.
    */
    int lastFPS = -1;

    // In order to do framerate independent movement, we have to know
    // how long it was since the last frame
    u32 then = device->getTimer()->getTime();

    // This is the movemen speed in units per second.
    const f32 MOVEMENT_SPEED = 5.f;

    while(device->run())
    {
        // Work out a frame delta time.
        const u32 now = device->getTimer()->getTime();
        const f32 frameDeltaTime = (f32)(now - then) / 1000.f; // Time in seconds
        then = now;

        /* Check if keys W, S, A or D are being held down, and move the
        sphere node around respectively. */
        core::vector3df nodePosition = node->getPosition();

        if(receiver.IsKeyDown(irr::KEY_KEY_W))
            nodePosition.Y += MOVEMENT_SPEED * frameDeltaTime;
        else if(receiver.IsKeyDown(irr::KEY_KEY_S))
            nodePosition.Y -= MOVEMENT_SPEED * frameDeltaTime;

        if(receiver.IsKeyDown(irr::KEY_KEY_A))
            nodePosition.X -= MOVEMENT_SPEED * frameDeltaTime;
        else if(receiver.IsKeyDown(irr::KEY_KEY_D))
            nodePosition.X += MOVEMENT_SPEED * frameDeltaTime;

        node->setPosition(nodePosition);

        driver->beginScene(true, true, video::SColor(255,113,113,133));

        smgr->drawAll(); // draw the 3d scene
        device->getGUIEnvironment()->drawAll(); // draw the gui environment (the logo)

        driver->endScene();

        int fps = driver->getFPS();

        if (lastFPS != fps)
        {
            core::stringw tmp(L"Movement Example - Irrlicht Engine [");
            tmp += driver->getName();
            tmp += L"] fps: ";
            tmp += fps;

            device->setWindowCaption(tmp.c_str());
            lastFPS = fps;
        }
    }

    /*
    In the end, delete the Irrlicht device.
    */
    device->drop();

    return 0;
}

/*
That's it. Compile and play around with the program.
**/

Building the project

Paste the code from above to your blank project in your IDE, in the source/main.cpp file. This may differ, but is not critical. Now, add the CMakeLists.txt file to your project and fill it with these commands:

cmake_minimum_required(VERSION 3.1)
project(irrlicht_newton_game1)

set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")

find_package(X11)
find_package(OpenGL)
find_package(ZLIB)

if (NOT IRRLICHT_LIBRARY_PATH)
    find_library(IRRLICHT_LIBRARY_PATH
            NAMES Irrlicht
            PATHS ${IRRLICHT_PATH}/lib/
            PATH_SUFFIXES Linux MacOSX Win32-gcc Win32-visualstudio Win64-visualstudio)

    message(STATUS "Found Irrlicht: ${IRRLICHT_LIBRARY_PATH}")
endif()

include_directories(${IRRLICHT_PATH}/include)

set(SOURCE_FILES source/main.cpp)
set(EXECUTABLE_NAME irrlicht_newton_game1)

add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})

target_link_libraries(${EXECUTABLE_NAME}
        ${IRRLICHT_LIBRARY_PATH}
        ${X11_LIBRARIES}
        ${OPENGL_LIBRARIES}
        ${ZLIB_LIBRARIES}
        ${X11_Xxf86vm_LIB})

Note: for those of you, guys, running MacOS X I prepared a bit more complicated CMakeLists.txt file - just to make our application run everywhere:

cmake_minimum_required(VERSION 3.1)
project(irrlicht_newton_game1)

set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")

option("NEWTON_DEMOS_SANDBOX" "Build demos sandbox" OFF)

set(LUACPPINTERFACE_PATH source/luacppinterface-master)
set(CPPFORMAT_PATH source/cppformat-master)
set(NEWTONGD_PATH source/newton-dynamics-master)
set(NEWTONGD_INCLUDE_DIRS
        ${NEWTONGD_PATH}/packages/dCustomJoints
        ${NEWTONGD_PATH}/packages/dContainers
        ${NEWTONGD_PATH}/packages/dMath)

set(NEWTON_LIBRARIES Newton dMath)

add_subdirectory(${LUACPPINTERFACE_PATH})
add_subdirectory(${CPPFORMAT_PATH})
add_subdirectory(${NEWTONGD_PATH})

find_package(X11)
find_package(OpenGL)
find_package(ZLIB)
find_package(Lua)

if (NOT IRRLICHT_LIBRARY_PATH)
    if (UNIX)
        set(IRRLICHT_PATH_SUFFIX Linux)
    endif()

    if (APPLE)
        set(IRRLICHT_PATH_SUFFIX MacOSX)
    endif()

    if (WIN32)
        if (MSVC)
            set(IRRLICHT_PATH_SUFFIX Win32-visualstudio Win64-visualstudio)
        endif()

        if (MINGW)
            set(IRRLICHT_PATH_SUFFIX Win32-gcc)
        endif()
    endif()

    find_library(IRRLICHT_LIBRARY_PATH
            NAMES Irrlicht
            PATHS ${IRRLICHT_PATH}/lib/
            PATH_SUFFIXES ${IRRLICHT_PATH_SUFFIX})

    message(STATUS "Found Irrlicht: ${IRRLICHT_LIBRARY_PATH}")
endif()

set(LIBRARIES luacppinterface
        cppformat
        ${NEWTON_LIBRARIES}
        ${IRRLICHT_LIBRARY_PATH}
        ${X11_LIBRARIES}
        ${OPENGL_LIBRARIES}
        ${ZLIB_LIBRARIES}
        ${LUA_LIBRARIES})

if (NOT APPLE)
    set(LIBRARIES ${LIBRARIES} ${X11_Xxf86vm_LIB})
endif()

include_directories(${IRRLICHT_PATH}/include
        ${LUA_INCLUDE_DIR}
        ${LUACPPINTERFACE_PATH}/include
        ${CPPFORMAT_PATH}
        ${NEWTONGD_INCLUDE_DIRS})

set(SOURCE_FILES source/main.cpp)
set(EXECUTABLE_NAME irrlicht_newton_game1)

add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})

if (APPLE)
    set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -framework Foundation -framework OpenGL -framework Cocoa -framework Carbon -framework AppKit -framework IOKit")
endif()

target_link_libraries(${EXECUTABLE_NAME}
        ${LIBRARIES})

CMake file

But what happens in all that code? First two lines of our CMakeLists.txt file define the project:

cmake_minimum_required(VERSION 3.1)
project(irrlicht_newton_game1)

Then we modify the variable CMAKE_CXX_FLAGS, which will be used to set compiler flags. This is how we add items to lists or modify string variables with CMake: we set it the new value, consisting of the old one and the new elements / parts:

set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++11")

Then we tell CMake not to build Newton demo sandbox subproject and set a few path variables - we will use them to point compiler to the header and library files of our third-party libraries (like Newton itself, Irrlicht and others).

Remember: these are only plain variables, they have no effect on compiler themselves.

set(LUACPPINTERFACE_PATH source/luacppinterface-master)
set(CPPFORMAT_PATH source/cppformat-master)
set(NEWTONGD_PATH source/newton-dynamics-master)
set(NEWTONGD_INCLUDE_DIRS
        ${NEWTONGD_PATH}/packages/dCustomJoints
        ${NEWTONGD_PATH}/packages/dContainers
        ${NEWTONGD_PATH}/packages/dMath)

set(NEWTON_LIBRARIES Newton dMath)

Next, we point CMake to our sub-projects, which are by the fact our third-party libraries:

add_subdirectory(${LUACPPINTERFACE_PATH})
add_subdirectory(${CPPFORMAT_PATH})
add_subdirectory(${NEWTONGD_PATH})

These tell CMake to build sub-projects before building our application. Because our sub-projects are nothing but libraries, we can then look for the built libraries, required by our project in the sub-projects’ output directories like this:

find_package(Lua)

Same way we look for system libraries:

find_package(X11)
find_package(OpenGL)
find_package(ZLIB)

These commands set compile-ready variables like X11_LIBRARIES.

Some sub-projects may set CMake variables too, providing us with paths to include files or library files. If Irrlicht did not do this, we try to find its paths with CMake:

if (NOT IRRLICHT_LIBRARY_PATH)
    if (UNIX)
        set(IRRLICHT_PATH_SUFFIX Linux)
    endif()

    if (APPLE)
        set(IRRLICHT_PATH_SUFFIX MacOSX)
    endif()

    if (WIN32)
        if (MSVC)
            set(IRRLICHT_PATH_SUFFIX Win32-visualstudio Win64-visualstudio)
        endif()

        if (MINGW)
            set(IRRLICHT_PATH_SUFFIX Win32-gcc)
        endif()
    endif()

    find_library(IRRLICHT_LIBRARY_PATH
            NAMES Irrlicht
            PATHS ${IRRLICHT_PATH}/lib/
            PATH_SUFFIXES ${IRRLICHT_PATH_SUFFIX})

    message(STATUS "Found Irrlicht: ${IRRLICHT_LIBRARY_PATH}")
endif()

Note the environment variables CMake provides us with: UNIX, APPLE, WIN32, MSVC and many others. They describe which operating system CMake was ran under and which compiler it was told to use.

And the most important part of our CMakeLists.txt file:

include_directories(${IRRLICHT_PATH}/include
        ${LUA_INCLUDE_DIR}
        ${LUACPPINTERFACE_PATH}/include
        ${CPPFORMAT_PATH}
        ${NEWTONGD_INCLUDE_DIRS})

set(SOURCE_FILES source/main.cpp)
set(EXECUTABLE_NAME irrlicht_newton_game1)

add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})

This actually runs the compiler with the include directories, source files and output file specified.

After that, we may run linker to link the intermediate object files, provided by compiler, and end up with the application executable:

target_link_libraries(${EXECUTABLE_NAME}
        ${LIBRARIES})

For OSX users there is a small hack, needed to build the application:

if (APPLE)
    set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -framework Foundation -framework OpenGL -framework Cocoa -framework Carbon -framework AppKit -framework IOKit")
endif()

Note the order the commands are specified in: having include path variables definitions placed before sub-projects commands may be no harmful, but more “effective” commands, like compiling sub-projects (add_subdirectory) depend on other CMake commands, so be sure to keep the order sane and clean.

Running the build

Now that you are ready, run the following commands from your project directory (you will need cmake to be installed in your system):

mkdir build
cd build
cmake -DIRRLICHT_PATH=path_to_directory_where_you_unpacked_irrlicht ..
make

Warning: do not forget to replace path_to_directory_where_you_unpacked_irrlicht with the actual path to the directory, where your Irrlicht files lay!

This will build our first Irrlicht application. Not obvious how handy it is right now, but you will see the power of CMake in our later sessions.

Before you run the application, copy the whole media directory from the Irrlicht dir to the parent dir of your project. You should end up with directory structure like this:

.
└── irrlicht_newton_tutorials
    ├── irrlicht_newton_game1
    │   ├── build
    │   ├── CMakeLists.txt
    │   └── source
    │       └── main.cpp
    └── media

Note: If you now just run the irrlicht_newton_game1 binary on OSX, you will see your application does not react to keyboard events. This is tricky, but you need to pack your application as OSX application. This is easy, though: just create a directory tree mkdir -p irrlicht_newton_game1.app/Contents/MacOS/ and move your binary file there:

├── irrlicht_newton_game1.app
│   └── Contents
│       └── MacOS
│           └── irrlicht_newton_game1

Open Finder and run the application from there. On other operating systems run the executable file in your build directory.

Buuuuut, since we have CMake, we may simplify this task because this is a part of application build process. So we need to create a usual binary file, when we are running Linux or Windows or create a directory structure with binary on its deepest level, when running OSX. CMake allows to do it in a really easy way:

if (APPLE)
    add_executable(${EXECUTABLE_NAME} MACOSX_BUNDLE ${SOURCE_FILES})
else()
    add_executable(${EXECUTABLE_NAME} ${SOURCE_FILES})
endif()

You should see something like this:

To end the process you may consider switching to a terminal and running

pkill irrlicht_newton_game1

Understanding the code

Here are few simple things we could extract from application’ code and understand right from scratch:

  • Each 3D model is a scene node
  • Primitive scene nodes (such as cube or sphere) could be easily created with built-in functions:
  scene::ISceneNode* node = smgr->addSphereSceneNode();
  scene::ISceneNode* node = smgr->addCubeSceneNode();
  
  • Animated 3D models (such as character models) could be loaded from file:
  scene::IAnimatedMeshSceneNode* node = smgr->addAnimatedMeshSceneNode(smgr->getMesh("../../media/ninja.b3d"));
  

Hint: if mesh is animated, animation could be started with:

    node->setFrameLoop(0, 13);
    node->setAnimationSpeed(15);
  

Hint: animation could be stopped with setting its speed to zero:

  node->setAnimationSpeed(0);
  
  • Node could be described not only by its vertices and indices (forming a set of triangles which are drawn in 3D forming a model, called mesh) but by its position, rotation and scale

    Those could be set with:

  node->setPosition(core::vector3df(x, y, z));
  node->setRotation(core::vector3df(x_angle, y_angle, z_angle));
  node->setScale(core::vector3df(width_factor, height_factor, depth_factor));
  

Hint: rotation is a set of angles relatively to the corresponding axes, the node will be rotated around. E. g., vector3df(45, 90, 0) sets the rotation by 45 deg around X axis, 90 deg around Y axis and no rotation aroung Z axis. All those axes are relative to the node itself.

  • Graphics User Interface’ (GUI) widgets for information output are labels; they are created with GUI Manager:
    gui::IGUIStaticText* label = device->getGUIEnvironment()->addStaticText(L"", core::rect<s32>(10, 10, 400, 20));
    

Hint: its text could be set with:

    label->setText((const wchar_t*) "some text");
    
  • User input is handled by an external IEventReceiver class object.

    Its method,

    virtual bool OnEvent(const SEvent& event)
    

defines the logic of handling events like mouse events, keyboard events, joystick events, GUI events, etc.

The type of event is passed with the event.EventType field. The corresponding field is filled with the event parameters.

For example:

  if (event.type == EET_MOUSE_INPUT_EVENT) {
      if (event.MouseInput.isLeftPressed()) {
          printf("%d, %d is cursor position\n", event.MouseInput.X, event.mouseInput.Y);
      }
  }
  

Hint: EventReceiver object has nothing in common with our main game loop. So we should create some interface, some architecture trick to link those two. Because they are strongly related!

  • Main game loop should contain rendering call, GUI rendering call and other game logic processing calls.

    The simplest main loop could look like this:

    while (device->run()) {
        driver->beginScene(true, true, video::SColor(255, 113, 113, 133));

        smgr->drawAll(); // draw the 3d scene
        device->getGUIEnvironment()->drawAll(); // draw the gui

        driver->endScene();
    }
    
  • There is no simple (or at least, built-in) way to get the delta time between two rendered frames. This is an important variable! We’ll need that later, when we inject physics engine. And Newton GD is not the only engine requiring this variable!

    But that could be easily done with this workaround:

    // before main loop
    u32 then = device->getTimer()->getTime();

    // ...

    // within the main game loop
    const u32 now = device->getTimer()->getTime();
    const f32 frameDeltaTime = (f32)(now - then) / 1000.f; // delta time in seconds
    then = now;
    

That was some short introduction to the Irrlicht engine. And that’s basically everything we will use for the next few sections.

Next chapter

Basic terms

Let’s talk a bit about our application before we create it. In order to make the development process sequential and least painful, we need to design the application well. The design of an application or the application architecture is the hardest thing to change on later stages of development. Thus it must be well-thought at the very beginning to prevent suffering in the future.

Well, there are number of application architecture’ levels:

The highest level defines which modules will the whole application consist of and what functionality will each of those modules have.

The next level is how the modules communicate to each other, how they work together.

The lower level is the structure of each module - what classes, entities, data structures and similar things will the module consist of.

One of the lowest, yet still very important architecture levels is how files are organized.

From the highest architecture layer point of view, I can advice a very simple architecture: assume our game will have a stable, rarely changed core, a set of assets (models, textures, sounds - any content, made by artists and used to be presented to the player) and a bunch of scripts, defining all the logic of a game - how character looks like, how the menus are shown and how they react to player’s actions, how objects in the game world behave and how that world looks like and all the stuff.

The main benefits of such an approach are:

  1. scripts and assets may be changed at any time
  2. scripts and assets define what we show to the user and how the application behaves so that none of the changes to scripts or assets make us to re-compile the core
  3. we can modify the core and thus change how game works internally (mainly for optimization purposes) without changing the overall application functionality and behaviour

We can make the core so flexible that we may re-use it in the future projects.

The tools

We will use Irrlicht engine because of its simplicity. And it satisfies all our needs - it does not need much content preparation; it provides GUI; extending it with IrrKlang will give us very simple interface to sound and music playback.

Newton Game Dynamics engine we will use to simulate physics. It is easy to use and is really powerful - you would be impressed!

The last, not the least, we will use Lua scripting language to write scripts. Lua is a lightweight programming language and perfectly suits that goal.

One of the most beautiful parts of this tutorial, will be the part on making of assets. We will use Blender 3D to create a couple of 3D models.

I also found CMake kind of user-friendly. It is not that handy as any of dependency managers for all those languages, supporting them (npm for JavaScript, go get for Go, RubyGems for Ruby, leiningen for Clojure and many others). Yet it makes your project a little more portable, helps to handle your dependencies, totally eliminates the need of all those How to configure VisualStudio for OGRE tutorials. Just try it!

Conclusion

Remember all the three rules for our architecture. And keeping them in mind, let’s get to some coding already!

Next chapter