This section discusses how to create and import models from Blender3D (2.62+, see bottom of page for Blender 2.49 and before) to jME3. Furthermore it explains how you can create various typical game-related assets like normal maps of high-poly models and baked lighting maps.

Asset Management

For the managing of assets in general, be sure to read the Asset Pipeline Documentation. It contains vital information on how to manage your asset files.

Creating Models

Game-compatible models are models that basically only consist of a mesh and UV-mapped textures, in some cases animations. All other material parameters or effects (like particles etc.) can not be expected to be transferred properly and probably would not translate to live rendering very well anyway.

UV Mapped Textures

To successfully import a texture, the texture has to be UV-mapped to the model. Heres how to assign diffuse, normal and specular maps:

  • blender-material-4.png blender-material-3.png

    • blender-material-2.png blender-material-1.png

Its important to note that each used texture will create one separate geometry. So its best to either combine the UV maps or use a premade atlas with different texture types from the start and then map the uv coords of the models to the atlas instead of painting on the texture. This works best for large models like cities and space ships.


Animations for jME3 have to be bone animations, simple object movement is supported by the blender importer, mesh deformation or other forms of animation are not supported.

To create an animation from scratch do the following:

  • Create the model.

    • Make sure your models location, rotation and scale is applied and zero / one (see “Model Checklist below).

    • (Did you know? You can make any model from a box by only using extrusion, this creates very clean meshes.)

  • Create the armature bones, don’t forget to have one root bone!

    • Start by placing the cursor at zero.

    • Go to the Add ▸ Armature ▸ Single Bone menu and create the root bone.

      • blender-add-bone.png

    • Select the bone and go to edit mode (press Tab).

    • Select the root bone end and press E to extrude the bone, then start rigging your model this way.

    • Make sure your armatures location, rotation and scale is applied (see “Model Checklist below) before continuing.

  • Make the armature the parent of the model.

    • Make sure you are back in object mode (press Tab again).

    • First select the model object then select the armature object with Shift pressed, then press Ctrl + P.

    • When you do this, you can select how the bone groups will be mapped to the model vertices initially. Select With Automatic Weights.

      When you parent your mesh to the armature, Blender automatically adds the Armature modifier to the mesh.


  • Voila, your model should move when you move the bones in pose mode.

  • From the Info header, press the Choose Screen Layout button and select the Animation layout.

  • In the Dope Sheet Editor window, press the Context button and select Action Editor.

    • blender-action-editor.png

  • Add an action by pressing the + button.

  • Set the rotationmode of the bone to Quaternion or switch later from your rotationmode to Quaternion and make a keyframe.

    • blender-switch-rotationmode.png

  • Create the keyframes (select the model armature and press I) along the timeline.

    • blender-add-keyframes.png

  • Each action will be an animation available via the animation control in jME after the import.

  • Press the F button next to the action so it will be saved even if theres no references.

    • The animation would else be deleted if its not the active animation on the armature and the file is saved.

Model Checklist

Sometimes you do not create the model yourself and often times models from the web are not really made for OpenGL live rendering or not completely compatible with the bone system in jME.

To export an animated model in Blender make sure the following conditions are met:

  • The animation has to be a bone animation.

  • Apply Location, Rotation and Scale to the mesh in Blender: In the 3D Viewport in Blender, select the mesh in Object Mode, from the 3D View Editor header, click Object ▸ Apply ▸ Location / Rotation / Scale.

    • blender_apply_mesh.png

  • Apply Location, Rotation and Scale to the armature in Blender: In the 3D Viewport in Blender, select the armature in Object Mode, from the 3D View Editor header, click Object ▸ Apply ▸ Location / Rotation / Scale.

    • blender_apply_bones.png

  • Set the mesh’s origin point in the bottom of the mesh (see the image below).

  • Set the armature’s origin point in the bottom of the armature (see the image below).

  • Armature’s origin point and mesh’s origin point must be in the same location(see the image below).

  • Use a root bone located in the armature’s origin. This root bone must be in vertical position (see the image below) and it is the root bone of the armature. If you rotate the root bone, the the entire armature might be rotate when you import the model into jMonkey (I’m just mentioning the result, I don’t know where is the problem (jMonkey importer or blender’s ogre exporter plugin)).

  • Uncheck “Bone Envelopes” checkbox on the Armature modifier for the mesh (see the image below).

    • blender_envelopes.png

  • Under the armature data tab, make sure the bone type is Octahedral (see image below).

    • blender_rootbone2.png

You can use SkeletonDebugger to show the skeleton on your game in order to check if the mesh and the skeleton are loaded correctly:

    final Material soldier2Mat = assetManager.loadMaterial("Materials/soldier2/soldier2.j3m");
    final Spatial soldier2 = assetManager.loadModel("Models/soldier2/soldier2.j3o");

    final Node soldier2Node = new Node("Soldier2 Node");


    final AnimControl animControl = soldier2.getControl(AnimControl.class);
    final AnimChannel animChannel = animControl.createChannel();

    final SkeletonDebugger skeletonDebug =
                    new SkeletonDebugger("skeleton", animControl.getSkeleton());
    final Material mat = new Material(assetManager, "Common/MatDefs/Misc/Unshaded.j3md");
    mat.setColor("Color", ColorRGBA.Green);
  • blender_finished.png

Also check out these videos and resources:

NormalMap baking

Models for live rendering should have a low polygon count. To increase the perceived detail of a model normal maps are commonly used in games. This tutorial will show how to create a normalmap from a highpoly version of your model that you can apply to a lowpoly version of the model in your game.

Blender modeling lowPoly & highPoly

  • If you use the multiresolution modifier you only need one object. Lets look at this example:

    • 1.gif

  • Add a multiresolution modifier:

    • 3.1.gif

  • There are two types of modifiers: Catmull-Clark and Simple.

    • Simple is better for things like walls or floors.

    • Catmull-Clark is better for objects like spheres.

  • When using Catmull-Clark with a higher “subdivide value (more than 3) its good to have the “preview value above 0 and less than the subdivide level. This is because Catmull-Clark smoothes the vertices, so the normalMap is not so precise.

  • Here is an example of Prewiew 1, it’s more smooth than the original mesh:

    • 2.gif

  • Enable “Sculpt Mode in blender and design the highPoly version of your model like here:

    • 3.gif

  • Now go into Render Tab, and bake a normalMap using same configuration as here:

    • 4.gif

Remember! The actual preview affects the baking output and mesh export!

Be careful: The steps above lead to terrible normal maps - use this procedure instead:

  • uncheck “[ ] Bake from Multires

  • switch to object mode

  • make a copy of your mesh (SHIFT+D)

  • remove the Multires modifier from the copied model

  • remove any materials from the copied model

  • remove the armature modifier from the copied model

  • select the original (highres) model

  • go into pose mode, clear any pose transformations

  • the highres and lowres models should be on top of each other now

  • select the original (highres) model

  • hold SHIFT and select the copied (lowres) model

  • in the properties menu go to render

  • use Bake > Normal

  • check “[x] Selected to Active

  • use a reasonably high value for “Margin (4+ pixels at least for 1024x1024 maps)

  • don’t forget to safe the normal map image

Be careful: in the Outliner the camera symbol (Restrict Render) must be on!

Fixing the normal colors in Blender

Blender has its own normal colors standard. We need to fix the colors to prepare the normalmap for using it with the JME Lighting Material.

To do this, go to the Blender Node Window

  • Here is Blender Node example. It fixes the normal colors:

    • 5.gif

  • Here is the colors configuration:

    • 6.gif 7.gif 8.gif

  • Sometimes it will be needed to change R and G scale and add some blur for better effect. Do it like on image below

    • exception2.gif

  • After rendering, save the file to a destination you want and use it with the JME Lighting Material and the lowpoly version of the model.

    • ready_normal.gif

LightMap baking

The goal of this tutorial is to explain briefly how to bake light map in blender with a separate set of texture coordinates and then export a model using this map in jME3.

Blender modeling + texturing

  • create a mesh in blender and unwrap it to create uvs

    • 1.jpg

  • In the mesh tab you can see the sets of Uvs, it will create the first one.

    • You can assign w/e texture on it, i used the built in checker of blender for the example.

  • In this list, create a new one and click on the camera icon so that baking is made with this set. Name it LightUvMap.

  • In the 3D view in edit mode select all your mesh vertice and hit 'U'/LightMap pack then ok it will unfold the mesh for light map.

  • Create a new image, go to the render tab an all at the end check the “Bake section and select shadows. Then click bake.

  • If all went ok it will create a light map like this.

    • 2.jpg

  • Go to the material tab, create a new one for your model and go to the Texture Tab.

  • Create 2 textures one for the color map, and one for the light map.

  • In the Mapping section be sure to select coordinates : UV and select the good set of coordinates.

    • 3.jpg

  • Then the light map

    • 4.jpg

Importing the model in the SDK and creating the appropriate material

Once this is done, export your model with the ogre exporter (or import it directly via the blend importer), and turn it into J3o with the SDK.

  • Create material for it using the lighting definition.

  • Add the colorMap in the diffuse map slot and the lightMap in the light map slot.

  • Make sure you check “SeparateTexCoords

    • 5.jpg

  • It should roughly result in something like that :

    • 6.jpg

The blend file, the ogre xml files and the textures can be found in the download section of the google code repo

Modelling racing tracks and cars

Follow the link below to a pdf tutorial by rhymez where I guide you to modelling a car and importing it to the jMonkeyengine correctly and edit it in the vehicle editor.Plus how to model a simple racing track.

Optimizing Models for 3D games

Follow the link below to a pdf tutorial by rhymez where I guide you on how you can optimize your models for faster rendering.

SkyBox baking

There are several ways to create static images to use for a sky in your game. This will describe the concepts used in blender and create an ugly sky emoji:smiley Check the links below for other ways and prettier skies.

A sky box is a texture mapped cube, it can also, loosely, be called en EnvMap or a CubeMap. The camera is inside the cube and the clever thing that jME does is to draw the sky so it is always behind whatever else is in your scene. Imagine the monkey is the camera in the picture.

  • skybox-concept.png

But a real sky is not a box around our heads, it is more like a sphere. So if we put any old image in the sky it will look strange and might even look like a box. This is not what we want. The trick is to distort the image so that it will look like a sphere even if it in fact is a picture pasted on a box. Luckily blender can do that tricky distortion for us.

The screenshots are from Blender 2.63 but the equivalent operations have been in blender for years so with minor tweaks should work for almost any version.

So let’s get started

  • Fire up blender and you’ll see something like this.

    • start-screen2.png

  • The cube in the start scene is perfect for us. What we’ll do is have Blender render the scene onto that cube. The resulting image is what we’ll use for our sky box. So our jME sky will look like we stood inside the blender box and looked out on the scene in blender.

  • Start by selecting the box and set its material to shadeless.

    • shadeless.png

  • Now we will create a texture for the box. Make sure the texture is an Environment Map, that the Viewpoint Object is set to the cube. The resolution is how large the resulting image will be. More pixels makes the sky look better but comes at the cost of texture memory. You’ll have to trim the resolution to what works in your application.

    • texture.png

  • Next up is the fun part, create the sky scene in blender. You can do whatever fits your application, include models for a city landscape, set up a texture mapped sphere in blender with a nice photographed sky, whatever you can think will make a good sky. I am not so creative so I created this scene:

    • scene.png

  • Now render the scene (press F12). It doesn’t actually matter where the camera is in blender but you might see something similar to this:

    • render.png

  • You can see that Blender has actually drawn the scene onto the cube. This is exactly what we want. Now to save the image.

  • Select the texture of the cube and select save environment map.

    • saveenvmap.png

  • That is it for Blender. Open the saved image in some image editor (I use the Gimp from here).

The SDK also contains an image editor, right-click the image and select “edit image to open it.

  • You will notice that Blender has taken the 6 sides of the cube and pasted together into one image (3x2). So now we need to cut it up again into 6 separate images. In gimp I usually set the guides to where I want to cut and then go into Filters→Web→Slice and let gimp cut it up for me.

    • post-slice.png

  • Next up is to move the image files into your assets directory and create the sky in jME. You can do that in the Scene Composer by right clicking the scene node, select Add Spatial and then select Skybox.

If you want to do it from code, here is an example:

public void simpleInitApp() {

    Texture westTex = assetManager.loadTexture("Textures/west.png");
    Texture eastTex = assetManager.loadTexture("Textures/east.png");
    Texture northTex = assetManager.loadTexture("Textures/north.png");
    Texture southTex = assetManager.loadTexture("Textures/south.png");
    Texture upTex = assetManager.loadTexture("Textures/top.png");
    Texture downTex = assetManager.loadTexture("Textures/bottom.png");

    final Vector3f normalScale = new Vector3f(-1, 1, 1);
    Spatial skySpatial = SkyFactory.createSky(

This example uses a strange normalScale, this is to flip the image on the X-axis and might not be needed in your case. Hint: the texture is applied on the outside of the cube but we are inside so what do we see?

Further reading