Piboy Advance Mockup

While I have no idea how to put hardware together, I’ve been putting more thought into standardized Pi hardware that can be used as a development platform. For the Piboy advance, I’ve made the following drawing of the resolution and inputs. Specifically the hardware uses a 3.5 inch 480×320 inch screen, with a dpad, right and left buttons, four face buttons and two buttons for start / select. Though in practice, these would be better labeled as “Pause” and “Home”.

Not pictured is the on and off button, (which would probably be under the left hand side of the device), or the volume slider (which would probably be on the right hand side of the device). And video out and power in, would be placed on the top of the device, so it could be charged while be used, and the output could be sent to a larger screen.

Ubuntu 18.04 OSM

After switching from Debian to Ubuntu, the results so far have been inconclusive. I was able to install postgres, download and import the map data, make and install mapnik. And after attempting to install mod tile, I simply end up with the error:

debug: init_storage_backend: initialising file storage backend at: /var/lib/mod_tile

In the apache error file. However, no tiles are ever generated. Since I think most of the pieces are in place I’m going to go ahead and attempt to fiddle with what’s there and try to get something working. I can try initializing the renderd process from the command line to be able to track the output. I can try generating tiles from the command line to see if mapnik is working. And I can try to re-import the data, which I am currently running at the moment. So I think it’s a matter of testing the variables and seeing what works.

sudo apt-get install postgresql postgresql-contrib postgis postgresql-10-postgis-2.4 \
postgresql-10-postgis-scripts osm2pgsql git autoconf libtool libmapnik-dev apache2-dev


Okay I found the issue. By looking at ‘systemctl status renderd’ we get the following error.

Received request for map layer 'default' which failed to load

Which looks like the style.xml file doesn’t exist. So the following step is failing.

carto project.mml > style.xml

So the way to fix this was to install carto and then call it publicly.

sudo apt-get install npm
sudo npm install carto -g
/usr/local/lib/node_modules/carto/bin/carto project.mml > style.xml

Okay then we try:

Follow Up

So here is my index.html file:

<!DOCTYPE html>
	<title>Quick Start - Leaflet</title>

	<meta charset="utf-8" />
	<meta name="viewport" content="width=device-width, initial-scale=1.0">
	<link rel="shortcut icon" type="image/x-icon" href="docs/images/favicon.ico" />

    <link rel="stylesheet" href="https://unpkg.com/leaflet@1.4.0/dist/leaflet.css" integrity="sha512-puBpdR0798OZvTTbP4A8Ix/l+A4dHDD0DGqYW6RQ+9jxkRFclaxxQb/SJAWZfWAkuyeQUytO7+7N4QKrDh+drA==" crossorigin=""/>
    <script src="https://unpkg.com/leaflet@1.4.0/dist/leaflet.js" integrity="sha512-QVftwZFqvtRNi0ZyCtsznlKSWOStnDORoefr1enyq5mVL4tmKB3S/EnC3rRJcxCPavG10IcrVGSmPh6Qw5lwrg==" crossorigin=""></script>


<div id="mapid" style="width: 600px; height: 400px;"></div>

	var mymap = L.map('mapid').setView([35.145844, 138.681230], 2);

	L.tileLayer('/osm_tiles/{z}/{x}/{y}.png', {
		maxZoom: 18,
		attribution: 'Map data &copy; <a href="https://www.openstreetmap.org/">OpenStreetMap</a> contributors, ' +
			'<a href="https://creativecommons.org/licenses/by-sa/2.0/">CC-BY-SA</a>, ' +
			'Imagery © <a href="https://www.mapbox.com/">Mapbox</a>',
		id: 'mapbox.streets'



And here’s what the result looks like:

I had to increase the tile missing timeout from 30 seconds to 60 seconds. The result is painfully slow, so it’s not exactly viable, but it is possible. I think the SD card is likely a huge bottleneck for the database read-write speeds. I’m going to go ahead and say that this is confirmed, you can make a tile server on an SD card based ARM SoC, but you probably wouldn’t want to. Something like the Rock Pi 4 that has an nVme SSD connector would likely make a better test candidate.

Libreboard OSM


With the popularity of the Raspberry Pi, combined with the Raspberry Pi Foundations unwillingness to distribute more powerful hardware, there are a lot of board makers coming out with more and more powerful system on a chip computers to try and fill the void, and market, left by the Raspberry Pi. One of these makers is Libre Compters, and the Specific board I’m using is the ROC-RK3328-CC running Debian. Links for the hardware and Linux distribution can be found below.

Libre Computer: https://libre.computer/products/boards/roc-rk3328-cc/
Roc Documentation: https://roc-rk3328-cc.readthedocs.io/en/latest/intro.html
Debian: http://download.t-firefly.com/Source/RK3328/ROC-RK3328-CC/Firmware/Debian/ROC-RK3328-CC_Debian9-Arch64_20180525.img.xz

I personally got my hands on the 4GB version, and have the operating system installed on a 64GB SD card. I’d prefer to either have the eMMC card, but can’t seem to find a reasonably priced version from the stores available to me, or I’d like the option to boot from an external hard drive or SSD, but can’t find if this is supporter or not. Though specifically what I wanted to test is with more powerful ARM devices coming out, is it possible to install something like an Open Street Map tile server, to be able to serve regional maps from an SoC?

The source for this install is taken from https://itsolutiondesign.wordpress.com/2017/07/11/build-your-own-openstreetmap-tile-server-on-ubuntu-16-04/, and has been copied to “show my work” on what specific commands are being executed in what order.

Install PostGres

$ sudo apt install postgresql postgresql-contrib postgis postgresql-9.6-postgis-2.3 postgresql-9.6-postgis-scripts

Create Postgres Database

 sudo -u postgres -i
createuser osm
createdb -E UTF8 -O osm gis
psql -c "CREATE EXTENSION hstore;" -d gis
psql -c "CREATE EXTENSION postgis;" -d gis

Create OSM user and download map data

sudo adduser osm
su - osm
wget https://github.com/gravitystorm/openstreetmap-carto/archive/v2.41.0.tar.gz
tar xvf v2.41.0.tar.gz
$ rm v2.41.0.tar.gz
$ wget -c http://download.geofabrik.de/asia/japan-latest.osm.pbf

Create swap file (to prepare for import)

sudo dd if=/dev/zero of=/swapfile bs=1GB count=2
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile

Import map data

sudo apt-get install screen osm2pgsql
su - osm
osm2pgsql --slim -d gis -C 3600 --hstore -S openstreetmap-carto-2.41.0/openstreetmap-carto.style japan-latest.osm.pbf 

And this is where my experiment came to an end. This process takes a long time even with more powerful systems, so I let the import run during the day. And when I got home at night, I found that the ROC-RK3328-CC was no longer available on the network. After rebooting the system, I would try to continue the process only to find that for some reason the SD card was now mounted as read-only, and I didn’t have any luck in mounting the SD card as read-write. I tried to reflash the SD card with Debian, and tried the process two more times and ended up with the same result.

I’m not sure if this is something done on my end, such as using a power supply that could not sustain the board for an extended period of time. I’m not sure if the board requires some manner of active cooling. I’m not sure if this is a bug in their compiled version of Debian, but at this point the process became frustrating enough that I would try their Ubuntu image option and only return to Debian if that proved successful.

Static Dash Format


DASH v1 IMG num ofs
TEX num ofs MAT num ofs
VERT num ofs FACE num ofs
BONE num ofs ANIM num ofs


IMG id | type offset length width | height


TEX id | IMG id wrap S wrap T flipY


MAT id | TEX id use blend blend src | dst opacity 0.0 – 1.0
transparent | visible vertex color bool skinning bool side const
diffuse red 0.0 – 1.0 diffuse green 0.0 – 1.0 diffuse blue 0.0 – 1.0 diffuse alpha 0.0 – 1.0


position x position y position z index 0 | index 1
index 2 | index 3 weight 0 weight 1 weight 2
weight 3


MAT id | a index b index | c index a color b color
c color a normal x a normal y a normal z
b normal x b normal y b normal z c normal x
b normal y b normal z a texture u a texture v
b texture u b texture v c texture u c texture v


bone id | parent id position x position y position z
rotation x rotation y rotation z scale x
scale y scale z


Anim id | num frames offset length time

Anim Format

Frame id | Bone id time pos | rot | scale position x
position y position z rotation x rotation y
rotation z rotation w scale x scale y
scale z

Balancing the Model Layout

So one thing that I can’t get my brain around is around is how to balance the Dash Model format. Specifically with respect from the file body to the file header. I have a couple of options in front of me. By far the easiest option would be to implement the file type in blocks. And in that situation you would create an Image block, that image block would contain all of the information for images, and then you would move on to the texture block, which would do the same for the textures. In general one of the designs that I’m working around is that I want the top of the file to act as a summary of the tile. So if someone read the top of the file they should get an idea of how many vertices, faces, animations and images are in the file.

But then what happens is that you have to manage how much information to put in the header versus the body. You can create a super tiny header and include the rest of the information in the body, or you can put some of the information (like names) in the header and have the body be nothing but structs, or you can simply have nothing but blocks where all of the data is defined in a list one right after the other. Though right now I think it might be easier to define the file format as blocks, include all of the information that I think is required to implement each block, and then there I can think about to balance the information between the header and blocks.

Quick Summary of the Properties:

Created on
Exported By

Image Id
Image Name
Image Data

Texture Id
Texture Name
Image Id
Wrap S/T

Material Id
Material Name
Texture Id (-1 for none)
Diffuse Color

Mostly Minified Version

AUTH            LEN
[                ]
COPY            LEN
[                ]
DATE            LEN
[                ]
TOOL            LEN
[                ]
[ id png ofs len ]
[ id png ofs len ]
[ id png ofs len ]
[ id time ofs len ]
[ id time ofs len ]
[ id time ofs len ]
[ id time ofs len ]

Fully Minified Version


Designing the Dash Model Format

Since we have the FBX SDK working, then the next step is to start working on the next piece, which is designing the Dash Model Format. Specifically I’m aware that GLTF is a thing, but it’s an over complicated format, and not something that I really want to work with myself. And in a lot of cases I’m finding that I’m going in the opposite direction of everyone else that seems to view threejs as a deployment target at the end of production, where as I’m using threejs for reading models and trying to get them back into a usable format.

So I’ll start with the pieces that I’m pretty confident about and then try to fill in-between or mark out areas that are unclear. So the first part that’s easy is the magic number at the beginning of the file. To be able to check if a file is a Dash Model file or not, the first four bytes of the file should start with “DASH”, and then be followed by the version number. In general my plans for the Dash Model format is a very specific set of models to specifically be able to accurately describe models that were created in the 1990’s to 2000’s, so there could be a lot of modern techniques that I’m missing out on. I don’t want to make anything that tried to support everything and gets over-bloated, but I do want to leave myself some flexibility to allow the format to be expanded or adjusted without breaking compatibility if possible. So following DASH, my plan is to put the version number, which can either be a uint32 value of either “1, 2, 3, ect”, or maybe even a string value of “v1.0”, “v2.1”, ect. And I’m leaning toward the string version as it seems more descriptive.

The next step is to start working on the values that I want, or need to record. The general areas are pretty straightforward.

– Images
– Textures
– Materials

– Vertices (skin weights, and skin indexes)
– Faces (indices, texture coords, face vertex colors, material index, face normals)

– Bones
– Animations

So there’s really only one “easy” design decision in here, and that’s to have the face handle vertex colors and face normals as opposed to storing them with the vertices, and it’s easier to think of a unique vertex as a unique point with a unique weight. Though honestly, even that is debate-able. From there I have a few general questions to contend with:

1) Do I separate between Meshes and Skinned meshes?
2) For weights do I allow for setting the number of weights, or do I assume a maximum of 4?
3) How do I set the format for faces?
4) Do I support shaders?
5) Do I support more than 1 uv channel?

And right now my thoughts are:

1) I don’t think I need to separate between skinned and non-skinned meshes and just assume everything is technically a skinned mesh, except when it’s not. So models that don’t have bones will just have zero bones, zero animations, and zero weight.

2) For weights, one the idea is to save space on file size using binary, so I think I can afford to use weights. And second most of what this file format is intended for is working with characters. But at the same time it would be pretty stupid to export stages (which have a large number of vertices) with vertex weights. But at the same time, since these won’t have animations, the file size should balance out. So a skinned or not flag could be beneficial, but if I want to make the format as static as possible, then I would just include weights in all cases.

3) For face format, this raises several questions. Do I allow for different face types to be defined (ie by defining a byte with bit flags), or do I define a create a flag that is set for a group of faces. And if I do that would I would have to allow for multiple face groups, or would I force one type for the entire model? Again this goes with vertices, but if I want to make the file format as static as possible, then inserting default values when these are not set would make things simple, as that’s generally want the editor or viewer is doing in the background anyway.

4) For now, I think no. Even if I define shaders, it’s not something that is supported by a lot of editors. So I think that sticking with a standard format makes more sense.

5) Again, no. I don’t have enough experience with multiple uv channels, and all of the models I’ve worked with, and that I’m targeting for this format only have one uv, channel, so I’m not going to try to support something I’m not familiar with. So for vertices, and faces I have the following format:

[ x (4) float ] [ y (4) float ] [ z (4) float ]
[ indicex (2) ushort ] [ indicey (2) ushort ] [ indicez (2) ushort ] [ indicew (2) ushort ]
[ weightx (4) float ] [ weighty (4) float ] [ weightz (4) float ] [ weightw (4) float ]

[ material index (2) ushort ]
[ a (2) ushort ] [ b (2) ushort ] [ c (2) ushort ]
[ aColor (4) uchar[4] ] [ bColor (4) uchar[4] ] [ cColor (4) uchar[4] ]
[ aNormal (12) float[3] ] [ bNormal (12) float[3] ] [ cNormal (12) float[3] ]
[ aTexture (8) float[2] ] [ bTexture (8) float[2] ] [ cTexture (8) float[2] ]

The next structure to start working on is bones. For bones, there is a lot of information that we have the option of either including or not including. A quick sketch of what I have in mind currently is the following:

[ Bone Name (0x20) char[0x20] ]
[ Bone Id (2) ushort ] [ Parent Bone Id (2) ushort ]
[ Position (12) float[3] ]
[ Rotation (16) float[4] ]
[ Scale (12) float[3] ]
[ Matrix (0x40) float[16] ]
[ Inverse Matrix (0x40) float[16] ]

Though I have several issues with this. For starters having the bone name floating about the struct looks kind of dumb, but I think in this format, we have images and animations that will have names as well. So I’m not sure if I want to have the names floating, or declare the id before the name. Declaring the id first is going in look better in general, even if it makes the string length 0x28 instead of 0x20, which shouldn’t really be an issue.

For rotation we have the option of using either quaternions or euler angles. I don’t think we have to worry about gimbol lock with bones, so normal euler angles are probably going to be better to work with.

From there we have the Matrix and inverse matrix. Looking at it, the matrix looks pretty stupid. Originally I would just want the matrix (and potentially inverse matrix), but I know some importers want the individual position rotation and scale, so I can probably take this out.

And for the inverse matrix, i’m not sure if this is the best to place to put it. The values provided could be wrong, so calculating them on import might be the easiest way, since I’ll probably have to figure out how to calculate them sometime eventually. But basically we can trust the transformations, and that someone could write these, but probably not the inverse matrix. So what we end up with is:

[ Bone Id (2) ushort ] [ Parent Bone Id (2) ushort ]
[ Bone Name (0x20) char[0x20] ]
[ Position (12) float[3] ]
[ Rotation (12) float[3] ]
[ Scale (12) float[3] ]

Which looks stupid, because it looks I’ll have NOPS everywhere to make the spacing align the struct to increments of 0x10 to make it easier to work with. This is why I originally just wanted matrix and inverse matrix, but generating matrices is easier than extracting them. Also I should go ahead and check to see which editors generate (and require) inverse matrices, because if editors have the values when exporting then might as well pop them in there.

Though I could reorganize the struct to

[ Bone Id (2) ushort ] [ Parent Bone Id (2) ushort ]
[ Position (12) float[3] ]
[ Rotation (12) float[3] ]
[ Scale (12) float[3] ]
[ Bone Name (0x20) char[0x28] ]

To align the struct to fit sizes of 0x10. Though I don’t think I did this with vertices or faces, so I can probably ignore this constraint and implement what’s easiest, but for bones, it seems like a good idea to make them easy to read (visually). Though I think a lot of that has to do with the string being declared inside the struct, if I declare the string in the header (like an archive), then I can just declare position, rotation, scale in the body of the file (though matrix and inverse matrix would be more elegant). Easiest option seems to be to throw bone names under the bus.

Which means right now my file header looks something like this:

Which I’m actually liking declaring the names in the header, so doing the same with materials and textures might make it look more clean. And since the size of the header isn’t fixed, i should add offset and length in the first line of the file. Which makes the face list and vertex list the odd members here since everything else has a name. I’m tempted to use the space to declare the format for the vertices and faces. Also the extended letters for the labels look stupid, i should keep those to a magic number.

FBX SDK Linux Install (Centos 7)

I bought a LattePanda to serve as a standard device for testing with the FBX SDK. Since the board is a x86_64 processor with 2GB of memory and 32GB of onboard storage for about a hundred dollars, I think it serves as a representation of the minimum requires for compiling the FBX SDK and hardware that theoretically anyone should be able to get their hands on to recreate these steps.

For an opperating system I’m using Linux and more specifically Centos 7. The reasons for using Linux should hopefully be self-evident to the kind of person reading this (open source). And the distribution used is CentOS 7 mostly because it works, although using the Redhat ecosystem isn’t likely a downside. I wasn’t able to get Debian working, and I’m not familiar with Arch or OpenSuse enough to write instructions.

For prerequisites, we have the CentOS 7 installed on a computer (the latter panda in this case). We have access to the console. And we are logged in as a user with root (sudo) privileges. We will be installing the FBX SDK to the home directory.

Install Prerequisites

$ sudo yum install gcc make gcc-c++ glibc-devel.i686 glibc-devel libuuid-devel libuuid-devel.i686 libstdc++-devel.i686 libstdc++-devel wget

Download and Install SDK (to user’s home directory)

$ cd /home/$USER/
$ wget http://download.autodesk.com/us/fbx/2019/2019.0/fbx20190_fbxsdk_linux.tar.gz
$ tar xvzf fbx20190_fbxsdk_linux.tar.gz
$ rm fbx20190_fbxsdk_linux.tar.gz
$ ./fbx20190_fbxsdk_linux

To install the SDK, you will be asked the path for the directory, the default will be the current (home) directory. You will then be asked to agree to the license agreeement, and if you want to read the read me or not, which you can answer “yes” or “no” accordingly.

The following files should now be inside you home directory:

-rwxr-xr-x.  1 kion kion 74396274 Aug 15  2017 fbx20190_fbxsdk_linux
-rw-rw-r--.  1 kion kion      105 Aug 15  2017 FBX_SDK_Online_Documentation.html
drwxrwxr-x.  3 kion kion       36 Aug 15  2017 include
-rw-rw-r--.  1 kion kion     1407 Aug 15  2017 Install_FbxSdk.txt
drwxrwxrwx.  3 kion kion       17 Mar  5 22:47 lib
-rw-rw-r--.  1 kion kion    96857 Aug 15  2017 License.txt
drwxrwxr-x.  3 kion kion       17 Mar  5 22:44 obj
drwxrwxrwx. 28 kion kion     4096 Aug 15  2017 samples

We need to make a few changes. First we don’t need the installer, so we can remove that. Next we need to rename the “gcc4” folder in the “lib” directory to “gcc”. And then to get the samples to compile, we need to replace the “gcc4” references in the makefiles to “gcc”.

$ rm fbx20190_fbxsdk_linux
$ mv lib/gcc4 lib/gcc
$ find samples -type f -name 'Makefile' | xargs sed -i 's/gcc4/gcc/g'
$ chmod -R 771 lib
$ chmod -R 771 samples

From there the sample programs should be able to complile.

$ cd samples/Animation
$ make
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c main.cxx -o main.o
mv main.o ../../obj/x86/release/Animation
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c ../Common/Common.cxx -o ../../obj/x86/release/Animation/Common.o
mkdir -p ../../bin/x86/release/Animation
gcc  -m32  -o ../../bin/x86/release/Animation/Animation ../../obj/x86/release/Animation/main.o ../../obj/x86/release/Animation/Common.o -L../../lib/gcc/x86/release -lfbxsdk -lm -lrt -luuid -lstdc++ -lpthread -ldl -Wl,-rpath /home/kion/samples/Animation/../../lib/gcc/x86/release

From there you should be able to run the executable which will be located in the bin folder of the directory which the FBX SDK was installed (in this case the user’s home directory).

$ cd ../../bin/x86/release/Animation
$ ./Animation
Autodesk FBX SDK version 2019.0 Release (b92b15b23)
Program Success!

Joining the Footclan

When it comes to writing user applications for Linux, GTK seems like it should be the obvious choice. In general it’s written in C, and has binding for a lot of languages. And it’s generally a first class citizen in the Linux desktop environment ecosystem as it’s been used to write desktops and applications in Linux. And on top of that it’s getting support for Wayland.

So the next difference is to how to actually get started with GTK. And it seems kind of weird that even though a lot of the more well known applications are written in GTK, there doesn’t seem to be too many resources for how to get started with GTK.

Getting Started with GTK:

Platform Demos (small example applications)

GTK 2 Examples

API Documentaion:

The interesting is that while these are available from https://developer.gnome.org, I don’t think if I ever actually found these by looking through their web page.

The Gnome Git repo is located on : https://gitlab.gnome.org/
Looks like anyone can join and start making private repositories.

Linux – FBX SDK Install Debian

I’ve had enough people ask me about FBX support that I figured I should start looking into how to support this file format. Autodesk supplies SDK for converting to and converting from their format here: https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2019-0. Specifically it seems like there’s a library for working with C/C++, and then python bindings as well. I’m going to be honest and say that python is probably the better choice to work with, but we’ll start with the C version.

So first step is we download the SDK for linux, run the installer and extract to the local directory.
From there we need to start by trying to build the examples.

$ sudo apt-get install make gcc

Okay and then when we build we get the following error:

kion@server1:~/Gitlab/fbx_c_sdk/samples/Animation$ make
mkdir -p ../../obj/x86/release/Animation
gcc4  -m32  -DFBXSDK_SHARED -I../../include -c main.cxx -o main.o
make: gcc4: Command not found
Makefile:67: recipe for target 'main.o' failed
make: *** [main.o] Error 127

I guess we could try to be lazy and include an alias for gcc4 to gcc in bashrc. Let’s try to edit the makefile to see if that makes it work.

We get another error:

kion@server1:~/Gitlab/fbx_c_sdk/samples/Animation$ make
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c main.cxx -o main.o
gcc: error trying to exec 'cc1plus': execvp: No such file or directory
Makefile:67: recipe for target 'main.o' failed
make: *** [main.o] Error 1

Let’s try to apt-get our way out of this

$ sudo apt-get install --reinstall build-essential

Okay, error fixed, next error:

kion@server1:~/Gitlab/fbx_c_sdk/samples/Animation$ make
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c main.cxx -o main.o
In file included from /usr/include/c++/6/stdlib.h:36:0,
                 from ../../include/fbxsdk/fbxsdk_def.h:23,
                 from ../../include/fbxsdk.h:43,
                 from main.cxx:29:
/usr/include/c++/6/cstdlib:41:28: fatal error: bits/c++config.h: No such file or directory

So next apt-get command attept

sudo apt-get install gcc-multilib g++-multilib

Another quick error in the make file, CC and LD need to be gcc, everything else is generally a path, so it can stay as gcc4 (or you can rename/copy the lib/gcc4 folder).

And now I’m getting the error

kion@server1:~/Gitlab/fbx_c_sdk/samples/Animation$ make
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c main.cxx -o main.o
mv main.o ../../obj/x86/release/Animation
mkdir -p ../../obj/x86/release/Animation
gcc  -m32  -DFBXSDK_SHARED -I../../include -c ../Common/Common.cxx -o ../../obj/x86/release/Animation/Common.o
mkdir -p ../../bin/x86/release/Animation
gcc  -m32  -o ../../bin/x86/release/Animation/Animation ../../obj/x86/release/Animation/main.o ../../obj/x86/release/Animation/Common.o -L../../lib/gcc4/x86/release -lfbxsdk -lm -lrt -luuid -lstdc++ -lpthread -ldl -Wl,-rpath /home/kion/Gitlab/fbx_c_sdk/samples/Animation/../../lib/gcc4/x86/release
/usr/bin/ld: cannot find -luuid

So there’s some uuid library, So far tried util-linux, uuid-dev and libuuid1. None of them seem to make it work so far. I think there’s some kind of flag that needs to be passed in for cross compiling uuid. Though I’m running this on an old(er) debian server. So I might try to use a different distribution.


CentoS worked!! Here’s the magic yum stuff:

sudo yum install gcc make gcc-c++ glibc-devel.i686 glibc-devel libuuid-devel libuuid-devel.i686


Tried adding 32 bit library to Debian. Still no luck, but at least it works on CentOS which is okay for now.

sudo dpkg --add-architecture i386
sudo apt-get update
sudo apt-get install libuuid1:i386


Finally got it working. All it needs after this is

$ sudo apt-get install uuid-dev uuid-dev:i386

And the make file for the samples will run. I should probably create a clean version of the install on Debian.