blog-20190127-exploring-procedural-meshes

EXPLORING PROCEDURAL MESHES

Posted by Nathan van Hulst on January 27, 2019 19:30 PM


Procedural fanboy!

Procedural meshes has always been something I was particularly interested in. It gives you the opportunity to create 3d geometry on the fly using just code. This can be really useful in situations where a custom 3d model created in 3d software just isn’t enough. A lot of games use procedural meshes which are often seen in city creation, rivers, roads and much more. However creating procedural meshes has always been some kind of wizardry to me especially the one where sophisticated 3d models are generated. One of the most recent games which heavily relies on procedural meshes is the game ‘No Man’s Sky’. In this game the complete universe is procedural generated from a single token, from solar systems up to planets, animals, plants and spaceships.

Years ago when I started getting interest in programming languages I dived into C++ and OpenGL. The reason for this is that I wanted to understand how 3d applications are created. At that time I just discovered Blender 3D and I was intrigued by how this piece of software was created. No idea that Blender was written in C I thought I had to learn C++ and OpenGL and so I gave it a try. Basically this was a first try on procedural mesh generation as displaying 3d objects was done through writing vertice positions and polygon indices with OpenGL instructions. It was a fun way to discover how 3d graphics worked, how 3d models are rendered and how you display textures. But soon enough I found out that creating your own small game engine is actually really hard and needs a huge time investment to get something useful out of it. But nonetheless it was a fun way to learn more about 3d graphics, what OpenGL is and what can be build with C++.


A VR enthusiast I met

At one day I met Johan Hanegraaf at VRdays Europe in Amsterdam who is an Architect at Mecanoo. In his spare time he works on a VR application for Architects called Archispace which he created completely alone. He told me that developing an application was something he really enjoyed besides designing buildings at Mecanoo. The company he worked at saw that his VR application was a real deal and gave him some space for talks at conferences everywhere in the world. At the VRDays Europe he did one of his talks about ArchiSpace. But one thing he had been struggling with was creating 3d meshes within his application and exporting them. So we planned to meet each other again during the VRDays and later on a VR meetup I was organising in Rotterdam. After the meetup we went for a drink and meal while talking more about his ambitions and VR application. I really admire him for his passion combining Architects with VR.


R&D time!

Once I was home after a great meetup the idea of creating procedural meshes was something I couldn’t forget. I thought about being able to help Johan with his issues and see if I could mean something to him. The following weekend I was as all alone at home and dived into procedural mesh generation again. While reading about how to get started with the game engine of choice I saw directly the similarities with OpenGL. The experience I had years ago was a quick start for me to get my first result. It wasn’t about clean code but about how to get something quickly done and working. The first thing I made was modifying and existing mesh by moving little cubes that were actually the positions of the vertices.



Soon after I realised that generating a cube using 8 vertices was a bit cumbersome and thought about another way to approach it. You can generate a box with just 2 vector positions and that is what I did placing two objects and calculate the volume of the box. After I got it working I was wondering how I could raycast 3d geometry working. I searched the internet and came across complex ways of raycasting for getting near vertices of the specified geometry. This technique is used in many 3d editors but I was going to stick to a simple plan and used mesh colliders which I also generated at the same time of the mesh itself. When I got it working I decided to use the same 2 vectors principe and raycasting to generate more boxes. The fun was in finding solutions for things you have no real knowledge about but get working by just creating your own solution.



While I could create a box on another one using a simple drag gesture I wanted to see how I could create more of them based on a normals within a mesh. The idea behind it is that you raycast a mesh and get the normal direction of polygon you hit. This worked pretty well and soon I was able to create boxes on all sides of another one.



A simple procedural mesh generation was made but there was one thing I couldn’t stand, this was the texture stretching when a mesh was modified. So I went through the code I wrote for uv mapping and made a simple fix to have the textures mapped correctly even after you modify the mesh.



One last thing I wanted to figure out is how mesh exporters actually work. I went searching on the internet and found an article about writing your own exporter for fbx files. While fbx files were the first thing that came into my mind, this wasn’t an trivial thing as Autodesk has the patent for creating this 3d format. So I searched a bit further and found out that even developers at Blender have issues keeping up with the fbx format each release. Creating an fbx exporter was a bit too much and so I went with a simple OBJ exporter which was actually fairly simple to make. At first I managed to export the mesh by merging all other meshes into one. Using Blender I could import my mesh I procedurally created within a game engine.



Recap

After all this coding, reading articles and trial and error I managed to get a fully functional prototype working for procedural mesh generation and export an obj file. The code wasn’t nowhere clean for production and neither really scaleable. I went through several articles about writing mesh generation and editing code but it is really hard to find good resources. Resources I found told me about the basics but never really go into depth of manipulating 3d geometry. This is one of the things I still want to figure. My next challenge could be something close to: “how to write code for effectively manipulate 3d meshes simular what can be found in 3d editor software.”



The next time I met Johan Hanegraaf I told him I was able to create meshes and export it as well. He noticed the tweets I made and was interested in how I archived it. However at the moment he was also trying to figure it out himself and if he had questions or needed my help, he would reach out to me. So now and then I had contact with Johan about his progress which was great as he told me he was using tools he bought do get the job done.

Months later I was going to organise another VR Meetup at Rotterdam and I decided to call Johan if he would give a talk which was no problem. I also asked him how his software was doing, he told me that a company reached out to him for co developing his software. For him this was the opportunity to greatly improve ArchiSpace together with another developers as he wasn’t able to do it all by himself. I felt really happy for him that he found this opportunity and I’m really curious where it will go.

This whole research project for procedural mesh generation was a enjoyable road to me. Finally I got something working which I had in my head for a while and at the same time got another person enthusiastic on this topic. Enthusiasm is also one of the reasons why I organise VR meetups in Rotterdam, but that’s a whole other story to tell.


Resources

Gamasutra: Modelling by numbers
Board To Bits Games: Procedural Mesh Tutorial, part 8: Cube basics