A short recap of what I’ve learned about Babylon JS over the last four weeks.
When I set out on this project a month ago my main goal was to better understand how to solve problems with Babylon JS. While I’m far from being an expert in this type of development, I’m happy with what I’ve learned. Babylon JS is a powerful new tool that I can start to use to solve problems for my customers and myself.
Initially, I was interested in making WebXR projects, but now that I’ve seen what Babylon JS can do, I’m open to a much wider range of projects and ideas.
I’m not going to go into detail about everything that I learned, but I’ll highlight a few things here.
I spent the summer of 2018 learning how create 3D models in Blender. While I really enjoyed this type of work, some RSI issues with my hands prevented me from moving forward. Blender (along with most GUI based creation apps) was just too intensive on my hands because of the constant need to use a pointing device.
During the first week of this project, I learned the basics of mesh creation in Babylon JS. I was quickly impressed by the wide scope of features available for 3D modeling, and I recognized many of those features from my time in Blender. Having these capabilities available, while only needing to type a few lines of code, is incredibly valuable to me. I’ve even started to think about how I could use these APIs to build a 3D Modeling interface (in VR?) that I can use comfortably without the constant need of a mouse.
GUI and Working with Data
Babylon JS has some notable features for rendering text and data onto a texture attached to a mesh. These features are what attracted me to Babylon JS in the first place. Many of the WebXR scenes that I want to build involve substantial amounts of data that I want to visualize and work with in a spatial setting. While the GUI features are not perfect, and they can be finicky at times, I’m happy to have access to these tools.
Babylon JS makes it easy to get started with WebXR (VR and AR) development.
- Controllers: There is an abstracted control scheme that works across a number device types. I can also implement device-specific controls if needed
- Hand tracking for Oculus Quest!
- The pointer event system is simple for users while being powerful and flexible for developers. This system works with VR controllers, hand tracking, and even traditional input devices.
- Teleportation is easy to use, although a bit harder to customize.
- Mode changing: The entire process for entering and exiting VR mode is taken care of.
Now that I’ve spent some time getting to know Babylon JS and what it has to offer, I can start to shift my attention away from learning and towards the projects that I want to build. That’s not to say what I don’t have a lot to learn, but to point out that I’m comfortable enough to move forward, learning as I go. Some areas that I want to improve include:
- WebXR controller input and interaction: I still have a lot to figure out such as how to use controller buttons that are not already mapped in the basic scheme, and how to interact with meshes with controllers and hands.
- Lighting: Most of my scenes have been using default lighting. I want to learn much more about taking control of lighting, as this can have an enormous impact on a project or scene.
- Scene management: How can I load content from multiple scenes? Can I move from one scene to another?
- Structuring code: How should I structure code in complex projects? How can I make API calls to load data while scene is already running?