As @troylyndon said, the source is indeed a good way to learn the gritty details.
First things first. There are some Lua “resource” files that implement a big grab bag of features, for instance init.lua, plus some others in that same directory. Platform-specific bits have their own homes, if you have any interest in those bits.
The details you asked about are mostly in the Display and Renderer sections. I would roughly describe the former as what a Solar user sees, the latter as the under-the-hood bits.
A good first overview might be to look at the Display (captures and the Render method, in particular) and Scene (Render method) classes.
You could then look at some of the objects that show up there and drill down into their methods, e.g. those of GroupObject and other display objects (including the base class of them all). The Lua-accessible properties may be found here. Even so, it might take a while to see how it’s all wired up!
There are some other features like adapters, found elsewhere in that Display directory, that support the various display object types, paints, and so on. These also export properties and methods.
Some renderer-level details of the frame mechanics may be found here; the underlying geometry submission is implemented further down.
You will also see, throughout this file, operations such as “bind” issued by a “command buffer”, for instance this call. This goes back to what you asked in your post:
Basically, there’s a first phase where “Lua stuff” happens: input, positioning objects, handling events, and so on.
Solar then takes a freeze-frame of the end result and “draws” this: this boils down to inserting geometry, assembling related bits (same shader, texture, blending, etc.) into batches, and issuing commands describing them.
The command buffer then grinds through this commands-and-data blob, issuing actual calls to the rendering API. (This could take place on another thread, since we actually ping-pong back and front buffers, only updating one at a time.)
And that’s a very rough overview.
Note the GL prefix on that last command buffer link; the rendering API here is OpenGL or OpenGL ES. Various resources in the Renderer directory subclass some more generic type, CommandBuffer or Renderer for example, that expose an interface we want to flesh out.
I can confirm this was structured quite well, having gotten a Vulkan backend in working order with relatively few changes.
On that same note, I figured since the source is now open, and at this point most of these details are highly unlikely to change, we should be able to design around them, say to write low-level plugins. In this PR (and 3 follow-ups) I’ve implemented some APIs aimed at letting us hook into various stages of what I’ve described and get some fancy results.
(I don’t know how soon this could be integrated. In the meantime, I’m hoping to get some of it in order soon to try out without needing to build the engine, say just swapping out some binaries.)
Anyhow, maybe that was way more than you wanted, but I’ve been neck-deep in this stuff, the last 18 months or so especially.