Okay, so! In theory, if you clone this repo and then run './gradlew build' or the windows equivalent in terminal, it should just build the project? :/ https://gitlab.com/AngularAngel/omnicraft Note, it will clone and then build like ~14 repos, so. XD
Nope, transform feedback wasn't what I wanted. In the end, I ended up just rendering my opaque fragments to textures via Framebuffer, as normal, and rendering the transparent fragments to an SSBO using AMD's OIT linked list idea. Works pretty well, I think, even if I need a separate pass for sorting the transparent fragments? Now I'm making my lighting shaders make use of all this. Going pretty well, even if I need to defer some of my deferred lighting to another deferred lighting shader. :/
Okay, so, I've been poking at this a ton. I tried setting up blocking between overlapping fragment invocations, and it was extremely laggy. Made a short digression to implement Frustrum Culling, but it didn't help much. Thinking I might try using Transform Feedback and changing my fragment shader for a compute shader? Not sure yet. :/
Alright, figured how to use an SSBO for my GBuffers... Except, either I need to change my surface fragment shaders to compute shaders somehow, or there's no way to avoid race conditions between overlapping surfaces inside a chunk on the same instancing call, and I'll have to give up and go back to what I head before. :/
Haha, no, not quite. No texture arrays as framebuffer bindings. So... I think I'm going to need to use an SSBO. I have a little experience with these, a very little. A friend wrote some code using them while I watched? But I have the code, and the wiki, and the internet. I will figure out what can be figured out. XD
Okay, added details for color patterns to my texture definitions. Right now I have solid, checkerboard, and random. Right now I'm thinking about applying the same system to normals, materials, emissions, etc.
Yup, did that. And redid my tonemapping a little, and also separated out ambient lighting into a deferred pass using lighting volumes, so it works on entities now. And made a lighting pass for emissions. (Things that are a little glowy, but not enough to illuminate anything else, like hot coals.)
Well, I was gonna do the pattern indexes and the like... But, then I got distracted making my point and directional lighting be per texel. So, texels now are lit uniformly by lights, and shadows fall uniformly across a texel. Interesting effect, not entirely sure I want to keep it? Might need to implement anti-aliasing for it. :/
Also thinking about making my renderdata more complicated - right now it's just a list of colors. I could add pattern indexes and the like, though. For example, I made my debug texture be black and pink... But, it's random, instead of a proper checkerboard. I could add that pattern index, and then allow describing it as a checkerboard. :/
Okay, did that. Not too difficult. Now...? Not sure what to do next. I could go back and take another crack at doing nice ambient lighting that takes material properties into account? Or, implement a new feature? Or just go looking for more refactoring to do? :/
Okay, lol, that was easy. Game no longer loads core modules via OSGI. Architecture still reflects when it did this a lot, but I want to put a lot more thought in before I start refactoring all of that. For the moment, I'm looking at abstracting block and side rendering data into some kind of 'render surface' object? So I could put all these on one texture, and have multiple blocks and/or sides point to the same render surface, if they render their surfaces the same way... XD
Did a bunch of refactoring today. Nothing crazy, just pulling out lots of old odds and ends. Massively shrunk my Gbuffer textures, removed some unnecessary data from it and from my vertices, so on and so forth. Thinking I might stop loading core packages with OSGI, and just save that for mods? Not sure yet. :/
Okay, just used three textures. Got block and side data moved into 3d textures instead of vertices. Not trivial, but I figured it out eventually. Means my greedy meshing can be even greedier, now that I can have multiple kinds of blocks and sides sharing a vertex.
Hmm. Trying to figure out how to put blocks and sides in a texture - trouble is, I need an array of 3d textures to really do it the way I want to... :/
I could implement server side light propagation? Or textures for blocks and sides, instead of putting that data into vertices? Or, maybe make my direction lights change direction? Or some refactoring, or, or...
Okay, fixed players not being able to see each other - it was only sending entity updates to the first player, instead of all of them. Also fixed up my ambient lighting, so it should work better with direction lights that change position. Not sure what to do next. :/
Yup, got it working fairly well. Players being able to see each other in multiplayer is a little unreliable? Like, later players can see earlier ones, but not vice versa, except earlier players can sometimes see the later ones shadows? Not sure what the deal is. Anyway, time for some refactoring now, I think. :/
Okay, I figured out I wasn't taking my model matrix into account for the view_pos variable, which is why the entity model was disappearing. Still have all my Matrics in exactly the wrong order, but it works now, so, I guess it's fine? :/
Okay, it seems my matrix multiplication is being done in the wrong order or something? I currently go 'projection * view * model * pos' which I think is exactly backwards. So, maybe that's the issue? When I try changing it though things break worse. So, clearly I got some other stuff backwards as well or something. :/
Anyway, I added model matrices so I can translate my models around and the like. They work fine as long as I stick to using the identity matrix, but if I set it to so much as translate by 0.01, the models disappear as I move my camera away. :/