Creating procedural buildings with visually different windows

Hello!

We have a bit of a problem with creating variation for windows in our procedural buildings. I’ll try to explain it as best as i can.

So we are building an open-world game, which will take place in a city. For buildings we’ve decided on a procedural approach, which uses premade elements. Here is an example, consisting of few of these elements put together to simulate a building facade (really crappy one):

Now, while we have no really difficult issues with combining the meshes together, we’ve ran into the problem of making windows visually different. As you can see on the screenshot above, we’ve managed to sort this out through materials. What is done here is pseudo-random UV offset based on object world position, using texture atlas. Works well enough.

But then, in our building creation pipeline, these elements have to be merged into a single static mesh to save draw calls. Here’s what we get:

I’ve may not made the perfect example, but you can see that, say, single windows on the left are now the same. It happens because now all of them have only a single world position to work with, since all are in the same actor. This case may not look that bad, but there are worse scenarios, believe me.

So now we are kinda stuck and not sure what to do here. I’ve tried to come up with several solutions. For example, I’ve wanted to use vertex colors to randomly offset UVs. That would work great if we could paint vertex colors on separate elements (either manually or programmaticaly). But we can’t.

Other idea we have is to place sockets in these elements where lower left and upper right corners of windows should be and then generate them in code when creating buildings. But this way is resource intensive and won’t be made soon, while we need a solution now (starting on the demo project).

So my question is, can you recommend some way to do this? Maybe some HLSL code to get vertex IDs, or calculate UVs shell, or paint vertex colors? Anything?

Thank you in advance,

Alexander

In your procedural construction phase, would it make more sense to use meshes w/ different UV coordinates for each window ‘segment’ that would then be merged into your final facade? This would allow you to selectively use an arbitrary set of window fronts (depending on the material and mesh layout) to build your structure, that would then be merged when finished.

One thing you could try, is in the material instead of relying on world position to affect the offsets to change appearance, use the world position of the final building as a lookup into an offset texture that contains useful offsets into your texture atlas. Essentially a fixed number of possible window offsets that is biased by the position of the full building in the world.

If you’re using an offset texture mapped to each window section, and that offset texture changes for each world position the building appears in, it should be possible to have each window section look up into the atlas of different window styles so each one looks different.

I see where my confusion came from. I assumed you had all of your possible window variations on a single atlas, which would allow you to have a texture-based offset per-window to ‘choose’ unique results per window.
Would it be possible to assemble your building from different window ‘quads’ that each look up into a given section of your UVs, so when the merge happens, they’re already looking into the correct portion of the atlas’d texturing?

Hi, Alan!

Thank you for answering.

I’m afraid that would not make sense. Construction elements can repeat often in a single building, so if we have same window we will be stuck with same UV coords for all of them. It does make sense for elements that have few windows - they are laid out differently already (to create variation just by using texture atlas with several window types). But again, when a couple of those elements is near, you can see repetition right away.

I think i understand what you mean, sort off. Trying it now. My only issue here is how do i then use the lookup result to offset UV shells differently?

Also, using object position just means i will have same building with different windows, while my problem is that windows are the same within a single building

Sorry, Alan, but i have hard time figuring out how can each section look up into the atlas and offset differently from each other?

I’ll provide a better example, so it would be easier for both of us:

Here’s my section with a single window and it’s UVs. Note that depending on how we do this we can layout UVs for a window either to a single atlas cell or just to 0-1 space. This example uses atlas UVs.

And here’s the merged mesh made from the same segment:

Independently of the way we layout these, all windows in a building made from the same element will have the same UV mapping. So this is what i’m stuck on - how can i offset each of these UV shells separately using texture? I understand what you suggest, i’m just not sure how can it work, because there’s no way to discern UV shells from each other in a material.

If something is not yer clear, don’t hesitate to ask. And thank you for putting up with me)

Alexander

That’s a method we’ve been looking into as well. Sadly, it requires more time and effort than we have resources available at this time.

My idea for this were to create sockets where lower left and upper right corners of a window would be in a segment, and then during generation create procedural plane there (or just simply stretch a premade quad mesh). It does mean that for demo project we are starting now, where we won’t use generation plugin and merge buildings manually, windows won’t have any variance.

I guess we will have to settle for procedural windows since there’s really no other option in sight. At least if there’s nothing else you can suggest.

In any way, thank you for taking your time and trying to help!

Alexander