Hello Blender Wizards!

I am seeking your help in trying to solve a GIS problem using Blender. Any help, pointers or general discussion related to this will be highly appreciated.

I am a Blender n00b but I am aware that Blender has a GIS plugin that helps in creating cityscapes by capturing terrain, buildings etc. from GIS maps. Suppose a city with 3D buildings, parks, lakes has been created. Now, I need to find all dwelling units from which a particular park/lake is visible.

GIS has something called a viewshed analysis which can be used to find area which will be visible from any given point. But that is the limitation, it just gives the view from a point, not a whole area.

My idea is to create stack dwelling units (apartments in high rises) as white objects having unique Object IDs in Blender and parks/lakes as colored light sources. Upon rendering, it is easy to see what dwelling units are lit up in which color. That is all good for visual analysis.

My question is, is there any way in Blender to get Object IDs of Objects that have non-white colors on their face? Or do I have to take the help of a Gaming Engine for this?

Looking forward to the responses. Cheers!

  • DontNoodles@discuss.tchncs.deOP
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    I totally get your point as to how it may be faster. Let me read up about ambient occlusion and since I’ve never worked with Houdini, whether I can implement the same using any of the tools that I’m conversant with. Thank you for making me aware of this.

    • Adalast@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Pretty sure Blender has a Python API. And AO research on its own probably won’t yield much juice for what I described. The topic of limiting the scope of AO calculations to calculate other things is kinda not a thing. I actually used it as the subject of my Master’s thesis. I was using it to calculate the exposure of scenes to the solar ecliptic throughout the year so I could calculate fading from direct sun exposure on textures. That is why I shared a step-by-step instead of a link. The principle for what you want to do is the same though. Measuring the exposure of geometry against some other geometry.

      From what I just read, you want to use the Scene.ray_cast() function. Usage should be straightforward.

      For point in buildingPointCloud:
          TotalHit=0
          For destination in parkPointCloud:
              hit, _, _, _, _, _ = Scene.ray_cast(depsgraph, point, destination-point, length(point-destination))
              TotalHit += hit
          Visibility = TotalHit/len(parkPointCloud)
          FunctionToStoreVisibilityOnPointOrInList(Visibility)
      

      That should be enough to get you started. I am 99% unfamiliar with the Blender Python API, but this should get you there if you are even remotely experienced with it. There are obviously optimizations that can be done as this is very brute force, I was just trying to illustrate the basic loop.

          • DontNoodles@discuss.tchncs.deOP
            link
            fedilink
            arrow-up
            1
            ·
            9 months ago

            Currently, I’m trying to find a lazy (for me) way out. I’m learning to bake the lighting on objects and figuring out ways to iteratively do it for all objects of choice (buildings) in a scene automatically. Thereafter, i hope to do image processing on the unwrapped light baking maps to detect the desired colors. It should be possible to crop these images to detect light on individual faces and or find the percentage of area exposed too.

            If this does not work out, I’ll take the progressively difficult ways suggested in the thread as I learn and become comfortable with what you guys have kindly given me pointers for.