Perhaps it exists, but I haven’t found a class that
does IBL. I have some rough ideas. 1) Existing light models in manta support area light. I see in
its implementation, random positions are generated on the light’s area
primitive, and connected to input rays’ intersected positions on a
geometry, attenuated by the normal at the intersection. For IBL, I would prefer each ray samples many light
directions in the hemisphere around the normal of the intersected geometry. Perhaps
I can modify the above area light model to perform these hemispherical
sampling? As the area light rays in computeLighting() do not check if newly
generated destRays are in shadow or not (I assume that is calculated else
where), neither do I have to check if the generated hemispherical rays are in
shadow or not? I think the destRays in any light model are shadow rays. What
should be the destRays’ directions for IBL? And for destRays’
colors, writing the irradiance color of the hemisphere seems inadequate, as the
full intersection information of the hemisphere isn’t yet known. 2) Alternatively, in the area light set up each intersected ray
samples one random light direction on the light primitive. Though the included area
light demo appears very spotty, perhaps this sampling scheme would shine in a
path traced solution? If that is the case, for path traced IBL, sampling one
and only one random direction from the hemispherical background ‘light’
would be sufficient, as long as many rays are in shot in each screen space
pixel. 3) I notice the EnvMapBackground class does standard mapping of
ray direction to sphere. While a similar scheme as in 1) can be applied, I
again am unsure about the shadowed portion of the hemisphere. Within the scope
of the shade() function, just like within the scope of the computeLight()
function, we do not yet have any intersection information of generated
hemispherical rays, and cannot just output the irradiances into destRays. Finally, can any type of coherency or similarity be assumed
for input RayPacket sourceRays? I don’t yet understand if these are multiple
rays from the same pixel on screen for multi-sampling, or children of those
rays, or entirely incoherent unrelated rays originating from different screen
pixels of different traversal depths, shadow rays mixed in or not, and etc.. Thanks Bo From: Abe Stephens
[mailto:
I don't think it would be too difficult to put together an image based
lighting scene--even hand coded with SSE, and it is certainly something that is
missing. Are you looking for an example of a specific operation? The
mailing list might be able to offer suggestions about implementation. RTSL isn't part of the open source repository. I don't think it was
ever used outside of the original paper authors and with RayScale. Abe On Sep 8, 2008, at 3:18 PM, Bo Huang wrote:
Hi Other than the demos given in the Scene
directory, are there others available for studying? I am the most interested in a simple
scene that is HDR image lit and globally illuminated for non-diffuse materials
such as glossy metals. The focus would be realism and physical accuracy; scene
complexity is not at all important. Also, I am curious of any tutorial to
integrate RTSL with Manta. Thank you. |
Archive powered by MHonArc 2.6.16.