The visual quality and performance of your tiled mesh display depends on several factors. LuciadRIA does its best to find a good balance between these factors. It also offer various options to optimize visual quality and performance for your dataset, hardware, and use case.

Adjusting mesh detail and amount of data loaded

LuciadRIA uses the scale information present in the source data to determine what tiles and level-of-detail to load at any given moment.

Sometimes the scale information in datasets isn’t well-configured though, making it difficult to choose the right level.

You can adjust the amount of detail loaded using the qualityFactor settings on TileSet3DLayer. The default is 1.0. Increasing the value results in the loading of more data and detail. Decreasing the value reduces the amount of data.

These are our recommendations for use of the quality factor:

  • If you see low detail, or visible switches in level-of-detail when you are navigating, increase the qualityFactor.

  • If you experience bad performance, or even browser crashes due to memory overload, decrease the qualityFactor.

Adjusting your map’s memory budget

When you are working with large and detailed datasets, we recommend tuning the allowed GPU memory usage for your map. By adjusting the memory budget, you can get the best detail and performance for your dataset on your target hardware.

See Adjusting your WebGLMap’s memory budget to learn how to do this.

Setting graphics effects

Using one or more of our graphics effects can have a big impact on the visual end result of your mesh.

See Configuring WebGL Map effects for more information.

You can enable lighting in your map, but still disable it for specific datasets using the MeshStyle.lighting setting. This is often useful for aerial mesh datasets that already have shading from the captured imagery.

Working with transparency

If your mesh data has transparent surfaces, or your colorExpression adds transparency, you must tell the layer to take this into account.

Using GPU texture compression

LuciadRIA can take advantage of GPU texture compression. This technique results in a far lower GPU video memory usage.

GPU texture compression works on most supported platforms. LuciadRIA automatically detects which compression format to use, and converts textures on-the-fly.

If your environment doesn’t support GPU texture compression, this feature has no effect. Use the "Device Support" sample to check if your device supports texture compression.

Because the benefits of GPU texture compression are significant, it’s enabled by default. It can introduce very small visual artifacts in the textures, though. These are seldom noticeable, but here are our recommendations for the usage of GPU texture compression:

  • Enable texture compression to display meshes with large textures, such as reality capture reconstructions.

  • Disable texture compression to display meshes with lookup textures, such as CAD models.

For more information, see the textureCompression constructor option of TileSet3DLayer.