An Augmented Reality Setup from Fusionated Visualization Artifacts

  • Maik Mory
  • Martin Wiesner
  • Andreas Wünsch
  • Sandor Vajna
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8683)


Merging three-dimensional visualization artifacts interactively from arbitrary sources is a promising approach to support interoperability in engineers’ software landscape. Based on previous work, which yielded a framework for asynchronous processing of OpenGL, we present a component, which combines three-dimensional visualizations from OpenGL-streams into one three-dimensional visualization space in real-time. In our current setup, CAX software is integrated with pointcloud rendering from an RGBD-camera to resemble an orthoscopic virtual mirror, which combines a user’s reality in front of the mirror with the CAX software’s virtual reality inside the mirror. We present results, how the tested augmented reality setup fosters cooperative decisions in product development and engineering.


cooperative decision mixed reality three-dimensional visualization distributed system interoperability 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bourke, P.: Distributed OpenGL Rendering (1996),
  2. 2.
    Boutin-Boila, E.: TechViz Fusion (March 2011),
  3. 3.
    Costin, A., Pradhananga, N., Teizer, J., Marks, E.: Real-Time Resource Location Tracking in Building Information Models (BIM). In: Luo, Y. (ed.) CDVE 2012. LNCS, vol. 7467, pp. 41–48. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  4. 4.
    Dunwoody, C.: The OpenGL Stream Codec – A Specification. Silicon Graphics (1996),
  5. 5.
    Heirich, A., Moll, L.: Scalable Distributed Visualization Using Off-the-shelf Components. In: Proceedings of the 1999 IEEE Symposium on Parallel Visualization and Graphics, PVGS 1999, pp. 55–59. IEEE Computer Society, Washington, DC (1999), CrossRefGoogle Scholar
  6. 6.
    Maimone, A., Fuchs, H.: Encumbrance-free telepresence system with real-time 3d capture and display using commodity depth cameras. In: ISMAR, pp. 137–146. IEEE (2011)Google Scholar
  7. 7.
    Microsoft Developer Network (MSDN): SwapBuffers function (2013),
  8. 8.
    Milgram, P., Kishino, F.: A Taxonomy of Mixed Reality Visual Displays. IEICE Transactions on Information and Systems 77(12), 1321–1329 (1994), Google Scholar
  9. 9.
    Miyachi, H., Oshima, M., Ohyoshi, Y., Matsuo, T., Tanimae, T., Oshima, N.: Visualization PSE for Multi-Physics Analysis by using OpenGL API Fusion Technique. In: First International Conference on e-Science and Grid Computing, pp. 530–535 (July 2005)Google Scholar
  10. 10.
    Mory, M., Masik, S., Müller, R., Köppen, V.: Exposing Proprietary Virtual Reality Software to Nontraditional Displays. In: Proceedings of the 20th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision. WSCG Communication Proceedings, Union Agency, pp. 35–43 (2012),
  11. 11.
    Silicon Graphics, Inc. and The Khronos Group: glDepthFunc function (2013),

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Maik Mory
    • 1
  • Martin Wiesner
    • 1
  • Andreas Wünsch
    • 1
  • Sandor Vajna
    • 1
  1. 1.Otto-von-Guericke-Universität MagdeburgGermany

Personalised recommendations