VAR4Good 2024: projects

We’ve had another successful round of students in the Visualisation Lab in January 2024, for the fourth iteration of the course “Virtual and Augmented Reality for Good” (VAR4Good). This year, nine groups with 41 students in total submitted a project.

Below are the videos made by each group to give you an impression of what they built. If you have an Oculus Quest (1, 2, 3 or Pro) and don’t mind sideloading you can even download an installation package and try the applications yourself! Click on the title below the video to download.

Watersnood by Amitai, Joep, Fabian and Yitjun.

EHBOSim Madelon by Andrea, Kai, Sven and Teun.

Search & Rescue by Sander, Fabian, Jarno, Luke and Tara.

VR Driving Simulator by Ruud, Chris, Petra, Mikky and Junis.

HouseHold Hazard Hunt by Ishana, Mathijs, Lukasz, Sara and Ben.

Visual Impairment Simulator by Justin, kieran, Ivar and Amber.

ADHD360 by Luuk, Daniel, Thijs and Jochem.

Veilig Vuurwerk by Stan, Lucas, Frenklin and Jasper.

Campfire Companion by Thomas, Jens, Jason, Annes and Job.

Visualisation Lab won ‘Student Visualisation Competition’ at the CompBioMed 2023 conference in Munich!

Christiaan Spieker, PhD student at the Computational Science Lab, together with colleagues at the Visualisation Lab won the first ‘Student Visualisation Competition” held at the “CompBioMed 2023″ conference in Munich. The image entitled ‘Cinematic punctured vessel blood flow simulation visualisation” uses simulation data produced by the Computational Science Lab. The image was produced by a joint effort from Christian Spieker, Joey van der Kaaij, David de Kanter, Robert Belleman, Gábor Závodszky, and Alfons Hoekstra.

The creation process of the image begins with raw blood cell simulation data in XMF and HD5 format, which is converted into mesh data (the X3D format) with the help of Paraview. The meshes are imported into Blender. Meanwhile, a cylinder is modeled in ZBrush to represent the blood vessel and further textured in Adobe Substance Painter to put a realistic-looking skin on the model. This textured model is subsequently imported into Blender alongside the blood cell models.

Once the components are in Blender, the scene is rendered using the Cycles rendering engine to produce an image of the blood vessel and blood cells. The final stage involves the import of the rendered image into Adobe Photoshop, where an appropriate background environment for the blood vessel is created, and some touchups are made to improve the overall visual appeal of the image. This comprehensive pipeline blends raw data conversion, 3D modeling, texturing, rendering, and post-processing to generate a detailed and realistic image of blood cells in a blood vessel.

Advanced motion capture system installed in Visualisation Lab

The Institute for Logic, Language and Computation (ILLC) and the Informatics Institute (IvI) are thrilled to announce the acquisition of a state-of-the-art motion capture system, which includes a cutting-edge face capture system. This advanced technology will be a valuable addition to the facilities of both institutes, enabling researchers to capture high-quality motion data and facial expressions for a wide range of applications.

The system has been installed in the Visualisation Lab on the ground floor of Lab42. Rob Belleman, manager of the Visualisation Lab, states: “This motion capture system is a significant addition to the facilities in the lab and will allow us to take our research and education to the next level. With its advanced capabilities, we can capture even the subtlest of movements, which will be invaluable for a wide range of projects.”

One of the primary beneficiaries of this new system is SignLab Amsterdam, a research group that studies sign language and its impact on communication. SignLab Amsterdam is a collaboration between researchers from ILLC and IvI, as well as sign language linguists at the Faculty of Humanities. Floris Roelofsen, founding co-director of the lab, says: “With the motion capture system’s precision and accuracy, SignLab researchers will be able to better understand and analyze the intricacies of sign language, enhancing their work on sign language recognition and translation.”

The new motion capture system in the Visualisation Lab is a testament to our commitment to providing cutting-edge resources to researchers, faculty, and students at Lab42. We look forward to seeing the innovative work that will be produced using this technology.

For more information, please visit our website or contact us directly at the Visualisation Lab.

Visualisation Lab, Informatics Institute, Lab42, L0.03

Motion capture system in Visualisation Lab

VAR4Good 2023: projects

In January 2023, the Visualisation Lab hosted the third iteration of the course “Virtual and Augmented Reality for Good” (VAR4Good). This year, ten groups with 45 students in total submitted a project.

Below are the videos made by each group to give you an impression on what they built. If you have an Oculus Quest (1 or 2) and don’t mind sideloading you can even download an installation package and try the applications yourself! Click on the title below the video to download.

Group A: “Toxic Gas Cloud simulator” by Mart, Mariska, Roy, Devrim and Raoul.

Group B: “Energy Rush” by Jonathan, Tim, Sijf, Nora and Kim.

Group C: “FitVR Senior” by Maurits, Jurgen, Lleyton, Shuqi and Dennis.

Group D: “Snowscape” by Silvia, Hannah, Rozhano, Abolabadi and Lucas.

Group E: “CPR” by Mees, Tika, Jasper, Ramon and Matthijs.

Group F: “Stanley” by Thierry, Milan, Wouter and Devin.

Group G: “Office Survival Experience” by Piraveen, Yoshi, Christiaan and Rénan.

Group H: “PowerDown!” by Tygo, Faas, Lourens, Aaron and Nina.

Group I: “Watt Wise World” by Finn, Jeffrey and Chimène.

Group J: “Parachute simulator” by Arco, Dante, Isaac, Rens and Yasser.

SURF XR on Tour @ UvA

On October 26, the Visualisation Lab of the University of Amsterdam, together with SURF and the UvA 4D Research Lab, hosts an XR on Tour event.

Students, researchers and staff of the UvA interested in Virtual Reality, Augmented Reality and other forms of Extended Reality (XR) are welcome to join the walk-in demos and try out technologies for themselves. The UvA Visualisation Lab will demonstrate XR applications for education & research, while SURF will demonstrate the current capabilities of XR, AR & VR technologies.

We hope you will get inspired by the possibilities of virtual and augmented reality, experience the demos yourself and grow your network!

See this page for more information on XR on Tour.


October 26, 2022
13:00 – 17:00
Lecture hall L1.01, LAB42 building, University of Amsterdam, Science Park 900, Amsterdam

Master thesis defense Yazhou Zeng

Surface visualisation of the coral species Acropora millepora.

Computational Science student Yazhou Zeng defended his master thesis August 3rd, 2020, on “Quantitative morphometrics of scleractinian corals”. Yazhou developed a data analysis pipeline for the quantitative analysis of CT scans of corals.

Master thesis defense Sandro Massa

Software Engineering student Sandro Massa successfully defended his master thesis on July 25th 2022. Sandro worked on an “Interactive Visualization Pipeline Construction Environment for Virtual Reality Visualizations”, which included a software interface that allows one to use the functionality of the Visualisation Toolkit (VTK) in the Unity development engine. Congratulations Sandro!

Master thesis defense Iris Reitsma

Iris Reitsma successfully defended her Computational Science master thesis today, June 30th. The title of her thesis is “Simulation of Tree Root Growth in a 3D Model of Amsterdam”, and reports on the work she did together with the Amsterdam municipality on modeling the growth of root tree systems and predicting their interaction with the infrastructure in Amsterdam. The result of her work is included in 3D Amsterdam.

New method developed for analysing 3D images of otoliths

Jaap Kaandorp and Rob Belleman of the Informatics Institute, Computational Science lab (CSL) have developed a new method to analyze 3D images and quantify the 3D morphology of these mm-sized biomineralized structures.

Surface reconstruction of a microCT of an otolith of the European hake (Merlucius merlucius)
Surface reconstruction of a microCT of an otolith of the European hake (Merlucius merlucius)

Images of the growth and form of otoliths in fish were obtained with micro Computed Tomography Scanning. This project was carried out in collaboration with the University of Bologna (Quinzia Palazzo).

Otoliths are intricate and complex shaped calcium carbonate structures in fishes which play a role in sound, movement and gravity detection. Growth rings in otoliths provide information about the age of the fish and analysis of otoliths is relevant in the population biology of fishes.

In this project, master student Steven Raaijmakers developed a method for quantifying these complex shaped morphologies as part of his master thesis work in Computational Science.

paper about this work was recently accepted for publication in Royal Society Open Science .