Please use this identifier to cite or link to this item:
Title: Using virtual or augmented reality for the time-based study of complex underwater archaeological excavations
Authors: Nawaf, Mohamad Motasem
Drap, Pierre
Ben Ellefi, M.
Nocerino, E.
Chemisky, Bertrand
Chassaing, T.
Colpani, A.
Noumossie, V.
Hyttinen, Kari
Wood, John
Gambin, Timmy
Sourisseau, Jean-Christophe
Keywords: Cultural property -- Protection
Virtual reality in archaeology
Excavations (Archaeology)
Underwater archaeology
Augmented reality
Issue Date: 2021
Publisher: CIPA
Citation: Nawaf, M., Drap, P., Ben-Ellefi, M., Nocerino, E., Chemisky, B., Chassaing, T., ... & Sourisseau, J. C. (2021). Using virtual or augmented reality for the time-based study of complex underwater archaeological excavations. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 8(M-1-2021), 117-124.
Abstract: Cultural Heritage (CH) resources are partial, heterogeneous, discontinuous, and subject to ongoing updates and revisions. The use of semantic web technologies associated with 3D graphical tools is proposed to improve access, exploration, exploitation and enrichment of these CH data in a standardized and more structured form. This article presents the monitoring work developed for more than ten years on the excavation of the Xlendi site. Around an exceptional shipwreck, the oldest from the Archaic period in the Western Mediterranean, we have set up a unique excavation at a depth of 110m assisted by a rigorous and continuous photogrammetry campaign. All the collected results are modelled by an ontology and visualized with virtual and augmented reality tools that allow a bidirectional link between the proposed graphical representations and the non-graphical archaeological data. It is also important to highlight the development of an innovative 3D mobile app that lets users study and understand the site as well as experience sensations close to those of a diver visiting the site.
Appears in Collections:Scholarly Works - FacArtCA

Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.