Resolution and Reason: evaluating underwater 3D data capture for archaeology – ongoing
Fraser Sturt
Increasing use of three and four (time) dimensional datasets have revolutionised maritime archaeology. Over the last decade we have seen rapid technological developments with regard to systems for both gathering and processing data from underwater. Together these advancements are allowing us to answer longstanding questions with regard to site identification, formation and preservation. On one hand, as some methods (such photogrammetry) have become cheaper and easier to deploy, we have seen a dramatic increase in 3D data outputs and representations. On the other hand, we have seen dramatic increases in the resolution of data that we can acquire from more expensive sensors, such as mechanical sector scanning sonars and underwater laser scanners. However, whilst each method is capable of producing a 3D point cloud, the process via which it is captured and derived is different, and incorporates a range of potential errors with regard to absolute precision and accuracy.
These difference in data capture methods, costs, outputs and absolute error can make selection of an appropriate technique for a particular project hard to determine. In addition, limited availably of some systems makes it hard for archaeologists to access sample datasets in order for them to fully understand what might be achieved through adopting a different technique. As such, this project seeks to collect comparative data from use of underwater laser scanning, 3D multibeam scanning and photogrammetric methods in controlled conditions. It will include a workshop where participants can take part in data capture and processing. The example datasets will be made publicly available on a wiki site where the broader community can discuss use of cutting edge techniques and discuss best practice. It is thus the aim of this project to democratise knowledge of these technologies and increase global capacity with regard to 3D data capture in maritime archaeology.