Welcome to the ‘DISCOVER THE HUMANTECH WORK’ blogs, a series where we dive into the HumanTech work packages (WPs) with our WP Leaders!

Let’s start this series with WP4, wearable technologies for construction, led by Bruno Walter Mirbach, Senior Researcher at the Department of Augmented Vision at DFKI.

This WP is composed of four tasks:

  • T4.1: Inertial sensors – wearable camera integration, led by sci-Track
  • T4.2: Intention prediction and exoskeleton integration, led by Tecnalia
  • T4.3: Wearable camera digital twin localisation, led by DFKI and
  • T4.4: XR-glass integration and BIMxD visualisation framework, led by HoloLight

In this blog, Bruno presents the work carried out in the first half of the project, the benefits and the impact of the work done, its liaison with the other WPs and what is next!

WHAT WP4 ACHIEVED

Work package 4 focuses in the development of wearable technologies for construction. These technologies will support workers in physical and stressful situations to enhance their efficiency, safety, and satisfaction. Moreover, extended reality (XR) glasses and deep-learning methods applied to body cameras shall ease the use and effectiveness of digitalisation methods in construction.

A key achievement in the first half of the project has been the development of a visual-inertial body sensor network (BSN). Combining a network of Inertial Measurement Units (IMU) with camera-based visual tracking allows for robust pose tracking of the worker. We will use this as input to an artificial intelligence (AI) algorithm that detects workers’ intentions and automatically activates an exoskeleton when needed.

The integration of the IMU network with the camera system has been achieved by the HumanTech partner Sci-Track, with support from RICOH, who has developed an entirely new prototype of a dual-fisheye 360° camera to integrate the visual BSN. In parallel, Tecnalia has built a test bench with IMU and physiological sensors to determine the actions in which workers need support and develop an intention recognition algorithm for exoskeleton activation.

HoloLight has integrated the visualisation of a BIM model in the HoloLens into its software framework. Both HoloLight and DFKI, in collaboration with sci-track, are moreover working on a camera-based localisation of a worker with respect to a BIM model. We can use these functionalities for an augmented reality (AR) visualisation of progress monitoring or issue investigation through a BIM model.

Hand Menu_XR visualisation_HumanTech, wearable technologies
Hand menu, XR BIM visualisation of Hololight’s office BIM model.

In HumanTech, we are not working in a silo – let’s discover how this WP is linked to the others, the impact and the benefits of this WP.

We will demonstrate the exoskeleton with automatic activation function in a use case of Pilot 2: human-robot collaboration and wearables. Therefore, from the start of the project, there has been a close collaboration with the pilot owner and other involved partners — in particular, from WP5: construction robotics and human-robot collaboration — to define use cases for the exoskeleton.

WP4 is also very closely linked to WP2 and WP3. WP2 provides the BIMxD backend, the communication and processing platform from which the BSN and the XR glasses receive content from the BIM model generated within WP3. We can either visualise this content in the XR glasses or use it to determine the camera’s pose with respect to the BIM model. Therefore, we have organised multiple workshops between WP4, WP2, and WP3 to define extensions of BIM standards as well as interfaces and services of the BIMxD backend.

WP3 partners have strongly supported WP4 partners in scanning lab space for generating BIM models. On the other hand, the development of a camera localisation and the augmented visualisation of BIM content allows to visualise the benefits of the methods developed in WP3 (dynamic semantic digital twin generation), thus providing added value.

Moreover, RICOH’s development of the novel 360° camera brought an unforeseen benefit to the project. Initially designed for integration into the BSN, the camera turned out to be an ideal environment perception sensor for the robotic platform we developed in WP5, into which we will now integrate it.

The exoskeleton with an automatic activation function is one of the project’s key exploitable results because it overcomes the downside of existing exoskeletons, which is the need for manual activation or the fact that it impedes natural motion. With the exoskeleton’s so-called transparency, the range of applications will increase, as will the acceptance of exoskeletons in construction.

Exoskeleton experiment__HumanTech, wearable technologies
Exoskeleton for construction environment in Tecnalia’s lab.

Computer vision and deep-learning methods, such as the automatic localisation of a camera with respect to a BIM model and the XR visualisation, can strongly support the demonstration and exploitation of the methods developed in WP3.

WHAT IS NEXT

The clearly indicated final step for the development exoskeleton with automatic activation function is the hardware- and software integration of the visual BSN with the exoskeleton, the optimization of the pose tracking and intention recognition algorithms and the system validation. The final system with be demonstrated within a use case of Pilot 2.

An ongoing task is the research on methods to localize accurately the pose of a camera with respect to a BIM model and to implement the communication between the body sensors and the BIMxD backend.


In the coming weeks, we will continue to share our progress on our different work packages — stay tuned to our news, social media (LinkedIn and Twitter), and to stay up to date!