HumanTech technologies_Visual inertial unit

HumanTech Technologies: Visual-inertial tracking unit

We are developing cutting-edge wearable technologies to enhance safety and efficiency in construction environments. In this edition of our HumanTech technologies series, our colleague Markus Miezal, CEO and Co-Founder of Sci-track, explains one of our key innovations in this area, a magnetometer-free visual-inertial tracking system which offers precise motion tracking even in magnetically disturbed environments — a collaboration between our partners at Sci-track and RICOH.

Body tracking and its problems

The term inertial measurement unit (IMU) usually refers to a sensor package containing an accelerometer, a gyroscope and a magnetometer. They measure 3D acceleration (including gravity), 3D rotational velocity and the 3D magnetic field (i.e. a compass), usually at a high frequency (100Hz).

Every smartphone has an IMU. When the display flips, the accelerometer detects a different "down" direction from gravity. A common application for these sensors is orientation estimation. By fusing the information from the accelerometer and the gyroscope, gravity and linear acceleration are separated so that a clean downward direction can be measured. The magnetometer adds yaw information.

When these sensors are placed on the body, the orientations of every limb can be calculated. However, with additional information, such as a biomechanical model, accurate orientations across the body or — in the most common use case — the relative segment orientations, i.e., the joint angles, can also be obtained.

The dimensions of the body allow us to better predict the measured accelerations and mechanical limitations of certain joints can be exploited. In our case, the biomechanical foot model allows us to estimate positional information through ground contact estimation*.

Using a model, however, requires that we know where the sensors are placed on the body. This is estimated in a small calibration step. But also, the model dimensions, i.e., the segment length, must be known and may be wrong.

Magnetometers are another error source. Especially indoors, any ferromagnetic metal or current-carrying electrical wires locally disturb the earth's magnetic field, so each sensor's measured "north" direction can be different. Therefore, we usually completely omit the use of the magnetometer.

Exploiting the relations between segments, we are able to maintain the yaw direction during motion but not in static situations. Also, a global yaw drift is introduced.

Adding a camera

By integrating a camera into the body and, in particular, a dual fish-eye camera, which is capable of capturing almost a full sphere, information about the surroundings and the body with respect to the surroundings can be obtained.

HumanTech team_ Markus Miezal, CEO and Co-Founder of Sci-track

By integrating a camera into the body and, in particular, a dual fish-eye camera, which is capable of capturing almost a full sphere, information about the surroundings and the body with respect to the surroundings can be obtained.

By detecting the body inside the image, the segment length and intrinsic segment orientations can be corrected (e.g. during drift in static conditions).

By monitoring the surroundings, position information can be corrected, and yaw drift can be omitted. Furthermore, the camera enables localisation, and context can be added to the estimate through object detection.

*Why can’t we get positions out of accelerometers?

Let’s say an IMU is static on a table. Neither accelerations nor rotations occur, so the IMU will measure gravity and zero rotational velocity. If I place the IMU in my car and drive on the highway at 180 km/h at a constant speed and then start measuring, as long as I don’t brake or accelerate, the accelerometer will only show gravity, and as long as I don’t turn, the rotational velocity will also measure zero. We cannot distinguish a static IMU on a table from an IMU travelling at a constant speed. We have to integrate the acceleration into velocity, and to know the position, we have to integrate it again.

This will, however, always result in errors since the measurements are digital and usually biased. Digitalisation implies information loss since the measured value is quantized. For example, 16 bits correspond to 65535 different numbers, which are mapped to the sensor range of ±60m/s². The smallest change is, therefore, 1.83 mm/s². Biased means that the process of converting to a digital number comes with the problem of the zero point being shifted. So, instead of measuring zero, a small other value is measured, which we call bias. The high frequency of the IMUs adds to the problem so that the integrations quickly diverge, which we call drift.


Learn about another of our HumanTech technologies, which promises improved efficiency and accuracy for the architecture, engineering, and construction (AEC) industries: scan to BIM, explained by Mahdi Chamseddine, an M.Sc. researcher at the German Research Center for Artificial Intelligence (DFKI).

Stay tuned to our news, newsletter, and social media channels (LinkedIn, Twitter and YouTube) to follow our journey toward accelerating the digital transformation of construction!


HumanTech technologies_Scan-to-BIM

HumanTech Technologies: Scan-to-BIM

In this edition of our HumanTech technologies series, our colleague Mahdi Chamseddine, an M.Sc. researcher at the German Research Center for Artificial Intelligence (DFKI), shares the keys to the scan-to-BIM technology we are developing.

What is scan-to-BIM and why is it important?

Building Information Modeling (BIM) is a process in architecture, engineering, and construction (AEC) that uses 3D models to centralize building project information. BIM integrates design, visualization, and collaboration, thus enhancing efficiency, accuracy, and communication throughout a building's lifecycle, from planning to construction and management.

Scan-to-BIM is the process of generating BIM models from 3D scans of existing structures, buildings, or construction sites.

"Scan-to-BIM is the process of generating BIM models from 3D scans of existing structures, buildings, or construction sites."

Scanning technologies provide accurate 3D point clouds of the target site. The scans reflect up-to-date information and real-world conditions, thus allowing for the generation of "as-built" models. The "as-built" building model can defer from the architectural plans, and thus, an automated approach to BIM generation is desired. This invaluable tool streamlines projects, facilitates efficient facility management, and empowers informed decision-making throughout a building's lifecycle.

HumanTech technologies_Scan-to-BIM
This image shows a transition from point cloud (scan), to segmentation, to BIM.

How does scan-to-BIM work?

Scan-to-BIM starts with data acquisition. There are different methods for 3D scanning of buildings, such as terrestrial scanners or photogrammetry. The scanning devices capture high-resolution and detailed point clouds of buildings or sites. Some scanning technologies can capture colour as well as 3D points, which allows for more information to be extracted.

The point clouds are then processed to remove noise and clutter. A typical scan will include data that is not relevant for the BIM generation, such as neighbouring trees or buildings and objects outside the designated site. The processed point clouds are then segmented using specialized machine-learning algorithms that are trained to recognize structural elements in scans. The AI model classifies points into relevant classes such as walls, floor, ceiling, doors, columns, and other classes. The segmented point cloud is used to create structured 3D models for the different objects depending on their type and dimensions. The collection of the 3D models, as well as their relationships and other information, constitutes the BIM model.

Scan-to-BIM applications and challenges

Generating a BIM model of existing buildings is labour-intensive, but automating the process can allow us to reap the benefits faster and more cost-effectively.

"Generating a BIM model of existing buildings is labour-intensive, but automating the process can allow us to reap the benefits faster and more cost-effectively."

It allows for automated progress monitoring of construction sites by generating models at regular intervals. Changes to the project are tracked and stored, and deviations from the plan can be detected and highlighted at an early stage, eliminating the need for costly corrections down the line.

Scan-to-BIM can be used, among other things, in renovation, historic preservation, and industrial projects to create precise digital replicas of existing structures. However, some challenges, including maintaining data quality, managing large datasets, scanning hazardous or hard-to-access areas, and adapting to unique and different structures, still exist.

Advancements in scan-to-BIM technology promise increased efficiency and accuracy for the AEC industry.


Learn about another of the technologies we are developing at HumanTech: the 360° ToF camera. Its speed, portability, accuracy, and efficiency in 3D data collection are unparalleled, and it is set to transform the way construction sites are monitored and managed.

Stay tuned to our news, newsletter, and social media channels (LinkedIn, Twitter and YouTube) to follow our journey toward accelerating the digital transformation of construction!


HumanTech Technologies: 360° ToF camera

At HumanTech, we are excited to introduce a revolutionary innovation in 3D sensing technology: A novel portable omnidirectional RGB-D device with a 5-meter range in all directions. This cutting-edge scanner stands out from currently available commercial 3D sensing equipment by its ability to quickly capture a 360-degree dense point cloud in a single shot with just one second of exposure time. Its handheld and portable design enables it to capture data in areas that are challenging for conventional scanners, such as cluttered rooms and small spaces, making it an invaluable tool for the comprehensive monitoring and management of construction sites.

We are committed to driving the digitalisation of the construction industry — making it safer, more efficient, greener, and more attractive to a new generation of highly skilled professionals. To this end, we are achieving major breakthroughs in cutting-edge technologies with a human-centred design.

One example is the 360° ToF camera we have developed. Its speed, portability, accuracy, and efficiency in 3D data collection are unparalleled, and it is set to transform the way construction sites are monitored and managed.

Enabling frequent digital twin updates

Hideaki Kanayama_360° ToF camera
Hideaki Kanayama, engineer at RICOH, with the 360° ToF camera.

Our main goal with this innovative device is to enable highly frequent digital twin updates and progress monitoring at the construction site, ensuring that professionals have the most recent information at their disposal.

It can also be used to support the creation of a complete digital twin by complementing the extensive scans taken by conventional terrestrial laser scanning and photogrammetry. 3D data from our sensors can also be aligned and integrated with data from other sensors using machine-readable marker detection and additional re-localization based on feature point matching between images obtained from each sensor. This improves the estimated location's accuracy, even in scenarios where some markers are not detected.

Verifying effectiveness through real-world applications

In order to introduce this device in the construction industry, we are verifying its effectiveness in practical scenarios through experiments at actual construction sites.

Collaboration and transparency are key in HumanTech. That's why our partners RICOH and DFKI have published the unique RGB-D dataset collected by this scanner in real buildings. This dataset, which consists of spherical RGB-D images with instance-level semantic and room layout annotations, is available for the research community to explore.

We expect our RGB-D scanner and this dataset to stimulate the development of novel algorithms that bridge the gap between research and practical applications in the workplace.

ToF-360 Dataset_HumanTech
The ToF-360 dataset enables the development of scene understanding tasks based on single-shot reconstruction without the need for global alignment in indoor spaces.

The prototype of the second version of the device has been completed and is already fully operational from April 2024. This version will significantly improve accuracy, extend the measurement range, and shorten shooting time. Further development is underway to achieve continuous dynamic updating and progress monitoring of the digital twin model.

As we continue to develop our technologies and expand the opportunities they offer, we look forward to seeing their impact on the building sector!


Meet our colleague Hideaki Kanayama, an engineer at RICOH — whose team at HumanTech is dedicated to developing the 360° ToF camera, among other technologies — and who provided the information for this article.

For more details on our developments, stay tuned to our news updates, newsletter, and social media channels (LinkedIn, Twitter and YouTube) to follow our journey toward accelerating the digital transformation of construction.


Gabor Sziebig, HumanTech hackathon

HumanTech Hackathon: Gabor Sziebig on the benefits of our technologies in construction

How can our HumanTech technologies benefit the construction industry and its workers? Who are the partners behind the robots we tested in our first Robotic Integration Hackathon? Watch this interview with our colleague Gabor Sziebig, Research Manager in Robotics and Automation at SINTEF Manufacturing, to learn more!

In our first Robotic Integration Hackathon, which we held at our partner ACCIONA's facilities in Madrid, we took the chance to interview some of our colleagues involved, who provided insights into their work developing the technologies we tested — a mobile robot to help construction workers build walls by handing over bricks and a robotic arm to fill concrete joints with elastic material.

Below, watch our interview with Gabor Sziebig, who is leading HumanTech’s Work Package 5, focused on Construction Robotics and Human-Robot Collaboration, to find out more details about the groundbreaking technologies we are developing and get a glimpse of our advancements, with which we are advancing digitalisation, automation, safety and efficiency in construction.

https://www.youtube.com/watch?v=w_FMM4jCUVk

The HumanTech partners involved

This initiative is a great example of the collaborative work that allows us to advance towards our goals at HumanTech, as six different partner organisations were involved in it: Baubot (which developed the mobile robot), Tecnalia (which created the control software that acts as the robot's brain), with a strong support from SINTEF, the University of Kaiserslautern-Landau in Rhineland-Palatinate (RPTU) and the German Research Center for Artificial Intelligence (DFKI) are also involved in it, as well as ACCIONA, on whose construction site we carried out this hackathon.


In the following weeks, we will publish the rest of the interviews we conducted. Stay tuned to learn more about our innovative technologies! Follow our news and social channels (LinkedIn, Twitter and YouTube - where you can find the HumanTech Hackathon playlist) and subscribe to our newsletter.


HumanTech_WP1 – Overall Framework Definitions

Discover the HumanTech work: Overall Framework Definitions

Welcome to the fourth edition of our ‘DISCOVER THE HUMANTECH WORK’ blogs, a series in which we discuss the HumanTech work packages (WPs) with our WP Leaders!

On this occasion, we dive into our WP1 — Overall Framework Definitions, led by DFKI. It is dedicated to setting the overall project framework in terms of elicitation of concepts, processes and information flows, detailed definition of usage scenarios, design of the overall framework architecture, technical specifications and user requirements, and ethics approach.

We have spoken with Jason Rambach, HumanTech’s Project Coordinator and Senior Researcher and Team Leader in Spatial Sensing and Machine Perception at the German Research Center for Artificial Intelligence (DFKI). He has shared  the main activities carried out in the first project period, how this WP is linked to others and our plans for the remaining part of the project.

WP1 had a critical role early in the project: to establish a common ground and a shared understanding of the overall project objectives among the partners coming from different backgrounds (AI, robotics, civil engineering).

Therefore, significant activities such as in-person workshops during meetings and several online meetings were carried out in the first months. This led to the conclusion of the first phase of the WP, with three important deliverables that defined research requirements, a component-level architecture, and user requirements.

This work gave a significant boost to the technical WPs, mainly WP3, WP4 and WP5. A very important contribution was the definition of use cases and flow diagrams for the pilots, which proved highly valuable in the following months.

What remains to be done for WP1 is finalizing the HumanTech architectures and describing them in deliverable D1.4. For this, the task leader Hyperqlic has been monitoring all technical work packages closely over the last months and updating the architecture accordingly.

The final architecture was presented to the partners at the upcoming Executive Board Meeting in Genoa.


Take a look at our progress on WP8, focused on dissemination and communication and presented by Giulia Pastor and Andrea Torres from AUSTRALO, and stay tuned to our news and social media (LinkedIn and Twitter) to stay up to date!


Australo in HumanTech's 1st hackathon

Discover the HumanTech work: Dissemination, communication and exploitation

Welcome to the third edition of our ‘DISCOVER THE HUMANTECH WORK’ blogs, a series in which we discuss the HumanTech work packages (WPs) with our WP Leaders!

Let’s continue this series with WP8 — focused on dissemination and communication. Giulia Pastor and Andrea Torres from AUSTRALO will present key insights on the work done and the most exciting achievements so far.

WP8 is the work package responsible for the project’s dissemination, communication, and exploitation activities. It is led by AUSTRALO, with DFKI in charge of exploitation and standardisation and Hypercliq handling IPR activities.

The whole HumanTech team has invested a lot of effort in the first two years of activities to ensure that the exciting work carried out by our technical partners is well-promoted and all the HumanTech stakeholders are aware of what is happening in the project.

In addition, a significant part of WP8’s job is to MAKE THE PROJECT RESULTS OPEN to everyone interested, in line with the open science principles.

To achieve this important objective, our technical partners have published six open-access scientific publications, which can be consulted and downloaded on our HumanTech website.

They also participated in 24 events, workshops, conferences, and media appearances and won 3 prestigious awards:

  • 2 first place awards in the Object Pose Estimation Challenge (BOP Challenge, ECCV 2022) | Partner: DFKI
  • 3 first place awards at the Object Pose Estimation Challenge (BOP Challenge, ICCV 2023) | Partner: DFKI
  • 3rd place in the CV4AEC workshop's Scan-to-BIM challenge at CVPR 2023 | Partner| DFKI + RPTU

In HumanTech, communication and dissemination are the keys to the project’s success.

HumanTech - dissemination, communication and exploitation

All the key scientific stakeholders must be aware of the project activities; however, we must keep in mind that HumanTech stands for Human-centered technologies for a safer and greener construction industry and has the main scope of providing new technologies to make this sector more worker-secure and green.

With this in mind, we created different blog posts and social media campaigns, underlying, on one side, the project's technical achievements and, on the other side, the benefits that the new technologies we are developing will bring to our end users, the workers, and the construction companies.

In the NEWS section of our website, you can find insightful articles on our technologies and their benefits, together with articles summarising the first-year achievements and progress updates before our first review meeting.

To explain different aspects of HumanTech — from our pilots to our learning materials to skill construction workers — in a clear, didactic and engaging way, we have conducted several video interviews with different team members. Also, videos and animations to showcase different project updates, demos, learning pills and activities within our Tech4EUconstruction cluster.

In WP8, we believe that our colleagues, the multidisciplinary professionals from 21 organisations in 10 countries who make up the HumanTech team, are best placed to explain the work they are developing. This is why we have given them space to present themself, their background, skills, vision of the project and the importance of their work for the construction industry in the series of interviews: ‘Meet the HumanTech team’.

In parallel, we launched a new series, 'Unlocking the Future of Research', in which researchers, PhD students and junior staff working on the project explain what it means for a young professional and/or researcher to work on an EU project and what this can do to their future careers.

In HumanTech, we value our community.

In June 2023, we joined forces with our sister projects BEEYONDERS, and RoBétArmé, to create the collaborative cluster Tech4EUconstruction. Funded by the European Commission, under the call HORIZON-CL4-2021-TWIN-TRANSITION-01-12, the three projects aim to develop and demonstrate new technologies to digitalise further and automatise the European building sector, increasing its safety and attractivity for workers. Furthermore, the cluster seeks to stimulate the EU’s sovereignty in the industry, decreasing the need for technological imports.

The main objective of our Tech4EUconstruction cluster is to share knowledge between the projects on different aspects:

  • Mutual exchange of technical expertise and project innovations
  • Implement joint communication campaigns to raise cluster awareness
  • Share knowledge and plans on exploitation actions
  • Mutually promote the projects’ principal activities and achievements
  • Co-organisation of events, workshops, panels, etc.

In less than one year, we organised two successful workshops at the European Robotics Forum 2023 and 2024. We also launched a new campaign called Words of Innovation, in which experts from our projects delved into essential and innovative aspects of their work by defining simple keywords. They briefly explained the technologies and strategies they are developing to address the challenges facing today’s European construction industry.

In addition, we invited other EU projects working in the same field to join our cluster: we can proudly say that now we have eight EU-funded projects onboard!

Looking ahead, the Tech4EUConstruction cluster will organise new social media campaigns (stay tuned: soon, we will launch an insightful campaign related to scientific publications), participate in joint workshops during central EU and international conferences and events (next step: SustainablePlaces 2024), and invite the recently funded EU projects working on our same topics.

And this is just the beginning!

Beyond dissemination and communication.

WP8 is not just the work package dedicated to dissemination and communication; it also covers the project's Exploitation, standardisation, and IPR activities. Thanks to the expertise of our Exploitation and IPR managers - Jason Rambach and George Kartsounis, and the great service offered by the Horizon Results Booster, we deeply analysed three Key Exploitable Results (KER1 Intention-controlled exoskeleton, KER2 Scan2BIM software and KER3 Spherical ToF camera), preparing the first Exploitation plan for the project. In addition, we also established a shared understanding and agreement regarding IPR among partners involved in developing the KERs.


Take a look at our progress on WP6, Human Factors – Training, Usability and Assessment, presented by Gloria Callinan, Project Support Officer at the Technological University of the Shannon, and stay tuned to our news and social media (LinkedIn and Twitter) to stay up to date!


HumanTech_Human factors – Training, usability and assessment

Discover the HumanTech work: Human factors – Training, usability and assessment

Welcome to the second episode of the 'DISCOVER THE HUMANTECH WORK' blogs, a series where we dive into the HumanTech work packages (WPs) with our WP Leaders!

Let’s discover WP6, Human Factors – Training, Usability and Assessment, presented by Gloria Callinan, Project Support Officer at the Technological University of the Shannon, Development Unit Thurles.

This WP is composed of four tasks led by a team of incredible experts:

  • T6.1 Micro-learning units development and coordination – led by TUS
  • T6.2 Subjective and objective assessment of worker's technology acceptance – led by TECNALIA
  • T6.3 Workflow capturing and extended reality (XR) training – led by DFKI and
  • T6.4 Wearables safety, gender and ethics considerations – led by BAUA

Let’s dive into the work behind this WP, its impact, its liaisons with the other WPs and what is next.

The importance of WP6 – what has been done in this WP.

The construction sector is among the least digitalised and thus offers significant potential to improve the efficiency of construction processes and building operations and enhance health and safety on construction sites. The digital transformation in the construction sector will require workforce upskilling and reskilling.

Training and assessments will focus on the following thematic areas:

  • Technologies supporting workers' safety and well-being in future digital construction
  • Human-robot collaboration in construction automation

Work Package 6 is the meeting of technology and human factors in construction through the HumanTech project. In response to this challenge, WP6 has carried out two main activities supported by all WP 6 task leaders and WP project partners.

1. Creation of new training materials for use by training centres to be delivered to craft trades and apprentices and to higher education institutes to upskill construction professionals. Chambers and registered bodies can also access the freely available materials for continuous professional development. 3 modules have already been developed on HumanTech training material: HumanTech and Digitalisation (module 1), Green Technology in Construction (Module 2), and BIM Fundamentals Module 3, and are available on Zenodo.

Another 9 modules are under development with support from the HumanTech pilots covering 360-degree Cameras and mounting on Robotics and Drones, Robotics in Construction, UAV and UGV and construction, Digital twins, Exosketons, Building Smart Data Dictionaries and others.

2. Subjective and objective assessment of worker's technology acceptance. The use of advanced technology such as exoskeletons, smart glasses and wearable sensors can have a huge impact on the behaviour of the worker. Similarly, the use of robots on construction sites with human interaction is a major challenge. Although technologies are designed to support workers, they can have the opposite effect in the work environment, especially when different technologies are combined. Workers may feel monitored, restricted in their movements or stressed by an information overload. In the context of a good work environment, special consideration should be given to the needs of workers.

Led by partners Tecnalia and BAUA, workers and apprentices participated in a focus group moderated by a HumanTech partner in Spain and Ireland. Technologies were presented and outlined in a face-to-face environment, and surveys were then completed on workers’ perspectives.

One online workshop was held with French female construction workers moderated by the European Builders Confederation. Workshop participants were assured that data from these objective measures were anonymised and would be used only to evaluate the technological solutions, not any given worker’s performance.

Participants were recruited by Acciona at Alicante and Zubieta, in Spain, among the stakeholders working on the construction site (workers and supervisors). A total of 22 participants took part in this first physical workshop, organised in Acciona premises. Another 27 participants took part in the second physical workshop, organised in the Acciona offices at another construction site. The Irish workshop took place in Limerick, Ireland, and was delivered by TUS to apprentice electricians and carpenters at the Raheen Training Centre campus of the Limerick Clare Education and Training Board. It was attended by 26 construction apprentice participants and 4 tutors.

The broad results indicate higher reluctance towards interactive robots than exoskeletons and XR glasses, mainly due to perceived low manoeuvrability, physical rigidity and sedateness. The results are available in report format D6.3 – HT Worker Assessment Report.

In HumanTech, we are not working in a silo – let’s discover how this WP is linked to the others.

WP6 is highly dependent on progress in other work packages, the evolution of technology such as exoskeletons, smart glasses, and wearable sensors, and the progress of the pilots, the work of which will be used for the final assessment due in 2025. Subjective assessment of the human factor-related aspects was performed using scientifically validated questionnaires for lab and field research. Objective evaluation is due later in the project and is an objective perspective of user acceptance that will be assessed by means of measurement of users psychophysiological signals such as EEG (electroencephalogram), GSR (Galvanic Skin Response) and BVP (Blood Volume Pulse). For example, WP1 user requirements and architectural definitions are relevant and fundamental for human factors in both training and assessment. The BIMxD platform of WP2, hyperspectral material scanning of WP3, body sensor network of WP4, demolition task planning in WP5 and mobile platform, along with the pilot of WP7, are all relevant and significant to WP6. Finally, a communications foundation from WP8 is vital for the dissemination of the WP6 training materials.

The feedback from workers and apprentices will help inform the work of the technology partners, and the provision of training from the HumanTech project will mean greater exploitation and sustainability of the HumanTech approach.

The impact and the benefits of WP6.

Two aspects of our work are likely to have the most impact.

  1. The delivery of 12 bespoke micro-learning units, which are the culmination of HumanTech's work and unique to HumanTech results, will upskill a range of construction sector actors across VET, higher education, and Continuous Professional Development. The target is upskilling 200 trainees and 20 tutors.
  2. Worker assessment will have a significant impact on workers' attitudes toward technology. Open questions for each technology included, for instance, the participant’s belief on how their working task will change under the use of the technology, expected benefits and problems (each in the short- and long-term), as well as the most important resources needed for a successful implementation. Participants stated that they see positive effects on the reduction of physical strain, musculoskeletal injuries, and disorders. For XR glasses, participants named the specific benefits of worker training, learning, and skill development. Furthermore, they see benefits in the specific application of XR glasses for prototyping as well as in the design and planning phase, also through visualisation of future on-site activities.

WHAT IS NEXT

The two main (although there are many more) activities planned for WP6 are the completion of the remaining 9 micro-learning units as modules to be delivered by training centres and higher education institutes. 200 participants will benefit as pilots from the HumanTech training, and 20 educators will be upskilled in delivery. In addition, the final assessment using technology developed in HumanTech will be tested on workers again in Spain and apprentices in Ireland.

Planning for the next assessments includes:

  • Organisation of additional workshops in other countries to go deeper into the subjective analysis (and thus complement what has been done in T6.2).
  • Performance of an objective assessment by evaluating physiological sensor data collected in dedicated training sessions. To do so, the technological developments of HumanTech wearables and interactive and collaborative robot systems developed in WP4 and WP5 will have to be mature enough to be tested in pilots in WP7.

Take a look at our progress on WP4, wearable technologies for construction, led by Bruno Walter Mirbach, Senior Researcher at the Department of Augmented Vision at DFKI, and stay tuned to our news, social media (LinkedIn and Twitter), and to stay up to date!


HumanTech's 3rd Executive Board Meeting in Genova

We held our third Executive Board Meeting on 20 and 21 March, hosted by STAM in Genova. During two days, we reviewed our progress over the last period, conducted workshops and demonstrations of our pilots and technologies, and established our next steps to continue advancing towards our goals.

The meeting kicked off with a warm welcome and an introduction by our coordinator, Jason Rambach (DFKI), providing updates on the project's status. Work Package presentations followed, covering various aspects such as BIMxD Formats and Standardization, Dynamic Semantic Twin Generation, Wearable Technologies for Construction, and Construction Robotics and Human-Robot Collaboration.

We conducted pilot workshops, focusing on Dynamic Semantic Digital Twin and Bridge Inspection and Monitoring, among others. The first day concluded with discussions on Outreach, Exploitation, and Collaboration and capped off with a fascinating XR BIM visualization demo by Holo-Light.

The second day began with a presentation on Human Factors: training and Usability Assessment. Pilot sessions continued, featuring Human-Robot Collaboration and Wearables, Remote-Controlled Demolition, and Robotic Mastic Application. Jason Rambach concluded the day with closing remarks, wrapping up two days of productive discussions and workshops.

Overall, the meeting served as a valuable platform for collaboration, knowledge sharing, and strategic planning, driving us closer to our collective goals. We're grateful for the dedication and contributions of all participants, and we look forward to continuing our journey towards innovation and excellence in the construction industry.

Many thanks to our partners STAM, Francesca Canale and Stefano Ellero, for organising it!


Discover the HumanTech work: Wearable technologies for construction

Welcome to the 'DISCOVER THE HUMANTECH WORK' blogs, a series where we dive into the HumanTech work packages (WPs) with our WP Leaders!

Let’s start this series with WP4, wearable technologies for construction, led by Bruno Walter Mirbach, Senior Researcher at the Department of Augmented Vision at DFKI.

This WP is composed of four tasks:

  • T4.1: Inertial sensors - wearable camera integration, led by sci-Track
  • T4.2: Intention prediction and exoskeleton integration, led by Tecnalia
  • T4.3: Wearable camera digital twin localisation, led by DFKI and
  • T4.4: XR-glass integration and BIMxD visualisation framework, led by HoloLight

In this blog, Bruno presents the work carried out in the first half of the project, the benefits and the impact of the work done, its liaison with the other WPs and what is next!

WHAT WP4 ACHIEVED

Work package 4 focuses in the development of wearable technologies for construction. These technologies will support workers in physical and stressful situations to enhance their efficiency, safety, and satisfaction. Moreover, extended reality (XR) glasses and deep-learning methods applied to body cameras shall ease the use and effectiveness of digitalisation methods in construction.

A key achievement in the first half of the project has been the development of a visual-inertial body sensor network (BSN). Combining a network of Inertial Measurement Units (IMU) with camera-based visual tracking allows for robust pose tracking of the worker. We will use this as input to an artificial intelligence (AI) algorithm that detects workers' intentions and automatically activates an exoskeleton when needed.

The integration of the IMU network with the camera system has been achieved by the HumanTech partner Sci-Track, with support from RICOH, who has developed an entirely new prototype of a dual-fisheye 360° camera to integrate the visual BSN. In parallel, Tecnalia has built a test bench with IMU and physiological sensors to determine the actions in which workers need support and develop an intention recognition algorithm for exoskeleton activation.

HoloLight has integrated the visualisation of a BIM model in the HoloLens into its software framework. Both HoloLight and DFKI, in collaboration with sci-track, are moreover working on a camera-based localisation of a worker with respect to a BIM model. We can use these functionalities for an augmented reality (AR) visualisation of progress monitoring or issue investigation through a BIM model.

Hand Menu_XR visualisation_HumanTech, wearable technologies
Hand menu, XR BIM visualisation of Hololight's office BIM model.

In HumanTech, we are not working in a silo – let’s discover how this WP is linked to the others, the impact and the benefits of this WP.

We will demonstrate the exoskeleton with automatic activation function in a use case of Pilot 2: human-robot collaboration and wearables. Therefore, from the start of the project, there has been a close collaboration with the pilot owner and other involved partners — in particular, from WP5: construction robotics and human-robot collaboration — to define use cases for the exoskeleton.

WP4 is also very closely linked to WP2 and WP3. WP2 provides the BIMxD backend, the communication and processing platform from which the BSN and the XR glasses receive content from the BIM model generated within WP3. We can either visualise this content in the XR glasses or use it to determine the camera's pose with respect to the BIM model. Therefore, we have organised multiple workshops between WP4, WP2, and WP3 to define extensions of BIM standards as well as interfaces and services of the BIMxD backend.

WP3 partners have strongly supported WP4 partners in scanning lab space for generating BIM models. On the other hand, the development of a camera localisation and the augmented visualisation of BIM content allows to visualise the benefits of the methods developed in WP3 (dynamic semantic digital twin generation), thus providing added value.

Moreover, RICOH's development of the novel 360° camera brought an unforeseen benefit to the project. Initially designed for integration into the BSN, the camera turned out to be an ideal environment perception sensor for the robotic platform we developed in WP5, into which we will now integrate it.

The exoskeleton with an automatic activation function is one of the project's key exploitable results because it overcomes the downside of existing exoskeletons, which is the need for manual activation or the fact that it impedes natural motion. With the exoskeleton's so-called transparency, the range of applications will increase, as will the acceptance of exoskeletons in construction.

Exoskeleton experiment__HumanTech, wearable technologies
Exoskeleton for construction environment in Tecnalia's lab.

Computer vision and deep-learning methods, such as the automatic localisation of a camera with respect to a BIM model and the XR visualisation, can strongly support the demonstration and exploitation of the methods developed in WP3.

WHAT IS NEXT

The clearly indicated final step for the development exoskeleton with automatic activation function is the hardware- and software integration of the visual BSN with the exoskeleton, the optimization of the pose tracking and intention recognition algorithms and the system validation. The final system with be demonstrated within a use case of Pilot 2.

An ongoing task is the research on methods to localize accurately the pose of a camera with respect to a BIM model and to implement the communication between the body sensors and the BIMxD backend.


In the coming weeks, we will continue to share our progress on our different work packages — stay tuned to our news, social media (LinkedIn and Twitter), and to stay up to date!


Tech4EUconstruction cluster at the European Robotics Forum 2024

Our second workshop at the European Robotics Forum was a great success! It allowed us to share our knowledge on the benefits of AI and robotics in achieving automation in construction — based on our research and development activities — with more than 100 professionals in the AI and robotics area.

We organised a workshop on “AI and Robotics in Construction” in collaboration with our sister projects, the EU-funded BEEYONDERS and RoBétArmé, for the second year in a row, which attracted a huge audience.

Our project coordinator, Jason Rambach (DFKI), introduced HumanTech, highlighting the groundbreaking technologies we are developing to revolutionise the construction sector.

Antonio Alonso Cepeda (ACCIONA) and Dimitris Giakoumis (CERTH-ITI) also introduced our sister projects, with which we created the Tech4EUconstruction cluster.

Watch the interview we did with them, in which they reflect on the successful outcome of our workshop, the importance of the cluster for our projects and the opportunity it offers us to collaborate and advance AI and robotics in construction.

https://www.youtube.com/watch?v=SBN9XnfpyUg&list=PLLFSbOsrWIuaKCMQwGqsuvQ3zEEhhalnV

During the workshop, Gabor Sziebig (SINTEF), Renaud Detry (KU Leuven) and Maria Teresa Lázaro (ITAINNOVA) addressed key aspects of AI and robotics necessary for introducing robotic automation in construction sites, focusing on robot vision, navigation, control and HRI.

Patricia Helen Rosen (BAuA) shared insights from the first end-user evaluation we developed in HumanTech.

Our colleagues Gabor Sziebig and Sascha Wischniewski (BAuA), together with Herman Bruyninckx (KU Leuven), Jose Carlos Jimenez (TECNALIA) and Alberto Landini (STAM), held an insightful panel on challenges and lessons learned within our projects.

To conclude, we had an interactive feedback round with the attendees.

Thanks to all who joined us!

From left to right, Sara Sillaurren (TECNALIA), Patricia Helen Rosen (BAuA), Sascha Wischniewski (BAuA), Maria Teresa Lázaro (ITAINNOVA), Jose Carlos Jimenez (TECNALIA), Alberto Landini (STAM), Jason Rambach (DFKI), Dimitris Giakoumis (CERTH-ITI) and Antonio Alonso Cepeda (ACCIONA).

Want to keep updated with the latest news from HumanTech and our Tech4EUConstruction cluster? Subscribe to our newsletter.


Sign up to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101058236.

Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.