HumanTech Team_Carina Pamminger_Chief Project Officer_Holo-Light

Meet the HumanTech team: Carina Pamminger, driven by infinite curiosity

Meet the inspiring Carina Pamminger, Chief Project Manager at Holo-Light, a company aiming to unleash the potential of augmented and virtual reality in the enterprise market, where she leads the team working on HumanTech.

"HumanTech is as exciting as it is groundbreaking. We are honoured to be part of a group of experts who will push the bar in construction to make it safer and greener through cutting-edge technologies."

HumanTech Team_Carina Pamminger_Chief Project Officer_Holo-Light
Carina Pamminger, Chief Project Manager at Holo-Light

Q: Carina, what motivates you most, both personally and professionally?

A: My biggest driver is curiosity. I always want to know more, walk an unfamiliar path, come up with new hypotheses and test their validity, and have engaging conversations with all kinds of people. Besides that, I enjoy sharing gained knowledge to keep the flow of information running.

Q: How great! And what about your organisation, Holo-Light? What is it specialised in, and what is its focus?

A: Holo-Light is a company specialising in immersive technologies for enterprises. In particular, we aim to unleash the full potential of augmented and virtual reality (extended reality) in the enterprise market.

By creating a business-ready industrial metaverse, we are closing the gap between the virtual and the real world — empowering people to seamlessly create, build and operate in the digital space.

Q: A fascinating world full of possibilities. Now, let's talk about your role at HumanTech. What do you do, and what is the most rewarding thing about it?

A: I have the pleasure of leading our team for HumanTech and managing the project. The most rewarding aspect is observing an idea come to fruition and seeing my team grow professionally and personally.

Q: Sounds great! Specifically, what actions are you working on that you are most excited about?

A: Defining the requirements for the applications we will develop for HumanTech. This aspect of the project is great, as it is a constant reminder that anything — big or small — once started in the mind of someone before it was taken to others, improved, merged, re-shaped and then materialized, tested, and deployed.

Secondly — after months of observing our team work in collaboration with the rest of the HumanTech team members — organising the pilot executions, where all assumptions are put to the test. From Holo-Light’s side, it is a great honour!

"One of the tasks I'm more excited about is defining the requirements for the applications we will develop for HumanTech. It is a constant reminder that anything — big or small — once started in the mind of someone before it was taken to others, improved, merged, re-shaped and then materialized, tested, and deployed."

Q: What positive impact do you hope to generate through this project?

A: We hope to push the boundaries of construction work. Make it safer for its workers and more efficient and sustainable. Augmented reality (AR) is a very powerful technology that can help solve long-standing problems in this sector. I want to use all our accumulated experience and knowledge to take this project as far as possible.

"We hope to push the boundaries of construction work. Make it safer for its workers and more efficient and sustainable."

Q: Let's hope so! Although much remains to be done, we are getting closer and closer to creating a better building industry for people and planet. What specific milestones do you hope to achieve?

A: We will create an easy-to-use AR application, which can be run in streaming without being subject to possible device limitations. Accordingly, the optimised tracking of QR codes and the clear visualisation of the relevant data set for a construction site are further milestones in the project.

Q: We have discussed the need to create a more sustainable construction industry. Do you think there is a net zero future for it?

A: I personally think that net zero in construction will be difficult to achieve. That said, optimising construction is essential. Streamlining processes and merging 2D planning with 3D visualisation on site to reduce waste, unnecessary transport and material, and related emissions are just examples of what new technologies allow us to achieve.

Q: What are the biggest challenges to achieving this optimisation? And the keys?

A: Even if a construction site runs entirely on green energy, it is almost impossible to guarantee that all the tools, materials, etc., that have been used have been produced similarly. Typically, materials have to be transported from all over the world, which increases the carbon footprint even before construction starts.

The key is awareness, collaboration, trust in new technologies, continuous optimisation paired with adaptability and, most importantly, accountability.

"The key to improving construction towards a safer and greener industry is awareness, collaboration, trust in new technologies, continuous optimisation paired with adaptability and, most importantly, accountability."

Q: To end, what other projects do you know of that have contributed to the industry's transformation?

A: IntellIoT has extended AR to agriculture, enabling augmented remote operation of an intelligent agricultural fleet. This could also be used in construction, for example, by securely controlling construction vehicles or vehicles or machinery via a digital twin.

UCARe4Citizen is another European project in which simulation methods enhanced through AR visualisations help governments in their urban planning tasks, allowing better decision-making.


Learn more about our work at HumanTech and the team behind it.

Meet Gloria Callinan, Project Support Officer at the Technological University of the Shannon (TUS), Florendia Fourli, CEO and Managing Director of Hypercliq, Francesca Canale, Project Engineer at STAM, and Patricia Rosen, Researcher at the German Federal Institute for Occupational Safety and Health (BAuA).

Subscribe to our newsletter and follow us on LinkedIn and Twitter to stay updated on our advances!


HumanTech_Rosen_Patricia-Helen

Meet the HumanTech team: Patricia Rosen, studying how technology affects us to promote healthier workplaces

How do digitalisation and the interaction with new technologies at work affect us? How can we create healthy workplaces that keep pace with the latest technological developments? Our colleague Patricia Rosen, a psychologist fascinated by the human mind, is working on this at the German Federal Institute for Occupational Safety and Health (BAuA). In particular, she is a researcher within the "Human Factors, Ergonomics" group, where she is leading the "Physical worker assistance systems" team.

"I find it motivating to analyse and investigate the effects of new and emerging technologies on the individual and the overall workforce and thus being able to promote healthy workplaces."

Together with her interdisciplinary team at BAuA, she is focused on defining the user requirements of the technologies we are developing at HumanTech, for which she will rely on the feedback of the construction workers to whom they are addressed.

Q: Patricia, what are you passionate about, and what motivates you most about your work?

A: As a psychologist, learning and understanding how humans behave, think, feel, experience and interact with others as well as their environment has always fascinated me.

HumanTech_Rosen_Patricia-Helen
Patricia Rosen, researcher at the German Federal Institute for Occupational Safety and Health (BAuA)

In our modern world of work, digitalisation and interaction with new technologies play an important role. I find it motivating to analyse and investigate the effects of new and emerging technologies on the individual and the overall workforce and thus being able to promote healthy workplaces.

I value a lot being able to participate professionally and personally in technological advances affecting how we work and live. I think it is important to accompany these effects early, especially from the human-centred perspective.

"I value a lot being able to participate in technological advances affecting how we work and live and I think it is important to accompany these effects early, especially from the human-centred perspective."

Q: It seems an exciting and necessary field! Even more so as we incorporate more technological advances into our work. What exactly does your organisation, BAuA, do?

A: We promote occupational safety and health (OSH) and human-centred work design. We aim to detect opportunities and risks for employees at an early stage. To this end, we develop approaches for appropriate, targeted OSH measures and ensure that safety and health concerns are considered from the very beginning when technological and organisational innovations are introduced.

Q: And, in particular, what is your role in HumanTech?

A: Within the HumanTech project, the role of my team at BAuA is to define user requirements for all of the HumanTech technologies. Furthermore, we aim to evaluate the different technologies within the specific use cases and pilots from a worker perspective.

For us, a close collaboration with the use cases, the technology developers and implementers within HumanTech is essential. Within our evaluation, we consider, for example, the interaction quality between a potential worker and a specific technology, also addressing the requirements in relation to construction work.

"Within HumanTech, the role of my team at BAuA is to define user requirements for all of the HumanTech technologies — within the specific use cases and pilots from a worker perspective."

Q: An essential part of our project! What do you like most about your work on it, and what are you most looking forward to?

A: I always enjoy working in interdisciplinary teams, which is the case within HumanTech. I am especially looking forward to having specific set-ups in our pilots and seeing how our innovative technologies can support human workers. We also plan to get direct feedback from the workers. This is always treasurable information to me, as they are the real experts for their specific tasks or jobs and any associated changes.

Q: How do you think you can make a positive impact through HumanTech?

A: By emphasising the human factors perspective. Within this project, we do not only focus on one emerging technology but different worker assistance systems like collaborative robotics, smart glasses and exoskeletons. The variety of technologies allows us, on the one hand, to combine existing knowledge on each technology and, on the other hand, to broaden the scope when using multiple systems in a challenging environment like the construction industry.

Q: To conclude, what is the most important milestone you hope to achieve?

A: With the help of the HumanTech team, we hope to be able to provide a comprehensive human factors perspective on the HumanTech technologies in their specific pilot applications.

"With the help of the HumanTech team, we hope to be able to provide a comprehensive human factors perspective on the HumanTech technologies in their specific pilot applications."


Learn more about our work at HumanTech and the team behind it.

Know Gloria Callinan, Project Support Officer at the Technological University of the Shannon (TUS), Florendia Fourli, CEO and Managing Director of Hypercliq, and Francesca Canale, Project Engineer at STAM.

Stay updated on our advances by subscribing to our newsletter and following us on LinkedIn and Twitter.


Using motion capture technology to reduce workplace injuries in construction

At HumanTech, one of our main goals is to reduce workplace injuries by 30% in the construction industry. Our partners at SINTEF are building on the motion-capture technology developed by Sci-Track to improve construction workers' safety and well-being and provide services for human-robot collaboration.

This technology can be used to both obtain information about the workers on a construction site and provide them with helpful information for executing their work. For example, it can predict whether a person is at risk of injury and inform them where they should or should not go to avoid it.

How does this technology work? How are we using it at HumanTech? We have spoken with our partner Markus Miezal, Researcher at Sci-Track, to better understand it.

 

Markus Miezal, Researcher at Sci-Track, is wearing a motion capture suit to show how this innovative technology works in real-time.

Q: Markus, can you explain to us what this technology is and how it works?

A: I'm wearing a suit from the Japanese company Xenoma, which includes 18 inertial measurement units. The sensors provide measurements of linear acceleration (including gravity), rotational velocity and a magnetic field measurement (i.e. a 3D compass). We all have such sensors in our smartphones, which can detect screen rotation, for example.

Our technology uses statistical sensor fusion based on a human biomechanical model to estimate the user's kinematics from the measurements.

Q: How interesting! For what purpose are you developing it? And how will it contribute to HumanTech's objectives?

A: We will use it to track construction workers and provide the kinematics to the exoskeleton from our partners at Tecnalia so that they can identify the user's intention and control the exoskeleton accordingly.

Another option yet to be explored is to also use it in the delivery tasks of the bricklayer and the robot. We will integrate it with the Theta 360° camera from our partners at Ricoh to provide visual-inertial tracking of the human body.

"We will use this technology to track construction workers and provide the kinematics to the exoskeleton from our partners at Tecnalia so that they can identify the user's intention and control the exoskeleton accordingly".

Q: Who will be able to use it, and could you give us an example of a practical, real-life case where it could be used?

A: Due to the shortage of sensors on the market, we will only be able to create a few suits for HumanTech. Apart from the construction sector, our product has been used in rehabilitation and for the ergonomic analysis of factory workers. Motion capture is also a well-known application.

Q: What stage of development are you at, and what are your next steps?

A: We have a working product, but our extensions, in particular the integration of the camera, are planned for August. We are building on existing publications in this field and are currently working on the device calibration process.


Want to keep up to date with our next steps? Subscribe to our newsletter and follow us on LinkedIn and Twitter!


Next steps at HumanTech: Our plans for 2023

It is February 2023, and we are in the ninth month of our project to transform construction into a safer, more inclusive and greener industry.

In December, the three things that Jason Rambach, our Project Coordinator, highlighted from our first six months of progress were:

"In the coming months, we will continue to work on our technical work packages for the generation of digital building twins, construction wearables and robotics. We look forward to making some of our newly acquired building data available to the scientific community", said Jason.

We are also very excited to organise our first workshop at the European Robotics Forum with two other Horizon Europe projects in the construction field, RoBétArmé and BEEYONDERS, on March 15th.

Where are we now in development? What are our plans for this new stage of the project? We spoke to the Work Package leaders that make up HumanTech, and this is what they told us.

Work Package 1: Overall Framework Definition

Jason Rambach, HumanTech Project Coordinator and Senior Researcher and Team Leader in Spatial Sensing and Machine Perception at the German Research Center for Artificial Intelligence (DFKI): For Work Package 1, we will be happy to present our public deliverable on the "HumanTech Vision and research requirements" very soon. Planned for March 2023, it contains inputs based on the expertise of many of our project partners. It will paint an accurate picture of the current state of digital twins, wearables and robotics in construction as well as our plans for advancing the state-of-the-art in HumanTech. Additionally, after several long discussions, we have established a first version of our HumanTech architecture at the system and data level, which we will present in deliverable D1.2."

"The deliverable we are working on will paint an accurate picture of the current state of digital twins, wearables and robotics in construction, as well as our plans for advancing the state-of-the-art in HumanTech."

Work Package 2: BIMxD Formats and Standardization

Andrea Giordano, Professor at the Department of Civil, Building and Environmental Engineering - ICEA, Università degli Studi di Padova: "The activities of the Work Package 2 are based on the development of the standardisation of the HumanTech activities, based on international standards. Our plans for the next months are related to the creation of the platform's main characteristic and to the definition of the exchange requirements of the other Work Packages. The standardisation of the process and the data using the buildingSmart data dictionary (bSDD) are, in fact, related to the kind of information that can be reused from the BIM models and the point cloud segmentation for an optimisation of the resource. The group members' collaboration will allow us to organise the dedicated platform."

Work Package 3: Dynamic Semantic Digital Twin Generation

Bharath Sankaran, CTO and Co-founder of Naska.AI: "On the one hand, while our team at RICOH will continue improving and testing their RGB-D sensor prototype, ZHAW and Naska.AI will develop their Unmanned Aerial Vehicle (UAV) and Unmanned Ground Vehicle (UGV) data capture platforms, and work together to integrate the RICOH Theta 360 camera. The platforms and sensors will be tested and evaluated throughout the year on real construction sites provided by our Work Package 3 partners, Acciona and Implenia.

On the other hand, led by DFKI, other partners will work on developing semantic segmentation algorithms, using the data collected with our sensors to train their machine-learning algorithms.

Finally, RPTU (formerly TUK - Technical University of Kaiserslautern), will continue developing algorithms to turn pointclouds and semantic information into BIM models."

Work Package 4: Wearable Technologies for Construction

Bruno Mirbach, Senior Researcher at DFKI: "In Work Package 4, we are developing a wearable visual-internal-sensor system. A surround-view camera developed by RICOH will be integrated with the body sensing system of Sci-Track. The combination of these sensors enables both a simultaneous monitoring of the environment and localisation of the worker therein and a precise and robust 3D-tracking of the workers' postures and movements.

The next step in Work Package 4 will be to develop an intention recognition based on this information and to implement an action-dependent automatic activation of an exoskeleton, which supports the worker in specific physically stressful actions."

"The technology we are developing enables both a simultaneous monitoring of the environment and localisation of the worker therein and a precise and robust 3D-tracking of the workers' postures and movements."

Work Package 5: Construction Robotics and Human-Robot Collaboration

Gabor Sziebig, Research Manager at SINTEF: "Work Package 5 started on 1st November 2022 and is progressing with rapid steps. Baubot started to build the new generation of Mobile Collaborative Robotic System (called MRS5-1920), while the rest of the partners involved are focusing on the research and development of the necessary parts for the robotic platform and the pilots, building up the HumanTech vision.

The Work Package 5 team met on the 1st February 2023 in Madrid, Spain, to get hands-on experience with two of our planned pilots (bricklaying and mastic application) and is arranging a Special Session on the 19th IEEE International Conference on Advanced Robotics and Its Social Impacts, which will take place in Berlin, Germany, June 5th-7th 2023. Meet us there!"

 

 

Work Package 6: Human Factors - Training and Usability Assessment

Gloria Callinan, Project Support Officer at the Technological University of the Shannon (TUS): "At TUS, we continue to work with our project partners on curricula for micro-learning units, the first of which should be ready in Q1 2024.

Our partners at Tecnalia are preparing the subjective assessment for the technologies being developed in HumanTech through dedicated workshops with final users to measure a worker's acceptance degree and how the perceived usefulness can improve it. This assessment will be done by the own workers, designers and researchers.

Our main goal? To celebrate one workshop with 40 workers (20 male and 20 female). The material for moderators participating in the focus groups has been prepared with self-exploration questionnaires to measure the experiential users' objectives and interaction. It will be analysed afterwards and conclusions will be drawn."

Work Package 7: Pilots, Evaluation and Validation

Fabian Kaufmann, Researcher at RPTU: "At Work Package 7, we are leading the use case coordination with Work Packages 2, 3, 4, and 5. As the research tasks are now starting, there is a lot of coordination necessary.

As Gabor has mentioned, we visited one of our pilot sites on the 1st February in Madrid and were able to check the site conditions, challenges and opportunities for pilot implementation. Also, we are analysing manual tasks such as bricklaying and mastic application collaboratively with other Work Packages and will continue to schedule and plan our activities for the coming months!"

Work Package 8: Outreach, Exploitation and Collaboration

Giulia Pastor, Project Manager at AUSTRALO: "On Work Package 8, we are coordinating with our sister projects, BEEYONDERS and RoBétArmé, to promote our first workshop, "AI and Robotics in Construction", which will be held on March 15th at the European Robotics Forum (ERF) 2023.

"We are pushing the promotion of our first workshop, "AI and Robotics in Construction", which we have organised with our sister projects, BEEYONDERS and RoBétArmé, during the European Robotics Forum 2023 and will take place on March 15th."

Our first technical publication, in which our Project Coordinator, Jason Rambach, has participated, is available: "OPA-3D: Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection."

We are continuing our series of interviews with the HumanTech team to learn about their work and share their vision of the positive impact we will generate with this ambitious project. In addition, we are working on a campaign on gender equality in the digital and technological environment on the occasion of the upcoming International Women's Day 2023, whose theme is "DigitALL: Innovation and technology for gender equality".


Want to keep up to date with our next steps? Subscribe to our newsletter and follow us on LinkedIn and Twitter!

If you want to be part of our journey and explore collaboration opportunities with us, we’re all ears!


HumanTech Publication_OPA-3D - Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection

HumanTech Publication: OPA-3D - Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection

Dive into our first scientific publication, "OPA-3D: Occlusion-Aware Pixel-Wise Aggregation for Monocular 3D Object Detection", accepted in the IEEE Robotics and Automation Letters (RA-L) journal.

Abstract

Monocular 3D object detection has recently made a significant leap forward thanks to the use of pre-trained depth estimators for pseudo-LiDAR recovery. Yet, such two-stage methods typically suffer from overfitting and are incapable of explicitly encapsulating the geometric relation between depth and object bounding box. To overcome this limitation, we instead propose to jointly estimate dense scene depth with depth-bounding box residuals and object bounding boxes, allowing a two-stream detection of 3D objects that harnesses both geometry and context information. Thereby, the geometry stream combines visible depth and depth-bounding box residuals to recover the object bounding box via explicit occlusion-aware optimization. In addition, a bounding box based geometry projection scheme is employed in an effort to enhance distance perception. The second stream, named as the Context Stream, directly regresses 3D object location and size. This novel two stream representation enables us to enforce cross-stream consistency terms, which aligns the outputs of both streams, and further improves the overall performance. Extensive experiments on the public benchmark demonstrate that OPA-3D outperforms state-of-the-art methods on the main Car category, whilst keeping a real-time inference speed.

Authors

Yongzhi Su and Didier Stricker - German Research Center for Artificial Intelligence (DFKI), RPTU Kaiserslautern

Yan Di, Guangyao Zhai, and Benjamin Busam - Technical University of Munich

Fabian Manhardt - Google

Jason Rambach - German Research Center for Artificial Intelligence (DFKI)

Federico Tombari - Technical University of Munich, Google

Keywords

Computer vision for transportation, deep learning for visual perception, object detection.


 and follow us on LinkedIn and Twitter to stay updated with our upcoming publications!