Meet Markus Miezal, CEO and Co-Founder of Sci-track, a passionate advocate of technology and its potential to improve lives. From a young age, Markus knew he wanted to be in research and development. His driving force is inventing technologies that benefit humanity and drive progress.
“I love technology. Since I was a kid, I knew I wanted to be in research and development and create all those magical devices of our everyday lives.”
At Sci-track, they focus on developing self-configurable, easy-to-use, accurate and affordable inertial motion capture solutions. Their goal is to provide a state-of-the-art motion interface that simplifies motion tracking and analysis for various applications, from patient rehabilitation monitoring to the ergonomic analysis of factory workers.
As part of HumanTech, Marcus and his team are responsible for tracking workers on construction sites. By capturing and analysing data, they aim to understand workers’ intentions and improve their safety and efficiency.
Learn more in our interview with him.

Q: Tell us a little about yourself. What drives you? What is your purpose, professionally and personally?
A: I love technology. Since I was a kid, I knew I wanted to be in research and development and create all those magical devices of our everyday lives. My goal was to invent things that benefit humanity and help those who can’t help themselves. In reality, the steps I make are much tinier than expected. Our motion tracking technology can help patients in rehabilitation to monitor their progress and ensure healthy postures, or we perform ergonomic analysis on factory workers to improve their posture during different tasks. On a greater scale, my technology makes motion (human or animal) available for machines to learn from without needing a big motion lab. People often see inertial tracking as a bridge technology for AI.
Personally, I want to provide for my family. I try to ignite curiosity in my kids and strongly hope they can afford and develop their own idea of altruism.
“Our motion tracking technology can help patients in rehabilitation to monitor their progress and ensure healthy postures, or we perform ergonomic analysis on factory workers to improve their posture during different tasks.”
Q: And about your organisation, Sci-track, what is your focus on and what have you set out to achieve?
A: We are developing self-configuring inertial motion-capturing solutions that are set to be easy for anyone without exhaustive preparation, calibration or setup. Our solution is supposed to be plug & play without prior technical knowledge. In the same way it is easy, it is also supposed to be accurate enough for clinical operations. We are also not bound to hardware and can already work with many hardware suppliers. Our goal is to provide the leading motion interface for the next decades.
“We are developing self-configuring inertial motion-capturing solutions that are set to be easy for anyone without exhaustive preparation, calibration or setup.”
Q: Now, let us know about your role at HumanTech. What do you do and what is the most rewarding thing about it?
A: Our main task is to track the workers on the construction site. The data will be used to determine their intention. One of our tasks involves integrating a spherical camera with our tracking, which adds localisation capabilities and aims to improve self-calibration.
We are excited to see how users react to the system. User acceptance is a major obstacle in any application of this technology. Positive feedback would be really rewarding.
Q: What two activities are you working on in the project that you are most excited about?
A: Our main task involves providing an integrated system of inertial sensors and camera. On the way there, we will develop extensions to our algorithm, allowing us to incorporate intrinsic and extrinsic 3D information. This is not bound to visual sensors only. Any sensor that provides this kind of information can be integrated then.
Another exciting activity will be integrating with the exoskeleton of Tecnalia and the pilots since this highlights the value of our data, and we will learn a lot from that.
“Our main task involves providing an integrated system of inertial sensors and camera. On the way there, we will develop extensions to our algorithm, allowing us to incorporate intrinsic and extrinsic 3D information.”
Q: What positive impact do you hope to generate through this project?
A: One goal is to become present in people’s minds. My experience is that in many fields, there are existing solutions to known problems, but they are just unknown. For example, inertial sensors have a bad reputation in the industry because they contain magnetometers (basically a 3D compass), which are easily disturbed by other magnetic fields, e.g., current flowing through wires or bare metal. In many cases, however, the problem is solved, which seems quite unknown.
Q: Very ambitious projects are being developed to transform the industry and make it safer and more sustainable. Do you know of any that have inspired you?
A: My first European project was COGNITO, an FP7 project aimed at capturing complex workflows. Back then, we had a primitive version of inertial motion capturing, but we could provide joint angles. At some point, I sat together in a taxi with a motion scientist from the University of Compiègne, and he came up with the idea to use our data for real-time ergonomic analysis. The resulting paper and the feedback were amazing, and still today, this is one of our main applications. It helps workers to maintain a healthy posture while working and improve their quality of life beyond work and when they retire. This was a key moment that demonstrated the possibilities of interdisciplinary work and showed how something very technical could have a positive and lasting effect on people’s lives.
Q: Any final words to end on?
A: Please play heroic epic music to improve the experience while reading this!
Subscribe to our newsletter and follow us on LinkedIn and Twitter to learn more about our work at HumanTech and the team behind it, and stay updated on our advances!
