Paul-Peter Arslan

About

Paul-Peter Arslan

HCI Researcher · Embodied Interaction · Neurorehabilitation

I don't believe in talent. I believe in conditions, the right environment, the right feedback, the right moment of trust. My work begins from the conviction that anyone can reach any goal if they are given the conditions to grow. That conviction is not abstract for me. It is what drives every system I design.

It started with education. My first project, Second Self, was an augmented mirror that turned the body into a learning interface. Modules for sign language, dance, music, each designed to give learners immediate, embodied feedback. I wanted to show people something they didn't yet know about themselves: that they were capable.

When I decided to pursue a PhD, I built my own research topic around that same idea, using technology to make learning visible. But funding a PhD in France is not straightforward. After contacting over a hundred companies and institutions, I found my way in: not through education, but through rehabilitation. In French, the words tell the story themselves. Éducation and rééducation. One teaches you to move. The other helps you move again. For me, the mission is the same. If I can design an interaction that makes someone believe they can recover, that they can play again, tap again, feel agency in their own body, then that is the most meaningful work I can imagine.

My PhD at IFT lives at this junction. I built ReTouche, an augmented reality piano system where embodied representations guide learners without replacing their judgment. I created Rhythm Karaoke, a timing engine that measures fine motor precision through musical imitation, and is now being extended toward stroke recovery. I developed deep learning pipelines that decode continuous finger forces from 224-channel HD-EMG arrays, turning invisible neural signals into something a clinician and a patient can read together.

At the MIT Media Lab, working in Prof. Hiroshi Ishii's Tangible Media Group, I explored how physical objects can restructure the way we think with AI. Tangible Co-Ideation moves prompting out of the chatbox and into the hands, making creative collaboration spatial, tactile, and deeply human.

Across all of this, I hold one belief: somewhere there exists a perfect interaction for each of us, one that meets us exactly where we are and gently shows us what we can become. That is what I want to build. It is why I am an HCI researcher at heart, despite a subject grounded in neuroscience and health. The body is not just data. It is where we experience possibility.

I see artificial intelligence as a partner in this search. Not AI that replaces human effort, but AI that amplifies human perception, models that reveal patterns we cannot see, that adapt in real time to the person in front of them, that make the feedback loop between intention and sensation fast enough to feel like intuition. The future I work toward is one where technology is so attuned to the learner, the patient, the creator, that growth feels less like effort and more like discovery.

Collaborations & Affiliations

MIT Media LabSorbonne UniversityInstitute for Future Technologies IFTInstitute of Psychiatry and Neuroscience of Paris | INSERM IPNPParis Brain Institute research teamsKTH Royal Institute of Technology, SwedenUniversity of Oxford