Back to portfolio
Wearable Ambient Intelligence2026

IPheromone

A wearable social matching system that uses voice interviews, agent-to-agent compatibility checks, scent release, and haptic feedback to help people find meaningful connections.

HackathonWearablesAI AgentsPhysical Computing

Winner of the Connect track at MIT Hard Mode: Hardware AI Hackathon

Hero image showing the IPheromone prototype in use

Overview

An AI companion that turns compatibility into scent, rhythm, and proximity.

  • Builds user profiles through a voice interview and maps personality to a scent signature.
  • Runs agent-to-agent compatibility conversations instead of relying on swipe-based interaction.
  • Combines a web app, Raspberry Pi wearable, and BLE scent bridge into a single social computing system.
  • Won the Connect track at MIT Hard Mode in 2026.

Core interaction

IPheromone imagines a social interface that stays mostly out of sight. Rather than asking people to check a phone, the system interviews them by voice, builds a richer profile, and quietly scans for meaningful matches in the background.

When the system finds a strong match, it signals through scent and a heartbeat-like haptic pattern. The whole interaction is meant to feel ambient, embodied, and low-friction.

Close-up image of the IPheromone object

System architecture

According to the repository, the project has three main pieces: a Next.js web app for profiles and agent logic, a Raspberry Pi wearable client for voice and physical feedback, and a BLE scent bridge that translates web requests into diffuser commands.

This split is one of the strongest parts of the project: the intelligence is distributed across software, hardware, and atmosphere rather than trapped inside a single screen.

Prototype hero shot for IPheromone
Winner image from MIT Hard Mode for IPheromone