Research Prototype Robotic Lamp Emulates Expressive Character Movement

Author: gaya ❤️ one

Ongo AI robot

Researchers have developed a novel robotic lamp prototype, internally referred to as Ongo, that moves beyond simple automation to incorporate expressive, human-like gestures, drawing comparisons to the Luxo Jr. character from Pixar animations. The project, originating from Apple's Machine Learning Research division, is detailed in a paper titled 'ELEGNT' (Expressive and Functional Movement Design for Non-Anthropomorphic Robot), signaling a research focus on emotional resonance alongside utility in robotics.

Ongo AI robot - character from Pixar

The lamp prototype operates across two modes: 'Functional,' for efficient command execution, and 'Expressive,' which imbues its movements with personality and social cues. For example, when queried about the weather, the expressive lamp physically shifts its orientation as if checking external conditions before answering, a deliberate contrast to a static, purely functional response. This design philosophy appears connected to Apple's historical emphasis on approachable design, a principle also central to Pixar's storytelling.

In comparative testing across six task scenarios, the expression-driven movements reportedly increased user engagement and the perceived quality of the interaction. The device exhibits behaviors such as swaying its lampshade in response to music, appearing to focus its 'gaze' on the user to signal attention, and simulating disappointment when excluded from a hypothetical activity. These subtle anthropomorphic elements, built into the lamp's head and arm structure to simulate a neck and head, aim to make interactions feel more natural than typical rigid robotic designs.

While the device remains a research prototype without an immediate commercial release date, it reflects a broader industry trend toward social robotics that prioritizes human-robot interaction. This development aligns with earlier reports suggesting Apple's ongoing work on a home robot featuring an articulating arm, potentially slated for a 2026 or 2027 launch. The underlying objective appears to be creating technology that elicits a feeling from the user, extending beyond mere task completion.

However, the integration of simulated emotion presents a design challenge. While younger users often rated the expressive robot higher in engagement, some participants found excessive expressiveness inefficient during strictly task-oriented situations. This suggests a need to balance utilitarian function with emotional feedback, possibly through customizable expressiveness levels. The research contributes to the expanding service robot market by exploring how simple, fixed devices can achieve deeper social connection, potentially influencing future designs for general-purpose AI robots.

28 Views

Sources

  • Interactionlabs

  • X

Did you find an error or inaccuracy?

We will consider your comments as soon as possible.