Peer-Reviewed Research
The Science Behind HelperOne
That’s not a tagline — it’s what decades of peer-reviewed research from MIT, USC, The Lancet, and leading universities consistently demonstrate.
Text can inform you.
Only a face can move you.
The difference between an AI tool you try once and an AI companion that actually changes your outcomes comes down to something deceptively simple: a face you can build a relationship with.
Here’s what the science says — and why we built HelperOne the way we did.
longer engagement
You stick with it longer — significantly longer.
The biggest problem with productivity apps, AI assistants, and digital tools isn’t capability. It’s abandonment. Most are forgotten within days. Research from MIT Media Lab found that people using an embodied AI coach stayed engaged 40% longer than those using the same content on a screen — and 89% longer than those going it alone with traditional tools.[1] That’s not a small edge. Over weeks and months, it’s the difference between a goal you talked about and a goal you hit.
more check-ins per week
You show up more often.
In a clinical trial with over 1,600 participants, people working with a relational avatar agent checked in nearly 3× more per week than those using a conventional app.[2] More check-ins means more momentum. More momentum means your job search doesn’t stall after week two, your study plan doesn’t die on day four, and the project you keep pushing to next Monday actually moves.
You’re more honest about what you actually need.
Research from USC’s Institute for Creative Technologies revealed something counterintuitive: people share more openly with a visual AI than with a human — or a text box.[3] They report what’s really going on. The stress they’re hiding at work. The interview anxiety they won’t admit to a friend. The fact that they have no idea where to start. That honesty is what lets HelperOne give you guidance that’s actually useful, not generic advice based on what you think you should say. Follow-up research with military service members confirmed the effect — participants disclosed more symptoms to a virtual human than on official written assessments.[4]
You form a real working bond — and that bond drives results.
This is the finding that matters most. Studies consistently show that users form genuine working alliances with avatar-based AI — rated on the same clinical scales used to measure the bond between a patient and therapist, a client and coach.[5] And across every domain studied, that bond is the single strongest predictor of follow-through. Not the quality of the advice. Not the number of features. The relationship. HelperOne is designed to build that relationship from your first conversation — and deepen it over time.
You get support that meets you where you are.
Avatar-based AI is especially effective for people who find text-heavy tools overwhelming, impersonal, or easy to ignore.[6] Not everyone processes best by reading. Not everyone thrives with bullet points and dashboards. HelperOne talks to you like a person — adapting to your pace, your style, and what you’re dealing with today. Research confirms this is particularly powerful for people navigating unfamiliar territory: a first job search, a career pivot, a new skill, a leadership role they don’t feel ready for.
The research spans every kind of goal.
The evidence isn’t limited to one domain. Avatar-based AI has been tested — and proven — across the full range of things people struggle to do alone.
Career and performance
The same rapport and accountability mechanisms that drive clinical outcomes apply directly to workplace goals. When you have a companion that remembers your priorities, checks in on your progress, and adapts to your situation, you follow through more consistently — on applications, on preparation, on the daily work that compounds.
Health and habits
Avatar-led programs achieved 3× higher quit rates for smoking than standard digital tools.[7] Sedentary older adults significantly increased daily activity with an AI exercise coach.[6] Multimodal AI agents reduced depression and anxiety symptoms with effects approaching in-person therapy.[8] [9] Fully automated avatar-guided coaching matched human-delivered treatment for phobias in a study published in The Lancet.[10]
Why a face makes the difference
Researchers have identified five converging psychological principles that explain why embodied AI outperforms text and voice across every metric that matters.
1. Social presence
Avatars convey richer cues — face, gaze, gestures, expressions — creating a genuine sense of being with someone. A systematic review of 152 studies confirmed that text-based tools consistently produce lower social presence.[13] Lower presence means lower engagement. Lower engagement means lower results.
2. The relationship trigger
Humans are wired to apply social scripts — trust, reciprocity, accountability — to anything that behaves socially.[14] An avatar activates these instincts in ways that text cannot. You don’t just read HelperOne’s advice. You receive it from someone.
3. The uncanny valley — and how to avoid it
Research shows that hyper-realistic virtual humans can actually backfire, triggering unease instead of trust. Moderately stylized avatars that are expressive without pretending to be human hit the sweet spot.[15] Enough social presence to build rapport, without the eeriness that breaks it. HelperOne is designed in this zone intentionally.
4. The coaching effect
Social cues from an avatar prime deeper cognitive engagement. When you see and hear a coach walking you through a problem, you process the guidance differently than when you read it. Studies show this leads to better retention, higher motivation, and stronger follow-through — but only when paired with natural voice and responsive behavior, not a static face reading a script.[11] [12]
5. Perceived warmth and competence
A meta-analysis of over 11,000 people found that moderate anthropomorphism increases both perceived warmth and perceived competence of AI tools — the two dimensions that predict whether someone trusts a tool enough to keep using it.[16] Too robotic and you don’t trust it to understand you. Too human and you don’t trust it at all. The middle ground is where HelperOne lives.
Not just a chatbot with a face.
Most AI assistants are text on a screen. Some add a voice. A few slap on an avatar as an afterthought. HelperOne is fundamentally different — the avatar isn’t a feature, it’s the architecture. Every interaction is designed around the science of embodied companionship: responsive expression, adaptive conversation, natural voice, and a relationship that develops over time.
The research is unambiguous: a face without intelligence is decoration. Intelligence without a face is forgettable. HelperOne is both — and that’s why it works.
References
- [1]Kidd, C.D. (2008). Designing for Long-Term Human-Robot Interaction and Application to Weight Loss. MIT Media Lab, Doctoral Dissertation.
- [2]Velicer, W.F. et al. (2014). Using relational agents to increase engagement in computer-based interventions. European Health Psychologist, 16(S), 374.
- [3]Lucas, G.M. et al. (2014). It’s only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior, 37, 94–100.
- [4]Lucas, G.M. et al. (2017). Reporting mental health symptoms: Breaking down barriers to care with virtual human interviewers. Frontiers in Robotics and AI, 4, 51.
- [5]Bickmore, T.W. & Picard, R.W. (2005). Establishing and maintaining long-term human-computer relationships. ACM Transactions on Computer-Human Interaction, 12(2), 293–327.
- [6]Bickmore, T.W. et al. (2013). A randomized controlled trial of an automated exercise coach for older adults. Journal of the American Geriatrics Society, 61(10), 1676–1683.
- [7]Karekla, M. et al. (2020). An avatar-led intervention promotes smoking cessation in young adults: A pilot randomized clinical trial. Annals of Behavioral Medicine, 54(10), 747–760.
- [8]He, Y. et al. (2023). Conversational agent interventions for mental health problems: Systematic review and meta-analysis of randomized controlled trials. Journal of Medical Internet Research, 25, e43862.
- [9]Chew, H.S.J. et al. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. npj Digital Medicine, 6, 236.
- [10]Freeman, D. et al. (2018). Automated psychological therapy using immersive virtual reality for treatment of fear of heights: A single-blind, parallel-group, randomised controlled trial. The Lancet Psychiatry, 5(8), 625–632.
- [11]Moreno, R. et al. (2001). The case for social agency in computer-based teaching: Do students learn more deeply when they interact with animated pedagogical agents? Cognition and Instruction, 19(2), 177–213.
- [12]Mayer, R.E. & DaPra, C.S. (2012). An embodiment effect in computer-based learning with animated pedagogical agents. Journal of Experimental Psychology: Applied, 18(3), 239–252.
- [13]Oh, C.S., Bailenson, J.N. & Welch, G.F. (2018). A systematic review of social presence: Definition, antecedents, and implications. Frontiers in Robotics and AI, 5, 114.
- [14]Nass, C. & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.
- [15]Nowak, K.L. & Biocca, F. (2003). The effect of the agency and anthropomorphism on users’ sense of telepresence, copresence, and social presence in virtual environments. Presence: Teleoperators and Virtual Environments, 12(5), 481–494.
- [16]Blut, M. et al. (2021). Understanding anthropomorphism in service provision: A meta-analysis of physical robots, chatbots, and other AI. Journal of the Academy of Marketing Science, 49, 632–658.