Fitness and health tracking apps and devices are so trendy that fashion designers are set to make a killing on sparkly bracelet cases for them. We like the idea of gadgets telling us when we haven't exercised enough, or when we need to sleep more. But what about an app to track mental health?
Startup Ginger.io monitors smartphone behaviour for signs of depression, schizophrenia and other mental health issues. It uses your phone's motion accelerometer and global positioning to measure where, how, and when you move. It records how long you spend idling and talking or texting. It sounds both potentially helpful and infinitely creepy.
MIT News interviewed the founder, Anmol Madan, about exactly how Ginger.io can tell when a user's smartphone behaviour signals mental distress. "If someone is depressed, for instance, they isolate themselves, have a hard time getting up to go to school or work, they're lethargic and don't like communicating with others the way they typically do," Madan told his alma mater. "Turns out you see those same features change in their mobile-phone sensor data in their movement, features, and interactions with others."
The program is opt-in, and if it helps to prevent self-harming behaviour in people with mental health issues, that's obviously a good thing. But it's also an unsettling thing, because the service makes it obvious how much can be gleaned from monitoring smartphone habits. It will remind people to reach out for human contact when it notices they haven't talked to anyone all day, measuring for lethargy and isolation. Researchers have used the app to monitor patients in studies on depression and diabetes, and as more doctors continue to sign up for the program, you might have the option to keep your doctors very, very up-to-date on your health.
Would you want your doctor to know your every move, though?
The idea of anyone monitoring and analysing my behaviour so closely is discomfiting to me, even if it's for my health. Despite Ginger.io's assurances that they protect their users' privacy, a hack or security failure could leave detailed records of users' health history available. There is a hefty privacy trade-off.
But I also do not have schizophrenia, clinical depression, or another serious mental health issue that has required hospitalization or other types of monitoring. So I'm not really the target client. If I did have this type of struggle, my hesitance might be superseded by the potential benefits. People in the middle of mental health crises may find that Ginger.io is a life-saving tool, if it can alert doctors and relatives to downswings when they are unable to make contact themselves. Giving up some privacy for that kind of reassurance sounds like a fair enough trade. [MIT News]