The global coronavirus pandemic has sparked a wave of health tech companies seeking to find innovative solutions to combat covid-19. That includes everything from using interactive maps to identify covid-19 hotspots. The one thing all these efforts require is your data.
The latest tech-y covid-19 “solution” to make headlines is a temperature-monitoring patch that’s being put together by yet another consortium of tech companies. According to Reuters, the patch would be about the size of a small bandage and is “intended to be worn on the skin,” where it will “connect wirelessly to a smartphone to monitor a person’s body temperature.” The companies involved include chip makers SkyWater Technology and Linear ASICs, as well as New York investment firm Asymmetric Return Capital. These three companies are also working with SensiML, an artificial intelligence software firm, and Upward Health, which provides in-home healthcare.
It’s not abundantly clear how useful temperature is as a metric for diagnosing covid-19, and relying on temperature screenings alone is flawed. But one eyebrow-arching detail in the Reuters report mentions that the companies are aiming to use artificial intelligence to “analyse signals such as the sounds of coughs” to try and identify covid-19 patterns. In fact, SensiML has a link in its press release that says it’s building a dataset of crowdsourced cough patterns for researchers. It would appear that the temperature patch is separate from this crowdsourced cough dataset, though that’s not immediately clear from the company’s press release. The release also mentions the project ” which purportedly aims to “give businesses, governments, healthcare, and other public facilities access to multi-sensor, pre-diagnostic screening mechanisms” ” is supported by universities and health organisations, but fails to name a single one. (Gizmodo reached out to SensiML for those details but did not receive an immediate response.)
At a glance, this effort certainly sounds like it’s well-intended. But given how many of these data-hungry efforts have popped up, it bears reminding that before you volunteer anything, you ought to do your homework about who exactly you’re gifting your data to ” and for what purpose.
I have never heard of any of these companies. And why would you, unless you were in this particular industry?
SensiML isn’t the only entity out there doing this sort of dataset-building. Carnegie Mellon University researchers are doing it, as is Fitbit, and as are a whole bunch of other wearables companies. But it’s usually a good idea to look for a reputable health institution attached to the project and explicitly stated end goals ” preferably in the form of a published study in a peer-reviewed journal. Even something as simple as an FAQ or even an about page that clearly states who the institution/company is, and the credentials of everyone involved. Privacy policies can be a drag to read, but it’s worth taking the extra 5-10 minutes to see whether your data is aggregated, non-identifiable, won’t be shared for advertising with third-parties, and whether it’s covered by HIPAA. And if none of that is immediately clear, sending off a quick email for clarification takes almost no effort.
Health tech can do amazing things with your data ” and it is an unfortunate truth that the more people who volunteer, the more accurate these algorithms and datasets are. It can also be grossly abused. The safest thing is to never volunteer your data, but for those who want to contribute, the next best thing is to do your due diligence. While I wish everything about this unnamed consortium’s efforts was clearer and better communicated, after some digging, there’s no smoking gun that something fishy is going on here. That said, I’ve looked at enough of these efforts to know they could have done a much better job. Personally, if I were to volunteer my data, I’d opt for studies like Scripps Research’s DETECT Study, Stanford Healthcare Innovation Lab’s Covid-19 Wearables Study, or even Carnegie Mellon University’s Covid Voice Detector app.