Editor’s note: The opinions expressed in this commentary are the author’s alone. This piece follows a limited series by Startland exploring parent advocates’ objections to 1:1 technology initiatives. Click here for more on that topic.
Inspired by the ongoing conversation here on Startland News about 1:1 technology initiatives in schools, I wanted to share my perspective as someone working in population health on the recent influx of Internet-of-Things (IoT) medical devices in schools and how they impact our children’s privacy.
The gravest example of this is with Kinsa’s FLUency program, which has distributed Kinsa’s smart thermometers to schools throughout Kansas and Missouri. Here you can see a list of schools, along with the video testimonial of Cassie Lawhon, Smithville Elementary School head nurse, explaining the benefits of this program: https://www.kinsahealth.com/missouri
While the benefits of these smart thermometers are clear, the cost is not. Kinsa would have you believe that these schools have somehow won the privilege of their free smart thermometers through a competitive process. But Kinsa revealed the true nature of the program when it announced in the New York Times (This Thermometer Tells Your Temperature, Then Tells Firms Where to Advertise, Oct. 23, 2018) that their fever data is for sale and currently in use by Clorox for targeting ad campaigns.
This raised the ire of many concerned citizens in the comments section of that article, and of journalists on Twitter. Kinsa’s CEO defended the move in his LinkedIn blog, but glosses over the fact that the data being collected by Kinsa are mostly from school children.
As adults, we expect that much of the data we generate online is sold for advertising, marketing, and merchandising purposes, but children’s data is the last bastion of true privacy — a line reinforced by Federal COPPA regulations and my wife (“Please, no photos of the kids online, Graham!”). It’s an entirely different circumstance when you are proactively “giving” IoT medical devices, complete with licensed Sesame Street characters, to children in elementary schools for the purpose of selling their data — aggregated, anonymized or otherwise.
This invokes comparisons to Joe Camel and Big Tobacco’s infamous marketing strategy to target kids: “Realistically, if our company is to survive and prosper, over the long term, we must get our share of the youth market” (1973 R.J. Reynolds document). To be sure, Kinsa is not selling cigarettes to kids, but they are egregiously marketing a medical device and app to kids, their schools, and their parents in order to sell insights derived from their health data.
Why is this problematic to privacy?
In order to protect the privacy interests of consumers, data companies will remove personal identifiers, such as name and Social Security number, from databases containing sensitive information. These de-identified data safeguard the privacy of consumers while still providing useful information to marketers. However, in recent years it has been revealed that “anonymized” data can often be re-identified — a process by which anonymized personal data is matched with its true owner.
Just ask the measurement company, Nielsen. They protect their clients’ sales data by omitting POS (point-of-sale) trends for geographies where a single store location could be inferred. For example, if I purchase Nielsen’s POS data for all cold and allergy remedies at a specific geographic level, and if there is only one pharmacy or grocery store in that entire geography, then I know how many sales that single store saw for their various point-of-sale data that I’ve purchased. Nielsen understands that risk to their client’s privacy, and omits data whenever those stores can be re-identified.
A case could be easily made that the data Kinsa is collecting could be re-identified given the way in which their thermometers are distributed to schools via their FLUency Program. The publicly listed schools where Kinsa has given away most of their thermometers are likely the only schools in their respective zip codes. An interested party could then cross reference those zip code data with what is known to marketers as MAID (Mobile Advertising ID) data via another data seller to infer “interest” data for those parents and families — where they shop, and where their children likely go to school. And now your child’s fever recorded by Kinsa is not only triggering flu remedy ads to you directly by an unrelated third party across all your other devices, but could be used to re-identify you and the household associated with your MAID.
Whether you’ve purchased a $20 smart thermometer from the pharmacy or received it for free from your children’s school, you expect that your data isn’t the product — instead that the product or service itself is the product. You expect that your money used to pay for the thermometer or tuition, or the public endorsement of your children’s school nurse, is the collateral for your retained privacy.
But that line has been crossed, and there may be no going back without the help of enforced Federal regulations on medical devices — maybe a warning label that says, “This Device Monetizes Your Data.” But as long as data companies provide and respect transparency without disguising themselves otherwise, and without marketing themselves to kids, then there can be a future where privacy is back in the hands of responsible consumers as a currency that they can choose to spend, or not. Privacy is dead, long live privacy.
Graham Dodge is the CEO and president of Sickweather, a crowdsourcing application featuring user opted-in and publicly available social sentiment to track and predict illnesses helping people make informed health decisions.