You are here
My iPhone guessed I was pregnant
I'M not of the generation that grew up with the Internet. My most high-tech play as a kid involved Game Boys and Sega consoles - and even then, those entered our household when I was well into primary school.
So late-last year, when my sleek iPhone - all compact and shiny-edged, so unlike those blocky hunks of plastic from the 1990s - asked me: "Did you just add a member to your family?", I froze.
All I had done was open the App Store to explore some baby-tracking apps. I had, after all, just confirmed that I was pregnant - a secret I had thought was being savoured only by my husband and me.
Nope. My iPhone knew I was pregnant, thanks to all my prior Google searches about starting a family, which pregnancy test sticks were the best, and goodness knows what else. And so there was my App Store screen, recommending various mum-to-be apps before I even knew I wanted them.
It wasn't just my iPhone; Instagram and Facebook started pushing me pregnancy-related ads - everything from baby clothes to infant formula, and even government baby bonus factsheets and cord blood banking services.
It was surreal, realising that these tech apps were privy to such intimate bodily knowledge. At that point in time, we hadn't even told our parents that they were going to be grandparents - and yet the bots knew.
Some call it hyper-personalisation. I find it hyper-freaky.
Even now, in my third trimester, my App Store has helpfully curated a list of apps under two banners: "New mum survival kit" and "Apps for super parenting support". The 15-plus apps are sorted into various categories, such as sleep trackers, breastfeeding guides, and baby milestone markers.
Uh, my baby's not even here yet. But thanks, I guess?
Look, I'm no luddite. I love AI and algorithms as much as the next person. I'm just saying I'm a little disconcerted.
That unease has grown even greater this week, after I read a New York Times story titled: "On YouTube's Digital Playground, an Open Gate for Paedophiles". The report details three Harvard researchers' findings that YouTube's automated recommendation system - which uses sophisticated algorithms to suggest what users should watch next - is unwittingly serving users with sexual interest in children.
The alarming progression looks like this: A user first watches an erotic video. He then gets recommended videos of women who are noticeably younger, and following that, women who pose suggestively in young girls' clothes.
Said the NYT piece: "Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations… Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
"On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable."
That is some scary stuff.
In fact, I'd seen this sort of wayward progression myself years ago, while visiting friends who had a toddler. To distract the boy for a bit, his father propped him up in front of an iPad with Sesame Street clips auto-playing on YouTube.
The kid's screen time had started harmlessly enough: First he'd watch Elmo dispense high-pitched life advice, and when the video ended, he'd jab at the iPad screen with an adorably chubby finger to select one of the next recommended videos. While the suggested clips initially stayed kosher, at some point they veered off into inappropriate territory - including homemade videos of teens jumping off roofs on BMX bikes.
All of this certainly gives me pause, especially with this bub in my belly.
I know there's no going back to my pre-Internet life - when home videos of swimsuit-clad children weren't spread with such ease, and devices didn't (and couldn't) give a hoot about your reproductive system.
But it sure does make me yearn for simpler days. And let's face it - no recommended app is going to be able to accomplish that.
- The writer works in the finance industry.
She is contactable at firstname.lastname@example.org