In this browser, as I mentioned before, I'm apparently a man in the 65+ age range. And keep getting ads for mobility aids and pet, especially dog supplies...yeah.

On my phone, I'm a woman in the 25-34 range (a couple of years off, but close enough). The main difference in viewing habits, AFAICT? The phone is handy for recipes and other cooking/food information in the kitchen. Though three of the eleven interest categories on this one also involve food; oddly, no science- or tech-related categories show up for either. I have, indeed, noticed ads in the phone browser which have less to do with even the interests they've decided I have than with gender stereotypes--including weight loss ones which I really do not want to look at.

On the whole, I have to agree with one commenter on the linked article (
I'm torn, though. I sort of like that Google doesn't know who I am. Do I really want to give them MORE information about me so that they can start targeting me with ads for cosmetics and baby clothes?

Especially given that I don't really identify in either category, hmm. I think I'll just keep the ads for dog beds and walking sticks, both of which I actually use--even if I would avoid clicking on an ad for those too, on principle, were I even in the market for more.

IMO, while it does more strongly reflect attitudes of the marketing industry as a whole than much about Google, wouldn't it make more sense just to tailor ads to interest categories without even taking the gender variable into account? Or are your interests in, say, "Food & Drink - Cooking & Recipes - Baked Goods" supposed to vary wildly depending on which gender gets assigned? The suggestions of prescriptivism--particularly based on determinations from such apparently silly algorithms--do rather get on my nerves. Otherwise, I probably wouldn't care beyond getting a giggle.
Google also thinks I'm a man because my interests are in technology and law.

What, not enough cooking videos? No fashion magazines?

The biggest problem with Google is it can't figure out a way to filter its own biases out of its algorithms.
Shared publiclyView activity