On a recent vacation in Italy I fed my itinerary into GPT-5 for sightseeing and restaurant suggestions. The model’s top dinner recommendation near my Rome hotel sent me down Via Margutta to Babette; the meal turned out to be one of the most memorable I can recall. Back home I asked the system how it had made that choice. I’m reluctant to give away every detail here in case I want a table again (it is called Babette). The model’s explanation was layered and impressive. It cited glowing local reviews, mentions in food blogs and the Italian press, the restaurant’s celebrated mix of Roman and contemporary cooking, and the simple fact that it was a short walk from our lodgings.
Cory Doctorow’s theory of “enshittification” describes how tech platforms decay from the inside. With AI becoming more profitable and more capable it faces a similar threat. Market incentives that push services to extract greater value from users could steer AI systems toward paid, partner, or monetized outputs instead of the genuinely best choices. My GPT-5 interaction showed how ranking signals and commercial pressures can be opaque, shaping what people discover, enjoy, and trust.

