

I have predicted this for a while now. As this will take effect, the airline no longer have responsibility for what sets the prices. The AI could for instance become very racist, driving prices through the roof for colored people if it somehow determines that well-paying racist customers will pay more to fly with only white people. Several scenarios like that could unfold, and since LLMs are basically impossible to get the source values for their decissions, no one can be held responsible for such choices.
Oh, and I’m sure the data from 23andMe will be abused soon to ensure that only healthy people get good prices. The personal data which “didn’t matter that we shared” is about to unfold.
Sure, I accidentally use AI and LLM interchangeably. But I believe the point still stands. If they were asked to trace the source of the price difference, it likely exists within layers upon layers of training data aimed at maximizing profits, and it would probably be impossible to give an answer as to what data has been used to produce the result in the long run.