6 Comments
User's avatar
Ronald Pavellas's avatar

"It's just policy..."

Barry Weinstein's avatar

Probability is calculated based on data and is no more accurate than the data used!

The economic crisis caused by bank failures due to mortgage swaps was caused by the calculation that the probability of home values falling was zero! That input of zero probability was grossly inaccurate because it was based on limited data! The use of probability must include a knowledge of the data used because the probability is no more accurate than the data!

Exploring ChatGPT's avatar

Thanks for your input Barry! This is a great point. Probability always inherits the blind spots of the data underneath it. The mortgage crisis is a perfect example of what happens when models confuse clean math with complete reality. That’s really the warning here. AI doesn’t eliminate that risk, it can scale it faster if we forget to question the inputs.

Exploring ChatGPT's avatar

Thanks Ronald! Could you elaborate?

Ronald Pavellas's avatar

At some point in one's life one meets a person with more power that oneself, both persons enmeshed in a bureaucratic system, where the one with more power can easily justify any decision that (most often, negatively) affects oneself by referring to the rule book.. without having to evaluate anything oneself may have to say. I sense from your article the same will eventually, if not already, occur by appealing to how an AI "assistant" supports the person with more power. Perhaps this is tangential, but that's what occurred to me.

Exploring ChatGPT's avatar

Thank you very much Ronald! That’s exactly the concern. Rules already get used as shields to avoid judgment, and AI just makes that easier. Once someone can say “the system says so,” the conversation stops, and power no longer has to listen. That’s the shift I’m worried about.