Blackbox -

We are building minds made of silicon. But because they are black boxes, we are like Zeus watching the forge of Hephaestus: we see the raw ore go in and the thunderbolt come out, but we have no idea how the fire works. We cannot go back. The black box is too powerful. Self-driving cars see things humans miss. Medical AI spots tumors in MRIs that radiologists gloss over.

Consider the case of a hospital in Tennessee. Doctors deployed a cutting-edge black box AI to identify patients at risk of pneumonia. The AI was remarkably accurate—except for one glitch. It consistently sent asthmatics home, labeling them "low risk." blackbox

Doctors were baffled. Asthma is a major risk factor for pneumonia complications. Why would the AI do this? We are building minds made of silicon

The old black box—the flight recorder—was built to survive a fire. It tells us exactly why we crashed. The black box is too powerful

In 2016, ProPublica investigated an algorithm called COMPAS, used in US courts to predict recidivism. The black box returned a "risk score." ProPublica found it was twice as likely to falsely label Black defendants as future criminals than white defendants. The company that made the algorithm denied the bias. Because the box was black, both sides could claim the math supported them.

Ironically, we call this device the "black box" (it’s actually bright orange). It is the ultimate witness. It swallows a storm of inputs—airspeed, altitude, button presses, screams—and produces a perfectly linear story of cause and effect.