History has valuable lessons on AI. Many of us are aware of the replication crisis in social science. Were you aware of the software crisis, first famously discussed at the NATO Conference on Software Engineering in Garmisch, October 1968? We have got used to software being buggy, updates being required on a near-daily basis, often to fix security vulnerabilities – and, given the vast number of high profile cyber attacks, often too late. People are now suggesting using large language models, trained on code people have dumped on the web, to write software. Software testing and static program analysis are going to be more important than ever, whether you’re evaluating internet-connected apps or statistical analysis code.
The original reports are available online. It’s worth having a browse around to see the issues. In 1968, hardware and software had a tiny fraction of the computational and political power it has now.