The industry is split among so-called doomers who say the technology is moving too quickly and others who say it can make lifesaving enhancements.
"More than 1,000 tech leaders signed a letter in March calling for a pause in the development of A.I.’s most advanced systems, saying the tools have 'profound risks to society and humanity.'"
Which brings me to the thought that perhaps AI should be added to our list of "human caused" previously called "man-made" disasters. Which direction the calamity might take is anyone's guess, but the possibilities are endless.