Heres a summary of the predictions made, from never all the way up to within the year. It seems to me the closer you get to the dollar bill the sooner the projections become.
"Some experts predict it will never happen..."
"Some experts argue that human intelligence is more multifaceted than what the current definition of AGI describes." (That AGI is not possible.)
"Most agree that AGI will arrive before the end of the 21st century."
"Some researchers who’ve studied the emergence of machine intelligence think that the singularity could occur within decades."
Current surveys of AI researchers are predicting AGI around 2040"
"Entrepreneurs are even more bullish, predicting it around ~2030"
"The CEO of Anthropic, who thinks we’re right on the threshold—give it about 12 more months or so."
I tried search the web for a particular comic - I think it might have been smbc - where the person's prediction of when the singularity was inversely proportional to how long they had to live, but I can't find it.
The last panel was an old guy saying "The singularity will arrive by Friday! Hopefully before 5..."
However, not everyone thinks AGI is a dead certainty. Some experts argue that human intelligence is more multifaceted than what the current definition of AGI describes. For example, some AI experts think of the human mind in terms of eight intelligences, of which “logical-mathematical” is just one (alongside it exists, for example, interpersonal, intrapersonal, and existential intelligence). Deep learning pioneer Yann LeCun thinks AGI should be rebranded to “advanced machine intelligence,” and argues that human intelligence is too specialized to be replicable. The report also suggests that, while AI can be an important tool in making new discoveries, it can’t make these discoveries on its own.
This is more realistic.
LLMs aren’t even close to intelligent, they just regurgitate information. Human thought is many times more complex. Even basic animals are more complex.
The singularity is an interesting idea, but further analysis to me indicate that physical barriers will prevent it from ever happening.
Yes we have development at an increasing pace, but the barriers for improvements are increasing even more as we approach physical barriers. So we are not approaching the singularity, but we are approaching what could be the peak of fast progress, especially on living standards, where it may already have passed for the developed world.
Ray Kurzweil is a brilliant man, but I think he miscalculated regarding the singularity.