Questions tagged [singularity]

For questions about the concept of technological singularity.

The technological singularity (also, simply, the singularity) is the hypothesis that the invention of artificial superintelligence (ASI) will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.

Technological singularity - Wikipedia

What is the concept of the technological singularity?

21 questions
40
votes
4 answers

What is the concept of the technological singularity?

I've heard the idea of the technological singularity, what is it and how does it relate to Artificial Intelligence? Is this the theoretical point where Artificial Intelligence machines have progressed to the point where they grow and learn on their…
WilliamKF
  • 2,493
  • 1
  • 24
  • 31
15
votes
5 answers

What is the idea called involving an AI that will eventually rule humanity?

It's an idea I heard a while back but couldn't remember the name of. It involves the existence and development of an AI that will eventually rule the world and that if you don't fund or progress the AI then it will see you as "hostile" and kill you.…
13
votes
4 answers

Is the singularity something to be taken seriously?

The term Singularity is often used in mainstream media for describing visionary technology. It was introduced by Ray Kurzweil in a popular book The Singularity Is Near: When Humans Transcend Biology (2005). In his book, Kurzweil gives an outlook to…
11
votes
3 answers

What is wrong with the idea that the AI will be capable of omniscience?

In the context of artificial intelligence, the singularity refers to the advent of an artificial general intelligence capable of recursive self-improvement, leading to the rapid emergence of artificial superintelligence (ASI), the limits of which…
10
votes
6 answers

Why does Stephen Hawking say "Artificial Intelligence will kill us all"?

This quote by Stephen Hawking has been in headlines for quite some time: Artificial Intelligence could wipe out humanity when it gets too clever as humans will be like ants. Why does he say this? To put it simply: what are the possible threats…
Soham
  • 399
  • 1
  • 2
  • 11
10
votes
6 answers

When the AI singularity takes over, what will there be left for us to do?

Since the first Industrial revolution machines have been taking the jobs of people and automation has been a part of human social evolution for the past 3 centuries, but all in all these machines have been replacing mechanical, high-risk and…
10
votes
3 answers

Can a technological singularity only occur with superintelligence?

In Chapter 26 of the book Artificial Intelligence: A Modern Approach (3rd edition), the textbook discusses "technological singularity". It quotes I.J. Good, who wrote in 1965: Let an ultra-intelligent machine be defined as a machine that can far…
6
votes
1 answer

Is there a theoretical maximum for intelligence?

From Artificial Intelligence: A Modern Approach, Third Edition, Chapter 26: Note that the concept of ultraintelligent machines assumes that intelligence is an especially important attribute, and if you have enough of it, all problems can be solved.…
Left SE On 10_6_19
  • 1,660
  • 9
  • 23
4
votes
3 answers

Is there a way to protect humanity against the impending singularity?

We are careening into the future which may hold unpredictable dangers in relation to AI. I've haven't yet heard of Chappie or Robocop style police robots, but militarized drone tech is replacing many conventional weapons platforms. I love the idea…
GIA
  • 568
  • 6
  • 22
4
votes
2 answers

Which government agencies oversee development of new AI?

Nick Bostrom talks in his book Superintelligence about the many dangers of AI. He considers it necessary that strong security mechanisms are put in place to ensure that a machine, once it gains general intelligence far beyond human capabilities,…
Demento
  • 1,684
  • 1
  • 7
  • 26
2
votes
4 answers

Is it possible to manage the evolution of super-intelligent AI?

Post singularity AI will surpass human intelligence. The evolution of AI can take any direction, some of which may not be preferable for humans. Is it possible to manage the evolution of super-intelligent AI? If yes, how? One way I can think of is…
akm
  • 171
  • 4
2
votes
4 answers

Can we define the AI singularity mathematically?

The "AI Singularity" or "Technological Singularity" is a vague term that roughly seems to refer to the idea of: Humans can design algorithms Humans can improve algorithms Eventually algorithms we design might end up being as good as humans at…
Phylliida
  • 274
  • 3
  • 9
2
votes
0 answers

Is a very powerful oracle sufficient to trigger the AI singularity?

Lets say we have a oracle $S$ that, given any function $F$ and desired output $y$, can find an input $x$ that causes $F$ to output $y$ if it exists, or otherwise returns nil. I.e.: $$S(F, y) = x \implies F(x) = y$$ $$S(F, y) = nil \implies !\exists…
Phylliida
  • 274
  • 3
  • 9
1
vote
1 answer

Why are AI Safety discussions almost always from the perspective of reinforcement learning?

I have been reading some articles on AI safety and they almost always speak of AI Safety from the reinforcement learning (RL) perspective, i.e. where we have some artificially intelligent agent acting in an environment so to maximise some reward. Is…
thesofakillers
  • 309
  • 2
  • 14
1
vote
2 answers

Can an animal-level artificial general intelligence kickstart the Singularity?

Most people seem to assume that we need a human-level AI as a starting point for the Singularity. Let's say someone invents a general intelligence that is not quite on the scale of a human brain, but comparable to a rat. This AI can think on its…
Anonymous
  • 69
  • 1
1
2