Intelligence is the ability to accomplish a complex goals. This definition leaves out biases and look objectively at multiple forms for intelligence.
Computer already outperform human in narrow tasks such as arithmetic, chess, and heavy lifting.
AI will only reach human intelligence when it become general enough to perform as well as humans across as many tasks as human. Therefor, the term general artificial intelligence.
We can view the domain of humans as a landscape which is getting filled by the waters of AI. The valley are already filled, but there are mountains like art, science, and programming that remain well out of reach.
Humans have used external sources of memory for centuries. Every writing system is a form of memory.
The simplest form of memory to build is binary memory where every piece is either on or off.
Computer memory is 10 trillion times cheaper now than when first computers were invented. Most computer today can already store more data than the human brain, and the divide will only increase.
Functions are a building block of all computations. They take an input and produce an output.
NAND gates are a simple function that takes two outputs. If both are 1 it outputs a 0, otherwise it outputs a 1. Enough NAND gates can represent any computation, which is exactly how modern processors are built.
Much like wave mechanics don’t care a bout the medium they travel in – as long as they have a medium – computation and intelligence doesn’t care about if it’s created out of biological matter or silicone.
Moor’s Law states that the number of transistors will double about every 2 years. This law has proven true so far, but we are reaching the physical limits of the current transistor design. That doesn’t mean that the growth has to stop. Just like transistors replaced cathode tubes, we will find another technology to replace transistors.