It has been approximately 50 years since 1970 when Gordon Moore wrote a paper that describes how the number of transistors on integrated circuits chips is doubling every year and this growth can be projected for the next several decades. The trend graph shows how the number of transistors has increased from 1,000 to 20,000,000,000 over these decades. Moore’s prediction has remained correct for several decades. The semiconductor industry has used this prediction for long-term planning and set target for research and development.
“I see Moore’s law dying here in the next decade or so. – Gordon Moore, 2015”
The rate of progress in the number of transistors is reaching the point of saturation because of physical limit. Current node is 14 nanometer and it can reach to its limit of 5 nanometer by 2020. Does it the mean the saturation in semiconductors will halt this development of the computer era ? Not necessarily ! The development will continue in these ways :
- Specialized chips : GPUs and TPUs based on the requirement and device type. Hyperscaling of chips to support the increased demands.
- Better algorithms and software : Artificial intelligence and other development
- Quantum Computing : fundamentally different than classical computers
Quantum chips and quantum computing are fundamentally different than everything we are doing up to now with classical computers. QC cannot be simulated on classical computers and engineers are trying to use the real quantum effects of particles to create new quantum computers. This requires to create an closed environment where quantum effects and their behaviour can be captured. Why do engineers try to implement computations on smaller base elements?
- Energy Efficient : Smaller elements consume less energy.
- Information Capacity : We can pack more elements into our devices if they are smaller. Information capacity of a system can be found by Shannon’s formula.
- Speed : Smaller elements have smaller inertia (inductance), they can switch faster and provide greater speed.
Relation with computation complexity theory
Current capability with classical computer
|Problem Type||Verifiable in P time||Solvable in P time|
|NP||Yes||Yes or No|
|NP-Hard||Yes or No*||Unknown|
P is a complexity class that represents the set of all decision problems (Yes/No) that can be solved in polynomial time. That is, given an instance of the problem, the answer yes or no can be decided in polynomial time.
E.g. Given a connected graph G, can its vertices be coloured using two colours so that no edge is monochromatic?
NP is a complexity class that represents the set of all decision problems for which the instances where the answer is “yes” have proofs that can be verified in polynomial time.
E.g. Given integers n and m, is there an integer f with 1 < f < m, such that f divides n (f is a small factor of n)?
NP-Complete is a complexity class which represents the set of all problems X in NP for which it is possible to reduce any other NP problem Y to X in polynomial time.
E.g . Travelling salesman problem : A salesman must travel between N cities. The order in which he does so is something he does not care about, as long as he visits each once during his trip, and finishes where he was at first.
So far, classical computers have failed in solving NP-Complete problems in polynomial time. Quantum computers has the ability to solve these problems. TSP becomes hard as the number of cities increases. If solved, this can change the world as it is and can be universally applied to similar type of optimization problems. E.g. Gene analysis to find specific diseases and drugs, etc. The class of problem that only quantum computers will be able to solved is called as BQP.
Grover’s algorithm is a quantum algorithm that is used to search through unstructured dataset. It can be implemented on a specific class of NP-completeness problems called 3SAT problems. We will discuss such quantum algorithms in detail in the next blog post.