This is a daily column written by Lowell Heddings, the founder and owner of How-To Geek. If you prefer, you can read this column in a web browser instead.
Smaller, Faster, Cheaper, Over: The Future of Computer Chips In recent years, however, the acceleration predicted by Moore's Law has slipped. Chip speeds stopped increasing almost a decade ago, the time between new generations is stretching out, and the cost of individual transistors has plateaued. Technologists now believe that new generations of chips will come more slowly, perhaps every two and a half to three years. And by the middle of the next decade, they fear, there could be a reckoning, when the laws of physics dictate that transistors, by then composed of just a handful of molecules, will not function reliably. Then Moore's Law will come to an end, unless a new technological breakthrough occurs. We’re quickly reaching the end of Moore’s law — the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years. Intel and others have already had some trouble with reliably shrinking their chips much further, introducing a lot of delays in their processor lines for desktops and servers. They continue to reinvent the designs to increase the speed, but we aren’t getting the shocking improvements that we used to. In the mobile sector, however, processors are still getting faster at an incredible rate — the iPhone 6s is nearly twice as fast as the prior version, and some independent tests have shown that it’s actually as fast as a 2012-era laptop when running certain JavaScript tests. Some of that is better software implementation, but a lot of it is hardware. We aren’t very far away from smartphones that are just as fast as the laptops we carry around. One area where processors can continue to improve even after the end of Moore’s law is by integrating functions that used to be carried out in software and doing the processing in hardware instead. For instance, for Bitcoin mining, there are tons of custom ASIC (application specific integrated circuit) machines that are built specifically for the arduous task of Bitcoin mining. And one of the advantages Apple has in the iPhone is that they design their own chips, so over time they have built integrated circuits for things like image processing, motion sensing, and other things, so they happen nearly instantly with hardware instead of in software running on a general purpose CPU. There’s a lot of room for improvement in this area, and it will be interesting to see how things shake out. Google and NASA are getting a new quantum computer The famous Quantum Artificial Intelligence Lab is getting some powerful new hardware. A joint project between Google, NASA, and the Universities Space Research Association, the Quantum AI Lab today announced a multiyear agreement to install a D-Wave 2X, a state-of-the-art quantum processor released earlier this year. With over 1,000 qubits, the machine is the most powerful computer of its kind, and will be put to work tackling difficult optimization problems for both Google and NASA. The fragility of the qubits also means the computer’s processor can only operate at extremely cold temperatures. The 2X’s standard operating temperature is less than 15 millikelvin, a temperature far colder than outer space. Normal computers work by processing everything as either On or Off, or ones and zeros. Quantum computers work by using On or Off… or “maybe”. This in-between state, like Schrödinger’s cat, could be either 1 or 0, but nobody knows until the value is checked, at which point the value is either 1 or 0. But before the value is checked, it could be either. The easier way to think about it is like an analog volume switch that you can turn between 1 and 0. If you’re somewhere in the middle, you’re at maybe 0.5 or 0.6. The problem is that when you are doing a calculation, you can only round up or round down, so it’s either 0 or 1. But if you can design an algorithm that works off the in-between state without checking the value (like adding two 0.3 values together to get 0.6 and round to 1), it could perform calculations that would be impossible with today’s computers. Theoretically. Perhaps the most interesting thing so far is that even with hundreds of qubits (the individual on / off / maybe switch used for processing in a quantum computer), so far nobody has been able to prove that quantum computers actually work differently than a regular computer. It’s also worth noting that quantum computers require temperatures near absolute zero, which means you’re probably never going to have a smartphone powered by a quantum computer. |
Tidak ada komentar:
Posting Komentar