Shyam's Slide Share Presentations

VIRTUAL LIBRARY "KNOWLEDGE - KORRIDOR"

This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.

TO VIEW MORE CONTENT ON THIS SUBJECT AND OTHER TOPICS, Please visit KNOWLEDGE-KORRIDOR our Virtual Library

Friday, October 20, 2017

The Future of AI and Big Data with Quantum Computing 10-20






With the boom in digital technologies, the world is producing over 2.5 exabytes of data every day. To put that into perspective, it is equivalent to the memory of 5 million laptops or 150 million phones. The deluge of data is forecast to increase with the passing day and with it has increased the need for powerful hardware that can support it.

This hardware advancement refers to faster computing or processing speed and larger storage systems. Companies worldwide are investing in powerful computing with the R&Ds constantly in the race for making improved processors. The current stream of data needs computers that can perform complex calculations within seconds.

Big data and Machine learning have pushed the limits of current IT infrastructure for processing large datasets effectively. This has led to the development of a new and exciting paradigm of quantum computing that has the power to dramatically increase the speed. But before that, let us understand the current technology and the need for quantum technology.

Current Computing Technology and Its Limitations
The technology of processing has come a long way in the past couple of years with the development of finger-nail sized microprocessors (single-chip computer packed with millions of transistors) called integrated circuits. Standing true to Moore’s law, the number of transistors packed in a single chip has doubled every 18 months since the past 50 years. Today, it has reached 2 billion transistors in one chip.

The semiconductor technology is now making smallest chips with 5 nanometer-sized gates below which it is said the transistor will not work. Now, the industry has simply started increasing the number of processor “cores” so that the performance continues on Moore’s law predictions. However, there come many other software-level restraints to keep this relevant.

In 2016, two researchers at Lawrence Berkeley National Laboratory created the world’s smallest transistor with gate size of one nanometer. This is a phenomenal feat in computing industry but making a chip with billions of such transistors is going to face many challenges. The industry has already prepared for transistors to stop shrinking further and Moore’s law is likely to come to a stagnant halt.

As the computations pertaining to current applications like big data processing or intelligent systems get more complex, there is a need for higher and faster computing capabilities than the current processors can supply. This is one of the reasons why people are looking forward to quantum computing.

What is Quantum Computing
Quantum Computing merges two great scientific revolutions of this century: computer science and quantum physics. It has all the elements of conventional computing like bits, registers, gates, etc. but on the machinery level, it does not depend on boolean logic. The quantum bits are called qubits. The conventional bits can store 0 or 1 but quantum bits can store 0, 1 and all the possible values (states) between it simultaneously. As it can store the values, it can also process them simultaneously. It can work in parallel doing multiple things at the same time which makes it million times faster than the current computers.

The working of these computers is little complex and the entire field of quantum computing is still largely abstract and theoretical. The only thing we really need to know is that qubits are stored by atoms or other particles like ions that exist in different states and can be switched between these states.

Application in Big Data
The progress in these fields critically relies on processing power. The computational requirement of big data analytics is currently placing a considerable strain on computer systems. Since 2005, the focus has been shifted to parallelism using multiple cores instead of a single fast processor. However, many problems in big data cannot be solved simply by using more and more cores.  Splitting up the work among multiple processors is used but its implementation is complex. The problems need to be solved sequentially where the preceding step is equally important.

At the Large Hadron Collider (LHC) at CERN, Geneva particles are accelerated, traveling at almost the speed of light within a 27km ring such that 600 million collisions take place in a second wherein only one of the 1 million collisions chosen for preselection. In the preselection process, only 1 out of 10,000 events are passed to a grid of processor cores that further choose 1 out of 100 possible events, hence, making the data process at 10GB/s. At LHC, 5 trillion bits of data is captured every second and after discarding 99% of the data, it still analyses 25 petabytes of data a year!

Such is the power of quantum computing but the current resources make the application of it in big data, a thing of the future. If it were possible, the computing would be useful for specific tasks such as factoring large numbers that are useful in cryptography, weather forecasting, searching through large unstructured datasets in a fraction of the time to identify patterns and anomalies, etc. The developments in quantum computing could actually make encryption obsolete in a jiffy.
With such computing powers, it would be one day possible to make large datasets that would probably store complete information such as – genetic of every single human that existed and machine learning algorithms could find patterns in the characteristics of these humans while also protecting the identities of the humans. Also, clustering and classification of data would become a much faster task.

Looking Forward
The initial results and developments in quantum technologies are encouraging. In the last fifteen years, quantum computers have grown from 4-qubits to 128 qubits. Google’s 5-qubit computer has demonstrated certain basic calculations; that if scaled up, can perform many complex calculations that will make the quantum computing dream come true one day. However, we are unlikely to see such computers for years or even decades.

The future says quantum computers will allow faster analysis and integration of our enormous data sets which will improve and transform our machine learning and artificial intelligence capabilities.

View at the original source

No comments:

Post a Comment