Everyone Talks About Quantum Computing: What About Optical Computing?


Published on:

Electricity powers computers. It makes them work and transmits information and data between components through circuits, cables, and other conductive elements.

But a branch of computing is looking for an alternative — to substitute electricity for light. Optical computing promises a future with machines that consume little electricity and offer greater processing capacity.

But while many talk about quantum computing, few echo optical computing mainly because it is in an early stage. More promises than facts, for the moment. However, all new technology started that way. What does optical computing aspire to?

Electrons vs. photons

Traditional computing, which we use on a daily basis, is based on the use of electrons to transmit data, orders or information between the components of the computer. Instead, optical computing is committed to using light, that is, to photons as transmitters of information.

To get an idea, electricity travels through a copper wire at speeds of one millimeter per second. Light, on the other hand, travels at almost 300,000 kilometers per second.

Of course, the exact figure depends on the medium that light uses to travel, but the differences are minimal compared to electricity. And that’s where the promises of optical computers lie. That higher speed translates into processing more data in less time.

To make an optical computer possible, elements and materials such as fiber optic cables, holographic memories, etc., enter. If the computer is purely logical or a hybrid between electronic and optical — electronic peripherals can be combined with optical elements.

In this case, the binary code must be translated into pulses of light through lasers. In contrast, in an optical computer, information is sent at all times by beams of light, in packets and waves.

Optical Computing and Moore’s Law

Optical computing, like quantum computing, is offered as an alternative to current technology, governed by transistors. Current microprocessors, both the CPU type and the graphics GPU, use transistors, which are responsible for processing the electricity that reaches them and transmits the binary code.

Transistors emerged in Bell Laboratories in late 1947. Since then, they have been reduced in size to nanometers. The objective, to place in a processor the maximum number of transistors occupying the least possible space. Miniaturization to get more processing power in a smaller and smaller element.

From this process of reducing the size of the transistors, what is known as Moore’s Law arises. Formulated in 1965 by Gordon Moore, co-founder of Intel, it stated that the number of transistors in a microprocessor would double every two years — more power with the same electricity.

But Moore’s law is now considered a thing of the past. On the one hand, because miniaturization levels have been achieved that exceed the law itself. And on the other hand, because physically, there will come a time when the size of the processors can no longer be reduced.

From communication to computing

Optical technology already exists among us. The main example, the kilometers, and kilometers of fiber optic cables that are scattered throughout the world. Today’s communications are largely based on this technology, with permission from satellites. Also, the data storage owes much to the optics, as optical discs (CD to the Blu-Ray, DVD and through other less popular variants). Thus, information processing is the next barrier to overcome.

However, there are certain limitations to making this possible. For now, the materials used to conduct light are deformed more frequently. In addition, the energy required to conduct light using high-power lasers makes the result an uneconomical computer from a monetary point of view.

Last year, the Wyss Institute at Harvard University announced the development of a new material that uses hydrogel and low-power lasers to change the refractive index of non-linear materials, those used to conduct light in these types of computers.

In practice, this means that materials can be designed to respond to light by changing their optical, chemical, and physical properties. And thus facilitate the conduction of photons. Also participating in the research were McMaster University and the University of Pittsburgh, as well as the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS).

For its part, an Anglo-Russian investigation between the University of Cambridge and the Skolkovo Institute of Science and Technology announced this year promises to advance optical computing by optimizing the transmission of light. The idea is to multiply the number of light waves sent. That is, instead of converting the classic binary signal into light waves, it is about going further — combining the light by multiplying the wave functions instead of adding them.

The future of optical computing is yet to be written. But although it does not receive as much attention as quantum computing, little by little, it is making its way and offering increasingly real improvements and results.

Sabarinath is the founder and chief-editor of Code and Hack. With an unwavering passion for all things futuristic tech, open source, and coding, he delves into the world of emerging technologies and shares his expertise through captivating articles and in-depth guides. Sabarinath's unique ability to simplify complex concepts makes his writing accessible and engaging for coding newbies, empowering them to embark on their coding journey with confidence. With a wealth of knowledge and experience, Sabarinath is dedicated to providing valuable insights, staying at the forefront of technological advancements, and inspiring readers to explore the limitless possibilities of the digital realm.

Related Posts:

Leave a Reply

Please enter your comment!
Please enter your name here