Reversible Computing and Entropy

Report 5 Downloads 104 Views
LaGuardia Community College Thermodynamics

Final Project

Reversible Computing and Entropy June 9, 2015

Authors: Kamal Ouanaim Instructor: Dr. Yves Ngabonziza

Reversible Computing and Entropy Kamal Ouanaim Electrical Engineering Department LaGuardia Community College [email protected]

Abstract

The reversible computing is a computational model that is based on the reversibility of the physical elementary logic gates and the reversibility of the logical operations of the computation. The physical and logical reversibility of the model assures an environment for a reversible computations where the information is conserved and the energy dissipation in form of entropy is optimized as low as possible. Even if the optimization of the entropy is not zero, at least if should be proportional to the number of the output and input of the circuit as opposed to the number of the logic gates on the circuit as in the conventional circuits. This paper intends to give a comprehensive picture and a variety of concepts and theories done around the reversible computing to date. It represents and gives a basic definition of the elements impacting the computational process as well as comparing it to the conventional models of computation.

Keywords: Reversible Computing, Shannon Entropy, Thermodynamics of computation, physical reversibility, logical reversibility, Landauer’s principle.

1. Introduction

To understand the relationship between the reversible computing and the entropy which is a thermodynamic term, we first need to define few terms. The first term been Thermodynamic itself, then reversible verses irreversible processes, and Entropy. In terms of computation we will list and define the underlying hardware logic gates and the basics of the binary information flow.

Thermodynamics [1] is composed of two terms, thermo which is the heat and dynamics which is the flow; thus thermodynamics is the science that studies the flow of the heat and the work generated as a result. Thermodynamics only cares about the macro-level of the transformation of matter from one state of equilibrium to another. There are different properties that define each state of equilibrium which can be expressed as in an equation so called equation of state (

=

[2],

where P is the pressure, V is the volume, n is the number of moles, R is

the fluid constant and T is the temperature). These properties are, temperature, pressure, volume, mass, number of moles … etc. The transformation or evolution process of a matter from one state of equilibrium to another can be done is two different ways, reversibly or irreversibly. In the reversible process, the properties that defines the equilibrium state are constant throughout the entire process which results in a maximum work out. Since the properties are constant, then at any given point of the process, the time line can be reversed and the process continues backwards to get to its original equilibrium state without additional energy required. On the other hand, irreversible process, is a process that happens very fast, where the thermodynamic properties are not conserved and to get to the original equilibrium state, more energy is required.

One may ask why things rusts, decays, falls apart and dies? Well, anything and everything is made of atoms and atoms obey the second law of thermodynamics. All these things follow a spontaneous and irreversible process which takes them from an order state to a disorder one, and once the damage is done it is extremely difficult to reverse things and undo the process. This process is called Entropy which is also known as the Second Law of Thermodynamics. In Thermodynamics, the Entropy is defined as a measure of incurred disorder while moving from one state to another. It is also defined as the amount of energy unavailable to do work and it is measured in Joules, which is the dissipated heat causing the disorder of the surrounding microparticles.

Reversible computing is a model where the computational process is reversible, meaning, at any given point of the process, the time line can be reversed and the computation continues backwards [3]. In such environment the computation can be done forward and backward and it doesn’t care about the historical inputs, because at any given state of the computation both the inputs and the outputs can be determined based on the direction of the computation. This model is used in quantum circuits where the transition from one step of the computation to another is defined as unitary evolution, in other words the transition function is linearly bijection function. To understand the importance and advantages of the reversible computing it is necessary to compare it to the traditional model where the computation process is irreversible. The classical computational model is irreversible, thus it losses the data all the time. Most, if not all of today’s computational models, used in the computers, are irreversible which can explain the heat dissipated from such computers. The amount of energy dissipated as heat in these classical models can be calculated as the increase in entropy between the beginning and the end of the

computation. As the computation progresses from one logic gate to another, the entropy increases due to the irreversibility of these logic gates. The number of pin outputs on the irreversible logic gates is usually less than the number of the inputs pins which implies deletion of some of the input information, thus an increase in entropy. Von Neumann (1966) and Brilouin (1962) came to a similar conclusion that a computer operating at temperature T must dissipate at least

=

2

/

(10

/

) per each bit {o, 1} of information (Chales H. Bennet,

The Thermodynamics of Computation – a Review, 1981) [4], where ds is the change in Entropy and K is the Boltzmann Constant ( increase in entropy D = within the circuit.

×

[

= 1.3806488 × 10 (10

). In this case the

)] is proportional to the number N of logic gates

2. Related Work A. Shannon Entropy [5]

Claude Shannon invented the information theory to solve the communication problems or more specifically how to transmit a reliable and effective communication over a non-reliable channel. Shannon realized that he can take any type of information and present them in a sinusoidal waves and measure them by bits per a second which he introduced in his paper “The Mathematical Theory of Communication” in the 1940’s. In his paper Shannon defined the entropy H as follows:

( )=∑

∈Ω[

( )

( ( )] = [−

( )] [6] where X is a discrete

random variable from a distribution Ω with a mass function P(X). Shannon Entropy can also be defined as the expected value [− with a discrete random variable X.

( )] drawn from a distribution

In the case of binary logic gate, there are two possible outcomes (0, 1). Zero, when no information is transmitted and 1 when information is received. Thus the base of the log in above equation is b = 2 and the possible outcomes are 2:

= {0,1}.

B. Signal Entropy and the Thermodynamics of Computation [7]

N. Gershenfeld in his paper “Signal Entropy and the Thermodynamics of Computation” (IBM SYSTEMS JOURNAL, VOL 35, NOS 3&4, 1996) presented the fundamental thermodynamic limitations at the hardware level that are intrinsic to the computational process. Using the first law of Thermodynamics

=

+

where dU is the change in the internal energy and dW is the work done by the system and dQ is the amount of energy unavailable to do work, also known as dissipated heat. At a constant pressure and temperature T,

=

where dS is the entropy (the

heat irreversibly exchanged with the surrounding). Gershenfeld integrated the first law of thermodynamics, presented above, and solved for work (replacing W with F) to get =



“which measures the

fraction of the total internal energy that can reversibly be recovered to do work. Creating a bit by raising its potential energy U (as is done in charging a capacitor) stores work that remains available in the free energy of the bit and that can be recovered. Erasing a bit

consumes this free energy (which charge recovery logic seeks to save) and also changes the logical entropy dS of the system; hence it is associated with dissipation (which is reduced in reversible logic).” [Quoting from N. Gershenfeld paper “Signal Entropy and the Thermodynamics of Computation” (IBM SYSTEMS JOURNAL, VOL 35, NOS 3&4, 1996)] [7].

C. Logically Reversible Computing [8]

Landauer demonstrated that the amount of entropy corresponding to the dissipated energy as a heat to the surrounding environment is solely due to the irreversibility of the logical program operating the physical computer. He argued that the logically reversible operations can in principle be performed in an energy dissipation-free manner. The logical reversibility requires a low level language reversibility of its commands. At the basic level matter is governed by classical mechanics and quantum mechanics which are reversible in principle and if the low level commands of the logical operations are not reversible enough then the dissipation of energy unpreventable. Samson Abramsky, from Oxford University Computing Laboratory, in his paper “A Structural Approach to Reversible Computation” suggested a more structural approach of the logical reversibility, because it is almost impossible to write a logically reversible program at the low level language. Samson suggested another layer of a syntax that would map the high-level functional program into a simple kind of automata which are immediately reversible.

3. Results & Discussion From above literature review, we can conclude that to attend a perfectly complete reversible computational model, we need to design a new computer architecture with consciousness of the physically and logically reversibility modules. We can only perform a dissipation-free computation on a purely reversible medium, where the hardware applications allow the reversibility of the quantum particles and the software operations allow the reversibility of the information. A. Physically reversible apparatus [9] The building blocks of any apparatus are the electronic components which can compacted in integrated circuits. These building blocks are also called the logic gates of the integrated circuits. The operations performed by each logic gates are very basic and consist of

additions, subtractions, and inverts. The new generation of the logic gates are reversible and are able to conserve the information. B. Logically reversible operations

The logical reversibility of an operation is mathematically defined as bijection investible function that maps each element from one set to one and only one element from another set. For a logical code to be reversible, it has at any given point of the computation to be able to determine the inputs as well as the outputs based on the direction of the computation. Most of the loops in the coding are reversible (i.e. i++, i--, for i=x to i=x+∆x…), and the passing by reference or by values from a difference function is mostly irreversible, because the passed argument is a function of another variable from a different computation.

4. Conclusion This paper has sketched the basic features of the reversible computational model that can be done in a dissipation-free manner. The next step will be to extend the analysis from these simple models to a more complex systems, and to explicitly compute the fluctuation of the energy dissipation due to the physical versus logical reversibility factors.

References [1] "Thermodynamics." Wikipedia. Wikimedia Foundation, n.d. Web. 10 June 2015. [2] "Equation of State." Wikipedia. Wikimedia Foundation, n.d. Web. 10 June 2015. [3] "Reversible Computing." Wikipedia. Wikimedia Foundation, n.d. Web. 10 June 2015. [4] Chales H. Bennet. 1981. "The Thermodynamics of Computation – a Review". [5] "Entropy (information Theory)." Wikipedia. Wikimedia Foundation, n.d. Web. 10 June 2015. [6] Shannon, Claude Elwood, and Warren Weaver. 1949. “The mathematical theory of communication”. Urbana: University of Illinois Press. [7] N. Gershenfeld. 199. “Signal Entropy and the Thermodynamics of Computation”. IBM SYSTEMS JOURNAL, VOL 35, NOS 3&4. [8] Samson Abramsky, “A Structural Approach to Reversible Computation”. Oxford University Computing Laboratory. [9] "What Is Logic Gate (AND, OR, XOR, NOT, NAND, NOR and XNOR)? - Definition from WhatIs.com." WhatIs.com. N.p., n.d. Web. 10 June 2015.