Lecture 18: Nonlinearity & Chaos

Report 4 Downloads 116 Views
Lecture 18: Nonlinearity & Chaos • relevant reading: these notes and python programs are your main resource. For further reading, see BG: parts of chapters 1-4. Or for a brief chapter, I recommend Chapter 12 of “Classical Mechanics” by Taylor. Finally, going through Tutorial 9 is recommended if you want some practice with the concepts discussed.

Maps as chaotic systems • Even though the damped driven pendulum (DDP) is a physically “simple” system (i.e. the equations aren’t that complicated), we can see chaotic behaviour in even simpler mathematical systems. • To understand chaos at a more basic level, we turn to difference equations. Difference equations are not continuous equations. They tell you the value of a variable in a sequence based on the values of the variable earlier in the sequence. • A famous difference equation is the logistic map: xn+1 = µxn (1 − xn ), where µ is some constant. • If you know the initial value x0 , this equation can then give you x1 which you can then use to get x2 and so on. So this equation gives you a sequence of x values. I should note that for the logistic map, you restrict starting values of x between 0 and 1. (The logistic map is more than just math, it is an equation used in population dynamics). • Its called a map because you essentially ‘map’ a point based on the value from the previous point. • You’ve actually used difference equations already. Remember our formula for numerical integration in python: x[i + 1] = x[i] + v[i]dt This has exactly the form of a difference equation. Basically, any differential equation can be represented by a difference equation which is why they are relevant for physics. 1

• Another difference equation (or map) we’ve encountered (although not stated as such) is a Poincare section. Remember Poincare sections plot the phase space (e.g. θ and θ˙ values) at specific times (multiples of the driving frequency). This can be represented in a 2D map like: θn+1 = G1 (θn , θ˙n ), θ˙n+1 = G2 (θn , θ˙n ). where G1 and G2 are functions that give you the new point’s location from the old point’s location. • Back to the logistic map. This is one of the most standard maps you will see in any Chaos book, its relatively easy to understand so its even found in ‘general audience’ chaos books like James Gleick’s book. Here is a graph of it (made from python program logisticmap plot.py with µ = 2.5:

The parabolic curve is the graph of µ ∗ x ∗ (1 − x), but now think about what the axes represent. The horizontal axis is the value of xn and the vertical axis is the value of xn+1 . 2

• Just like in our ODEs for the damped double pendulum, we are interested in the evolution of this system (i.e. what are the values of xn+1 for large n? Are there any “stable fixed points” (i.e. “attractors”). Are any of them “strange attractors”? To find the attractors we can apply a nice geometric method. • The logistic map (and other maps) can be considered as a set of instructions: 1. For value x, calculate y = µ ∗ x ∗ (1 − x) 2. Set x=y 3. Repeat • This process can be envisioned graphically if we plot both the logistic equation and the line y = x on the same graph. It looks like this:

• Now, lets start at a particular value of xn and watch the evolution of the xn to see where the map takes us. This can be done with the vpython program logisticmap_webdiagram.py. Essentially I am going to use 3

lines to represent the instructions listed above. I will pick a starting point on the xn axis, then move vertically up to the logistic curve to find my xn+1 value (this is instruction 1). Then I will set this y value to my x value by moving horizontally from my point to the line y = x. Then I will repeat the process. µ < 1: • I will start by considering a µ value less than one, say µ = 0.9. Start with any initial x0 value between 0 and 1 (say x0 = 0.9), and run the program. You will find that x always heads to the origin. The origin is a “stable fixed point” (like the vallleys in potential plots) in this case. µ = 2.5: • Lets increase µ to 2.5. Again, start with any x value between 0 and run program. You will see that now x heads to the intersection of the y = x and y = µx(1 − x) curves. This is true for all initial x values EXCEPT for x0 = 0. Then you stay at 0. In this case x = 0 is an “unstable fixed point” (like the hills in potential plots) and the other x value is a “stable fixed point”. You can solve for the location of the intersection between y = x and y = µx(1 − x) by setting the right hand sides of the equations equal to each other. You will find that the stable fixed point occurs at x = 1 − 1/µ.

4

• At this point we have found that for µ < 1 we have a single stable fixed point (i.e. a 1-cycle attractor). At µ = 1 the stable fixed point changes to x = 1 − 1/µ, but it is still a 1-cycle attractor. We are calling them “cycle attractors” because they are maps, and we are treating them like we would the Poincare sections from the last lecture. What happens as we keep increasing µ? µ = 3.3: • As we start increasing µ values further, we can find some interesting things happen to our fixed point. Eventually this fixed point becomes unstable and we get a “period doubling” (same as in ODEs). For example at µ = 3.3 my evolution eventually bounces between 2 points on the logistic curve. I now have a 2 cycle attractor (shown nicely by plotting xn as a function of n in the second plot below). (In logisticmap_webdiagram.py, set irate=50 to speed up the plotting and startplot=20 to get past the transient).

5

• As we increase µ further, we go through a sequence of period doublings — a subharmonic cascade — and eventually, we get chaos. E.g. at µ = 3.5: 4 cycle attractor, µ = 3.55: 8 cycle attractor, µ = 3.9: chaotic: 6

• Just like with the DDP, we can plot a bifurcation diagram for the logistic map (i.e. plot the stable fixed points as a function of µ. Using the program logisticmap_bifurcation.py we get the following bifurcation map over different intervals: 7

The 2nd and 3rd plots are just magnifications of the first plot. Notice the period doublings and eventual chaotic regions. Also notice the ”windows” where the chaotic regions because un-chaotic again and then go through a subharmonic cascade again! • The bifurcation diagram has a structure similar to that for the DDP. Both the DDP and the logistic map exhibit the “period doubling route to chaos”. This route consists of a sequence of pitchfork bifurcations that get closer and closer to each other. Your cycles go from 2 to 4 to 8 to 16 to 32 ... eventually to infinity. An ‘infinite’ period cycle is a chaotic cycle.

Sensitive dependence to initial conditions • A final feature of chaotic systems we will discuss: divergence of initially close initial conditions. • We are going to watch the evolution of a system over time from some initial condition. Then we are going to watch the same system’s evolu8

tion for an initial condition that is not the same as our first one, but is really close to it. For non-chaotic systems, the difference between the trajectories will not grow (i.e. if the trajectories start close together, they stay close together). For chaotic systems, the error will grow. For example, lets consider our logistic map. • Suppose we have two trajectories with initial values x1 (0) and x2 (0) = x1 (0) + . • Suppose that after n iterations (or time periods), we express their difference as |x2 − x1 | ≈ ||enλ

(1)

where λ is called the Lyapunov exponent. Note: we are assuming that the growth or decay will be exponential. We will be able to test this when we get our values. • If λ < 0, solutions converge exponentially over time. E.g. stable case for free damped pendulum. • If λ = 0, then the trajectories stay the same distance apart. E.g. stable case for undamped pendulum. • If λ > 0, the trajectories diverge exponentially over time. So a slight error in the initial conditions can quickly lead to huge differences in the solutions. • We can rewrite equation (1) as: |x2 − x1 | = enλ elog || = enλ+log || If we take the log of both sides we get: log |x2 − x1 | = nλ + log || So if we plot log |x2 − x1 | vs n, the slope of our graph will give us λ, the Lyapunov exponent. So if these graphs have positive slopes, then the trajectories diverge. If the graph actually looks linear, then they 9

diverge exponentially (its saying our representation in equation (1) was reasonable). • Lets check out the logistic map for a non-chaotic µ value: say µ = 2.0 (left plot below) and a chaotic value: say µ = 3.8 (right pot below). The plots were made using the program logisticmap_lyapunov.py. we get:

• Notice that the slope in the left plot is 0. This means the Lyapunov exponent is 0 and our solutions do not diverge. • Notice that the slope in the right plot is positive and the graph is linear. This means the Lyapunov exponent is >0 and our solutions diverge exponentially. You could calculate the actual Lyapunov exponent by finding the slope of this graph. • Positive Lyapunov exponents occur in many systems, e.g. turbulent fluid, weather, nonlinear circuits, etc.. These are all systems that exhibit chaos. • This divergence of trajectories is known as “sensitive dependence on initial conditions”. This is why we can’t predict systems that are chaotic. If we don’t know the initial condition perfectly (which we can’t) then we don’t know what trajectory we are on, and hence we don’t know where we will be on the strange attractor at some later time.

10

Summarize hallmarks of chaos: • For the system x˙ i = Fi (x1 , ..., xn ),

i=1→n

we need n ≥ 3 and at least one of the Fi to be nonlinear for chaos to be possible. • Subharmonic cascade via period doubling is a route to chaos. • Poincar´e maps showing strange attractors with fractal (non-integer) dimension and self-similar structure. • Sensitive dependence on initial conditions: Exponential divergence of solutions. Please see Tutorial 9 for another example of a chaotic system, the ’Lorenz’ system. This tutorial will also be helpful for the computational assignment where we discuss even more chaotic systems.

11