5

.mpipks-transcript 06. Non-equilibrium Criticality | 阿掖山:一个博客

 3 years ago
source link: https://mountaye.github.io/blog/articles/mpipks-non-equilibrium-physics-transcript-06-criticality
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

slide 1

00:00 there we go
00:07 okay so now you can see the screen i
00:09 hope
00:10 with a little overview so what where are
00:13 we actually
00:14 uh at the moment yeah so we had uh
00:18 two lectures ago we started thinking
00:20 about how order can emerge
00:22 you know and we said that this somehow
00:25 um very often uh relies on a balance
00:29 between
00:30 fluctuations you know that favor
00:31 disorder and
00:33 interactions that favor order
00:37 and then we went on next week last week
00:40 to study how we can have transitions
00:44 between different non-equilibrium states
00:48 for example between a
00:49 disordered and an order state or between
00:52 different kinds of ordered states
00:54 and we started that in a situation where
00:56 we neglected
00:58 moles we pretended that our
01:00 non-equilibrium system
01:01 or our general system could also be an
01:04 equilibrium system is very very large
01:06 and then we were basically back in the
01:08 framework
01:09 of non-linear differential equations and
01:12 non-linear partial differential
01:14 equations
01:15 and to understand then what kind of
01:17 order we had
01:19 we looked at specific situations here
01:21 where this order
01:23 or these patterns arose continuously
01:26 or not abruptly but they first were
01:28 small and then we can larger larger
01:31 and in these situations were allowed and
01:33 to linearize
01:34 and to ignore these difficult
01:38 non-linearities in these equations
01:42 so today we’re now in a state where we
01:44 want to
01:46 join these two lectures now we’re
01:49 looking at transitions between
01:51 non-equilibrium
01:53 steady states where
01:57 we have noise yeah and i wouldn’t tell
02:01 you that i wouldn’t have a lecture on
02:02 this if
02:03 the noise these fluctuations in these
02:06 transitions weren’t super important
02:09 and maybe yeah then
02:12 once we’ve understood that we are in a
02:15 position
02:16 uh just to understand how does actually
02:19 order look like
02:20 how can we identify order and after that
02:23 we’ll then
02:23 go to data and actually learn some tools
02:26 from data science
02:27 on how to extract such order
02:31 from complex and big data sets
02:34 and then at the end of this lecture of
02:37 this uh this term
02:38 uh we’ll have a specific example from
02:40 research where we bring that all
02:42 together and we see
02:43 how this works together uh in the
02:45 context of
02:46 current research

slide 2

02:50 okay so to start
02:54 let’s remind ourselves about
02:57 how we actually transit from order to
03:00 from disorder to order in equilibrium
03:03 yeah and we’ll continue we’ll
03:06 continue looking at continuous phase
03:09 transition and
03:10 in equilibrium these continuous phase
03:12 positions are characterized
03:14 by a critical point and this is just the
03:16 point where this balance
03:18 between fluctuations and interactions
03:21 this ordering and these disordering
03:24 forces
03:25 are just equal now the system doesn’t
03:27 really know
03:28 exactly where to go whether to create an
03:30 ordered state or to a completely
03:32 disintegrated
03:33 order state and it’s somewhere in
03:34 between and
03:36 uh so i i suppose
03:39 uh you’ve you you’ll have had that in
03:42 your
03:42 statistical physics lecture normally
03:44 yeah but at the end of the first
03:46 statistical physics lectures but i’ll
03:49 just give you
03:51 a little reminder of the most important
03:54 concepts
03:55 so here you see like this example from
03:58 equilibrium
03:59 this very powerful and intuitive model
04:02 system which is called the izing model
04:04 which is essentially modeling a
04:05 ferromagnet magnet
04:07 and on the left hand side you can see
04:10 simulations
04:12 for three different temperatures it’s
04:15 just
04:16 such an icing model and what you see
04:18 here if the temperature is very low
04:21 you get an ordered state now everything
04:23 is black
04:24 all spins are putting the same
04:26 directions
04:27 in the same direction and if the
04:29 temperature is high
04:32 then you get a disordered state and this
04:35 disorder state as you see as a kind of
04:37 salt and pepper state now the average
04:40 magnetization is zero
04:43 but and locally uh you will always find
04:46 spin that goes up and down
04:48 go up and down and then you have that
04:50 thing in between these states
04:52 yeah when the temperature is exactly
04:56 equal to a critical temperature and this
04:59 is where
04:59 there is a balance between energy and
05:02 entropy of
05:03 fluctuations and order and
05:06 what you see here is this state
05:09 where you have these domains here
05:11 domains of these black domains
05:14 and if you zoom in to such a system
05:18 what you’ll see at the critical point
05:19 you’ll see that it looks exactly the
05:21 same
05:21 now you zoom in and you would wouldn’t
05:24 be able to say whether this is a
05:25 snapshot a zoomed in version of the one
05:27 on the left hand side
05:29 or whether this is an entirely different
05:32 simulation
05:34 and then you can zoom in further and
05:35 further and
05:37 you again all the time get the same
05:39 impression that you can’t really make
05:41 out
05:42 uh what is now the typical length how
05:44 large are these classes
05:46 you have here these domains of all sizes
05:49 equally now because you have domains
05:53 these white things or black things of
05:55 all
05:56 size is equally represented you can’t
05:59 make out a single sound you can’t make a
06:01 typical
06:02 length scale here because you can’t say
06:04 that
06:06 the typical size of such a cluster here
06:10 is for example that large
06:13 as you always find clusters that are
06:15 much smaller you find glasses that are
06:17 much
06:17 larger here’s a cluster now that has the
06:20 size of the entire system that’s
06:22 infinitely large
06:23 and then you have all a spectrum of
06:26 sizes in between that
06:28 yeah well here in this example you kind
06:31 of get an idea
06:33 that these clusters are typically very
06:35 small
06:36 these domains whereas pin points up or
06:38 so are very small they have a
06:40 typical size but you can’t see it on the
06:42 left hand side
06:44 yeah same as if i would zoom in with
06:46 this camera you know so
06:49 if we do like this now you immediately
06:52 see
06:53 that i zoomed in now because i have a
06:56 characteristic size i’m
06:57 one meter 80 or something yeah and now
07:00 you
07:00 see that you that have zoomed in the
07:02 camera and the picture is not the same
07:04 as before
07:06 so i’m not in a critical state yeah but
07:09 the icing system is in a critical state
07:12 and this critical state is characterized
07:15 by self-similarity
07:17 so they have structures of all sizes
07:19 represented
07:21 in this system now this is the
07:23 self-similarity
07:24 and the mathematical representation of
07:27 the self-similarity
07:28 the fact that you don’t have an average
07:30 cluster size
07:32 is that the correlation length diverges
07:35 now so the correlation length is
07:37 infinity
07:38 that means the correlation function you
07:41 know
07:41 so how that represents how large
07:45 these clusters are is scale in variance
07:48 that means
07:49 that really classes of different sizes
07:51 are equally represented
07:53 and such if you ask what is the
07:55 probability that i
07:56 am now in a white cluster that i’m in a
07:59 black cluster a certain distance away
08:01 then the answer to this doesn’t depend
08:03 on any specific distance
08:05 you know it’s the same this probability
08:07 is the same this
08:08 correlation function here it’s the same
08:11 when we
08:12 calculated in the original version of
08:15 the simulation or a zoomed in
08:17 uh fraction of this this is the
08:19 self-similarity
08:21 the self-similarity at a critical point
08:23 goes along with power laws
08:26 you know so the power laws have the um
08:30 for example like this here the critical
08:31 length as you go the critical
08:34 correlation length as you go closer to
08:37 the
08:37 critical point diverges to infinity it
08:40 goes to infinity
08:42 and it does so with an exponent that’s
08:44 to be
08:45 called new when this gets zero here
08:49 this term will be infinity and these
08:51 exponents
08:52 capture how fast you go to infinity
08:56 and these exponents are very the fact
08:58 that you have an exponent but
08:59 that you have such a power rule means
09:02 that yourself similar you have a power
09:04 law if you have
09:06 something like this you can zoom in and
09:08 you still have the same exponent here
09:10 and you can’t do that with an
09:11 exponential function or so
09:14 now and it also tells you that this is
09:17 here
09:18 uh if you have power laws from some
09:20 distribution that goes over
09:22 that has a power law that has this long
09:25 tail some exponent you cannot typically
09:29 calculate averages or moments because
09:31 these integrals diverge
09:34 now you have the power law of the
09:35 correlation function and you have a
09:38 power law
09:38 now if you have a power of the
09:39 correlation function you also have the
09:41 power laws
09:42 for example in the density and the
09:44 magnetization
09:46 uh near the critical point and all kinds
09:48 of other thermodynamic
09:50 quantities and i just wanted to briefly

slide 3

09:53 show you why this is the case
09:55 now it’s actually the background of this
09:58 the background of this is actually an
09:59 assumption
10:01 that you say you have a free energy
10:05 and with this free energy uh
10:08 this free energy as you go to the
10:10 critical pound point uh
10:11 gets in finite it has singularity
10:15 and then you say that you assume
10:19 that the free energy here
10:23 this free energy has
10:26 one part that has all the physics and
10:29 all the details a regular part
10:32 but you say that the part of the free
10:33 energy that
10:35 diverges at a critical point
10:39 this one here so now it’s is a
10:44 t t
10:46 is something like t minus
10:50 t c over t you know and h
10:53 is the external field if you have
10:56 something like this as a free energy as
10:58 the function of these parameters
11:00 yeah then the singular part the one that
11:03 goes to infinity
11:05 is a homogeneous function
11:08 homogeneous function just uh tells you
11:12 that if you have f of
11:15 lambda x that this is equal to
11:18 lambda to the power of some alpha
11:22 f on x yeah and this represents that
11:25 just this these gains zooming in you
11:28 have the same function you zoom in where
11:29 you rescale your variable
11:31 you zoom in and you get the same
11:33 function back
11:34 now this is a homogeneous function and
11:37 you still assume that this free energy
11:39 density in this case is a homogeneous
11:42 function
11:43 and this homogeneous function can only
11:45 depend
11:46 on dimensionless quantities now for
11:49 example
11:50 you wouldn’t expect this divergence to
11:53 infinity
11:54 to depend on how you measure a length
11:58 now whether you measure the length in
12:01 units of centimeters or meters
12:04 whether you measure temperature in
12:07 kelvin or in units of one kelvin or two
12:10 kelvins or so
12:12 so you can see you these kind of things
12:15 these units
12:15 dimensions should be irrelevant for how
12:18 this quantity goes to infinity
12:22 yeah and if we say that then we say okay
12:24 so we have
12:26 here so-called scaling function that
12:29 depends on dimensional parameter
12:31 a dimensional dimensionless
12:34 combinations of our parameters so the
12:37 external field
12:38 divided to the temperature and then we
12:40 have to
12:41 take the temperature to some power of
12:43 something
12:45 to make everything dimensionless
12:48 so that it has no units no and there’s
12:51 something that’s not
12:52 something we don’t know yeah and
12:56 this has the free energy has some units
12:59 therefore the whole thing gets doesn’t
13:01 have the units you need a pre-factor
13:04 that gives you the right units you know
13:06 and then
13:08 you have this alpha which we don’t know
13:11 yeah
13:11 this is some exponent that depends on
13:13 the specific model
13:14 for example for the eisenmann zero
13:17 you know and then you have these uh this
13:20 is the
13:21 consequence of how you translate this
13:23 homogeneity
13:25 here of the free energy to something
13:28 that you give names
13:29 as you have here this part that diverges
13:32 yeah that has the units
13:34 and this part here is the so-called
13:37 scaling function
13:38 that only depends on dimensionless
13:40 parameters
13:41 and it turns out that these exponents
13:43 and this scaling function are universal
13:45 so if you know it for one model then you
13:47 know it will know it
13:49 you will know it for a very large class
13:52 and we’ll see that once we do
13:53 renormalization later today
13:57 so uh so what does it mean yeah so if we
13:59 make this
14:00 assumption that’s really an assumption
14:02 about
14:04 homogeneity of the free energy then we
14:07 can calculate for example
14:09 the magnetization m of th
14:12 now this is in thermodynamics something
14:14 like
14:16 del f 2 del h
14:19 you know and then we just plug this in
14:22 and we get something like temperature
14:25 this reduced temperature
14:27 t to the power of minus 2 minus alpha
14:30 means
14:30 minus delta some other function that we
14:34 don’t know
14:35 that again depends on a dimensionless
14:40 parameter and then
14:44 this scales like some better that’s the
14:47 definition
14:48 of this exponent better of the
14:50 magnetization
14:52 yeah and uh so in thermodynamics this is
14:55 called
14:56 rhythm scaling basically in any textbook
14:59 on statistical physics
15:00 and it’s just just to show you how the
15:04 assumption
15:05 of homogeneity near the critical point
15:09 leads to power laws in other
15:11 thermodynamic quantities
15:13 i’ve shown you here the magnetization
15:17 this was the magnetization
15:23 but the same holds true for example for
15:26 susceptibility
15:27 and other thermodynamic quantities that
15:29 you can get by taking derivatives of
15:31 your
15:32 energy so
15:35 this homogeneity or this self-similarity
15:39 that i showed you here that is a
15:41 reflection that is one of the hallmarks
15:44 of uh critical behavior and that’s what
15:47 we’re looking for when we look for
15:49 critical behavior
15:50 and now the question is can we see
15:53 something like this
15:55 also a non-equilibrium system

slide 4

15:58 before before i start with that
16:01 uh let’s just have a look at one
16:02 specific how this scaling
16:04 behaves if you look at these equations
16:08 here
16:10 what does it mean it means that
16:13 actually the curves that you get you
16:16 know so if you just divide so once you
16:19 make a measurement for example with a
16:21 known temperature
16:23 and a known magnetic field you measure
16:26 this function here
16:28 then you know that it doesn’t depend
16:29 separately
16:31 on the age and the temperature
16:35 so you can rescale your axis so this is
16:37 what is your y
16:38 x axis you can use that your x axis and
16:41 your y
16:42 axis to make all of these curves
16:46 collapse onto each other yeah and this
16:49 is this uh
16:50 scaling form that we see in equilibrium
16:54 physics this is for the icing model
16:56 so on the x-axis we have this scaled
16:58 temperature
16:59 that would be t on the previous slide
17:02 lowercase t
17:04 uh times something so this is v scale
17:08 and then these uh people in these
17:11 experiments
17:12 for uh for
17:15 for a ferro ferromagnet measured
17:18 the magnetization for different values
17:20 of the
17:22 of different experimental parameter
17:24 values
17:25 like magnetic field external magnetic
17:27 field temperature
17:30 and by making use of this formula here
17:33 you see that uh the scaling
17:37 behavior what is where is my
17:40 is it going down here yeah that’s the
17:42 scaling behavior here
17:45 yeah that the only thing
17:49 that you don’t know is this g of m that
17:52 you have to measure
17:53 yeah once you know the h and the
17:56 temperature
17:58 you can make all of these different g of
18:00 m’s the gms
18:02 this scaling function you can rescale
18:05 these axes
18:06 to make them collapse onto each other
18:09 yeah and this is this observation this
18:11 is how you observe
18:12 scaling an experiment so you manage to
18:15 collapse
18:16 your experimental curves by multiplying
18:19 this x-axis and the y-axis with certain
18:23 values
18:23 of offense you have to guess you can
18:27 collapse all of these curves on the same
18:30 uh universal so-called scaling form
18:34 now this is the manifestation of scaling
18:36 and that’s of course
18:37 also something we’ll be looking at and
18:39 non-equilibrium systems
18:41 but also in data
18:44 now scaling is a whole mark of critical
18:48 behavior and today

slide 5

18:52 we want to see whether these concepts
18:56 of scaling and criticality yeah
18:59 where and these these continuous
19:02 phase transitions actually also extend
19:05 to non-equilibrium systems
19:07 yeah and it turns out so now we first we
19:10 need to find a non-equilibrium system
19:12 that is as intuitive
19:15 as the ising model and the icing model
19:18 is very intuitive
19:19 you have that in your lectures when
19:21 you’re a student and
19:22 in your later life as a scientist you
19:24 always refer to that because it’s so
19:26 simple and intuitive that uh you can
19:28 explain a lot of things a lot of
19:30 things about continuous phase
19:32 transitions in equilibrium
19:33 just based on this very simple model
19:35 like i did in the beginning of this
19:37 lecture
19:38 and it turns out now that the uh
19:41 icing model of non-equilibrium physics
19:44 of course is
19:46 that so people would dig a disagree but
19:48 one of the simplest models in
19:49 mono-equilibrium physics that shows
19:52 critical behavior
19:53 is an epidemic model and this epidemic
19:56 model we knew already
19:57 from the previous lectures here we have
20:02 our good old si model again
20:06 now so this epidemic model is this is
20:08 the simplest model
20:10 that you can think about so you have
20:11 infected individuals
20:14 i and susceptible individuals or
20:17 healthier people’s less
20:19 now if an i meets an s
20:22 then the s gets infected with the rate
20:25 say lambda half
20:27 and turns into an affected infidel and
20:30 in the end you have two of them
20:33 then you have the other process that we
20:34 recover
20:36 and we set this rate to one now we can
20:38 just set that to one
20:40 without any loss of generality and uh
20:43 so that infected we measure units
20:46 time in units of this recovery rate
20:50 you know service infected individuals
20:52 can also
20:54 then recover and become
20:57 healthy again we have these two kinds of
21:00 individuals
21:01 and now we put them in the real world so
21:04 last time we were only looking at some
21:06 well-mixed
21:07 average quantities but now we put them
21:10 into the real world like the city of
21:11 brisbane also
21:13 where they actually can where actually
21:16 space
21:17 matters yeah so i’m more likely to
21:19 infect somebody else working at the
21:22 mp rpk pks than somebody looking at
21:25 another max planck institute for example
21:28 yeah so
21:29 so here uh these spatial structures
21:32 these special degrees of freedom
21:34 uh are taken into account and the
21:36 simplest way of you
21:38 how you can think about this is at the
21:40 bottom here
21:42 now that you look at letters
21:45 you have a letters each site
21:48 carries either an affected individual or
21:51 a recovered individual and
21:55 you know an infected or a recovered
21:56 individual and
21:58 when an infected individual
22:02 is next to a recovered a healthy one
22:05 then
22:05 the healthy one can turn into an
22:07 infected one
22:09 with a certain probability of with a
22:10 certain rate lambda over two
22:13 yeah and also there’s another process
22:16 here if i saw if the
22:18 individual on a certain position is
22:21 infected
22:21 it can turn into a healthy one at a rate
22:24 lambda
22:26 so this is this simple spatial version
22:29 that you can think about for this and
22:32 simple epidemic model and it’s also the
22:35 literature is often called the contact
22:38 process
22:40 so of course real epidemic model models
22:43 have
22:44 typically one more component namely the
22:47 um
22:49 [Music]
22:51 the infected recovered uh wait
22:54 is this also here okay so so the third
22:57 component that you normally have
22:59 and these models are the recovered one
23:01 the immune people
23:02 now you have the disease yeah and then
23:04 you are fine for the rest of your life
23:06 and you’re immune to this disease
23:08 so you can only forget in fact one then
23:11 you have a third
23:12 species here a third kinds of particles
23:15 which would be the recovered ones
23:17 or the immune ones and they cannot be a
23:21 faculty again
23:22 but this slightly more complicated model
23:25 uh is shows very similar behavior to the
23:28 model that we’re studying here
23:31 for the things that we’re interested in
23:32 so here we’re interested in infinities
23:34 in singularities so once you
23:38 once you look at these kind of things
23:40 then these models will qualitatively the
23:42 same
23:43 although also the exponents will be
23:45 different
23:47 but once you look of course into
23:49 non-singularities it’s more critical
23:51 behavior
23:52 than the messiness of how wide your
23:55 roads are
23:57 how often the tram goes uh between the
24:00 blasphemy institutes and so on these
24:02 things will matter
24:05 yeah but close to the critical point uh
24:07 we’ll be fine

slide 6

24:10 so this is a stochastic simulation of
24:13 such a system
24:14 and we can just see what happens on the
24:17 left hand side
24:18 you see a simulation of such a lattice
24:20 system
24:21 where you initially have random random
24:25 random initialization so every site
24:28 is either with the probability of one
24:30 half
24:32 a certain probability infected or
24:35 not infected and what you see here
24:39 now is in blue infected
24:42 individuals now if this lambda
24:46 now this infection rate is smaller
24:49 than a certain critical value
24:52 then what you will see is that this
24:54 infraction
24:55 this infection can spread for a while
24:58 but most of the time with a certain
24:59 probability
25:00 it will uh disappear
25:04 now so in this regime here in this phase
25:08 the recovery rate outweighs
25:11 the infection rate you know and
25:14 uh so that’s what we’re supposed to be
25:17 on in this regime we’re supposed to be
25:19 investing
25:19 starting next week and then on the right
25:23 hand side
25:24 that’s the regime that we’re currently
25:26 in now then the infection rate
25:28 is larger than the uh
25:31 than the recovery rate now so the
25:33 infection probability is higher
25:36 and what you will then end up is is a
25:39 state where most of the individuals will
25:42 carry the disease will be infected
25:45 so you will you will reach a steady
25:47 state
25:49 not everybody is all the time in fact
25:50 that you will reach some steady state
25:52 with a certain percentage of infected
25:55 people
25:57 and now we have this situation in
26:00 between
26:02 that’s this one here and this is
26:05 where the um where
26:08 the infection rate is more or less
26:11 balanced
26:12 with the recovery rate it’s not exactly
26:15 equal to one
26:16 so that’s these things are complicated
26:18 yeah you might think that
26:20 okay if this this lambda is equal to one
26:23 or one half or so
26:25 yeah then uh then that’s the critical
26:27 point of these systems i’ll show you are
26:29 more complicated than you might think
26:32 because the noise is so important here
26:35 and here
26:36 what you see is that you have domains
26:39 that become larger and larger over time
26:41 so from
26:42 when we go from top to bottom we have
26:43 time now so we go we start here at the
26:46 top
26:47 and then this domain goes large and
26:49 larger you have merging
26:51 of domains with infected individuals
26:54 and then we have branches that they die
26:56 out
26:57 like this one here and uh
27:00 it looks a little bit like a
27:02 self-similar state
27:04 now we have domains of all sizes for
27:07 example
27:08 in the time domain or from top to bottom
27:11 you have some branches that die out
27:13 quite quickly here
27:15 but then you have other branches like
27:17 the big one in the middle
27:19 where that just go on for uh forever
27:23 without really occupying the whole
27:24 system
27:26 and then if you look take a slice in
27:28 this direction here
27:30 these simulations are very small also
27:33 it’s not like a design
27:34 you can’t see that well you’ll also see
27:36 that here you have
27:37 structures of all different sizes
27:41 there are small ones like this one here
27:44 yeah and the larger ones like this one
27:46 and so you have these structures of all
27:48 different sizes
27:51 and this is again reminiscent of cell
27:53 similarity
27:54 and the critical point so it turns out
27:57 our little empiric model has a critical
28:00 point
28:00 can i ask a question um i i think i
28:04 roughly understand the model
28:06 but this simulation is the simulation of
28:08 what
28:09 so is it a hamiltonian system where the
28:12 um just yes so what is this basically
28:15 nothing
28:16 yeah so uh you just take what it is here
28:19 or you take that just this year the way
28:21 here that you write these simulations
28:23 the different ways to write them you
28:25 have a lattice you know you have a
28:26 numerical simulation you have a vector
28:28 an array and you either have like
28:31 one or zero and then you pick a side
28:35 randomly
28:36 and perform these reactions here
28:39 you know so so the one way to do that is
28:41 to pick a side randomly
28:44 and check if your neighbors overcome if
28:45 you pick this side
28:47 and if your neighbor is susceptible or
28:49 it does not is not infected
28:52 then you infect the neighbor with the
28:53 probability
28:55 lambda over two so like a monte carlo
28:58 simulation it’s a monte carlo simulation
29:00 the different ways you can also think of
29:02 a cellular automaton
29:04 yeah but uh the typical way to simulate
29:06 these things are multicolored
29:08 simulations
29:09 okay but there’s no hamiltonian there’s
29:10 no deeper insight you can just take
29:12 these rules
29:13 and simulate them on a lattice and the
29:15 only thing you have to do
29:16 is to take into account that this is not
29:19 a deterministic process here
29:21 but it’s a random process with a
29:23 probability one-half
29:25 you turn this one here lambda over two
29:27 you turn this one here
29:29 into an infected blue one can i then
29:32 properly
29:33 um make a make a statement about the
29:36 time scale how
29:37 some how this thing spreads because it’s
29:40 it’s random right so i can
29:44 that’s what we’ll be trying to do today
29:47 okay uh but we’ll
29:48 uh only be managing to do that tomorrow
29:50 uh let me just go on
29:52 so here you get some kind of idea
29:55 already
29:56 in this slide here you can get us some
29:58 kind of idea here so that
30:00 that you have here a time scale yeah
30:03 it’s not
30:04 where things uh where things uh
30:06 disappear
30:08 yeah so you say that for example here
30:10 the typically the
30:12 the number of infected individuals will
30:14 go down with an exponential function
30:17 yeah and then this has a typical time
30:18 and then you get rid of most of the
30:20 infected ones
30:22 this has some certain time at this
30:25 typical
30:25 this certain time where you say okay
30:29 at this time i’m i have lost most of my
30:31 infected
30:32 people now that they’re healthy again
30:35 this is then called
30:36 psi parallel
30:39 yeah so this is like a correlation
30:41 length so i’m actually actually getting
30:43 a hat a little bit too far so so this
30:45 you have here a correlation length in
30:47 time
30:48 but that tells you exactly that how f
30:51 how quickly
30:52 does this disease disappear
30:55 yes you have a correlation length in
30:57 space like this
30:59 so for example this one here
31:03 now our analysis is it depends on how
31:05 you define it it can relate to that
31:07 uh this is typically called psi
31:10 perpendicular
31:12 you have a correlation length in space
31:13 now that tells you how large are your
31:15 clusters
31:16 but you also have a correlation length
31:18 and time how long
31:20 lived are your clusters how long does it
31:23 take for them to disappear
31:25 and it turns out that both of these
31:27 things at the critical point are
31:29 infinite
31:30 so the system is not only self-similar
31:32 in space but also in time
31:37 yeah but first before we before we do
31:39 that um
31:40 uh before we do that formally so what i
31:43 see here
31:44 is is the stochastic simulation uh we’ll
31:46 later
31:48 motivate some larger equation that we
31:50 actually will be studying
31:52 but for now now the system is as simple
31:54 as it gets now you have a neighbor
31:56 if this neighbor is not infected you
31:58 infect it with a certain probability
32:00 yeah it’s the simplest there’s like five
32:03 lines of code also
32:04 in matlab now there’s nothing there’s
32:08 nothing
32:09 uh in terms of the simulation the modal
32:11 definition is nothing
32:12 that is nothing deep in there but of
32:15 course the consequences as we see on the
32:16 slide
32:17 are rather non-trivial

slide 7

32:22 so now we want to go one step
32:26 ahead and try to formalize this
32:29 mathematically
32:32 and um to formalize this we first need
32:36 to
32:36 have something to put into our larger
32:39 equation
32:40 yeah and that something that we put into
32:42 our laundry equation
32:44 is the density of this or this order
32:47 parameter
32:49 is the density of infected
32:52 individuals all right to get this we uh
32:58 um we we do a double average
33:02 so this average here is over the lattice
33:06 now we sum over the letters and we count
33:10 now with this si variable like a spin
33:14 how many infected individuals we have
33:17 and divide it by the total number of
33:19 lattice sites
33:20 the system size and then we average
33:23 again
33:23 over the ensemble now this is our order
33:26 parameter and this parameter this order
33:28 parameter
33:29 tells us whether we have order or not
33:33 you know if this is one then everybody
33:35 is infected
33:36 yeah it’s not or not order or not if
33:38 this is one
33:39 everybody’s infected and if this is zero
33:42 then everybody is healthy
33:45 so this is our like our magnetization
33:48 and now
33:49 we want to do the same thing as an
33:52 equilibrium
33:53 i also want to ask what are we actually
33:56 looking at yeah so
34:00 what we say is we don’t know
34:04 but we make the assumption
34:07 that in this non-equilibrium critical
34:10 point
34:11 we also have scaling behavior and we
34:14 also have self-similarity
34:16 now and of course you can test this
34:18 assumption if you do large enough
34:20 computer simulations
34:22 so one thing is that this
34:25 if our system obtains a steady state
34:28 with some density
34:30 you know so that’s a row
34:33 stationary density something like the
34:36 magnetization you know the process of 90
34:39 of the people
34:40 are infected uh
34:43 this goes with
34:48 lambda minus lambda c to the power
34:51 of beta it’s like the magnetization we
34:54 don’t know what beta is
34:56 but there is some better that we want to
35:00 know
35:02 now as i’ve discussed already before we
35:04 have now not just
35:05 one correlation length but two so one is
35:08 the spatial
35:13 correlation length
35:18 and this is typically denoted by psi
35:21 perpendicular
35:22 because it’s perpendicular to time and
35:26 perpendicular so if you look at these
35:28 pictures here
35:30 you can kind of get an idea
35:33 why this is called perpendicular and
35:35 parallel
35:37 suppose that this is here whether this
35:39 actually an
35:40 equivalent model is the one of water
35:43 pouring
35:44 into soil you know so you have little
35:48 channels
35:48 it’s a rough thing yeah and then for
35:51 example here
35:52 the water flows down
35:56 but at some point now the density of the
35:59 soil is too large
36:00 and the water stops this is
36:03 this is an example where the soil is
36:05 like the soil is like
36:06 it’s very open it’s not very dense you
36:09 put water in it
36:10 and it flows all the way to the bottom
36:13 so that’s
36:14 what is what is sorry yes um you said
36:17 that
36:19 in case of critical systems the
36:20 correlation length in space can be
36:22 divergent
36:24 yes and also the correlation length in
36:27 time could be divergent
36:28 yes so if the correlation length in time
36:31 is divergent then in this specific
36:33 example
36:35 the it’ll the number of infected
36:38 clusters will always be present
36:40 right yes exactly you will not you will
36:43 never get rid of this
36:44 disease but of course this infinities
36:47 when i talk about infinity
36:49 these infinities are not defined really
36:52 in this small simulation where we maybe
36:54 have
36:54 100 individuals also now these
36:57 infinities are defined for
36:58 systems that don’t really have an
37:00 infinite infinitely large size
37:03 now this here what you see the
37:05 simulation in the middle
37:07 can just by chance disappear
37:11 and it will disappear and i can tell you
37:15 even that the disease in this case here
37:18 the right hand side will disappear with
37:21 a very small probability
37:23 right so it’s a very nice feature of
37:25 this model that will turn out to be very
37:28 important
37:29 what happens if all individuals
37:32 are healthy what happens if all
37:36 individuals are healthy
37:39 then there’s no process here
37:44 there are only s’s there’s no process
37:46 here that can give you the disease back
37:49 once the disease is extinct
37:52 it will never come back and because this
37:55 is a stochastic system
37:57 you just have to wait long enough and
37:59 just by chance
38:01 even this is casey on the right hand
38:03 side will turn into the
38:05 case just because it’s stochastic just
38:08 by chance
38:09 maybe you have to wait 100 billion years
38:11 or so for this to happen but you know
38:13 that at some point you will end up in
38:15 this state
38:17 where the disease went extinct by chance
38:20 you have to wait extremely long for that
38:22 but you know that it will happen
38:24 and these states here now like in this
38:27 system here
38:28 you go to zero and then there’s no way
38:30 it can come back
38:32 yeah in reality you will have to wait
38:34 for evolution
38:36 to create another virus that has the
38:39 same properties
38:40 now to come back so that texas goes
38:42 extremely long
38:43 yeah it’s a much it’s much longer than
38:45 the spreading of the disease itself it
38:47 happens in one or two years
38:50 you know and so these are called
38:52 absorbing states you can go in there
38:54 but you can never go out again
38:58 so in other words this means that in
39:00 these absorbing states
39:02 they’re very important for not only for
39:03 virus spreading but in any ecological
39:05 model
39:06 now we have extinction and this
39:09 absorbing states
39:10 uh you can get in but you will never be
39:12 able to get out
39:14 now once you’re in there you’re trapped
39:16 and these absorbing states they don’t
39:18 have
39:19 fluctuations they don’t have any noise
39:21 and we’ll see that in the larger
39:22 equation you know so these absorbing
39:25 states don’t have any noise
39:28 and by this you can already see that
39:30 this whole system
39:31 is a non-equilibrium system because if
39:34 you have a state that has no noise
39:36 this is not a thermal system where you
39:38 have a temperature
39:40 now so this here is a system where you
39:41 have noise now for example here you have
39:44 noise but once you reach the state
39:47 where there’s no disease no virus left
39:51 you don’t have any noise anymore you
39:54 know and that cannot happen in a
39:55 thermodynamic system that is an
39:57 equilibrium
39:58 that you always have your temperature
40:00 and this will always give you noise
40:01 regardless of how many
40:03 particles you have or whatever yeah so
40:06 this already tells you that this is a
40:07 non-equilibrium system
40:09 and it’s a very interesting system and
40:11 the system is actually one of the
40:13 universality classes non-equilibrium
40:15 physics
40:16 so that’s once you are getting a little
40:19 bit ahead
40:20 once you know that your system has one
40:22 absorbing state
40:24 many ecological systems for example one
40:26 absorbing state
40:28 then it’s quite likely that what i’ll
40:29 show you in these
40:31 uh renewalization calculations today and
40:34 next week
40:35 will also apply to these systems is a
40:37 very powerful
40:38 you know universality class and
40:40 universal system
40:42 for non-equilibrium systems
40:45 yeah but let’s let’s i was here talking
40:47 about did i actually answer your
40:49 question so i got a little bit
40:50 uh distracted uh i i distracted myself
40:55 a little bit yeah did i do that
40:58 okay okay i i forgot it at the end i
41:01 forgot this question but i hope i
41:02 answered it at some point
41:04 okay so you have the two correlations
41:06 just spatial correlation length
41:07 you know that’s uh sigma
41:11 uh side perpendicular and actually
41:14 what i want to say here is actually
41:16 that’s that’s what i would say
41:18 yeah so you have here you have soul and
41:20 you have water
41:21 flowing through this then the parallel
41:23 length here
41:26 this one it’s called parallel because
41:28 it’s parallel to the direction of
41:29 gravitation
41:31 and the other length here is
41:34 perpendicular because it’s perpendicular
41:36 to the
41:37 direction of gravitation now that’s
41:38 where these names come from
41:40 because these models called direct or
41:42 the directed percolation
41:44 is that you have something flowing
41:47 through a rough
41:48 medium like soil and then you have a
41:51 direct gravitation force that pulls the
41:54 fluid into one direction
41:55 but not in the other direction and
41:57 that’s where these parallel and
41:58 perpendicular
42:00 yeah so and then we give that some
42:02 exponent
42:04 lambda minus lambda c to the power of
42:08 minus
42:09 mu perpendicular
42:12 now that we have the temporal or dynamic
42:22 correlation length
42:25 side parallel and this
42:28 we call and very surprisingly minus
42:32 new parallel
42:35 and this is now as you said so
42:38 our temporal correlation can become
42:41 infinity
42:42 what does it mean yeah so i have a
42:45 perturbation
42:46 to the system so what so if you have a
42:49 the spatial correlation and infinity
42:52 like an isomorph
42:53 you make a perturbation and this
42:56 perturbation will in principle
42:58 affect all parts of the magnet
43:01 yeah you will have a very very long
43:03 range correlation you flip a spin
43:05 somewhere
43:05 and it has an effect somewhere
43:07 completely somewhere else
43:10 now we have an infinite correlation
43:13 length
43:13 in time what does that mean so that
43:16 means that if we perturb the system we
43:18 are at a critical point
43:19 we will perturb the system and the time
43:23 that it takes the system to go back to
43:26 forget this perturbation
43:27 is infinitely long yeah
43:30 so so you have again now processes in
43:33 all time scales and parallel
43:35 very long very slow process and also
43:37 infinitely long processes
43:39 yeah it’s like like the space in the
43:41 isis mode you have classes of all
43:43 different sizes now you have also
43:44 processes
43:45 of all different length scales at the
43:48 same time
43:49 now and this is what this criticality
43:51 does to time
43:52 the time domain you make it perturbation
43:55 and it never just disappears again
43:57 that the effects of this particular
43:59 perturbation you will see in this system
44:01 infinitely long now so that’s the cool
44:04 thing about critical systems that does
44:06 uh
44:08 it does uh they do very straight things
44:11 and then now we define another uh
44:15 i’ve got a question yes sorry is are we
44:18 still dealing with a mean field model
44:21 uh i i didn’t tell you yet uh but we’ll
44:24 we won’t be dealing with the mean field
44:26 of model
44:27 yeah so mean field is not very good for
44:30 these kind of things
44:32 yeah so we’re not uh we’re not dealing
44:34 with the mean fit model
44:35 last last time last week we were dealing
44:37 with mean field models
44:39 but this time we have to take propaganda
44:41 fluctuations properly into account
44:44 and we will have also to take into
44:46 confluctuations on all
44:48 different temporal and spatial scales
44:51 you know so that’s that’s what we will
44:53 have to do and that’s what we will uh do
44:55 with the renovation group
44:58 so mean field theory is typically pretty
45:00 bad for these things
45:03 even just for the getting what is this
45:05 lambda c i’m going to show you what this
45:06 lambda c is but this is
45:08 in the mean field version we would say
45:10 okay this is just one half or something
45:12 like this
45:12 yeah where you write down some
45:14 differential equation like you did last
45:16 time
45:16 you write you guess some differential
45:18 equation you motivate it
45:20 and then you get some lambda c but i’ll
45:22 show you today
45:23 now that this is actually not how it
45:26 works if you have these
45:27 strong fluctuations to get different
45:29 results
45:31 okay so then we have a third exponent
45:35 the dynamic critical exponents
45:39 and that means that near lambda c near
45:42 the critical point
45:45 we say that uh these correlation lengths
45:49 now they they both follow power rules
45:52 you know and we say that they are
45:54 connected
45:55 by this exponent called z
45:59 and this is called
46:02 a dynamical
46:07 critical
46:11 exponent yeah so we what we did now
46:14 is okay we said okay these systems look
46:17 self-similar
46:18 uh like in equilibrium and we assume
46:21 that the same concepts of scaling
46:23 and power laws also apply to
46:26 non-equilibrium systems

slide 8

46:30 and now i have a slide here that we
46:31 already talked about
46:33 this is just what are these correlation
46:35 lengths now so what are these
46:37 correlation uh
46:40 what are what are these correlation
46:41 lengths these two and we discussed that
46:44 basically already
46:45 and so if you look here on the right
46:49 hand side for example
46:51 so on the left hand side i have two
46:53 simulations
46:55 that were started from an initial seat
46:58 from this just one
46:59 infected individual and on the right
47:02 hand side
47:03 these figures uh they are from
47:06 assimilation
47:06 were maybe 50 percent of the letters
47:10 was infected you know other
47:13 simulations i didn’t show you uh there’s
47:16 a nice review
47:17 article by hindi uh
47:24 about a non-equilibrium criticality and
47:26 face transitions
47:28 and that’s where it took this pair this
47:30 uh it’s a very nice review
47:32 uh about the kind of things that we’re
47:34 doing this week and next week
47:37 so here now we this is just an intuition
47:40 about what these correlation lengths are
47:42 uh i told you already you know that this
47:45 um that this temporal correlation length
47:49 side parallel gives you to say the time
47:52 that the disease
47:54 dies out as you can see that here
47:57 in this simulation here that the
47:59 parallel correlation length
48:01 that’s the time for such a droplet here
48:04 to go away again now we know that
48:08 if we’re below the critical point that
48:09 at some point it will disappear
48:12 and this has a typical time and this is
48:14 just the excite
48:16 parallel and then you have also a
48:18 typical size of such a droplet here
48:21 that’s psi perpendicular that’s just how
48:24 large
48:25 do these domains get and you can also
48:28 see that here on the right hand side of
48:29 course
48:30 how long does how large does the domain
48:33 of uninfected
48:34 or infected individuals get now there’s
48:37 this one here and how long does it
48:39 survive
48:41 now that’s how these correlation lengths
48:44 are to be
48:45 interpreted intuitively

slide 9

48:49 okay now
48:53 i’m going to do a little step
48:57 uh that where i’m trying to avoid a long
49:02 calculation
49:05 what we usually would do now is we would
49:08 take this lattice model and we would
49:11 try to derive a master equation
49:15 and also this probability that the
49:17 lattice
49:18 has a certain configuration and then we
49:21 would be looking
49:22 at this master equation uh
49:25 this very complex master equation that
49:27 tells us the time evolution of this
49:29 vector and we try to get the rates
49:33 and then we would uh do approximations
49:36 uh like the system size expansion or the
49:38 chromosomes
49:39 expansion and so on and then we would
49:41 try to derive this large
49:43 which now that’s a lengthy business and
49:46 it’s not the subject of our
49:48 lecture what i just want to show you is
49:52 that why does this larger equation that
49:55 i’ll show you
49:56 later not look like what you naively
49:59 would expect
50:01 to this end you can just show you that
50:03 such a letter
50:04 system you can interpret in different
50:06 ways
50:08 one way is to say that it won’t give you
50:11 like
50:12 the mathematical rigorous form of the
50:14 larger equation but it shows you why
50:16 it looks not like you expect it to look
50:19 like
50:20 so the first way we can interpret
50:24 this lattice system is to say what
50:27 happens
50:28 at t and what is the state at some d
50:32 plus dt some like a very short time
50:35 interval after that
50:38 and now i’m taking like in this master
50:40 equation i’m taking the perspective
50:43 of the state in a certain site
50:48 and then ask what are the previous
50:51 states
50:52 that give rise to me being infected
50:57 you know what gives rises yeah and
51:00 suppose that i was not infected before
51:03 yeah then my left neighbor could have
51:05 been infected
51:07 my right neighbor could have been
51:09 infected or both of them could have been
51:12 infected
51:12 and have affected me these are the three
51:15 processes that
51:16 lead to me being infected and then
51:19 if it was there’s another process
51:23 and i should use here different colors
51:25 it’s not the recovery
51:35 i switched the colors not red is
51:39 healthy sorry
51:43 red is healthy and blue is infected
51:48 okay yeah if i in this recovery process
51:53 it doesn’t matter what my neighbors were
51:54 i would just i just know that was
51:56 previously infected
51:57 if i’m now in the process of recovery
52:01 now this is this master equation picture
52:03 where we ask the word which state do i
52:04 come from
52:06 and now we can take an equivalent
52:09 description
52:10 and take more a lattice picture now that
52:13 corresponds more a little bit left to
52:14 the longer
52:15 picture yeah where update where i say
52:18 what is the state of the lattice now
52:20 and what is the state of the lattice the
52:21 next step
52:23 now that corresponds to this
52:24 differential equation picture
52:28 yeah and then how can i update it so
52:32 then i need at least two lattice points
52:35 at the same time to update to define
52:37 these updating rules
52:39 and then i have different possibilities
52:41 here
52:42 yeah if my two lattice points are
52:45 infected
52:46 that previously either the left one or
52:49 the right one
52:50 was infected
52:54 the recovery process is no more
52:56 complicated
52:57 yeah if i have known that in this two
53:00 side picture i have one
53:03 infected and one one infected and one
53:07 not affected once this white is down
53:09 here
53:10 this is
53:13 healthy this is
53:17 infected
53:21 now if i have one infected and one
53:24 healthy one
53:25 previously my system could have been
53:29 must have been in a state now if i’m
53:31 looking at the
53:32 recovery process where both of them were
53:35 infected
53:37 now so with the probability one half i
53:39 have either this
53:41 or this one here and then
53:44 if both of the states in the second time
53:47 step
53:48 are healthy then one of them was in fact
53:52 before now so here i update two sides at
53:55 the same time
53:57 or i can also say i updated the whole
53:59 letters at the same time
54:00 in parallel and now
54:04 the thing is what is this here
54:09 what is this here these two processes
54:11 here
54:12 suddenly i have a process a process for
54:15 recovery
54:17 that involves two individuals
54:20 yeah formally that involves two
54:22 individuals so here have two individuals
54:25 to infect it
54:26 and after that only one of them is
54:28 infected

slide 10

54:30 now suddenly i have two individuals and
54:32 if i ask how
54:33 such a term here if i take this
54:37 description as the basis for my larger
54:40 equation
54:41 how will this term pop up in my larger
54:44 equation
54:45 then it’s this term is proportional to
54:49 one-half
54:50 times
54:55 the probability that one of them is
54:57 infected and the probability that the
54:59 other one is
55:00 also infected so we will have something
55:03 like rho
55:04 of x t squared
55:08 and we’ll get a minus because we
55:10 decrease the number
55:11 of infected individuals
55:15 now this is just to show you yeah by
55:16 this uh
55:18 magnesium inside the exotic lattice
55:20 representation
55:22 that you suddenly get a term you write
55:25 down the larger equation here
55:27 this is the larger equation
55:30 and this is here what you expect the
55:32 first thing is what you expect is an
55:34 infection term
55:35 you have the density of infected people
55:37 at a certain position x
55:40 and this depends on how many infected
55:43 people i already have
55:44 that is this typical exponential
55:47 increase of the infection rate
55:51 now we get another term here
55:55 this term that describes the recovery
55:58 process now this describes
56:02 the recovery process and
56:05 it suddenly has this quadratic term
56:08 although this recovery process like this
56:10 picture looked completely linear because
56:12 every individual was doing it
56:14 individually now we have now the second
56:18 degree term here and that looks like an
56:21 interaction
56:22 and this comes just because we’re
56:23 updating all the letters in parallel in
56:25 this
56:26 lingerie equation and that’s why we get
56:29 this
56:30 second degree term here we have a
56:33 diffusion term
56:34 this one here that was also not in our
56:36 model description
56:38 you know that we never said that these
56:41 particles are actually moving around
56:44 but there is some spread of spatial
56:46 information because you
56:47 interact with the nearest neighbor yeah
56:50 and this
56:51 is models if you zoom out and go to a
56:54 continuous picture
56:56 it’s a spread diffusive spread of
56:58 infection information
57:00 and that’s why you effectively get a
57:02 term here you need to have a term
57:05 that involves spatial derivatives now
57:07 where you actually spread something over
57:09 space you spread the infection of
57:11 space and that’s why you get this term
57:13 here
57:14 and you have of course again a noise
57:16 term
57:18 now and i’ll give these here parameters
57:20 and also new names the combinations of
57:22 the old parameters
57:24 and we want to keep this we want to have
57:26 different parameters at each of these
57:28 rates because we in the next step want
57:30 to renormalize these parameters that we
57:33 we need them and the noise here
57:36 on the right hand side is our good old
57:39 gaussian noise now that has
57:41 zero mean and
57:44 correlations in space and time
57:47 that are luckily delta distributors so
57:50 they’re
57:50 memory less they don’t have memory in
57:52 space or in time
57:54 but they depend on this density here
58:00 they depend on this density and what
58:03 this means
58:04 is that this noise
58:08 the correlation
58:11 in the noise or the strength of this
58:13 noise that’s the strength of
58:15 noise
58:19 is zero
58:24 if the disease
58:28 is extinct that’s called
58:31 multiplicative noise because now the
58:33 density
58:35 rho of x and t is a pre-factor in the
58:39 noise term that
58:40 if is it is contributing
58:43 or defining the strength of the nodes
58:45 and once we have
58:46 zero infected individuals left then the
58:49 noise disappeared and we can never have
58:51 get away out of this term out of this
58:53 point
58:54 where the infection is lost

slide 11

58:59 so maybe
59:02 quite late in time as a next step
59:10 um we get a little bit more formal so
59:12 now we have the larger
59:13 equation now and what you see here
59:16 already
59:17 is the martensite rose functional
59:20 integral that we derived
59:22 and this functional martin citra rose
59:24 function integral
59:25 or martensite rules johnson the dominic
59:27 is functional integral
59:29 you can divide very easily we remember
59:32 the equation that we had
59:33 a few lectures ago first
59:36 part of this function integral
59:42 what was it here
59:45 it’s just the launch of a equation
59:48 yeah that that makes sure that you
59:50 actually solve the launcher equation
59:52 here you have the laundry equation and
59:55 then you have these terms here
59:57 on the right hand side there are higher
60:00 order
60:01 here we have this phi squared
60:04 that’s just this one here and here
60:08 you have a noise term that we also had
60:10 before now that was this
60:12 gaussian noise term that we came from
60:14 integrating out the psi
60:16 in the martensitic rows formula now so
60:19 we have this term here
60:21 is quite noise but in contrast to the
60:23 previous case
60:25 where the noise didn’t depend on the
60:27 density itself
60:28 we now here that’s the only difference
60:31 have
60:31 another phi here
60:35 now we have another phi here
60:38 that’s the only difference that we get
60:40 for multiplicative noise
60:42 and because we have multiplicative noise
60:46 you can see that somehow now the noise
60:50 term
60:51 here looks a little bit like another
60:55 term that is
60:55 actually as an interaction term where we
60:58 couple the two fields
61:00 one linearly to each other this term and
61:03 suddenly the noise term is not
61:04 simple a gaussian it’s not simple a
61:07 gaussian that the inti can integrate
61:08 over
61:09 suddenly it couples to the other fields
61:11 now the strength of this noise curve
61:13 is proportional to phi that’s this
61:16 multiplicative noise that makes life
61:18 complicated
61:20 now we now do a simple step now we
61:23 re-scale we now
61:24 we now we take this martensitic rows
61:27 generating function that i wrote down
61:29 here we take this
61:31 yeah and we just make our lives easier
61:33 life easier for later
61:35 yeah and by doing this to do this
61:39 we rescale some of these
61:42 fields and parameters so rescale
61:47 the fields so what we do
61:50 is we want to
61:53 get rid of
61:59 we have had two terms this one and this
62:02 one
62:03 they look kind of similar and the idea
62:06 is now
62:06 that if we have a proper transformation
62:09 or fields
62:10 that we can make them exactly equal up
62:13 to some pre-factors
62:15 yeah so that’s what we want to do we
62:16 want to simplify this action
62:19 and uh to symmetrize it now so that we
62:22 can treat these two terms
62:23 and e equally now that’s how we may we
62:26 want to make the prefactor here
62:28 this and this prefactor equal
62:32 now we can summarize these two trends
62:35 now so really scared the fields but just
62:37 tell you how to do that you have phi
62:40 goes over to 2 lambda
62:43 over gamma times phi
62:46 if i tilde the response field goes over
62:50 to
62:52 gamma to lambda
62:56 and gamma goes over to
63:00 two gamma lambda
63:05 yeah so we rescale this we are allowed
63:07 to do that
63:08 and then we get our new generating
63:11 functional
63:13 and this generating functional is again
63:16 of this form
63:17 d phi d phi tilde
63:22 e to the some action as naught and we
63:26 find that
63:26 what this is phi
63:30 by children
63:33 plus now this is has a naught the others
63:36 the non-interacting term
63:38 and now we have a term that of course
63:40 that describes interactions
63:43 now that’s where the fun is happening
63:46 by artillery and just write that down
63:50 this as not phi
63:54 phi tilde this action is the integral
63:57 over dx
63:59 dt
64:03 so here we should have
64:06 dt as well
64:10 yeah so at the x dt
64:13 and now we um have
64:17 5 x t
64:21 tau sorry what we need that
64:25 del t minus d naught
64:28 squared minus kappa
64:32 phi of x t
64:37 and then we have another part as
64:40 interaction phi phi to the
64:45 gamma over two integral dx
64:49 dt phi tilde
64:53 x t minus i sorry that looks
64:56 not so nice
65:01 all right tilde of x and t
65:05 times phi of x
65:09 t minus phi to the
65:13 of x and t
65:17 phi of x
65:20 t so this is this interaction term
65:24 and it’s called interaction term because
65:26 we have here
65:27 higher orders of the field coupling to
65:29 each other
65:31 yeah so
65:34 this is now our martin’s intervals
65:37 integral
65:38 yeah and this is what we’ll be dealing
65:41 with and this is what we’ll define
65:43 the result renormalization group on and
65:46 because i knew that we would be doing
65:48 renormalization
65:50 i have already introduced this little
65:54 towel here now because in
65:57 renormalization
65:58 we want to cause grain we want to
66:00 transform this action
66:02 and we want to see how our action and
66:04 how the parameters are
66:06 of our action change in this procedure
66:09 and that’s why all our terms here need
66:11 to have a prefactor
66:14 yeah and that’s why i introduced this
66:15 tau that’s why i have this d
66:18 you know and this kappa here i have them
66:20 all
66:21 giving them different names although
66:22 they’re not independent of each other
66:25 because our original model just had one
66:27 parameter that was the lambda
66:29 yeah so that’s the uh that’s the martin
66:32 sutra rose
66:34 functional integral and um
66:38 next so we’re already quite late now
66:41 next time
66:42 we start right away with uh first with
66:47 introducing renormalization intuitively
66:51 and then as a second step you’ll then
66:53 apply that
66:54 to this epidemic model and re-normalize
66:57 this margin central rows
66:58 functional integral now apparently i was
67:01 always
67:01 very optimistic and it was trying to
67:04 introduce already
67:05 the renomination today but we can do
67:07 that just next week
67:08 now because it’s already quite late
67:11 today


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK