#renormalization — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #renormalization, aggregated by home.social.
-
#paperOfTheDay "Relations between short-range and long-range Ising models" from 2014.
The basic version of the Ising model in #statistical #physics is a lattice where every site contains a binary variable, a "spin" that can point up or down. There is a nearest-neighbour interaction which energetically prefers neighbouring spins to point in the same direction. Then, there are "long range" versions, where the interaction also takes into account spins at larger distance, with a weighting factor that decays with some power law with exponent sigma (where sigma=2 reproduces the conventional short-range model). In this class of models, there are thus two parameters: The dimension d, and the parameter sigma.
The present paper is mostly a numerical Monte Carlo study of various such systems at different d and sigma. The guiding question is whether instead of two, there is actually only one parameter. Phrased differently: Given some d and sigma, can I find some other D such that the short-range (sigma=2) model at this D is equivalent to the long-range one at d? It is intuitively plausible that this works close to the interface between long-range and short-range models, making it slightly long-range is mostly the same as slightly changing dimension. However, the authors demonstrate that such relations are in fact only an approximation, and fail for generic values of sigma and d.
A second finding concerns the structure of the correlation functions of certain long-range models, which decay according to a power law (as expected from the #renormalization group), but where the leading correction is another power law (and not exponentially small).
https://journals.aps.org/pre/abstract/10.1103/PhysRevE.89.062120
https://arxiv.org/abs/1401.6805 -
#paperOfTheDay "Derivative expansion of the exact renormalization group" from 1994 is a follow-up on yesterday's paper, by the same author. Here, he uses the (then) new functional #renormalization group #FRG equations and introduces a certain expansion in momenta, which he then studies for the scalar flip-symmetric model (i.e. the avatar of phi^4 theory). This gives rise to two non-linear differential equations, which can be solved numerically, and produce numerical values of the critical exponents rather close to the correct ones.
It seems to me that this method is surprisingly simple -- solving few differential equations instead of coupled integral equations -- and the author claims repeatedly that it can be systematically improved at will. It didn't entirely become clear to me why he does not do that. After all, computing critical exponents for phi^4 theory is one of the globally accepted benchmarks for methods in field theory and statistics, which would give much credibility to this new method. Perhaps the concrete technical challenges were too big, even if a systematic expansion is possible conceptually.
I would be interested to know if now, 30 years later, this systematic momentum expansion has been continued to higher order, or if it has been replaced by another method.
doi.org/10.1016/0370-2693(94)90767-6 -
The #paperOfTheDay is "The exact renormalization group and approximate solutions" from 1994. This is one of the foundational papers deriving functional #renormalization group equations in #physics . These equations encode a scheme to solve a path integral step by step from high to low energy (i.e. small to large distance), and thereby gradually obtain a full solution of the theory. In practice, only approximations can be solved, and even these usually only numerically.
What I especially like about this particular paper is that it gives detailed discussions, interpretation, and historical remarks. For example an overview of the many previous papers where similar equations had been derived, seemingly without knowing of each other (this shows how useful it is to read a #paperOfTheDay !).
The author also explains that he first wanted to work with Dyson-Schwinger equations, another type of non-perturbative equations, but found them to be inconsistent and not renormalizable. From today's perspective, the Hopf algebra theory of renormalization is a precise construction and classification of consistently renormalizable Dyson-Schwinger equations, but that was only around 2005.
Another topic in the present paper concerns the use of smooth versus sharp cutoff functions, where the author prefers sharp ones since they can be dealt with analytically.
The paper ends with an application of the newly found functional renormalization group equations to phi^4 theory, where again the discussion is very pedagogical and gives detailed comparison with common Feynman integral or Dyson-Schwinger calculations.
https://www.worldscientific.com/doi/abs/10.1142/S0217751X94000972 -
Yesterday's #paperOfTheDay was about Widom scaling: The observation that experimental results for critical points in statistical #physics can be described by an equation of state that is overall homogeneous in its arguments.
Today, we have "Scaling laws for Ising models near Tc" from 1966. This is the article that introduced block-spin transformations: One works with a (cubic) lattice where "spin" variables sit at each vertex. In the traditional Ising model, they interact only with nearest neighbours: They want to be aligned, but random fluctuations disalign them. Now, Kadanoff's crucial idea is to consider a "blocking" operation. For example, in a 2-dimensional square lattice, one could "merge" four spins each. This would give rise to some new lattice, twice as coarse. The original spins were just +1 or -1, but the new "effective" spins could have many states because they consist of 4 "internal" binary variables. However, one assumes that near the critical point, where most fluctuations are at large spacial scales, it is overwhelmingly likely that the 4 internal spins all point into the same direction, so that the newly created lattice variables effectively only have two states: 4 up or 4 down spins. Hence, in this approximation the block-spin transformation produces a new Ising model which has the same structure, but different numerical parameters. Working this out in more detail, one finds that this implies that the equation of state must be overall homogeneous: Widom's scaling relation.
The block-spin transformation is still today the pedagogical prototype of the #renormalization group: A transformation that changes scales and parameters, but not structure.
https://journals.aps.org/ppf/abstract/10.1103/PhysicsPhysiqueFizika.2.263 -
#paperOfTheDay : "Fixed-point structure and effective fractional dimensionality for O(N) models with long-range interactions" from 2015.
This paper considers scalar lattice models in statistical #physics (which, in the appropriate limit, are equivalent to scalar Euclidean #quantumFieldTheory ). The "classic" such model, the Ising model, has interactions only between nearest neighbours, but many relevant physical systems allow for long-range interactions that decay with a power law. In that case, there is a "crossover": As one might guess, when the long-range interaction decays slowly, it dominates the behaviour of the system at large scales. However, if it decays fast enough, the system is effectively equivalent to having only shoart-range interaction. In particular, there is a parameter range where the interaction Hamiltonian/ the action contains a long-range term, but the resulting system is equivalent to not having any long-range term.
It has long been known that this crossover happens exactly at the critical dimension of the short-range theory: Basically, the local interactions give rise to an effective long-range behaviour, and when this one decays slower than the manually inserted long-range interaction, the short-range one dominates.
The present paper uses methods of the functional #renormalization group to confirm this picture. Concretely, they study a local potential approximation of the Wetterich equation, and find that indeed above the crossover, the presence of a quickly decaying long-range term does not alter the results, while below, when it decays less quickly, it gives rise to a different solution.
https://journals.aps.org/pre/abstract/10.1103/PhysRevE.92.052113 -
The #paperOfTheDay is "Phase Transition in Uniaxial Ferroelectrics" from 1969. This paper considers a problem in condensed matter #physics (at that time rather called solid state physics): Electrical dipoles in e.g. a 3-dimensional lattice have an interaction that does not just affect adjacent dipoles, but decays like a power law over the entire lattice. This gives rise to a phase transition, and the paper computes the properties of that phase transition, such as the behaviour of the specific heat using methods of field theory such as #FeynmanIntegral s.
From today's perspective, this is all pretty standard, but it should be seen in its historic context: The #renormalization group in its modern form was only discovered in the early 1970s, together with the understanding of universality: A system consisting of many interacting "objects" behaves, close to a critical point, in an "universal" way that depends only on few parameters such as symmetries and dimension. By now, it is widely known that the critical behaviour of any such system can be computed with the methods of another, e.g. one can use perturbative field theory for lattices, or lattice simulations for field theory. The present paper already contains much of this insight, in particular the appendix notes that a generalization to an O(N)-symmetric field theory would be straightforward.
https://jetp.ras.ru/cgi-bin/e/index/e/29/6/p1123?a=list -
#paperOfTheDay "Theres' plenty of room in the middle: The unsung revolution of the renormalization group" from 2023 is a meta-article to commemorate 50 years of the #renormalization group in memory of one of its pioneers, Michael E. Fisher.
First of all, this article is a masterly and dense historical overview, basically every single reference in it is a breakthrough article worth reading. This being said, the article builds upon "Fisher's least cited article". There, Fisher introduces the thesis, which is elaborated in the present article, that almost all interesting #physics happens in the "middle"; concretely in the realm of collective phenomena and effective #fieldTheory governed by the renormalization group.
The present article discusses various examples, each with a provocative title, for example the BCS theory of superconductivity (famously a pure quantum effect) under the slogan that quantum mechanics is really not needed for condensed matter physics. The reasoning is: Such effects are described by effective field theories and "minimal models" with certain "asymptotic" properties (this is discussed in detail), in concrete cases these models can be derived from quantum mechanics, but this derivation does not really add anything to the practical understanding. At the same time, one can often obtain them from thermodynamic considerations alone, regardless of any more fundamental theory.
The article ends with the standard model of elementary particle physics, with the same thesis: One can view this as "just some EFT", which is valid as a very accurate model to describe observations, regardless of whether it is "fundamental"
. https://arxiv.org/abs/2306.06020v1 -
#paperOfTheDay "Theres' plenty of room in the middle: The unsung revolution of the renormalization group" from 2023 is a meta-article to commemorate 50 years of the #renormalization group in memory of one of its pioneers, Michael E. Fisher.
First of all, this article is a masterly and dense historical overview, basically every single reference in it is a breakthrough article worth reading. This being said, the article builds upon "Fisher's least cited article". There, Fisher introduces the thesis, which is elaborated in the present article, that almost all interesting #physics happens in the "middle"; concretely in the realm of collective phenomena and effective #fieldTheory governed by the renormalization group.
The present article discusses various examples, each with a provocative title, for example the BCS theory of superconductivity (famously a pure quantum effect) under the slogan that quantum mechanics is really not needed for condensed matter physics. The reasoning is: Such effects are described by effective field theories and "minimal models" with certain "asymptotic" properties (this is discussed in detail), in concrete cases these models can be derived from quantum mechanics, but this derivation does not really add anything to the practical understanding. At the same time, one can often obtain them from thermodynamic considerations alone, regardless of any more fundamental theory.
The article ends with the standard model of elementary particle physics, with the same thesis: One can view this as "just some EFT", which is valid as a very accurate model to describe observations, regardless of whether it is "fundamental"
. https://arxiv.org/abs/2306.06020v1 -
#paperOfTheDay "Theres' plenty of room in the middle: The unsung revolution of the renormalization group" from 2023 is a meta-article to commemorate 50 years of the #renormalization group in memory of one of its pioneers, Michael E. Fisher.
First of all, this article is a masterly and dense historical overview, basically every single reference in it is a breakthrough article worth reading. This being said, the article builds upon "Fisher's least cited article". There, Fisher introduces the thesis, which is elaborated in the present article, that almost all interesting #physics happens in the "middle"; concretely in the realm of collective phenomena and effective #fieldTheory governed by the renormalization group.
The present article discusses various examples, each with a provocative title, for example the BCS theory of superconductivity (famously a pure quantum effect) under the slogan that quantum mechanics is really not needed for condensed matter physics. The reasoning is: Such effects are described by effective field theories and "minimal models" with certain "asymptotic" properties (this is discussed in detail), in concrete cases these models can be derived from quantum mechanics, but this derivation does not really add anything to the practical understanding. At the same time, one can often obtain them from thermodynamic considerations alone, regardless of any more fundamental theory.
The article ends with the standard model of elementary particle physics, with the same thesis: One can view this as "just some EFT", which is valid as a very accurate model to describe observations, regardless of whether it is "fundamental"
. https://arxiv.org/abs/2306.06020v1 -
#paperOfTheDay "Theres' plenty of room in the middle: The unsung revolution of the renormalization group" from 2023 is a meta-article to commemorate 50 years of the #renormalization group in memory of one of its pioneers, Michael E. Fisher.
First of all, this article is a masterly and dense historical overview, basically every single reference in it is a breakthrough article worth reading. This being said, the article builds upon "Fisher's least cited article". There, Fisher introduces the thesis, which is elaborated in the present article, that almost all interesting #physics happens in the "middle"; concretely in the realm of collective phenomena and effective #fieldTheory governed by the renormalization group.
The present article discusses various examples, each with a provocative title, for example the BCS theory of superconductivity (famously a pure quantum effect) under the slogan that quantum mechanics is really not needed for condensed matter physics. The reasoning is: Such effects are described by effective field theories and "minimal models" with certain "asymptotic" properties (this is discussed in detail), in concrete cases these models can be derived from quantum mechanics, but this derivation does not really add anything to the practical understanding. At the same time, one can often obtain them from thermodynamic considerations alone, regardless of any more fundamental theory.
The article ends with the standard model of elementary particle physics, with the same thesis: One can view this as "just some EFT", which is valid as a very accurate model to describe observations, regardless of whether it is "fundamental"
. https://arxiv.org/abs/2306.06020v1 -
#paperOfTheDay "Theres' plenty of room in the middle: The unsung revolution of the renormalization group" from 2023 is a meta-article to commemorate 50 years of the #renormalization group in memory of one of its pioneers, Michael E. Fisher.
First of all, this article is a masterly and dense historical overview, basically every single reference in it is a breakthrough article worth reading. This being said, the article builds upon "Fisher's least cited article". There, Fisher introduces the thesis, which is elaborated in the present article, that almost all interesting #physics happens in the "middle"; concretely in the realm of collective phenomena and effective #fieldTheory governed by the renormalization group.
The present article discusses various examples, each with a provocative title, for example the BCS theory of superconductivity (famously a pure quantum effect) under the slogan that quantum mechanics is really not needed for condensed matter physics. The reasoning is: Such effects are described by effective field theories and "minimal models" with certain "asymptotic" properties (this is discussed in detail), in concrete cases these models can be derived from quantum mechanics, but this derivation does not really add anything to the practical understanding. At the same time, one can often obtain them from thermodynamic considerations alone, regardless of any more fundamental theory.
The article ends with the standard model of elementary particle physics, with the same thesis: One can view this as "just some EFT", which is valid as a very accurate model to describe observations, regardless of whether it is "fundamental"
. https://arxiv.org/abs/2306.06020v1 -
Wednesday's #paperOfTheDay is "Lagrange Inversion: When and How" from 2006 .
This paper is a detailed pedagogical discussion of the Lagrange inversion formula and its variants in #mathematics . The fundamental setting is: If you have a (not necessarily convergent) power series f(x), how can you compute the series coefficients of the inverse (under composition) g(x), such that f(g(x))=x ? The solution can be expressed in terms of Bell polynomials, but also in terms of complex analysis, where the extraction of a power series coefficient is a variant of the residue theorem.
This question has many applications in enumerative combinatorics, where one wants to count all sorts of things, or establish relations between their generating functions.
For example, #renormalization in #quantumFieldTheory is of this form: you have a perturbation series in some bare coupling, and you want to invert it in order to express everything in terms of renormalized couplings. I find it surprising that these elementary and general formulas for series inversion are in general not taught in theoretical physics lectures, and one instead argues on a case by case basis that it would be possible to redefine couplings to include higher order terms, etc. -
Monday's #paperOfTheDay is "Renormalization Group flows between Gaussian Fixed Points" from 2022. This preprint concerns scalar #quantumFieldTheory with different choices of the propagator. Conventionally, one has (in a massless theory) a propagator of the form 1/p^2, corresponding to a kinetic term of second derivatives. However, there could be (i.e. it is generated by quantum fluctuations) also 2-point interactions proportional to more derivatives, in particular a fourth derivative. This raises the question whether one can equivalently use that term as the propagator, i.e. assign the value 1/p^4 to edges in #FeynmanIntegral s, and use the other term as an interaction vertex. In principle that works, but it leads to a number of technical issues such as having states with negative norm (ghosts).
The present preprint takes a different perspective: At low energies (consider e.g. plane waves with long wavelength), a fourth derivative will be numerically small, while it dominates at high energy. One can therefore view the transition from one choice of propagator to the other as a #renormalization group flow that starts in the UV with a fourth derivative, and arrives at a second derivative in the IR. An analogous argument has long been known for a mass term (i.e. 2-point term with zero derivatives): In the UV, the kinetic term p^2 determines the behaviour of the field (e.g. UV convergence of Feynman integrals), whereas at low energy, every propagator is essentially constant 1/m^2. Notice that all these transitions are taken at fixed spacetime dimension, whereas #tropicalFieldTheory is an analogous limit to zero derivatives in zero dimensions, which gives a different result.
https://arxiv.org/abs/2207.10596v1 -
Friday's #paperOfTheDay is "The functional f(R) approximation" from 2022. This is a review article about a certain approach to #quantum #gravity in #physics . Namely, the "asymptotic safety" scenario, which asserts that although the Einstein-Hilbert action is perturbatively not renormalizable, it will at high energies give rise to an interacting fixed point, so that the observables in fact stay finite.
In principle, the behaviour of high-energy quantum gravity can be computed with the methods of functional #renormalization group equations. In practice, a number of assumptions and approximations are required, for example choosing a suitable splitting of the full metric field into a background and quantum-fluctuations, and choosing an IR cutoff functional that leads to analytic simplifications. Of particular importance is the choice of truncation for the effective action: The effective action is the generating function of correlation functions, it contains all physical information. In gravity, these correlation functions can potentially depend on all possible tensor structures, and have an arbitrary dependence on momenta. The earliest truncation, used in the 1990s, was to assume that there are only two terms: One proportional to the cosmological constant, and one proportional to the curvature R. By now, many further terms have been included. The present review analyzes the case where an arbitrary function f(R) of the curvature is allowed. This includes arbitrary powers R^n, but also trans-monomials like exp(1/R).
https://arxiv.org/abs/2210.11356 -
#paperOfTheDay for Wednesday is "Form factors in quantum gravity: Contrasting non-local, ghost-free gravity and Asymptotic Safety" from 2022.
Unlike all other elementary forces, #gravity does not straightforwardly make sense as a perturbative #quantumFieldTheory . This has given rise to a number of alternative approaches over the decades, two of which are being compared in today's paper.
The first one is "asymptotic safety", which, roughly, asserts that the conventional Einstein-Hilbert action is indeed the correct low energy description, but at higher energies, it does not simply blow up as could be expected from naive power counting. Instead, the strong gravity interaction at high energy (or equivalently at short scale) produce a state that is essentially scale invariant: An interacting fixed point. To study this behaviour, one usually resorts to numerical integrations of flow equations of the functional renormalization group.
The second approach is non-local ghost free gravity, where one assumes that, in perturbation theory, the propagator secretly has an exponentially decaying factor that only becomes relevant at high energies. This renders the theory renormalizable because it eliminates UV divergences.
The two approaches can also be interpreted in terms of two different, momentum-dependent, wave-function #renormalization factors. They correspond to rather different high-energy behaviour, which, however, is far beyond current range of experimental data.
https://www.sif.it/riviste/sif/ncc/econtents/2022/045/02/article/3 -
#paperOfTheDay for Wednesday is "Dimensional renormalization: The number of dimensions as a regularizing parameter" from 1972. As the title suggests, this is one of the articles that first introduced dimensional regularization.
In perturbative #QuantumFieldTheory (or statistical physics), one encounters #FeynmanIntegral s which are divergent. These divergences are eventually removed through #renormalization , but in order to even get to that point, one first needs to assign some value to these integrals. This is called regularization. Various methods of regularization are known, but the typical problem is that they destroy symmetries of the theory. Dimensional regularization was a breakthrough for practical computation of Feynman integrals because it respects many symmetries.
The basic idea is to define an integral for non-integer dimension of spacetime. This is done, essentially, by analytic continuation: We know what it means to take a first, second, third etc derivative of a function, and to integrate it once, twice, thrice etc. If the function is spherically symmetric (i.e. depends only on the radius of spherical coordinates), then the "count" of the integrals or derivatives appears as an explicit number in intermediate steps. For example, the volume element in 3 dimensional spherical coordinates is r^2*dr*(angular part), where the exponent "2" represents dimension D=2+1=3. Basically, you could insert any number in place of the "2", and declare this to be the D-dimensional integral. Of course, in reality this is more sophisticated, but the basic idea is very much in this spirit.
https://link.springer.com/article/10.1007/BF02895558 -
Today in the #dailyPaperChallenge I read (parts of) "critical \( \phi^4_{3,\epsilon} \)" from 2002. This paper deals with the #renormalization of a quartic interacting #fieldTheory in 3 dimensions, but with a non-standard (long-range) propagator \(1/p^{\frac 3 2} \). The methods they use are quite different from what I am accustomed to, but there are two points of contact with my work: Firstly, this theory is an example of a marginally coupled \(\phi^4 \) theory with non-integer propagator power. The #tropicalFieldTheory we are currently developing is also of that type. And secondly, the algebraic/combinatorial operations they use seem to fit nicely into a Hopf algebra description a la Connes-Kreimer (probably, someone has already worked that out in the 20 years since). Besides that, this paper also includes one section that is just a sequence of 24 lemmas, which would be more typical for Wittgenstein's tractatus than for a physics paper. What I also liked was that the paragraphs have individual titles, which makes the structure of arguments very easy to follow. https://link.springer.com/article/10.1007/s00220-003-0895-4
-
Today in the #dailyPaperChallenge I read (parts of) "critical \( \phi^4_{3,\epsilon} \)" from 2002. This paper deals with the #renormalization of a quartic interacting #fieldTheory in 3 dimensions, but with a non-standard (long-range) propagator \(1/p^{\frac 3 2} \). The methods they use are quite different from what I am accustomed to, but there are two points of contact with my work: Firstly, this theory is an example of a marginally coupled \(\phi^4 \) theory with non-integer propagator power. The #tropicalFieldTheory we are currently developing is also of that type. And secondly, the algebraic/combinatorial operations they use seem to fit nicely into a Hopf algebra description a la Connes-Kreimer (probably, someone has already worked that out in the 20 years since). Besides that, this paper also includes one section that is just a sequence of 24 lemmas, which would be more typical for Wittgenstein's tractatus than for a physics paper. What I also liked was that the paragraphs have individual titles, which makes the structure of arguments very easy to follow. https://link.springer.com/article/10.1007/s00220-003-0895-4
-
Today in the #dailyPaperChallenge I read (parts of) "critical \( \phi^4_{3,\epsilon} \)" from 2002. This paper deals with the #renormalization of a quartic interacting #fieldTheory in 3 dimensions, but with a non-standard (long-range) propagator \(1/p^{\frac 3 2} \). The methods they use are quite different from what I am accustomed to, but there are two points of contact with my work: Firstly, this theory is an example of a marginally coupled \(\phi^4 \) theory with non-integer propagator power. The #tropicalFieldTheory we are currently developing is also of that type. And secondly, the algebraic/combinatorial operations they use seem to fit nicely into a Hopf algebra description a la Connes-Kreimer (probably, someone has already worked that out in the 20 years since). Besides that, this paper also includes one section that is just a sequence of 24 lemmas, which would be more typical for Wittgenstein's tractatus than for a physics paper. What I also liked was that the paragraphs have individual titles, which makes the structure of arguments very easy to follow. https://link.springer.com/article/10.1007/s00220-003-0895-4
-
Today in the #dailyPaperChallenge I read (parts of) "critical \( \phi^4_{3,\epsilon} \)" from 2002. This paper deals with the #renormalization of a quartic interacting #fieldTheory in 3 dimensions, but with a non-standard (long-range) propagator \(1/p^{\frac 3 2} \). The methods they use are quite different from what I am accustomed to, but there are two points of contact with my work: Firstly, this theory is an example of a marginally coupled \(\phi^4 \) theory with non-integer propagator power. The #tropicalFieldTheory we are currently developing is also of that type. And secondly, the algebraic/combinatorial operations they use seem to fit nicely into a Hopf algebra description a la Connes-Kreimer (probably, someone has already worked that out in the 20 years since). Besides that, this paper also includes one section that is just a sequence of 24 lemmas, which would be more typical for Wittgenstein's tractatus than for a physics paper. What I also liked was that the paragraphs have individual titles, which makes the structure of arguments very easy to follow. https://link.springer.com/article/10.1007/s00220-003-0895-4
-
Today in the #dailyPaperChallenge I read (parts of) "critical \( \phi^4_{3,\epsilon} \)" from 2002. This paper deals with the #renormalization of a quartic interacting #fieldTheory in 3 dimensions, but with a non-standard (long-range) propagator \(1/p^{\frac 3 2} \). The methods they use are quite different from what I am accustomed to, but there are two points of contact with my work: Firstly, this theory is an example of a marginally coupled \(\phi^4 \) theory with non-integer propagator power. The #tropicalFieldTheory we are currently developing is also of that type. And secondly, the algebraic/combinatorial operations they use seem to fit nicely into a Hopf algebra description a la Connes-Kreimer (probably, someone has already worked that out in the 20 years since). Besides that, this paper also includes one section that is just a sequence of 24 lemmas, which would be more typical for Wittgenstein's tractatus than for a physics paper. What I also liked was that the paragraphs have individual titles, which makes the structure of arguments very easy to follow. https://link.springer.com/article/10.1007/s00220-003-0895-4
-
Two years ago, I began writing my #doctoralThesis in theoretical #physics. Most effort went into giving a very detailed pedagogical account of what the #renormalization #HopfAlgebra in #QuantumFieldTheory does, and why it is natural and transparent from a physical perspective.
One year ago, my referees recommended in their reports to publish the thesis as a book, and today I received the printed copies!
It was exciting to go through all the steps of actually publishing a book, and I hope that it will be of use to convince physicists that the Hopf algebra structure in #QFT is not a weird mathematical conundrum, but it actually encodes the very way physicists have been thinking of renormalization since the 1950s: Parametrize a theory by quantities one can actually measure, instead of fictional expansion parameters.
https://link.springer.com/book/10.1007/978-3-031-54446-0