Uniqueness vs. non-uniqueness in
complete connections with modified majority
rules
J. C. A. Dias
Departamento de Matemática, Universidade Federal de
Ouro Preto, Morro do Cruzeiro, CEP 35400-000, Ouro Preto, Brasil
Departamento de Matemática, Universidade Federal de
Minas Gerais, Av. Antônio Carlos 6627, C.P. 702 CEP 30123-970,
Belo Horizonte, Brasil and S. Friedli
Departamento de Matemática, Universidade Federal de
Minas Gerais, Av. Antônio Carlos 6627, C.P. 702 CEP 30123-970,
Belo Horizonte, Brasil
Abstract.
We take a closer look at a class of chains with complete connections
inspired by the one of Berger,
Hoffman and Sidoravicius [1]. Besides giving a sharper description of
the uniqueness and non-uniqueness regimes, we show that if the pure majority
rule used to fix the dependence on the past is replaced with a function that is
Lipschitz at the origin, then uniqueness always holds, even with arbitrarily slow
decaying variation.
where each Zt, t∈Z, is a
symbol taking values in a finite alphabet A. The processes
we consider are called chains with complete
connections (Doeblin and Fortet [3]),
due to a dependence on the past of the following form.
Assume some measurable map
g:A×AN→[0,1] is given a priori,
called g-function, and that
for all t, all zt∈A,
A processes
Z=(Zt)t∈Z satisfying (1)
is said to be specified by g.
The role played by g for Z is therefore analogous to a transition
kernel for a discrete time Markov process,
except that it allows dependencies on the whole past of the
process.
We will always assume that g is regular,
which means that it satisfies the following two conditions.
It is uniformly bounded away from 0 and 1: there exists
η>0 such that
η≤g(z0|z)≤1−η for all z0∈A, z∈AN.
Define the variation of g of order j by
varj(g):=sup|g(z0|z)−g(z0|z′)|,
where the sup is over all z0∈A, and over all
z,z′∈AN for which zi=z′i for all 1≤i≤j.
Then g is continuous in the sense that varj(g)→0
when j→∞.
When g is regular, the existence of at least one stationary process
specified by g follows by a standard compactness
argument (see also the explicit construction given below).
Once existence is guaranteed, uniqueness can be shown
under additional assumptions on the speed at which
varj(g)→0. For instance, Doeblin and Fortet
[3] showed that if
∑jvarj(g)<∞,
then there exists a unique process specified by g.
More recently, Johansson and Öberg [9]
strengthned this result, showing that uniqueness holds as soon
as
(2)
∑jvarj(g)2<∞.
An interesting and natural question is to determine if a given
regular g-function can lead to a phase transition, that is if it
specifies at least two distinct processes.
In a pioneering paper, Bramson and Kalikow [2]
gave the first example of a regular g-function exhibiting a phase
transition. More recently, Berger, Hoffman and Sidoravicius
[1], in a remarkable paper, introduced a
model whose g-function also exhibits a phase
transition, but whose variation has a summability that can be made
arbitrarily close to the ℓ2-summability of
the Johansson-Öberg criterion (see Remark 2 below).
The g-functions constructed in [2] and [1]
have common features. The main one is that they both rely on some
majority rule used in order to fix
the influence of the past on the probability
distribution of the present. That is,
Zt+1, given (Zs)s≤t, is
determined by the sign (and not the true value)
of the average of a subset of the variables
(Zs)s≤t over a large finite region.
This feature is essential in the mechanisms that lead to
non-uniqueness, since it allows (roughly speaking)
small local fluctuations to have dramatic effects in the remote
future, thus favorizing the transmission of information from −∞ to
+∞.
For the Bramson-Kalikow model, it had already been observed in [5] that
arbitrarily small changes in the behavior of the majority rule,
turning it smooth at the origin,
can have important consequences on uniqueness/non-uniqueness of the
process.
In this paper, we give a closer look at a class of models
based on the one of
Berger, Hoffman, and Sidoravicius (which will be called simply
the BHS-model hereafter).
Beyond giving a sharper description of the original model of [1],
our results show that any smoothing of the majority rule leads, under
general assumptions, to uniqueness, even for very slow-decaying variations.
We will present these models from scratch,
and not assume any prior knowledge about [1].
Since their construction is not trivial and deserves some
explanations, we will state our
results precisely only at the end of Section 2.
Before proceeding, we single out other non-uniqueness-related works.
In [8], Hulse gave examples of non-uniqueness, based on the Bramson-Kalikow
approach.
In [4], Fernández and Maillard constructed an example, using
a long-range spin system of statistical mechanics,
although in a non-shift-invariant framework.
In [6], Gallesco, Gallo and
Takahashi discussed the Bramson-Kalikow model under a different perspective.
1.1. Models considered
Although the basic structure of our model is entirely imported from the one of
BHS, our notations and terminology differ largely from
those of [1].
The process Z=(Zt)t∈Z
defined in [1] takes values in an
alphabet with four symbols, where each symbol is actually a pair, which we denote
Zt=(Xt,ωt),
with ^{1}^{1}1Often, we will abbreviate +1 (resp. −1) by + (resp. +).Xt∈{+,−}, ωt∈{0,1}. The process can be considered as
constructed in two steps. First, a doubly-infinite sequence of i.i.d.
random variables ω=(ωt)t∈Z is sampled,
representing the environment, with distribution Q:
Q(ωt=1)=1−Q(ωt=0)=12.
Then, for a given environment ω, a process
X=(Xt)t∈Z is considered, whose conditional distribution
given ω is denoted Pω and
called the quenched distribution. We will assume that Pω-almost surely,
(3)
where xt−1−∞=(xt−1,xt−2,…)∈{±}N.
The perturbation ψωt:{±}N→[−1,1] describes
how the variables of the process X differ
from those of an i.i.d. symmetric sequence (which corresponds to
ψωt≡0).
The quenched model will always be attractive, in the sense that
ψωt(xt−1−∞) is non-decreasing in each of the
variables xs, s<t.
We assume that the functions ψωt satisfy the following conditions:
For all x∈{±}N,
ψωt(x) depends only
on the environment variables ωs, with s
lying at and before time t.
The functions are odd, ψωt(−x)=−ψωt(x) for all
x∈{±}N, and bounded
uniformly in all their arguments:
|ψωt(x)|≤ϵ for some ϵ∈(0,1).
The maps (x,ω)↦ψωt(x) are continuous,
uniformly in t.
If θ:{0,1}Z→{0,1}Z denotes the shift,
(θω)s:=ωs+1, then
ψωt=ψθtω0.
The probability distribution P of the
joint process Zt=(Xt,ωt) is defined as follows.
If A∈F:=σ(Xt,t∈Z), B∈G:=σ(ωt,t∈Z), then
(4)
P(A×B):=∫BPω(A)Q(dω).
We will sometimes denote P by Q⊗Pω.
It can then be verified that under P, Z=(Zt)t∈Z is a chain with
complete connections specified by the regular g-function
(5)
g((±,ωt)|(xt−1,ωt−1),(xt−2,ωt−2),…)
:=14{1±ψωt(xt−1−∞)}.
Although the processes specified by g are of a dynamical nature (the process
(xt,ωt) at time t having a distribution fixed by the entire past),
we will rather be working with the quenched picture in mind, and think only
of the variables
xt as being dynamical, evolving in a fixed environment (ωt)t∈Z.
The precise definition of the functions
ψωt will be given in
Section 2.1.
Before that we describe, in an informal way, the
main ingredients that will appear in their construction.
1.2. Sampling a random set in the past
A natural feature of the model is that
the distribution of the process X at time t
is determined by its values over a finite (albeit large) region in the
past of t. Therefore, for a given environment ω,
the starting point will be to associate to each
time t∈Z a random set St=Sωt living in the
past of t: St⊂(−∞,t).
We will say that St targets the time t.
Although each St is either empty of finite, we will always have, Q-almost surely,
supt|St|=∞ and suptdist(t,St)=∞.
In the environment ω,
the distribution of Xt conditionned on its past (Xs)s<t (see
(3))
is determined by the values of X on St.
As a matter of fact, the distribution of Xt will depend on
the average of X on the set St:
ψωt(xt−1−∞)= odd function of (1|Sωt|∑s∈Sωtxs).
The precise dependence will be fixed by some majority rule.