CONFERENCE
Recording during the thematic meeting : « Combinatorics on words » the February 26, 2024 at the Centre International de Rencontres Mathématiques (Marseille, France)
Filmmaker : Luca Récanzone
Find this video and other talks given by worldwide mathematicians on CIRM’s Audiovisual Mathematics Library: http://library.cirm-math.fr. And discover all its functionalities: – Chapter markers and keywords to watch the parts of your choice in the video – Videos enriched with abstracts, bibliographies, Mathematics Subject Classification – Multi-criteria search by author, title, tags, mathematical area
Okay thank you Julian um okay so well first thanks for the organizers and everybody to be there at 9 something in the morning of the first day um okay I will talk about the as my title says the 2s number which is a a problem which I
Find interesting and which is a good motivation for what I really want to talk about which is this coning argument which is a proof technique that that I like I think some of you might already have heard me talking about that in other uh events but still so my goal
Really is to talk about this counting argument and the two number is going to be my motivation to talk about this arguments today um okay so let me recall some uh definition that not all of you might be familiar with uh so what I’m going to call a square is simply a word
Of the form uu where you is not empty and then I’m going to say that the period of my square is just the length of the part which I’m repeating uh and I’m going to say the word is square free if none of its factor is a
Square uh so for instance ABC ABC is a square uh the following word is not Square free because it contains the square bcbc as a factor and this one is square free you can check that none of the factor is a square okay so that’s uh
Notion of square so uh let me remind this theorem which is uh result from 2A 1906 that says that uh by using only three different letters you can make an infinite sequence which does not contain any square right so there is arit long Square free words over 012 and this this
Result is often considered as the starting point of com words one of the first result of com words so naturally uh it’s been generalized in many directions uh I’m just generalized and viation of it have been studied so I’m just giving some of them now so lot of
Them in comat combinatorics and words obviously there is some I’m mentioning at the end which are not in com words like for there is this notion of non- repetitive colorings of graph where the idea is you want to color graph such that there is no square that appears on
The coloring when you look at the path so this has beened in really many Direction and one more direction that I want to mention today is this one which I’m going to explain in the next slides uh but yeah so it’s it’s to be a rization of of this uh old result from
2A uh and so for this I’m going to look at a slightly different setting than the one we usually use to for this kind of question which is I’m not going to have one alphabet like usually we have one alphabet and we say okay can you avoid something over words over this alphabet
Now I have a slightly different setting I have a sequence of alphabets so at every position I have a different alphabet and I want to build a word such that my ice letter comes from the ice alphabet right so for instance yeah and and I will if in particular if my word w
Is infinite I will just write that W belongs to the product of my alphabet which makes sense uh so for instance I have a sequence of alphabet and I want to build a word that respect the sequence of alphabet well I can do this zero inde comes from the first alphabet two comes
From the second alphabet Etc right so it’s just a generalization of this setting that we are used to in com comat exts okay and again I told you that what I want to do is to avoid squares so this is what I want to generalize so okay
Maybe just so the question for me would be okay now given a particular sequence of alphabet can you build a word which avoids squares over this particular sequence of alphabet so the first thing you might want to look at is something relating to the related to the size of
These alphabets so in particular can you find uh well value an integer case such that if all the alphabet are of size at least at least k then you can build an infinite squ Words which respect the sequence of alphabet um and if such a exist I’m
Going to call it the 2 number right the smallest T ke um okay so little bit about this um maybe the question might soon a bit in particular somehow intuitively uh when you think about squares it feels like if if consecutive alphabets are different then it should be easier to to avoid
Squares right because well I mean if if I don’t have the two the same letters in consecutive position then it’s going to be easier to choose letter which are different right so so there is this intution that somehow the worst cases should be when all the alphabet are
Identical so then you should be able to take the value three we shouldn’t trust this intuition maybe it’s true but we shouldn’t believe in in this intuition and I just want to illustrate this in the next slide so I will make a small d by graph coloring
Where this is one of the motivation of this question so I want to make this small d so there is this notion of proper coloring of graph where you know proper coloring of a graph is just I want to give colors to a vertices that that Adent vertices do not receive the
Same color and so for in particular if I look for instance at this graph then I can properly color it with two colors as such right this is a pipage graph now I want you to look at the same generalization of the question which is called list coloring which has been
Studied a lot uh by people from graph Theory so I’m giving two colors to each vertic is and now I want to find proper cing of this graph which respect uh this list uh and so somehow the intuition should be even stronger there right because now I mean clearly if if if
Adent vertices have a different list then it should be easier to color right because I just want that Adent vertices receive a different color so it should help me however if we look at this particular choice of list actually it’s not possible to provide um a proper
Coloring so how can you do that well first look at the ver this is on the left for instance if they took if they take the color green and yellow then I cannot color the vertex on the top on the top right now if they take green and
Red then you cannot color the second vertices on the right if they take uh blue and yellow you cannot take the third one and if they take blue and red you can take the last one and this this example um is really easy to generalize
To lists of size K you have K vertices on the left you have K to the K vertices on the right you need only two color to properly color but you cannot do it with list of sizec so somehow there is a really big gap between what you can do
When the the set is the same everywhere and what you can do when you just know that the size of the set is at most something yeah so so really the point of this data was to say uh do not trust your intuition for this your inition is
Bad I mean at least my intuition is bad I don’t know you but my is bad uh okay so let’s not talk about graph coloring anymore uh still I can prove something which is that uh if all of my lists are size at least four then I can build a
Square free word right so this is what I’m going to prove that I have my sequence of alphabet all the alphabet list of size at least four yeah by the way I will use interchangeably the word lists and alphabet but they mean the same thing right it’s there’s nothing
About this um so I should say a few things about the references that I’ve indicated first uh so there is the first proof of this result by uh G hopefully I’m pronouncing it right and zo in 2011 and this result was using something called the left-handed local Lemma and was something like
Really the main proof was seven pages then a bit later they improved it using uh well gruk and other coed improved it using entropy compression and now the proof is something like five pages a bit lat I don’t know who exactly but using the um a technique called the power
Series method it was again improved and then the proof became much simpler and finally in 2020 I gave an even simpler proof of this which takes something like Alpha page and this is what I’m going to present in the next slide so this is really the proof I want to present
Today uh the proof of this result using this technique which I’m going to call the cing argument for the sake of this presentation okay so how how am I going to prove this well the first idea as the name suggests is instead of proving that there is at least one word of r
I’m going to prove that there is a lot of them like I’m going to condemn um so for this I will just need to introduce few notation so I will fix my sequence of alphabet for the rest of these two slides and I will write CN the
Set of square free words of lens and that respects my sequence of alphabet and now what I want to prove is the following LMA which is that uh well the size of CN plus1 is always at least twice the size of CN which is way stronger than what we had
Before right in particular this implies that the size of CN is at least 2 to the N because the empty word avoid squares so there’s at least one word of size zero and clearly 2 to the N is larger than one so we have at least one word of
Every lens right uh but somehow really what I this is actually a crucial part to introduce this par LMA because I’m going to to provide a proof by induction of this LMA and as often uh this is really important for my induction to Works to have the good hypothesis right
So like half of the idea of this proof is really to introduce the the right LMA at some point and now I need to do the proof of this LMA by induction this is going to be the next slide hopefully I’m not going to go too fast in my in my
Opinion this is the most interesting things I to say the proof of the next slide so if something is not clear please interrupt me um okay so as I said I want to prove by induction on N that the size of CN plus one is at least twice the size of
CN so let’s just ose my induction hypothesis right now so I don’t have to think about it really anymore uh what it means is now um if I’m comparing CN to smaller CJ CN minus J then if right so CN minus J is the size of CN minus J is
At most uh the size of CN divided by 2 to the J right because I’m removing n letters so it’s just applying a few time my induction hypothesis okay so we will use this in the end in some computation but now I don’t have to think about the fact my
That I’m doing an indiction anymore so now what’s right the the idea of the proof the idea of the proof is I’m going to take words from CN so I want Square Words which respect my alphabet of lens n and I want to look at what happens
Whenever I’m trying to adding the next letter so I’m going to take a letter from my next alphabet and I’m trying to add one letter from this alphabet what happens well somehow to things can happen either this is a square forward in which case it goes into CN plus one
Or I will put it in b b is just what I’m going to call the the set of these bad extensions right so I’m starting from a word from CN I’m adding a new letter from my alphabet and for some reason something went wrong then I’m just going to put them in
B be like bad uh okay then by definition uh the size the size of cn+ one is exactly the size of CN multiplied by the number of extension minus the bad extension somehow my my hypothesis was that every alphabet is of size at least
Four so let’s use it right now so I know that the size of c one is at least four times the size of CN minus the bad extension so now my goal is going to be to Upper bound the number of bad extensions and for this uh I will just
Um partition my set into smaller sets which I’m going to call BI so bi is going to be the set set of words from B that end with a square with a square of period b i sorry so in particular I know that all of the words from B they end
With a square because if something went wrong it means that the square appeared so it has to appear at the end and now I’m just going to partition this set into this bi which mean that b is the union of all the bi right so in particular the size of B
Is at most the sum of the size of the bi okay great so now I want oh sorry so let me just it in the equation on the top just to uh make some progress so now the size of CN is at least the size is at least
Four times the size of CN minus the sum of the bad extensions okay good now let’s B the size of B and this is going to be the second important part of the argument uh we will use a simple idea for this in fact so let’s take W that
Belongs to one of the bi so this is W it belongs to bi great I know one thing about W which is that it ends with a square of PA I okay so I have this Square uu that such that the lens of U is I great now uh let’s look at the
Beginning of this word by which I mean I’m checking all the words until the first the second occur of you uh well in in this word which I’m going to call v um I know that there is no square right so this just by removing the last letter I know that I’ve killed
All the squares so if I remove more than the last letter then I’ve killed all the squares so V V is a square free word and it respects my alphabets so V belongs to CN plus one minus I now if I have one such V can I
Get back the word w yes I just need to take the I last letters of v and to repeat them so basically it gives me a unique way to get back my word w from any word from uh cn+ one minus I which mean that the size of bi is at most the
Size of CN plus one minus I yes said we should ask if we don’t understand Yeah question um you said we should ask a question if we don’t understand so maybe it’s the jet lag but could you explain why V has to be squarefree it seems to me that you may have many
Squares in one word so simply removing or breaking up the square at the end why do why is the remainder Square free yeah so my my definition of b was I’m taking a word from CN and I’m so CN it is square free and I’m adding one more
Letter so if I have a square I know it’s at the end and there is no other Square so now if I remove at least the last letter then it was the jet leg thank you yeah good you ask anyway okay uh so basically now I’ve
Done everything I’ve just need to use my three equation together and we will have the conclusion the desired conclusion uh so yeah so as I said the size of bi is at most the size of CN plus one let’s use my hypothesis up there since I’m
Removing I minus one letters I know that this is at most the size of CN ID 2 to the I minus one right uh now I’m going to use it with the other equation and this is done uh so yeah so let me just I’ve just copied the equation on the top
I will just replace size of the bi by the upper B that I have from the size of the bi I’m just going to factorize by CN to make it slightly more convenient and this is what I get and this is actually the like the most advanced part of the
Proof you need to know that the sum of the 2 to the is two and then you get that two to the minus I sorry and then you get that uh yes the size of CN plus one is at least twice the size of CN and that’s it that’s the proof um
So yeah um well this proof I think I think it’s quite simple but maybe if you haven’t look at this kind of problem it’s not that apparent that you would need other strong tool to do just similar things um also it relies like on on really really
Simple arguments which I think I think is quite convenient um okay let me say well other things now the last slide yeah the last sorry right but this is true as well right this is correct I could be sharper I will discuss sharpness of this proof in a
While we can go back to this comment you’re right um okay yeah so maybe just to say a few words thish proof is in fact much more General than this you can apply it to many other problems and usually at the end you get an inequality not an
Equality that’s why I went yeah I should written inequality I agree with you um okay so this okay so let me just remind what we have just proven which is well slightly stronger than this but if all of my alphabet are of size at least four then there exists a square free Word sorry okay what if the size of AI is three will the conclusion also hold the size of a is what sorry it’s it’s three I will I will take it in in 5 seconds so yeah so this is what I’ve just proven and this is the question that I’m going to
Ask so if what if the size of the alphabet is three uh actually we don’t know I will say a bit more about this later uh and and really this question like seven years ago is was my main motivation to look at these kind of techniques which since then I’ve fused
To do a lot of other things but I haven’t been able to solve this question uh with it yet but anyway um okay so so I don’t know what what the size is three um okay let’s let’s talk about your other question uh sharpness of the proof uh somehow there are some
Part of this proof which are not sharp where we could hope to improve it like um okay if we look at my induction hypothesis then somehow the first equation that I have on on on the top I don’t have really other choices than
This uh again when I defy a n + one I sorry when I Define B somehow the the second line of equation it has to be what it is because the size of n plus one could be exactly four by my assumption but now some part are not
Sharp in particular oh sorry so the first thing is I’m writing that b is the union of the bi so the size of B is at most the sum of the size of the B which might not be true if some of the bi intersect each other right in this
Particular setting it happens to be uh an inequality in fact I could replace this inequality by inequality because the way squares can intersect each other this would mean I have a third smaller Square which is not in the end of the word which be which would be a
Contradiction with my hypothesis so I could make this an equality so but somehow there is no hope to improve this inequality but now there is the yeah the one which is really interesting which is this one this one like the argument is nice but this is far from being sharp this this
Inequality and so somehow if we want to improve this proof for instance if we want to prove something better than two like that the gross rate is at least I don’t know 2.3 or something like that then we need to work on on on this part
Uh so this is what I want to discuss a little bit uh in the next few minutes so somehow I have this this bound on the size of bi which is not Sharp uh one thing to note is somehow is I is large I don’t really care because I have this SP
Series but now if for I I don’t know 1,000 this is going to be so small that yeah I mean it’s going to be some really really small quantity so if I want a good approximation of my growth rate it doesn’t really matter the one that
Matters are when uh I is is small right for I equal 1 2 3 4 5 then I can maybe I’m losing a lot in my inequalities because of this so how can we improve this uh and there are some some techniques to improve this which are
Well somehow for this you need to look at at what precisely happened in the local behavior and one way to do this is okay let’s look at an approximation of my language which is I’m going to look at Words which avoid um only short squares right I’m going to forbid on the
Short squares because they are the one which seems to be I need to look at um and now I’m forbidding only finitely many factors so I can use automata Theory right this is just a regular language I can compute an automat compute the gross rate associated with
This automat and somehow I can find the gross rate of this language and I have this this hope which which is this this inequalities is not is not correct but somehow that this is what I want that what I wish to be correct uh the size of
Cn+ one should be larger than the size of CN multiplied by the gross RTE of of Words which avoid uh short squares but now I also need to avoid the long squares so this I need to remove the bad extension uh that are due to to large
Squares so obviously this is not correct because for this to be correct uh somehow I would need here to to have in fact the set of of words without short squares and that the like somewhere it breaks something but there are ways to fix this
Yes yes it should be for I smaller than n but again don’t really care but yeah you’re correct like I could stop the sum at n that’s true but somehow I want to do it for any end so yeah it’s the same yeah okay so I don’t want to go into
Details but now the idea is using just basic AATA Theory linear algebra you can find some weights instead of just counting the word I’m going to give a different weight to different words and I’m going to say that now the the weight of a set of words is just the sum of the
Weight of the words and and now I can actually prove such an an inequality using this weights like I could prove that the weight of CN plus1 is at least Alpha L times the weight of CN minus the bad extension and all of this is using really the spectral Rus of the automat
You find the associated vectors I mean everything is quite classical we know how to do it efficiently with a computer everything is fine now the only difficulty would be maybe the bation that you need to use to bound the bi might be more complicated ask yes all of
Your alphabets are the same that’s very clear what is Al with the but if all of your alphabets are different uh this is more complicated how do you compute it with an autom this is exactly the kind of details I don’t want to talk about but
Uh this is yeah this is this is more complicated but the idea is the same it’s just I mean you have to put the ends on it and then just to do technical stuff which are a bit annoying but the idea is quite the same like okay you
Need some extra ideas to make it work but this is quite similar yeah anyway the idea is I compute an appro in sometimes I compute an approximation of my language I can find the gross rate of this approximation and now using this counting argument I can prove that I
Don’t need to remove too much like all this bad extension this is just something which can be taken to be really small and and I get the growth rate of the language I was really interested in um so you could do it by hand compute small automat by hand or
You could do it using a big computer and Computing really large automat for really large approximation of your language and this is the kind of thing I’ve been doing so this is what I I will be talking about oh yeah I should say small few words similar technique was
Already used by kpov then kpov and then sure to obtain Bonds on the gross of power free languages uh yeah so so this there are some ideas which are really similar in this um yes okay and so using this idea and things some of the things I don’t want
To talk about because are just really annoyingly technical I can prove something like this point which I mean like it’s really weak in some sense but I’m really happy about this so I’m going to talk about it uh so suppose all the alphabets of Si three which is
Really what I want to prove but now I need some extra condition which is that well all the alphabet belongs to 0 1 2 3 right so so somehow in total I only have four different letters but all the alphabet are offside three then they are square words that respect this alphabet
And which ofid squares um so there is something again against this intuition I was mentioning which is somehow the only the only setting in which I can prove something is when when the alphabet are closing off right if if all of the alphabet look like each other then I can prove
Something but now if you make the alphabet more different I will have more difficulties proving that it works which is really cont inuitive maybe because of the technique I’m using but still uh so I’m not going to go into into detail of this proof but somehow it uses some of
The ideas that I’ve mentioned before um plus some technicalities because the alphabet is changing at every position so you need to do some things you you cannot use linear algebra anymore but you do something that that looks like linear algebra the standard linear algebra on
Automatons um and so what I do I need to compute the approximation of the automat that recognize uh squares sorry Words which do not contain squares of pair less than 22 and this automat is quite big so uh it has one billion state in fact it has
More State than this but I’m I’m I’m doing some kind of um renaming like uh I can I can always assume that that the last letter of my word is zero up to renaming and the that the the previous letter is one up to renaming so some by
Doing this I can divide the number of states by 24 so in fact the automat has 24 billion States but you can use this trick to make it a bit smaller uh it takes something like 7 gab in memory uh with a lot of optimization and two hours of computation so yeah
This is I mean it already requires quite uh powerful machines to do it um but someh something interesting about this technique is well you could replace z123 by 0 and 2 34 and it should still work the only thing is now this is the kind of computer I don’t have so I
Cannot do it but uh somehow it should work like if the conjecture is true then then it should be able to prove it uh and more than this if the conjecture is true in general like if you can just get rid of the second condition then I should also
I mean okay and and if another conjecture is true which is that the growth rate of the language is the same which seems to be true as well uh then actually I could even prove it with this proof it would just require something like 10 to the 7 gab of memory which I
Mean it’s a lot but it’s not a lot a lot like you know maybe in 20 years uh so I don’t know uh anyway so so yeah there is this there is this proof technique that might help solving this problem I mean obviously a direction is to try a way to
Improve the technique first but maybe we can just wait 20 years I don’t know um okay so this is what I wanted to say about the uh TR Choice number there are other things I want to mention related to this proof technique um because as I said I think
For me the most for me the most interesting part of this talk is really this proof technique which I like uh and it it can be applied to many different problems so let me mention for instance this uh so suppose you have an alphabet and a set of forbidden pattern so I’m
Back to the more regular standard setting in comat I have one alphabet I don’t have a sequence of alphabet anymore uh and suppose that there exists a positive real which is solution of the following inequality right and somehow this this inequality should be reminiscent of what we think right like
The size of a is four this geometric sum looks like uh 2 to the minus I and the thing on the right looks like two so it’s not exactly the same equation uh but somehow this is really the same idea and the proof relies on the same idea
Anyway if you have a solution to this equation then um then this language is not empty and in particular the in fact the growth of this language is at least uh x uh yeah so how would you prove this well again you would fix the alphabet
The set of forbidden pattern and now you would say I want to count the number four words right so CN is going to be the set of these words which avoid my forbidden set of factors um and now what I want to prove it that whenever I’m increasing this n
By one then I’m multiplying the number of words by at least x uh yeah which mean in particular that I have at least x to the end words of plens um sorry at least x to the n words of lens uh n which mean I have more than
Zero which mean I have at least one right uh so great this means that I have at least one word in fact um if there is a solution to this equation this the solution X is always strictly larger than one uh except for a few generate cases so in fact it gives you
Exponential growth but I don’t really I don’t even need this so how does the proof work let’s so the proof is going to be I’m going to do it quickly but this is the same as the one we’ve seen before so you should be able to have
Some idea of what I’m doing so I want to prove by induction that the size of simp is at least X time the size of CN right so for this I’m first using my induction hypothesis as I know that when I’m removing J letter then I’m dividing by
At least x to the J right again I want to look at what happens whenever I do an extension so I have I take a word of l n which is valid and I add the next letter now it could go wrong in which case I’m
Going to put it in b or it could still be valid in which case it’s just in CN plus one so I have a similar equation okay and now I have to I have to compute a bound on B for this I’m going to split it again to decompose it
And I’m going to to do it according to F so BF so for every forbidden pattern F BF is a set of words from B that ends with an occurrence of this particular fidden Factor since I know that every word in B ends with a forbidden Factor uh this the
Union of the BF is just B so the the size of B that most the sum of the size of the BF right and now the idea is again exactly the same well now suppose I have W inside of inside BF this means that W is just some words
Followed by F so now if I know this word v i can I can yeah guess W uniquely right I can obtain W uniquely uh and moreover I know that since I’ve removed the last letter I’ve removed all the occurrences of bad factors so V belongs
To uh CN + one minus size of f um and I have the following inequality which is that the size of PF is at most uh the size of CN plus one minus F and now again I just have to put all the equation together and and it’s going to
Give me what I want yeah so let’s use our induction hypothesis so the size of BF is at most the size of cnid by x to the F minus one great let’s put everything together I’m factorizing by CN and now you probably don’t remember but I do my my my hypothesis that this
Is larger than x so now I can conclude uh okay so so the idea of the proof is really Sim and you can actually apply it to many different settings this this idea um the other thing is now when I when I showed the previous slide when you saw
This condition that a minus this geometric sum this sorry this series uh is at least X it might look like it’s coming from nowhere but actually you just at the beginning you don’t know what the assumption is you try to do the proof and the Assumption really comes
Naturally at the end like you you really don’t have to guess the the correct assumption it just comes to I mean in the proof directly okay so that’s that’s the proof um okay what I should say is uh proof of the same result was already
Known there is one by oam using the PO method uh in 2016 I I pretend that my proof is slightly simpler but not that much it’s just it’s a good ill ation of the technique but like the the proof using the PO is not much more complicated and
I think it’s nice as well um okay it it’s also stronger than results used by Miller and pavlof recently to prove things about different properties on sub shift uh and in particular they wanted something about the the size of the sub shift sorry the growth of the sub shift
And the bonds that we obtain by doing this are much better than the one they have so is nice you can apply it to different problem which are I mean right now if you look at it it’s not clear that it gives you anything interesting
This this LMA but you can apply it to actual sets of forbidden patterns and you might obtain interesting properties okay um so now I want just to finish to mention a few other application yeah yeah okay yes okay maybe we can go back to it after okay so I want just to mention
Quickly I’m not going to give the definition but we’ve seen that this uh this argument it right I’ve been using it mostly to say uh the set is not empty but but in the proof I always need to to say the gross of the set is at least x
To the end so can can we say something really interesting with this can we prove that the gross of some uh set of words is large and so for instance there is a following setting so I’m not going to give the definition but I’m interested in the growth rate of uh the
Language of x3 word over the alphabet 1 2K right okay that’s that’s a question and there is the following conjecture by sh uh from 2009 who gives like somehow a good uh yeah like precise estimate of the grow of different languages and the conjecture is that
This should be true so it was open for a while actually it’s still open but I was able with this technique I’ve mentioned to prove at least one side of the conjecture which is yeah the gross is indeed at least this and since we expect
It to be this this mean that you can use this this counting argument to get really sharp Bond on the gross of the languages you’re looking at uh and in in this particular case the proof is quite simple like again it’s alha page just like the one for squares it’s really not
That complicated so yeah this this can be used to to obtain this kind of result about the gross of the languages as well and it’s completely by hand there is no computer involved in this which might be of use but okay so let me conclude by mentioning one last
Problem which I think is nice so this is a game between an and Ben so at that turn an and Ben they are going to so they share a word there is a word with they share and at that turn each of them they add a new letter at the end of the
Word and then he likes squares so he want to create squares and and she hates squares so she has to forbid creating them so somehow uh if there is a square of lens at least four that appears then Ben wins that’s that’s the rule uh so let’s
Look at this game over two letters for instance I’m going to give a strategy for Ben so suppose an starts by playing zero which I mean you can do with complete gener then the strategy of Ben is to play Zero as well now and she has two possibilities she could play Zero but
Then Ben can play Zero and then there is a square a square of period four of lens four sorry so maybe and try to place one in which case for instance this strategy of Ben is going to play Zero and now there are still possib two possibilities for
An she could play Zero but in this case Ben plays one and then we have the square 0 0 1 0 0 1 or she could to play one but then she’s lost immediately because there is a square 0 and 01 so over two letters Ben wins pretty
Easily uh if you look at it over three letters I think you can do it by hand it’s it’s more complicated like the tree is slightly larger but you can do it by hand and drawing this kind of tree and checking that Ben also has a simple strategy with three
Letters and uh using this technique called entropy compression uh Gru can Su of his Scot prove that in fact if there is at least six letters then Anns which is not obvious I think right so there is B which TR which tries to create squares and and she has enough powers to just
Forbid all the squares that’s pretty nice but still now this this is open What’s the correct value is is 4 five and or six and I was able using again this kind of technique I’ve mentioned now I need a big competition uh with my computer to prove that if there is at
Least four letters then I has a winning strategy okay so I I think this is a nice question I like this game but yeah okay let me let me conclude uh yeah so this this counting argument uh so similar technique was already used by other people which this technique was
Called the uh po series method it was used by Bard M but some there was as the name suggest there was some Series in it uh which didn’t help uh and actually removing it made the proof simpler and also allowed me to generalize it to other object so actually I’ve been using
This proof for graph coloring hyper graph coloring Kat tiling sub shift and many other uh settings so this idea that comes from com words we can actually uh use it in many other areas which I think is quite nice uh um and yeah for instance this is a
Recent result I have I have some a theorum which is quite similar to the one I had before which is um but this time I want to look at tilings of some groups uh and now I’m I’m looking at tilings of a groups and if I have this
Uh this condition that relates the The Forbidden patterns uh then I can make a tying of my group right so don’t look at the details but somewh you can use the same technique and the proof is not much more complicated it’s almost the same um okay so this this is all I
Wanted to say uh some what I want to say is maybe if if you have some combinatorial problem with this kind of flavor you should try to use this technique because it’s not it’s not so complicated to use and and and it gives some interesting results yeah so thank you for [Applause]
Attention Okay so are there questions and please wait for the microphone before asking your question okay gab H thank you so um the technique I would use to know if there are um arbitrary long words avoiding a set of forbidden words would be based on the classical um
Automaton uh uh which is essentially the Alcor automaton built on the try of forbidden words you can suppose that the forbidden words are minimal without loss of generality uh and so uh when you build a try of all these uh forbidden words and you build the automat that avoids the these
Words um it uh either it has a cycle or it doesn’t and of course you have arbitr many aitr long words even only if the resulting automat has a cycle so there should be some relation uh between this and the counting technique but it’s not clear to me uh how to
Um given the the set of forbidden words how to predict uh by counting whether the resulting automaton we have a cycle can you apply this technique to to directly uh give a threshold depending on the set of words or it’s completely unrelated let me just uh put the back
Notice something there uh I’m not asking my set of forid patterns to be finite to apply automata Theory you need the set ofid FNS to be finite right so in particular I can apply to squares you can apply automata Theory to squares okay right so this this makes a huge
Difference uh so I think that’s that’s the fair part of the answer you you cannot apply it to the same kind of problems now um as as you you suggest and this is what I was somehow ining it ining at at the beginning uh like you can use the two things
Together in some sense like you can use this discounting technique to to forbid the infinitely many factors but which are big and which do not play a big role and then use automata Theory to to control the small factors the small forbidden factors and somehow by doing
This um yeah you can you can obtain something which might be precise like in particular say you want to forbid squares plus other factors then I’m claiming that I have a decision uh algorithm for this I mean I have no written proof of it but I know that I
Can prove it somehow and the idea is really like you will compute this this big automat at the beginning that include all the set of finite factors plus all the small squares and then you use the counting technique to eliminate the large sares and maybe you don’t get
And and at some point because because of the way it works the par get negligible sorry NE negligible if you go far enough so so it will work uh in in complete generat it’s not clear uh where it works and where it doesn’t because the PO might be not that well
Behaved and stuff like this but but somehow yeah like like the the most powerful thing is to use these two things together they’re not somehow they are complement I’m not doing the same thing okay Right so uh in order to get this kind of estimates uh apart from your Technique there’s also entropy compression and L loala is it can you say in some precise way that this technique is slightly is strictly stronger in all of those cases or do you have some cases where you
Cannot improve on the existing results using this um so the thing is okay so to so actually there are okay I think uh in this kind of settings it should be stronger than the laas loala I don’t have any proof of this like it’s a claim that which is pretty
Bold uh but for this kind of setting it should be stronger than theas at least it is the case for all the application we have but that might not be a good proof uh now for the entropy compression that you mentioned there are actually two version of the entropy compression
One of them is equivalent to the Lala and one of them somehow is equivalent to this counting argument there is no proof of this uh but it it it’s really like you have the same power series that appear at some point and and and you need to compute
The same thing and at least in some particular setting there are proof of it but in general since we we don’t have a real general statement of this counting argument it’s really hard to say it’s the same thing and some and in the same way we don’t have a general statement of
The but whenever you try to make a big framework you get the exact same B the exact same condition with this version of the C of uh entropy completion and the and the cting argument uh yeah and the entropy does not tell you anything about the size but
The grow of the language right it tells you that the language is not empty but it doesn’t tell you that it’s large thank you okay oh yeah last question I think yeah thank you is there a hope that uh this argument says anything on non non amable
Case soorry on for example in your theorem you have amenability which plays a role is there a hope to go be well uh you know I’m wrong I don’t need aable in the theorem it’s uh well yeah okay um Let me let me be more
Precise um so what I can prove is that if you I look at the gross in term of uh of words or locally valid words then I don’t need amenability amenability allows me to say that the uh locally valid words as the same gross as the globally valid one so it’s it’s really
Not about this this proof it’s like about uh it’s really about uh writing this instead of writing uh if I was writing the complexity of my language is at least beta to the end then I could remove abin ability thank you okay so thank you again [Applause] f