so sorry for the for the for starting late uh this is the the last lesson I I don’t know if I’m able to finish in time but in any case you have the slides so H we left last lesson with the rise of the worldwide web and in particular uh it was based essentially on three things first of all the idea of representing a document with a text file with for markup uh language the second is to have a browser that could visualize this page and also images and dialog dialogue with a server and the third is a server and the development of a new kind of protocol the HTTP protocol which was essentially a a very stripped down version of something like FTP because in FTP you have to establish two channels one for sending comments and the other for receiving data one okay and so the connection was kept alive for the whole session while the idea of HTTP is very simple there is a handshake U part in which the server and the client establish the connection the uh this this the client sends then a request get or post request to the server sending also cookies as we shall see so some kind small pieces of information the server uh treats the request it could be for a static page it could be for a dynamic page it could be a could need the running of some program send the response and closes the connection so it is essentially uh the connection is made in some some some sense Atomic even though the client continues to navigate the same so the chain of pages that are part of a site the connection is closed at teer request this is clearly the first protocol now we have possibilities of keeping the connection uh alive in order to for instance populate the page with the things coming from database but the sense of this protocol is this one and the explosion of the worldwide web which is different things from internet was already existing led to The Disappearance of a lot of services like americ online also Microsoft network which were based essentially on connections at point to point from clients to servers using modems essentially and um each of them was offering services like for instance a chat or email which were closed essentially I remember that for instance also the de the network of VES was a closed system so and also bat at the the the network of IBM servers so essentially in these cases you could send a mail from one user to another inside the service but not do any users of any system on internet clearly Bridges were then developed in order to send I don’t know emails from bitet to DEET to internet and so on and but in any case speaking of say services that could be accessed by browser this was a real Revolution because in in principle one could do anything using a word a browser and this was also seen as a Liberation so for instance the Electronic Frontier Foundation was founded just to defend internet and www users but still something was missing this was the era in which the the internet water web was seen essentially as a way of sharing information not not as a market and so for instance it was a completely missing a payment system especially for small payments and a way of searching for information there was goer but it was quite limited so the first answer was Yahoo for instance Yahoo had a nice interface but which was populated by hand uh so maintained by a lot of people working on this and it was essentially based on U say making links to whole sites rather than Pages clearly because since it was done by hand essentially they they listed the sites dedicated to sports to Arts to computers and so on and the first revolution in this sense was alav Vista again from deck the the vax house uh alav Vista was the first one which used the robots bots in order to navigate the web and take information from pages and following Pages follow reconstruct the whole web but the problem with alav Vista which was a clearly a big revolution at the beginning but after say just one year ER it it become essentially useless because people trying to promote their sites especially por sites or sites selling something um where uh developing since the indexing of a page the importance of a page so the ranking in which it appeared was based on the content on the page so you if you were searching I don’t know for I don’t know path integral it listed the all pages containing the word path integral ranked with the number of occurrence of path integral inside the page so essentially people started to put a lot of terms which were searched commonly uh in transparent fonts or very tiny fonts or masked be behind some image so that there were indexes said that at a high rank in alav Vista since it contained a lot of information or a lot of terms but actually it was showing a completely different content for instance using images so that the the robots could not read what was read by humans essentially and so in 98 alav Vista was quickly replaced by Google search which developed a clearly quite interesting piece of software first of all the database in which uh was stored all information and the page rank um way of assing importance to a page in which the page were classified using what the contents but the importance was given by how many links arrived to this page and this was a way of exploiting human intelligence because generally links are set up by humans and not by computers and so the idea was to exploit the information hidden inside the the the network of connections among pages and use it for to to to get this service clearly another interesting point which is not covered here is the algorithm for traveling the the web pages so the robots by Google essentially followed the pages for with a certain probability or jumped to another random page with given probability and this probability is adjust in order to navigate quickly the the web which was clearly exponentially growing and uh and then clearly the the other things that Google invented was the mechanism of the revenue uh they essentially wait we shall see in a moment the click the paper click so essentially it was um getting uh money from advertising according to the number of clicks received by Links and uh after this as you know Google both and adapted a lot of other services based on a web so Maps Earth books Google Docs and then he developed his own browser Chrome and then Android so uh the other service what which was missing was payments and markets and so you see that in quickly Amazon appeared and then eBay and then PayPal and clearly Google which was getting a lot of money from this paper click which was also offered to other sites so you could include you can you can you still can include Google advertisement in your in your page and get some money from it and clearly customers are also identified by cookies cookies are a way of making the HTTP protocol which is a stateless it means that it has intrinsically no way on knowing who ER where you come from actually in the heading of the H HTTP connection there is a field which says if you click the a link on some page so where you come from if you click a link but if you type an address directly you don’t see it and so you you cannot easily track the how the pages of users while cookies can be used for tracking this and also for say keeping the connection alive they’re not always bad the cookies because if you are making a transactions on a site using cookie the site can know that you have in your say shopping bag already some amount of items and then you can add others and so on um and this essentially is also connected to the idea of dynamic web pages because the first the first Pages were just static so you wrote something and that was the uh the pages then uh the the form was developed the form extension to HTML was developer by which you could insert something and pass this information in the URL L with a question mark and then fields and so on um inside get or post query to the server and um in this way uh you can get data from a database so you have to build up the page dynamically because the content of the page came from a server so uh it was developed a way of launching a program on the server in passing information from a form to this program the program could for instance get information from a database then it return the the web page built with variable data and then this this was returned to the client and this is called the common getaway interface then the next step was to include this ability of elaborating uh information directly on the server for instance PHP was integrated could be integrated on the server and also modules for per with PE Ruby and so on and finally forance the last one which is the most used is not JJs for instance Google Docs Google Apps essentially I I know the JS application and um so since the browser was so important it’s started to be used quickly clearly we are speaking essentially of graphic browser forance in the University at that time it was quite used also links which was which is still exist which was which is a textual browser to be used also on terminals say but for users the general user clearly they prefer Mosaic which was then made the commercial by Netscape uh Microsoft at first ignored completely internet it was relying on his messenger but then suddenly realized that the browser could replace a whole operating system and so it reacted developing Internet Explorer or Internet Exploder as you want H which uh was one example and also gave rise to a court appeal because it was given for free with Windows and and was installed by default as the default browser with windows so while Netscape was sold you have to pay in principle to use Netscape so clearly it quickly replaced the as you can see the Mosaic here in the green is Netscape which was quickly replaced by Internet Explorer and all only after some say problems with Internet explor also legal problems it started to be replaced by others browsers Clearly Now Internet Explorer has essentially disappeared and um clearly there is interesting path because what Escape developed there was separation from the rendering machine and the browser itself so that the rendering machine could be used for other application for tund the the mail client was essentially written as kind of web pages with page running on the machine that also was used for Firefox so essentially all the interface of Firefox thunderberg was developed using essentially HTML including the the menu m not the menu inside the page the menu of the application and um there was so you you see the other uh important contribution was in the in the Unix field in the Linux field we shall see Linux and the following but kdi which is a a German um Enterprise developed the common desktop environment which is now is used in Linux for instance and also its own web engine this that was used to develop the Conqueror browser but which is now used by Apple and also by Chrome so essentially Chrome and and the Safari browser from from Apple use the core of this browser of this web engine developed in the Unix world and um which is called now web kit uh no where is it here it is and um also Chrome use this so we said that uh Internet Explorer was sold and embedded in Windows and this brought to dispute because Windows was accused of illegal competition so essentially he was Bill Gates was convinced to retire from essentially Microsoft and at the end it was ruled that Windows should be should separate office and and Internet Explorer and finally okay ER so remember this these dates essentially 1998 2000 which essentially corresponded also to the explosion of the internet bubble you can see the how the the rise and fall of quotations made the people millionaire in some days and also poor in other few days and um so there was a time in which EV essentially one could got a lot of money just say selling an idea without developing anything but U so uh Linux essentially head red head become public in this date and this was also the date in which lus to become a millionaire because he was given some stock account from red head but after a while the bubble collapsed and lot of uh Enterprises say fied in the the parallel history from of commercial software was that about G which is the essentially this cre Mo movement from the electronic Frontiers free Frontiers and so it started with Richard stalman which abandoned the MIT pH so it was working on lisp as you see Li start to appear and so he decided that he had to write a complete operating system clearly deeply inspired by Unix and so g means recursive acronym G is not Unix and um so he develop he he was at first it it decided to write the whole system then it it found that a lot of components were already available for instance the X graphic system developed by me mit was free the tech scientific mup program was free developed by Don knot he wrote emx which uses lisp as a script language and also the first C compiler and and then the other utilities and but he was not able to write down the Kel the hard which was planed to be a micro because essentially I don’t want to enter this question but the idea of Kel which is the core program of an operating system in the 70s already was that it should not be a monotic program just one program essentially because it was too dangerous it should have a system made by different processes that communicated among them which is called the micro caret the problem with micro carness is this it is that is quite difficult to have something which is Affordable because clearly if messages arrives out of order that can be big mess so L STV which was who was a student and both one of the first 386 computers H and so he started u wanting to write just a a program to connect to a modem and to the server then he was studying a very simplified Unix scel meix he was not satisfied so he started to write down a complete operating system monolithic one so just a single program H and so he actually was able to write it in three months and then he started adopting reporting all the utility utilities he wrote down his own version of the C stand the library program and uh and he de distributed it freely using newsletters or public repository and so immediately a lot of other developers started to to write down device drivers adapt them H the distributions started to appear distribution mean that you can park the the Kel with with all utilities and also possibly develop a way of uh checking if the some piece of software is compatible with the others and have say packed together all programs tested that to work in say version of the operating system having a way of and downloading the the new versions of the utilities so the things the things that have been then copied by Microsoft by uh apple and so on with the the different stores and so on and so you see that different distribution appeared slack where which was I think the first one I used in which you had to compile everything by hand then dean red hat which was a revolution because it also Dean they used this package system so you don’t you didn’t have to keep track of all the things and so on and another protagonist of this of this Revolution was Eric Raymond he was more in the philal field but he wrote this Cathedral and the bazaar text which was in some sense and the say the the Manifest of this movement and for instance he compared the development of Emax and GCC which were which have been developed by Richard stalman using a standard developing uh development Pro process I mean having a team of expert working on a program a small team of expert working on a program and releasing only major improvements while the Linux kernel is an example of this Bazaar mod model in which everyone could see live the development of the of the software could participate clearly there there have been ways of avoiding people destroying everything but essentially anyone could Fork its own version of the system make any development he wanted without asking permission of using this software and clearly there are success and problems in this approach for instance Wikipedia is one of the biggest examples of this Bazaar model but clearly for large complex program like for instance I don’t know developing browser the still the the cathedral architecture maybe the the one important in the meanwhile the the wiki idea was theel veloped by W Canan essentially exploiting the fact that Dynamic web page could be ER also written by users in the sense that you could have a form in which you could write down the text using some mark up language and this could be then stored in some database and served to all others so the idea was this so to let people write down their so contribute to a common web web site instead of having its own developing the web personal web pages as was done at that time you could join together he developed these camel Cas ways of making links so you you just just had to write down a caml k um word and it was converted to a link and uh and it was open to everybody clearly this gave origin to Wikipedia wiki wiki means quick in in Hawaiian I think uh language and uh so this is uh Wikipedia also Wikipedia used the camel Cas word but now essentially it uses another syntax and uh at the beginning it was only text so with the standard uh mark up say that I think it is called markdown using aisk for Boldore for italics and so on now you can edit Wikipedia using a visual Rich Text Editor but this wasn’t um Possible only after a certain version of the browser and uh so you can see that also Wikipedia grow has grown exponentially and it was essentially also a revolution because nobody uh could have a forecast to success something that in Enciclopedia that was written by every everyone so with say a very Democratic control I mean that everyone can alter a page but since there are so many people looking at modification you can be essentially certain unless it is a very remote page that your revision will be checked and eventually controlled parallel to the browser war after a while that was the server War because clearly it the server adopting a server meant to having to have a whole operating system behind so essentially the the the war was from from one side Microsoft which was selling its window window n for a server and clearly with a server windows web server which had some problem I remember because since a server generally runs for I don’t know months or even years uh continuously accepting request foring and serving and then shutting down the SE the the the spoon the process and so on if you have any minimal memory leakage I mean some part of memory allocated by one of these forking process not released to the system after some I don’t know some months or some week or some year you have the whole memory filled and so you have to shut down the machine essentially so I remember for instance here in Flores there was a big web fact a server which developed the way of shooting down periodically by Electronics the servers in order to in order to clean the memory while in the Unix World which was then ported to the different operating system the leading server was aash from NCSA the same enterpris who developed the ncape and apash is a pan for a patch because clearly it was continuously patched and finally the database was the mySQL database which started as a propietary database then was released the open and then passed it to Oracle which also developed its own database and so on so essentially the uh the suite for having web server was the server itself a script in language like PE or PHP and msql server and clearly you also had to develop a way of uh having computers serving this system and um so first of all you use the very EXP ensive server like main frames but at the end the solution was that of Google so to have a very cheap and out ofth shelf computers I think this was the first Google server H so you have just motherboards and connected by some fast Giga internet connection hard disk but you just store all of them in SRE and since it had failure rate there are so many that at least one every I don’t remember some few minutes was failing there was a a wreck of say spare motherboards and people that’s continuously passing and replacing the the failing motherboards clearly you also had to develop a way of managing all of this each Ser on Unix Linux essentially and you the the Google developed this map reduce which is directly inspired by Lis the idea is that if you um after it was developed by aach and then adopted by Google so suppose that you have a database with five rows and with two columns C and measure temperature and you want to find the maximum temperature then you can say distribute the process to five processor each of them has the task of finding the temperature for just one city and so this is the map operation and then you reduce the results combining them and producing the Reston and this is how H this works clearly in in practice you spawn more than five processes the first one which answers is is accepted the others could answer but if they’re late they thrown away and also from the electronic point of view the there was also a kind of War of processors from one side Intel which was essentially pushing the limit of CIS processors so essentially in since a c processor as we already said they used mainly micr code instructions in order to mimix complex instructions with simple Electronics relatively simple electronics and this meant to have the processor running at the higher speed than the rest of the system so essentially they started to push the limit of the speed limit more and more and more clearly since this meant also having less def on the mother on the the CPU uh it also developed a way of selling defective or downgraded uh chips so an I3 I5 i7 are essentially the same chip but sold at different prices according to the number of uh units working and so on but this also meant having a lot of heat dissip dissipation if you have seen a process processor or a motherboard of the 2000 you could have noticed a very large fan sometimes with the cooling system using water or other system because it was essentially dissipating a lot of power while the other the other approach was that of IMD and then of also in the media and apple and so the of developing multicore system with essentially a risk approach restricted instruction set so that and also gpus so which is a the idea the extension of the fpga array and so essentially the idea now also for a complex system like the M1 or M chips by Apple is to have in the same chip a very different things for so for instance memories and also uh gpus and CPUs and neuron networks and say caches all theide a single cheap clearly you cannot expand this but you can make things quite uh powerful and cheap and also to reduce the time of communicating among different parts of the memory and uh the other important so revolution in some sense is to push virtualization at maximum virtualization is not new things because for instance IBM developed the system 370s and this technique also for the machine itself so essentially you had the virtual machine which could run inside the virtual machine and so for instance you could test the whole system inside a virtual machine before replacing the operating system so in this way you can be sure that when you say update an operating system you have already tested everything in inside the virtual machine clearly a virtual machine can be of different types you can can have just em emulation so you implement ways of processing the instruction of a CPU using another CPU so for if you are interested you can now run a a VX VMS system inside any Fe uh just installing an emulator and then you can load everything but clearly this is quite slow because you have to translate uh either at a compilation time or at running time the native instruction inside to in a different architecture virtualization generally means using programs written for the same CPU in which it is running so you have just to H shade the cold to the operating system or to the devices and intercept them and having them Ser inside the virtual machine while the direct call to processor could just be fulfilled so essentially the speed is almost the same as the native system and this is for instance used for instance for running Linux on Windows or Windows on Linux and Linux on Mac OS and so on and okay let’s come back to the social part one clearly the increasing power of CPU uh of memories in the say Market computers and the speed increasing speed of internet of connections uh finally made possible to have system serving for instance we videos so and this made possible what is now common things that to have a computer Enterprises interested in buying the movies or producing M movies so this started in 89 when Sony Bo the Columbia pictures from Coca-Cola so was always strange that Coca-Cola was interested in comia pictures so essentially in the archive of movies but then Sony made this explicit and for instance another Revolution was clearly YouTube in 2008 five so uh after 11 months consider that Google bought it for $1.6 billion dollar and it was really a revolution I remember when it appeared I thought that it would have so expensive to have the storage of all these videos that it would never be possible to have make a revenues from this and clearly it was wrong and another for instance was another different approach was Netflix we started renting DVD but then made a transition to streaming while Blockbusters for instance was never able to do and clearly it failed and u Netflix at the end started as you know to produce its own show shows and clearly the same R was done by Amazon Prime that began as a way of say making a more say fast delivery of things and now is serving any kind of service Disney plus and so on so uh as I said the idea of having a possibility of uploading or writing down web pages was seen as Another Kind of Revolution called the Web 2.0 in which the web was now this was a term ConEd by publish T Mor but the idea that the the main part now of the web is not produced by some service or some say big Enterprise but by using itself and so for instance Facebook is one but it was anticipated by other services and clearly on the other side there is a Wikipedia and and all YouTube also is an example like any other social media which is now the the the main source of information especially for databases but also when you when you look for some information you essentially get the answer from things written by other users rather than by so top down Enterprises and so you see the history is the one that you know YouTube Facebook Instagram Tik Tok and all the the rest I’m not going deep on this part because clearly it was it is a a huge part but you can consider that in this way there is a a strong influence of the recommendation system of the system of the the platforms because when you look for information for instance in in Facebook but also in any other social media you receive responses not based essentially on as in Google on links or U or uh contents but rather on the profiles of uh so what have what other people have visited and how near you are to other users essentially because you chat with them or you visit the same sites and so on and this is a possibility of developing this Echo chamber in which you listen only to people that thinks like you or knowledge bubble in which you find only the contents that already H have been say approved or selected by your community and uh so essentially this is a completely new way of say seeing how information is processed in in Internet the clearly the web also influenced the programming languages because just consider that in 1995 Java was developed by S as a language for application robust based on a virtual machine which which then was just compiling just in time so the idea was to have something that could be written and then instantiated on different kind of Hardware so that this the same program has not to be compiled for all different platform and all different versions you have just to install the the VOR machine and then it received a program in a in intermediate form and then it on the Fly could compile in machine language or something near to machine language the problem is that Java is not interacting well with the operating system so that essentially it has its own feel and appearance not complying with that of the operating system so you can see a Java program you can discriminate it for when you look for how to load the file so the the file browser interface and it was also thought to be the way of embedding programs into a web pages but again it had a lot of problem in communicating with the server on the other side Netscape developed the JavaScript which was essentially embedded inside the browser and which could H manipulate the document object system so the content of the page and the structure of the page on the Fly the name JavaScript was chosen because Java was quite known but it has absolutely nothing to do with Java it is not a scripting language in Java it is a different language it actually it’s real name is e ecmascript and um so essentially for at the end Java had to to interact with JavaScript in order to manipulate the page and finally the Java support for the embedding in page called uplet was dropped so now Java is only used for embedded the system essentially while still it it is thought to in every computer course so you can see that you have so in this figure for instance you have the time evolution in the last 20 years of several languages clearly database languages stay there and HTML CSS stay essentially there java there is a rise and full while JavaScript um is still say essentially continuing and clearly now forance that is python which is uh so taking place as a scripted language then clearly the increasing capabilities of browser made it possible to have also software as a service instead of buying a say office you can just make a subscription or having it for free and use a web service so clearly the first big offer was that of Google for instance Google Docs and and so for instance Google was able to offer say essentially to use a a browser as an entire operating system the Chromebooks are P personal computer with wi with Unix Linux that only run a single application a browser and on the browser you can have any kind of service and now you know that there is a whole Suite of application this was imitated by Microsoft for instance which now is proposing office on the web and uh and this was made possible by combination say of uh possibility of describing data used using a syntax similar to HTML which is called XML the JavaScript that could say download just a piece of information and alter the the web page and insert it in the appropriate page inate part and so essentially this is called Ajax with again marketing name I think also by toril and Crow which is the browser developed by Google is by far the biggest say the most used the browser because you can consider that uh essentially it is also Edge the browser by by Microsoft is essentially Chrome and the the end of of of the is the web kit which is also used by by fir Fox when it has to be run on an iPhone or iPad also Safari all of them are the same clearly the major transition in the second decade of the 2000 is the essentially replacement of computers by smartphones and and you can see that the percentage of say browsers now is that the majority are from say smartphones or iPad and this also means that the content of pages had to be adapted and so essentially this was the the idea IDE of dynamic web page which the formatting of the page is different according to the browser that used this is mainly done by using the cascading style shet not redrawing the the the Page by JavaScript just saying that okay if the screen is such a proportion or such a dimension you have to put things in a different way and Android was essentially the respon by Google to the iPhone and actually Steve Jobs was extremely say um complain with Google about this and okay ER jobs was also asked to become the CEO of Google so and so there are actually there are competition but also strong interactions among hro Google and apple and um clearly this meant also that the design of phones changed because forance we saw the The Disappearance of Nokia which had had its own operating system Sim simum but it was say not upgraded not could not sustain the competition of modern operating system so then Nokia tried to adopt the Microsoft uh portable operating system uh which some people said that it was quite good I TR on a noia and in my opinion it was terrible but um at the end U the look and feel are copied by that of iPhone um I actually prefer Android with respect to the iOS but is a matter of taste and as you see that also now Google is copying the iPhone for instance by moving the audio jack another clearly another Revolution born in apple was that of the iPad tablets and because it started with the Newton which was a failure but uh then also the other time type of say tablets have been developed and um clearly again you you see that there is a hercy by theoo Alto uh but clearly the the big Revolution was the iPad which essentially introduces a different way of say dealing with the portable devices and but almost immediately Android tablets came out clearly Android was first designed for for phones so it had some problems for instance in having the rotated the proportion of this of the screen and for instance the Kindle by Amazon is essentially again an Android with a custom program just to uh read the books and so now you can find the computer comp or Le at least a CPU everywhere because with the this mass production is much more cost effective to Pro to to to embed the processor with the made some with some uh general purpose chip like program field G array and instead of developing Custom Electronics so you put much more power that you need but but cheaper in a in a car then new program by software the the the electronics essentially and you have anything you want so cars uh washing machines mods phones and so on the problem clear is that all since all these has CPU and the power of good PC it could be hacked and there is for instance a tax aimed at things like washing machine which have not been at first designed with security in mind clearly essentially most all of them use Linux so essentially you can see that now that Linux is the most uh adopted operating system in the world not for PC but for all the rest Android clearly is Linux and but still there are hacking and attacks aimed at taking control for instance of washing machine just to H say for instance run denial service attack and another big problem that we are now facing for instance is the documentation of the evolution of all of this because there are a huge number of web sites web pages which at certain point disappear and no so standard way of recovering them uh so for instance uh Google now stop the cash system up to two or three years ago you could uh choose to see the old version of some page but now it is not possible there is internet archive which is storing version of pages so forance I was able to recover some of my old contribution to the web just looking at the internet archive and there are CLE ways of archiving things but clearly for platforms like Facebook YouTube Services you don’t have this way of recovering things and also these platforms uh where are now say losing so the the old sites are losing importance and so for instance you can read this article on the post on the end of Internet site this is just the the Homer answer of building a web page with essentially gave no information it was just a collection of of moving gift and another interesting article by the post was the debate about this image there is a nice history of how a single image and a single question do you see a red so gold and white or a blue and white dress here and it is nice that nice that to to to read how H say essentially uh social where bus feed essentially uh was able to have to to involve millions of an and this was also a a an occasion with which for instance Facebook started to put filters on this kind of of phenomena because there were afraid to lose control of users and uh so not being able to control this self organization of users and so for instance it was it is said here that Facebook changed the algorithm in order to keep the bubbles that we already spoken about separated and not to have them percolate so because did not want to not be able to control what users does on these social network so and just arriving to the last year we also have these blockchain technology which is an interesting way in my in my opinion to ways to contribute to hit pollution of the earth and and throw away a lot of computer time because the idea of this blockchain database to have a distributed database without no single storage so any anybody can duplicate and check for validation of a transaction recorded on these three but uh their way of avoiding a lot of transaction added to the same chain which clearly could constitute a problem because since you can have several copies of the database but you want to keep all of them synchronized you have to say have a way of adding transactions only at a certain rate not more than that so how can you avoid having a transaction added at any rate H it it is essentially H based on the problem of finding again the solving a difficult problem we have already seen something like this in the uh public key um Cipher in which the problem was to find two big prime numbers whose product is the one that you serve in this case is based on the hash so a hash of any document is a number obtained by combining in some way the information in the text a good hash is essentially a chotic system because you need to have a A variation of a single bit in your document reflected by a complete variation of your hash and and clearly you cannot have a hash as large as the the whole document otherwise you lose the the concept of HH so you have a compa a certificate of your system H which changes completely any time that you change just a single bit and [Music] uh the the the blockchain require the hash seen as a number to be below a certain threshold a certain challenge which this is called a difficulty so essentially you have to try different way you you can add a some small number number essentially you can there’s an answer this things like here so so you add this number you compute your hash and you have to repeat this until you have the resulting number below a certain threshold so essentially you have to try there is no way since it is widely chaotic you have no way of performing an optimization like for instance Cil and heing gradient ascent and so just brute force and so you have to waste a lot of computer power in order to try different n until you have the right hash at at that point you can add it to the blockchain and um and clearly this means that there are since blockchain is also used to produce Bitcoins and other electronic uh money uh that are a lot of computers nowadays just wasting a computer power just trying to find the good nouns in order to to to add the things to the blockchain and uh so this is a kind of uh uh crazy things in my opinion okay another clearly another part which which is the end of my talk I but I’m not sure being able to present it in a good way because still we have some 40 slides is the the field of artificial intelligence which is again quite old because essentially started clearly with the tuning but the the idea that now are used like like that just wait H is that uh you based on something that is say inspired by the working of real neurons and this this field of research started in the 65 but the problem just to uh to so actually the with the perception perceptrons the problem is that uh in the 70s essentially uh or the 8S maybe in 7080s there was the Sens that this approach the one that has brought to the ER neor today was not going to was not the real one so completely different things where has been developed essentially database system based on rules uh and languages like for instance prologue and and then the the exploding of computing power now for instance the gpus has allowed to come back to the to the original or something similar to the original concept so all starts with the say the Turing test which is inspired by the imitation game which was a real game in which you had to discriminate between a man or a woman in this case the idea is to discriminate between a human and an artificial intelligence by putting by asking question to them and examining the answers um actually the inverse Turing test is what is extremely used today is to to to do the the same just to discriminate in favor of humans in order to allow them forance to to perform a search or to download information and this is the capka which is completely automated public during test to tell computers and hum apart and uh another approach which from time to time comes out is that of the Hidden Mark model in this is essentially was developed the forward recognition but was hugely used for instance for classifying pieces of genome for for instance telling a part which is the coding part of of a genome which are inance which is essentially garbage and so on and so the idea is to have the concept is to have different Mark of chains like with very different probabilities for instance this one generating strings and for instance these are two strings generated by these two you can see that since I have zeros there are forbidden patterns in one and the other which this is the easy way of making this but you you see that they emit the same symbols even though they are different so if now you have this sequence you have to find where the the most probable place where to assume that this was generated by a a one process and this was generated by another process and clearly you you need to assume that the two system are quite different and then there is a small probability of of passing from this to this and then you can develop methods based on several approximations in order to estimate for instance where a garbage sequence ends and where a coding sequence end or where a single word in a speech ends so this is called the segmentation of wordss and another starts because you know that if you see the the the plot of the of the sounds immit the when you pronounce a phrase there is no breaking from General from a world to another while the breaking are on the poses inside the words and um clearly you know that now we mainly use neuron networks which was inspired by this work in the 43 by Mula Kitz who developed the idea of a formal neuron which was a Boolean device to taking a certain number of inputs Computing the the product of this for the weights comparing with the threshold and G and expelling 01 what mulok and P proved was that any logic circuit and so any computer could be built using appropriate um say gate and neural Gates but they had no they developed the no way of making of building up the inverse problem so giving some input and output build up the circuit that will produce this output which is the learning problem in the Boolean networks in neon networks another field related to this is the idea of boan networks boan networks means having Gates that could be done with meul pit neuron or standard Gates connected in a random way and looking at what they do the idea originated by work by cman which said okay inside the cell you can see genes as units that can be activated or inactivated so zero once each gene produces a product say A protein that that then induces the activation or the activation of other genes so you can see a genetic Network as a kind of gates connected as a boo Network so a kind a network of gates that controls beats that their turn controls other beats and so on since all cells in multicellular being have the same genome you have to assume that the same disposition of gates can have different activation patterns and so what cman and many other people studied was the number of patterns I mean the number of limit Cycles or fixed points of a Boolean networks according to different choices of gates and and moreover more of all the number of geneses so essentially the idea is to find a correspondence between the number of states which is the number of say morphologies of a cell and the number of genes and this correlates in some way with what is observed in biology so that the more so the cells like bacteria which have only one morphology or very few morphology has little number of genes while those with many morphologies have more say number of genes in a logarithm way and another clearly another famous model is the easy model which which each spin aligns to a local field and possibly on an external field you have phase Transitions and based on this analogy between the uh the statistical mechanics and microscopic models you can develop optimization techniques like Monte Carlo in which you find the minimum of function which should have a kind of funel shape at a certain resolution uh so that the idea is that how you can find the minimum of this function exploiting some local information but avoiding being trapped into a local minimum which is what happens if you just follow the the standard gradient the sh so the idea is to introduce a temperature which is the possibility of jumping over some barrier and so that you have a probability distribution which is related to the potential by the standard bolman function and then reduce the temperature so to concentrate the the probity distribution into the minimum you I said that you need to have this funnel shape H form because essentially you are still exploiting the local information about the gradient even though in an average way otherwise you find just if the function is essentially a random one like the ones that we have seen for the hash of document you just have to follow Brute Force approach otherwise you don’t find anything another optimization techniques is genetic algorithm in which you instead of just having one copy of the system you have several copies of the system you still have something analogous to temperature which you now call mutation and so you use the information coming from your pool of Solutions in order to select the best ones those that have lower level for instance of a Target function and reproduce them with mutation if moreover the problem is almost as separable I mean that you can represent the uh we are working in a multi-dimensional system but if the the minimum of the whole function is essentially given by the minimum of different portions sequentially portions of your uh coding you can think of them like if they were genes in a genome and then you can apply also the crossover I mean exchanging pieces of information by which you can jump to directly to the best solution one of the best solution without having to optimize all steps but clearly this is this depends crucially on how you code your system okay now uh coming back to networks you may ask how to introduce learning in this system one way clearly is that that used by simulated anding and genetic algorithm I mean varing the the system with a certain Target function that can be the discrepancy between the elaboration of your input and the suggested output and having them optimized and this was for instance done by Stefano pello in which he took a a network of logic gates or or mul P neons H connected in a feed forward way then feed the system with examples for instance how to add the two 8 bit numbers compare the output of the system with the the set of examples that the training set then veryy some of the game some of the parameters of the game connections functions and so on and accepted them or not according to a simulated algorithm method or you could use genetic algorithm and then finally he found that he was able to to build up the right circuit H producing the right answer clearly with a lot of time to order to optimize the system and there are interesting considerations that according to the number of neurons you can have just this is the same as in neuron networks with a few neurons you have just memorization of the patterns that you feed without any generalization while if you have enough neurons and a large enough number of example you finally have generalization so learning of rules instead of learning of examples and um but the the the path to neuron Network started with the Hop field model which was directly inspired by the working of neurons I mean the idea is to reinforce the connections among neurons that fire together this was an integrated meth model in which neurons do not fire but are activated or inactivated one minus one and one you can show that if connections among neurons are given by the patterns that you want to store in your recurrent Network then this is actually is a minimum also of the this energy defined in this way the problem is that when you try to store new patterns first you have the right memorization of pattern so they memorize the p are the minimum of the network but after a certain point so but soon so immediately it also spure and not wanted the minimum appear due to the correlation among patterns at the certain point this false Minima becomes more uh have a lower level of energy with to the desired ones so essentially when you perform retrieval you find combinations of your input instead of pure stored inputs and forance instead of having this if these are the stored Parts at a certain point you start to retrieve combination of patterns and the problem is that the storage capacity of this network is a small fraction of the number of neurons so for having a lot of St patterns you have you need to have a huge number of neurons which at that time was considered not good um clearly the other approach was to use fight forward feed forward networks so just for with just one layer you have the perceptron which is able to discriminate a l so to perform a linear separation of task of examples so it could say characterize or classify things in two different sets if the separation with respect to some characteristic input some input is just linear but if you combine several layers of perceptron then you are able to separate things and this is the origin of neuron Networks and what was essentially discovered after a while is that clearly you can have a deep networks which a lot of powers but clearly you can use a supervised or unsupervised uh learning that the network we with just one hidden layer in principle is able to approximate any function but actually this is not what you want because you want a generalization so you don’t want Network to learn just the examples essentially if you just want this you have a single neuron that is activated when the pattern is one of the stored pattern and this force is the output and so all the others are zero and so essentially the this is a idea is that it can approximate any function because any function can be divided into slices and you have a neural recognizing the the the patter that identifies these lies and give the the response assigned to these lies but this is not what you want you want to furnish a certain number of example and having the the the network generalized and this was uh quickly recognized to be related to the architecture of the network so different architectures give different performances H because for instance one of the simple let me skip AR is the out encoder so essentially you give examples you force the syst to pass through a bottleneck and then you want that your target that the output be the very similar or the the most similar as possible to the input so to to to represent the identity function but with a bottl next this means in order to be possible that this internal portion has to categorize the examples according to few number of characteristics and this is the essence of learning how to abstract from the details of the of the input in order to extract only the Sint description of the input able to reconstruct this so forance if these are faces you may have here something related to color of eyes has it or not is it fat or thin and so on this is this is the idea of out encoders and clearly then you can for instance impose local connections on your network in order to reduce the number of connections that you need and so speed up system clearly the problem of convolution in ER net network is that they are much more sensible to small details than to the general framework so forance you they they are quite able to recognize an elephant examining the skin of the images with respect to the the full picture so if you draw an elephant using a tiger screen T skin humans quickly identify the elephant n networks identify tiger convolution Network so clearly you can mix all things together another important point is the generative networks in which you have actually two networks one trying to produce something according for instance an image according to the description and the other trying to discriminate between a constructed image and a real image so suppose you have a certain database of images with description then you can have something that uses a part of the stored image in some s to construct a new image given a description and something that tries to discriminate between the the generated images and the real images and clearly when the generating part is able to uh to say to trick the the the adversarial part it means that it is generating a good things clearly the problem is that you generally use convolutive networks in this phase this means that they more pay more much more attention to particulars with respect to the general frames for this is why essentially in the old image generated by artificial intelligence you see for instance people with too many fingers because clearly they know how to draw a finger which is well say very similar to a real one but they are not counting how many finger a human has to be has to have then clearly you can have unsupervised networks for for instance a competitive interactions among neurons that if one is excited the other inhibited or and then the final one is attention networks like chat GPT in which you have a kind of mechanism that tries to to rise attention to the to the important part of the network this is you know this is this has been developed in order to solve the problem of relating a distant part of the for instance of a phrase because in general you can you need to have for instance concordance of of number of gender wordss that can be quite separated and so the idea is that of having a mechanism for say putting attention on some aspect of the system and not just on what is immediately before and so essentially this is how to to write down to to produce but I’m skipping all of this because we are almost at the end uh the an important point of this attention and sec to SEC so transformance essentially is that in this way you can easily parallelize the task of producing the output and so this can be implemented in in a vectorized way uh and this is essentially what make them efficient and so this is the famous way in which you take your input you extract H you start the tension and then you may feed something to the other network in order to have the the input from the first one for instance in order to uh this was essentially essentially devised to make automatic translation at first so the idea was to extract from one language the concordance of words and then to feed this attention part to the to the network that was translating word by word using kind of dictionary for instance in order to impose the concordance of the same words in the two languages and uh okay just for finish I want to show you how the time needed for different services to reach 1,000 100 million users so you see that um something like Spotify I think in uh in the 2008 needed 11 years Netflix 10 years um airb BNB 8 years X Twitter five years Facebook uh four years and a half dropb four years WhatsApp three three years and a half Instagram two years and a half uh this is don’t remember what nine months and finally sh jpt just two months so the you can imagine what will happen in the in the future and just for saying you goodbye I would like to let you see this old this old video clip for the L which is which has been built using a kind of fake interface but this is just to appreciate what in the say 20 years ago Public Image of interacting interface which was built up say using pieces but not really you see that there is any that we have no frame framework idea how people be the 94 okay so I think that this is enough for this course but clearly I am available for question we have still a few minutes if you want let me know if there are comments this is a poetry by by remember who the text I mean the Poetry B by Oscar wild okay and that’s all do you have a question comments if not uh I say you goodbye so uh please please sign the form and uh have a nice day byebye [Music]

    Leave A Reply