https://audio.dev/ — @audiodevcon​

    Workshop: Dynamic Cast: Practical DSP & Audio Programming – Emma Fitzmaurice, Harriet Drury, Anna Wszeborowska and Alex Korach – ADC 2023

    Dynamic Cast: Practical DSP and Audio Programming

    We’ll explore the concepts of Karplus Strong Synthesis. This is a physical modelling synthesis, aiming at modelling a plucked instrument such as a guitar. Included DSP will be noise generation, delay lines and filters. We’ll also touch on MIDI messaging and polyphony.

    We’ll be using the Cmajor platform for the practical aspect of this workshop. Knowing Cmajor upfront is not a pre-requisite; we’ll guide the participants through the implementation gently.

    This will be a self-contained workshop aiming to be accessible to all levels of learning – all elements used in the practical part of the workshop will be thoroughly explained in the introduction.

    Dynamic Cast – Who Are We?

    Dynamic Cast is a peer-to-peer C++ study group, a safe space for underrepresented groups (women, LGBTQIA+, minority ethnic). The Dynamic Cast workshop at ADC is designed to create an entry point to the industry for newcomers, everyone is welcome.

    Link to Slides: https://data.audio.dev/talks/2023/practical-dsp-and-audio-programming/slides.pdf
    _

    Emma Fitzmaurice

    Emma Fitzmaurice is a QA engineer on the Novation team at Focusrite, sticking her fingers into as many parts as the hardware development pie as possible in an effort to make cool gear. She is charming, beautiful, wise and the proud author of her own bio.
    _

    Harriet Drury
    _

    Anna Wszeborowska

    Anna is a freelance software developer and a PhD student at the Creative Computing Institute, University of the Arts, London. She’s worked on music production and live performance tools for the last 8 years. During her time at Ableton she contributed to the integration of the company’s flagship product Live with Cycling ’74’s Max, worked on the second edition of Ableton’s hardware product Push and was part of the team responsible for the company’s instruments and effects. Anna will be happy to chat about the use of AI in live performance, learn about your favourite tools for rapid prototyping and see pictures of your pets.
    _

    Alex Korach

    Alex Korach works as a software engineer in the Max for Live team at Ableton, helping take care of the integration between Max/MSP and Live. Former dev at Native Instruments, where she was involved in the development of products such as Massive X, Guitar Rig, Maschine and Komplete Kontrol.
    _

    Streamed & Edited by Digital Medium Ltd: https://online.digital-medium.co.uk
    _

    Organized and produced by JUCE: https://juce.com/
    _

    Special thanks to the ADC23 Team:

    Sophie Carus
    Derek Heimlich
    Andrew Kirk
    Bobby Lombardi
    Tom Poole
    Ralph Richbourg
    Jim Roper
    Jonathan Roper
    Prashant Mishra

    #adc #dsp #audio #audioprogramming

    So we are Dynamic cast and what dynamic cast is it’s a peer-to-peer study group for people under represented in Tech and we uh host monthly meetups in person once here in London at the Focus right offices but we also uh it’s AA it’s possible to join us online so we always

    Share a link so people can join us from wherever they like at these um meetups we usually just share what we’ve learned or rehearse conference talks for instance um show our projects um and sometimes you just watch and discuss talks together so the format is uh mixed and you’re very welcome to suggest

    Something else also when you join us so if you’re interested in joining us please speak with us after the workshop come talk to us we’re here we also have speakers so please help yourselves afterwards lovely logo isn’t it so so um right I think I didn’t mention that it’s

    A C++ study group so that you know explains the logo right um we also have a website that it’s there on the bottom right corner it’s quite minimal but you’ll find our Twitter handle and an email address there in case you you want to contact us later okay so today’s uh

    Workshop is going to be delivered to you by Harriet a software engineer at sound stacks um Alex software engineer at Ableton um Emma QA engineer at Focus right and myself I’m Anna I’m ex ablet I’m a freelancer now and also a PhD student at the creative Computing Institute at U

    Um what else did I want to uh say right the presentation slides are available online in case somebody wants to click on links or follow us as we go they’re on our uh GitHub um if you find the ADC 23 repo we will be using it later so you can go on

    And find it now uh there’s a Wiki section so if you click on it the slides are going to be there um right just briefly let’s talk about the goals of this session today we um will give a brief introduction to the concepts um and introduce some names

    Used in audio programming and we hope it’s uh useful in the Practical parts of um the session and then we’re going to be implementing basic string synthesis in a new programming language called C major we will give a gentle introduction to it don’t worry um right I wanted um also to say

    That we don’t expect you to remember everything we say but we just hope it uh helps you navigate the rest of the conference later or some uh Concepts you encounter in your um audit development Journey uh we’ll we’ll be qu taking questions also online so um please feel

    Free to post them on Zoom I’m on a zoom call so for the online participants just post them there or on our Discord Channel there’s a Discord channel it’s called Dynamic cast Workshop something um please find it and post questions there we can be taking questions in

    Person when the speaker is ready to answer them and if you need assistance just indicate it we’ll come help you out uh I think this is it let’s move on to the first session I’m going to hand over to Alex who’s going to uh take you through

    Some yeah through the way have like signals flow through audio uh systems hello all right so let’s start with the basics what is it that we’re talking about when we talk about digital signal processing as it applies to audio uh so first off what is a signal

    Put simply a signal is a parameter that changes as a function of another parameter usually it’s time or space but it also can be another observable uh variable in the case of sound it’s going to be the amplitude of a sound wave changing as a function of time we may be

    Talking about a physical representation which is um a measure of sound pressure changes or if we capture the physical sound using a transducer such as a microphone we obtain an electrical representation a measure of voltage changes both of these signals are continuous but computers can only operate on discrete signals so in order

    To process sound on our computers um we need to discretize it this is done through a process called sampling uh which is performed by a digital to audio converter uh to analog converter analog to digital I mix it up um which turns voltage variations into uh numeric

    Stream um and each sample represents the value of the signal at a certain point in time um we won’t be going into the details of that process it’s a practical Workshop so let’s not be too heavy on Theory uh what’s important is that we’re left with a signal that is no longer

    Continuous and we can also generate those samples artificially that is synthesize them for instance using a digital oscillator um all right so what is it that we can do with a signal first of all we can scale a signal which is uh which means multiplying it by a constant

    Scaling Factor such as when we use a gain control to either attenuate or amplify the output sound we can also do time scaling also called dilation uh which will cause a signal to either compress or expand uh which could be used to change the duration and frequency of the

    Sound another possible operation is a Time shift uh shifting a signal in time can be used to introduce a delay um we can also do time reversal which is like playing something backwards um and when we’re dealing with multiple signals uh we can first of all perform

    Summation so we take two or more signals and that is equivalent to audio mixing uh we could also multiply the signals such as when we uh turn a constant signal into an envelope and apply it to an audio signal to produce a note and of course other more complex

    Functions can be applied and combined to create arbitrarily complex DSP systems a DSP system is any process that produces an output signal in response to an input signal with some manipulation of the input signal occurring in between uh an example of a continuous system is a is an analog synthesizer and a

    Software synthesizer works as a discrete system so the synthesizer might take um a generated audio signal and an envelope signal as an input and then pass the audio signal through let’s say a filter or an amplifier and then combine the processed signal with the envelope which

    Gets triggered by a button or a key press to produce output audio the result of this process can be then sent to a loudspeaker and transformed back into an audible Soundwave and um maybe as a side note one thing that’s worth noting um with regard to mathematical notation that

    We’re using here is that for continuous signals we tend to use parentheses as with this x of T and for discrete signals we use square brackets as shown for Xfm and um this kind of data and Transformations that we’ve been talking about lend themselves particularly well to a modular process processing

    Architecture so a module encapsulates a certain signal transformation performing any number of signal operations depending on what they’re meant to achieve and each module is connected to one or more modules coming before and after it in some particular implementations of this architecture uh it’s possible for the connections to be freely rearranged

    Which allows for a lot of flexibility to test out different topologies quickly um an analog example that comes to mind is a guitar effect chain uh so each pedal takes an input either from the raw unprocessed signal coming from the guitar or the result of the processing

    Done uh by the proceeding effect and then it then applies its own modifications and sends its output further through the chain until it finally reaches the amplifier these processing systems can of of course be as simple or as complex as necessary to produce the desired effect so while guitar chain effects are

    Usually linear and often consists of just a handful of modules modular synthesizers on the other hand with the tangles of patch cords and that split and multiply and some the signals present a much more elaborate structure um yeah and when you connect the modules uh with patch cords to

    Produce certain sound the particular pattern of connect c s which produces that sound is known as a patch and we also refer to these sounds themselves colloquially as patches um yeah and now as intuitive as this Paradigm seems nowadays this concept of modular processing wasn’t translated into the digital world until

    The 1960s at which point experiments with soft for synthesis were already well underway and it was only in the 1960s that that Max Matthews who’s widely considered uh as the father of computer music came up with this idea of applying these modular building blocks to sound synthesis software so he devised a

    Programming language called music 3 which allows allowed its users to create networks of so-called unit generators or ugs which modeled for instance virtual oscillators envelope generators filters amplifiers what have you and they could be interconnected in arbitrary ways to create an instrument or again a patch and the flexibility of this design meant

    That um a variety of synthesis algorithms could be implemented relatively easily music 3 was just one in what became known as a family of Music n languages all of which are based on this idea of unit generators the legacy is pre preserved in modern implementations of this architecture both visual such as

    Maximus B PD or audio Mulch and also textual such as super collider Chuck F and C major which is what we’re going to have a look at today um yeah and this modular graph architecture maybe as a side note is also mirrored by how many audio applications developed with um General

    Programming general purpose programming languages um are organized internally yeah and again the patches can be as simple as or as complex as desired and for instance here we have an example of a Max MSP patch that outputs a sine wave of different frequencies I hope I can get my mouse where it should

    Be let’s see nothing fancy as simple as it can be and uh oh hang on can we stop that yeah and here we have another patch which which is an implementation of an FM synthesizer a vastly more elaborate one uh let’s check that Out that’s what it looks like right move on right so I will now hand over to Anna who will tell us a little more about the programming Paradigm which all of these languages I mentioned uh are based on so thank you just one more concept and

    We off to coding I promise so very us right so um all these languages that Alex mentioned would that we will be and we’ll be focusing on one of them later uh use this Paradigm called data flow Paradigm and now we’re going to take a closer look at why it’s called this way

    And how it differs from the so-called imperative Paradigm um which you all are probably more more used to from your experience with other popular programming languages like C++ or or Python and so on right so what is data flow uh programming the most important uh

    Quality o of it is that um it models a program as a directed graph so we focus on the movement uh of data model programs a series of connections then and graphs it turns out are a great way to represent data dependencies which going to which is going to come in handy

    So here on the slide you can see um an equation that includes summation and then a division and then multiplication of the result expl expressed as a graph right um a more formal formal definition of uh what a directed graph is according to the graph theory is that uh it’s a

    Collection of nodes connected by arcs simple as that and uh both node and Arc can be uh substituted by other names that you’ll sometimes maybe encounter like vertex and an edge and point and connection but in the context of directed graphs uh the most commonly

    Used term is not an arc so we’re going to be using this one and uh a directed graph is a specialization of an undirected graph where connections have explicitly defined Direction so we have nodes arcs and uh direction right that’s most important right so uh what does each of

    These elements mean um a node is basically a unit of computation and it can be as small as a single machine Lev Machine level instruction like a multiplication seen in the example uh or a function of arbitrary complexity um and um a note can take one

    Or more inputs and produces one or more outputs so you can see that it’s a good metaphor for what Alex was saying about the building blocks of a modular system right pretty good metaphor right we also have arcs which are basically p uh pipes through which data passes and they express the data

    Dependency between nodes um sometimes you’ll encounter the the term token and it basically means a unit of data of any type you you may like uh and Port if you if you ever cter it is means a boundary between the node and an arc we have input ports and output ports and just

    Many times times this term will be discarded and we’ll just simply be saying inputs and outputs or input arcs input output arcs input connections output connections right so in pure data flow ports are the only way to get data in and out of a node so Global variables are not

    Allowed um which also means that uh nodes um in a data flow graph don’t modif modify any state variable outside of its local uh environment so they don’t have side effects we say in functional programming right um that’s going to be important in a second Ali okay you’ll encounter different

    Types of nodes um uh these are going to be the terms you’re going to be seeing also later in the workshop so nodes without input ports are going to be called inputs or sources and you can think of audio stream generators like for instance oscillators producing signals like sine waves

    Or audio file readers they’re are going to be inputs uh the notes without output ports are called uh outputs or syns and these are nodes that can prepare audio samples in their final audible form or save um data into a file for instance the rest is going to be

    Considered as processors and processors are nodes which basically perform operation on data typically changing it as a result as a result right so the data coming in is going to be different from the data coming out but they could be so-called monitoring nodes which don’t alter the data they just maybe

    Plot it or analyze it in some ways um all right so as we said earlier graphs are supposed to represent data dependencies it’s a data driven model and the four nodes um only get processed when they receive all the data they depend on only then they are ready to be processed so

    Uh let’s see how uh like what are the implications um of that and how this model differs from what we’re used to uh when dealing with the imperative languages as I mentioned before like C++ or python JavaScript um so as we said before in the data flow model we follow data and

    Then trigger actions right all data arrived we we trigger an action in uh imperative languages we follow a very explicitly defined sequence of instructions so we focus on a very explicitly defined control flow um so in an imperative model data may follow the control flow but the main

    Question is about the order of EX execution and in the data flow model control flow uh control follows the data um and computations are executed Based on data availability so that’s the main distinction these are like uh different ends of a spectrum right so imagine the scenario from the

    Graph on the slide uh and imagine input B uh being ready while we’re still waiting for input a to be prepared right maybe it’s a large um I don’t know Vector being loaded and taking its time so in this situation uh the division node is already ready to

    Activate whereas we have to wait for a to be able to perform summation right um and in the imperative scenario next to it this like pseud code uh written on the side uh it becomes clear that we have to first wait until all data becomes available to perform the first

    Computation if we defined summation as the first one right so we can see that in this scenario the data flow model gets a head start start clearly so this also means that the data flow model reveals quite naturally opportunities for parallelism um it reveals early when all data dependencies um have been satisfied

    And we know sooner when it’s okay to launch a parallel operation and again as I mentioned before all data is supposed to be local all data is local um so we naturally tend to prevent styles of memory access that would prevent parallelism so that’s um really good

    Advantage of this kind of a model but there are also some drawbacks to this uh approach and probably people who have experienced coding like making patches in maxmsp pure data know how tricky and inflexible it is to um write any control flow operations like looping and so on it’s quite tricky especially

    In visual languages and in this slide you can see a factorial function implemented in an um in the imperative style in Python to the left and in the data flow Style on the right hand side so um we can see that implementing an algorithm like a factorial of computing a factorial of a

    Number is a pretty good task for an imperative coding style it’s pretty straightforward you see two versions of it here using recur recursion and without it um so why would we want to force ourselves to do it in a data flow uh style right that’s actually quite

    Cumbersome well what if we could combine both and if we combine both we uh reach a hybrid data flow and uh then we would write our computation in an in the imperative style um and uh stuff it in a node and then we use this node as a close box in our data

    Flow graph so still a good analogy for the modular um setup of like audio systems right so the hybrid data flow uh is actually the approach that we see in modern solutions to problems which can be solved on a directed graph most commonly um so again director graphs are

    Good for representing dependencies or scheduling uh execution of parts of the computation unit of like some of the computation units so uh this is why we see them commonly in audio engines we have audio graphs being um processed um we see them in libraries for parallel Computing of Big

    Data uh the web audio API is modeled using the concepts we’ve described so maybe if you ever deal with it it’s going to be easier to understand also tensor flow actually a popular library for implementing and deploying large scage large scale machine learning models uh uses directed graphs uh internally to represent

    Computations right and uh as we said c major is gonna have a similar syntax so hopefully it’s going to be easier to understand what’s going on so now I’m gonna hand over to uh Harriet uh who’s gonna start with the Practical part and I just hope this introduction will make

    It easier to to follow it later right thank you yeah all right thanks guys thanks for that uh introduction as well it’s lovely um if anyone needs a break at any point please don’t feel scared to do that just let me know I’m happy to do it so what I

    Want to do now is give like a it’s a bit of a sneak peek to this afternoon’s Workshop if everyone anyone’s coming along to that we’re using C major today because we found a simple way to sort of show the DSP that we want to do and sort

    Of and also I mean I might be a little bit biased obviously so C major what is it um maybe you saw our announcement talk last year at ADC or you maybe are aware of C major already it’s basically a c c styled language that’s designed for DSP signal

    Processing code so we have some like main aims and targets that we had for this year so our main aim is to match slbe the performance of c/ C++ be simple enough to learn which is why we’re doing it today uh to entice people to use it and also make code portable across

    Processor architectures so we want to support like CPUs gpus dsps tpus like etc etc um we also have some nice extensive tooling so we can compile to wasm clap juice projects and this is like an Ever growing list of things as we uh keep going so we use an llvm jit

    Compiler for all the Nerds out there who are interested um to optimize that means we can hot reload code as well so that means we can change the patch in a way or maybe we add a new gain or something I can press contrl command s and we can

    See it reload in real time and stuff um if you want any more information um have a look at the website um that will go into a bit more depth on what we will do today so the tool I’m going to use and maybe the easiest way to get started

    With C major is downloading vs code and using RC major extension we don’t obviously just use this we support whatever platform people want to write with um but this actually internally uses our command line tool to create a vs code panel with information about your C major patch um I’m I I would

    Implore you to use this today with the workshop simply because we’re going to be working with synthesis and it’s got a keyboard on screen so you can make noises from your laptop without needing a keyboard um yeah so I can show this if people need or maybe we should take a

    Five or 10 break after I’ve finished explaining we can go through all this as well um there’s also the command line tool which you can install directly from r c major repo um and if you go to our latest releases tabs you’ll find it there um yeah please feel free to go

    Along with this or go ahead and just stop playing with that now as I’m s talking through some main Concepts so let’s have a look and let’s have a quick summary of like the main Concepts in C major so we have this idea of a patch

    Um and this is like a bundle SL folder that contains all your metadata files program files and like other resources such as like goey scripts and audio files if you’re doing a sample play or anything um and as Anna was mentioning earlier we have this idea of processes

    And graphs so we can call to sort of nodes within it’s like a processor and we can call that within a graph and then we can connect them in a sort of line by line way uh to do that sort of architecture um brilliant yeah so processes declare a

    Main function which I will go through uh in a bit and that contains a loop which reads inputs performs some processing and then writes outputs and repeats and it’s a infinite Loop so that’ll constantly repeat throughout the remainder of your program so here’s an example graph very quickly uh I will

    Talk through every single bit in a minute um as you can see is an I was saying we declare our inputs and outputs and we’re using a float type for this uh we’ve got two gains why not we’re doing two different gain of things and then we connect them at

    The bottom here so we’ve got our connection in this will just be from whatever microphone you’ve got on your laptop go through the two gains and come back out and here’s an example processor doing the same thing so we’ve got our inputs and outputs Ah that’s the test

    Nobody nobody worry I guess I’ll wait yeah if it goes off again please leave but is anyone on fire right now no okay cool we can keep going you’re on fire I’m on fire nice so this is the yeah this is the example processor basically doing

    The same thing but I’ve just used one game this time instead of two um we’ve got our inputs and outputs we’ve got this sort of Main and we go into our infinite loop we’re timesing our input uh fire now is that a real fire or not probably not probably not right is

    That I mean that’s the thing you want to say right like we might burn but we might not also so it’s fine the TS is there would be fine um yeah so read in and read out and then we advance to the next frame so let me just go back to previous

    Slides quickly so you’ll see here we’ve got the input stream and output stream and I just want to sort of Define that a little bit better so this is how processors and graphs communicate with the outside world they communicate via these end points and they must always at

    Least declare one outp output they don’t have to take an input but they should well they have to take an output so we get the uh hang on I’m just going to make this a bit bigger on my screen so I can actually see the slide um we sort of

    Define the syntax to be like a Direction so you’re inputting or outputting the type so we’ve obviously got floats and ins etc etc um oh sorry stream type even uh stream event or value and then we declare if it’s a float and integer etc

    Etc we give it a name can give it an array size if wanted and then we have some optional annotation that can be used to sort of make some onscreen um sliders and such um yeah so let’s have so I’ve only shown the stream mend points so far so this

    Transmits a continuous sequence of sample accurate values providing one value per frame so the type of stream must be Scala so it must be a float or integer or a vector or floats an inch so they can be summed basically um steams evolve storing updating vs at every

    Frame so we work frame by frame with this sort of methodology uh which makes them quite computationally expensive because this is something that happens a million like all the time so this makes them a bad choice for sort of values that rarely change if you’ve got

    Like a button on screen that’s just sort of like selecting a mode you can use a value or an event endpoint so a value endpoint does that so it can hold any type of data uh allows for a fixed value to be sent or received in a way that’s not sample

    Accurate uh they have like effectively zero overhead when the value is not being updated so it will listen for a change in the value that’s happening basically um and then we’ve got the event endpoints that sort of act in a very similar way and in fact I find

    Personally that events and values are sort of interchangeable and you can use them to do similar things um when you declare an event handler you have to use the name and type to match the endpoints so you have to do the special sort of event Loop bit and you can do something

    With your input here uh if you’ve got like multiple types within your event like a string or a float you can do multiple V Loops for this uh so I thought I’d show you some C major patches first before we do anything but would we like a quick break

    To get everyone up to speed with vs code what do you guys think everyone okay should we take a quick 10-minute break would you think goes start cing let’s start coding let’s do it hour okay so what I’m going to do now is switch the display type so I can

    Actually see what I’m doing there’s Wi-Fi here if anybody want need it okay yeah good question do I want to start with this one yeah all right has everyone got C major installed or are people going to do this while I talk or any thoughts of Vibes questions

    All good still installing that’s fine um what I’ll do is I’ll talk through these examples and then we’ll just take a quick five minute break and I’ll just come around and make sure everyone’s set up and got everything that they need um this is the first patch I wanted to show

    You um this is a can first of all can everyone read this I can make it a bit bigger if it need be is everyone good um maybe that’s better all good at the back brilliant fantastic um so this is using the topology that Alex and an explained

    Explained before so we’ve got this graph that’s sort of taking in our input taking in our output and then we’re going through this gain processor to put an arbitrary value through to make a gain basically um what I want to focus on here is the topology

    Of the graph so first of all we declare our inputs then we declare our outputs and then we’ve got these nodes so nodes are generally how you instantiate a processor within a graph so I could add to this example so I could have node gain 2 and I will make this equal

    The gain processor below like that and then that means I’ve got another instantiation of this so I could connect this in there are some fun syntax things we can do here with the connections and stuff what we’ve got going on here is we’re going into the input stream of game processor and then

    We’re doing our sort of DSP our maths and then we’re coming through our output stream to to go out nice and simple I just wanted to show this as an example to highlight I’m just going to meute my sound for a second so I don’t get a

    Feedback look when I run this patch so if I now run this patch in RVs code panel um just give it a minute okay great so this is our vs code panel looks weird at this font uh this size uh as you can see we’ve got this

    Gay knob I’m sure if people are playing on it now right now online or in person you can hear a difference in your microphone um and you can see our input here so we know it’s working I can actually input a file here as well so we

    Don’t have to listen to Just yourself talking and then we can also monitor the outputs what I wanted to show here was this graph topology so you can see we’re taking our input stream which is uh me talking right now we’ve got our gain value which is coming from sort of this

    Knob here it’s 86 right now we’re going into our gain processor and you can see it’s sort of named after the node and then it tells you which processor that is is just free it’s doing our DSP it’s doing our summation we’re getting our output and we’re going straight through

    To the out and that’s maybe the most simple gain I could make uh while showing something so let’s uh let’s go into the second gain processor here so what I’ve done here actually yeah I I’ll explain that first so what I’ve done here is used the input value

    This is what I was talking about earlier so I’ve made two patches that do the same thing but the first one uses an input value and the second one uses an input event um and as you can see I’ve annotated this input value when it gets updated in the patch when someone moves

    That knob it will update its computationally inexpensive and then that’s how we get our value so gain two game processor two here uh we’re doing the same thing I’ve annotated the event we’ve got an input event but instead of going through this uh main loop with just the value you have to actually

    Assign it using this event end point here so the topology of the main graph is the same the only thing that’s changed here is we’re using an input event um and we should see the same thing here as well if it wants to load of course there we

    Go exact same we’ve got a n but initialized to 0.5 we’re obviously not going to go above 0. n because if we go to one we’re going to get a infinite feedback loop which would not be nice uh and the exact same thing we’ve got our

    In we’ve got our gain we go into our gain processor we output our audio stream amazing so what happens if you want to generate something rather than just uh make something a bit louder so this is our annoying beep patch and as you can see I’ve only used a gain here

    So what we’ve got in the standard library for C major is a method to make sine waves and you can sort of Define the frequency so what I’ve done is I’ve made a node again which is according to a processor it’s going to stood oscillator sign I’ve told you that I

    Want a float type and I want it to be 440 HZ uh I’ve added a smooth gain as well just because it sounds a bit nicer um and we’ll play this for you now bear in mind I don’t want to damage people’s eard drums and as you can see here as well on

    The topology there we go we’ve got a wave I can change the volume so it’s not so annoying or I can make it more annoying I can change the frequency like so now let’s get rid of that because it is an annoying beep um but as you can

    See here as I was going to say so in this graph topology we’re not taking an input stream from the user so we’re we we’re put passing in values to the gain of the the gain node and the frequency to the frequency node that just makes

    Those knobs on screen that we can edit and then again here we’ve got connection into the gain in gain out nice and simple and then the final example I wanted to show you is two games in series um it’s the same thing basically but the nice thing about this is just showing

    That we can do all the connections in one line so if you’re not declaring a specific endpoint if you’ve only got one end point you can actually just go in to gain one to gain two to straight out okay so I’m just going to make a new

    Folder quickly and just put ADC so when you’re working in vs code what you’re going to want to do is it’s command shift p on Mac it’s control shift p on Windows and type in C major and what you’re going to need to do is create a

    New patch uh create it without goey obviously uh for now um and I’m going to place this in my folder here and I’m just going to call it hello um the reason I’m getting you to do this is because you need this C major patch

    File to actually run the c major file so this dot catch file is sort of Json styled it’s a manifest file basically and it tells you where to find the source code I’m very sorry I completely forgot to introduce this it tells you the source code uh where the location

    Which is here but to actually execute a patch you can have this page open but you’re going to need to have this cage patch manifest in the same location as the major file um so when I run this patch it’s loading the c major patch file which is this file here

    Um that’s my bad and we’ve got another annoying beat brilliant okay uh continue playing I guess for a minute so I’ve picked um quite a simple sort of synthesis idea to give this Workshop um I thought it might be cool to sort of show some midi make a noise

    And just make a big racket basically so this is um a diagram of the carplus strong string synthesis idea um so the carpa strong synthesis algorithm it’s a method for generating realistic PLU string instrument sounds um this might be something like a guitar a banjo or instruments that are similar to that uh

    It was developed by Kevin car plusus and Alex strong in the ‘ 80s and what we’re doing here is we’re emulating the physical properties of a plucked instrument so this algorithm is like basically an example of physical modeling synthesis um so before I explain that diagram I just want to show

    You this is my really crude example of a guitar but from the side so I made this in Myro it was a it took me 10 minutes so what you can see here is I’ve made this very ere exaggerated plucked string position so if you think about you’ve you’re holding the string

    Up with your finger or with a Plum probably not to that angle because you probably won’t have your string in tune what you’re doing is you’re bringing it into this PL position you’re giving this initial uh excitation or burst of energy and then what happens from there is you’ll see the strings

    Start of vibrate and Decay over time basically so this is what this sort of simple is um diagram is showing we have got this input noise Source this is what we’re replacing um with the plug string and when then we going to feed it back on itself and we’re going to Decay it

    Over time slowly to make this sort of sound of like an outputed guitar basically um and this is where I was going to have a break but we’ve moved on uh there’s just one more thing we want to talk about before we go into this because we

    Want to sort of discuss median stuff uh is control and interaction um Alex are you the one leading this one so I’m going to pass back to Alex who’s going to give us a quick overview of controls interaction and midi okay hello again so there’s one more topic

    I’d like to talk to you about before we dive into the code and it’s about control and interaction as harro said uh so what is an electronic instrument good for if we can’t control it we can have an instrument output a constant sound or a processor repeat the

    Same processing pattern but that gets a little boring after a while right uh so musically interesting uh effects arise from variations of occurring over time that surprise our brain by breaking expectations thus we want our realtime audio systems to be interactive so let’s have a look at how this interaction can

    Is um yeah can be made POS possible so apart from signal inputs unit generators in a processing system can also accept control events as input we can distinguish between events that provide a way to alter the parameters of the module or provide timed information about notes that are me to sound their

    Duration and Pitch this note information can be supplied either by a user performing on a device a controller or supplied by a sequencer or a door DW in form of a score um yeah and how this data is formatted is dictated by the universally adopted midi standard and since we’ll be

    Implementing a midi gate later on let’s take a closer look at it so midi stands for musical instrument digital interface and it’s a communication protocol it was invented to enable musical devices and computers to pass data in a standardized standardized way and um while it was initially designed with um Hardware

    Communication in mind be it synthesizers uh drum machines or just controllers it is also used to transmit information in purely software contexts the data that is passed is not inherently musical but represents a set of control control instructions for controlling a musical device in real time whether that’s information about

    The notes that are being played or controls for parameter manipulation midi data is packaged into messages of various types which encode different aspects of musical performance so let’s have a closer look at what kind of messages can be sent and how they are built so midi messages are sent as a

    Time sequence of one or more bites and they consist of a status bite which is always first and it serves to describe the type of the message being passed and it may be followed by a number of data bites that describe additional parameters how many depends on the message

    Type midi messages can be divided into two categories Channel and system messages Channel messages are destined for a particular Channel while uh system messages are broadcast to all connected devices through all channels Channel messages include note on and note off messages that communicate which notes should either start or stop sounding and

    These are the messages we’ll be mostly concerned with for the purposes of this workshop and there’s also pitch and bend control uh pitch Bend control which allows for a more fine grained control of the pitch and then we also have control change messages which trans transmit information about gestural

    Changes to modulation Wheels volume pedals encoders faders and other controls uh finally there are also program change messages for communicating a selection of a different patch and um yeah the two two type of types of system messages include system exclusive or sisx messages whose purpose is defined by the device manufacturer

    And there are also system real time messages which are clock messages for synchronizing devices now let’s take a closer look at a note on message it’s a three bite message uh and as every meting message it starts with a status bite with the code 101 in this case and the last four bits

    Represent the midi Channel used for routing the information to specific devices the status bite is followed by two datab bites where the first datab bite and encodes the pitch value and the second one uh the velocity value which controls the amplitude so how loud uh

    The note is going to be sounding a note off message is very similar with the exception that the code is one o and the meaning of velocity is also a little different since the given note that this message is um targeting is meant to stop sounding so it can no longer no longer

    Describe the loudness of the of the not and instead instead it refers to the speed with which the note is meant to be released one thing that um marries a comment here is what kind of pictures are represented by these messages so we’ve got seven bits and seven bits

    Allow for a range of 128 numbers to represent pitch and the midi specification prescribes that these values ma to equal tempered pitches we can use pitch V to escape that limitation in combination with MP or midi polyphonic expression which is an extension of midi um yeah and naturally we don’t want

    To deal with raw midi messages in our programs that’s not very fun so for general purpose languages um there there exist libraries for passing midi messages and many music specific languages such as C majer also offer libraries with abstractions that provide a friendlier interface to the information information encoded in uh Medi

    Code um they also offer converters that translate raw midi data into into those and further abstractions and uh yeah vice versa um that’s enough from me so I’ll hand over to harod again who will walk us through the algorithm thank you Alex hi it’s me again you might remember me from five

    Minutes ago um yeah okay so I’ve split the section into four branches on our GitHub replay basically um do you guys want to see this thing making some noise first should we do that let me show you it’s only taken us an hour and 10 minutes to Make Some

    Noise Okay so uh can everyone see that brilliant let me just bring up the font size a little bit again because I will be going through every single patch so this is the cage patch that I want everyone to write with me today um I will be coming around and helping um

    This is our implementation oh what happened there it’s our implementation of Car Plus strong so this is the repo that you’ll see I’ve done a little bit of a read me that goes into information of where things are and then I’ve got a little sort of MD file that explains the

    Very basics of this patch but don’t worry we’ll be going through all of this um oh you can’t see my display just realized technical issues keep changes please okay brilliant and now you can see it again okay so what I was saying was um so after this Workshop if you find yourself

    Looking at the repo again please refer back to this markdown link if you don’t want to go through all the slides to sort of go through fit but I will be explaining every every single bit now um so this is the patch uh as you can see we take the

    Midian that’s Alex just explained to us all and we’ve got a nice voice got a nice processor and a note selector and a delay so this is what we’re going to be doing today if I run this bch it will take a million years to Load and this is what we’re going to be making today we’re not going to be making a goey this morning but you’ll see this keyboard pop up because um our tools gone oh hey you’ve got midi coming in wouldn’t it be nice to be able to do that so I press my

    Keyboard ha nothing comes out well it works on screen so let me check my Med devices ah it doesn’t matter anyway so if you run this patch now you’ll see this and we get this sort of string sounding thing it’s not perfect because we’ve sort of built a

    Very basic thing here but the idea of today is that we make this basic patch and you can go ahead and make changes to either make it a bit more accurate or you can take this in a completely different way add a different noise source and make it like a lowii Vibe the

    The nice thing about writing synth is the ability to do all this stuff and add additional DSP and uh work with it as it goes so this is sort of our string sound um there’s quite a lot going on in that we’ve got that short burst of white

    Noise and then we’re decaying background and we’re using a filter to take through the top end of this okay this is going to get old real fast doing this extend display there we go yeah so you’ll find all the branches here um if you’re not set up with Git

    This is something we’re not going to cover today so please just feel free to download all of this as a zip file and go through each branch as needed um yeah so this is our lovely diagram again and I will be referring back to this throughout the process so the first part

    Of this is something I’ve already alluded to already um we’ve got this sort of noise source and this is what we’re using to replace that plucked string action so generally in Signal processing noise is like an unwanted modification to a signal that might be from this say this microphone to this

    Wire there’ll be noise here if it’s a bad wire or a bad connection we’ll have unwanted noise Um this can occur during capture storage transmission processing or conversion um but noise can also mean that signals that are random and they can actually be introduced intentionally which is something that a lot of

    Synthesis does um signal press noise can be classified with statistical properties I’m not going to go into like mathematical details today because we’ve only got two more hours so these are the examples I wanted to show you guys today because I wanted to implement them in C

    Major we’ve got a white noise pink noise and brown noise and I’m sure a lot of people are sort of aware of how that sounds uh and how do we trigger this with midi so we need that burst of white noise to emulate the guitar string being

    PLU plucked so the idea here is um just grab this what we want to happen is when a use a presses a key a little noise burst will occur so we don’t want someone to keep this held down and can continue to have this noise burst we want it to sort

    Of blip on blip off uh and we’re going to implement this with the Noise Gate um we will set the gate to be open for a length of time so actually with the carus strong we’re going to use the frequency of the note to determine this

    Uh so if we if I check out Branch one quickly this is part one midi for everyone following along and I will go back to duplicating my displays and keep changes okay brilliant so I’m going to write this patch with you guys as well don’t worry but let me just show

    You yes it’s on GitHub it’s on our Dynamic cast repo um ADC 23 it’s called um there should be some links online I’ll post a link in the Discord Channel yeah if you guys are on the Discord Channel or if people could go around um but while that’s happening I will

    Just show you this quickly is everyone good everyone got that okay cool so this is what we’re going to be writing in a minute this is our part one uh and what you’ll hear here is you see we’ve got no knobs we’ve got no annotations so we can’t sort of

    Edit what’s happening here and you just you’ll hear a blip of noise now when I press the key like that sounds great right that’s pure white noise and that is our source um as you can see here I’m using standard Library tool for this okay so if I just open this to the

    Side and make a new folder carus create a new patch uh and what I’m doing here is creating an EMP empty patch so I can code this along with you guys okay so hopefully by now everyone’s used to seeing this uh this file was autogenerated etc

    Etc so what I want everyone to do is get rid of that so we’ve got like a a blank screen like that what I’m going to do while you guys are doing that is I’m going to type in graph main this is something I’ve been explaining ad hoc while I’ve been been

    Going around the room but what we do is we add this main annotation to this first graph so this is a special key annotation that the compiler is looking for and this basically gives an entry point for the compiler so it knows to go aha this is where we’re starting uh I

    Will execute this will be our endpoint for internal input and output this will be where we have our nodes and this will be where we connect everything together fantastic and then we’ll write in with the correct sypa an input event and instead of taking in a a in a

    Um sort of a float value or anything we’re taking in this stood midi message so this is what’s encapsulating the midi message so we’ll type in stood midi message and we’re going to call it midi in so that’s our keyboard press and then what we need to do after that is have an

    Output so we’ll do an output stream so as I explained earlier I’ll explain it again now so the input event is basically we’re listening for a keyboard being pressed either on screen or off screen and then output stream is the audio coming back out in a continuous frame by frame manner

    Um and there’s our first patch we’ve got an input we’ve got an output and now we need to do something with that so what I want to do is create our first processor and this is where we’re going to do some again processing some DSP of some form

    And I’m going to make this thing called a midi gate and what I want to pass into this midate is some events I’m just going to make sure that I get the syntax correct here the joy of a live demo and as Alex explained earlier with the midi format we have

    This section called note on and note off so that’s what I’m passing in to this processor we want the noton and the not off value then float and then what I’m doing here is defining the gate length uh as I explained earlier what we’re doing here

    Is telling the gate how long we should be open before we stop emitting white noise so we don’t want it to go on forever as I was saying we want those blips that you all just audibly heard and then obviously we want to output something from here like

    This should probably put this together like that okay brilliant and as mentioned before we’ve got these events so I’m going to retype these and talk through them as we do them event in notes note on and we’re just calling that e just as a arbitary thing

    So what so what I’ve basically done here is I’ve taken in this note on event I’m passing it through an event endpoint Handler and I’ve got to tell it to do something so what we want with the note on is to trigger a gate to like a

    Boolean values say so we want one for it to be open and zero for it to be closed um so what I’ve got here is I’ve defined a variable called active note active notes um I’ll do it down here to stay in line with what I’ve already got R active

    Notes and we’re going to set that to be closed for now so it’s a zero and yeah it’s very simple from there what we want to do for with the uh noteon is we want to make active notes B1 so we do that with our my simple syntax like that so now

    If you were to run this patch nothing would happen because we’ve not actually connected the nodes up yet but if you run anything for this processor it would take the Midian and you would just change this active notes value wouldn’t hear anything obviously we’ve not connected

    Anything up yet and then what we want from there now we’ve told it to be open we need to tell it how long to be open for before we shut and that’s what we’re going to do with gate length and this is from the value that we pass in

    So what we’re doing here is we’re setting up a variable called gate duration in samples and let’s do that quickly and we’re doing exact same thing as we were doing up here we’re basically calling the gate duration examples to be the gate length and that’s all the events sort of

    Handled so we’ve pressed the bidy note we’ve told the gate to be open and now we’ve also told it how long it should be open for based on some maths we’re about to do that tells us the frequency of the note we’ve pressed from the midi

    Message okay and this is where our fun midate exists so I’m just going to copy this in very quickly so if you remember from the earlier patches we’ve got this void Main and this happens in an infin Loop so I’ve written this if statement here is what I

    Was this is our Noise Gate in action we’ve gone if it’s greater than zero it’s going to be one so we’re going to be open so what we want to do is output this and tell it that it’s open and then we’re going to be open for

    The length of the great duration in samples and then once that’s less than zero we’re going to bring active notes to be false again so we’re closing that noise gate and that’s what’s happening here basically and that is the processor for the midate so we’ve got everything we

    Need to make this midi gate now the only thing we’re missing is the frequency from the midi message um we don’t have too much time so I’m also going to cheat and do a cheeky copy and paste here with the idea being is um I’ll give you guys

    A couple minutes to catch up and I can help people SL take questions so we’ve got this thing called a note selector and we’ve done a very similar thing here of what we did with the original processor up here we’re taking in our notes on and Pitch Bend values this time from the

    Midi and then we’re outputting a note frequency so basically what’s happening in this is um we’re taking in our note on we’re taking our pitch Bend we’re using our standard library to convert this note into a frequency uh and if you’ve got a midi instrument that can do

    A pitch Bend or has um like a slider on you can see this in action uh and it will change the length of the note so one thing to mention here is the length of the actually I’ll explain that later let’s just do this for now so if I run this

    We’ve got the patch again we’ve got a keyboard that pops up and we don’t have any sound exactly so nothing happens here yet but we’ve got our nice keyboard on screen so what we need to do is hook everything up I’ll keep that open for now so we’ve made our two processes that

    We need so we’ve got node gate and we’re going to make this equal for the midate we just made we do no so this noise source is coming from our standard Library which basically just means we don’t need to generate White Noise ourselves like that and now we’re going to do our

    Connections I can’t remember how I did this in here hang on yeah okay so this is the syntax I was talking about earlier so with connections you can either use the word connection and do things like this and then you have to retype every time you want to do a connection which

    Isn’t always the most useful if you want to set all your connections at the bottom maybe you do this and now that means that you’ve encapsulated everything within this uh these curly brackets to be a connection so we can do our Midian here we’re not going to put a space

    There we’re using this uh stood midi MP converter to make the message MP converter like so and then what we’re doing is we’re telling that to go into the gate cool and then we need to connect our noise source connect that to the gate and connect this up to the output

    Of the output stream H see I can’t even get my camel casing right there we go there we go okay that out the joys of a live demo okay and then this final section is for the note selector and this tells the gate how long long it’s got to be open

    For so I’ll just copy this in actually and I’ll move this off to the side okay great and that should mean that this works and we’ve got that burst of white noise again what you’ll also notice is what I’ve added to this note selector is this console logging and this is just a

    Method of actually looking at the values that you’ve created here so if I press an a note you’ll see that it’s 440 which is what we want so the gate length is going to be open for 440 samples basically um let me just leave this on

    Screen for you guys to have a quick look at and that’s it that’s our midate that’s that all typed out um so let’s move on to the delay so let me just uh oh actually I won’t flip back to the code because it involves changing of the screen size but one

    Thing that maybe you’ve noticed while playing with this thing is it doesn’t matter which note you press on the keyboard that blip is always the same like pitch right it’s we don’t have any pitch happening that’s because we’re only outputting white noise we’ve not actually done anything do to it yet to

    Actually synthesize anything so this is part of what a delay line will do so in the carpus strong um algorithm the length of the delay will determine the pitch that’s played and this um the idea of this delay is it mimics the string vibration and the Decay over

    Time so I just wanted to add in a little bit about a buffer because we’re going to be adding in a circular buffer to do this so a buffer is a region of memory used to store data temporarily while it’s being moved from one place to another like generally speaking uh and

    We need methods to write and read to the buffer and this is done with the read and write indexes which I’ve put in these fun colors here this is just a pointer to say we’re at say here we’re at index zero we’re reading from index zero and maybe we’re writing to index

    Zero so the only thing that’s going to change from this slide and the next slide is I’ve made it a circle and this is the circular buffer this is a data structure that uses a fixed size buffer as it were and connects end to end in a circle uh and

    As it’s fixed sized we’re going to take out the oldest item of The Bu buffer we’re going to push that out when more data is added uh the cue is that so this is like a first in first out buffer fiveo buffer um we use the same method

    To read and write to the buffer using pointers of the data um there’s a this sort of technique of memory allocation uh it does have benefits especially when you’re working with a lot of data in real time uh as more data is added to the structure the old data will be

    Removed so no date will ever be needed to shift um okay so that’s enough talking let’s go back to typing will I change the display type if anything this is a good advert for the settings panel on Windows 11 so what I need to do now is uh get rid of my

    Changes just got two SE changes yeah and I’m going to flip to part two delay and I’m going to open the correct vs code panel I’m just going to zoom this in a bit so I can read it on my screen okay brilliant so this is what we’re

    Going to be typing out now um as you can see the delay sort of changed a little bit instead of going straight to the output we’re going we’re going into a delay and we’re going to multiply it by feedback put one frame of delay in and put that to the output so this

    Sounds like this we’re also going to add some annotations as well as you can see we’ve got a feedback annotation and now that sounds like a string right like a little bit you can hear it you can hear that sort of Decay as the string stops vibrating and we’ve got

    Pitch bit tinny but we’ll get to that okay cool actually what I’m going to do is just switch back to the original Branch copy what we’ve already done and go to this new Branch put that to the Side and create another new patch I’m sure there’s a simpler way of doing this sh create patch but so now we’ve picked up where we left off um I’m going to make a simple edit to this before we get going I’m going to remove the console logging but you can

    Keep this in if you want to look at the pitch changing and stuff I’m just doing this to make some space on my screen so I think I’m going to start by writing this delay processor so we’ll do the same thing again we’re doing processor delay we’ve got our curly braces

    Um let me just show you the finished code quickly yeah nice and simple so we’re going to we’re just working with um floats now basically input stream float in and then we’re going to Output some values as well obviously this is going to go through our circular buffer and come back with the

    Delay oh yeah I should also mention that the uh the delay length here I didn’t Define it as a float perfect so there’s our inputs and outputs for our delay processor we’re doing going to take an input uh stream of data output stream of data and then we’re going to

    Take an event that tells us of delay length and as I said earlier this is based on the not to frequency um that you get from the mid note that you’re playing I’m actually just going to copy this in line by line we just let’s look at this these four lines first

    Uh by the way the the reason I’m doing this is um one because I can’t remember the whole patch off by heart and second so I can just talk through each line as we’re going um so as I mentioned earlier we’re handling our event our input event we’ve got this event handler we’ve

    Called it delay length and we’re taking in the delay samples um what I’ve done down here is mentioned this WP type so this wrap type is an integer of a set length basically so I’ve set a Max delay length to be 1,000 and if anyone’s sort of um used to JavaScript

    Land you’ll know the let keyword as a constant um we could change this if you needed it to change it could be um it could be VAR as well and that means that it’s uh changeable but in this concept in this context sorry we want it to be

    Let because we want this to stay I’ve set this to an arbitrary length which means that we’ll never as I mentioned earlier we need the circular buffer to be a specific size so we can loop back around and start rewriting um and this is what that’s happening here so we’ve

    Made a float we’ve um got this uh Max delay length and we’ve set the float to be that size and we’ve got these rep types which is our pointers to the read and write positions that’s happening here um and what’s happening with this event I’m actually going to bring it

    Down so it’s easier to show you guys um this is just my syntax ADH um OCD now um what we’re basically doing is we’re telling the reposition to be a set length based on the delay samples so this this line is basically taking the frequency of the processor dividing it

    By the amount of samples and that’s where we’re going to read from and now we write our main loop as previous we’re not pass passing any values in so we’ve got closed brackets and then we’re going to go into our infinite Loop here

    Y this so this is a a fun thing about C major is um when you instantiate press let me just run this quickly um play some hold music while it loads so the anyway as it’s loading the processor. frequency is a a key word so it’s um it’s looking for what frequency

    Is set to this processor so if you come down to your rud device settings in the panel you’ll see that my sample rate is set at 48,000 htz so this is just a way of getting to this value and yeah apologies for the very minor sort of explanations here I’m just it’s

    Just because of time I don’t have like time to go through everything but please keep asking me these questions as we go um yeah so Okay so we’ve we’ve written our infinite Loop and now we need to do our it’s the same I mean if you guys have ever written a circular buffer

    You’ll notice that this sent tax is very similar to what you’ve done previously hopefully um I’m just going to paste it in here get rid of this and what we’re doing is we’re telling our you know our output to the world as I said earlier processor going

    To go through ins and out we’re telling the output to be where our read position is so we’re reading from the buffer and taking it to the output and then based on where we’ve set our right position we’re going to set that to be our input

    And then we just Advance we the read we advance the ride position and then we keep going and this again this happens in an infinite Loop um so now we’ve got that process of written out we can make some Connections so you’ll see if you look at side by side comparison you see that there’s not too much that’s changed here I’ve added some extra connections and I’ve also added this feedback annotation as well which means that we can choose how much feedback there is that basically changes the length of the uh

    Delay as it were so I’m going to add this in first so this is an input value and we’re going to use the special if I can uh spell the word feedback we’re going to use this special annotation and this tells C major that

    We need a knob on screen of some form uh I’m just going to copy this straight in again for time uh but please type this out as it comes there we go and that will now show on screen I actually not sure if it will don’t know why it’s taking so long

    To load today and there we go we’ve got a knob on screen it’s not attached to anything right now so it won’t do anything but you see I’ve just set an arbitary value here um you can change them in a Max and you can initialize to be from wherever you

    Want so that’s now that knob on screen okay so what we did earlier we’ve been defining all these nodes so now we want the delay node so we do node not noed node delay and it’s called delay CU I process called delay brilliant and now we need

    To do some more connections fun so I’m actually just going to get rid of these and copy this in and I will talk you through What’s Happening Here now brilliant so as well as the gate needing the length set by this uh note selector we’re doing the same thing with the

    Delay length you’ll see um it’s like an arbitary value and what I’ve done to save space is used this comma annotation here so this is Midian we’re going through our M MP converter to make the values that we need for the note selector and then we’re going straight

    Into our delay length and gate length um and then after that we need to go into the gate event in as earlier so say hey we’ve got a Noel we want you to open the gate we’re doing the same thing as we were doing here earlier we’ve got our

    White Noise we’re going to the gate do out yeah output the and we’re multiplying by the gate. out sorry I confused me for a second there so that’s our one or zero value if it’s zero obviously we’re just multiplying the no Source by zero so it’s just going

    To Output nothing uh and when it’s one it’ll just be the noise Source uh and that’s going two are delay do in here as you can see is an input stream and then what we’re doing from there is we’re multiplying this feedback this is why I’ve set it to be

    Uh 0.99 Max by the way because otherwise we’ll have an infinite feedback loop now this annotation here is to tell C major that okay we just want to wait for one frame and then we want to put this into the laying and that again stops a

    Feedback loop and then we’re going to go from the output of delay straight to the output and if that all works and we wait for it to load you’ll see we’ve got the gooey with the feedback and we can make a noise and now we’ve got not an annoying blip but instead we’ve

    Got a short blip with the low feedback or a longer blip with that sounds more like a I don’t know like a short resonated body sort of string instrument right and that’s part two that’s the delay line okay again I’m going to leave this on screen give everyone five

    Minutes sort of catch up um please refer to the branch code as well and again I’ll come around and answer questions okay okay so this is the patch we’ve just created again I hope that woke you up I’m sorry sorry that was Loud so we’ve got our delay we’ve got our noise source we’ve got our string excitation and this is what we were talking about with the guitar we s of seeing the string ration we’re seeing this Decay but unfortunately I mean listen to the note like it’s really like quite

    Tiny you know you don’t really want all those higher frequencies so this is where part three comes in this is where we talk about filters and I’m just going to go back to my slides can you guys see the filters yeah brilliant okay filters so as we as we as we’ve just

    Established um I mean I think it sounds great our instrument but I think maybe a million producers would probably tell you that that’s not okay and the mix sounds awful so please add some filters in this is going to be a very high level overview of filters by the way simply

    Because I’ve sort of only got 15 minutes per section um so digital filter is a like a mathematical algorithm that operates on a digital data set EG audio data in order to extract information of interest and remove any unwanted information so in this case it would be all those high frequencies that we’re

    Hearing at the moment so the com the four common types I’m sure everyone’s aware here but we’ve got high pass low pass ban pass and all pass um and we can sort of divide digital filters into two categories so we’ve got infinite impulse response filters and finite impulse response filters um yeah

    And as the name suggest each type of filter is categorized by the length of its impulse response uh I’m actually not going to show any mathematical equations today which maybe everyone’s whooping a chair of Joy at that um I just don’t have time I’m

    Afraid um so I just want to give a quick high level overview of ir and F filters for everyone so when you see this in the hopefully in the talks in the next few days you can sort of understand what people mean by this um so infinite impulse resource F filters generally

    Chosen for applications where linear phase is not too important and memor is limited uh so they’re easy to design easy to implement and they involve fewer parameters so it’s less memory and therefore less computational um complexity uh the disadvantages here is for mathematical reasons uh they can become unstable and they will accumulate

    Rounding and noise error over time basically um and an i filter uh example is like a Butterworth chubby Chev and all of these nice examples bessels elliptic etc etc uh fi filters finite impuls response filters uh these are generally chosen for applications where linear phase is important and a decent amount of memory

    And computation performance are available so they’re stable by Design uh again maths um there’s no Distortion because it’s a linear phase uh and they are easily customizable but obviously disadvantage is there you know depending on what you’re running on they do take a a larger storage requirement and they’re computationally quite expensive

    Um I apologize for that Whistle Stop tour of filters so we’re going to go back to programming now so this is part three uh hang on a minute I forgot that I need to change my Display and we’re back to duplicating so I can see what I’m typing so what I everyone to do here is well you guys can keep on your patch because we’re just going to add to it and I’m just going to discard these we’re going to go into part three

    Filters what we’re not going to do is look at the wrong patch okay so this is what you’ll see if you’ve checked out this Branch or if you’ve got it open anywhere else you’ll see that we’ve added a load of filters basically and this is what we’re going to do with this example

    So I’m just going to run it so you guys can hear what it sounds like it’s very similar to what we’ve just heard unsurprisingly okay it’s not going to work what’s Happening Here three Filters there we go someone set the gain to be 0.1 it was me I did that apologies guys if you checked out the uh the repo that’s there please change that um so I don’t know if you guys can hear the difference from the one we played about 10 minutes

    Earlier we’ve got more of a sort of refined sound as it were we sort of low path filtered all those highend frequencies and was starting to sound a bit like a guitar now um cool let me make my patch again please without the Mo okay fantastic so this is the code that we

    Were just looking at this is just for my own uh use here so let me just close that patch okay so this is what we had previously and I need to put this to the side you can see that I’ve got rid of this uh feedback knob as well so what

    We’re doing here is we’re not actually going to implement any filters today I just want to add some filters into the sound so you can then either change them modify them or add in different filters so part of the c major standard library is uh one pole and Butterworth filters

    So that’s what we’re going to use today um I’m not going to go into too much detail here but I will answer questions as they come so as you can see at the bottom of the delay I’ve added these two nodes and we’ve got a filter I’ve just called it

    Filter this could actually probably have a better name and node low pass as well and we’ve also added a gain and that’s what tripped me up about a minute ago so this is uh this is all we’re adding in this section now so we’ve added the two filters and the constant

    Gain uh and yeah it’s at point9 F not 0.1f so we need to connect these up basically what I did in the previous example was set this feedback annotation just to show you how to start with annotation but I’m actually just going to set this to be a set length uh

    There’s no real reason for this I’m just doing this to save a bit of screen space and everything you can keep this as a feedback um event if you want a value if you want no problem there oh I’m just going to copy and paste this all

    In and I’ll put this in here and just comment this out quickly the only reason I’m doing this is so I can immediately show you guys some some comparisons so you can see the Top Line stays the same noise Source start out gate out still going to the delay in that’s because

    We’re not filtering obviously our events happening there from the output of delay we’re going into filter dotin and this is um this is a closed thing we’re not actually going to see this because it’s happening in a standard Library processor we’re going out to a game

    Which is this one here and then we’re outputting oh sorry my uh throat’s killing me and then from the noise source so what’s happening here is we’ve actually added in the noise Source into our final value as well and this is because we want a bit more of an

    Excitation of that blip so we actually want that initial blip that we made in part one but we want to get rid of some of those higher frequencies so we’ve got noise Source going into the uh gate going into the low pass taking the output so this is just telling it again

    That’s the one Zer that’s telling if it’s off or on it’s going into our low pass to get rid of some as high frequency we’re going on from the output of that multiplying it by that same gain which is again9 in this case was it yeah9 f and going into our

    Output if I run this wherever it wants to Run there we go I’ve left the feedback annotation on here but it’s not actually connected to anything so we can go ahead and comment that that out and we’ll see that it’ll hot reload and disappear from our screen here just fore I’m also going to get rid of

    This and hopefully this time we’ll hear Something there we go okay great so if you guys want to go ahead and add these filters in it’s these two here um it might be worth actually linking to our standard Library so I can show you guys what these all mean actually so um again I’ll give you guys

    Another 5 minutes so we’ll come back at about quarter 2102 and we’ll continue on to the next part which is going to be pipany and the final patch okay cool sure question e whole the gu yeah exactly and then she like tou oh we just of could yes

    So I was like why don’t it okay have for just get this ready actually I’m not even going to clone it this time watch me live push to the reper that’s me fixing my error from earlier I don’t even know what I don’t even know when I would have set it to

    0.1 like that makes no sense to me I don’t know where that 0.1 value came from the con game but that would be me last night like Panic typing these branches out and then not it later anyways all right all right guys thanks for bearing with me while I went through some

    Questions um that was filters um hopefully that’ll give you some groundwork for later talks uh uh later in the conference and stuff so this is the final part of this now we’re doing well on time we’ve got 40 minutes which means that we get to talk about testing

    Which I hope is everyone’s favorite part of any Workshop is writing a unit test woohoo we’ve got a lovely QA engineer Emma here to tell us why testing is important which by the way she’s doing a talk about tomorrow and I would highly recommend going because it’s very funny

    And makes me want to write a test okay so one thing you may have noticed while you’re playing with the patch maybe not obviously with the if you’re using the uh vs code panel you won’t be able to press on Keys like I can but if if you try pressing on two

    Keys you may have noticed that it it’s not working like we we’re monophonic right now so monophonic basically means that it’s it’s it’s one note happening at one time so we press a note wait for that to happen we can then trigger the next section so Pony is multiple

    Notes makes sense right so it’s the combination of two or more tones or melodic lines and that’s how we Define it in music so our s needs to handle that really I mean we could just leave the instrument there and we could just have a guitar synth that only plays like

    A solo or something to some sick lead Zeppelin and that’s great but I know a lot of guitarists like myself actually need to play chords um so for this we use a thing called a voice allocator so the this sort of um this sort of graph here this sort of shows

    Simple voice allocation algorithm so voice allocation is a voice it’s like a voice circuit and this is the number of notes able to be played at any one time in a polyic synth Um this can this comes from Hardware Land say for example like so the number of voices they had is the

    Number of notes you could play all at once and this has sort of come over into software and we still call it voices um so say a three note chord is played if you play ceg or something on your keyboard these notes are assigned to voices one two and three

    Respectively um and then in the digital world we’ve recreated this basically with the same voices and they dynamically assign to voices to play notes um there are some algorithms that do this that make this quite effective uh I’m going to use the standard Library uh to

    Do this uh and we’re going to use eight channels that are dynamically allocated um so what I’m going to do is flip back so this is if I go skip this is Branch four and this is part four Pony so everyone I’m not going to go through like copy and paste and code

    Because the only thing I do here is add a voice allocator algorithm so this code is going to look a little bit different for what we’ve just made um as you can see all our nodes have gone what’s happened here ah we switch screens H thank

    You I like how you all just played along with that as well thank you my r or D like yeah that looks that looks different yeah does it look different now can you guys read that should I bring it up one so now all of our notes have gone

    And what we’ve done is we’ve created another graph and what you’ll notice in this graph is that all of the nodes that we’ve originally defined and written and gone to different processes are now here that’s not changed and what you’ll notice here is the connections are all

    The same as well so this graph is handling our polyphony so let’s go through this so so what I’m doing here with this node is I’m sub instantiating eight um um arrays of this graph so there’s like eight voices happening all at once so the same thing you play eight notes

    It’ll allocate from one to eight um and then we’re using our voice allocator to do that so we’ve got this thing called voices and voice allocator and what you’ll see what I’ve done here is I’m taking the Midian and putting it through our EMP e converter again and we’re going straight into our

    Voice allocator and once that takes the event out and goes into our voice here notes in so the topology of this graph looks exactly the same uh again I’m skipping over this because we don’t have too much time to actually Implement a voice allocation algorithm in the time

    Given and then what we’ve done here as well is add these annotations so I’ve brought back that feedback annotation we had earlier you guys remember you probably already kept it in and what I’ve done here is add two more annotations it’s impulse length and filter and that basically means that

    We’ve just got a a little bit more control on the default guy which means that this thing’s a little bit more interactive because when I wrote this I realized I had nothing on the screen and you could only make one type of noise um so if I run

    This what I’m going to do now as well so this is our final patch this is where we leave RC major um what I’m going to do is show you it and then maybe show you some edits you could maybe make and maybe just to get you guys in spired

    Because I want you guys to have a little bit of a play in this make this instrument your your own that’s might being like changing the filters and things like that so if we have a look at our vs code panel we now got three knobs

    On screen which is our delay length our filter frequency and our feedback um I’ll just keep it here as well and once again is it also still 0.1 in here as well it is look at that well done me now it will make a noise what’s going

    On okay just for reason I’m going to switch back to main because this is what’s also on Main as well and we’ll run this again it’ll be a proper laugh if this doesn’t work maybe it’s the keyboard yeah I do I just want to yeah now we’re getting a

    Noise yeah it is I’m just uh editing some of the settings as well there we go yeah sorry uh my volume settings also changed as well in between T which means that the other branch is probably also working so this sounds like earlier we’ve not changed the filter or anything we still

    Got the same initial values but now I can play a C major chord brilliant do you get the pun do you get it guys come on work with [Laughter] me uh so give that a go check out that Branch or if you want to copy the code

    Exactly I can either put on the connection here or I can put it back onto the uh voice wherever that’s gone oh that is the voice H I can put this connection on here or I can put on The Voice graph whichever one’s easier um one final thing I want to show you

    Because I’ve had a few questions about this today is this graph topology um so I explained this obviously earlier with a very simple example I did that in we took a value in for the game The Stream in for the game had a gain processor and had the out

    Obviously this is way bigger now and you can actually follow through just like in like Max you can follow through everything that you’re doing and you’ve got the following the gate in following the gate out and you can see this is where we sort of have our noise Source

    This is where the white noises you can follow this all the way through so if you’re more of a visual learner I am myself I actually really enjoy looking at these graphs or if you want to change your connection this will hot reload and you can see the new

    Connection and the nice thing about this is if you want to change a specific thing like this noise Source I go oh I want to go to this line you can click on it and it will take you there um so if you me if you were listening

    Earlier you’ll know that we have different types of noise Source we can so we can change this to be a brown noise this will sound like this you see it’s got a different tumbr a different style noise Source has changed and we’ve also got pink so if you want that low

    Five pad Vibe give that a go and this is the pink and it’s got a bit more like high end excitation so you might need a either a higher order low pass filter or something like that so give that a go again I’m going to give 5 to 10 minutes

    For this because we’ve got about half an hour left of this Workshop which means that we’ve got time to talk about testing again Okay let me put that on there right um everyone’s ready talk just yeah you got past the intro it’s gone to the good bit tests um tests tests very important if we’re going to send you out recklessly developing things you should at least have some idea how to test them as

    Well um yeah not not going to go too much into the c major syntax of it but just a couple of kind of Core Concepts of testing um is on on the main branch you’ll be able to find all this um have two tests so here we’ve got something

    Called test function and we’ve got a function that returns a bu and this is this is just all internal to itself itself so it’s adding um adding a vector to itself and checking it’s got the same thing so by itself this isn’t particularly useful if you had some kind

    Of logical processor and you could say right if I do this I expect to see these results and if that returns true so all all true here is saying that all of these values are right um then that will return true and say this is past past

    True becomes past false becomes fail and um if you run patch on one of the C test files it’ll open What’s happen no I don’t what are you doing then it’ll do a Windows thing and not work like it did for Mac Echo okay ha comparison failed yeah yeah okay so get

    That in a minute so I’ll run test and I’ll show you that one of them has passed which will be this test Vector out one because we can follow that through logically see it works so test function good for logical processing the other is this run script

    One so we’ve all been using CJ from the um vs code extension there’s also um terminal ways to run it this is one such way which is essentially giving it some parameters so sample rate um block size length things and it’s talking about this test one

    Folder so this test one folder we’ve got a couple of bits of Json so here’s some kind of there’s a midi note there’s where it’s occurring there’s PR filter PR feedback then a WB file that is what we expect to get out so this is essentially doing a sample by

    Sample comparison that that only works because this is fully deterministic so if the it could not be depending on the noise generator we use um the the c major standard Library noise generator is deterministic so it’ll produce the same thing every time some other ones aren’t they produce

    Better Randomness but they don’t have this useful feature for testing um so yeah so this is going to entally trigger it as defining those Json compare it to this and say that it’s not working so yeah Max dbd exceeded all these thresholds probably to do with that 0.1 gain that um was

    Floating around let’s see what was that 99 maybe what no can’t remember what the settings were when this test was defined which is no I definitely don’t um yeah so essentially that’s kind of the other way of testing audio if you’ve got it fully software fully deterministic you can do a sample

    Sample by sample comparison it’s kind of the most stringent test so if you wanted you could kind of break it down a bit more maybe do one a third octave band comparison with some threshold and it’s kind of it will beish the right thing which is kind of a bit more lenient but

    Yeah and that would probably help us avoid this eror where can’t remember the settings that we Define the thing at um and yeah I’m I’m not going to go further because I’ll be here for 50 minutes tomorrow going way too far um so yeah that’s here what I wanted

    To say here thanks so much thank you okay so that is the end of the content that we have prepared for you guys I’m just going to go back to the slides we spent time on the slides so I will show them that’s been my

    Um yeah these are the unit test ones we just get through this um so yeah so thank you guys all for coming we’ve got about 15 minutes left of this Workshop before we go to lunch I assume is the next part of the day um I’m around for

    Any questions you can ask anyone here but just play with the patch have some fun with it um and yeah if there’s any more questions online please feel free to ask them because I will be monitoring the laptops as well so yeah thank you very much guys thanks for [Applause] listening

    Leave A Reply