Canalblog
Suivre ce blog Administration + Créer mon blog

Sound 1

30 novembre 2008

Last week - wrap up !!!

At the beginning of the semester, you asked 3 questions. I answered them a few times along the session, but in the end I started to think more about the notes you were giving us in class and the texts we had to read. For the last entry of my blog, I will answer the questions with a summary of what I learned in this course. - Today, I am able to tell how a sound is recorded and transformed from analog to digital and then from digital to analog. Also, I know a lot more than I did about microphones when I first entered in the class, which is really good. I know how they work in general, I can tell which one will be of better use in this or that kind of project, I know how to choose them according to their fragility, powerness, etc. - It would be a complete failure for me if I would not have changed my way of thinking and seeing sound after Sound 1, because I would probably have learned absolutely nothing. This class really gave me a good idea of how people working with sound need to think and what they need to do in their jobs. Now I know a little bit about the studios, about how, when and/or where to record which sounds, etc. This is a really big steph in the sound world. - This question I won't answer, because the last posts on my blog tell about what I learned about sound use in media. To sum up, I liked this class a lot (and this is true, not just to make you happy hahaha) and I learned a lot rapidly, which is cool, because I can start now to use sound in a more professionnal manner. You let us do a lot by ourselves with our sound projects.
Publicité
Publicité
30 novembre 2008

Sound Design in Radio/TV/Intermedia

We passed a lot of time talking about sound design in film, but we also talked about sound in radio, television and interactive media. First of all, let's remember that they all take their origins from "live" contexts and performances as circuses, music halls, theater, ceremonies, etc. - RADIO Radio started in the 1920s. Today, there are 2 models of services for the radio ; public (created by the state) and community (funded by donors, institutions, etc.) The main goal of radio, which did not really surprise me, is to create a continuous flow of sound in our everyday lives. It is also to deliver the audience to advertisers. Naive are the ones that think it is only to communicate us information about everything. Broadcasters pump up and compress the volume of the audio signal at the highest permissible levels, so it results in strong and attention-getting sound flow. The linear radio flow is modular, which means that it is superimposing different blocks of material like sports, news, weather, commercials, radio drama, music, phone-ins, magazines sequences, discussions, etc. Also, 5 forms (that we will see again in TV) of emphasis are used to solidify the sense of the flow continuity. These forms are ; sound address (explicit all the time except music, radio drama, documentaries), internal sound audience (live audience, applause, laugh, interviews), sound advance (chart countdowns, drum rolls), sound underlining (dramatic music on a dramatic drama, etc.) and sound labeling (sounds for before and after news, for traffic, weather, etc.). - TELEVISION It also aims to create a continuous flow, but with sounds AND images this time. It is supposed to keep the attention of the distracted viewers. As I said, it uses the same 5 forms of emphasis, but a sixth one is added ; sound completion (voices, effects, live performances, ...) Broadcasters, in TV too, pump up and compress the volume, but MOSTLY (and it is even more frustrating) for commercials, so that when they start, they attract the attention because the are louder. Music video can be related to television because it also mixes sounds and images. There are 4 different approaches used in music video : pure performance (continuing presence), pure narrative (images present the "story" of the lyrics), mixed performance-narrative (often use in hip hop) and abstract-experimental (images and sounds don't have an obvious relationship). - INTERACTIVE MEDIA, HERE WITH THE EXAMPLE OF VIDEO GAMES It is interesting to see how it draws on all the media. I find that we really can see the post-modernism in a video game, because it uses everything that has been created before and it mixes it all up. It has elements of viewing a film, it has the same continuing flow goal, it uses the narrative approach of a film, it relies, as in radio and TV, on a constant variety and on ongoing series of alternative or rotating actions. But video games creators need to think about a loooott of things. Because they are all computer-based, video games lead to preoccupations as memory preocuppations (RAM, live memory, memory budgets ...), rates preoccupations (sample rate, bit rate, loops), MIDI preoc., effects saving preoc., and details preoc.(in creating sound effects). This is a relly brief summary of what we learned about sound in radio, tv and interactive media, but it really helps me to have a good idea of what sound designers need to deal with.
20 novembre 2008

Acoustic Ecology/Ethics and Copyright

I learned a lot about acoustic ecology (or ecoacoustics) : i did not even know it existed before I read about it and I heard about it from David. It's interesting to hear, even if we know it without knowing it, that our culture really became an EYE culture, a visual culture. Our ears become less and less accurate, so we hear less and less the sounds surrounding us. The WSP (World Soundscape Project) is good and I hope it will help us to be more aware of the sounds from our environment. Acoustic Ecology tries to make us understand that the subjective (our feelings, thoughts) is more important than the objective ( the noise pollution). Today, we try to hide with music (virtual sonic environment) and noise instead of trying to hear all the sounds around us. Soundscape Composition is a way researchers found to show us how environment is rich and full of surprises. By soundwalks, oral history, sound counting, architecture and urban planning, anthropology etc., they are trying to let the world know about sound. In the soundscape, there are keynotes (backgroud sounds), which is the fundamental tonality around which the rest modulates. Then, there are the foreground sounds, the ones that attrack attention and the landmarks. These are the sounds we associate with a particular culture or population or whatever and that we naturally recognize. Also, we call the signals that are fairly heard hi-fi sounds (high fidelity soundscape) and the signals masked or incomprehensible lo-fi (low fidelity soundscape). Sometimes, expert talk about SCHIZOPHONIA, which means the separation of a sound from its original source, for example all the recordings, etc. The members of the WSP are R. Murray Schafer, a professor, compositor and musician, Barry Truax, also a professor at Simon Fraser University, and Hildegard Westerkamp, a soundscape composer. In the readings, they also talk about the ethics. The question of copyright is a big and controversial one. Should the industry focus on the musical creativity or on copyrights ? That's not easy to answer. Some companies encourage cultural diversity and are stopped or unfairly restreint by the current copyright protections. What can we do about that ? Schloss, him, takes a look at hip hop artists more precisely and tells us what is acceptable and what is not. Acceptable ; flipping or creating variations, parodies, hommages, coincidences, the use of the same single drum beats ... In vinyl, this is also acceptable to take samples from a record. But what is NOT Acceptable ; when the producer recognizes or is inspired by the wholeness of the record, when the composer takes more than one sample from the same artist or the same record, etc. This is not a easy thing.
28 octobre 2008

Sound design in Film

The more and more I hear about sound, the more I start to be aware of its composants, issues and significations in, for example, movies and tv shows I watch, even though I already learned a lot about it in my cinema classes before. - I will not tell more than I did until then about my understandings in sound recording and digital transformation into a recorded data, because I did not concentrate my readings on that subject these last days, and the classes have not talked about it since my last posted message. - Of course, I cannot NOT mention the way I now see the sound because I read so a lot about it lately. First of all, I think that I have never realized until today how sound is complicated. Now that I know that its amplitude, frequency, timbre, wavelength, pitch, and so on are all related as well, I see why sound designers, directors or music/studio recorders spend soooo much time thinking about the room shapes, dimensions, materials, etc. : to be sure that everything is going well and in harmony with the rest. Also, I changed my way of thinking about sound when I started to work with ProTools for the last two assignments. Of course, you have to see the sound in its every detail, because if a sound is louder, different, has higher or lower frequencies, etc. than the other ones, you need to open a box, whatever it is, and modify the problematic segment. To do so, you absolutely need to know how tu use EQ, compressor, ProTools' tools, etc. and, alongside, to know what to change and which box to open to change what. Sorry if it's not clear, but it is my way of seeing it sometimes. - Now, the important part this week. The principal subject for the last 2 weeks was "sound design in film", so there is a lot to answer about that question. Last week, I talked about my readings from Alten and Chion in sound design, but right now I will focus on the notes taken in class. Starting from the beginning (which is kind of repetitive sometimes, considering I passed 2 years in a cinema program, but everyone thinks about sound differently, so it makes it interesting to see it from another point of view. Also, film is realllly a domain that passionates me, so I can hear from it all day long without being bored), we talked about the 3 basic types of sound in film ; narrative, abstract/exp., performance-oriented. Then, we saw a little timeline and continued with the Pre-Prod, Prod and Post-Prod considerations. These considerations are very important, because every dialogue or music track (to mention only these two) has to be DECIDED, THOUGHT, NOTED and so on. Movie directors (alongside with sound designers and all the staff) do not only decide to record sounds that are randomly played or produced while they are filming : everything is wanted in a movie. All that links immediatly to the types of microphones (boom, plant or wireless body types), with their descriptions, and the soundtrack relationship with picture. This relationship is, in my opinion, an important thing to think about. In fact, sound MEANS something from your movie, it SAYS something about it. You have to be sure of what you want to mean and say to the viewer when you take and record a sound. For voices it's the same thing, even if you are, if I can say that, a little more "restricted" ; dialogue or narration. The part I also like but that we less talk about is the sound editing and mixing (the part where you can rework and decide everything, which is the feeling I get when I do sound editing, for example for my 2 assignments), in other words the post-production. Thus, everything can be related to sound. What I mean by that is that it can give the viewer a loooot of different perspectives of our movie in playing with it (we saw examples in class). If you want the viewer to realize that, example, you are having subjective thoughts, you can play with the reverb or/and the delay to make it sound further, more abstract, etc. It has unlimited (or almost!?) possibilities !!!
12 octobre 2008

Week 7 - READINGS

- I read the chapter 5 from Alten about the consoles and control surfaces, but I found it difficult to understand for two reasons : first, I am a visual person so I would have need to physically BE around a console to really get the way it works. Second, there are a lot of technical terms at the same time for a person who does not know everything about the sound. But it was important for me to read it and to get a general image of how it works, what it looks like and what are the things you have to deal with when you use these kind of consoles. The parts about the features of a prod console, the meters, the patching and the plugs will probably be useful for us if we continue in this field. To sum up, week by week I understand a little more about how the sound is transformed and/or modified to give the best result it can. - I think I always say the same thing about this second question, but it is still true after the readings I did in the last two weeks : the more you learn about this medium, the more it changes your way to think and to work with it. For example, just with the assignment that we did, I understood how many sounds are there in our life and how much we do not pay attention to it anymore. When you have to completely construct an atmosphere with non-verbal sounds, you see that our ears miss a lot. Also, I read the parts in Chion and Alten where they talk about the music and the sounds in movies and other media and it makes me realize that sound is a big part of the creation. But I will talk about it in the next question ... - First of all, in Chion, I learned a lot of new different terms. The author taught me that sound really enriches an image. A sound, given with a particular image, can make you feel something, learn something, see something differently, etc. It changes our perception of what is just there. When he talks about empathetic and anempathetic effects (music), it makes us realize that music is there for a reason (to participate to the feeling of the scene, which is empathetic, or to intensify the emotion in progressing in a steady manner, which is anempathetic). He explains how without the sounds, we would be lost in the frames (ex. in a kung-fu scene, maybe if there was not any BANG BING PAF sounds, we would not understand that it is a fight happening). Then he talks about music in horror movies, telling that it is really important for the ambiance, and he gives a lot of important terms. A few of them that I find particularly useful : - offscreen sound : sound whose source is invisible - onscreen sound : the opposite (ex. we see someone walks so we hear steps) - non-diegetic : sound source is external to the story (ex. narration) - ambient (or territory) sound : envelops the scene - objective-internal sounds : sounds of breathing, moans, heartbeats - subjective-internal sounds : mental voices, memories - pit music : music nondiegetic - screen music : music arising from a source in the space and time of the action - active offscreen sound : acousmatic (sound we hear without knowing where it comes from) sound that raises questions like ''what is this?'' ''what's happening?" - passive offscreen sound : creates an atmosphere that envelops and stabilizes the image ; it provides the ear a stable place There are, of course, lots of terms like these. Let's finish with my readings from Alten about sound design. The author talks about similar things, but he uses a more technical language, I believe. For Alten, sound design is : the process of creating the overall sonic character of a production, which I agree with. He explains that speech, sound effects and music are all produce with the same elements : pitch, loudness, timbre, tempo, rythm, attack, duration, decay. He says that the ear sees and the eye hears, sentence that I like, to make us realize that when we look at a movie, we do not really separate the sound and the image. For us, it is a whole. According to the sound, we understand what we see and when you mix both, it gives you a result. If you change the sound, it'll give you another result. If you change the image but keep the sound, it'll give you another result, and so on. What I really liked from this chapter of his book, is the part where he gives all the functions of sound effects. I cannot summarize all of them because there are 14, but it is amazing to see what it can do (breaks the screen plane, defines the space, focuses attention, establishes locale, creates environment, emphasizes action, intensifies action, depictes identity, sets pace, provides counterpoint, creates humor, symbolizes meaning, creates metaphor, unifies transition).
Publicité
Publicité
25 septembre 2008

Week 3 - READINGS

This week I read Alten, chapters 2 and 3. In general, it is weel-explained, clear and concise so it is easier to understand what the author is talking about. A few times I did not get what he was explaining, but it is because of all the very technical terms. - The third chapter answers more efficiency to the first question than the second chapter, because it explains everything that we need to know about ; how the room needs to be shaped to produce a better recording quality ; what are the sizes used to record sound ; how to isolated the sound ; how to prevent unwanted noise ; etc. It really tells us how the sound is travelling in the space, so we can know how to find a good room to record without losing all the richness and the quality of the vibrations and the reflections that we need depending on what type of recording we do. Now I know that the microphones have different ways to ''hear'' the sound and that they need to be positionned in a way that they do not interfer with the shape and the composition of the room. Also, when you record, you have to think about the early and the reverberant sound that the room will produce. Sometimes, for example for the radio shows, to have a little bit more noise on the studio produces a nicer ambiance for the listeners. But, on the other hand, it is not true for, example, the recording of a rock band. The room dimensions have to be created in relation to the sound outside versus the sound inside or vice-versa. - Before I started to read about it, I knew that sound was a complicated thing to deal with. Now I think it is worst ! By this, I do not mean that it does not interest me anymore, which is really not true. But now I can imagine how difficult it is to think about : the room dimensions, the isolation, the reflections/reverberance/echo/early sound, the noise all at the same time just to produce a sound that our ears will hear perfectly clear. What is kind of funny is that when we hear even a little little bad sound or transient on a song or on a tv show, we think : ''wow, the sound-editor is not good''. Now, I will think about it before I say so because sound is a hard medium. The thing I find is more difficult in the readings is all the technical terms and the abreviations of sound measures. I would like to have a table with all their significations just to be able to take a look at it whenever I want. The Hz, dB, ms, NC and cie are losing me sometimes. - For the third question, I cannot really tell more than I did last time : the readings are not based on this subject (the sound on film, tv, intermedia, etc.), but more on basic and general subjects. I will add to it next time. The only thing I learned about it is that on radio studios, they let more of what is considered as ''noise'' on the studio to produce a better ambiance (as I said earlier on my text).
25 septembre 2008

Week 3

This week we saw the different kinds of microphones, which was very interesting. I always see types of microphones on tv, on shows, on concerts, etc., but I never know which one is for what type of work. To learn that microphones act in the same way than my ears do is nice and it helps me to understand the process that happens when a sound is played. The transients are the sharp changes in the sound waves, they are the sounds that we sometimes hear at the beginning of a song (a mistake, to be clearer). The ''ribbon type'' and the ''dynamic/moving coil type'' are both magnetic. The first one is used on live shows, for singers, for the drums or for the percussive sounds with a sharp attack. The second one is used with rich instruments and voices. It is useful for studio recording. The ''condensor/capacitor type'' is electric and works with negative electrons. The electrons change with the movement of the diaphragm in relation to a plate with a charger and this microphone needs to be amplified. There are a lot of pick-up patterns, but we talked about ; the omni-directional, the bi-directional, the cardiotic and the shotgun. (If I knew how, I would have insert images about it ...) We also saw the miking techniques ; the A-B Miking (which is when you use two microphones, both directed to the same place) ; the X-Y Miking (which consists in putting two microphones in an 90 degree angle) ; and the Mid-side Miking (which is when you don't use two microphones on opposite directions, but one in the middle). And to complete all of this, we talked about how to avoid cancellations or bad phases while we are recording sound.
9 septembre 2008

Week 1

Only with the first class of sound, I already learned a lot. I just finished college in creative arts-profile cinema, so I have heard about sound a lot (sound on film, obviously), but this class showed us how the ears were perceiving sounds, how they work and what are the differents ranges in those sounds that we hear every day. I know a little bit about, for example, how we TAKE sounds with a pole, how we ISOLATE sounds when we are outside and there is wind, things like that. But nobody ever talked to me about how sound is DONE, how it works, etc. I cannot wait to do the practical exercices in class. Also, I downloaded the sonic visualiser and, since then, I listen to my music with that program open. I am still discovering it, but for what I know, I like it. - At thit time of the semester, I don't really know how the sound is transformed from an acoustic phenomenon into recorded data. - I did not read a lot about sound yet, but only with the concepts that we talked about in class, I know that I will not think about it in the same way that I did before. I cannot, because now I know how it goes and runs in my ears and how it is strong (I understood it when I saw the different shapes appear in the chalk!). I will probably work with it in a better way, now that I am more aware of every single sound that is present in our everyday life. - In film, sound is really really important. It is what keeps the spectator on track and what makes a movie work. Without sound, there is no timeline, nothing to keep everything together. Directors can take sounds directly from the outside or directly from the movie set, but normally, they add a lot of sounds outside of the set (what we call ''post synchro''), and then, they mix all together on the post-production. There are outside voices, direct voices, sounds taken with a pole, and all of them are generally taken with a ''mixette'', which is a kind of recorder, that is plugged on the pole. On television, it is kind of different. First of all, the technicians work most of the time on sets, so it makes it easier to take great sounds on the first take. They use a pole directly from the take, which is a thing that we do not always do on movie sets. This is what I have learned yet, briefly. On the next weeks, I will keep writing on my blog and talk about my readings (that should come soon!) and talk about what I learn.
Publicité
Publicité
Sound 1
Publicité
Publicité