Vid102 Day 1

Post on 25-Feb-2016

16 views 0 download

Tags:

description

Vid102 Day 1. Class Schedule. Course Overview Syllabus Review Course Site Contact Sheets Lecture Careers in Audio Principles of sound Discussing Soundtracks What is Sound Design? Acoustics and Psychoacoustics Discussing Soundtracks Equipment Policies. Questions. - PowerPoint PPT Presentation

transcript

VID102 DAY 1

Class Schedule Course Overview Syllabus Review Course Site Contact Sheets Lecture

Careers in AudioPrinciples of soundDiscussing SoundtracksWhat is Sound Design?Acoustics and PsychoacousticsDiscussing SoundtracksEquipment Policies

Questions What would you like to get out of this

class? What are you most interested in learning

about? Why are you taking this class?

Weeks 1-2 Principles of Audio Weeks 3-4 Production Audio Week 5 Midterm Weeks 6-10 Post Production Audio Week 11 Final

Course Outline

TEXTBOOKSound in Media 9th Edition

NOTES COURSE WIKI

Vid102.wikispaces.combbrownsound.com.moodle – coming soon

ASK QUESTIONS!!!!!

Requirements for Success

What does the term soundtrack mean? If a tree falls in a forest does it make a

sound? What is sound design? What are stems? Why is sound important to film?

Questions

Careers in Film Audio Production Mixer Boom Operator Utility Sound Supervisor Librarian Field Recordist SFX, Foley, DX, ADR, MX – Recordist, Editor, Mixer Foley Artist Rerecording Mixer Engineer Transfer

Audio Related Fields Aerospace, industrial, and military development and

applications Audio equipment design, manufacturing, and distribution Audio in medicine, security, and law-enforcement applications Audio/visual media for industry, government, and education Broadcast station and network operations Communications systems, such as telephone, satellite, and

cable TV Consumer audio equipment, sales, and services Motion picture distribution and exhibition Professional audio equipment, sales, and services Record preparation and distribution Sound equipment rentals and installations

Names to Know… Walter Murch Ben Burtt Randy Thom David Stone Steve Flick Gary Rydstrom Greg Hedgepath Ren Klyce

Films to Know… Star Wars The Godfather Wall – e The Conversation Apocalypse Now Dracula Eraserhead Terminator 2 Inception The list goes on….

1. Emotion (51%)2. Story (23%)3. Rhythm (10%)4. Eye-trace (7%)5. Two-dimensional plane of Screen (5%)6. Three-dimensional space of action (4%)

A films soundtrack is defined as….The combining of all the stems into a

finished mix The following chart will outline the stems

of a film soundtrackStems are the component parts of a film

soundtrack

The Soundtrack

Stems

Film Sound Dialogue

ProductionADR

MusicSourceUnder Score

EffectsFoleyHardDesignBGs

PropagationThe way in which a sound radiates from a

source and interacts with its environment Capture

Recording the sound to a medium Editing Mixing Analog vs. Digital

Understanding Sound

Before we can begin discussion creative ways to use sound in film, we must first understandHow sound is createdHow we can capture soundUnderstand the way sound guides our

emotions

Question

The range of human hearing is roughly 20Hz – 20kHz

Hearing and Listening are two separate tasks Pitch is our perception of Frequency which is

measured in Hz Frequency is the number of cycles per second The length of waves

Lower frequencies are longer than higher frequencies Equal Loudness Curve

At louder levels we perceive frequencies equally, at lower volumes the mids are accentuated and highs and lows are rolled off

Hearing

BinauralHearing with two functioning ears ILD and

ITD Hearing vs. listening

We are always hearing but not always listening

Cocktail effectAbility to focus on a single source of audio

and filter out others

Hearing Continued

Lower frequency sounds are less directional and therefore more likely to scare us

Higher frequency sounds are more directional

Frequency

Volume/Loudness is our perception of Amplitude

Level is a measurement of the loudness of a sound in dB typically called dB SPL

SPL = Sound Pressure Level Threshold of Hearing is 0 dB Threshold of Pain is 130 dB

Loudness

Sine Square Triangle Sawtooth

Basic Wave Forms

Wave Forms

Sound comes from a source (direct) and interacts with the space

Inverse Square LawFor each doubling of distance from the source, the level

of a sound is cut by a quarter Compression Rarefaction the opposite of compression Reflection Absorption Resonance ADSR –Attack Decay Sustain Release

Sound Radiation

2 Sounds in phase will add 2 sounds 180 degrees out of phase will

cancel each other

Phase

Dynamic RangeMeasurement from the quietest point to the

loudest point of program material Peak vs. RMS Metering

Measure highest point vs. averaging of peaks

Dynamic Range

Noise Pink and White Noise – for testing Noise is unwanted sound Noise Floor

The inherent noise of the space and the equipment that will be present

S:N

27

Sound Characteristics

Compressions and Rarefactions molecular disturbances

28

Basic Acoustic Comparisons Pitch Frequency

Loudness Amplitude

29

Waveforms Transverse Longitudinal  Periodic Complex Periodic Random or Aperiodic

30

Waveform Characteristics Frequency Amplitude Wavelength Velocity Envelope Harmonics Surface Effects and Propagation

31

Frequency Defined Cycles Hertz Range of Human Hearing

20 Hz–20,000 Hz or 20 kHz

32

Amplitude Defined

Atmospheric Pressure Volume is our perception of Amplitude

33

Root Mean Squared (RMS)

RMS = 0.707 * Peak Values of a sine wave

34

Decibels and Intensity Bel Watts per meter squared

35

SPL and SIL The decibel scale

36

The Law of Conservation of Energy – Inverse Square Law

37

Speed of Sound Standard 344 m/s V = 0.6 m/s * Y

38

Envelope

39

Harmonics

Fundamental Frequency

40

Surface Effects Reflection Diffusion/Scattering Absorption

41

Surface Effects Diffraction

Surface Effects Refraction

Check Out Procedures All cage reservations must be made in

advance of assignments Audio Professor Signature REQUIRED

for use of the recording booth All equipment must be returned and any

problems immediately reported

For Next Class Journal #1 Reading from Alten

VID102 DAY 2

Class Schedule Turn in Journals Discussing Soundtracks What is Sound Design? Acoustics and Psychoacoustics Discussing Soundtracks

Models for discussion Chions Mode of Listening

3 types…CausalSemanticReduced

Stem Analysis

Sound Categorization Chion

Acousmatic (off-screen) sound○ sound one hears without seeing their originating cause - a

invisible sound source. i.e. Radio, phonograph and telephone○ Either we hear and then we see or we see and then we hear  

The first cause associates a sound with a precise image from the outset. This Image can then reappear in the audience mind each time the sound is heard off screen

The second case, common to moody mystery films, keeps the sound´s cause a secret before revealing all. (De-acousmatization)

Visualized (on-screen) sound○ sound accompanied by the sight of its source or cause. In film

a onscreen sound whose source appears in the image, and belongs to the reality represented therein 

Sound Categorization Chion Anempathetic sound

seems to exhibit conspicuous indifference to what is going on in the film's plot, creating a strong sense of the tragic. ○ a radio continues to play a happy tune even as the character who first

turned it on has died   ○ in a very violent scene after the death of a character some sonic process

continues like the noise of a machine, the hum of a fan, a shower running as if nothing had happened. (In Antonioni´s The passenger - the electric fan, in Hitchcock's Psycho - the running shower)  

Empathetic sound music or sound effects whose mood matches the mood of the action    

○ In Jonathan Demme´s Silence of the lambs when Jodie Foster visits Lecter in the dungeon the ambience are made of animal screams and noises. The room tone is a lunatic kind of screaming processed, slowed down and played in reverse.  

Sonic Logic Internal logic

continuous and progressive modifications in the sonic flow, and makes use of sudden breaks only when the narrative so requires.  

External logic editing that disrupts the continuity of an image or a sound

○ Sudden changes of tempo Synchresis

is the forging between something one sees and something one hears - it is the mental fusion between a sound and a visual when these occur at exactly the same time. Synchresis is an acronym formed by telescoping together the two words synchronism and synthesis  

The possibility of reassociation of image and○ The sound of an ax chopping wood, played exactly in sync with a bat

hitting a baseball, will "read" as a particularly forceful hit rather than a mistake by the filmmakers  

Helpful in understanding relationships between recognizable sound sources and their visibility.CausalSemanticReduced

Chions 3 Modes of Listening

CausalSee a Cow Hear a Cow

SemanticInterpreting meaning from a sound, i.e.

morse code or dialogue Reduced

Putting a sound onto an object not associated with that sound thereby creating a new meaning

3 Listening Modes for Laymen

What it all means What should we focus on in the

soundtrack in a given scene? What stem does it belong to? Does it match the edit? The visual?

Audio vs. SoundAudio is the representation of soundSound is the motion of particles in a medium

TransductionTurn sound energy into electrical energy (microphone)

Recording vs. MixingRecording is the capture of soundMixing is the balancing and combining of sound

Track vs. ChannelTrack is space on a medium that contains audioChannel is a specific path through which audio travels

Audio Principles

Signal FlowThe path which sound travels through a

deviceExamples...

Principles Cont

MAX/MSP

Mackie Mixer

ADC and DAC Analog to Digital Converter Digital to Analog Converter

Bit Depth 6.02 dB per bit 16 bit vs. 24 bit and Dynamic Range

○ 96dB vs. 144dB Dynamic Range

○ Number of steps between the quietest and the loudest portion or recorded material.

Sampling Rate Need to take a snapshot roughly twice the max frequency of what is being

captured Human hearing ranges from 20Hz – 20kHz, therefore, we need to sample at

least 44.1kHZ in order to capture and accurately reproduce the source Nyquist Theorem

Must sample roughly 2x the max frequency of the source

Digital Audio Basics

Distortion – Analog vs. Digital 0dBFS – if you go above this, you will experience digital distortion

Noise Linear vs. Non-Linear Editing Dynamic Range PCM

Pulse Code Modulation BWAV

Uncompressed Broadcast Wav File Dither

Low level noise to help with quantization errors Perceptual Coding Artifacts

Digital Audio Basics Cont

Electromagnetic Electrostatic/capacitance Ribbon

Methods of Transduction

MicrophoneWeaker signal, requires a pre-ampMic level2 mV to 1.2V

Professional Line+4dBu 1.23 V

Consumer Line-10dBu .316V

Speaker4V

Signal Level Types

XLRBalanced 3 conductor

¼ Inch2 or 3 conductorBalanced or unbalanced

BalancedMinimizes noise and rf interference, uses phase

inversion, helps with longer cable runs Unbalanced

No shielding

Connectors and Cables

The amount of give between your average level and the point of distortion

Clipping is the point at which you exceed your headroom and the limitation of the medium, resulting in distortion

Headroom

You want to have more signal then noise Noise is inherent in the environment and

in the equipment

Signal to Noise Ratio

Way in which a microphone will pick up the sound

This is influenced by the type of transduction

Choosing the right polar pattern for the right situation

Polar Patterns

Sound Theory Types of Transduction (Microphones)

Describes the way in which the microphone converts a sound source into an electrical signal.

Speakers do the reverse, an electrical signal is turned into an acoustic amplified sound○ Electromagnetic○ Electrostatic/Capacitance ○ Ribbon

Sound Theory Types of Transduction (Microphones)

○ ElectromagneticDynamic microphone uses electromagnetic inductionA small movable induction coil, positioned in the

magnetic field of a permanent magnet, is attached to the diaphragm. When sound enters through the windscreen of the microphone, the sound wave moves the diaphragm. When the diaphragm vibrates, the coil moves in the magnetic field, producing a varying current in the coil through electromagnetic induction.

Commonly used to capture loud percussive sounds with very strong transients.

Does not require an external power source

Sound Theory Types of Transduction (Microphones)

○ Electrostatic/Capacitance Require 48v Phantom Power The diaphragm acts as one plate of a capacitor, and the vibrations

produce changes in the distance between the plates.  variety of polar patterns Good at capturing full frequency sounds, captures subtle dynamics

of a performance Frequency Response is better

○ Ribbon use a thin, usually corrugated metal ribbon suspended in a magnetic

field. The ribbon is electrically connected to the microphone's output, and its vibration within the magnetic field generates the electrical signal. Ribbon microphones are similar to moving coil microphones in the sense that both produce sound by means of magnetic induction.