Ok this is boggling my mind.
Frequency: can be defined as how often something occurs
If I create a 555 timer Sound generator for example, and use a Potentiometer to modulate the time interval between pulses, when those frequencies start to overlap, they increase in pitch due to what I assume is referred to as harmonic distortion.
The pulse itself, is a discharge of energy, I assume coming from the capacitor. And the pulse itself has its own pitch. For example, it might be 500hz.
An excerpt from an audio engineering book I am reading for school has this to say:
If one second is used as a reference time span, the number of fluctuations above and below the ambient condition per second is the frequency of the event, and is expressed in cycles per second, or Hertz. Humans can hear frequencies as low as 20 Hz and as high as 20,000 Hz (20 kHz).
How can, in the same sentence, they describe Hz as a frequency in time, and they differ to claim that we cannot hear 20hz, when it is simply the time interval between pulses. 1hz, is one second, A clock ticks, I hear it tick. It ticks at 1 tick per second, or one Hz. Its Pitch Frequency may be 500hz, but I hear it 1 time per second (1hz).
This may not seem relevant, but I need to understand the difference between pitch frequency and time frequency.
Because I have a 555 Signal generator putting out a square wave at 1hz or 1 pulse per second, but its pitch is at aroun 500hz, and I want to know how to make that tone 20hz in pitch or 1000hz pitch.
What am I missing here? I am clearly lacking some piece of understanding. Unless pitch is determined by the size of the object discharging the energy.
And if that is so, than this book I am reading for school needed to add that piece of information clairying this. As frequency in time and frequency in pitch have clearly two different meanings entirely.
Please help if you can, but if you are one of those people who are smart but don't actually have an engineering degree, please refrain. I Want facts, not speculations. (no offense)
|