Life, Streamed
How long does your brain take to buffer? And what’s that got to do with your big toe?
How long does your brain take to buffer? And what’s that got to do with your big toe?
If you get onto a bus anywhere in Tamil Nadu, be prepared for music.
Private and state-run buses alike have speakers throughout their length, although the private speakers are generally more numerous and louder. You’ll see tiny ones anchored above the luggage rack, with an occasional big box taking the place of a bag. Under the seat is another convenient luggage spot — but some private buses have speakers there as well.
What this means is, you’ll have a guaranteed high-volume “surround sound” experience of cinema songs, every time you take a trip.
It’s not limited to songs. Sometimes, you get the cinema itself. Every bus has two TV screens behind the driver’s seat: there’s one on the right side for the right-hand row of passenger seats, and a similar one on the left. When the conductor’s in the mood, or if enough passengers request it, these screens will play back a full-length movie. With full sound through the army of speakers, of course.
Movies used to be something you’d go to the theatre to watch. People would plan in advance, and go in a group with friends or family, to take part in an experience shared with the rest of the audience.
Cinemas are still there, of course, but viewing is more often a private affair. The rise of the TV meant more families watched movies at home, though they still had to be seeing them at the same time as everyone else. VCRs and “Home Theatres” allowed viewers to drift apart in time as well as space. And of course, in the Netflix era, it’s often done alone on a laptop.
But just because personal viewing has gone up doesn’t mean other kinds of viewing have gone down. Perhaps people are simply watching more movies and TV shows than they used to.
A group of people had once come to my friend Nritya’s house, to watch a movie. But the group was large and the screens were small: the only ones available were laptops. It would be too uncomfortable for all half-a-dozen of them to crowd around one screen.
So, they decided to crowd around two.
The movie file of Pancharangi was loaded onto both laptops — which were then placed side-by-side, and played back bus-style.
A neat solution, but there’s still a problem: it’s difficult to start both laptops playing at exactly the same time. That’s because the setup is slightly different from buses: there, there’s no two devices playing the movie. There’s just one device attached to two screens (not to mention dozens of speakers).
Making the two devices play at the same time as each other is, naturally, harder than making one device play at the same time as itself.
The human brain is very sensitive to timing. In experimental setups, it can distinguish sounds as close as five milliseconds apart.
In practical terms, that means the ‘dual-laptop’ setup will almost certainly give you an echo effect, what with their different response times and the sluggishness of their respective VLC Media Players.
On a subconscious level, things work even faster. When you hear the horn of a leaving bus, the sound usually reaches one ear slightly earlier than the other. Your brain instantly notices this sound-difference, and uses it to calculate exactly where the noise came from (and whether you might have just missed your bus).
Sound travels at 340 metres per second. So the difference between your ears can be as small as nine millionths of a second.
Light and sound travel at different rates. That means, for faraway events, they’ll be slightly desynchronised. Imagine someone washing clothes by beating them on a rock. If you’re watching them from about 100 metres away, the light will reach you before the sound — by 300 milliseconds.
You’ll first see the cloth slap against the rock, then notice a slight delay before the corresponding sound arrives.
It works the same way for thunder and lightning, which is how you can estimate the distance of a lightning-strike by counting the seconds before thunder.
When TV broadcasts first started out, people would spend a lot of effort making sure the sound and images matched up. This wasn’t a problem for live broadcasts, but the recorded ones had audio and video stored on two different tapes. So they had to line them up precisely — or so they thought.
Then, they realised it didn’t really matter.
If the sight-and-sound gap is less than a hundred milliseconds, you don’t notice it. The two events seem to happen at exactly the same time.
Sometimes, it’s good not to notice things. Then you don’t have to worry about them.
Expert musicians, with their perfect sense of pitch and tone, can make brilliant music — but the skill comes with a cost. These people are forced to endure the un-musicalness of everyday life: they have to suffer every off-key whistle, every flat-note car horn, and every diaphragm failure of a state-bus speaker system.
Other people, less sensitive to these subtleties, don’t even notice there’s a problem. (You can simulate this on a bus by wearing earplugs).
It’s because you don’t notice a hundred-millisecond gap that you can enjoy slightly-off TV. And it’s why Nritya’s ‘dual-laptop’ problem has a simple solution: put one laptop on mute, and let the other play at full volume.
Slightly-desynchronised sound may clash, but desynchronised visuals don’t echo.
Hold on a minute — that doesn’t make sense. If your brain can tell sounds five milliseconds apart, why doesn’t it notice a sound-visual gap that’s twenty times larger?
The answer is: your brain does notice the gap. But it decides not to show it to you.
Gaps aren’t the only thing your brain edits out. If you want to see your brain’s editor in action, neuroscientist David Eagleman has a simple experiment for you to try out.
“Look at your own eyes in a mirror,” he says, “and move your point of focus back and forth so that you’re looking at your right eye, then at your left eye, and back again. Your eyes take tens of milliseconds to move ballistically from one position to the other.” But here, he says, is the mystery —
You never see your own eyes move.
Your senses don’t work at the same pace. Signals from your ears have a hotline to your brain — that’s why races start with the bang of a pistol rather than, say, a flash of light. Eyes, nose, and tongue take more time to push their notifications to your brain. And touch is the slowest of all, because its signals have to travel all the way up from the tip of your big toe.
Now, it wouldn’t do to have the different senses coming in at different times, would it? Imagine if you first saw your finger touch your toe, and then got the contours of your toenail on your finger, and finally felt the finger on your toe several moments later. And imagine if these desynchronised trickles of feedback continued, one after another, for every moment of your life.
Of course, you may say that you’d eventually get used to it.
Then again, that’s exactly what your brain’s done: gotten used to it. That’s why it waits for all the sensations to come in, and passes them to you in one neat chunk. And while doing so, some desynchronised signals get put back in step as well.
Your conscious mind is always living in the past. You don’t experience things as they happen, but after a delay, like a livestream broadcast that plays back only after buffering.
How far back in time you live depends on the distance from your brain to your big toe — or whatever your equivalent of slowest sensory signal is. For humans, it’s about half a second: more if you’re taller, and less if you’re short. Fruit-flies have a shorter delay than dogs, while elephants live even further in the past than humans do.
Of course, some signals are too important to wait for: that’s why you have instinctive movements when you react “before you even know it”.
But a lot can happen in half a second. Is it a valuable trade-off? Evolution seems to think it is. And, when you think about it, you’d be living in the past anyway.
Light takes three microseconds to travel a kilometre. The distant horizon appears to you as it was fifteen microseconds earlier. Look at the Moon, and you see things from more than a second ago; gaze at the Andromeda galaxy and you gaze at at time when humanity was still in the Stone Age. Look far enough, and you’ll eventually get close to the beginning of the Universe.
Any distance you look is not just a distance in space, but also in time. The further you see, the further back you go. Of course, the picture will be smaller and more blurred as well.
If you want a clearer picture, try the TV.
Want to write with us? To diversify our content, we’re looking out for new authors to write at Snipette. That means you! Aspiring writers: we’ll help you shape your piece. Established writers: Click here to get started.