One of a number of Galouye SF novels focusing on some form of perception, Dark Universe is told from the viewpoint of Jared Fenton, a young man and skilled hunter of "soubats", in a tribe called the Survivors living in the "Lower Level" - what is evidently the depths of a cave system. Light having been forgotten except in name, the Survivors find their way about using echos: by a central "echo-caster" in their main dwelling area, and by rattling "clickstones" elsewhere. Their sense of smell is also well-developed. Jared has various problems - a forthcoming arranged marriage, his tribe's dwindling resources, and the incursion of foul-smelling wrinkly-skinned monsters that bring "screaming silent sound" - that lead him into an exploration, first meeting "zivvers" (humans who have evolved to see infrared), and finally the worldview-shaking reality that the monsters are clothed people from the outside world bringing light. There's a strong allusion to Plato's Parable of the Cave.
From a perceptual standpoint, the end of Dark Universe, when Jared emerges from the caves and adapts to his newly-discovered sense of sight, is unfortunately nonsense. It's now known - Google Wiesel and Hubel if you must - that if eyes aren't exposed to visual input at a formative age, the neural equipment to process it won't work. But the story is so well told and the details so well worked out, right down to the bureaucracy and ritual (among the Survivors, the "Misplacement of Bulky Objects" is a crime on a par with murder) that it's easy to suspend disbelief. I've mentioned other Galouye works in a previous post, PK Dick, Ubik and conceptual breakthrough.
- Ray
There is an interesting post on The Philosopher's Magazine titled The World of sounds that delves into the differences between the seen and the heard. Here is one bit from it that seems relevant to your post:
ReplyDelete" ... spatial hearing is not as accurate or as richly detailed as spatial vision. This relative impoverishment results from how information about space is extracted from signals at the two eardrums; vision, in contrast, extracts spatial information from two densely packed retinal arrays. Second, while sounds are experienced to be located in some direction, and perhaps at a distance, they are not themselves experienced to have a rich internal spatial structure. Compare this to the visual experience of seeing a face. The face seems to comprise different parts – a nose, eyes, lips – that stand in relatively precise and detailed spatial relationships. In part, what you see is determined by this internal spatial structure. In fact, visual objects are individuated and identified in virtue of their inherent spatial characteristics. In contrast, sounds and other things you hear do not auditorily appear to have densely detailed internal spatial structure."
I'm surprised that O'Callaghan doesn't mention the obvious reason for this. It's not "how information is extracted" - it's that the information is not there to extract. The resolution possible with some wave-transmitted energy depends on wavelength, and light has something like a billionth the wavelength of sound.
ReplyDelete