The atmosphere is still undergoing change. Alternate warm and cool periods lasting several centuries, as well as long spells of wetness and dryness, have resulted in stress to humans and their agricultural activities. During the past century, the petrochemical age, we have altered the constituents of the very air we breathe, and only in the past three decades have we come to realize the damage we have done. Now we are trying to restore the atmosphere to its former purity. We should not have left to unregulated industrial freedom the composition of our most valuable possession. (4)
This version of environmentalism is almost 40 years old. In some respects, the sentiment sounds dated. Ludlum writes in the wake of the U.S. “Clean Air” acts of the 1960s and 1970s, which were concerned with specific sources of pollution like coal plants or combustion engines. From the current era’s perspective, the notion that we need to clean up the air sounds quite optimistic and doable. Today air is still a problem, but it’s a symptom. Mainstream attention has moved to the potential collapse of the entire planetary atmosphere that makes Earth conducive to life. “Dirty” is defined differently: not particular sites of pollution like factories, but a circulating background of particles, atmospheric CO₂. Global CO₂ is far harder to address, because it is a dispersed marker of the entire planet’s polluting activities. The shift in the understanding of atmospheric pollution is a great example of how systems-level thinking can define a problem more completely, at the same time as it makes it more daunting to address.
Sources
David Ludlum, The Vermont Weather Book (Vermont Historical Society, 1985).
I would love to read an account of creative blocks, of “being blocked” when you want to do something original, or engage in some sort of unstructured free inquiry. How have cultures and individuals understood that, what did they did do about it, why did it matter to them? “Writer’s block” is the most recognizable term in this area, although I wonder if it may be on the way to obscurity. I’ve never heard of anyone speak of writer’s block with respect to the ephemeral text-based communication that makes up the bulk of writing today.
Then again, writer’s block (or any other condition of being blocked) never pointed to a stoppage of ordinary communication. No one, for example, gets blocked when talking to a friend, or ordering dinner at a restaurant. This is because an essential part of being blocked is failing to make the leap from spontaneous and unremarkable speaking to an original expression. When a person is un-blocked, he or she still communicates when the cues from the environment and social world fall to a minimum.
But if someone simply can’t, for whatever reason, continue to speak from the other side of that transition, if he comes back with nothing to show for it, or if the attempt is so painful, interrupted, and shaming that it creates a negative feedback loop–producing ever-more dirt, gravel and sand when only the hint of something precious would make the effort continue–then I would call that person blocked.
I think a wider, cross-cultural account of this phenomenon would be worth studying, because I have the sense that it describes something much broader than a condition of artists and self-described “creative” types. The “many people are too distracted” argument gets a lot of attention; the condition of being blocked is really just a description of distraction from the other side.
“Watching the clouds” is an everyday shorthand for daydreaming, aimlessness and boredom. It’s as if the professionals who classify clouds have this prejudice in mind as they go about their work, and they defend themselves by building a maze of systematic classification on top of the clouds. Just look at the remarkably thorough, carefully designed website of the International Cloud Atlas. It has to be one of the best online references I’ve ever seen, on any subject–as intellectually satisfying as the casual viewing of clouds is for the senses and the imagination.
We’ve had a lot of storms and weather in the last few weeks here in Chicago, and I’ve had more reason than normal to look at the sky. For ordinary observers, it seems to me that there is really just one fundamental division in cloud typologies: between clouds that create depth in the sky, and clouds that obscure. The atmosphere is an amazing medium. At its clearest one can see indefinitely far: to the stars, into thousands of light years, distances so large they have no referent on earth. It can also close off vision, down to the few feet reachable with one’s hands, or less.
Even in a busy city sky, when I look up, I rarely see anything. But an open sky is not empty space. A view into the sky can be much more than a line of sight between “here,” the standpoint of the observer, and “there,” the furthest visible point. An opening in the sky is an invitation to see strata: layers, a space with undefined depth, like how any unit of time can become long or short according to events.
This picture taken a few days ago looks west, after a storm; the darker stratus clouds are just moving away, to reveal a large cumulus, a tower with puffs whose true size is impossible to gauge by looking. It could be the site of a floating city, an unoccupied landmass, the extension of a continent. This is what is distinctive about the cumulus tower: its heft hints at the depth contained in the sky.
Then, too, when the stratus clouds pull back, in the foreground, those almost-featureless specks are birds–I think they’re swifts. This sky is their playground, their home. It’s wide open to them, but they take only a sliver.
The last few years have produced many histories of the internet. This
includes the topic of the internet as a technology, as a product of
social groups, as a medium, and as a cultural force.
Here’s one history that I have seen less of, and that I would love to
see researched within a more conceptually unified scheme: all of the
technologies, societies, media and cultures that lost their viability
after the internet gained a mass foothold.
For example, the retreat from the ambition to be comprehensive, or a
certain ideal of comprehensiveness. Three examples:
A Metaphors
Dictionary,
a project that is pretty much what it sounds like. It was published
by Elyse Sommer and Dorrie Weiss with Visible Ink Press. The final,
reissued edition was published in 2001. Curiously, there was a
companion guide to similes, which managed a second edition more than
a decade later in 2013.
The American Library Association Guide to Reference
Books,
a reference guide to all the other reference guides. Its purpose was
to help librarians select from tens of thousands of specialized
references for addition to their library collections. According to
Wikipedia the
first one came out in 1902. The last one, the 11th edition, came out
in 1996, after which the guide went online in 2009 before completely
shutting down in 2016.
The long-running, celebrated, sometimes-contested Penguin Guide to
Jazz
Recordings,
a massive compendium of jazz album reviews reaching back to genre’s
beginning. It was first published by Richard Cook and Brian Morton
in 1992, and kept up through nine editions until 2008. Cook died
that year, and while Morton put out another, smaller guide to the
“1000 Best Albums” of Jazz in 2010, the project has seen no more
updates since.
These works are so different that it is hard to make sense of them
within a single trend. What they share is the aspiration to catalog a
cultural object whose borders are so vague as to threaten the catalog
itself. A dictionary of metaphor? There are scholars who argue that all
of language is
metaphor. And a reference book for reference books? Can we first take a
few years to agree on the subject matter headings and the definition of
a “reference?” And a “guide to [insert musical genre here],” to say
nothing of two authors “reviewing” an entire genre?
I’d like to think that one could find a crowdsourced version of each
project, but this is likely not true. More probable is that one could
find most of the individual referents in each book somewhere online
(e.g., digital annotations for the metaphors in Shakespeare, a WorldCat
entry for a reference book, a Wikipedia entry for the musical albums),
sans the catalog.
The idea of “being comprehensive” about something, and the intellectual
power “to comprehend:” two words with the same root that express, it
seems to me, very different aspirations. The people who created the
above works probably had a different notion of what it meant to
comprehend something, and because of that they felt empowered to at
least entertain improbable ideas about comprehensiveness.
The last few days have seen a string of mostly clear weather in the
evening, just as the moon begins a new monthly cycle. One of the nice
things about looking at a very young moon (~2-5 days) is that it
appears low in the western sky around sunset (which occurs after 8pm
right now), timed as the day is slowing down and it’s convenient to look
outside for a while. I started studying this cycle’s crescent moon with
binoculars when it was just over two days old.
A moon of that age is a beautiful sight: large enough to be easily seen,
but still very delicate, so thin at its sharp points (lunar observers
call them “horns”) that it is possible to imagine that the moon is made
up of a bunch of stars clustered near each other.
A feature on the horns caught my attention through the binoculars. At
the leading edge of the sunlight cast onto the moon, where the thinnest
possible segment of lighted lunar surface faces the earth, the moon’s
light becomes discontinuous. Through the binoculars it looks like four
or five bright lines or smudged dots, with short breaks, as black as the
rest of space, in between. Here is a rough diagram of the crescent moon
and where the effect appears—although it’s not visible to the eye
alone:
Through the binoculars, it’s not obvious what’s causing this. A little
digging gets an answer. The moon’s surface near its visible southern
pole is cratered and bumpy. The sun’s light, which hits the surfaces
near the horn at a nearly 90-degree angle, doesn’t reach the bottom of
every crater, canyon floor and plain. At the most extreme angles of
illumination, the sunlight only reflects off the highest points on the
lunar surface, and the shadows are so deep that huge patches of dark
occur right alongside the high points that are lit. This is what creates
the broken appearance at the southern horn.
If you were standing on one of these lit patches in the moon’s horn, the
sun would appear near the horizon, like it was just rising or setting.
Anyone who has seen dawn or dusk from a high point in the mountains
knows that the sun reaches the mountain peak long before the valleys.
This is true for the moon as well, but since a view from earth shows us
both the high and low points on the side of the moon that faces us, the
reflected sunlight makes the southern horn look a little bit broken,
like pieces of the moon have been punched out.
The edges of the moon’s surface, as seen from earth, are known as the
“limb,” and the patchy illumination that occurs around a crescent moon
is just one more reminder of how
rugged the
moon’s surface is. A perfectly spherical body would show even lines
along its edges, but the moon is so craggy that you can make out its
spherical imperfections just by viewing it in profile from earth.
A young moon is also closest in position to a moon that blocks the sun:
aka, a solar eclipse. I went back to my notes from when I saw the 2017
North American eclipse from Kentucky, and it turns out I witnessed a
related phenomenon, known as “Bailey’s Beads,” at that event. During the
eclipse, when the sun slides behind the moon, there are moments (a few
minutes at most) around the time of the total eclipse when the moon’s
broken terrain is unable to completely block the sun. This occurred at
the same region where I saw the areas of sliced out sunlight the last
few days: near the southern pole.
In places where spring seems to be a long time coming, there is a particular melancholy that sets in when one realizes that the season is, at least for aesthetic purposes, over. After the first round of new growth–the plants that cheer everyone up with their hardy blooms when it feels too cold for vegetative life–when the weeds and the trees look leafed-out enough to pass for mid-summer, when there are a few hot days that make one lose–even for a minute–one’s reflexive gratitude for the warm weather–that’s the end of psychological spring.
There’s another part to that feeling for me. I have a few small garden plots, and I also mark the moment when the seedlings I’ve started lose their compact, orderly form–beginning to stretch this way and that out of their own principles.
What is it that’s charming about seedlings? When they first emerge, they are pretty much all well-behaved. Next they show one, two, three of their true leaves, and I imagine that they will look straight and compact like this forever, only bigger. Of course this is not right. Soon, they will become unruly, stretch out of their pots, enter into tangled warfare with their neighbors, and nag at me that if I don’t do something with them soon, they will die or be stunted and I will have wasted the season.
In the spring, gardening is a rational task. Plans, maps, calendars-plants are at least potentially faithful to the winter vision.
Reasonable lines and grids contain early-season seedlings
But at some point by around this time, in the transition to summer, I get a premonition of the chaos that is coming, that it is more powerful than me. There ought to be a word for this phenomenon, the moment when a rational order gives way to organic spontaneity. There are probably gardeners for whom the early period is the best part of the season, when they feel most confident and fulfilled by their avocation. Once the plants are in control, it’s all downhill.
From Louis Menand’s The Free World: Art and Thought in the Cold War,
on the educational program of the Black Mountain
College:
Black Mountain College is famous for the number of artists and poets,
later prominent, who studied or taught there, but it was not an art
school. It was a college. It was launched during the Depression by a
renegade Classics professor named John Andrew Rice, who had been fired
from Rollins College in Florida, and for twenty-four years, it led a
hand-to-mouth existence in the foothills of the Blue Ridge Mountains
outside Asheville. In a good year, enrollment was sixty. The college
opened in the fall of 1933 with twenty-two students, fourteen of whom,
along with four of the faculty, had followed Rice from Rollins. To the
extent that finances permitted, and depending on who was available to
teach, it offered a full liberal education. Students could take
courses in science, mathematics, history, economics, psychology,
languages, and literature.
What made Black Mountain different from other colleges was that the
center of the curriculum was art-making. Students studied, and faculty
taught, whatever they liked, but every student was expected to take a
class in some kind of arts practice—painting, sculpture, pottery,
weaving, poetry, architecture, design, dance, music, photography. The
goal was not to produce painters, poets, and architects. It was to
produce citizens. “The democratic man,” as Rice explained his
philosophy, “… must be an artist.” Rice thought that people learn
best by doing, rather than by reading books or listening to lectures,
and he regarded art-making as a form of mental discipline. It instills
a habit of making independent choices, which is important in a
democracy. This was the pedagogy of progressivism, derived from the
educational theories of John Dewey, who visited the college frequently
and served on its advisory board.
Black Mountain College only enrolled a little more than a thousand
students in total, and closed in 1957 due to lack of operating funds.
But I find so much that is imaginative, thrilling, and timely in this
capacious model of what the “liberal arts” could be. First, it makes art
useful–let us be very specific about why–because it creates people
who use their own imagination, who have confidence in their own
judgment, who form an independent point of view. This version of art,
both practical and empowering, is neither reductive (in the sense of
submitting art to a moral or political principle) nor damaging to the
pursuit of art as an end in itself. By committing to learn an art form,
to become–at least for a while–an artist, it is possible that a person
is changed and made stronger as a person. This argument does not get
nearly enough airtime today.
Second, it’s an important reminder that the liberal arts needn’t be
limited to an agenda of academic study, as defined by the classical
tradition, and by the history of European and American university
systems. The liberal arts in their original form are not really what
most Western universities aspire to today, anyway: rhetoric and grammar,
for example, are rarely considered a core part of the educational
program. The liberal arts is really just another term for a generalist
education. And if they are hemmed in by traditional university curricula
today, that reveals a fixed, unresponsive view of what it means to be a
generalist.
There should be multiple liberal arts. If there can be one--as at Black
Mountain--that consists of making artists, why could there not be
another liberal arts for practical courses of study? Take the major
industrial products that make modern life possible, what Vaclav
Smil
has recently called the “four pillars of modern civilization: cement,
steel, plastics and ammonia.” Wouldn’t it be worthwhile to spend a few
years studying how to make these materials--through hands-on
apprenticeship, experience in the practical steps, knowledge of the
infrastructure and logistical requirements, study of the geopolitical
and historical background, and a first-hand knowledge of what their
effect is on the world? Wouldn’t that study constitute a kind of
generalism? Or a scientific liberal arts, which might at the
undergraduate level seek to give students an intermediate-level
knowledge in all major scientific disciplines. Or an ecological liberal
arts. Or a vocationally-minded liberal arts, in which students occupy
themselves with a functional understanding and repair of the everyday
amenities (cars, houses, water treatment, power generation) familiar to
ordinary consumers. What would an “algorithmic” liberal arts, in which
students studied the major components of civilization which are subject
to computer modeling and control, look like?
If one lets go of the idea that “generalism” means a “here is a little
bit of everything,” and replaces it with “here is a system with parts
that work together,” it would be possible to imagine many more liberal
arts than exist today. These would be liberal arts that are neither in
competition with nor exclusive of one another, but which aspire to
socially complementary versions of a generalist education.
The traditional liberal arts would still be a healthy community, one
among other options, but much social and ideological pressure would be
taken off of them. They would no longer have to be an everything
education, the best model for both giving students a “good job” and a
recipe for making good citizens. The current version of the liberal arts
could be more honest about what they are—a philological, academic,
theory-based account of the general good–and more healthy because it
does not have to invent unrealistic promises about itself. Other liberal
arts, other generalisms, could, for example, take over the work that the
“Great Books” do now.
This gets us away from the idea that the humanities–or any subject–is
necessarily at the center of the liberal arts. The liberal arts becomes
a bigger and more powerful idea because it becomes more flexible.
Instead of “these are subjects and courses of study that make up a
generalist education,” the idea becomes “we need generalists, because
generalists are fully realized human beings–let us figure out new ways
to produce them.”
One last lesson that I take from the Black Mountain College experiment
is that universities needn’t be set up as perpetual bodies to be
successful. Presumptive immortality is just a curious feature of the
current university model, where too many institutions model themselves
on the few oldest universities. This would seem to prize institutional
longevity more than almost anything else. The evidence can be found, for
example, in how far universities go to protect and grow their
endowments. But take other models: a commercial business, which operates
for a few years, supports its operators, and then is closed or
sold--this is not considered a failure. Neither is a church, whose
particular congregation might wane even as the umbrella faith (which
could be considered its own, exclusionist, “liberal art”) continues.
There is no reason that a university can’t have a much smaller
footprint, run while its ideas are fresh and its offerings attractive,
and then fold as a normal outcome of a generational lifecycle. Black
Mountain College existed for just 25 years, but it probably had more
influence on the
direction of higher education than many colleges that persist today.
Sources
Louis Menand, The Free World: Art and Thought in the Cold War. Farrar,
Straus and Giroux, 2021.
Tom Crewe has a delightful review of Turgenev’s body of work in the April 21st issue of the London Review of Books. Two highlights whose combination struck me:
In reading Turgenev in English we are not departing from historical precedent. The vast majority of his 19th-century readers, in company with his most distinguished European and American admirers (James, Flaubert, Zola, George Eliot, Howells, the authorities in Oxford who gave him an honorary doctorate in 1879), read him largely in French or English. His importance for Western literature is unavoidably a mediated one, and it is through translation that we see what made those readers praise him so highly.
And:
Turgenev’s greatest strength as a writer was his talent for detail, which had several different applications. One of his most distinctive habits is his use of similes drawn from the natural world (the result of much time spent outside, first as a child frightened of his mother and then as a devoted huntsman).
Among the examples Crewe gives is this complex metaphor from Turgenev’s novella First Love:
Indistinct streaks of lightning flickered incessantly in the sky; they did not so much flash as flutter and twitch like the wing of a dying bird.
It takes a gifted writer to manage the handoff between these two images. I, at least, find it convincing; in my mind the lightning and the bird’s wings work on something like the same underlying principle of motion.
And for a writer who has largely made his reputation through translation, it is a risky, high accomplishment to mark naturalistic detail with so much vitality that your translators have what they need to keep it alive.
The moon, rising in the early evening, just above the rotunda at the Museum of Science and Industry. When I took this picture it was short of a full moon by a day or two:
There’s a long-standing puzzle about the moon when it is near the horizon: why does it look bigger? This is usually just called the “moon illusion.” The problem has so far not been definitively resolved by any modern scientific explanation, leaving it open to speculation by philosophers, amateurs, and polymaths. Also, not all people perceive the illusion in the same way. For example, I have seen the moon on the horizon that looked huge, but I didn’t find this to be true when it was next to the rotunda in this picture. Subjectively, it looked “normal-sized.” I believe this comes through in the photograph. But a quick image search for apparently large moons does show many near the horizon, or a surface-level object, that do look huge (the fact that this illusion–or the lack of it–can be carried through into photographs is a property worth noting–not all illusions do).
Optical illusions involving forced perspective take one or more objects and place them near a reference object, which deceives the intuition for size and space. There is usually something deceptive about the presentation of the reference, making the original seem smaller or larger by comparison. Maybe the moon’s appearance is another example of forced perspective. This illusion has been noticed for so long that the competing paradigms to explain it are well-established:
These include both the “apparent distance” theory
…the brain perceives the Moon when near the horizon to be farther away than an elevated Moon. Therefore, the brain calculates that the horizon Moon must have a larger angular or linear size (about 1.3 to 1.5 times larger) than when viewing the Moon when it is higher in the sky.
And the “apparent size” theory:
…when the Moon is low and close to familiar objects, such as houses, trees, and mountains, we already know or quickly estimate their apparent size and distance, then the brain incorrectly calculates the angular size of the Moon compared to the familiar objects on the horizon. When the Moon is elevated, there are no earthly objects to compare it to, so the brain perceives as being more distant and therefore, smaller than the horizon Moon.
Both explanations are from Robert Garfinkle’s lavishly comprehensive recent book on the moon, Luna Cognita (section 6.11.4, “The Moon Illusion”).
It seems to me like these solutions are trying to account for the moon’s paradoxical aspect when viewed in the traditional way: with the naked eye. Although the moon is the largest object in the sky, and it moves across the horizon each day or night, it never really changes size. The combination of motion and fixity of apparent size–this is not a normal property of most physical phenomena. Movement on the earthly plane is usually associated with some change in size. Also, to my knowledge, the moon is also the only object in the sky that that possesses both regular motion and any apparent size at all. We speak of brightness of the stars and planets, but they all appear to be points of light, closer to mathematical locations on a plane than three-dimensional objects. So the trouble arises with this object, the moon, that moves yet is not subject to growth or reduction, and which follows predictable, calculable cycles like the stars, yet retains the obvious imperfections of substance and matter (shape, texture, depth). It is not surprising that the mind/brain does not know how to treat it, and is tricked into applying standards of growth and change which the moon, in its own very strange class of objects, refuses.
Sources
Robert Garfinkle, Luna Cognita: A Comprehensive Observer’s Handbook of the Known Moon. Springer, 2020.