Years ago I had an argument with a biblically-minded friend who claimed that there could have been no tools of any kind, no technology whatsoever, in the Garden of Eden. So did Adam and Eve do all their “tending of the garden” (Gen. 2:15) with their hands? Pruning branches by breaking them off, sowing seeds only by kneeling and pawing the ground? No, he replied, there would have been none of that -- there could be no labor in Eden, so the garden would have taken care of itself and all the man and woman had to do was eat what it produced. I pointed out that what distinguishes a garden from a wilderness is that labor is put into the shaping of it and the caring for it, and that the Hebrew words used to describe what Adam did there clearly denote work, but he was immovable. Labor and technology are post-lapsarian phenomena, period. End of story.
Well, my friend was wrong, and the question we theologically-minded critics of technology need to ask goes something like this: Given that the use of tools is co-extensive with humanity itself, how might we distinguish between technology that is consistent with obedience to God and technology that manifestly isn’t? Something like this question is common to almost every serious thinker about technology, even when God isn’t involved. Perhaps the guiding concepts involve purely human flourishing.
Take Heidegger, for instance. Here’s a summary from George Steiner’s excellent overview of Heidegger’s thought:
Once, says Heidegger, nature was phusis, the archaic designation of natural reality which he reads as containing within itself the Greek sense for “coming into radiant being” (as it is still faintly discernible in our word “phenomenon”). Phusis proclaimed the same process of creation that generates a work of art. It was, in the best sense, poiesis – a making, a bringing forth. The blossom breaking from the bud and unfolding into its proper being (en eautō) is, at once, the realization of phusis and of poiesis, of organic drive – Dylan Thomas’s “green fuse” – and of the formal creative-conservative dynamism which we experience in art. Originally, technē had its pivotal place in this complex of meanings and perceptions. It also sprang from an understanding of the primacy of natural forms and from the cardinal Greek insight that all “shaping,” all construction of artifacts, is a focused knowing. A “technique” is a mode of knowledge which generates this or that object, it is a re-cognition towards truthful ends. (Something of the Heideggerian reticulation can be made out in the cognate range, in English, of “craft” and of “cunning,” with their respective derivation from Germanic roots for “knowing” and “forming.”) No less than art, technē signified a bringing into true being, a making palpable and luminous, of that which is already inherent in phusis. Heidegger’s word for authentic technology is entbergen.
Steiner goes on to point out that zu entbergen can mean either “to reveal” or “to guard in hiddenness.” So healthy technology both reveals something true and guards something valuable.
Consider all this a prequel to a post I wrote last year, which quotes Ruskin on this point. Ruskin’s distinction between the rough but richly human imperfections of Gothic stonework and the rigid regularity of modern industrial stonework is immensely relevant here. Ruskin is articulating in very specific and practical terms what Heidegger is articulating theoretically.
Note that in that post I also cite Ivan Illich’s notion of “tools for conviviality,” which is another way of pursuing the same general ideas. I could also cite Ursula Franklin’s distinction between holistic and prescriptive technologies. Franklin on the former category:
Holistic technologies are normally associated with the notion of craft. Artisans, be they potters, weavers, metal-smiths, or cooks, control the process of their own work from beginning to end. Their hands and minds make situational decisions as the work proceeds, be it on the thickness of the pot, or the shape of the knife edge, or the doneness of the roast. These are decisions that only they can make while they are working. And they draw on their own experience, each time applying it to a unique situation….Using holistic technologies does not mean that people do not work together, but the way in which they work together leaves the individual worker in control of a particular process of creating or doing something.
And the latter:
Today’s real world of technology is characterized by the dominance of prescriptive technologies. Prescriptive technologies are not restricted to materials production. They are used in administrative and economic activities and in many aspects of governance, and on them rests the real world of technology in which we live. While we should not forget that these prescriptive technologies are often exceedingly effective and efficient, they come with an enormous social mortgage. The mortgage means that we live in a culture of compliance, that we are ever more conditioned to accept orthodoxy as normal, and to accept that there is only one way of doing it.
So we see the same essential point being made over and over again, since the middle of the nineteenth century at least. Ruskin, Illich, and and Franklin all see that there are technologies that liberate human creativity, that enable human power, and, by contrast, technologies that enslave us, that force our very being into conformity with their codes and structures.
Some version or another of this essential distinction has been central to this blog over the decade or so that I’ve been writing it, and I have come to believe that I have unpacked the relevant questions as thoroughly as I know how. At this point I am merely repeating myself, or playing tiny scraps of variations on the Great Theme. I still want to think about the matters that this blog has been concerned with — and to think more about John Ruskin! — but I need to find new ways to do it. So this will be the last Text Patterns post. Look for me later on in the pages of The New Atlantis, and at my own site, but I’m wrapping this blog up.
Gentle readers: Thanks for coming along for the ride. It’s been real.
Micro.blog: A couple of days ago I posted this: "I really do think this is a great service, and I’d like to be here regularly, but I wonder how much longer I’ll do this if no one I know (or almost no one I know) is here. I’m keeping fingers crossed that friends will show up!” And immediately I started getting a flood — well, honestly, it was just a trickle, but given how small the place is overall it felt like a flood — of dudes advising me how find new people, how to get more followers, and I thought: Ah. Here we go again. I’m a pretty intense introvert: I don’t want to meet new people, in an ideal world I would have no followers I don’t already know and like, and nothing will ever convince me that giving unsolicited advice to strangers isn’t extremely rude. (Acknowledgment: I know those guys were “just trying to help.” I get that. Nevertheless.) It’s the old problem of intimacy gradients all over again, but on a platform that actually has fewer controls on what you’re open to than Twitter does.
In theory I’m totally supportive of the simplicity of micro.blog, but … what all this demonstrates to me is that with social media I have two choices: far more unsolicited human interaction than I’m comfortable with, or no social media at all. So I just need to make my call and live with the consequences.
AirPods: I went back and forth about these for several months I wrote about them last year. I’m pretty sure I would be using them regularly if they worked regularly for me — but they don’t. Apple promises that if you flip open the AirPod case a sheet will slide up showing the charge percentage of the AirPods and the case; this happens for me maybe on-third of the time. When you put the AirPods in your ears they’re supposed to pair automatically with your iPhone; this happens for me maybe half the time. And one time in five the phone tells me the AirPods are connected, but sound is coming through the phone’s speaker instead. By contrast, my wired buds always work precisely as expected, so I rely on those. (Everyone else I know who uses the AirPods simply raves about them, so either they don’t have these problems — which wouldn’t surprise me, because I’m digitally cursed: no computing device ever does for me what it’s advertised to do — or they overlook them because of the convenience of going wireless.)
Notebooks: For several years now I’ve been using the Leuchtturm1917 A5 notebooks, which are just marvelous. But they are fairly narrowly ruled, and I find that writing a little smaller than comes naturally to me tends to make my hand cramp. So when I finished my last Leuchtturm I decided to try the slight-more-widely-ruled Conceptum in the same size, and it’s great. The thicker paper is also very nice to write on. For around the same price it has fewer pages, and of course I’ll write fewer words per page, so I’ll go through this more quickly than I would a Leuchtturm, but that’s a relatively small price to pay for more comfort. Plus, it’s sort of fun coming to the end of a notebook and putting it on the shelf with its predecessors.
Pens: I have a few fountain pens (nothing fancy, mostly Pilots) that I like, but it seems that when I write my grip slides down the barrel of the pen in such a way that I always end up with ink on my fingers. I don’t mind being metaphorically an ink-stained wretch, but I’d rather not make that literal. I tried a Tombow rollerball but I find it a scratchy experience. I don’t like using throwaway pens but I have found that the smoothest, most enjoyable writing experience I can get for a reasonable price and no inky fingers is the Pentel Energel. Highly recommended.
In later posts I’ll strive for a substantive engagement with Ruskin, but I want to make a general preliminary comment here. Ruskin was one of those figures who lived through a massive social transition and who never forgot what the world was like before its change. “It has been my fate,” he wrote in a late work, “to live and work in direct antagonism to the instincts, and yet more to the interests, of the age; since I wrote that chapter [in the first volume of Modern Painters] on the pure traceries of the vault of morning, the fury of useless traffic has shut the sight, whether of morning or evening, from more than the third part of England; and the foulness of sensual fantasy has infected the bright beneficence of the life-giving sky with the dull horrors of disease, and the feeble falsehoods of insanity.”
That sounds like the ranting of an angry old man, and yet ... Ruskin not only watched the sky and noted the weather every day but also kept a detailed record of what he saw — for decades. And in a pair of lectures called “The Storm-Cloud of the Nineteenth Century” he described in great detail the alterations in the sky and the weather, not just of England but of Europe as a whole, that he had observed. It turns out that everything he said in this seemingly most crazily extreme of his writings was precisely correct: the emissions of factories, and other side-effects of the Industrial Revolution, really were changing European weather. As Wolfgang Kemp records in his excellent biography of Ruskin,
The 1870s and 1880s form a unique period in the history of environmental and weather study. The skies darkened, the air became thicker and unhealthier, the climate damper and colder. One result was a progressive increase in the numbers of people dying from respiratory ailments. Trees and animals died too, not only in foggy England but also on the Continent. For example, starting in the 1880s, there were widespread reports of damage to forests in Germany. Rosenberg tells us that “From 1869 through 1889 the temperature in London was below average for eighteen of the twenty-one years … reliable figures for sunshine are available only after 1879, but sixteen of the twenty autumns and winters from 1880 through 1889 were below average, and the total sunshine was below average for more than sixty per cent of the decade.”
So old men may complain about the condition of their times simply because they’re old men and (therefore?) grumpy; but they may also complain because things really have changed and they’ve seen it. And the people who live on both sides of a cultural or political (or even meteorological) divide may be very useful observers of their scene.
Jacques Ellul is another such figure, who lived through a massive transformation in French society; and I might also cite Lesslie Newbigin, who left England as a young man to serve as a pastor in South India and returned forty years later to find a very different culture, which he sought to address theologically and pastorally in a brilliant book called The Gospel in a Pluralist Society. People who have experienced massive social changes and do not merely react against them, but rather strive to comprehend them analytically, tend to be very valuable thinkers indeed. And their habits of critique can be enormously helpful to those of us who are living through our own period of change. That is why when I try to understand our current technopolic moment I find that the thinkers who help me the most are not the ones fully immersed in our own time, but those who remember an earlier time, or those from the past who underwent similar social transformations. It is very hard from within this technologically oversaturated moment of ours to discern its outlines clearly. I’m therefore drawn to thinkers whose vocabularies are tilted or skewed in relation what I see and hear every day. This is one of the many uses of reading old books.
More details about Ruskin soonish.
I will put up with this state of things, passively, not an hour longer. I am not an unselfish person, nor an Evangelical one; I have no particular pleasure in doing good; neither do I dislike doing it so much as to expect to be rewarded for it in another world. But I simply cannot paint, nor read, nor look at minerals, nor do anything else that I like, and the very light of the morning sky, when there is any – which is seldom, now-a-days, near London – has become hateful to me, because of the misery that I know of, and see signs of, where I know it not, which no imagination can interpret too bitterly.
I must clear myself from all sense of responsibility for the material distress around me, by explaining to you, once for all, in the shortest English I can, what I know of its causes; by pointing out to you some of the methods by which it might be relieved; and by setting aside regularly some small percentage of my income, to assist, as one of yourselves, in what one and all we shall have to do; each of us laying by something, according to our means, for the common service; and having amongst us, at last, be it ever so small, a national Store instead of a National Debt. Store which, once securely founded, will fast increase, provided only you take the pains to understand, and have perseverance to maintain, the elementary principles of Human Economy, which have, of late, not only been lost sight of, but wilfully and formally entombed under pyramids of falsehood.
In old times, under the pure baronial power, things used, as I told you, to be differently managed by us. We were, all of us, in some sense barons; and paid ourselves for fighting. We had no pocket pistols, nor Woolwich Infants – nothing but bows and spears, good horses, (I hear after two-thirds of our existing barons have ruined their youth in horse-racing, and a good many of them their fortunes also, we are now in irremediable want of horses for our cavalry), and bright armour. Its brightness, observe, was an essential matter with us. Last autumn I saw, even in modern England, something bright; low sunshine at six o’clock of an October morning, glancing down a long bank of fern covered with hoar frost, in Yewdale, at the head of Coniston Water. I noted it as more beautiful than anything I had ever seen, to my remembrance, in gladness and infinitude of light. Now, Scott uses this very image to describe the look of the chain-mail of a soldier in one of these free companies; – Le Balafre, Quentin Durward’s uncle: – “The archer’s gorget, arm-pieces, and gauntlets were of the finest steel, curiously inlaid with silver, and his hauberk, or shirt of mail, was as clear and bright as the frost-work of a winter morning upon fern or briar.” And Sir John Hawkwood’s men, of whose proceedings in Italy I have now to give you some account, were named throughout Italy, as I told you in my first letter, the White Company of English, ‘Societas alba Anglicorum,’ or generally, the Great White Company, merely from the splendour of their arms. They crossed the Alps in 1361, and immediately caused a curious change in the Italian language.
Some things I like:
You can see that much of what I like about micro.blog involves the absence of “features.” You can post thoughts and images and links, and you can converse with people. That’s it.
I’ve already heard from several people telling me that micro.blog will fail just the way earlier Twitter alternatives failed. I think it’s got a better chance of surviving than any of the previous contenders — and I don’t think it’s best described as a “Twitter alternative” anyway — but let’s say these folks are right and micro.blog shuts its doors. In that case I’ll export my posts to my WordPress blog and keep on keeping on.
Finally, an endeavor like Ritual Design Lab has a paradox at its heart. If I contact the Ritual Design Hotline and the team solves my problem by creating a ritual for me, I am implicitly buying into the notion that I’m not capable of creating one myself. By outsourcing ritual design, I am, to use Steinlauf’s idiom, objectifying rather than subjectifying it; I’m reinscribing the old notion that we have to look to outside experts for such things. Only now, instead of turning to a rabbi or a priest or a guru, I’m turning to a designer.
Ozenc does not necessarily see this as a problem. In the Stanford classes he co-taught with Hagan, he ran two sessions. In the first, each student designed a ritual for herself. In the second, students paired up: One, the designer, was tasked with crafting a ritual for the other, the client. “The second version is more effective because you might not be seeing the opportunities in your life — maybe someone else can see better,” Ozenc told me. “There’s value in it if someone you trust comes in, and you give that other person permission to design a ritual for you.”
That, of course, is what religious people have been doing for millennia; it’s just that the “other person” might have lived in the year 218, not 2018.
A follow-up to this post.
In the Introduction to The Myth of Disenchantment, Jason Josephson-Storm writes:
For three years, starting in 1905, some of France’s most famous scientists had assembled in apartments and laboratories in Paris to study this particular Italian spirit medium — Eusapia Palladino. In addition to the Curies, others often in attendance were the celebrated physiologist Jacques-Arsène d’Arsonval, the eminent psychiatrist Gilbert Ballet, the aristocratic doctor Count Arnaud de Gramont, and three future Nobel Prize winners — the physicist Jean Baptiste Perrin, the physiologist Charles Richet, and the philosopher Henri Bergson. The French were not the only ones interested in Eusapia; from 1872 until her death in 1918, her powers were tested by teams of researchers in England, Italy, Poland, Germany, Russia, and the United States. The paranormal researchers who investigated Eusapia were not marginal eccentrics, but the cutting edge of the period’s academic establishment. Yet these researchers were exploring areas that were often marked out by their contemporaries as occult, if not downright magical. They did so not as a legacy of medieval “superstitions,” nor generally as a way to overturn science, but rather as a means to extend its borders….
In one of the last letters before his death (addressed to Louis Georges Gouy, April 14, 1906), Pierre Curie remarked, “We have had several more séances with the medium Eusapia Palladino (we already had sessions with her last summer). The result is that these phenomena really exist and it is no longer possible for me doubt them. It is incredible but it is so; and it is impossible to deny it after the sessions, which we performed under perfectly controlled conditions.” He added: “In my opinion, there is here a whole domain of completely new facts and physical states of space about which we have had no conception.”
JJS, commenting on this exemplary scene, continues,
I challenge one conventional notion of modernity and suggest that we should be less surprised than we usually are to find scientists of all stripes keeping company with magicians; that reason does not eliminate “superstition” but piggybacks upon it; that mechanism often produces vitalism; and that often, in a single room, we can find both séance and science. The single most familiar story in the history of science is the tale of disenchantment — of magic’s exit from the henceforth law-governed world. I am here to tell you that as broad cultural history, this narrative is wrong.
Now, JJS knows far more about all this than I do, so you should my quibbles with several large grains of salt, but the story of the Curies looks different to me than it does to JJS. It seems to me that the Curies — and many like them — seek to bring the manifestations of spiritualism within the disenchanted order, under the disciplinary control of what Bruno Latour calls the “modern constitution.” Thus Pierre Curie’s emphasis on observing Eusapia under “perfectly controlled conditions”: all supposedly paranormal phenomena must justify themselves at the bar of the scientific method, and if they do, then they are no longer paranormal — they’re just normal. And in that way we do see what JJS calls “extending the borders” of science, but not in a way that restores enchantment. Rather, the modern constitution, the methodological guarantee of disenchantment, remains in place.
PSA: I’ll be traveling for the next week, so there probably won’t be any posts.
Claire McNear, How Quote Tweets Helped Ruin Twitter:
When did things get so bad on Twitter? It’s hard to say, even as nostalgia for the network’s simpler, friendlier days permeates the site. Was it the runup to the 2016 election? The inevitable growth beyond the insidery club of Twitter’s earliest days? The mainstreaming of meme and troll culture? In reality, the shift in discourse from freewheeling internet lab to Room Where Everyone Screams At Everyone owes itself to many things: from the things above to the worldwide SAD plague and rickets hastened by a planet of laptop dens. But quote tweets—a nice, little feature on a nice, little site meant for us to do nice, little things—have had an outsize role in the nastiness….
The result is that the ugliest things on Twitter are frequently amplified. Unlike instances of ratio-ing, quote tweets beam the ugliness straight into followers’ feeds. The problem is especially insidious with Twitter’s true merchants of hate, who capitalize on just such a reaction: What more could an aspiring alt-right toad with 1,000 followers hope for than to be huffily fired into thousands upon thousands of additional feeds?
One of the ways I’ve tried to make Twitter more manageable and less frustrating is by disabling most people’s retweets. But there’s no way, on Twitter.com or on any client I know of, to disable quote-tweets. So when people want not just to spread the bad word but comment on it, to explain why it’s bad or to mock the person who originally tweeted it — and this is far more common that simply retweeting objectionable material — there’s no way for me to avoid that.
Click a link in a web browser, it should open a web page, not try to open an app which you may not have installed. This is what Apple does with podcasts and now news.# Facebook is taking the place of blogs, but doesn't permit linking, styles. Posts can't have titles or include podcasts. As a result these essential features are falling into disuse. We're returning to AOL. Linking, especially is essential.
Google is forcing websites to change to support HTTPS. Sounds innocuous until you realize how many millions of historic domains won't make the switch. It's as if a library decided to burn all books written before 2000, say. The web has been used as an archival medium, it isn't up to a company to decide to change that, after the fact.
Medium, a blogging site, is gradually closing itself off to the world. People used it for years as the place-of-record. I objected when I saw them do this, because it was easy to foresee Medium pivoting, and they will pivot again. The final pivot will be when they go off the air entirely, as commercial blogging systems eventually do.
Every major tech company — or at least every major tech company that deals in the online world — seems to think it needs to marginalize and deprecate the open web.
The single most familiar story in the history of science is the tale of disenchantment — of magic’s exit from the henceforth law-governed world. I am here to tell you that as broad cultural history, this narrative is wrong. Attempts to suppress magic have historically failed more often than they’ve succeeded. It is unclear to me that science necessarily deanimates nature. In fact, I will argue à la Bruno Latour that we have never been disenchanted.
Up to the age of thirty, or beyond it, poetry of many kinds, such as the works of Milton, Gray, Byron, Wordsworth, Coleridge, and Shelley, gave me great pleasure, and even as a schoolboy I took intense delight in Shakespeare, especially in the historical plays. I have also said that formerly pictures gave me considerable, and music very great delight. But now for many years I cannot endure to read a line of poetry: I have tried lately to read Shakespeare, and found it so intolerably dull that it nauseated me. I have also almost lost my taste for pictures or music. Music generally sets me thinking too energetically on what I have been at work on, instead of giving me pleasure. I retain some taste for fine scenery, but it does not cause me the exquisite delight which it formerly did. On the other hand, novels which are works of the imagination, though not of a very high order, have been for years a wonderful relief and pleasure to me, and I often bless all novelists. A surprising number have been read aloud to me, and I like all if moderately good, and if they do not end unhappily—against which a law ought to be passed. A novel, according to my taste, does not come into the first class unless it contains some person whom one can thoroughly love, and if a pretty woman all the better.
There's something happening in fiction now, and to a degree in film and TV too: the time in which stories are set is scootching back, with writers fleeing to the safety of 1994 or 1987 or much earlier. Why? Because we didn't have smart phones then. We didn't have social media. The world didn't have this shimmering overlay of internet which is, in a very practical way, hard to write about. Writers of novels and teleplays have well-developed tools for the depiction of drama in real space. Drama that plays out through our little pocket-sized screens is just as rich – but how do we show it? We're now seeing film and TV figure this out in real-time. Novels have been (oddly?) less successful. Because digital action relies on so many Brands™, it feels risky and/or distasteful to send your narrative too deep into that realm. Who wants to be the person who called it wrong and wrote the Great MySpace Novel? (Actually, the Great MySpace Novel would be amazing. But see, that's not now anymore! MySpace has stabilized into historical artifact. We can look at it; describe it; maybe even understand it. That's not the case with the systems we're using right now. We're lost inside of them.)
Allison Miller tells The Story of the Multigraph Collective, an academic group project that eventuated in a book called Interacting with Print: Elements of Reading in the Era of Print Saturation. I very much want to read the book, but for those interested in the economics of labor in the academy and its effects on scholarship, this part of Miller’s account is especially interesting:
Being edited by so many other scholars, according to Paul Keen (Carleton Univ.), was unnerving but also “weirdly liberating. It gave us all a license to put our authorial sensitivities on hold and put our faith in this larger brainstorming process.”
Indeed, [Andrew] Piper too describes the endeavor as a “leap of faith,” since no one knew how the final work would be received by tenure and promotion committees or by UK Research Excellence Framework evaluators. One Multigraph Collective member, says Piper, was told that since there were 22 collaborators, the member’s work on Interacting with Print would count as 1/22 of a book—by word count, not even the equivalent of a journal article.
In the thick of it all, however, the process was thrilling. Hierarchies of academic rank and disciplinary territoriality dissolved in a shared commitment to the work. “This project fundamentally changed my ideas about what humanities scholarship could look like and what it could achieve,” says Porter.
The whole situation is a reminder of the absurdity of the current tenure system, with its crude quantitative pseudo-metrics for assessing “productivity” — but also of the power of tenure. Those of us who have it need to be engaged in projects like Interacting with Print — projects that reconfigure and extend the character of humanistic scholarship (sometimes by renewing older scholarly modes). I’m displeased with myself for not doing more along these lines.
I just came across this fascinating 2015 talk by Mike Caulfield and want to call attention to a couple of elements of it.
1) the garden/stream distinction:
The Garden is the web as topology. The web as space. It’s the integrative web, the iterative web, the web as an arrangement and rearrangement of things to one another.
Things in the Garden don’t collapse to a single set of relations or canonical sequence, and that’s part of what we mean when we say “the web as topology” or the “web as space”. Every walk through the garden creates new paths, new meanings, and when we add things to the garden we add them in a way that allows many future, unpredicted relationships….
In the stream metaphor you don’t experience the Stream by walking around it and looking at it, or following it to its end. You jump in and let it flow past. You feel the force of it hit you as things float by.
It’s not that you are passive in the Stream. You can be active. But your actions in there — your blog posts, @ mentions, forum comments — exist in a context that is collapsed down to a simple timeline of events that together form a narrative.
In other words, the Stream replaces topology with serialization. Rather than imagine a timeless world of connection and multiple paths, the Stream presents us with a single, time ordered path with our experience (and only our experience) at the center.
2) The difference between the Memex and the World Wide Web:
So most people say this is the original vision of the web. And certainly it was the inspiration of those pioneers of hypertext.
But in reality it doesn’t predict the web at all. Not at all. The web works very little like this. It’s weird, because in our minds the web still works like this, but it’s a fiction.
Let’s look at some of the attributes of the memex.
Your machine is a library not a publication device. You have copies of documents is there that you control directly, that you can annotate, change, add links to, summarize, and this is because the memex is a tool to think with, not a tool to publish with.
And this is crucial to our talk here, because these abilities – to link, annotate, change, summarize, copy, and share — these are the verbs of gardening.
Each memex library contains your original materials and the materials of others. There’s no read-only version of the memex, because that would be silly. Anything you read you can link and annotate. Not reply to, mind you. Change. This will be important later.
Links are associative. This is a huge deal. Links are there not only as a quick way to get to source material. They aren’t a way to say, hey here’s the interesting thing of the day. They remind you of the questions you need to ask, of the connections that aren’t immediately evident.
Links are made by readers as well as writers. A stunning thing that we forget, but the link here is not part of the author’s intent, but of the reader’s analysis. The majority of links in the memex are made by readers, not writers. On the world wide web of course, only an author gets to determine links. And links inside the document say that there can only be one set of associations for the document, at least going forward.
“A tool to think with, not a tool to publish with” — this seems to me essential. I feel that I spend a lot of time trying to think with tools meant for publishing.
Hey everybody, sorry for the radio silence — I’ve been traveling, and will be traveling again soon, so I can't promise regular posting for a while. But I’m hoping to get a few thoughts up here, starting with this:
I've read two recent posts about computer interface design that really have me thinking. The first is this reflection by Riccardo Mori about using a first-generation iPad. Mori discovers that that original Apple tablet, despite its significant limitations in processing power in comparison to today's machines, still works remarkably well. But he also, and this is the really interesting part, decides that some of the design choices made eight years ago (the first iPad came out in 2010) are actually superior to the ones being made today. This is true to a minor degree even with regard to the hardware — Mori finds the iPad 1 more pleasurable to hold than some later models, despite its greater weight and thickness — but he thinks that the design language of iOS 5, the last version of iOS that the original iPad can use, is in certain respects simply superior to the new language introduced in iOS 7 and largely persisting, though with some modifications, today.
when it comes to visuals it’s ultimately a matter of personal taste, but one thing iOS’s user interface possessed before iOS 7’s flattening treatment was consistence and more robust, coherent, stricter interface guidelines. Guidelines that were followed by third-party developers more closely, and the result was that under iOS 6 and earlier versions, third-party apps presented a user interface that was cleaner, more predictable, easier to navigate than what came afterwards, update after update. After iOS’s UI got flatter, when it came to designing apps, things got out of hand, in an ‘anything goes’ fashion.
There are apps today with poor discoverability, ambiguous controls, UI elements whose state or function isn’t immediately clear — i.e. you cannot tell whether they’re tappable or not simply by looking at them; whereas before iOS 7, a button looked like a button right away, and you didn’t have to explore an app by tapping blindly here and there. Spotify is the first example coming to mind: its early iOS and Mac clients were more usable and had a better interface.
Concluding this section of his posts, Mori writes:
During my trip down Interface Memory Lane these days with the iPad 1, I’ve stumbled on many other cases, and the result was always more or less the same: I found the old version of an app to have a more usable interface and a clearer interface language than its current counterpart. Despite all the pre-iOS 7 skeuomorphism, for many app interfaces of that time design was truly ‘how it works’. Today, more and more often (and it’s not only with iOS) I see examples where design is simply ‘how it looks’; attractive apps, but with ambiguous interface controls, poorly-designed UI architecture, and sometimes even with little to no accessibility, disregarding users with disabilities.
The second post is by Mark Wilson, who found himself using version 7 of the original Macintosh OS, issues way back in 1991 — and loving it. "Using an old Mac is pure zen." Now, Wilson doesn't suggest that that old interface could simply be implemented today; we ask too much of our computing devices today, and too many kinds of "much."
But I do believe that the old Mac makes for a timely reminder that the digital age hasn’t always felt so frantic, or urgent, or overwhelming. And maybe, even if an old Mac interface isn’t the solution, we can view it as a subtle north star for its sensibilities, and how much it was able to accomplish with so little.
Few interface designers are indifferent to the needs of the user, but I can't imagine that there are many for whom that is the first consideration. One way designers keep their jobs is by producing new designs, and in a corporate setting (like that of Apple) novelty helps get customers to update their hardware and software alike. And what kind of designer wouldn't want the challenge of making the best use of increased processing power or display resolution?
So I don't expect the desires and needs or users to be at the top of any designer's priority list. But Lordy, how I wish it were a little higher than it is. Then perhaps the best elements of the work of earlier designers, people working under far greater constraints, could be recovered and redeployed. Because, as I never tire of saying, creativity arises from constraint and resistance. And it’s not clear to me that, from the user’s perspective, UI design for computing devices hasn’t been getting worse and worse for the past few years — with Apple leading the way in this sad category.
From a national perspective, the number of people earning communications degrees (which was negligible in the heyday of English majors 50-60 years ago), surpassed the number getting English degrees around 20 years ago. Since then Communications has held a fairly steady share of graduates as the college population grew, while English has lost its share and in recent years even shrank in total number, as this NCES table records. In short, students voted with their feet and, for the most part, they aren’t interested in the curricular experience English has to offer (i.e. read books, talk about books, write essays about books).
Peterson is very conscious of his role as just another backwater stop on the railroad line of Western Culture. His favorite citations are Jung and Nietzsche, but he also likes name-dropping Dostoevsky, Plato, Solzhenitsyn, Milton, and Goethe. He interprets all of them as part of this grand project of determining how to live well, how to deal with the misery of existence and transmute it into something holy.
And on the one hand, of course they are. This is what every humanities scholar has been saying for centuries when asked to defend their intellectual turf. “The arts and humanities are there to teach you the meaning of life and how to live.” On the other hand, I’ve been in humanities classes. Dozens of them, really. They were never about that. They were about “explain how the depiction of whaling in Moby Dick sheds light on the economic transformations of the 19th century, giving three examples from the text. Ten pages, single spaced.”
So maybe — just maybe — it’s not "read books, talk about books, write essays about books” that’s the problem.