the double back theory

I have an old friend who swears by The Double Back Theory, which basically goes like this:  Any important revelation will immediately strike you as obvious and true, but because its significance lingers with you for years, you will have too much time to develop alternatives and corollaries that overcomplicate the picture. Nevertheless, if you keep on thinking about the central idea, you will inevitably double back to your original revelation as the most profound revelation.

Today’s example: The Internet is a new platform for consumer media. That’s a striking revelation . . . in maybe 1994 or perhaps as late as 1998. This may be hard to believe today, but there was a time when it was revelatory to describe the Internet as a new form of popular media, rather than as a niche technology.  Today most people would declare the Internet as the second most important form of media (behind TV).  It seems so obvious now that the Internet is a consumer media delivery system.  And yet, it’s easy to find ways to overcomplicate this simple picture.

Take for example the argument over whether The Web Is Dead. Putting aside the easiest objection – that many claims of death are exaggerated – the thesis basically says that “the Web” was supposed to be this great open playground that changed the world forever, but a variety of closed systems now threaten the promised paradise. We are supposed to get hysterical over the idea that content that was free on the Web will not be free forever, and that there will be special access channels that only some people will be able to afford.

But the Web isn’t dying, it’s just evolving the way that consumer media have always evolved.  The history of consumer media is littered with similar patterns of free and paid content, amateur and professional content, sponsored and bought content. There are many examples where a new medium was popularly established with free content, and evolved into a tiered system of both free and paid content. Look at television – once it was free (i.e. ad-supported), then cable TV came along with both an ad-subsidized paid model (basic cable) and an ad-free paid content model (e.g. HBO, PPV).

The same thing is happening with this wondrous new medium of the Internet, and the most wondrous thing of all is that anyone thought it would be any different.  The Internet is wonderful and has changed many things in the consumer content landscape, in terms of interactivity, variety, engagement, and low production and distribution costs. But one thing it hasn’t changed is that consumer media, as a whole industry, will always trend toward payment for quality content, and toward concentration of media power in the hands of a relatively small number of players.

I wish that weren’t true, but it is true today and will always be true for as long as we remain human beings.

We like to think that technology frees us from the scarcity-based economics of the past.  And it’s true that changes in scarcity can free up new business models.  But there is no kind or amount of technological advancement that can eliminate scarcity in two areas:

  • Quality. Quality content is by definition scarce: no matter how great the aggregate improvement in overall quality, there will always be some portion that is better than the rest. The development and application of new technology to content only heightens the divide, not flattens it – because the quality of the content includes not just artistic merit but its presentation and convenience to the consumer.
  • Attention. Human attention is limited, both in the aggregate and for any individual. No matter what automatic aggregation, filtering, or curation tool is ever developed, we can’t radically increase the finite amount of real human attention for consuming media. Even if we develop technology that actually stops time, our biology dictates a finite attention span – there’s only so many hours of media a brain can absorb in a day, no matter how long the day is.*

Since quality is scarce and attention is finite, there will always be an opportunity to charge money for the best content – and since this includes charging for the best quality presentation and delivery, it means that there will necessarily be a two (or more) tiered Internet. You can call it surrender, you can call it the death of the Web, you can call it whatever you want – but recognize that it’s progress, it’s evolution, it’s the future as well as the past.

——-

On a related (and more obscure) note, lately there’s been a lot of conversation about the evolution of certain parts of the venture capital business. I can’t do the whole conversation justice – but basically the narrative is that there is a new mode of investing in the consumer Internet sector, with smaller but smarter initial investments, giving rise to an expanding birthrate of web startups, and raising the specter of a seed investor bubble.  Again I’d ask, should we try to understand all this as a new phenomenon, or is this just a different variation of a familiar pattern?

Consider that a lot of “Consumer Internet” is no longer mostly about technology development, it is about media content development. From that perspective, a lot of the shifts in venture investing are about a certain class of savvy investors becoming media investors instead of technology investors. They’re not evolving to some kind of new model of investing, but cycling into the model of investing that you see in more mature content production businesses.

I think that consumer Internet investors will become more and more like television producers and financiers, and less like “hard” technology investors.  If that’s right, you’ll stop seeing conversation about equity vs convertible debt, and will instead see a move toward the revenue-sharing model that is common in the TV and movie industry.

Some people will regard this theory as idiotic, controversial and even demeaning (if you think being a TV producer is worse than being a VC), but for me it’s just doubling back to the basic insight that the Internet is a new platform for consumer media.  Now that the original mid-’90’s revelation has come true, you can expect that the investment economics will repeat old patterns more than they create new ones.

——-

* I realize that there are people who believe that in the future, technology could enhance brain function as well as create endless renewable energy – making essentially limitless time and capacity to enjoy leisure activities, including consumption of media.  Without opining on the likelihood of that future, I’d just note that it’s a future in which we are no longer human, as we understand humanity today.

you gotta love yourself

The final lesson in the four-for-forty series is the hoariest, hippyest, horriblest of them all. “Love yourself” is the basic rule of all personal development, so there’s no shortage of Internet advice on how to love yourself. To me, the advice has always come across as self-indulgent babble that may be good for crackhead pop and comic treatment, but it’s succored a generation of wimps who can’t hold down a job.

The first hundred times or so I heard “You gotta love yourself,” I thought: “No I don’t.  You don’t tell me what I gotta do.” Then I began to ask “Why?” and I finally heard a reason that made some sense to me.

Loving yourself requires accepting your faults, and accepting your faults gives you more options for how to react in any situation. That’s a quantifiable rationale, testable both in theory and in practice – and as a bonus the measurement also gives guidance on whether you’ve taken self-love too far. Here’s a simplified example:

Let’s say you receive a bad outcome that is at least partially based on something you did. Here is a count of your options for how to react –

  • Self-hate: Since you will blame yourself to the exclusion of other factors, you only have two choices: (1) rigorously apply yourself to skills improvement, even though it’s likely that no amount of improvement would have given a different result, or (2) drink enough to obliterate your self-hating identity.
  • Self-love, of the over-indulgent kind: Certainly the outcome wasn’t your fault, so your choices are (1) smugly wait for the next chance for the world to properly join you in your love of you, or (1) ignore any possible evidence that your actions contributed to failure. Yes, those are numbered the same because they are the same.
  • Goldilocks self-love, the kind where you love yourself just right: You can be clear-eyed about what really happened. You can apply yourself to change, you can recognize the factors that were out of your control, you can put the outcome out of your mind in good humor and good health. You can do all of these things and you probably will.

Basically, loving yourself just right gives you all of the options of the other two conditions, with the additional optionality that comes from not being ideologically compelled to react in a way that is harmful or indulgent. You gotta love yourself just right, because the alternatives are suboptimal. Sure, that’s a particularly dry and uninspiring way to put it, but what can I tell ya, I love this way because it’s mine.

we are all authors of our own lives

I’m not against self-affirmation on principle.  Many people benefit from empowering messages that remind them of their intrinsic worth.  However, that isn’t the sort of bromide that works with my particular chemistry. I want to understand what to do, not how to feel.  Even though I might enjoy hearing that I’m good enough, smart enough, and doggone it, people like me, that news doesn’t give me tactical guidance on how to live my life.

So when I tell you that “We are all authors of our own lives” – I don’t mean to trumpet the primacy of your own role in shaping your destiny, even though that’s a useful bit of affirmation.  I mean for you to think about the process of authorship, the task of writing a story from both facts and fantasy over many years.

Whether you realize it or not, you carry around a story in your head about who you are.  You draft, write and rewrite your internal explanation of the kind of person you are, the character you have, the things you will and will not do.  This work of self-conception is the greatest novel ever written, or at least it should be for you.

Early on, very little of your story is constrained by actual events, since you’re too young to have been in all of the situations you anticipate that you’ll experience.  You have the freedom of your imagination, and you write your story based on what you’ve seen in your family, friends and others in life and fiction.  You’ll imagine, for example, that you’re just like your dad, or not at all like your mom, or a bit like Al Pacino in Scarface, or a lot like Lindsey Lohan on Twitter.  Then as you grow older, your story becomes a lot more personalized to you, based more on your experiences and less on your aspirations.

You have years, maybe decades, to write your beautiful story of who you are, and then something happens. It may be one traumatic event, or a series of little events that are only clearly related in retrospect – but it’s something that happens that doesn’t fit into the story you’ve been spending your whole life on to that point. You thought you were a good guy, but then you did something that was undeniably bad.  You thought you were an honest woman, but then you’re confronted with your repeated pattern of little lies.

You race back to your story, flipping madly through the pages of the Book of You.  Who is this person in this story?  Who is this stranger living this life, holding this tattered book in shaky hands?  Can these possibly be the same person?  Faced with this disconnect between your life’s work as an author, and the actual facts of your life, you have two choices:  You can rewrite your story to fit the facts, or you can rewrite the facts to fit your story.

Perhaps this is the point where I’m supposed to say that the facts are sacrosanct, and your job as an author is to fit the story to the facts.  But no:  I said you were an author, I didn’t say you were a journalist, and I can’t presume to tell you what kind of story you’re writing. You have to make the choice that satisfies your art as the author of your own life.

Maybe you’ll just choose straightforward reporting, because you do want to match the story exactly to the facts.  Or you might be like Mark Twain, writing fiction truer than fact; or Jack Kerouac, making facts into truthful fiction.  I wouldn’t advise going full-on into fantasy, with complete disregard for any events from reality.  Not because it’s wrong, but because all of the best fantasies are rooted in something real.  As an author, you’re an artist, and art without truth is trivial, and you don’t want your life to be trivial.

Finally, be aware that we are all engaged in these acts of authorship.  You can get very far in understanding other people if you think about the story they’ve written in their own heads, and observe what they do with facts that don’t match the story.

many goods are incommensurable

There are many simple ways of saying things pretty similar to what I’m saying here, such as:

  • To each his own.
  • One man’s trash is another man’s treasure.
  • It’s apples and oranges.
  • It’s all good.

But I don’t like these easy sayings, because it’s not all good – what I’m trying to get across is hard to understand and hard to live, and has little relation to the soft-headed permissiveness implied in those easy clichés.

This happens to be the only life lesson that I actually learned in a classroom as the direct subject of a lecture, and this lecture justified a year of college tuition all on its own.  “Incommensurability” is a simple enough concept – it just means that there are things that do not share a common standard of measurement, like the proverbial apples and oranges.

Apples aren’t oranges, could anything be simpler?  But it struck me as a thunderbolt to understand how this affects the search for the good life.  I’d always thought that the task of living a good life was largely about understanding the difference between good and bad.  Maybe I’ve got a moral compass that doesn’t have a reliable fix on true north, but that difference hasn’t always been obvious to me.

As life goes on, it has become easier to tell the difference between good and bad – or rather, it’s become harder to delude myself into believing that that there isn’t a difference or that I can’t see it.  Now I can see that choosing between good and bad was simply the entry-level exam for the good life.  The hard task of living a good life is to choose among things that are good that can’t be compared with one another.

Choosing among incommensurable goods is sad because you are by definition choosing not to do things that are good.  You know that the choices you make will sacrifice things that you would also like to have.  The good things you choose may be vastly outnumbered by the good things that you gave up.  And yet, your choices are a triumph that isn’t second-best to any other set of choices.

One of the great things about understanding this is that you won’t be limited, as many people are, to only having friends who have generally made the same moral choices that you have.  You’ll be able to see that others chose among the same set of incommensurable goods that you did, and even if they made different choices, they are still people who share a common sense of good with you.

Just to make sure that this isn’t interpreted with a mushy morality that I actually despise:  This doesn’t mean that everything and everyone is all good, it doesn’t mean that any set of choices is as good as any other, it doesn’t mean that you can be friends with anyone, it doesn’t mean that there’s no difference between good and bad.  It just means that many goods are incommensurable, and you should think carefully about what that means as you make your choices for a good life.

intelligence is a crutch

Being smart is a good thing, as any smart person will tell you more times than you care to hear. And being really smart is like some kind of weird superpower. If you’ve ever been at the head of your class, or the smartest person in the room, or even just the subject matter expert in conversation with the uninitiated, you know what it feels like to not only have every answer but anticipate every question – it almost seems like being able to bend space, time and reality to your will.

Now, maybe you’ve never had that superpower smartness – that’s also a good thing. Because that means you may have had a chance to observe really smart people at the height of their powers, glorying in their intelligence and in love with their knowledge of the world. And you may have achieved a striking insight that is beyond the understanding of many smart people, a special insight that seems to routinely escape the most massive intellect. This insight is painfully obvious to everyone else: Smart people suck.

Intelligence is a largely genetic trait that is also substantially influenced by environment and circumstance. In this way, it’s a lot like height. So before we talk more about smart people, let’s talk about tall people for a bit. Tall people get some pretty nice prizes from winning the genetic lottery. Tall people make more money and find more attractive mates. Height provides some advantage in many sports, and is a virtual requirement for success in some. So being tall is overall a good thing.

And here’s the point: Tall people know they’re lucky. They know that they have an advantage in life that others don’t have, and they know that they did very little to secure this advantage. They also know that to maximize their advantage, they have to add their own efforts – if they want to make the team, get the job, get the girl or guy – they have to eat right, work out, study hard, take care of their skin, hair and personality.

Not so with smart people. Even though smart people are generally aware of the genetic, environmental and circumstantial contributions to their intelligence, they rarely think of these as luck. Instead, smart people tend to think they’re better than other people because they’re smart, not because they’re lucky. And smart people often think that the world owes them something merely for being smart, as opposed to being diligent, sincere or personable. Smart people think that being smart should be enough, where tall people know that being tall is just a start.

The problem with intelligence is that it does, to some extent, make up for the absence of other admirable qualities. Smart people can get the same or better results as others even when they work less, care less and cooperate less. Intelligence is a crutch. And a smart person who leans on that crutch to the detriment of other important traits can become a monstrously malformed person. Intelligence is used worst when it’s used as a crutch to escape the hard work of being human.

four for forty

I’m four days from my fortieth birthday, and thinking hard about what I’ve learned over the past four decades. Over the next four days, I’m going to write about the four lessons that were hardest for me to learn – these are not necessarily the most important, or the most valuable, or the most insightful. They were just goddamn hard to learn, and in fact I’m still struggling to get them right.

People who give advice usually believe that some particular experience has given them an authority that others might want to regard seriously. That isn’t the case with me: although I’ve had many instructive experiences, I don’t think my historical record is what makes me qualified to give advice, and I don’t think everyone should take my advice seriously. Instead, what makes me qualified to give advice is that I am spectacularly bad at taking it.

I’ve had the great good fortune of having many wise people tell me many wise things, and my usual practice is to squander that good fortune by refusing to take even the best advice at face value. Instead, I question, I doubt, I criticize, I experiment, I delve down dark alleyways of impulse and instinct – and in the end I painfully find that I should have listened to the wisdom of my betters.

The problem with wise advice is that you have to have wisdom to appreciate it beforehand. And if you had the requisite wisdom in the first place, you wouldn’t need the advice so badly.  I never understand good advice until I’ve had the opportunity to fail to follow it. Only by living the bad consequences first-hand can I understand the underpinning that upholds solid wisdom.

Let’s hope that my misfortune is your bounty in these next four posts.

  1. Intelligence is a crutch.
  2. Many goods are incommensurable.
  3. We are all authors of our own lives.
  4. You gotta love yourself.

launch PR: New York Times vs TechCrunch

This post is inspired by a similar post by Udemy – I’m trying to add useful information for all the folks who are working hard and trying to get their products noticed.

Bynamite | Internet By The PeopleWe launched our beta product at Bynamite about a month ago, and were lucky to get covered in the New York Times.  I wish this post could be about “How To Get Covered in The New York Times,” because that would be some really valuable information for the startup community.  But we were simply very lucky – a friend introduced us to a potential business partner who was really interested in our story, who introduced us to the Times reporter, who had been thinking and writing about related issues for a long time.  Everyone in the chain was very thoughtful and patiently dedicated to understanding what, if anything, is interesting about what we’re doing.  Sometimes the pieces just fall into place, and that’s what happened here.

Before that series of fortunate events, we had been preparing a more traditional scrappy startup PR strategy, which I learned from the interwebs.  Balsamiq‘s marketing advice and launch homework are invaluable; in particular I was focused on the 10 PR tips from Weebly.  We had identified about 45 blogs, big and small, that I intended to contact one by one, with the holy grail being coverage in one or more of the major tech blogs – TechCrunch, Mashable, ReadWriteWeb, GigaOM and VentureBeat.  Just as I was starting to reach out to the list, the Times reporter confirmed that his story was very likely to go forward in the Sunday business section.

At that point, we had a decision to make.  On the one hand, the TechCrunchosphere is the place to launch consumer tech products – the audience is intelligent, opinionated, and early adopting.  This is an audience that understands that startup companies launch “unfinished” product.  It’s not a good idea to get mainstream press before your company is really ready for it.  On the other hand, our product goes contrary to the tech orthodoxy that had largely proclaimed that no one cares about privacy.  Would TechCrunch readers be the wrong audience for our more mainstream message?

Although these are complicated concerns, we didn’t take long at all to decide, and we were swayed for one irresistible reason: it’s the New York freaking Times!  As much as I’m with the punditocracy that declares newspapers dead, I just couldn’t help myself – I grew up reading the Times, and I really wanted to see if we could get in the paper, the good ol’ physical, dead-tree paper.  So we saved the blog efforts for a later time – hopefully after we’ve learned our lessons from the beta and are ready to relaunch with a more complete product.  It’s sort of a topsy-turvy press strategy, and there’s probably a whole ‘nother post in whether or not it’s stupid, but that’s not the point here.  The cool thing today is that we get to compare results from different PR launch paths.

Here’s the Visits graph from Udemy’s launch:

Udemy Screen-shot-2010-05-24

Here’s a similar graph from Bynamite’s launch:

Bynamite Screenshot 2010-08-13

Here’s the referral chart from Udemy:

Udemy referral chart

And the corresponding chart from Bynamite:

Bynamite referral chart

Now, the point here is NOT to say that Bynamite PR is any better or worse than Udemy PR!  That kind of comparison would draw all sorts of wrong conclusions, not least because I’ve cheated here by including 30 days of data to Udemy’s 23 days.  Also, note that Bynamite is a browser extension that records a page view when the extension bar pops up (that’s why the Avg. Time on Site is absurdly high).  Different products are going to have lots and lots of reasons for different metrics.

But the conclusion I’m willing to draw is that getting covered in the Times is roughly equivalent to coverage in the major tech blogs.  Not an order of magnitude higher, and certainly not smaller.  So for anyone hoping to confirm the relevance of mainstream media, I suppose that’s a victory of sorts, though it’s just as accurate to be amazed that media sources that barely existed 5 years ago are now equivalent to the “paper of record” that’s been around for 150 years.

It’s also interesting to note that both Udemy and Bynamite got a secondary bump 5 or 6 days after the original coverage.  In Udemy’s case, that bump exceeded the initial coverage, and was almost entirely driven by a mention in one source, Thrillist.  Bynamite’s secondary bump was smaller than the first, and was a result in pickup by many smaller sites that focus on covering downloadable apps.  Also like Udemy, our traffic has settled down to a much quieter pace, though significantly higher than the near complete obscurity prior to the press coverage.

I’m still digging through the details – and by the way, could use some help, if anyone reading this wants to drive through Google Analytics with me, let me know!

career two by four

Lately I’ve had occasion to give advice to a few people who are early in their careers.  I always find myself amusingly inept at this activity – the more actual experience I have, the more young people think I have something useful to tell them, but the further I am from the time when I was actually making the decisions they face, so the less accurate my recollection is, and the more my advice is colored by soft nostalgia rather than rooted in hard facts.  The wisdom of experience turns into the banality of platitudes.

Of course, none of this stops me from spouting on and on about how to manage your early career.  One set piece I often relate is that there are only four personal characteristics that can advance your success: Intelligence, Diligence, Personality and Mentality.  Many people get very far early on with just one of these characteristics, and so they begin to believe that this characteristic is the most important or even the only important one.  When they begin to fail, they double down on the characteristic that they believe in, which only deepens their failure.

To understand why this is true, consider the other side of this same advice, which applies to people just learning how to manage teams.  There are few things as destructive to a team as the person who has one of the characteristics in spades, but lacks any useful amount of the others.  The brilliant genius who can’t get along with others, the guy who works terribly hard but always on the wrong things, the “people person” who plays politics rather than solves problems, the hard charger who plays to win at any cost – these are all different forms of the same cancer, and they must be excised from the team as soon as they are identified.

So development of the four characteristics rules both sides of the management divide.  And on either side, you have to have great strength in more than one of these characteristics, and you have to understand how all of them contribute to success.

never quite the same

p. 67:

They were never quite the same ones in physical person but they were so identical one with another that it inevitably seemed they had been there before.

This is Fitzgerald’s description of the rotating retinue of “four girls” who always accompanied one of the revelers at Gatsby’s parties.  The narrator admits “I have forgotten their names . . . the melodious names of flowers and months or the sterner ones of the great American capitalists whose cousins, if pressed, they would confess themselves to be.”  The girls are objects of art, objects of desire, signifiers of sex and wealth.  A feminist critique of Gatsby would deplore the nameless characters and the “girl” terminology.

But Fitzgerald’s gift is observation, not social commentary – and observation stands up better over time than commentary ever could.  The objectification he describes continues today, with different meaning and different dynamics.  These days a man can travel like Robert Palmer only as satire; making a habit of it just looks silly.  So we read this novel of the past with the feelings and morals of the present, which only enriches our understanding of how these crowded parties were filled with empty people.

the Internet is making us bad writers

Over the last several years, many people have engaged in discussion and debate about whether “the Internet makes us stupid.”  What is this debate really about?

The first volley in the debate may have encapsulated the entirety of its substance.  Doris Lessing, in accepting the 2007 Nobel Prize in Literature, asked:

How will our lives, our way of thinking, be changed by this Internet, which has seduced a whole generation with its inanities so that even quite reasonable people will confess that once they are hooked, it is hard to cut free . . .

As the vanguard and finest defender of the cutting edge, TechCrunch boiled down Lessing’s careful rumination into “the Internet makes us dumb,” and crafted the exquisitely reasoned rejoinder:  “Meh.

The following year, Nicholas Carr kicked the debate into high gear by asking, “Is Google Making Us Stupid?”  Carr noticed that after years of using the Internet as his main source of information, he’d become less able to apply sustained concentration to reading lengthy articles and books.  He found anecdotes and early research that suggested that the constant browsing and skimming of information so typical of Internet reading exercised the brain in a different (arguably more shallow) way than the “deep” reading of books.

Carr himself noted people often feared that new technologies would limit human progress, without being able to imagine the ways those technologies would expand our knowledge and further progress:  Socrates complained that writing allowed people to cease exercising their memories; the Gutenberg press was once decried as a tool of intellectual laziness.

Nevertheless, now two years later, Carr has more firmly concluded that the Internet has rewired our brains to crave new and trivial information, at the expense of deep analysis and critical thinking.  From Carr’s original article through the recent publication of his book The Shallows, the question has become a matter of popular, academic and public concern. TechCrunch continued its proud tradition in this debate, dismissing Carr’s question as merely his “axe to grind.”

This is the kind of debate that can go on for a very long time, because the titular question is ironically stupid, though in a clever, link-baiting, book-selling way.  Knowing what “stupid” is requires defining “intelligence,” which is a concept so malleable that anyone who isn’t stupid (and many who are) can argue without end that the other side is being stupid (or at least, isn’t being smart about what stupid is).  Carr is not actually stupid, and I think his question isn’t designed to be answered.

However, there is one way that the Internet has broken a chain that began thousands of years ago:  for the first time since the invention of writing, good writing is no longer crucial to the transmission of knowledge.

When information is available everywhere from anyone at little cost, the power of good writing is diminished as a vehicle for knowledge.  Think of it this way:  Was Plato the smartest of Socrates’ students, or was he merely the best writer?  If all of the philosophers of Ancient Greece had blogs and Twitter, would we even know who Plato was?  Would we hold any single one of them in such high regard?  I think not.  And yet, I think we would still have the full breadth and depth of Greek philosophy in our human knowledge base.

The constraints of physical media, from stone tablets to wood pulp, meant that only the best writing could survive the culling of editors, libraries, wars and time.  So only good writers could pass their knowledge through the generations.  Now that anyone can publish and everything is stored forever and can be found easily, anyone can transmit knowledge so long as it is relevant, and regardless of whether it is the best-written statement of the concept.  If that were the case in Socrates’ time, we might have heard about the Cave from any one of his students – or maybe a dozen of them would have tweeted about it simultaneously.  So we would know the allegory of the cave without knowing or caring who the author was.

This thought must torture good writers everywhere, including Nick Carr, so maybe that’s what his question is really about.  The Internet isn’t making us stupid, and to be precise, it isn’t really making us bad writers.  But it does make good writing matter less.  Oh sure, you can argue that there’s an art to a good blog post or tweet or status update.  But this isn’t like defining “stupid” – there really is a meaningful standard of good writing that people of taste and discernment agree upon, and people who argue otherwise are stupid, for lack of a better word.

The highest challenge in writing – as an act and art separate from the communication of information – is a lengthy work that commands sustained interest and concentration from a reader who enters the writer’s world, rather than the other way around.  The Internet is a reader’s world, and that probably does make readers smarter.  But it makes good writing for writing’s sake matter less, so people who otherwise would have had to be good writers to communicate their ideas can now just get their ideas out in 140 characters.  Is that a bad thing?

I’ve been in this cave my whole life, but now I’m free. OMG, everything I thought was real was only shadows on the wall!! via @Socrates