All posts by Grant McCracken

A celebrity lab by celebrities for celebrities

518px-Judy_Greer,_Comic-Con_2010I read somewhere recently that Judy Greer has  a way to categorize her fans.  So when she sees them in public, she can tell who she’s dealing with.

Perfect, I thought.  It’s about time celebs turned the tables.  We spent a lot of time talking about them.  They dominate TV, many magazines, and much of the chatter on-line.  It’s about time they started studying us.

A celebrity lab by celebrities for celebrities would be a good idea for strategic, entirely self-interested reasons.  However much it may feel like their celebrity is inevitable, fame is something conferred by the fans.  And what the fans give, the fans can take away.    Ask Alec Baldwin, Gwyneth Paltrow or Matt Lauer.

Celebrities could start with a typology of fans.  And there is a lot to categorize.  Some fans are deeply scholarly.  They can recite the biographical details, film titles, dialogue.  Some merely prize a particular film character.  ”I loved you in A Walk on the Moon!”  Others just happen to be in love with the Star Wars or the Spider Man franchise and you are sudden, irresistible opportunity to make contact.  Still others are merely excited because they are in the presence of someone famous and people are screaming.  Still others are just screaming.  Distinguishing one from the other would be a good thing.  Having a strategy for each of them would most helpful.

There is also, of course, a dark side.  There are those who make a great noise but are essentially harmless.  Others are stalkers in training.  And still others are so dangerous, the right thing to do, the only thing to do, is to take cover as quickly as possible.  Early warning here would be unbelievably valuable.

The celebrity could assume that everyone they meet in public is a nut job and armor themselves with beefy security guards.  But in fact public appearances, even impromptu ones, are part of the job, the way you renew your celebrity.  Really, the celeb has no choice to expose him or her self to interactions with the public.

So celebrities really need a way to tell who they are dealing with, on sight, in real time.  This person I see before me, the one grinning ear to ear and making a high pitched sound, is this a goof ball or a psychopath? A typology would help.

I got to see celebrity at work when I was the chauffeur for Julie Christie for the filming of McCabe and Mrs. Miller.   She was so famous at this point that Life magazine had declared 1965 “the year of Julie Christie.”

One afternoon, Julie and I were in a candle store.  We happened to be standing beside some guy who couldn’t decide whether his gift candle should be lemon or lime.  Julie volunteered that the lemon might be quite nice.  The guy turned to thank her for this advice but as it dawned on him that the speaker was the most famous woman in the world, he found  he had no words.  He walked out of the store, both candles in hand.

This is the best case of celebrity.  Your charisma protects you from contact, smoothes your path, charms your existence.

But there were other moments when people would come up to us and barge into Julie’s personal space.  I didn’t quite know what to do.  You couldn’t tell whether this was a friend she didn’t recognize or a hostile she ought to fear.  Scary as anything.  And there would be this unpleasant moment when we would have to wait for the person to throw off a little more information, so that we could figure out who they were and what they wanted.  And in that several seconds, we were vulnerable.

So some system for identifying strangers and a set of strategies for dealing with them would have been a very good thing.  Go, Judy, go.  And if you need a team of anthropologist to work on this problem, call me.  I could put together a set of teaching materials and conduct a lab.  It would be like teaching in the Harvard Business School classroom again except I wouldn’t have to memorize anyone’s name.  

Orphan Black and cultural style

Readers of this blog know that I’m a fan of the show Orphan Black on BBC America (Saturdays at 9:00). It resonates with the transformational and multiplicity themes so active in our culture now.  See my post here.

I finally got to see Episode 1 of the new season (2) this morning and I was captivated by this scene.

Apologies for the quality of this clip.  I shot it with my phone.  Perhaps the show runners Graeme Manson and John Fawcett would consent to put the original up on YouTube.  (In fact, the last moment of this clip shows Manson and Fawcell in a Hitchcockian turn.  Manson is the camera man.  Fawcett is the the man in the glasses.)  (See the whole of this episode on the BBC website here.)

I think this clip touches on a couple of recent posts, especially the one on Second Look TV and the one on “magic moments.”  You decide.

But the real opportunity here is to comment on a truth in anthropology.  My field is, among other things, a study of choice.  There are so many ways of being human, of acting in the world, that people must choose.  (There is a famous story about a Russian actor proving his virtuosity by delivering the word “mother” in 25 distinct ways.)  How will we say a word, make a greeting, or carry ourselves?  We have to choose.  There are, for instance, lots of ways to do a “high 5.”

We have to choose from all the choices and once we choose we are inclined to stabilize the choice and use it over and over again.  It may shift with the trend, and we alert to these shifts, but for the moment, an invisible consensus says, this is how we do the high 5.

But this is not only a personal choice.  We make these stable choices as the defining choice of a nationality, ethnicity, gender, region, class, status, and so on.  Eventually, this choice becomes a style, a signature way we express ourselves.  It is a way we are identified by others.

Hey, presto.  Imagine an actress’ delight.  With styles, she has a  device with which to tell us who her character is and what her character is doing in any given part of the narrative.

Tatiana Maslany, the Canadian actress who plays the clones, has the exceptional task of delivering the “truth” of each clone even as she must make them identifiably different.  But of course she is going to use style.

In this scene, she is giving us Alison, the suburban clone.  The minivan, pony tail and jump suit label that identity, but then comes the hard part.  To show Alison in all her Alisonness.  And still more demandingly to show Alison under duress.  (Sarah, the street toughened con-artist clone, can handle herself in a fight.  The trick is to show Alison making her response up as she goes along.)

There is lots to like in this scene and, reader, please exercise your “second look” privileges to go back and scout around.

I love the moment when we see Alison spraying and blowing.  She is after all a multitasking mom.

I love the ineffectual last tweet that comes when she gets pitched into the waiting van, expiration meets exasperation meets astonishment. Who is this man?

I love the small gesture with which Maslany gathers her composure before leaving the van, squaring the shoulders and fixing her pony tail.

And then the wonderful look of dismissal she gives her captor as she closes the door of the van.  Alison is back in possession of her suburban self possession.  What’s nice about this among other things is that it shows the Alison beneath the Alison.  Yes, her self possession has been shaken by this event but where most of us would be wordless and traumatized, Alison is back.

That last moment of the clip, the one in which we see a brief, Hitchcocking appearance from the show runners, I like as well.  There was a time when it would be ridiculous to talk about these showrunners and the movie making master in the same breath.  But TV is getting so good these days, the comparison is not far off, and closing all the time.

It’s usual to talk about this Golden Age of TV, but that suggests the TV is now completing its glorious ascendancy.  And this just seems wrong.  With performances like Maslany’s and shows like Orphan Black, I think it’s more likely that TV is just getting started.

Thanks to the anonymous reader who discovered a naming error.  (Now corrected.)

 

Does capitalism have thermals (aka, the evolution of Paramecium, Inc.)

A couple of months ago, I had the good fortune to have lunch with Napier Collyns.  Mr. Collyns is one of the founders of the Global Business Network and a man with a deep feeling for the rhythms and complexities of capitalism.

I came home and banged out this little essay.  It’s an effort to think about the possibility that “value” goes from the material to the immaterial.  A company might begin by making hammers but sometimes it ends up making value that is  less literal and more broad.

Does capitalism have thermals?

Ember

 
A bigger picture may be called for when we think about capitalism.   In his famous essay, Marketing Myopia, Theodore Levitt encouraged people to ask, “What business are you in?”   The question had a strategic purpose: to rescue managers from their literalism.

In the early days of the railroads, managers were preoccupied with laying thousands of miles of track.  The next generation devoted itself to making a magnificent delivery system for industrial America.  With the rise of the automobile, the truck and the plane, things changed.  But the conceptual shoe didn’t fall for management until Levitt gave them a big picture. “You’re not in railroads, you’re in transportation.”

There is perhaps an inevitable developmental pressure.  As the world becomes more complicated (and capitalism routinely makes the world more complicated), the ideas with which it is understood must become more sophisticated.  One minute we’re laying track.  The next, we’re wondering how to compete with things that fly.

The only way to grasp the intellectual challenge is to generalize.  This helps break the grip of literalism, the one that says, trains are trains and planes are planes.  No, says Professor Levitt, trains and plains are the same thing but only if we move to a higher vantage point.

A second thermal comes in the shape of commodity pressure.   In every market, incumbents eventually draw imitations (aka “knock offs”) into play.   The incumbent is faced with two choices.  It can engage in a “race to the bottom” that occurs as incumbent and imitator sacrifice margins until everyone finds themselves mere pennies above cost.  (Thus does the innovation has become a commodity.)

Or, the innovator can climb the value hierarchy, moving from simple functional benefits that the imitators can imitate to “value adds” they cannot.  Thus did IBM find itself challenged by off-shore competitors who offered bundles of software and hardware at 40% of what IBM was charging.

Customers snapped up these cheaper alternatives, only to discover that the commodity player was not supplying the strategic advice and intelligence that came with the IBM version of the bundle.   Now IBM had to learn to talk about this value, and to make more of it.   They were obliged to cultivate a bigger picture.

Here’s another “thermal.”  Premium players traditionally defend themselves from commodity attack by creating higher order value that almost always comes in the form of idea and outlook.  Thus Herman Miller, the furniture maker, confronted by an off-shore competitor that was prepared to make chairs for much less, redoubled it’s effort to sell not just chairs but new ideas for what an office could be.  This thermal intensified as new commodity players have emerged from China, India, and Brazil.

Paramecium, Inc.

We could argue that capitalism has thermals from almost the very beginning.  In this beginning, enterprise were inclined to be structurally simple, a single cell mostly oblivious to the world outside itself.  Call this “Paramecium Inc.” or Level 1.  The enterprise makes hammers.  It assumes someone out there wants hammers but the focus of attention is on the hammer.

Eventually someone comes along and says, “actually, what the company makes matters less than what the consumer wants.”  Thus spoke Charles Coolidge Parlin in 1912 when he asserted that the “consumer was king.”    Closing the gap between company and consumer has been a work in progress.  New methods, theories, and resolve have come from the likes of Peter Drucker and A.G. Lafley, and somehow the gap persists.   But at least the Paramecium is evolving, reckoning with things outside itself.  This is Level 2.

In time someone says, “we need to think more systematically about our competitors.”  This is the long standing focus of Economics, but in the late 1970s, Michael Porter offered a new approach and strategy proved influential.  Here too the organization is sensing and responding to the world outside itself.  It is scaling not so much up as out.  We are now at Level 3.

With each new Level, we “dolly back” to see more of the world. Our “paramecium” is increasing aware of itself and the world outside itself.  This is a movement from the narrow to the broader view, from the local to the global, from the provincial to the cosmopolitan.

Level 4, collaboration, has several moments.  The enterprise, once less solipsistic, can entertain partnerships.  The organization that once insisted on a crisp, carefully monitored border now consents to something that looks more porous.  The Japanese influence helps here.  So did the “outsourcing” movement.  Most recently, with the advent of new media and digital connections, collaboration expands to include still more, and more diverse, parties.

In Level 5, we are encouraged to see that the enterprise must reckon with the meanings, stories, identities, subcultures, and trends with which people and groups construct their world.  Noisy and rich in its own right, culture supplies some of the “blue oceans” of external opportunity and the “black swans” of external threat.  A great profusion of consultancies and aggregators springs up to cover culture.

Level 6, context, was once merely a field or container for all the other levels.  But now the field has come alive, no mere ground but now a source of dynamism all its own.  In this bigger picture, the enterprise can feel itself a tiny cork in a veritable North Sea.  Disruptive change comes from all directions.  Strategy and planning become more difficult, and some enterprises descend into a simple adhocery. The world roils with deliberate change and its unintended consequences.

There is an intellectual challenge at Level 6.  Making sense of a world that is so turbulent, hard to read, and inclined to change is difficult.  Indeed identifying the unit of analysis is vexing.  Are we looking at “trends,” “stories,” “scenarios,” or “complex adaptive system?”   Should the enterprise do this work by hiring x, y or z?

“Context” is a wind driven sea.  The horizon keeps disappearing, navigational equipment is dodgy, the world increasingly unfamiliar, inscrutable and new.  We are to use the language of T.S. Kuhn, post –paradigmatic.

The movement of levels 2 through 4 has been conducted under expert supervision.  But Levels 5 and 6 are vexing partly because there is no obvious intellectual leadership.  Even the “experts” are challenged.  The problem created by Levels 5 and 6 are simply unclear and we continue to disagree on even simple matters.

Reading this through, a couple of hours after publications, it occurs to me that there is for some corporations a Level 7.  This is where the corporation embraces its externalities and takes an interest in the larger social good that can come when the corporation thinks about what value it can create for creatures other than itself.

I was in a strategy session a couple of years ago when a guy from Pepsi, I believe he was actually the CMO (let me check my notes), actually said, “I am committing my organization to solving every environmental problem it has in its purview and can get its mitts on.” Wow, I thought, this is capitalism writ large.

A new name for this blog

grant mccracken II

My blog subtitle used to be “This blog sits at the Intersection of Anthropology and Economics.”  This was both too grand and untrue.  Fine for politicians but not websites.

So now it’s “How to make culture.”  For the moment.  Also thinking of “New Rules for Making Culture.”  Is that better?  I can’t tell.  Please let me know.

Yesterday, I was blogging about the new rules of TV.  And in the last couple of weeks I’ve been talking about advertising, education, late night TV, game shows, culture accelerators.  Less recently, I’ve been talking about marketing, comedy, language, branding, culturematics, story telling, hip hop, publishing, and design thinking.

All of this is culture made by someone.  And all of it is culture made in new ways, often, and according to new rules, increasingly.  Surely an anthropologist can make himself useful on something like this.  Anyhow, I’m going to try.

I have four convictions.  Open to discussion and disproof.

1) that our culture is changing.  Popular culture is becoming more like culture plain and simple.  Our culture is getting better.

I have believed in this contention for many years.  Certainly, since the 90s when I still lived in Toronto.  (It was my dear friend Hargurchet Bhabra who, over drinks and a long conversation, put his finger on it.  ”It’s not popular culture anymore.  Forget the adjective.  It’s just culture.”)

This was not a popular position to take especially when so many academics and intellectuals insisted that popular culture was a debased and manipulative culture, and therefore not culture at all.  Celebrity culture, Reality TV, there were lots of ways to refurbish and renew the “popular culture is bad culture” argument.  And the voices were many.  (One of these days I am going to post a manuscript I banged out when living in Montreal.  I called it So Logo and took issue with all the intellectuals who were then pouring scorn of popular culture one way or another.)

My confidence in the “popular culture is now culture” notion grew substantially this fall when I did research for Netflix on the “binge viewing” phenomenon.  To sit down with a range of people and listen to them talk about what they were watching and how they were watching, this said very plainly that TV, once ridiculed as a “wasteland,” was maturing into story telling that was deeper, richer and more nuanced.  The wasteland was flowering.  The intellectuals were wrong.

2) This will change many of the rules by which we make culture.  So what are the new rules?

I mean to investigate these changes and see if I can come up with a new set of rules.  See yesterday’s post on how we have to rethink complexity and casting in TV if we hope to make narratives that have any hope of speaking to audiences and contributing to culture.  Think of me as a medieval theologian struggling to codify new varieties of religious experience.

3) The number of people who can now participate in the making of culture has expanded extraordinarily.  

This argument is I think much discussed and well understood.  We even know the etiology, chiefly the democratization (or simple diffusion) of the new skills and new technology.  What happens to culture and the rules and conventions of making culture when so many other people are included, active, inspired and productive?  We are beginning to see.  Watch for codification here too.   (As always, I will take my lead for Leora Kornfeld who is doing such great work in the field of music.)

4)  We must build an economy that ensures that work is rewarded with value.

I have had quite enough of gurus telling us how great it is that the internet represents a gift economy, a place where people give and take freely.  Two things here.  1) The argument comes from people who are very well provided for thanks to academic or managerial appointments.  2) This argument is applied to people who are often obliged to hold one or more “day jobs” to “give freely on the internet.”  Guru, please.   Let’s put aside the ideological needle work, and apply ourselves to inventing an economy that honors value through the distribution of value.

I have made this sound like a solitary quest but of course there are many thousands of people working on the problem.  Every creative professional is trying to figure out what he or she can do that clients think they want.  I am beginning to think I can identify the ones who are rising to the occasion.  They have a certain light in their eyes when you talk to them and I believe this springs from two dueling motives I know from my own professional experience, terror and excitement.

Thanks

To Russell Duncan for taking the photograph.

Two New Rules for TV story-telling (aka things to learn from Being Human)

being human

Let’s begin here:

My Netflix research this fall tells me that the rules for making popular culture and TV are changing.

The cause?  That popular culture is getting better and this means some of the old rules are now ineffectual and in some cases actually counter-productive.

Being Human is a great case study.

This is a study in fantasy and the supernatural.  A ghost, a vampire and a werewolf find themselves living together and look to one another for guidance and relief.

It is a show is riddled with implausibilities.  Characters skip around in time and space.  They morph from one creature to another. The plot lines can get really very complicated.

And the viewer doesn’t care.  (At least this viewer doesn’t.)  The acting is so good that we believe in these characters and we are prepared to follow them anywhere.   Even when the plot tests our credulity, we believe in the show.

The key is good acting.  Without this, Being Human is just another exercise in dubiety.  With it, the show holds as a story and more important it actually serves as an opportunity to ask big questions that attach to “being human.”

There is a second show in SyFy called Lost Girl.  .  This is billed as a supernatural crime drama.  It too is stuffed with implausibility.  Lots of fabled creatures and magical spells.  For me, it’s pretty much unwatchable.

And the difference is largely acting.  The actors on Lost Girl are not bad.  They are just not good enough to deliver the emotion truth on which narratives depend, but more to the point they are not good enough to help Lost Girl survive the weight of its own implausibility.

This condition is actually complicated by the creative decision to have the characters supply the “ancient lore” that explains spells and various supernatural beastie.  I found myself shouting at the TV,

“Oh, who the f*ck cares!  The back story is a) not interesting, b) it does not animate the front story, c) in short, the back story is your problem, not our problem.  Get on with it.  Spare us the pointless exposition.”

(Yes, it’s true.  I shout in point form.  It’s a Powerpoint problem.  I’m getting help.  It’s called Keynote.)

New Rule # 1

The more implausibility contained in a narrative, the better your actors had better be.

If this means spending more time casting, spend the time casting.  If this means paying your actors more, pay them more.  Actors are everything.  Well, after the writers.  And the show runners.  Um, and the audience.  But you see what I mean.

And this brings us to the second new rule for story telling on TV.  The old rule of TV was that actors should be ABAP (as beautiful as possible).  Given the choice between someone who is heartstoppingly attractive and someone who looks, say, like one of the actors on Being Human (as above), you must, the old rule says, choose the actor who is ABAP.  (The Being Human actors are attractive.  They just aren’t model perfect.)

This rule created a trade off.  Very beautiful actors were chosen even when they weren’t very talented as actors.  Indeed, show runners were routinely trading talent away for beauty.  As a result, a show began to look like a fashion runway.  Even good writing could be made to feel like something out of the day-time soaps.

Bad acting is of course the death of good narrative.  Wooden performances can kill great writing.  But real beauty exacts a second price.  There are moments when you are supposed to be paying attention to a plot point and you find yourself thinking, “Good lord, what a perfectly modeled chin!”

In a perfect world, every actress would be Nicole Kidman, perfectly beautiful, utterly talented.  In the old days, when TV makers had to chose they would go for beauty even when it cost them talent.  But here’s the new rule.

New Rule # 2

Do not choose beauty over talent.  Beauty used to be the glue that held your audience to your show.  Now that work is performed by talent.  It’s not that beauty doesn’t matter.  Seek attractive actors.  But beauty will never matter more than talent.  Make sure the talent is there, and then, and only then, can you cast for beauty.  Think of this as a kind of “attractive enough” principal.

Stated baldly, this rule seems indubitable.  What show-runner or casting agent would ever think otherwise?  On the other hand, I dropped in on The CW recently and everyone seemed model perfect with bad consequences for the quality of the work on the screen.

A change is taking place in our culture.  And over the longer term, it will provoke a changing of the guard, a veritable migration in the entertainment industry .  Actors who are merely talented will have a more difficult time finding work.  And, counterintuitively, actors who are blindingly attractive will have a more difficult finding work.  What used to make them effective now makes them distracting.

As popular culture becomes culture, there will be many more changes.  Watch this space.

More thoughts on advertising’s “magic moment”

rboyko

(with thanks for Rick Boyko, pictured, for the conversation from which the idea for this blog post sprang.)

Last week Bob Scarpelli and I offered some thoughts on the “magic moment” in advertising.  The magic moment is the small detail that helps bring an ad suddenly, unexpectedly to life.  Here’s the original post.

We can’t quite say how the magic moment works.  What’s worse, we can’t plan for the magic moment or even anticipate it.  It just happens.

It is this unpredictable quality that prompts some people in the ad biz to insist that the magic moment is off limits.  It cannot be part of the industry’s value proposition, or the way any particular agency sells its ware.  After all, if the magic moment is pure serendipity, it can’t be created, managed, predicted, or, least of all, promised.  It is a gift from the gods and the gods pretty much do what they want.

Even if a client hires the best agency, with the most robust planners, strategists and creatives, there is just no telling whether a magic moment will manifest itself.

I admire how scrupulous this is.  I admire an industry that will not promise what it cannot deliver.  But there is another way to make the argument.

Yes, magic moments are serendipitous, but that does not mean they are beyond our grasp.   We can increase our chances of summoning the magic moment.  We can call it out of the heavens.  There are no absolute assurances.  But we can increase the odds.

And this is precisely why those who hope for magic moments will spend the time and money to hire the right agency, director of photography, casting director, and actors.  These people cannot deliver magic moments but they will act like one of those “listening arrays” with which we scrutinize the heavens.

science outer space nasa astronomy astronauts radio telescope_www_wallpaperhi_com_71

It turns out magic moments are not truly random.  They don’t happen to stupid, talentless hacks.  And this means talent does play a role.  And this means, at the very least, are chances of a magic moment go up when we are dealing with people with the talent, imagination and intelligence.  (And that’s what we pay them for.)

There is some connection.  Somehow, talent plays nursery to genius.  Agencies and creatives matter.  We can summon magic, even when we cannot promise it.   In that famous phrase, the gods favor the well prepared.

We may have merely increased the chances of a magic moment by, say, 40%.  For the creative community, this looks meager and nothing like a sales pitch.  They can’t imagine ever selling anything this way.  But for the statistically gifted brand manager, 40% is an opportunity to assess the risk and  justify the expenditure.  Believe me, what the brand manager does not want to hear is, “Oh, this is completely mysterious.  We have no idea how it happens.  Just pay us.”  But we are wrong to think that “40%.  Our chances go up 40%” means little more.  Forty percent is something to reckon with.

My conclusion: the ad agency should be selling itself with the magic moment.  This should be a way to discriminate agencies from no agencies and good agencies from bad agencies.  And it should be the grounds on which agencies justify their fees and the fees attached to recruiting the best talent.  We are not guaranteeing magic moments.  But we are increasingly their likelihood.

How would you help Bosco out of that meth lab?

ht_meth_lab1_061130_ssh

I have a friend who lives in the Midwest and serves as a court-appointed advocate for kids.  One of his “wards,” an eight-year old, recently regaled him with a detailed and enthusiastic description of a meth lab.  This kid  described the cooker, the cooks, the chemicals, the masks, the precautions, the security, the works.

“Bosco,” (we will call him, not his real name) knows all this stuff not because he has ever watched Breaking Bad.  No, he knows this because his parents cook meth.  Or at least, they did until they were arrested, and Bosco ended up a ward of the state.

My friend and I were wondering what cultural creatives could do to help Bosco.  (There are simpler, more direct ways, to be sure.  The question here was what could we do in particular as cultural creatives.)  Most of us live in a cosmopolitan world, soaked in intellectual and cultural capital.  As the beneficiaries of a middle class existence and university educations, we know a lot about lots of things: design, economics, politics, current affairs, Brooklyn, Los Angeles, fly fishing, Route 66, Russian novels.  And if we run across something we don’t know, we know someone who does know.  A quick email and we are in the know.

Bosco doesn’t know much of this at all.   His world is small and, outside of his meth expertise, his knowledge of the world is limited.  If he is struck by a question for which he has no answer, chances are he’s on his own.  Most of the adults in Bosco’s world live a world that is small, ill informed, and starved for stimulation.   They think cooking meth is worth the risk.

The question is this: how to pour intellectual and narrative stimulation into Bosco’s world.  PBS does a great job helping Bosco with his letters and his numbers.  Where could he go to expand the horizons of his world?  (Assuming this does not pour into his world while he is learning his numbers and letters from PBS.)

The trick here is to construct an intellectual, imaginative world for Bosco that makes cooking meth look like a dubious choice.  Naturally enrichment will have  other benefits.  It will increase the wisdom with which he makes all of his life choices.  It will increase the likelihood that he will finish high school and college.

But that’s our minimum.  What knowledge of the world, what intellectual and imaginative resources, could we give Bosco that would make cooking meth go from the biggest thing he knows to one of the smallest, and evidently, one of the most dubious things he knows?

I am thinking of making this a Minerva competition.  And that really is the first question.  Is this a good question?  Could a cultural creative answer it in a useful way.  It may not be.  Some of my Minerva questions turn out to be more arresting than others.  Your comments, please.

Second Look TV

Ember

For most of it’s existence, TV was designed to be “one look” entertainment.  We were supposed to grasp things the first time, and if it happened that some complexity or nuanced escaped us, well, not to worry.  It can’t have been that important in any case.  TV was forgettable culture.  Tissue thin and completely disposable.

But we are entering into the era of “second look” television.  Sometimes this happens because we were making a sandwich or playing with the cat.  Never mind, a simple push of the go-back button, and we are caught up.

But some TV is now created with the expectation that we will not and cannot get it the first time.  If it pleases the court, I offer the following Sprint ad into evidence

Notice that it’s not just the dialog and foreign language(s) that demand the replay.  This ad has got Judy Greer who is fast rising from “sidekick” standing to full blown celebrity.  Plus there are parts that make no sense however many times we watch it.  (The final moment when everyone looks suddenly at the hamster is wonderful partly because it is inscrutable and permanently so.)

Pam, my wife, and I spend a lot of time freezing frame and going back.  ”Wait, did she say what I think we said.”  Or “Hey, did you notice that guy in the background?” Or “get a lot of this camera angle!”  This is what it is to live with Second Look TV and the technology that makes replay effortless.

Indeed culture and technology do an attractive two-step here.  The technology makes this possible.  Culture (in the form of new complexity) makes it necessary.  And so continues  our steady transition from a pop culture to a culture, plain and simple.

Sure it’s good for the game show, but how about the host?

Wayne_Brady_APLA_-_modified

I had a look in on Let’s Make A Deal this morning.

Wayne Brady is the host.  Drew Carey is the host of Wheel of Fortune.  Both are “graduates” of Whose Line Is It Anyway?  Improv has come to day time television

The use of improv comics is a great way to animate a game show genre, now decades old and in danger of becoming formulaic, in spite of all that ingenuity and enthusiasm coming in waves off an extremely “amped” audience.

An improv comedian can turn a split second into something funny and fresh.  Hey, presto, new blood for old shows.  On Whose Line It is Anyway? Brady was fearless.  Clearly, it doesn’t bother him that he was called upon to work without a net.   No script.  No direction.  No advance warning.  He could handle anything the show threw at him.

But here’s the question.  Even as we acknowledge  what Brady gives to the show, we have to ask what the show is taking from him.  What is it like for someone this good at novelty to be stuck in something that is rarely very novel at all?  I wonder if he feels like those World War II aces who were called upon to pilot space capsules in the early days of NASA.  Accustomed to maximum control, they were now, in their language, “spam in a can.”

This is a tension in the entertainment biz.  How do we deliver the soothing samenesses that come from genre and formula without creating something that ends up being stupefyingly dull? As it is, Let’s Make a Deal skews way too far in the direction of formula.  This doesn’t just test the patience of the TV audience.  It must also test the endurance of the host.  For someone who can turn .5 seconds into comedy riches, 60 minutes of predictability must feel like an eternity.  Five times a week.

Image courtesy of Creative Commons and Wikipedia.  Author attribution:  DaniDF1995

Stephen Colbert replaces David Letterman. Please help us figure out what this means!

11colbert2_now-tmagSF

This just in.  We learned moments ago that Stephen Colbert will replace David Letterman on late night television.

We can identify the cultural significance of David Letterman .  He came to prominence on the back of a cultural trend, the Preppie revolution.   Letterman was the guy who liked to stand in a window in Rockefeller Center and proclaim through a bull-horn, “I’m not wearing any pants.”   This was preppie humor, a frat boy prank.

Below is my cheat-sheet treatment of the Preppie revolution as it appeared in Chief Culture Officer.

I would  love it if people would give offer a brief account of the cultural movement that brought Stephen Colbert to prominence and the shift in culture  his rise represents for us.  Don’t feel obliged to give a detailed account.  We can make this collaborative.  Just take a different piece of the puzzle and I will try to piece together when all “results” are in.

Here’s the passage from my Chief Culture Officer:

The preppie convergence began to form visibly and publicly around 1980, but we if we were astoundingly well informed and gifted, we could have seen it coming ten years before.  Doug Kenney founded in National Lampoon in 1970 with staff from the Harvard Lampoon.  And we could have tracked the success of this convergence as this publication began to scale up.  National Lampoon published parodies of Newsweek and Life, the 1964 High School Yearbook Parody (1974), and a well received issue entitled Buy this magazine, or we’ll shoot this dog.  By the end of the 1970s, Lampoon circulation had reached nearly a million copies per month.  And by this time even the dimmest trend hunter had it on their radar.

Sales is one thing.  We should also be alert to the migration of talent.  In the case of the preppie convergence, we needed to be paying attention when the world started raiding the Lampoon for talent.  Kenney left to write movies.  Michael O’Donoghue left in 1975 to become head writer for Saturday Night Live.  P.J. O’Rourke left to write for Rolling Stone.  The National Lampoon spoke with the voice of the ruthless private school boy.  Apparently this was now in demand.

We should have noticed when the preppie convergence began to colonize the movies.  We should have been paying attention when the preppie thing migrated to the movies.  Kenney created Animal House in 1978 and Caddyshack in 1980.  The first featured Tim Matheson, the second Bill Murray.  The prep also appeared in Bachelor Party (1984), played by Tom Hanks.  Perhaps most famously, the prep turned up in the 1982 NBC series Family ties in the character of Alex P. Keaton, played by Michael J. Fox.  He also appeared in the 1982 late night comedy show in the person of David Letterman who gave voice to prep form by standing in a window of Rockefeller center and announcing with a bull horn, “I’m not wearing any pants.”  (Preps loved to be vulgar and clever at the same time.  It’s a frat thing.)

Everyday language began to vibrate with new phrases: “go for it,” “get a life,” “get a grip,” “snap out of it.”  It was easy to see how these spoke for the new convergence.  People were impatient with the old pieties.  That was 60s idealism, and people were done with that.

Convergences must shake the webs of the publishing world.  (Or they cannot be convergences.) One of the best sellers of the period was Lisa Birnbach’s The Preppy Handbook in 1980. This was 200 pages of detailed instruction: what to wear, where to go to school, what sports to play, what sports to watch, what slang to speak, how to be rude to a salesperson, and how to mix a Bloody Mary.  If the National Lampoon had supplied the new character of the decade, here were instructions of a much more detailed kind.

The consensus was visible in public life.  Suddenly Harvard Yard, never especially presentable in its architecture, appointments, or personnel, filled with glossy teens in down vests, Norwegian sweaters, and Top-Siders, all newly minted by L.L. Bean.  Some of them were the children of Old Money following ancestral footsteps into the Ivy League.  But most were kids from Boston University who believed that the Yard was a better lifestyle accessory.

The convergence began to recruit ferociously.  A young woman remembers.

As a teenager [my mom] was pulling The Preppy Handbook out from under my [sleeping] cheek.  These were the mid-80’s, and I just lapped up all that puppy/yuppie/J. Crew catalogue/Land’s End stuff.  I didn’t want to live in Wisconsin; rather, I wished my parents played tennis and would send me away to Phillips Exeter.  In fact, I waged a two-year send-Ann-to-Exeter campaign (“or, hey Choate would be O.K. C’mon, at least consider the University School of Milwaukee!”).  I wished we summered on Martha’s Vineyard and wore penny loafers without socks.  I wanted to ski in Vermont during Christmas vacation like my copy of The Preppy Handbook recommended.  […] I wanted to live far away from Wisconsin and my family and come home only at Christmas.  As pathetic as it sounds, deep in my soul I wished I owned a navy-blue blazer with my school’s crest embroidered on the lapel and wore grosgrain ribbons in my hair.  I daydreamed about the day when I would go to East to college, and I believed I would.⁠1

The preppie convergence would sell a lot of cars for Chrysler (Jeeps) and, eventually, a lot of SUVs for everyone.  It would sell clothing for L.L Bean, Land’s End, J. Crew, Ralph Lauren, and eventually Tommy Hilfiger and the Gap.  It would sell a ton of furniture for Ethan Allen and eventually Sears.  Downstream, it sold a lot of watches for Rolex and a lot of cars for BMW.  Eventually, it would serve as the foundation for Martha Stewart and her brand of status.  It would shape and still shapes what boomers wear on the weekends.⁠2

The tide turned again.  Repudiation was coming.  We might have seen, as I did, graffiti on a Tom Cruise movie poster that read, “die Yuppie scum.”  Another was Gordon Gecko in Wall Street (1987), a film Roger Ebert hailed as a “radical critique of the capitalist trading mentality.”  The prep hero was now tarnished.  (Life soon imitated art, with the fall of Michael Milken, the junk bond trader indicted in 1989 for violations of federal securities and racketeering laws.)  The third was the movie, Heathers (1989) in which teens excluded by snobbery take a terrible revenge against the preps.  The fourth was the publication of American Psycho in 1991.  This was, among other things, a vilification of the prep.  At this point, the big board should be flashing with warning signals.  Something new had made it up out of the college campuses of the world, past all the little gates, and on to the big screen.  Pity us if this is our first warning.

I was doing research with teens in 1990 and, almost to a person, they were saying, “well, I guess you could say I’m a Prep, but I don’t really think I am.”  Or, more forcefully, “The last thing I want to be called is a Prep.”  This was coming from kids who were still wearing buttoned down shirts and Top Siders.  Teens were moving on, some to the emerging subculture of rap, some to a brief revival of the hippy regime, still others were taking an “alternative” turn.  We do not have access to this data, but we can assume that sales figures for Ralph Lauren, Rolex, BMW, and the other “flag ship” brands of the decade fell sharply.  Presumably, furniture and textile stores suddenly found it difficult to move their “duck” and “sailboat” motifs.  What convergences give, they take away.

anImage_19.tiff

1 Stroh, Ann. n.d. The Preppy Handbook and other myths.  This document may be found at http://www.sit.wisc.edu/%7Exanadu/preppy_handbook.html.

2 For the connection between the prep or yuppie movement and BMW, see Greyser, Stephen and Wendy Schille. 1991.  BMW: The Ultimate Driving Machine Seeks to De-Yuppify Itself.  Harvard Case Study, 9-593-046, December 27, 1993.  Steven Greyser is an Emeritus Professor at the Harvard Business School.  Wendy Schille was a research associate at HBS at the writing of this case.