Install Theme

Posts tagged with ‘anthropology’

Hundredth monkey effect

The hundredth monkey effect is a studied phenomenon in which a new behavior or idea is claimed to spread rapidly by unexplained means from one group to all related groups once a critical number of members of one group exhibit the new behavior or acknowledge the new idea. The theory behind this phenomenon originated with Lawrence Blair and Lyall Watson in the mid-to-late 1970s, who claimed that it was the observation of Japanese scientists. One of the primary factors in the promulgation of the story is that many authors quote secondary, tertiary or post-tertiary sources who have themselves misrepresented the original observations. (via Wikipedia)

(Source: inthenoosphere)

Culture is a thing of surfaces and secrets. The anthropologist is obliged to record the first and penetrate the second. Once we’ve figured out what people believe to be true about themselves, we can begin to figure out what’s really going on in this culture.

Grant McCracken, via Wired.com

In an airy studio on a high floor of the London College of Fashion, featuring a long conference table, white walls, and a view to an adjoining patio—where, a sign warns, bees are being kept—the hues you will see in two years are being divined by a pan-European group of colorists.

“What do we say about blue?” asks David Shah, a British-born, Amsterdam-based designer who heads the meeting on behalf of Pantone, the quietly ubiquitous American company that maintains color standards for publishers, designers, and the fashion world. “Blue took so long to come back. It came back last year in a watery story, it’s here this summer in an indigo story—what are we doing about blue?”

“A good navy,” says a French woman with short blonde hair, “is going to fulfill the role that black used to fill, because black is now launching into another dimension.”

“How do we see black now?” Shah interjects. “As a dynamic color?” There is excited chatter. Black has shed its cultural baggage as a negative color. The Italians “did a big statement” about black. The big Yohji Yamamoto retrospective down the road at the V&A. The noncolor that is all colors. Exciting new materials that help black transcend its blackness.

So the new black is … black? Leatrice Eiseman, a color consultant and the sole American at the meeting, (the sole “pragmatic American,” as she describes herself), speaks for the first time. “What I fear about making a general sweeping statement about black is that we know we’ve been there—who doesn’t know about black? What’s new about it?” Animated conversation ensues.

Twice a year, in some European capital, in a room purposely chosen to be drab and sparse—so as not to influence the color mood—Shah gathers a stable of colorists, each of whom works with his or her own country’s national color groups (who traditionally have worked with textile companies and others to set color standards), as well as consulting with companies ranging from Airbus to Zara to Union Carbide. Where the rest of us see black, these are people who talk about the “family of black.” Over two days, they will each pitch a palette concept, organized roughly around a theme that has been chosen in advance (this time, it’s “unity”), that they believe will be dominant in Spring/Summer 2013. The results are published in Pantone View, a $750 publication that is purchased by companies across a broad consumer landscape, from fashion designers to supermarket chains to the floral industry. (“Everybody’s into white flowers at the moment,” Shah tells me, “there are definitely movements, even in flowers.”)

While the Pantone meetings are traditionally secret, I was invited to the Summer 2013 meeting on the condition that I not reveal the colorists’ identities. (Shah and Eiseman’s names are real; I’ll refer to everyone else present at the meeting by their nationality.) And so as to avoid influencing the discussion, I have been asked not to reveal my own identity as a journalist. Instead, I am vaguely portrayed as a functionary of X-Rite, the corporate parent of Pantone.

The meeting is a high-concept show-and-tell fused with a cultural anthropology seminar, with Shah alternately playing the role of interlocutor and air traffic controller. Like novelist William Gibson’s trend-hunter Cayce Pollard, Shah can unleash a torrent of cultural memes on command. Expounding in one instance on the “unity” theme, he riffs: “We’re talking a lot about community, neighborliness, moving from macro to micro economy. The whole ‘rurban’ thing—local food, local chocolate. At the same time, the simplification of things, reducing complications. Don’t make any instruction manuals—things should be intuitive. Computers that will think for you, read your gestures, actually tell you when to go shopping. You go into Gap, it starts suggesting products for you, connecting your friend’s taste to your taste. It’s all about choosing together.” He pauses, a quick intake of breath, before firing: “How many people use Twitter here?” “Oh, God,” retorts the Frenchwoman.

- Tom Vanderbilt, Pantone color forecasts: Are they accurate? - Slate Magazine

Vanderbilt describes a hush-hush meeting of color mavens, convened every summer by Pantone, to decide what will be the colors of the following season.

The paragraph about David Shah’s stream-of-consciousness cultural dissection made my head hurt, but in a familiar way.

This is the proof that getting deep into any niche of culture means that you have to get more connected to everything that niche touches.

Facebook Metaphors: New Spatialism

I often compare Facebook to a large and impersonal shopping mall, with a lot of noise and the cloying stink of Cinnabon and cheap candles getting deep into your sinuses. Others use other metaphors, and ex-Facebooker Dave Morin is probably influenced by the value of his stock portfolio when he recently compared Facebook to a town:

Jessica Roy,  Former Facebook Social Design Evangelist Says Facebook Model Is ‘Self-Serving, Egocentric’ via Betabeat

The discussion started with this prompt:

[by Dave Morin, the founder of Path] Facebook has built the cities, they’ve built the town squares, and they’re more of a general social network, he said. Path, on the other hand, is more like the home, as if adding each friend is filling out your dinner table.

On Twitter, Mike Karnjanaprakorn–CEO and cofounder of Skillshare–added, “If Facebook built the cities and Path is building the houses, Skillshare is building the schools.”

Mr. Miller started a discussion about this approach to design on Branch, and Mr. Morin and Mr. Karnjanaprakorn both weighed in.

“The Internet is like the Wild Wild West. But over time, you can see certain infrastructures being put in place. AOL created the roads, Facebook created the Town Square, PayPal created the bank, Twitter created the newspaper, Path is creating the home, and Skillshare is creating the school,” said Mr. Karnjanaprakorn. “We still need a sheriff.”

Mr. Morin agreed. “I think Mike hit the metaphor on the head,” he said.

But then another voice chimed in. It was the voice of Mr. Fisher, whose former title at Facebook was “Social Design Evangelist.”

“On the contrary, Facebook is NOT a town square,” he wrote. “In fact there’s little sense of community at all. It’s centered on individuals and their friends which is a very self-serving, egocentric model that does little to help people actually work together, as would a town.”

Da-yum. That’s definitely a burn coming from someone a source says wrote Facebook’s social design guidelines.

It appears Mr. Fisher is building his own version of a town square, which is probably why he doesn’t like the idea of Facebook overtaking that moniker very much. His LinkedIn lists him as the founder of Townsquared, Inc., a ”social platform that helps organizations grow niche communities.”

I am interested to learn more about Townsquared, certainly, but this seems to be nothing more than entrepreneurs caught up in metaphorical chinese checkers; all of them are trying to jump over Facebook to some better, warmer, more productive social matrix, at least conceptually.

The mall metaphor works for me because I dislike malls and spend as little time in them as possible. But malls somehow seem to be filled with people, looking in the windows, checking each other out, trying on cheap shoes, and eating bad pizza. Or maybe the suburban sprawl metaphor (from 2009):

Facebook Is The New Suburbia by Hugh McLeod

New Spatialism:  Reclaiming Social Space In Web Media via stoweboyd.com

Using an analogy from city planning and architecture, we need a rethinking of the basics: something like the New Urbanism movement, that tried to reclaim shared urban space in a way that matches human needs, and moved away from gigantic and dehumanizing cityscapes of the mid and late twentieth century, where garbage trucks seemed more at home than a teenage girl walking a dog.

So, we need a New Spatialism movement, to rethink web media and reclaim the social space that is supposed to be central to so-called social media. Some web media may just remain what it is, like an industrial district at the edge of town. But at least some parts of web media should be reconceptualized, and reconstructed to get back to human scale. Just as New Urbanism is about organizing streets, sidewalks, and plazas to support the growth of social capital, New Spatialism would help us channel interactions on line to increase sociality, and thereby increase the growth of social capital.

New Spatialism is based on the idea that our primary motivations for being online are extra-market drivers: we are not online for money, principally. We have created the web to happen to ourselves: to shape a new culture and build a better, more resilient world. And we need better media tools than we have at present, to make that a reality.

As usual, when the techies start talking about online shared space, they lose their way because that haven’t actually studied urbanism, nor anthropology.

Charlie O’Donnell On Why We Need More Anthropologists

Charlie O’Donnell makes a case for getting more anthropologists involved in software product design, and I have to agree.

via email

The General Assembly conference  is a full-day summit of keynotes, panels, and networking at General Assembly, bringing together founders and investors for an intelligent dialogue about the state of venture capital, fundraising strategies for early-stage startups, honest stories from entrepreneurs on successful (and less than successful) capital raises, when/how/why to bootstrap, and more. Food and drink will be provided.  Speakers include Anthony Caselena, founder and CEO of Squarespace, Hayley Barna & Katia Beauchamp, co-founders of Birchbox and Spencer Fry, Co-Founder of Carbonmade.  I’ll be on a panel about the State of the Venture Capital industry. 

By the way, we’re in severe need of some anthropology classes.  Back in 2005, Joshua Schachter told me and and the guys at USV that he thought venture firms should employ anthropoligists—people who were experts in human behavior, needs, and evolutionary tendencies.  As the web was becoming more and more about people, and seemingly less about the technology, it would follow that those who could best predict where it was going would be experts in people more so than experts in writing code.

This is what I find most disappointing about many of the ideas I see day in and day out.  They appear to be more about solving for what will get funded or what’s a derivative of a product they saw on Techcrunch that they don’t even use themselves versus understanding and improving the lives of the population at large.   Entrepreneurs pitching are too easily tripped up by “Ok, so pretend I don’t know you and I’ve never heard of your product—how do I find it?”  Their first answer is usually that someone would share it with me, because it went viral, but they don’t understand what incentive anyone would have to share their product—other than the fact that their own social circle of friends seem to share a random assortment of stuff and they do it very often.  They too often extrapolate non-existant trends or false perceived needs from their very homogeneous social circles. 

If you’re going to design a product, it’s so incredibly important to not only diversify the kinds of people you interact with socially, but spend a fair bit of analysis and contemplation to really understand trends in their behavior.  Why does someone use or not use Service A?  Entrepreneurs are, to their own disadvantage, punting on the understanding of human behavior they need to design products for the masses.  Perhaps if you spend a little less time pitching and more time listening—figuring out why people use the products they do and observing how they interact with others, you’d have more users.  Before you get confident that you’ve got something, get curious about why people are doing what they’re doing.

The General Assembly conference he mentions has a long list of entrepreneurs talking about getting funded, but not a single anthropologist. Hmmm. Sounds like a conference that will talk about what is getting funded, and not ‘improving the lives of the population at large’, but I could be wrong. In fact, it sounded so much like ‘the funded talking to the unfunded about becoming funded by convincing funders to fund’ that I decided not to attend. Unless they want an anthropologist to talk about social behavior on the web.

Is there a new geek anti-intellectualism? - Larry Sanger →

Larry Sanger, via

This is enough to clarify what I mean by “geek anti-intellectualism.”  Let me step back and sum up the views mentioned above:

1. Experts do not deserve any special role in declaring what is known.  Knowledge is now democratically determined, as it should be.  (Cf. this essay of mine.)

2. Books are an outmoded medium because they involve a single person speaking from authority.  In the future, information will be developed and propagated collaboratively, something like what we already do with the combination of Twitter, Facebook, blogs, Wikipedia, and various other websites.

3. The classics, being books, are also outmoded.  They are outmoded because they are often long and hard to read, so those of us raised around the distractions of technology can’t be bothered to follow them; and besides, they concern foreign worlds, dominated by dead white guys with totally antiquated ideas and attitudes.  In short, they are boring and irrelevant.

4. The digitization of information means that we don’t have to memorize nearly as much.  We can upload our memories to our devices and to Internet communities.  We can answer most general questions with a quick search.

5. The paragon of success is a popular website or well-used software, and for that, you just have to be a bright, creative geek.  You don’t have to go to college, which is overpriced and so reserved to the elite anyway.

If you are the sort of geek who loves all things Internet uncritically, then you’re probably nodding your head to these.  If so, I submit this as a new epistemological manifesto that might well sum up your views:

You don’t really care about knowledge; it’s not a priority.  For you, the books containing knowledge, the classics and old-fashioned scholarship summing up the best of our knowledge, the people and institutions whose purpose is to pass on knowledge–all are hopelessly antiquated.  Even your own knowledge, the contents of your mind, can be outsourced to databases built by collaborative digital communities, and the more the better.  After all, academics are boring.  A new world is coming, and you are in the vanguard.  In this world, the people who have and who value individual knowledge, especially theoretical and factual knowledge, are objects of your derision.  You have contempt for the sort of people who read books and talk about them–especially classics, the long and difficult works that were created alone by people who, once upon a time, were hailed as brilliant.  You have no special respect for anyone who is supposed to be “brilliant” or even “knowledgeable.”  What you respect are those who have created stuff that many people find useful today.  Nobody cares about some Luddite scholar’s ability to write a book or get an article past review by one of his peers.  This is why no decent school requires reading many classics, or books generally, anymore–books are all tl;dr for today’s students.  In our new world, insofar as we individually need to know anything at all, our knowledge is practical, and best gained through projects and experience.  Practical knowledge does not come from books or hard study or any traditional school or college.  People who spend years of their lives filling up their individual minds with theoretical or factual knowledge are chumps who will probably end up working for those who skipped college to focus on more important things.

I really dislike the ‘drop out of college and spend time on making money’ theme that has been advocated by crackpots like Peter Thiel. However, I am not sure that Sanger is right here. There is a great deal of anti-intellectualism in America, generally, so perhaps it isn’t too much of a surprise that some of the folks that don’t give a hoot about the Battle of Hastings or the newest additions to the Periodic Table are becoming involved with the web. What should we do? Use a literacy test before granting them access?

I would rather rant about the anti-intellectualism of television and pop culture as a whole.

The reality is that the web is absorbing pop culture, and being changed by that. I continue to hope that the web’s transformational character will blend change pop culture, too.

Claude Lévi-Strauss wrote, “The anthropology of the future is the study of ourselves”, and the dry and dispassionate character of intellectualism is a poor match for today’s web involvement, which is really a form of full-contact anthropology: an on-going study of ourselves and our selves. Sanger might have spent more time castigating the educational establishment and the intellectual class for their aloof indifference to the lives of those indifferent to intellectualism, even geeks. The well-educated should know better.

The anthropology of the future is the study of ourselves.

Claude Lévi-Strauss

(via underpaidgenius)

The Decade Of Publicy

I am aware that my recent inquiries into privacy and ‘publicy’ are a bit anthropological at their core, rather than technological or software-design based. Claude Lévi-Strauss sets the stage for this inquiry, perhaps, when he wrote “The anthropology of the future is the study of ourselves.”

The anthropology of the future is the study of ourselves.

Earlier, in the middle of the ’00s, I used to talk about social architecture as a set of design principles for social tools grounded in the way that we are wired, and how we can be bettered by the augmented sociality that social tools offer. But I feel that we have to go deeper than this architectural metaphor, just as city planners and architects need to step outside of the materials and styles of buildings and public spaces to understand what people are doing, living in proximity. They have to get back to what buildings and cities are used for, what they are good for, and to do that, they have to study people, not bricks. Rheinhold Niebuhr was getting at this when he said city planners had to get past the self-deception he called ‘the doctrine of salvation by bricks.’

And in the case of the new social web, we need to get past the comforting metaphors of online public spaces, and take a hard look at what what people are really up to, online.

Revisiting Privacy

Our cultural principles of privacy are derived from our existence in space. People share physical space, both natural and human made, and we require space to live, walk, and interact.

This will be the decade when publicy displaces privacy, online and off.

In everyday life we come in contact with other people all the time in public and private spaces, like streets, trains, offices, restaurants, stores, and homes. We have developed elaborate social codes about how we act in such places. We excuse ourselves when we bump into others or otherwise touch them unintentionally.

In every culture, social mores have arisen to allow people to interact without causing offense while sharing public spaces. It is considered rude to stare at people in most cultures, for example, and in many cultures people — both men and women — go to great lengths to conceal their bodies, faces, or even the contour of the body below the clothes. Some well-known examples, include these:

  • Male Tuareg Berbers, for example, cover their mouths even when eating except when alone with family, while the women go unveiled. 

  • Orthodox Jewish women that follow the practices of tzniut must cover their hair when in public, so many wear wigs, scarfs, or other head coverings.

In most cultures, these principles include some rights to personal space, and the right to conceal parts of the body from others’ view. This leads to problematic cultural conflicts, like the current trends in France to prohibit various sorts of Islamic face coverings, which are viewed by many as ‘unfrench’.

Much of what we consider as online privacy is considered analogous to what goes on in public spaces. We start with the premise that individuals online have the right to reveal as little or as much of their personal information, backgrounds or interests, in a way that parallels similar rights in face to face public interactions.

However, online interactions aren’t really based on sharing space. All the metaphorical mumbo jumbo about online ‘spaces’ is just that: metaphors. And not particularly apt ones. Because there is no ‘space’ in which we live online. There is no equivalent, really, of passing through a restaurant, in our online world. Hanging out on Twitter, for example, is really not like that, in public terms.

And there is nothing like a ‘private home’ online, because we don’t sleep there, and we don’t need toilets or showers there. So a lot of the privacy issues in the real world — like what the police can seize without a warrant, or whether you have to admit your identity when asked — don’t really play online. Online privacy is seldom about private property, but about access to information.

Consider that in a real-world restaurant, I cannot cloak my presence, because we don’t yet have invisibility, or the Mystique-like ability to appear to be someone else. If I am present, I can be seen. I might try to wear large sunglasses, or a hat, but otherwise I am there for people to see. But of course, if I pass into a VIP lounge in the back of the restaurant, I might drop my hat and sunglasses, since I am in a different public space, one with different norms.

And in real-world social interaction there are some facts about yourself that are impossible or socially unacceptable to conceal.

Gender is so fundamental to human society that pronouns are based on them in English, and other speech constructs in other languages. We can’t even talk about or with people if we don’t know whether they are male or female. 

In contemporary US society it is considered a given that everyone will reveal their marital state, or their dating situation, if asked. People who intentionally conceal being married, for example, are presumed to be immoral, not super private: it’s not a topic that is considered appropriate for privacy. (Note: this is why there is such a widespread furor about the FaceBook ‘It’s complicated’ datum.)

We are still in a gray zone, culturally, with regard to sexual orientation or sexual availability. Many gay men and lesbian women are still living a clouded life, where ‘don’t ask, don’t tell’ is still the order of the day, and not just in the military. In other segments of society, however, being less than fully out if you are gay or lesbian is a social faux pas.

Many Publics

The idea that we live in a social sphere involving many potentially overlapping ‘publics’ has been discussed widely, and explored profitably by folks like Kevin Marks and danah boyd.  The real world examples — where I pass from the restaurant to a VIP lounge, discarding sunglasses — give a misleading sense of equivalence between the physical world and the online. But let’s examine the notion of publics, and see what we can carry over into the online, social web.

These streams of updates don’t have to add up to a picture that defines the individual, any more that we are defined by the stamps on our passports or the complete sequence of hats we have owned.

The notion that I can participate in different publics — different social milieus — is a well worn one. In the online sense, however, this has a distinctly different take.

Online interaction is not actually based on shared space. The metaphor fails because the actual actions and reactions tied to shared physical space don’t play online. There is no ‘walking into a restaurant’ online.

When I post my location at a specific restaurant on Gowalla, Brightkite or Foursquare, only those that are sharing my thread are aware of me being there, and even then, only those online at that time. In the real world, everyone physically present at the restaurant can turn their head and see me, even if I would rather that they don’t.

Online, we share time, not space. We are not actually in a restaurant together: we are using Brightkite, and I am playing along with the premises of the social conventions of Brightkite by posting that I am in Momofuku, The Slanted Door, or Fatty Crab.

Online, only those who are part of the publics associated with the timelines I am posting to are invited to know I am in that restaurant.

And I am not defining the norms of that public. I am instead chosing to go along with its conventions, and by extension, endorsing them, by posting my status updates there.

In privacy-oriented (or real-world space-oriented) terms, I would be considered as revealing information about myself, in accordance with the physical make up of that specific physical locale.

In publicy-oriented (or online time-oriented) terms, I am according with the conventions of this specific tool’s take on shared time. In the case of Brightkite, it is about geolocation, so to participate I post that I am sitting down to ramen at Momofuku.

So, in the actual Momofuku, I have to reveal enough about my identity to claim my reserved table, and pay for my dinner by credit card. And anyone passing by can see that I am dining with my friend Gregarious, or an ex-girlfriend, or Al Gore.

But, on Brightkite, there is no way that someone can see who I am eating with, or what I ordered. If I want to play along with the conventions of the specific tool, or the strictures of the public associated with that tool, I might upload a picture of Gregarious or my Quaking Beef. And, with different tools, that involves different publics, I might share the music I heard there, the wine we drank, or the glorious sex I had after dinner.

From a privacy viewpoint, this fracturing of the totality of experience is viewed as selectively revealing potentially overlapping classes of information about my personal life with different subsets of my world. In the privacy take on the world, a person might be defined as the union of all the personalities they present to the world. People’s personalities in this worldview are thought of as atomic, but multifaceted. And of course, if the various facets don’t align, the person is seen as flawed, pathological, or evil.

From a publicy viewpoint, something very different is going on. In this zeitgeist a person has social contracts within various online publics, and these are based on norms of behavior, not of layers of privacy. In these online publics, different sorts of personal status — sexual preferences, food choices, geographic location — exist to be shared with those that inhabit the publics. So, in this worldview, people are the union of a collection of social contracts, each of which is self-defined, and self-referential. The norms and mores of a foodist service — eat everything and post everything you eat — may be completely distinct from those about sexual interests, or sports, or social technology on the web. These streams of updates don’t have to add up to a picture that defines the individual, any more that we are defined by the stamps on our passports or the complete sequence of hats we have owned.

In this worldview, a person is a network of identities, each defined in the context of the form factor of a specific social publics. There is no atomic personality, per se, just the assumption that people shift from one public self to another as needed.

This is something like what happens to people that speak multiple languages fluently. In English, Luigi might be more reserved than when he speaks Italian, because the cultural milieu in which he learned and uses the two languages are very different. In such a case we wouldn’t say that Luigi is a fake, two-faced or duplicitous because of these changes in his manner. And the only ones that are capable of seeing the two Luigis are those that are themselves fluent in English and Italian. Luigi’s monoglot friends might never know.

Passing Into Publicy: A New Decade

Here, at the start of 2010, a new decade, we should anticipate significant blowback from the transition to an online world based on publicy. It is not ‘the death of privacy’ per se, an idea that is  rumbling around in the commentariat. It is not that notions of privacy will disappear. Privacy is as deeply enculturated in our social wiring as pronouns.

This will be a fracturing of the premises of privacy, and a slow rejection of the metaphors of shared space.

What is happening is the superimposition of publicy on top of, and partly obscuring, privacy. Those raised in this brave new world are already living in a cultural context based on publicy, and therefore they are running afoul of social conventions based on privacy. That’s why young people find job offers rescinded when pictures of drunken or naked pictures are discovered on their Facebook pages. Their prospective employers are judging their actions from a privacy-based attitude, in which the facets of an online self are averaged, instead of being considered as a constellation of selves. Publicy says that each self exists in a particular social context, and all such contracts are independent.

This carries over into the nature of online relationships. A common refrain in the Sunday supplements is that online relationships aren’t as ‘real’ as offline. This may be a reaction to the demands that online social contracts imply, many of which are unlike those in the offline world.

It’s as if we are gaining the ability to see into the ultraviolet and infrared ends of the social spectrum when we are online, and in some contexts we are dropping out yellows or reds. To those tied to the visible color spectrum we are habituated to, this new sort of vision will be ‘irreal’. But ultraviolet has always existed: we just couldn’t see it before.

Some will dismiss my theorizing as a simple reprise of cultural relativism, making the case that all cultures can only be understood in their own cultural terms. I am making part of that case, in essence, by saying that the mores inherent in online social contracts are self-defined, and any individual’s participation in a specific online public does not have to be justified in a global way, any more than the cultural mores of the Berber Tuaregs need to be justified from the perspective of modern Western norms.

This will be a fracturing of the premises of privacy, and a slow rejection of the metaphors of shared space. The principles of publicy are derived from the intersection of infinite publics and our shared experience of time online, through media like Facebook, Twitter and Tumblr. The innate capability we have to shift in a heartbeat from a given public, and our corresponding persona, to another, is now being accelerated by streaming social tools. This will be the decade when publicy displaces privacy, online and off.

Moving To The Edge: The Hunter/Gatherer Future

Today’s New York Times includes a piece on female bosses being occasionally more tyrannical that males (see A Tyrant Boss, Even Without the Y Chromosome - New York Times), which precipitated a chain of thought about collaboration, the future of work, and, perhaps, a return to an earlier basis for culture:

[by Benedict Carey]

In an authoritative 2003 analysis of 45 studies in a wide range of organizations, from schools to hospitals to financial companies, Alice Eagly of Northwestern University and Marloes van Engen of Tilberg University in the Netherlands found that women managers tended to be — on average — more collaborative than men, more encouraging to subordinates, more likely to include them in decisions. Men were more likely to lead by top-down command, or to be strictly hands off, distant.

"The differences are small and of course individuals vary," Dr. Eagly said, "but women score higher on transformational leadership, modeling good behavior, working with people, letting people know when they are doing a good job."

I have long believed that the best collaboration arises in groups where hierarchical dominance is tempered by egalitarian ideals: where the social norms favor hearing the viewpoints of the largest number of those involved in the work or subject at hand, and where the responsibilities and rewards of work are distributed. Not too surprising, those places where I have worked with the fewest number of women have been the places with the lowest degree of collaboration, trust, and happiness.

Being a natural loner, although an alpha male type, I have in general withdrawn into the hard shell of individual achievement in such situations, or, when working as a manager in these contexts, have worked to create a private world with social norms at variance with the larger context. Lamentably, both of these strategies had only limited success, or, at the worst, have failed miserably.

I believe that the best setting for collaboration — whether in the workplace, or in society as a whole — is one in which these principles of inclusion and egalitarianism are afforded the greatest importance. This is the primary motivation for my evangelism for Web culture, since it lacks any hierarchies except those based on respect, influence, and the inexorable power laws of reputation and emergent individual authority.

It is worth saying that the nature of collaboration is such that it flourishes in situations where the so-called “feminine” — which could be translated as egalitarian — approach to social interaction are expected and applied. This means that we need to move actively away from authoritarian — “masculine” — and hierarchical approaches to sense making, organization, and decision making, that is, if we wish to live in a world based on collaboration and inclusion.

Of course, the alternative is right in front of us. We can instead aspire to hold onto a world based on exclusion, divisiveness, and caste, where unilateral power politics dominate the world stage, and the parochial interests of the powerful few control the destinies of all.

And in business, we will watch as those who best learn how to collaborate will come to dominate the land-rush into the 21st century economy, and those who tried to control their markets — instead of building deep collaboration into their business models, as John Seely Brown styles it — will fail. It may well be that this will be the fulcrum upon which human culture turns, pried away from industrial era models of command-and-control, in business, in politics, and in society as a whole.

I sense that the Web stands as a strange attractor, where the relatively stable state of our world system can be rapidly transformed into a radically different but equally stable state.

On a personal level, I think that the rise of a new world culture, colored strongly by the Web and its subversion of all the pillars of culture, namely arts, entertainment, media, politics, and religion, will change people’s aspirations and role models. Bill Gates will seem less an icon than Pierre Omidyar or Jason Fried, in the coming years. Gates, the last great industrial tycoon, and the information era the final chapter in the industrial epoch.

And it occured to me that this egalitarianism — including the near parity of men and women in social systems — is a major characteristic of hunter/gatherer societies. I wondered what other aspects of hunter/gatherer life might be worthwhile to reintroduce into today’s world, or which are likely to reemerge as we move into the new epoch?

Communitarianism — Early paleolithic hunter/gatherers have been generally depicted, as Hobbes stated, “living lives nasty, brutish, and short,” but current anthropology suggests that they may have lived much better that later pastorialists, agricutural serfs, or industrial factory workers. They certainly worked less, perhaps only a few hours a day, spending the rest of their time socializing, sleeping, and playing. Most interesting, it seems that in general, hunter/gatherers distribute their food in a communitarian manner, relying on a “gift economy” of obligation and favors to make sense of who should get how much of what is caught or gathered everyday.

I am not advocating a grasshopper lifesyle, with no food storage or planning for the future, but only suggesting that the current social norms fail to include the poor and disenfranchised adequately in our rapid movement into a brave new world. We need to adopt an “all for all” attitude, and distribute the benefits of our collective resources in such a way that all have a place at the cookpot. Any other philosophy — especially based on the arguments of centralized authority in the hands of ruling elites — will fail to find any resonance in the emerging Web culture.

Marshall Sahlins refers to a “Zen affluence” where

human material wants are finite and few, and technical means unchanging but on the whole adequate. Adopting the Zen strategy, a people can enjoy an unparalleled material plenty - with a low standard of living. That, I think, describes the hunters. And it helps explain some of their more curious economic behaviour: their “prodigality” for example- the inclination to consume at once all stocks on hand, as if they had it made. Free from market obsessions of scarcity, hunters’ economic propensities may be more consistently predicated on abundance than our own.

I am not actually advocating returning to actually hunting and gathering itself — spearing fish from streams, or eating bark from trees — only the reapplication of the social models of hunter/gathers now that we have moved beyond a time in which most of us are working the land or laboring in factories. This Zen abundance is one example of how we might think, if we only wanted to.

Nomadism — A few thousand years ago, humanity stopped wandering around, and became tied to the land based on agriculture, and then moved from agriculural areas into the cities with the rise of industrialism. In a time when corporate agriculture employs only a few percent of the population, and a minority are involved in manufacturing, huge cities and sedentary populations are an artifact of outmoded economics. (As are, perhaps, all notions that derive from living in fixed places, which is too large a thought to pursue here, suggesting the inutility of things like nations.)

While New York seems the pinnacle of all that is metropolitan, parking 8 to 15 million people into a giant hive on some relatively low-lying islands and pennisulas sticking out in the increasingly restive Atlantic in a time of global warming is simply hubris, as the Katrina disaster should prove to anyone willing to connect the dots. A multi-trillion dollar disaster is just waiting to happen, and it really is not limited to just big catastrophes. Ditto Hong Kong and dozens of other locations.

The world is warming, and while we should do everything we can to decrease the rate of the rise, it is likely to rise for some time, no matter what we do. Our relationship to the earth has to change on many levels, and in one obvious one: we need to think about moving large populations away from the coastlines, where most people on earth live.

We could adopt the paleolithic ideal, which involves thinking of ourselves as nomads, and moving away from the “American Dream” which involves a relatively large home, surrounded by other similar, disconnected homes, in neighborhood unserved or underserved by public transportation, that require the entire energy apparatus of Western civilization to support it: commuting, cars, large road systems, strip malls, gas stations, the petroleum economy, wars in West Asia, and the growing carbon crisis.

Instead, we need to envision a time when we rely principally on non-personal transportation, using renewable energy sources, and a gradual decommissioning of all the elements of the current “Dream” — including the fixed tie to the land that a home implies.

I am not self-universalizing, suggesting that others should become jetsetters just because I am bouncing from place to place like a shuttlecock. But moving our aspirations away from a house in the suburbs, a job with a short commute, and a summer place on the Cape is a good starting point.

Why can’t people in service jobs work nearly anywhere? You can, as the Web shows. And even folks in more prosaic jobs — the barrista at the neighborhood Starbucks — could just as well work in Tucson, as in Fairfax County, so long as people begin dispersing from the coasts and large cities into other locales.

This is a condemnation of the current mileau — politicians, governments, media, and other organized gorups — who refuse to mobilize toward a sustainable future. We will have to — no surprise — take it on ourselves to accomplish this transformation of human life on earlth, and most likely jettison those power structures along the way.

And my prediction is that people will begin moving into locales conducive to this sort of life style. [I better hurry to sell my suburban DC home before this trend surfaces.] But it is likely to require a permanent increase in energy costs and a few more coastal disasters before the notion becomes a movement.

Inclusiveness — Buckminster Fuller’s Spaceship Earth notions are basically forgotten, but the idea is simple: we are on Earth together, and if we think of it as a vessel carrying us into the future, with limited resources and a growing number of passengers it will change our response to the problems that beset us. And good collaborative solutions will require us to adopt inclusionary and egalitarian approaches to work toward reasonable solutions: we will need to become more feminine in our style and society.

You can take this as the Sunday morning ramblings of a bleary-eyed, over-zealous student of technology and society, if you’d like. Some moony-headed geek hoping that technology will outpace the ills of our dysfunctional world. Maybe so.

I hope that instead I am simply advocating a return to our wiring, a resurgence of trust in ourselves and how we have come into being on this green, small, and fragile watering hole, this third rock from the Sun. We need to move past the fractious adolesence of a troubled humanity and take on the tasks of adulthood, which means caring for all of us, not just our friends and families, and putting aside childish cliques and games.

Its time to learn to hunt, and forage, and memorize the old, old songs that tell the paths to the summer fishing grounds and where the nuts are sweetest. Fables that teach us that humor trumps evil, and that the wisest of us all might be an old crone full of tales. Stories that warn us against the short and hard promises of war, and argue against the cold logic of hatred.

We need to move to the edge, and leave the center behind. We have so much to do, so much to lose, so much to regain.


Related Posts Plugin for WordPress, Blogger...