Home Introduction Cognitive Psychology Cognitive Perspective Social Perception Social Memory Social Categorization Social Judgment Language Automaticity Self Social Neuropsychology Personality Social Intelligence Development Sociology of Knowledge Social Construction Conclusion Lecture Illustrations Exam Information

 

Social Categorization

Lecture Supplement

 

Find an overview of concepts and categories 

in the General Psychology lecture supplements on Thinking.

 

Social perception is concerned with the ways in which we use stimulus information -- in the form of trait terms or  more physical features of the stimulus -- to form mental representations -- impressions -- of people and situations.  As we have already seen person perception entails more than extracting information from a stimulus: the perceiver must combine information from the stimulus (including the background) with knowledge retrieved from memory.  Much of this pre-existing knowledge comes in the form of implicit personality theory, but more broadly the act of perception is not completed until the new percept is related to the perceiver's pre-existing knowledge.  Paraphrasing Jerome Bruner (1957), we can say that Every act of perception is an act of categorization.

What Bruner actually said was: 

"Perception involves an act of categorization....  The use of cues in inferring the categorial [sic] identity of a perceived object... is as much a feature of perception as the sensory stuff from which percepts are made."

Perception connects knowledge of the stimulus with knowledge about the kind of object or event the stimulus is.  This conceptual knowledge exists as part of semantic memory.  In contrast with the autobiographical knowledge of specific events and experiences that comprises episodic memory, semantic memory holds abstract, context free knowledge:

our "mental lexicon" of knowledge of the meanings of words;
other linguistic knowledge, e.g., of phonology, syntax, and pragmatics;
knowledge of objects; and last, but not least in this context,
categorical knowledge of concepts, subset-superset relations, similarities, and category-attribute relations.

Concepts and categories are critical to cognition because they enable us to organize the world -- to reduce the "blooming, buzzing confusion" (James' phrase) of experience to something we can understand and manage.  Categorization is critical to perception, because it enables us to infer properties of an object that we cannot perceive directly.  Once we have categorized an object, on the basis of those properties we can perceive, we can infer that it has other, unseen properties that it shares with other members of its class.

In the social-intelligence view of personality (Cantor & Kihlstrom, 1987), social categorization sorts persons, situations, and behaviors into equivalence classes that are the basis for behavioral consistency.  People behave similarly in situations that they perceive to be similar; and categorization is the basis of perceptual similarity, because instances of a category are broadly similar to each other.

Concepts and Categories

Having now used the terms concept and category interchangeably, it is time to distinguish between them:

A category may be defined as a group of objects, events, or ideas which share attributes or features in common. Categories partition the world into equivalence classes.  Oak trees and elm trees belong in the category trees, while the Atlantic and the Pacific belong in the category oceans.

Some categories are natural, in that their members are part of the natural world.

Other categories are artificial, in that they have been contrived by experimenters who want to know more about how categorization works.

A concept is the mental representation of a category, usually abstracted from particular instances. Concepts serve important mental functions: they group related entities together into classes, and provide the basis for synonyms, antonyms, and implications.  Concepts summarize our beliefs about how the world is divided up into equivalence classes, and about what entire classes of individual members have in common.

Generally, we think of our mental concepts as being derived from the actual categorical structure of the real world, but there are also points of divergence:

Categories may exist in the real world, without being mentally represented as concepts.

Concepts may impose a structure on the world that does not exist there.

Technically, categories exist in the real world, while concepts exist in the mind. However, this technical distinction is difficult to uphold, and psychologists commonly use the two terms interchangeably. In fact, objective categories may not exist in the real world, independently of the mind that conceives them (a question related to the philosophical debate between realism and idealism).  Put another way, the question is whether the mind picks up on the categorical structure of the world, or whether the mind imposes this structure on the world.  

Some categories may be defined through enumeration: an exhaustive list of all instances of a category. A good example is the letters of the English alphabet, A through Z; these have nothing in common except their status as letters in the English alphabet.

A variant on enumeration is to define a category by a rule which will generate all instances of the category (these instances all have in common that they conform to the rule). An example is the concept of integer in mathematics, which is defined as the numbers 0, 1, and any number which can be obtained by adding or subtracting 1 from these numbers one or more times.

The most common definitions of categories are by attributes: properties or features which are shared by all members of a category. Thus, birds are warm-blooded vertebrates with feathers and wings, while fish are cold-blooded vertebrates with scales and fins. There are three broad types of attributes relevant to category definition: 
perceptual or stimulus features help define natural categories like birds and fish; 
functional attributes, including the operations performed with or by objects, or the uses to which they can be put, are used to define categories of artifacts like tools (instruments which are worked by hand) or vehicles (means of transporting things); 
relational features, which specify the relationship between an instance and something else, are used to define many social categories like aunt (the sister of a father or a mother) or stepson (the son of one's husband or wife by a former marriage). 

Of course, some categories are defined by mixtures of perceptual, functional, and relational features.

Still, most categories are defined by attributes, meaning that concepts are summary descriptions of an entire class of objects, events, and ideas. There are three principal ways in which such categories are organized: as proper sets, as fuzzy sets, and as sets of exemplars.

Now having defined the differences between the two terms, we are going to use them interchangeably again.  The reason is that it's boring to write concept all the time; moreover, the noun category has a cognate verb form, categorization, while conceptual does not (unless you count conceptualization, which is a mouthful that doesn't mean quite the same thing as categorization).  

Still, the semantic difference between concepts and categories raises two particularly interesting issues for social categorization:

To what extent does the categorical structure of the social world exist in the real world outside the mind, to be discovered by the social perceiver, and to what extent is this structure imposed on the world by the social perceiver?

To what extent are social categories "natural", and to what extent are they "artificial"?

Concepts and categories are just about the most interesting topic in all of psychology and cognitive science, and two very good books have been written on the subject.  They are highly recommended:

Categories and Concepts by E.E. Smith and D.L. Medin (Harvard University Press, 1981).

The Big Book of Concepts by G.L. Murphy (MIT Press, 2002).

Here in Berkeley's Psychology Department, Prof. Eleanor Rosch -- who made fundamental contributions to the "prototype" view of conceptual structure -- gives a wonderful course on the subject.  Prof. George Lakoff, who has also made fundamental contributions to our understanding of concepts and categories, gives a similar course in the Linguistics Department.  

The study of social categorization encompasses a wide variety of social categories:

types of persons (nouns representing different kinds of people, including forms of mental illness);

types of social groups (including social stereotypes);

types of situations and interactions (mostly nouns again); and

types of actions (especially trait adjectives).

Mostly, social categorization has been studied in the domains of persons and social groups.

The Karass and the Grandfalloon

In his novel Cat's Cradle (1963), Kurt Vonnegut makes a distinction between two types of social categories:

The Granfalloon, a recognized grouping of people that have no real relationship with each other.

The Karass, a group of people whose relationships with each other are profound but unknown.

Vonnegut's example of a granfalloon is the term Hoosiers, referring to residents of the state of Indiana.

In the novel, Vonnegut invents a religion, Bokonism, that celebrates people's karasses.

With social categories -- with any categories, really, but especially with social categories -- it's important to consider whether the category in question is a karass -- a category that really means something -- or a granfalloon.

 

 

Us and Them

Perhaps the most basic scheme for social categorization divides the world into two groups: Us and Them -- or, to use the technical terms of sociology and social psychology, the ingroup and the outgroup.  As Sumner put it (1906, p. 12);

The insiders in a we-group are in a relation of peace, order, law, government, and industry, to each other.  their relation to all outsiders, or others-groups, is one of war and plunder....  Sentiments are produced to correspond.  Loyalty to the group, sacrifice for it, hatred and contempt for outsiders, brotherhood within, warlikeness without -- all grow together, common products of the same situation.

 

The Robbers Cave Experiment

RobbersCave.JPG (104558 bytes)The division of the social world into US and Them is vividly illustrated by one of the earliest examples of experimental social psychology -- the "Robbers Cave" experiment conducted by Muzafer Sherif and his colleagues.  Through extensive pretesting, Sherif et al. identified a group of 22 5th-grade boys from Oklahoma City who were absolutely "average" in every imaginable way.  These children were then offered a vacation at a camp located at Robbers Cave State Park (hence the name).  

 

In Stage 1 of the experiment, the boys were divided into two groups, unbeknownst to each other, and assigned to physically separate campsites.  For one week, each group was engaged in a number of independent activities encouraged to foster intragroup cohesion, and the establishment of a hierarchy of leadership.

One group of boys named themselves the "Rattlers".

The other group named themselves the "Eagles".

In Stage 2, the two groups were brought together for a series of tournaments.  There the researchers observed the development of considerable intergroup competitiveness and hostility; they also observed shifts in leadership within each group.

Remember that, by virtue of the way these boys were selected, the two groups were similar in every imaginable way.  There were no "natural" group distinctions, by gender (of course), or race or ethnicity, or social class, or academic achievement.  But once formed into a group, however arbitrary, group membership dominated both intra- and inter-group relations.

Beans.JPG
                          (70277 bytes)Aside from ordinary observation, Sherif and his colleagues conducted a number of experimental tests to document the competition and hostility between the two groups.  In one such study, they scattered beans on a playing field, and had the two groups compete to see who could pick up the most (and stuff them in a bag through a very small opening).  Before the actual count, the experimenters showed photographs, ostensibly of the contents of the bags collected by the two groups, and asked the boys to estimate the number of beans in each bag.  In fact, each of the displays contained exactly 35 beans.  Nevertheless, members of the Eagles estimated that they had collected more beans than the Rattlers, and the Rattlers estimated that they had collected more beans than the Eagles.  

In Stage 3, Sherif et al. engaged the two groups in noncompetitive, cooperative activity for the good of all -- such as using a rope, previously used in a tug of war, to haul a delivery truck out of a ditch.  In fact, these staged crises were successful in reducing intergroup friction and inducing intergroup cooperation.

 

The Minimal Group Paradigm

In the Robbers Cave experiment, the two groups achieved a clear group identity before they were brought together, and initially encountered each other in an environment of competition for limited resources -- precisely the circumstances in which Sumner thought that a distinction between Us and Them would emerge.  But it turns out that competition for limited resources is unnecessary for the division into ingroup and outgroup to occur.

KleeKandinsky.jpg (121845 bytes)A 008Tajfel.jpg (33523 bytes) series of classic experiments by Henri Tajfel and his colleagues (1971; Billig & Tajfel, 1973) employing the minimal group paradigm shows how powerful social categorization can be.  In his experiments, Tajfel assigned subjects to groups on an essentially arbitrary basis - - for example, based on their expressed preferences for the paintings of Wasily Kandinsky vs. Paul Klee -- or, in the most dramatic instance, based on the results of a coin-toss.   Members of the two groups did not know other members in either group.  They had no experiential basis for the formation of ingroup and outgroup stereotypes.  And they had no history of group interaction that could lead to the formation of differential attitudes.  Nevertheless, when group members were given the opportunity to distribute rewards to other group members, the subjects consistently favored members of their own ingroup, 

Based on this line of research, Tajfel and Turner (1979) formulated social identity theory, which argues that there are two sources of self-esteem: one's own personal status and accomplishments, and the status and accomplishments of the groups of which one is a member.  By boosting the status of their own ingroup, compared to outgroups, individuals indirectly increase their own status and self-esteem.  They also discovered a phenomenon known as basking in reflected glory, by which individual group members receive boosts in self-esteem based on the achievements of their ingroups, even though they themselves had nothing to do with those achievements -- and even when their connection to the group is tenuous.  

011OutHomog.jpg (39266 bytes)An interesting phenomenon of group membership is the outgroup homogeneity effect (Allen & Wilder, 1979).  In their experiment, Allen and Wilder took pre-experimental measures of attitudes toward various topics.  Subjects were then arbitrarily assigned to two groups, ostensibly on the basis of their preferences for paintings by Kandinsky or Klee, as in the original experiment by Tajfel et al.  Then they were asked to predict the responses of ingroup and outgroup members to various attitude statements.  Subjects ascribed attitudes to other group members in such a manner as to decrease the perceived attitudinal similarity between themselves and other ingroup members, increase the perceived attitudinal similarity among members of the outgroup, and also to increase the perceived difference between ingroup and outgroup.  This was true even for attitude statements that had nothing to do with abstract art.

The Outgroup Homogeneity Effect in Literature

Kurt Vonnegut must have read a lot of social psychology.  Another of his novels, Slapstick: Or, Lonesome No More (1976), uses the outgroup homogeneity effect as a kind of plot device.  In this novel, a computer randomly assigns every person a new middle name -- either Daffodil-11 or Raspberry-13.  Almost immediately, the Daffodil-11s and Raspberry-13s organize themselves into interest groups.

So, the mere division of people into two groups, however arbitrary, seems to create two mental categories, Us and Them, with "people like us" deemed more similar to each other than we are to "people like them".

The Us-Them situation becomes even more complicated when you consider how many ingroups we are actually members of, each ingroup containing a corresponding outgroup.  

As an exercise, try to determine how many ingroups you're a member of, and see how many different outgroups those ingroup memberships entail.

The basic division of the social world into Us and them, ingroups and outgroups, is the topic of Us and Them: Understanding Your Tribal Mind by David Berreby (Little, Brown 2005).  In his book, Berry analyzes what he sees as "a fundamental human urge to classify and identify with 'human kinds'" (from "Tricky, Turbulent, Tribal" by Henry Gee, Scientific American, 12/05).

Arguably, an even more fundamental category is between Self and Other, about which more later.


Categories of Persons

What are the natural categories in the domain of persons?  Here's a list, inspired by the lexicographical work of Roger Brown (1980):

sex or gender;

kinship;

age;

occupation;

nationality;

race or ethnicity;

personality types; and

social stereotypes.

 

Gender Categories

At first blush, the gender categories look simple enough: people come in two sexes, male and female, depending on their endowment of sex chromosomes, XX or XY.  But it turns out that things are a little more complicated than this, so that gender categorization provides an interesting example of the intersection of natural and artificial, and biological and social, categories.

As it happens, chromosomal sex (XX or XY) is not determinative of phenotypic sex (whether one has male or female reproductive anatomy).  As in everything else, heredity interacts with environment, and in this case the hormonal environment of the fetus is particularly important in gender differentiation.  Sometimes due to accidents of genetics, as in Klinefelter's syndrome (XXY) and Turner's syndrome (XO), but mostly due to accidents of the endocrine system, individuals can be born with ambiguous external genitalia.  It is possible, for example, to be chromosomally male but phenotypically female (e.g., the androgen-insensitivity syndrome), or to be chromosomally female but phenotypically male (e.g., congenital adrenal hyperplasia).  

What do do with these cases of pseudohermaphroditism?  (There are no true hermaphrodites, who have the complete reproductive anatomies of both males and females - except in mythology.)  For a long time they were simply ignored.  Then, in an attempt to help people with these conditions to lead better lives, they were often surgically "corrected" so that their external genitalia more closely corresponded to the male or (usually) female ideal -- see, for example, the cases described in Man and Woman, Boy and Girl by J. Money & A. Ehrhardt (1972).  

More recently, however, some authorities have argued that such individuals constitute their own gender categories.  For example, Anne Fausto Sterling (in Myths of Gender, 1985, 1992; and especially in Sexing the Body, 2000) has identified three "intersex" gender categories, where the individuals deviate from the "Platonic ideal":

chromosomally male, with female reproductive anatomy;

chromosomally female, with male reproductive anatomy;

chromosomally and anatomically half male and half female ("ovotestis", where the gonadal tissue has differentiated into one testis and one ovary).

Rather than force these individuals to conform to the Platonic ideal for males or females, Fausto-Sterling argues that they constitute separate gender categories, and should be acknowledged as such and considered to be normal, not pathological.  According to Fausto-Sterling's account, then, there are really five sexes, not two.  Put another way, the categorization of people into two sexes is a social construction, imposed on the individual by society.  

Fausto-Sterling's argument is provocative, but it is also controversial.  See, for example, "How common is intersex? A response to Anne Fausto-Sterling" by L. Sax, Journal of Sex Research, 2002.

 

Gender Identity

It is one thing to be male or female biologically, and another thing to identify oneself as such.  Most people, even most of those who fall into the "intersex" category, identify themselves as either male or female.  Even transgender

individuals will identify them as "a man trapped in a woman's body" (meaning that they identify themselves as male), or the reverse (meaning that they identify themselves as female).  Gender identity usually corresponds to phenotypic sex, but this is not necessarily the case.  In any event, with respect to social cognition, we are mostly interested in gender identity -- how people identify and present themselves with respect to gender.  In Fausto-Sterling's world, there would be at least a third category for gender identity, intersex.

Transgendered and Transsexual

Sex researchers, endocrinologists, and feminists can debate whether there are five categories of gender, but there's no question that a third category of transgender individuals has begun to emerge.  The definition of "transgender" is a little ambiguous (no joke intended), but generally appears to refer to people who are for whatever reason uncomfortable with the gender of their birth (or, put in terms compatible with social constructivism, their assigned gender).  A transgender male may simply not identify himself as a male; alternatively, he may identify himself as a female, in which case we may speak of a transsexual individual.  Transsexuals may seek to have their bodies surgically altered to conform to their gender identities.  Transgender individuals may not go that far, because they do not necessarily identify themselves as either male or female.  

For an article on transgender and transsexual students on American college campuses, see "On Campus, Rethinking Biology 101" by Fred A. Bernstein, New York Times, 03/07/04).  

 

Gender Role

Beyond subjective matters of gender identity, there is the matter of gender role -- the individual's public display of characteristics associated with masculinity or femininity.  It turns out, that having a male or female gender identity does not necessarily mean that the person will adopt the "corresponding" masculine or feminine gender role.  

While every culture has some standard for appropriately masculine and feminine behavior, these standards are not the same from one culture to another.  In American culture (and in European culture as well), masculinity is associated with agency and femininity is associated with communality.  But in other cultures, as Margaret Mead documented in her book Sex and Temperament, these can be reversed.

A person can be male (or female), and identify him- or herself as such, but still not adopt the corresponding gender role.  There are "masculine" women who nonetheless identify themselves as women, and "effeminate" men who nonetheless identify themselves as men.

Although masculinity and femininity would seem to be opposite ends of a single bipolar trait, work by Sandra Bem and Janet Taylor Spence, among others, has shown that masculinity and femininity are in fact independent of each other.  Masculinity does not contradict femininity, and it is possible to be high on both -- or low on both, for that matter.  According to this analysis, we have four categories of gender role:

masculine

feminine

androgynous -- i.e., displaying both masculine and feminine characteristics; and

undifferentiated -- i.e., displaying characteristics of neither masculinity or femininity.

 

Sexual Orientation

Then there is the matter of erotic and sexual orientation (not "preference", as some would have it -- as if the matter of what turns people on sexually was a matter of choice, as between corn flakes and shredded wheat for breakfast).  Most people are heterosexual, with men falling in love with, and having sex with, women and women falling in love with, and having sex with, men.  But there are other varieties of sexual orientation:

homosexuality;

bisexuality; and

asexuality.

According to the conventional view of gender, everything is given by the genes: XYs become males who identify themselves as such, become masculine, and make love with women; XXs become females who identify themselves as such, become feminine, and make love with men.  In this view, there are only 2 gender categories, male or female, they are dichotomous, and everything else flows from this.

But it turns out that gender categories are more complicated than this.  If there are really

5 categories of gender (assuming that Fausto-Sterling is right),

3 categories of gender identity (counting intersex),

4 gender roles, and

4 sexual orientations,

and they really are to some extent orthogonal to each other, then that leaves 240 gender-related categories -- a long way from 2!  Which we choose depends on how we, individually and as a society, think about gender.  In other words, what looks like a natural biological category has some elements of a social construction.

 

Kinship Categories

People can be classified as male or female (etc.), but they can also be classified by their relationships to each other.

The nuclear family, celebrated by popular television shows of the 1950s such as Father Knows Best and The Adventures of Ozzie and Harriet, consists of four kinship categories:

mother;

father;

son (who is also a brother to any other siblings); and

daughter (who is also a sister to any other siblings).

Of course, there is also an extended family, consisting (depending on how far out it is extended) of additional kinship categories, both paternal (on the father's side) and maternal (on the mother's side):

great-grandparents;

grandparents;

uncles and aunts;

cousins, including first, second, third cousins and beyond (John Kerry and George Bush are 16th cousins)

nephews and nieces;

grandchildren; and

great-grandchildren -- at least!

None of this takes into account the kinship categories created by divorce and remarriage, such as:

stepfather and stepmother;

stepson and stepdaughter.

Never mind the difficulties created by "foster" families.  We're talking only about blood relations -- relations determined by consanguinality -- here

Again, given that we are talking about relations that are determined by shared blood, it would seem that kinship categories are natural, and are biologically defined:

fathers and mothers are the biological parents of their sons and daughters;

grandparents are the biological parents of fathers and mothers;

great-grandparents are the biological parents of grand-parents;

aunts and uncles are the brothers and sisters of fathers and mothers;

cousins are the biological sons and daughters of aunts and uncles;

stepchildren are the biological sons and daughters of their mothers or fathers, but not of their step-fathers or -mothers.

As such, it would seem that everyone would share the same set of "natural" kinship categories.  But it turns out that this isn't true.  Nerlove and Romney (1967) found wide variance in the kinship categories employed by various cultures.  For example:

"Type A" cultures have words for siblings only, and do not further differentiate them according to gender;

"Type B" cultures, such as English, have separate categories for brothers and sister;

"Type C" categories make distinctions between elder and younger siblings, but for brothers only;

"Type H" categories distinguish between elder and younger sisters as well as brothers;

"Type G" categories distinguish between parallel and cross-sex siblings, so that they have different terms for the sister of brothers and for the sister of other sisters. 

"Type L" cultures are like Type G" cultures, except that they make further distinctions between, e.g., the older brother of sisters and the younger brother of brothers. 

To take a particularly interesting example, Hopi sibling terminology has specific terms for:

an elder brother;

an elder sister;

the younger sister of a male; and

the same term refers to both the younger brother of a male or the younger sibling (brother or sister) of a female.

This constellation of sibling terms makes sense in the context of Hopi culture (Eggan, 1950; Nerlove & Romney, 1967), and this reinforces the point that social categorization may be quite different from biological categorization, and that social categorization serves specifically social purposes.

 

Marital Status Categories

As a variant on kinship categories, we also classify people by their marital status.  

The big category here is married vs. single.

Within the "single" category, there are a number of subcategories, including:

bachelor, referring to an unmarried male (interestingly, it seems that English doesn't have a cognate category for an unmarried female);

divorcee (typically applied to women who have been divorced);

spinster (interestingly, it seems that English doesn't have a cognate category for such an unmarried male);

widow and widower.

In 2004, a controversy erupted over whether gays and lesbians should have the right to marry (in 2003 the Episcopal Church considered a proposal to solemnize unions between same-sex partners, and in 2004 Gavin Newsome, the Mayor of San Francisco, ordered the City Registrar to issue marriage licenses to same-sex couples), prompting President George W. Bush to call for an amendment to the US Constitution that would restrict marriage to a "union of a man with a woman" (in the language of one proposed amendment).  Other arrangements would be called civil unions or somesuch, but they would not necessarily have the same legal status as a marriage.  Setting aside discussion of the wisdom of this proposal, it seems to be an attempt to apply a classical, proper-set view of concepts to the concept of marriage.  That is, the union of "a man with a woman" would be the singly necessary and jointly sufficient feature to define the concept of marriage.  But as one lesbian participant in the San Francisco same-sex-marriage marathon noted, "I live in the suburbs, I have two kids, and I drive an SUV -- why shouldn't I be able to get married" (or words to this effect).  Clearly, she is defining marriage in terms of nonnecessary features.  Perhaps she thinks of marriage as a fuzzy set, where perhaps the union of "one man with one woman" is the prototype, but other kinds of marriages are possible.

 

Age Categories

Age is another "natural", "biological" variable: we're born at age 0, and die some time later.  If we're lucky, we pass through infancy and childhood, puberty, adolescence, adulthood, and old age.  In strictly biological terms, more or less, infancy starts at birth, puberty marks the boundary between childhood and adolescence, and death marks the end of adulthood; where the boundary is between adolescence and adulthood is uncertain, as even causal observation indicates.  

But where are the boundaries between infancy and childhood, and between adolescence and adulthood?  Although Brown identified age as a "natural category" of persons, it is also clear that, at least to some extent, even age categories are social conventions.  

You might say that childhood begins with the onset of walking and talking, but some theorists have argued that childhood itself is a fairly recent cultural invention.  

Chief among these was Phillippe Aries, whose 1960 book The Child and the Family in the Ancient Regime (translated as Centuries of Childhood and published in English in 1963) argued that "in medieval society, the idea of childhood did not exist" (p. 125), and that children began to work, like adults, as soon as they were physically able.  The idea of childhood as a separate period of life between infancy and adolescence came into existence only in the 17th century, and then only in the upper socioeconomic classes; acceptance of the idea of childhood by the lower and lower-middle classes, as indicated by the passage of child-labor laws, occurred only in the late 19th and early 20th centuries.  But his essential idea, that society's view of childhood underwent considerable revision and evolution, has staying power.  

Aries's critics argued that he relied too much on unrepresentative and unreliable evidence, such as portraits in which children were dressed as adults

And in the days when young people married very early, the boundary between adolescence and adulthood just didn't exist.

Consider Shakespeare's Romeo and Juliet: Juliet is only 13 (OK, she's almost 14), and her father would have been perfectly happy to marry her off to someone who wasn't named Montague.

Taking a leaf from Aries's book, Patricia Cohen has argued (in In Our Prime: The Invention of Middle Age, 2012) that "middle age" is likewise a social construction.  She notes that before the middle of the 19th century, people were identified by life markers such as marriage and parenthood, but not by age per se.  Even the US Census didn't ask people their exact birthdate until 1900.  

The concept of middle age, not to mention the midlife crisis, gained cultural momentum after 1950, with the publication of Erik Erikson's "Eight Ages of Man", discussed below, which described stages of adult development.

Moreover, even if certain age categories are biologically "natural", societies seem to invent subcategories within them.  

Child-rearing books commonly divide childhood into particular "stages" such as toddlerhood, the "terrible twos", and the "white-food" stage, that children move into and grow out of.  

The period between ages five and seven -- the so-called "five to seven shift" gets a great deal of attention from developmental psychologists, including Freud and Piaget (for quite different reasons).

Adolescence is now marked into subdivisions such as "pre-teen" and the "'tweens".

In adulthood, we commonly distinguish between young adulthood, middle age, and between the merely elderly and the "old old".

Even before birth, legal, moral, and religious debates about abortion divide prenatal development into stages based on three trimesters.

 

Freud's Stages of Psychosexual Development

Freud divided childhood into a succession of five stages of psychosexual development:

 

Oral

Anal

Phallic

The Latency Period

Genital

For Freud, all instincts have their origins in some somatic irritation -- almost literally, some itch that must be scratched. In contrast to hunger, thirst, and fatigue, however, Freud thought that the arousal and gratification of the sexual instincts could focus on different portions of the body at different times. More specifically, he argued that the locus of the sexual instincts changed systematically throughout childhood, and stabilized in adolescence. This systematic change in the locus of the sexual instincts comprised the various stages of psychosexual development. According to this view, the child's progress through these stages was decisive for the development of personality.

Properly speaking, the first stage of psychosexual development is birth, the transition from fetus to neonate. Freud himself did not focus on this aspect of development, but we may fill in the picture by discussing the ideas of one of his colleagues, Otto Rank (1884-1939).

Rank believed that birth trauma was the most important psychological development in the life history of the individual. He argued that the fetus, in utero, gets primary pleasure -- immediate gratification of its needs. Immediately upon leaving the womb, however, the newborn experiences tension for the first time. There is, first, the overstimulation of the environment. More important, there are the small deprivations that accompany waiting to be fed. In Rank's view, birth trauma created a reservoir of anxiety that was released throughout life. All later gratifications recapitulated those received during the nine months of gestation. By the same token, all later deprivations recapitulated the birth trauma.

Freud disagreed with the specifics of Rank's views, but he agreed that birth was important. At birth, the individual is thrust, unprotected, into a new world. Later psychological development was a function of the new demands placed on the infant and child by that world.

From birth until about one year of age, the child is in the oral stage of psychosexual development. The newborn child starts out as all id, and no ego. He or she experiences only pleasure and pain. With feeding, the infant must begin to respond to the demands of the external world -- what it provides, and the schedule on which it does so. Initially, Freud thought, instinct-gratification was centered on the mouth: the child's chief activity is sucking on breast or bottle. This activity has obvious nutritive value: it is the way the child copes with hunger and thirst. But, Freud held, it also has sexual value because the child takes pleasure in sucking; and it has destructive value because the child can express aggression by biting.

Freud pointed out that the very young child needs his or her mother (or some reasonable substitute) for gratification. Her absence leads to frustration of instinctual needs, and the development of anxiety. Accordingly, the legacy of the oral stage is separation anxiety and feelings of dependency.

After the first year, Freud held, the child moves into the anal stage of development. The central event of the anal stage is toilet training. Here the child has his or her first experience with the external regulation of impulses: the environment teaches him or her to delay urination or defecation until an appropriate time and place. Thus, the child must postpone the pleasure that comes from relieving tension in the bladder and rectum. Freud believed that the child in the anal stage acquired power by virtue of giving and retaining. Through this stage of development, the child also acquired a sense of loss, and also a sense of self-control.

The years from three to five, in Freud's view, were taken up with the phallic stage. In this case, there is a preoccupation with sexual pleasure derived from the genital areas. It is at about this time that the child begins to develop sexual curiosity, exhibits its genitalia to others, and begins to masturbate. There is also an intensification of interest in the parent of the opposite sex. The phallic stage revolves around the resolution of the Oedipus Complex, named for the ancient Egyptian king who killed his father and married his mother, and brought disaster to his country. In the Oedipus complex, there is a sexual cathexis toward the parent of the opposite sex, and an aggressive cathexis toward the parent of the same sex.

The beginnings of the Oedipus Complex are the same for boys and girls. Both initially love the mother, simply because she is the child's primary caretaker -- the one most frequently responsible for taking care of the child's needs. In the same way, both initially hate the father, because he competes with the child for the mother's attention and love. Thereafter, however, the progress and resolution of the Oedipus complex takes a different form in the two sexes.

The male shows the classic pattern known as the Oedipus Complex. The boy is already jealous of the father, for the reasons noted earlier. However, this emotion is coupled with castration anxiety: the child of this age is frequently engaged in autoerotic activities of various sorts, which are punished when noticed by the parents. A frequent threat on the part of parents is that the penis will be removed -- and Freud noticed that this threat would be reinforced by his observation that the girls and women around him, in fact, do not have penises. As the boy's love for his mother intensifies into incestuous desire, the risk is correspondingly increased that he will be harmed by this father. However, the father appears overwhelmingly powerful to the child, and thus must be appeased. Accordingly, the child represses his hostility and fear, and through reaction formation turns them into expressions of love. Similarly, the mother must be given up, and the boy's sexual longing for her repressed. The final solution, Freud argued, is identification with the father. By making his father an ally instead of an enemy, the boy can obtain, through his father, vicarious satisfaction of his desire for his mother.

A rather different pattern, technically known as the Electra Complex after the Greek heroine who avenged her father's death. The Electra Complex in girls is not, as some might think, the mirror-image of the Oedipus Complex in boys. The young girl has the usual feelings of love toward her mother as caretaker, Freud believed, but harbored no special feelings toward her father. Girls, Freud noted, were not typically punished for autoerotic activity -- perhaps because they did not engage in it as often, perhaps simply because it is less obvious. Eventually, Freud believed, the girl discovers that she lacks the external genitalia of the boy. This leads to feelings of disappointment and castration that are collectively known as penis envy. She blames her mother for her fate, and envies her father because he possesses what she does not have. Thus the sexual cathexis for the mother is weakened, while the one for the father is simultaneously strengthened. The result is that the girl loves her father, but feels hatred and jealousy for her mother. The girl seeks a penis from her father, and sees a baby as a symbolic substitute. In contrast to the situation in boys, girls do not have a clear-cut resolution to the Electra Complex. For them, castration is not a threat but a fact. Eventually, the girl identifies with her mother in order to obtain vicarious satisfaction of her love for her father.

It should now be clear why Freud named this the "phallic" stage, when only one of the sexes has a phallus. In different ways, he argued, children of both sexes were interested in the penis. The first legacy of the phallic stage, for both sexes, is the development of the superego. The child internalizes social prohibitions against certain sexual object-choices, and also internalizes his or her parents' system of rewards and punishments. (Because girls are immune to the threat of castration, Freud thought, women had inherently weaker consciences than men.) The second legacy, of course, is psychosexual identification. The boy identifies with his father, the girl with her mother. In either case, the child takes on the characteristic role and personality of the parent of the same sex.

The phallic stage is followed by the latency period, extending approximately from five to eleven years of age. In this interval, Freud thought that the sexual instincts temporarily subsided. In part, this was simply because there is a slowing of the rate of physical growth. A more important factor in this state of affairs, however, are the defenses brought to bear on the sexual instincts during and after the resolution of the Oedipus Complex. During this time, however, the child is not truly inactive. On the contrary, the child is actively learning about the world, society, and his or her peers. 

Finally, with the onset of puberty at about age 12, the child enters the genital stage. This stage continues the focus on socialization begun in the latency period. The coming of sexual maturity reawakens the sexual instincts, which had been dormant throughout the latency period. However, the sexual instincts show a shift away from primary narcissism, in which the child takes pleasure in stimulating his or her own body, to secondary narcissism, in which the child takes pleasure in identifying with his or her ego-ideal. Thus, sexuality itself undergoes a shift from an orientation toward pleasure to one oriented toward reproduction, in which pleasure is secondary. The adolescent's attraction to the opposite sex is, for the first time, coupled with ideas about romance, marriage, and children. When the adolescent (or adult) becomes sexually active, events in the earlier stages will influence the nature of his or her genital sexuality -- for example, in those body parts which are sexually arousing, and in preferences for foreplay.

 

Erik Erikson's Eight Ages of Man

Erik Erikson is the most prominent disciple of Freud who lived after World War II (in fact, he was psychoanalyzed by Anna Freud) -- and after Freud himself, perhaps, the psychoanalyst who has had the most impact on popular culture. Erikson focused his attention on the issue of ego identity, which he defined as the person's awareness of him- or herself, and of his or her impact on other people. Interestingly, this was an issue for Erikson personally (for a definitive biography of Erikson, see Coles, 1970; for an autobiographical statement, see Erikson, 1970, reprinted 1975).

Erikson has described himself as a "man of the border". He was a Dane living in Germany, the son of a Jewish mother and a Protestant father, both Danes. Later his mother remarried, giving Erikson a German Jewish stepfather. Blond, blue-eyed, and tall, he experienced the pervasive feeling that he did not belong to his family, and entertained the fantasy that his origins were quite different than his mother and her husband led him to believe. A similar problem afflicted him outside his family: the adults in his parents' synagogue referred to him as a gentile, while his schoolmates called him a Jew. Erikson's adoptive name was Erik Homburger. Later he changed it to Erik Homburger Erikson, and still later just Erik Erikson -- assuming a name that, taken literally, meant that he had created himself.

Erikson agreed with the other neo-Freudians that the primary issues in personality are social rather than biological, and he de-emphasized the role of sexuality. His chief contribution was to expand the notion of psychological development, considering the possibility of further stages beyond the genital stage of adolescence. At the same time, he gave a social reinterpretation to the original Freudian stages, so that his theory is properly considered one of psychosocial rather than of psychosexual development.

Erikson.JPG (105796 bytes)Erikson's developmental theory is well captured in the phrase, "the eight ages of man". His is an epigenetic conception of development similar to Freud's, in which the individual must progress through a series of stages in order to achieve a fully developed personality. At each stage, the person must meet and resolve a particular crisis. In so doing, the individual develops particular ego qualities; these are outlined in Erikson's most important book, Childhood and Society (1950), and in Identity: Youth and Crisis (1968). In Insight and Responsibility (1964), he argued that each of these strengths was associated with a corresponding virtue or ego strength. Finally, in Toys and Reasons (1976), Erikson argued that a particular ritualization, or pattern of social interaction, develops alongside the qualities and virtues. Although Erikson's theory emphasizes the development of positive qualities, negative attributes can also be acquired. Thus, each of the eight positive ego qualities has its negative counterpart. Both must be incorporated into personality in order for the person to interact effectively with others -- although, in healthy development, the positive qualities will outweigh the negative ones. Similarly, each positive ritualization that enables us to get along with other people has its negative counterpart in the ritualisms that separate us from them. Development at each stage builds on the others, so that successful progress through the sequence provides a stable base for subsequent development. Personality development continues throughout life, and ends only at death.

Stage 1: Trust, mistrust, and hope. The oral-sensory stage of development covers the first year of life. In this stage the infant hungers for nourishment and stimulation, and develops the ability to recognize objects in the environment. He or she interacts with the world primarily by sucking, biting, and grasping. The developmental crisis is between trust and mistrust. The child must learn to trust that his or her needs will be satisfied frequently enough. Other people, for their part, must learn to trust that the child will cope with his or her impulses, and not make their lives as caregivers too difficult. By the same token, if others do not reliably satisfy the child's needs, or make promises that they do not keep, the child acquires a sense of mistrust. As noted earlier, both trust and mistrust develop in every individual -- though in healthy individuals, the former outweighs the latter.

Out of the strength of trust the child develops the virtue of hope: "the enduring belief in the attainability of fervent wishes, in spite of the dark urges and rages which mark the beginning of existence". The basis for hope lies in the infant's experience of an environment that has, more than not, provided for his or her needs in the past. As a result, the child comes to expect that the environment will continue to provide for these needs in the future. Occasional disappointments will not destroy hope, provided that the child has developed a sense of basic trust.

An important feature of social interaction during this period is the ritualization of greeting, providing, and parting. The child cries: the parents come into the room, call its name, nurse it or change it, make funny noises, say goodbye, and leave -- only to return in the same manner, more or less, the next time the situation warrants. Parent and child engage in a process of mutual recognition and affirmation. Erikson calls this ritualization numinous, meaning that children experience their parents as awesome and hallowed individuals. This can be distorted, however, into idolism in which the child constructs an illusory perception of his or her parents as perfect. In this case, reverence is transformed into adoration.

Stage 2: Autonomy, Shame, Doubt, and Will. The muscular-anal stage covers the second and third years of life. Here the child learns to walk, to talk, to dress and feed him- or herself, and to control the elimination of body wastes. The crisis at this stage is between autonomy and shame or doubt. The child must learn to rely on his or her own abilities, and deal with times when his or her efforts are ineffectual or criticized. There will of course be times, especially early in this period, when the child's attempts at self-control will fail -- he will wet his pants, or fall; she will spill her milk, or put on mismatched socks. If the parents ridicule the child, or take over these functions for him or her, then the child will develop feelings of shame concerning his or her efforts, and doubt that he or she can take care of him- or herself.

If things go well, the child develops the virtue of will: the unbroken determination to exercise free choice as well as self- restraint, in spite of the unavoidable experience of shame and doubt in infancy. As will develops, so does the ability to make choices and decisions. Occasional failures and misjudgments will not destroy will, so long as the child has acquired a basic sense of autonomy.

The ritualization that develops at this time is a sense of the judicious, as the child learns what is acceptable and what is not, and also gets a sense of the rules by which right and wrong are determined. The hazard, of course, is that the child will develop a sense of legalism, in which the letter of the law is celebrated over its spirit, and the law is used to justify the exploitation and manipulation of others.

Stage 3: Initiative, Guilt, and Purpose. The locomotor-genital stage covers the remaining years until about the sixth birthday. During this time the child begins to move about, to find his or her place in groups of peers and adults, and to approach desired objects. The crisis is between initiative and guilt. The child must approach what is desirable, at the same time that he or she must deal with the contradictions between personal desires and environmental restrictions.

The development of autonomy leads to the virtue of purpose: the courage to envisage and pursue valued goals uninhibited by the defeat of infantile fantasies, by guilt and by the foiling fear of punishment.

Stage 4: Industry, Inferiority, and Competence. The latency stage begins with schooling and continues until puberty, or roughly 6 to 11 years of age. Here the child makes the transition to school life, and begins to learn about the world outside the home. The crisis is between industry and inferiority. The child must learn and practice adult roles, but in so doing he or she may learn that he or she cannot control the things of the real world. Industry permits the development of competence, the free exercise of manual dexterity and cognitive intelligence.

Stage 5: Identity, Role Confusions, and Fidelity. The stage of puberty-adolescence covers ages 11-18. Biologically, this stage is characterized by another spurt of physiological growth, as well as sexual maturity. Socially, the features of adolescence are involvement with cliques and crowds, and the experience of adolescent love. The crisis is between identity and role confusion. The successful adolescent understands that the past has prepared him or her for the future. If not, he or she will not be able to differentiate him- or herself from others, or find his or her place in the world. Identity, a clear sense of one's self and one's place in the world, forms the basis for fidelity, the ability to sustain loyalty to another person.

Stage 6: Intimacy, Isolation, and Love. Erikson marks the stage of young adulthood as encompassing the years from 18 to 30. During this time, the person leaves school for the outside world of work and marriage. The crisis is between intimacy and isolation. The person must be able to share him- or herself in an intense, long-term, committed relationship; but some individuals avoid this kind of sharing because of the threat of ego loss. Intimacy permits love, or mutuality of devotion.

Stage 7: Generativity, Stagnation, and Care. The next 20 years or so, approximately 30 to 50 years of age, are called the stage of adulthood. Here the individual invests in the future at work and at home. The crisis is between generativity and stagnation. The adult must establish and guide the next generation, whether this is represented in terms of children, students, or apprentices. But this cannot be done if the person is concerned only with his or her personal needs and comfort. Generativity leads to the virtue of care, the individual's widening concern for what has been generated by love, necessity, or accident.

Stage 8: Ego Integrity, Despair, and Wisdom. The final stage, beginning at about 50, is that of maturity. Here, for the first time, death enters the individual's thoughts on a daily basis. The crisis is between ego identity and despair. Ideally, the person will approach death with a strong sense of self, and of the value of his or her past life. Feelings of dissatisfaction are especially destructive because it is too late to start over again. The resulting virtue is wisdom, a detached concern for life itself.

Shakespeare's Seven Ages of Man

Erikson's account of the Eight Ages of Man is a play on the Seven Ages of Man, described by Shakespeare in As You Like It:

All the world's a stage,
And all the men and women merely players,
They have their exits and entrances,
And one man in his time plays many parts,
His acts being seven ages. At first the infant,
Mewling and puking in the nurse's arms.
Then, the whining schoolboy with his satchel
And shining morning face, creeping like snail
Unwillingly to school. And then the lover,
Sighing like furnace, with a woeful ballad
Made to his mistress' eyebrow. Then a soldier,
Full of strange oaths, and bearded like the pard,
Jealous in honour, sudden, and quick in quarrel,
Seeking the bubble reputation
Even in the cannon's mouth. And then the justice
In fair round belly, with good capon lin'd,
With eyes severe, and beard of formal cut,
Full of wise saws, and modern instances,
And so he plays his part. The sixth age shifts
Into the lean and slipper'd pantaloon,
With spectacles on nose, and pouch on side,
His youthful hose well sav'd, a world too wide,
For his shrunk shank, and his big manly voice,
Turning again towards childish treble, pipes
And whistles in his sound. Last scene of all,
That ends this strange eventful history,
Is second childishness and mere oblivion,
Sans teeth, sans eyes, sans taste, sans everything.

 

Life-Span Theory Since Erikson

Erikson's theory was extremely influential. By insisting that development is a continuous, ceaseless processes, he fostered the new discipline of life-span developmental psychology, with its emphasis on personality and cognitive development after adulthood. Much life-span work has been concerned with cognitive changes in the elderly, but personality psychologists have been especially concerned with the years between childhood and old age.

Erikson's stages inspired a number of popular treatments of "life span" personality development, including Daniel Levinson's The Seasons of a Man's Life, and Gail Sheehy's Passages.  

Gould.JPG (58153 bytes) Levinson.JPG (91051 bytes)
Sheehy1976.JPG (44108 bytes) Sheehy1995.JPG (43974 bytes)

These and other schemes are all, to a greater or lesser extent, social conventions superimposed on the biological reality that we're born, age, and die.  They are social categories that organize a continuum of age.

Piaget's Stages of Cognitive Development

The Swiss developmental psychologist Jean Piaget marked four stages of cognitive development:

Sensory-Motor Period

Preoperational Period

Period of Concrete Operations (corresponding, roughly, to arithmetic)

Period of Formal Operations (corresponding, roughly, to algebra)

Some "neo-Piagetian theorists, such as Michael Commons, have argued that there are even higher stages in the Pigetian scheme (presumably corresponding, roughly, to calculus and other higher mathematics)

The late C.N. Alexander even argued that the Science of Creative Intelligence announced by the Maharishi Mahesh Yogi as an offshoot of his Transcendental Meditation program promoted cognitive development beyond the Piagetian stages.

Piaget's theory was very influential among psychologists and educators (though it also proved controversial).  But Piaget's stages never entered popular parlance, the way Freud's and even Erikson's did, so it would not seem appropriate to include them as social categories.

 

"Generations"

Generations.JPG (91773 bytes)Moving from the individual life cycle to social history: In 1951, Time magazine coined the term "Silent Generation" to describe those born from 1923-1933.  The term "generation", as a demographic category referring to people who were born, or lived, during a particular historical epoch, gained currency with the announcement of the Baby Boom (1946-1964) by the Census Bureau.  Following these examples, a number of different generations have been identified by two sociologists, Strauss and Howe (1991, 1997).  Other "generations" include Tom Brokow's "Greatest Generation" (those American adults who lived during World War II), Douglas Coupland's "Generation X" (1960-1965), and Jonathan Pontell's "Generation Jones" (1954-1965 -- that is, older than the typical Baby Boomer but younger than the typical Gen-Xer).

GenerationalBoundaries.JPG
                                        (81929 bytes)As with most social categories, the boundaries between generations are somewhat fuzzy.  For example, the US Census Bureau classified Americans born between 1946 (the end of World War II) and 1964 as constituting the "Baby Boom", while Strauss and Howe argue that the Baby Boom actually began in 1943, and lasted only until 1960.

 

 

GenerationalConflict.JPG
                                        (66765 bytes)As with any other social categorization, generational categories can be a source of group conflict.  For example, the 2008 race for the presidency pitted John McCain, a member of the Silent Generation, against Barack Obama, a member of Generation X, who had won the Democratic nomination over Hillary Rodham Clinton, a member of the Baby Boom Generation.

 

 

JapaneseDiaspora.JPG (76874
                                        bytes)These examples are drawn from American culture, but they can be found in other cultures, too.  Consider the terms used to characterize various generations of the Japanese diaspora (Nikkei).

 

 

Issei immigrated to the United States before 1924, when a new immigration act virtually closed off Japanese and Chinese immigration.

Nisei are the American-born children of Issei -- essentially, members of the Silent Generation.

Sansei are the "Baby-Boom" children of Nisei.

Yonsai are the Generation X/Generation Y children of Sansei.

There is a similar classification for American Chinese:

The children of Chinese who emigrated to America before World War II are known as American-Born Chinese, or ABCs.

Chinese who came to America in the wake of the communist revolution in China, especially those who came in the aftermath of the Cultural Revolution of the 1960s, were known as Fresh Off the Boat, or FOBs.

Artist June Yee explores these stereotypes in her piece, Two Chinese Worlds in California, on display in the Gallery of California History at the Oakland Museum of California (2010).

"I was surprised at how much misunderstanding there was.  They called us FOBs, for fresh off the boat, and they were ABCs, American-born Chinese.  Ironically, we did not fit into each other's stereotype, even though we were all Chinese.  We weren't aware of the anti-Chinese sentiment they had endured for years.  And they didn't understand our feelings about Mao, who in the '60s was a hero for many ABCs who joined the student protests.  I remember being appalled by ABCs who embraced Mao's Little Red Book" (OMCA Inside Out, Spring 2010).

 

Occupation Categories

Sociologists (especially) have devoted a great deal of energy to measuring socioeconomic status.  In these schemes, information about occupation, education, and income is used to classify individuals or families into categories:

lower class;

middle class; and

upper class.

In addition, sociologists and other social scientists make use of other categorical distinctions based on occupation, such as:

white-collar vs. blue-collar (and, for that matter, pink-collar);

professional vs. managerial; and

skilled vs. unskilled labor.

All these terms have entered common parlance: they're not just technical terms used in formal social science.

In contrast to earlier classification schemes, there is nothing "biological" about these categories, which wouldn't exist at all except in societies at a certain level of economic development.  In feudal economies, for example, there was a distinction between serf and master that simply doesn't exist in industrial economies.  

As the serf-master distinction indicates, classification by socioeconomic status evolves as societies evolve. In England, for example, the traditional class distinction was the tripartite one described above: upper, middle, and working classes (In England, as viewers of Downton Abbey understand, the upper class prided themselves on the fact that they did not work for a living). In 2013, the British Broadcasting Corporation commissioned "The Great British Class Survey", which revealed that British society now included at least seven distinct social classes,

As decribed in the BBC press release:

  • Elite: This is the most privileged class in Great Britain who have high levels of all three capitals. Their high amount of economic capital sets them apart from everyone else.
  • Established Middle Class: Members of this class have high levels of all three capitals although not as high as the Elite. They are a gregarious and culturally engaged class.
  • Technical Middle Class: This is a new, small class with high economic capital but seem less culturally engaged. They have relatively few social contacts and so are less socially engaged.
  • New Affluent Workers: This class has medium levels of economic capital and higher levels of cultural and social capital. They are a young and active group.
  • Emergent Service Workers: This new class has low economic capital but has high levels of 'emerging' cultural capital and high social capital. This group are young and often found in urban areas.
  • Traditional Working Class: This class scores low on all forms of the three capitals although they are not the poorest group. The average age of this class is older than the others.
  • Precariat: This is the most deprived class of all with low levels of economic, cultural and social capital. The everyday lives of members of this class are precarious.

These are clearly social categories -- but are they any less natural than biological categories, just for being social rather than biological in nature?

Caste in Hindu India...

Caste.JPG (65151 bytes)A unique set of social categories is found in the caste system in Hindu India.  Although a product of the Vedic age (1500 BCE to 600 CE), the term "caste" (casta) itself was first used by 16th-century Portuguese explorers.  

Traditionally, Indian society was divided into four varnas (Sanskrit for "class" or "color"):

Brahmins: priests and scholars;

Ksatriyas: rulers and warriors;

Vaisyas: merchants, traders, and farmers;

Sudras: artisans, laborers, servants, and slaves.

Below these four groups are the Panchamas ("fifth class"), popularly known as "untouchables".  Mahatma Gandhi labeled these individuals as Harijans, or "children of God".  Untouchability was outlawed in 1949, though -- as in the American "Jim Crow" South -- prejudice against them remained strong.  As an outgrowth of social protest in the 1970s, the untouchables began to view the Harijan label as patronizing, and to identify themselves as Dalit, or "oppressed ones". 

Membership in a caste is largely hereditary, based on ritual purity (the panchamas are untoughable because they are considered to be polluting), and maintained by endogamy.  So long as one follows the rules and rituals (Dharma) of the caste into which he or she is born, a person will remain in his or her caste.  However, one can lose one's caste -- become an outcast, as it were -- identity by committing various offenses against ritual purity, such as violations of dietary taboos or rules of bodily hygiene; and one can move "up" in the caste system by adopting certain practices, such as vegetarianism -- a process known as "Sanskritization".  One can also regain his or her original caste status by undergoing certain purification rites.  Movement "upwards" from untouchability is not possible, however -- though in recent years the Indian government has created "affirmative action" programs to benefit untouchables.

Caste is not exactly a matter of socioeconomic status: there can be poor Brahmans (especially among the scholars!).  Parallel to varna is a system of social groupings known as Jati, based on ethnicity and occupation. 

Although the caste system has its origins in Hindu culture, Indian Muslims, Sikhs, and Christians also follow caste distinctions.  For example, Indian Muslims distinguish between ashraf (Arab Muslims) and non-ashraf (such as converts from Hinduism).

The caste system has been formally outlawed in India, but remnants of it persist, as for example in the identification of a broad class of largely rural "daily-wages people", which is more a matter of social identity than economics.  

Beginning in 1993, the Indian government began a sort of affirmative action program, guaranteeing 27% of jobs in the central and state government, and college admissions, to members of an official list of "backward classes" -- of which there are more than 200, taking into account various subcastes and other caste-like groupings -- not just the dalits.  

Sometimes members of the "backward classes" take matters into their own hands.  An article in the Wall Street Journal about India's affirmative-action program told the story of Mohammad Rafiq Gazi, a Muslim from West Bengal, whose family name was Chowduli -- roughly, the Bengali equivalent of the "N-word".  He legally changed his last name to the higher-caste Gazi in order to escape the stereotypes and social stigma associated with his family name.  But when India initiated its affirmative-action program, individuals with the higher-caste surname of Gazi were not eligible, so he sought to change his name back to the low-caste Chowduli ("For India's Lowest Castes, Path Forward is 'Backward'" by Geeta Anand and Amol Sharma, 12/09/2011).  

 

...and in Japan

Feudal Japan had a class of outcasts, known as the seti or buraku, who were indistinguishable, in ethnic terms, from any other native Japanese.  Like the "untouchables" of India, the buraku performed various tasks that were deemed impure by classical Buddhism, such as slaughtering animals and handling corpses.  They wore distinctive clothing and lived in segregated areas.  Although officially liberated in 1871, they lagged behind other Japanese in terms of education and socioeconomic status.  From 1969 to 2002, they were the subjects of affirmative action programs designed to diminish historic inequalities.  But despite substantial gains, the descendants of the buraku still live largely in segregated areas, and are objects of continuing, if more subtle, discrimination.  For example, by 2001 Hiromu Nonaka, a burakumin politician, had achieved the #2 position in Japan's ruling Liberal Democratic Party, but was not able to make the leap to the #1 position, and the post of prime minister.  As Taro Aso, the LDP prime minister at the time, reportedly said at a private meeting, "Are we really going to let those people take over the leadership of Japan?".  [See "Japan's Outcasts Still Wait for Society's Embrace" by Norimitsu Onishi, New York Times, 01/16/2009.] 

 

Political Categories

Similarly, political scientists (as well as other social scientists) slot people into categories based on their political affiliations.  In the United States, some commonly used political categories are:

Democrat;

Republican; and

Independent.

The category "Progressive" is still used in certain states in the Upper Midwest, but not anywhere else.  The category "Communist" used to be (somewhat) popular, but pretty much disappeared after the fall of the Soviet Union in 1989 (actually, it died long before that).  The "Green" party label is emerging in some places.

Some Germans used to be Nazis, and they killed or incarcerated many Germans who used to be Communists; but now Germans come in three new categories: Christian Democrats, Social Democrats, and Greens.

In addition, political science employs a number of alternative categorization schemes, which have also entered common parlance.  Some examples include:

liberal;

conservative;

neoconservative;

paleoconservative; and

libertarian.

Political categories of very recent vintage include "Soccer Mom" and "NASCAR dad".

 

Religious Categories

People are commonly classified by religion.  Indeed, religious classifications can be a major source of group conflict, as seen by the disputes between Muslims, Eastern Orthodox, and Catholics in the former Yugoslavia, or the disputes between Hindus and Muslims in India and Pakistan.  

The obvious form of religious classification is by religion itself -- Jewish, Christian, Muslim, Buddhist, Hindu, etc. 

But at an even higher level than that is a classification based on the number of gods worshipped in a religion:

Monotheistic, as in Judaism, Christianity, and Islam

Polytheistic, as in Hinduism and Buddhism 

Within many religions, there is a hierarchy of subcategories.  

Islam has major divisions between Sunni, Shia, and Sufi.

Within Christianity, for example, there are major divisions between Catholics, Orthodox, and Protestants.

Within Protestantism, there are various "denominations" such as Presbyterian and Methodist.  Protestant "denominationism" emerged only in the 18th century in the United States and other countries where there was no state or majority religion.

Social change also produces sub-subcategories.  

For example, in 19th-century America, the Baptist church divided into two wings, the Southern Baptists and the Northern (now American) Baptists, over the issue of slavery.  

Similarly, the Lutheran church is organized into different "synods", largely based on national origins.  

Although the very term "Catholic" means universal, and the entire Roman Catholic church is united by the Pope, after the Second Vatican Council, in 1962, some Roman Catholics stepped (not broke) away from the Church by continuing to celebrate the traditional Latin mass; some of these Catholics do not acknowledge the authority of the Pope.

In the United States, stimulated by the consecration of James Robinson, the first openly homosexual bishop, some members of the Episcopal Church have broken away from their church over such social issues as divorce and homosexuality.

Traditionally, institutional Judaism has shunned the idea of denominationism, although it acknowledges "wings" or "streams" such as Orthodox, Conservative, Reform, and Reconstructionist.  However, in 2004 the American Jewish Year Book adopted the term denomination to characterize these movements, for the first time officially acknowledging the existence of major subcategories within the faith.

There are many different kinds of Buddhism, and even Zen Buddhism recognizes different traditions, such as Soto and Renzai.

 

Nationality Categories

We also classify people by their national origin.  

Among Europeans, for example, we distinguish between the Anglo-Irish and those from the continent;

Among the Irish, there is a special category of Scots-Irish, meaning Protestants who colonized Ireland, particularly the North, for England;

Among those from the continent, we distinguish between Northern and Southern Europeans, and between Western and Eastern Europeans;

Not to mention Central Europeans, covering Hungary and the former Czechoslovakia.

We distinguish Europeans from Asians and Africans.

Among Africans, we distinguish between North Africans and Sub-Saharan Africans;

Among Asians we distinguish between East Asians, South Asians, and Southeast Asians.

In some sense, national origin is a matter of geography: the English Channel divides the British Isles from the Continent; the Alps divide Northern and Southern Europe; the Danube divides Western and Eastern Europe; the Mediterranean, the Strait of Bosporus, and the Black Sea divide Europe from Africa, the Bosporus and the Caucasus Mountains divide Europe from Asia, and so on.  South Asia is on a separate tectonic plate from Asia proper.  But again, we can see social concepts imposed on the map of the world.

Eastern Europe was much more a product of the Ottoman Empire, and later of the Soviet Empire, than it was of any geographical feature.  

Russia (at least Moscow and St. Petersburg) is to the west of the Caucasus Mountains, but Czar Peter became "Peter the Great" by "Westernizing" Russia, to make the country more European.  

Turkey lies south and east of the Bosporus, but it seeks admission to the European Union;

Many Arabs, or at least many Arab states, view Israel as an extension of Europe, rather than as another country in the Middle East; and

Isn't the Middle East actually further away from Europe (not to mention the Far East) than the Near East?

Nationality categories also change with historical and political developments.  For example, with the formation and consolidation of the European Union, many citizens of European countries have begun to identify themselves as "European" as well as Dutch, Italian, etc.  Based on the Eurobarometer survey, Lutz et al. (Science, 2006) reported that 58% of Europeans above 18 reported some degree of "multiple identity" (actually, a dual identity), as against 42% who identified themselves only in terms of their nationality.  The percentages were highest in Luxembourg, Italy, and France (despite the French rejection of the proposed European constitution in 2006), and lowest in Sweden, Finland, and the United Kingdom (which maintains its national currency instead of adopting the Euro).   Perhaps not surprisingly, younger respondents were more likely to report a multiple national identity than older respondents.

The Israeli-Palestinian conflict is an interesting case in point (see Side by side: Parallel Narratives of Israel-Palestine by Sami Adwan, Dan Bar-On, and Eyal Naveh, 2012; see the review by Geoffrey Wheatcroft, "Can They Ever Make a Deal?", New York Review of Books, 04/05/2012).   Yasser Arafat, president of the Palestinian National Authority, and his successor, Mahmoud Abbas, agitated for a Palestinian state separate from both Israel and Jordan; on the other hand, Golda Meier (1969), the former Israeli prime minister, denied that there was such a thing as a Palestinian people, and Newt Gingrich (2012), the former US presidential candidate, called the Palestinians "an invented people".  Which raises a question: What does it mean to be a Palestinian -- or an Israeli, for that matter, but let's stick with the Palestinian case for illustration.  It turns out that national consciousness -- one's identity as a citizen of a particular nation -- is a relatively recent cultural invention.  Before the 1920s, Arabs in Palestine -- whether Muslim or Christian -- considered themselves part of the Ottoman Empire, or perhaps as part of a greater Arab nation, but apparently not as Palestinians as such.  In fact, it has been argued that the Palestinian identity was created beginning in the 1920s in response to Zionism -- an identity which was itself an invention of the 1890s, before which Jewish tradition did not include either political Zionism or the idea of a Jewish state.  It's one thing to be Jewish (or Palestinian) as a people; it's quite another to be citizens of a Jewish or Palestinian (or greater Arab) nation.  And -- just so I'm not misunderstood here -- Israelis and Palestinians are by no means unique in this regard.

These two aspects of identity: identity as a people and identity as a nation are not the same thing.  But at the Versailles Conference that followed World War I, Woodrow Wilson championed the idea that every people should get their own nation -- this is what is known as self-determination, as opposed to the imperial and colonial systems (including those of Britain, France, and Belgium) which had existed prior to that time.  On the other hand, Walter Lippman argued that self-determination was not self-evidently a good thing, because it "rejects... the ideal of a state within which diverse peoples find justice and liberty under equal laws".  Lippman predicted that the idea of self-determination would lead to mutual hatred -- the kind of thing that boiled up in the former Yugoslavia in the late 20th century.

The question of national identity can become very vexed, especially as nation-states arose in the 18th century, and again in the 20th century with the breakup of the Austro-Hungarian and Ottoman empires.  In contrast to non-national states, where the state was identified with some sort of monarch (a king or a queen, an emperor, or a sultan), who ruled over a large and usually multi-ethnic political entity (think of the Austro-Hungarian Empire, or the Ottoman Empire), nation-states are characterized by a loyalty to a particular piece of territory, defined by natural borders or the settlement of a national group, common descent, common language, shared culture promulgated in state-supported public schools -- and, sometimes, the suppression of "non-national" elements.  Think of England, France, and Germany.

But immigration, globalization, and other trends can challenge this national identity, raising the question of exactly what it means to be a citizen of a nation-state.  It turns out that belonging to the group is not precisely a matter of citizenship.

As Henry Gee notes, 

"the abiding horror [of the July 7, 2005 suicide attacks on the London Underground] is that the bombers were not foreign insurgents -- Them -- but were British, born and raised; in Margaret Thatcher's defining phrase, One of Us" (in "Tricky, Turbulent, Tribal", Scientific American 12/05).

Traditionally, the United States has portrayed itself as a great melting pot, in which immigrants from a wide variety of nations, religions, and ethnicities blended in to become a homogeneous group of -- well, Americans.  But after World War II, with the rise of the Black civil rights movement, and increasing degrees of Hispanic and Asian immigration, the American self-image has shifted from that of a melting pot to that of a stew (Jesse Jackson's famous image), or a gumbo, where the various ingredients combine and influence each other, making something delicious while each maintaining its original character.

Other societies have not favored the melting-pot image, striving to maintain ethnic homogeneity and resisting immigration.  A case in point is Belgium, a country which includes both the German-speaking Flemish (in the northern Flanders) and the French-speaking Walloons (in the southern Wallonia), and the conflicts between the two have made for highly unstable governments, and increasing discussion of the possibility that the country will, in fact, break up -- much as happened in the former Czechoslovakia and the former Yugoslavia.  The irony is that Brussels, seat of the European Union, while nominally bilingual, is for all practical purposes Francophone -- and it's located in German-speaking Wallonia.  So breaking up isn't going to be easy.  

Yet other societies have fostered immigration, but have held to the melting-pot image, despite the desire of new immigrants to retain their ethnic identities -- creating the conditions for cultural conflict.  Ironically, the potential for conflict has been exacerbated by those societies' failure to make good on the promise of integrating new immigrants

France.JPG (69249 bytes)A case in point is rioting that broke out in some Arab immigrant communities in France in 2005, and the more recent dispute over the desire of some observant Muslim Frenchwomen to wear the headscarf, or hijab, as an expression of modesty or their religious heritage or, perhaps, simply their identity.   

 

As part of the heritage of the French revolution, which abolished the aristocratic system and made all Frenchmen simple "citizens", the French Constitution guarantees equality to all -- so much so that, until recently, anyone born in any territory ever held by France is eligible to become President -- including Bill and Hillary Rodham Clinton, born, respectively, in Arkansas and Illinois (in fact, the law was changed when the French realized that Bill Clinton was eligible).  Unlike the United States, where terms like "African-American", "Asian-American", and "Mexican-American" have become familiar, there are no such "hyphenated" categories in France, and the French census has no provision for identifying the race, ethnicity, national origin, or religion of those who respond to it.  So the government has no idea how many of its citizens are immigrants, or from where.  And, officially, it doesn't care.  Everybody's French, and all French are alike.  In theory, anyway, and in law.

Then again:

In 2004, France banned the wearing of head scarves in public schools -- and just to reassure everyone that this wasn't directed at Muslims, banned all religious paraphernalia (except small crosses).

In 2010, the Sarkozy government proposed a law that would ban the niqab, or full veil, as well as the burqa -- despite the fact that, at the time, it was estimated that fewer than 2,000 French women actually wore the full veil in public.  When the law was proposed, Sarkozy asserted that the full veil "hurts the dignity of women"; but, more to the point of these lectures, stated that it "is unacceptable in French society" ("Sarkozy Says He Supports Bill Banning Full Veils" by Steven Erlanger, New York Times 04/22/2010). 

But it has become painfully clear that (paraphrasing George Orwell in Animal Farm) some French are more equal than others.  Despite a large number of immigrants from Algeria and Morocco, there are few Arabs represented in government, or in the police force.  Many Arab immigrants feel that they have been left out of French society -- effectively denied education, employment, and other opportunities that were available to the "native" French.  As one immigrant put it, "The French don't think I'm French" (quoted in "France Faces a Colonial Legacy: What Makes Someone French?" by Craig S. Smith, New York Times, 11/11/05).  The situation has been worsened by the fact that, while there is full freedom of religious practice in France, the state virtually outlaws any public display of religious piety, such as the headscarf (hijab) worn by many Muslim women (as well as the Jewish yarmulke and oversize Christian crosses). Moreover, as part of a policy of secularization, the state owns and maintains all religious properties.  Just as it has not built any new churches or synagogues, so it hasn't built any mosques.  The problem is that while there are plenty of churches and synagogues to go around, there are lots of Muslims who are forced to worship in gymnasiums and abandoned warehouses.  

In part, the 2005 riots in France reflect a desire on the part of recent Arab immigrants to be classified as fully French, and treated accordingly, without discrimination; but also a desire to be recognized as different, reflecting their African origins and their Muslim religion.  Such are the contradictions of social categorization.

American.JPG (119594 bytes)Despite the mythology of the melting pot, the United States itself is not immune from these issues.  Many of the earliest European settlers, especially in the original 13 colonies, came to the New World to escape ethnic and religious conflict, and quite quickly a view of a new American type, blending various categories, emerged.  In Letters from an American Farmer (1782, Letter III), Hector St. John de Crevecoeur, a French immigrant to America in the 18th century, noted the mix of "English, Scotch, Irish, French, Dutch, Germans, and Swedes" in the New World and characterized "the American" as a "new man", in which "individuals of all nations are melted into a new race of men".  In Democracy in America (1835), Alexis de Tocqueville (another Frenchman) predicted that America, as a country of immigrants, would be exempt from the conflicts between ethnicities, classes, and religions that had so often beset Europe -- initiating a view of American exceptionalism.  The image of America as a "melting pot" was fixed in Israel Zangwill's play of that title, first produced in 1908.

AssimilMulticulti.JPG
                                                (90967 bytes)Beginning in the 19960s this traditional view of what it means to be an American was challenged, first by a new wave of African-American civil rights leaders, and later by Mexican-Americans, Chinese-Americans, and others who wanted to keep their traditions at the same time as they became Americans.  This movement away from assimilationism toward multiculturalism is captured by the image of America as a "gorgeous mosaic", or "salad bowl" of cultures -- an image derived, in turn from John Murray Gibbon's image of Canada.  It's what Jesse Jackson has in mind with his "Rainbow Coalition" -- a rainbow in which white light can be decomposed into several different colors.  

In 1963, Nathan Glazer and Daniel Patrick Moynihan noted in their book, Beyond the Melting Pot, that "the point about the melting pot... is that it did not happen ".  By 1997, Glazer would title his new book on the subject We're All Multiculturalists Now.

DiversityClimate.JPG (76686
                                                bytes)It turns out that whether members of the majority culture (meaning whites) hold assimilationist or multiculturalist views has an impact on the quality of life of members of majority cultures (meaning persons of color, broadly defined).  Victoria Plaut and her colleagues conducted a "diversity climate survey" in 17 departments of a large corporation, and found that white employees embrace of multiculturalism was strongly associated with both minority employees' "psychological engagement" with the company and their perceptions of bias.  But where the dominant ideology of the white employees tended toward "colorblindness" (a variant on assimilationism), minority employees were actually less psychologically engaged, and perceived their white co-workers as more biased against them.

Typically, we categorize other people in terms of their national identity, but national identity can also be part of one's own self-concept.  In The Red Prince: The Secret Lives of a Habsburg Archduke (2008), the historian Timothy Snyder tells the story of Archduke Wilhelm (1895-1948), son of Archduke Stefan (1860-1933), of the Austro-Hungarian Empire.  

In the late 19th century, as Central European nationalism began to rise, Stefan had decided to become king of a united Poland.  Poland was not even a free-standing country at the time, being divided among Russia, Prussia, and Austria-Hungary.  But there was a burgeoning nationalist movement in Poland, and Stefan could see its future as an independent nation.  Every nation needed a king, in his view, and so Stefan decided he would be it.  To that end, he learned Polish, bought an estate in Poland, and married his daughters to Polish nobility -- though he never actually moved his family to Poland.  Anyway, Poland successfully overthrew both Austro-Hungarian and Russian rule and became a republic, thus mooting the idea of Stefan or anyone else becoming king.

Meanwhile, Stefan's son, Wilhelm, had other ideas -- that the Ukraine, struggling to free itself from rule by Russia and Austria-Hungary, also needed a king.  He began studying Ukrainian, put on a Cossack hat, formed an alliance with the Metropolitan of the Greek Catholic Church, and led Ukrainian soldiers during World War I.  But then came the dissolution of the Austro-Hungarian Empire, not to mention the Russian Revolution.  Wilhelm joined the Nazis out of antipathy for the Soviet Union, and continued his anti-Soviet activities after the war.  

Anne Applebaum, reviewing The Red Prince (Laughable and Tragic", New York Review of Books, 10/23/2008), writes:

Snyder is more convincing when he places Wilhelm's story not in the politics of contemporary Ukraine, but in the context of more general contemporary arguments about nations and nationalism. For the most striking thing about this story is indeed how flexible, in the end, the national identities of all the main characters turn out to be, and how admirable this flexibility comes to seem. Wilhelm is born Austrian, raised to be a Pole, chooses to be Ukrainian, serves in the Wehrmacht as a German, becomes Ukrainian again out of disgust for the Nazis—and loses his life for that decision. His brother Albrecht chooses to be a Pole, as does his wife, even when it means they suffer for it too. And it mattered: at that time, the choice of “Polishness” or “Ukrainianness” was not just a whim, but a form of resistance to totalitarianism.

These kinds of choices are almost impossible to imagine today, in a world in which “the state classifies us, as does the market, with tools and precision that were unthinkable in Wilhelm’s time,” as Snyder puts it. We have become too accustomed to the idea that national identity is innate, almost genetic. But not so very long ago it was possible to choose what one wanted to be, and maybe that wasn’t such a bad thing. In sacrificing that flexibility, something has been lost. Surely, writes Snyder,

the ability to make and remake identity is close to the heart of any idea of freedom, whether it be freedom from oppression by others or freedom to become oneself. In their best days, the Habsburgs had a kind of freedom that we do not, that of imaginative and purposeful self-creation.

And that is perhaps the best reason not to make fun of the Habsburgs, or at least not to make fun of them all the time. Their manners were stuffy, their habits were anachronistic, their reign endured too long, they outlived their relevance. But their mildness, their flexibility, their humanity, even their fundamental unseriousness are very appealing, in retrospect—especially by contrast with those who sought to conquer Central Europe in their wake.

 

Racial and Ethnic Categories

The complicated relationship between natural categories of people, based on biology or geography, and social categories of people, based on social convention, is nowhere better illustrated than with respect to race and ethnicity.  By virtue of reproductive isolation, the three "races" (Caucasoid, Mongoloid, and Negroid), and the various ethnicities (Arab vs. Persian, Chinese vs. Japanese) do represent somewhat different gene pools.  But members of different races and ethnicities have much more in common, genetically, than not: they hardly constitute different species or subspecies of humans. Moreover, social conventions such as the "one drop rule", widespread in the American south (and, frankly, elsewhere) during the "Jim Crow" era, by which an individual with any Negro heritage, no matter how little, was classified as Negro (see "One Drop of Blood" by Lawrence Wright, New Yorker, 07/24/94), indicates that much more goes into racial and ethnic classifications than genes.   

Consider, for example:

The labels historically applied to the basic White-Black racial division: Negro, Black, Afro-American, and African-American.  These category labels are all intended to apply to the same social group, but the label has shifted according with changes in the social meaning of the grouping.

Americans of Hispanic decent (not a race, of course, but a prominent ethnic category) commonly divide themselves into Chicanos and Latinos.

The people who used to be called American Indians (a misnomer if ever there was one, stemming from Columbus' false belief that he had reached the shores of India rather than the New World) are now called Native Americans.  But the category Native American is distinguished from Aleuts and Eskimos, who are also Native Americans; and in Canada, the Eskimos are called Inuits.

As recently as 1986, the US Supreme Court refused to review, and thereby essentially upheld, a lower-court ruling that, according to Louisiana law, a woman was "black" whose great-great-great-great-grandmother, but no other ancestors, was black.  That's only 1/64 black, pretty close to "one drop of blood".  Lawrence Wright  (in "One drop of Blood", 1994) quotes G. Reginald Daniel of UCLA: "We are the only country in the world that applies the one-drop rule, and the only group that the one-drop rule applies to is people of African descent".

While the "one drop" rule would seem to suggest an all-or-none, classical view of racial categorization, there are reasons to think that racial categories are also a fuzzy set.  

In 1998, as President Bill Clinton was coping with the Monica Lewinsky scandal, Toni Morrison, the poet and winner of the Nobel Prize for Literature, wrote an article in the New Yorker in which she characterized Clinton as "our first black President.  Blacker than any actual black person who could ever be elected in our children's lifetime.  After all, Clinton displays almost every trope of blackness: single-parent household, born poor, working-class, saxophone-playing, McDonald's-and-junk-food-loving boy from Arkansas."  Clinton, so far as we know, doesn't have even a drop of black blood -- but for Morrison (speaking ironically, to make a larger point: you should read the essay), he classifies as "black" because he fits a certain stereotype -- in her words, he displays "almost every trope of blackness".

Little did Morrison imagine that, only a decade later, an actual black person -- Barack Obama, a first-term senator from Illinois, would run for president and actually have a good chance of not only getting a major-party nomination, but also of winning the election (it doesn't matter whether he actually was nominated or won).  Barack Obama, being the child of a Kenyan father and a white mother (they met as students at the University of Kansas) definitely counts as black according to the "one drop rule".  But during the primary season, some commentators worried about whether he was "black enough".  For example, Debra J. Dickerson asserted that Obama was not authentically African-American ('actually black") because his father was not descended from a family who had lived as slaves in America ("Colorblind", Salon, 01/22/2007; also an interview in February 2007 on the comedy Channel's Colbert Report).  In this view, Obama may be a black American, but despite his American citizenship and his undeniable African heritage, he's not an African-American.  Apparently, at least in Dickerson's mind, people are categorized as African-American based on nondefining features.

But wait -- it gets even better.  Writing in Newsweek (02/26/08), Martin Linsky, a political scientist at Harvard, argued that Barack Obama might be the first woman president!  In his view, Obama approaches political questions in ways "that are usually thought of as qualities and values that women bring to organizational life: a commitment to inclusiveness in problem solving, deep optimism, modesty about knowing all the answers, the courage to deliver uncomfortable news, not taking on all the work alone, and a willingness to air dirty linen.  Hillary clinton, on the other hand, is taking a more traditional (and male?) authoritarian approach.  In Linsky's view, Obama displays precisely those attributes -- such as "advocating conversation and collaboration" -- that are closely associated with the traditional stereotype of femininity.  (Then again, in the same essay, Linsky suggests that John McCain, the presumptive Republican nominee, displays an "androgynous" leadership style.)

But seriously, Obama appears to exemplify a new category of post-black, as described by Toure, a writer and critic of music and culture (reviewing Sag Harbor by Colson Whitehead in the New York Times Book Review, 05/03/2009, p. 1):

"Now that we've got a post-black president, all the rest of the post-blacks can be unapologetic as we reshape the iconography of blackness.  For so long, the definition of blackness was dominated by the '60s street-fighting militancy of the Jesses and the irreverent one-foot-out-of-the-ghetto angry brilliance of the Pryors and the nihilistic, unrepentant ghetto, new-age thuggishness of the 50 Cents.  A decade ago they called post-blacks Oreos because we didn't think blackness equaled ghetto, didn't mind having white influences, didn't seem full of anger about the past.  We were comfortable employing blackness as a grace note rather than as our primary sound.  Post-blackness sees blackness not as a dogmatic code worshipping at the alter of the hood and the struggle but as an open-source document, a trope with infinite uses."

In his own book, Who's Afraid of Post-Blackness? What It Means to be Black Now (2012), Toure argued that:

The definitions and boundaries of Blackness are expanding in forty million directions -- or really, into infinity.  It does not mean that we are leaving Blackness behind, it means we're leaving behind the vision of Blackness as something narrowly definable and we're embracing every conception of Blackness as legitimate.  Let me be clear: Post-Black does not mean "post-racial".  Post-racial posits that race does not exist or that we're somehow beyond race and suggests color-blindness: It's a bankrupt concept that reflects a naive understanding of race in America.  Post-Black means we are like Obama: rooted in but not restricted by Blackness.

Or maybe not.  In 2010, Obama filled out his Census form identifying himself as "black" -- even though the Census permitted him to endorse as many racial categories as he wished.  Writing in the Wall Street Journal (April 9, 2010), Abigail Thernstrom criticized Obama for "disowning his white mother and, by extension, his maternal grandparents who acted as surrogate parents for much of his boyhood."  She compared him unfavorably with golfer Tiger Woods, who famously identified himself as "Cablinasian" -- Caucasian, black, Indian, and Asian.  Which I suppose was the first time in 2010 that anyone despised Obama enough to mark Woods as a paragon of virtue, but I digress.

Speaking of Tiger Woods, a little more on that "Cablinasian" business.  Woods's father was African-American, Chinese, and Native American, while his mother was Thai, Chinese, and Dutch.  As a student, he identified himself as both "African-American" and "Asian".  In a 1997 appearance on the Oprah Winfrey Show, Oprah asked Woods if he was bothered by being labeled "African-American", and he replied that "It does....  Growing up, I came up with this name: I'm Cablinasian [Caucasian, Black, Indian, and Asian].  I'm just who I am... whoever you see in front of you."

Commenting on the exchange in an editorial, the Chicago Sun-Times wrote that Woods "justly rejects attempts to pigeonhole him in the past.  Tiger Woods is the embodiment of our melting pot and our cultural diversity ideals and deserves to be called what he in fact is -- an American."

Writing of this whole incident, Gary Younge noted that "Racial identity is not less diverse than national identity.  But somehow to describe Woods as black or Asian traps him in a pigeonhole, while to define him by his nationality sets him free" ("Replacing History with Fiction in Arizona", the Nation, 02/27/2012).

So here's a question for empirical research: Are national stereotypes like "American" weaker, less informative, than racial or ethnic stereotypes?

This is a continuing issue, and not just for multiracial individuals, but also for governmental bookkeeping.  

The census, and common usage, suggests that African-Americans comprise a single group -- a good example of the outgroup homogeneity effect described earlier.  But things look different if you're in the outgroup, and they look different if you're an ingroup member who's looking closely.  For example, W.E.B. Dubois famously distinguished between the "Talented Tenth" and other American Negroes (as they were then called).

Eugene Robinson (in Disintegration: The Splintering of Black America, 2010) argues that there is no longer any such thing as a "black community" in America -- that is, a single group with shared identity and experience.  Instead, Robinson argues that American blacks divide into four quite different subgroups:

the Transcendent -- an elite of highly educated, socially prominent, and very wealthy individuals;

the Mainstream -- middle-class, now comprising the majority of black Americans;

the Emergent -- mixed-race families, and black immigrants from Africa and the Caribbean;

the Abandoned, an "underclass" populating the inner cities and the rural South.

Despite the fact that all of these groups are composed of black Americans, Robinson argues that they nonetheless have little in common -- that the divisions of economics and culture, interests and demands, overwhelm the commonality of race.  

 

When Tom Met Sally...

As an example of the power of the "one-drop rule" in American history, consider the case of Thomas Jefferson, principal drafter of the Declaration of Independence, third President of the United States, and founder of the University of Virginia, who we now know fathered as many as six children by one of his Negro slaves, Sally Hemmings.  In a letter written in 1815, Jefferson tried to work out the "mathematical problem" of determining how many "crossings" of black and white would be necessary before a mixed-race offspring could be considered "white" (see "President Tom's Cabin" by Jill Lepore, reviewing the Hemmingses of Monticello: An American Family by Annette Gordon-Reed, New Yorker, 09/22/08, from which the following quotation is drawn).

Let us express the pure blood of the white in the capital letters of the printed alphabet... and any given mixture of either, by way of abridgment, in [small] letters.

Let the first crossing be of a, a pure negro [sic], with A, a pure white.  The unit of blood of the issue being composed of the half of that of each parent, will be a/2 + A/2.  Call it, for abbreviation, h (half blood)....

[Jefferson refers to b as the second crossing, and q as the resulting "quarteroon".]

Let the third crossing [denoted c] be of q and C, their offspring will be q/2 + C/2 = a/8 + A/8 + B/4 + C/2, call this e (eighth), who having less than 1/4 of a, or of pure negro blood, to wit 1/8 only, is no longer a mulatto, so that a third cross clears the blood.  

Given that Sally Hemmings herself had both a white father and a white grandfather, Jefferson was apparently satisfied that his own children by her -- who, by the way, were each freed as they reached age 21 -- had "cleared the blood".  In any event, their daughter Harriet, and one of their sons, Beverly, did in fact live as whites in Washington, D.C. -- though another son, Madison, remained part of the community of free Negroes in Virginia.  

In modern society, Blacks have often been subject to racial discrimination, but this was not always the case.   Frank M. Snowden, a historian of blacks in the ancient world, has argued that "color prejudice" was virtually unknown in the ancient world of Egypt, Assyria, Greece and Rome (see his books, Blacks in Antiquity: Ethiopians in the Greco-Roman Experience, 1970, and Before Color Prejudice: The Ancient View of Blacks, 1983).  In his view, blackness was not equated with inferiority and subordination because ancient Whites encountered Blacks as warriors and statesmen, rather than as slaves or colonial subjects.  Color prejudice, then, appears largely to be a social construction, arising relatively recently out of specific historical circumstances.

Nor is racial prejudice necessarily about color, per se.  We can see this is the case when legally enforced racial categories are abandoned. 

  • In one famous case, Arthur Ashe, the first African-American to be ranked the world's #1 professional tennis player, was designated an "honorary white person" when he visited South Africa during the apartheid regime (he rejected the "honor", but that is another matter).  Apparently, the South African government distinguished between African and non-African blacks. 
  • Closer to home, at the height of racial segregation in the "Jim Crow" South, Angela Davis, the African-American political philosopher, and her sister got themselves served in a whites-only department store in their hometown of Birmingham, Alabama, by the simple expedient of speaking French -- thereby allowing themselves to be perceived as "foreign", and thus acceptable customers, despite being black.  
  • In Saudi Arabia, the Muttawah, or religious police, enforce strict segregation of men and women. All women must cover themselves when they go out in public, and unrelated men and women are not allowed to mix -- to the extent that even coffee shops are segregated by gender -- not unlike the "bad old days" of the Jim Crow South. Except that Western women are treated as "honorary men", permitting them to interact with unrelated men, including Saudi men, as Saudi women cannot do (see K.E. House, On Saudi Arabia: Its People, Past, Religion, Fault Lines -- and Future, 2012).

 

Who is a Jew?

WhoIsAJew.JPG
                                                          (106934
                                                          bytes)Social categorization can have important legal (and personal) ramifications.  For example, the question of Jewish identity, which mixes categorization on the basis of ethnicity and religion, took an interesting turn with the establishment of Israel as a Jewish state in 1948, and the enactment in 1950 of the "Law of Return", which gave any Jew the right to aliyah -- to immigrate to Israel and live in the country as a citizen.  Thus, the question, Who is a Jew?, is addressed by the Israeli Chief Rabbinate, whose court, or Beit din, is dominated by Orthodox and Ultra-Orthodox rabbis who operate under Halakha, or Jewish Rabinical law.

Because Jewish culture is matrilineal (Deuteronomy 7:4), the easiest answer is that anyone born to a Jewish mother is Jewish; if not, then not.  It doesn't matter whether the child is raised Jewish, or whether the mother considers herself to be Jewish.   In this view, "Jew" is a proper set, with all instances sharing a single defining feature. But then things get complicated.

There are conversions to Judaism, which also must be sanctioned by a Beit Din, or religious court.  Thus, the set "Jew" is rendered disjunctive, with some Jews in the category by virtue of having a Jewish mother, and other Jews in the category by virtue of having undergone conversion.

Under Israeli law, a person can make aliyah even though he or she has only one Jewish grandparent.  But that person is not considered to be a Jew under Halakha.  Thus, a person can become an Israeli by virtue of a Jewish heritage, but still not be considered a Jew.

 In Britain and America, the Liberal and Reform movements define as a Jew anyone who has at least one Jewish parent, and who is raised as a Jew.  This sets up a conflict for some Reform or Liberal Jews who want to emigrate to Israel, as the Beit Din generally does not recognize as Jewish individuals who were not born to Jewish mothers.   To make things even more difficult, the rabbinical court prefers that the individual be the child of an Orthodox mother.

Similarly, the Israeli Chief Rabbinate, which is dominated by Orthodox Jews, tends not to recognize non-Orthodox conversions -- which excludes most conversions performed in the US and Britain.  

Then there is the matter of individuals who were born Jewish but then convert to another religion.  In Orthodox Judaism, individuals born to Jewish mothers are considered to remain Jews even after their conversion.  But this is not the case for Reform and Liberal Judaism.  

Then there are members of so-called "lost tribes", including the Falasha of Ethiopia, and other "lost tribes" in Africa, the Caucasus, India, Siberia, Burma, and New Mexico.

The situation is made more acute by the fact that the Israeli Chief Rabbinate not only controls aliyah but also controls marriage: Jews are not permitted to marry non-Jews -- and, as just described, the criteria for "Who is a Jew?" are at best unclear, and at worst incredibly strict.  But the rules of categorization have real-life consequences (see "How do You Prove You're a Jew?" by Gershom Gorenberg, New York Times Magazine, 03/02/2008).

To make things even more interesting "Jew" is an ethnic category as well as a religious one.  This dual status was brought to light in a lawsuit heard in Britain's supreme Court in 2009, over the issue of admission to a Jewish high school in London.  Britain has some 7000 publicly financed religious schools; although these are normally open to all applicants, when there are more applicants than openings these schools are permitted to select students based on their religion.  The plaintiff in the case, known as "M", is Jewish, but his mother converted to Judaism in a non-Orthodox synagogue.  Therefore, she does not meet the Orthodox criteria for being Jewish -- and, so, neither does "M".  "M" appealed, and the British Court of Appeals declared that the classic definition of Judaism, based on whether one's mother is Jewish, is inherently discriminatory.  The appeals court argued that the only test of religious faith should one be religious belief, and that classification based on parentage turns a religious classification into an ethnic or racial one -- which is quite illegal under British law.  The Orthodox rabbinate, which controls these sorts of things, claims that this ruling violates 5,000 years of Jewish tradition, and represents an unlawful intrusion of the State into religious affairs.   (See "British Case Raises Issue of Identity for Jews" by Sarah Lyall, New York Times, 11/08/2009.)  

Another perspective on "Jewish" as a social category is provided by a rabbinical debate concerning fertility treatments.  In one form of fertility treatment, eggs from a donor will be implanted in an infertile woman, to be fertilized by her husband (or whatever).  Recall that, according to Orthodox rule, a child is a Jew if his or her mother is a Jew.  But in the case of egg-donation, who's the mother?  The woman who donated the egg, or the woman who gave birth to the child?  This issue was hotly debated at a conference in Jerusalem hosted by the Puah Institute.  Many Orthodox authorities argue that it's the egg donor who matters, and the birth mother is only an "incubator", whose womb is an "external tool".  Others argue that because Judaism accepts converts, it cannot be considered a "genetic" religion -- that there's no "Jewish blood".  But then again, that principle is compromised by the British school case described earlier -- though maybe not: in the school case, if the mother had undergone an Orthodox conversion, the issue of her son's Jewishness would never have arisen.  Then again, it's conceivable that, at an Orthodox wedding , the officiating rabbi could ask the couple how they were conceived, and require evidence of the egg-donor's Jewishness.

This sidebar can provide only the briefest sketch of the question, which turns out to be incredibly complicated -- and also in flux, depending on the precise makeup (in terms of the balance between Modern Orthodox and Ultra-Orthodox members) of the Israeli Chief Rabbinate.  I claim no expertise in Halakha.  The point is that the question "Who is a Jew?" is not self-evident, and it's not just a matter of anti-Semitic stereotyping, but has real consequences, even among Jews, and even in Israel itself.  (See "Fertility Treatment Gets More Complicated" by Gabrielle Birkner, Wall Street Journal, 05/14/2010).

It's a good example of the issues that surround social categorization.  Social categories may exist in the mind(s) of the beholder(s), but -- as the Thomases would surely agree -- they are real in their consequences.

WhitePeople_LeeWells_NYT03282010.jpg
                                                    (100338 bytes)It should be said, in conclusion, that the racial category "white" -- the usual contrast for social categories such as Black, and Asian is also problematic.  In the ancient world, ethnic distinctions were based on culture, not physical differences such as skin color -- as when the Greeks and Romans, not to mention the Chinese, distinguished between themselves and "barbarians".  In fact, the Greeks noted that the Scythians and Celts were lighter in skin tone than themselves.  So were the Circassians, from whom we derive the very term Caucasian -- but at this time the Caucasians were hardly a dominant ethnic group, and whiteness had no special cachet.  

Apparently, the notion of "white" as an ethnic category began with German "racial science" in the 18th and 19th centuries, such as Johann Friedrich Blumenbach -- an early anthropologist who classified humans into five races based on skin color: Caucasions (white), Mongolians (yellow), Malays (brown), Negroids (black), and Americans (red).  Blumenbach's system was adopted in America by Thomas Jefferson and others.  However, while Bumenbach took Caucasians as the exemplars of whiteness, Jefferson and others focused on Anglo-Saxons (English and lowland Scots, but not the Irish) and Teutons (Germans).  Later, the boundaries of "white" were extended to Nordics and Aryans, and still later to Alpines (Eastern Europeans) and Mediterraneans (Italians and Greeks).  The Irish, who were considered to be only 30% Nordic, and 70% Mediterranean, were granted "white" status after the Civil War.  Thomas Carlyle considered the French to be an "ape-population" -- but then again, a 1995 episode of The Simpsons referrred to the French as "cheese-eating surrender monkeys", so maybe we haven't progressed so far after all.  [For details, see The History of White People (2010) by Nell Irvin Painter; the illustration, by Leigh Wells, is taken from the review of Painter's book, by Linda Gordon, New York Times 03/28/2010.]

Hardly anyone uses the term "Caucasian" anymore -- nor, for that matter, the other Blumenbachian terms, "Mongoloid" and "Negroid". But it has come up in interesting contexts. In Taka Ozawa vs. United States (1922), the United States Supreme Court found a Japanese man ineligible for citizenship because he was night "Caucasian" -- even though he was light-skinned. In United States v. Bhagat Singh Thind (1923), the Court denied citizenship to a man of Indian descent because, although he was technically "Caucasian", he was not light-skinned (the case is a notable example of the judicial theory of Original Intent).

Shaila Dewan, an American of East Indian and European descent, writes about "whiteness" in her essay, "Has 'Caucasian' Lost Its Meaning?" (New York Times, 07/07/2013). She notes that in the American South, she was often asked about her ethnic origins. When she answered that her father was from India, but her mother was white, she felt pressed for further clarification: "What kind of white". The answer was that her mother was a mix of Norwegian, Scottish, and German ancestry. Which experience, in turn, led her to think about sub-classifications within the category of "white". The implication, as explained to her by Matthew Pratt Guterl, who wrote The Color of Race in America, 1900-1940, is that "all whitenesses ae not created equal".

All of which seems to illustrate the outgroup homogeneity effect. Whites care about whether someone is English or Irish, Swedish or Norwegian, Polish or Lithuanian; but they don't distinguish between Chicanos and other Hispanics; and they don't ask whether the ancestors of African-Americans were from East or West Africa. However, Latinos may well distinguish between people of Mexican, South-American, Cuban, or for that matter, Iberian heritage; and African-Americans may well distinguish between those with a heritage of slavery (like Michele Obama) and those without one (like Barack Obama). There's a study here: hint, hint.

 

Social Categorization in the United States Census

RaceUSCensus.jpg (337103
                                                      bytes)Nowhere is the intertwining of the "natural" and "social" bases for racial and ethnic classification clearer than with the history of the United States census:

 

 

From 1790 to 1850, the only racial classification was between White and Black, with the Black population further classified (in accordance with the "three fifths clause" of the United States Constitution) as "Free" or "Slave".

From 1850 to 1870, the category of Mulatto (of mixed white and black parentage) was added to Black and White.  

In 1860, reflecting the Western Expansion which pushed the American frontier westward, American Indians were added to the census.    

At first, the census counted only those Indians who were subject to taxation, but in 1870 the census was expanded to all Indians.  

Beginning in 1890, the census distinguished between those Indians who were taxed, and those who lived on reservations or in the Indian Territory (until Indian Territory became the Territory, and then State, of Oklahoma).

Also in 1860, reflecting the importation of Chinese labor to help build the Transcontinental Railroad, Chinese were added to the census.

Japanese were added in 1870.

In 1890, categories of Quadroon (one black grandparent, thus 1/4 "black blood") and Octoroon (one black great-grandparent, thus 1/8 "black blood") were added to the census.  These categories were a holdover from the French territory of Louisiana, and anticipated the "one drop rule" of the Jim Crow period, but they were abandoned in 1910.  

In 1910, categories of Asian (meaning Asians other than Chinese and Japanese) and Pacific Islander were added -- probably reflecting the acquisition of Pacific territories such as the Philippines, Guam, and Samoa following the Spanish American War.

In 1930, the category of Mexican was added, reflecting the beginning of large-scale Mexican immigration into the American Southwest, often as "guest labor".  

In 1940, the category Mexican was dropped in favor of White Hispanic -- presumably to cover the immigration of Latinos other than Chicanos.

In 1970, this category was changed to Spanish Language/Heritage/Origin/Descent, reflecting the increasing reluctance of Hispanics to identify themselves as White.  Mexican, Cuban, and Puerto Rican respondents were allowed to classify themselves as "Other Race".

In 1950, Filipinos were added to the list.

Beginning in 1950, a category of Other Race (including Mixed Race) was included.

In the 2000 census, respondents were able to check all the racial and ethnic categories that applied to them.

In 1960, categories of Eskimo, Aleut, and Hawaiian were added, reflecting the admission of Alaska and Hawaii to the Union in 1959.

In 1970, Koreans were added to the list.

In 1980, a new category of Asian and Pacific Islander appeared, along with subcategories for Vietnamese, Asian Indian, Guamanian, and Samoan.

There were no changes in the race and ethnicity categories in the 1990 census.

In the 2000 census, individuals were allowed to identify themselves as members of "two or more races" or of "some other race".  As a result, 47.9% of Hispanics identified themselves as White, 2% as Black, 6.3% as "two or more races", and fully 42.2% as "some other race".  Hispanics were virtually the only ethnic group to use the "some other race" category . 

In an attempt to address this anomaly, the Census Bureau considered eliminating the 'Hispanic" category in the 2010 census, in order to force Hispanics to identify themselves as White, Black, Asian, American Indian, or Pacific Islander.  Hispanics, for their part, are resisting this pressure -- some on the grounds that Hispanics themselves constitute a kind of racial grouping; others on grounds that Hispanics have their own subcategories (jabao, indio, trigueno, moreno) based on skin color or birthplace.  On the 2000 census form, many Hispanics further identified themselves as Latino, Mexican, Puerto Rican, Dominican, etc.

In the final analysis, the procedure (and categories) of the 2000 census were retained in the 2010 census, except:

The 2000 census asked respondents to identify themselves as "Spanish, Hispanic, or Latino".

Those who did so were asked to further classify themselves as:

Mexican, Mexican-American, Chicano

Puerto Rican

Cuban

Other

But representatives of the Hispanic community objected that they didn't consider themselves "Spanish".

Accordingly, in the 2010 census, the order was changed to "Hispanic, Latino, or Spanish Origin".

The subcategories of Hispanics, etc., were retained as in the 2000 census.

And once again, respondents were permitted to endorse multiple racial categories.  The result of this is that the Census Bureau recognizes a total of 63 racial categories, based on various combinations of race and ethnicity.  

To make things even more complex, different government agencies, such as the Department of Education and the National Center for Health Statistics tally multiracial individuals according to different schemes than the one used by the Census Bureau (see "In a Multiracial Nation, Many Ways to Tally" by Susan Saulny, New York Times, 02/10/2011, and other articles in the Times' ongoing series, "Race Remixed: The Pigeonhole Problem).

 Beyond 2010, Kenneth Prewitt, who served as director of the Census Bureau from 1998 to 2000, has written that "the demographic revolution since the immigration overhaul of 1965 has pushed the outdated (and politically constructed) notion of race to the breaking point" ("Fix the Census' Archaic Racial Categories", New York Times, 08/22/2013).  Prewitt has proposed three reforms to the Census:


For more details, see "Historical Census Statistics on Population Totals by Race, 1790 to 1990, and by Hispanic Origin, 1970 to 1990), for the United States, Regions, Divisions, and States" by Campbell Gibson & Kay Jung (Working Paper Series No. 56, Population Division, U.S. Census Bureau (09/02), from which much of this this material is taken.  

See also:

  • "Hispanics Debate Racial Grouping by Census" by Rachel L. Swarns, New York Times, 10/24/04;
  • "Marrying Out" by Luna Shyr, National Geographic, 04/2011;
  • What Is Your Race? The Census and Our Flawed Effort to Classify Americans (2013) by Kenneth Prewitt..

 

Minorities and Diversity on Campus

The difficulties of social categorization are not confined to the census.

Consider the evolution of ethnic categories offered to undergraduate applicants to the University of California system.

For most of its recent history, the UC has classified its applicants into eight categories:

American Indian

African American

Chicano/Latino

Asian/Filipino/Pacific Islander

White

Other

Unknown.

However, the application for the 2008-2009 academic year contains a more differentiated set of "Asian" ethnicities:

African American/Black

American Indian/Alaska Native

Chinese/Chinese American

East Indian/Pakistani

Filipino/Filipino American

Japanese/Japanese American

Korean/Korean American

Mexican/Mexican American/Chicano

Pacific Islander (including Micronesian, Polynesian, and other Pacific Islanders)

Vietnamese/Vietnamese American

White/Caucasian (including Middle Eastern)

Still, out of a concern that certain Southeast Asian and Pacific Islander groups were disadvantaged in the admissions process, in part because their numbers were submerged in larger ethnic groups like Vietnamese and Filipinos, representatives of Pacific Rim students mounted a "Count Me In" campaign.  In response, the UC system greatly expanded its categories for Southeast Asians and Pacific Islanders.

There are now separate categories for a wider variety of Asian ethnic groups:

Chinese

Taiwanese

Asian Indian

Pakistani

Japanese

Korean

Filipino

Vietnamese

Hmong

Thai

Cambodian

Laotian

Malaysian

Sri Lankan

Other Asian

Similarly, there are now separate categories for a wider variety of Pacific Islander ethnicities:

Native Hawaiian

Guamanian/Chamorro

Samoan

Tongan

Fijian

Other Pacific Islander

Note that, aside from Mexican Americans, "Other Spanish Americans" are still lumped together -- never mind Middle Easterners, who are lumped together with Whites (and, for that matter, never mind ethnicities among Whites!).  As more and more ethnic groups mount "Count Me In" campaigns, we can expect official recognition of more and more ethnic categories.

And, as a smaller example, the racial and ethnic classifications used in the Research Participation Program of the UCB Department of Psychology.   For purposes of prescreeing, students in the RPP are asked to classify themselves with respect to gender identity, ethnic identity, and other characteristics.  

038RPP2004.jpg
                                                          (64032 bytes)In 2004, a relatively small number of such categories were employed -- pretty much along the lines of the 2000 census.

 

 

 

039RPP2006.jpg
                                                          (89552 bytes)But in 2006, RPP employed a much more diverse set of racial and ethnic categories -- with more than a dozen subcategories for Asians and Asian-Americans, for example.  Arguably, the ethnic composition of the Berkeley student body didn't change all that much in just two years!  Rather, the change was motivated by the fact that the Psychology Department has a number of researchers interested in cultural psychology, and especially in differences between people of Asian and European heritage.  In this research, it is important to make rather fine distinctions among Asian, with respect to their ancestral lands.  But note some anomalies:

 

"Asians" and "Asian-Americans" are grouped together, as if their ancestral homes mattered more than their American citizenship -- or, perhaps, how recently they and their families arrived in America   This classification seems to indicate that a Japanese-American student whose family has lived in Southern California for three generations has more in common with an exchange student from Kyoto than he has with a White student from Los Angeles.

European-Americans do not get classified into such small slices -- and the investigators have chosen to distinguish between Western and Eastern Europeans (grouping together Germans, French and English, as opposed to Russians, Czechs, and Poles), and ignoring the distinction between Northern (largely Protestant) and Southern (largely Catholic) Europeans.

Middle-Easterners are lumped together, ignoring the huge differences between (mostly Jewish) Israelis, (mostly Muslim) Arabs, and Persians (Iranians) -- who, while also (mostly) Muslim, have huge cultural differences with Arabs.

African-American can identify themselves as such, but there is no place to recognize Africans who are not Americans -- for example, exchange students from Nigeria -- never mind Algeria, or the Caribbean.  

In fact, the term "African-American", as a proxy for 'Black", emphasizes ethnic heritage over race, but creates all sorts of problems of its own:

What about non-American Africans, such as students from Liberia, Nigeria, etc.?

What about non-Black Africans, such as students from Algeria and Egypt?

What about non-Black African-Americans?

If you think this last category is a contradiction in terms, think again:

In an article on the transformation of "minority" to "diversity" programs in higher education, a recent newspaper article noted that "of the three white participants [in a university diversity program], one was a young man who routinely identifies himself on applications as 'African-American' because his father was "raised in Egypt" (from 'Minority' to 'Diversity'" by Peter Schmidt, Chronicle of Higher Education, 02/03/06).

This was intended to illustrate the point that White students now have access to some programs formally offered only to minority students -- for example, on grounds of economic or other disadvantage.  

But a subsequent Letter to the Editor complained about the author's apparent discounting of the student's ethnic identity, based on his "white" physical features:  "On what basis did Mr. Schmidt determine that the student was white if he identifies himself as black? Why shouldn't he be African-American, if that is how he views himself?  This is a good example of how in the battle between self-identification and identification by others, identification by others seems to win out.  Irrespective of how you view yourself, how you are perceived and written about depends on the stereotyping eye of the beholder" (from a letter to the editor by Rochelle Parks-Yancy, CHE, 02/24/06).

Note, however, that the student didn't actually identify himself as Black -- only as African American.  

This point was reinforced by an editorial reply (CHE, 02/24/06): "The student in question told our reporter that he considers himself and both his parents white, and he would never describe himself as black -- only as African-American.

Never mind that the student claimed only that his father was raised in Egypt -- meaning that his father might have been Egyptian, or a Palestinian exiled in Egypt by the Arab-Israeli conflict, or a white kid born in Iowa whose father was on the faculty at the American University of Cairo.

The category of Hispanic has also been contested: should it apply to anyone with Spanish heritage, including immigrants from Spain as well as Latin America -- not to mention Spanish Morocco?  

I once knew someone who, as a Cuban of Jewish heritage, was denied membership in a Hispanic social organization on the grounds that, as a Cuban and a Jew, he wasn't really Hispanic -- by which, apparently, the organization meant Mexican-American.

More to the point, President Obama's nomination of Judge Sonia Sotomayor, who is of Puerto Rican heritage, as Associate Justice of the Supreme Court (replacing Justice David Souter) raised the question of whether, if confirmed, she would be the first Hispanic Supreme Court justice.  Most people thought so, including spokespeople for major Hispanic advocacy groups.  But then there is the fact that Justice Benjamin Cardozo, appointed to the Court by President Herbert Hoover, was of Portuguese heritage.  Cardozo, however, didn't identify himself as Hispanic (a term that was not in common use during his lifetime), or even as Portuguese.  Rather, he most likely would have identified himself as "a Sephardic Jew whose ancestors came from the Iberian Peninsula", in the words of his biographer, Andrew Kaufman ("Was Hispanic on the Court in the '30s?" by Neil A. Lewis, New York Times, 05/27/2009).

Of course, the Portuguese speak Portuguese, not Spanish.  

Not to mention that Portugal declared its independence from Spain in 1143, and the Treaty of Tordesillas (signed in 1494, only two years after Columbus) divided the non-Christian world, including the New World, between Spain and Portugal -- so even the Pope (who negotiated the treaty), as well as the Spanish and the Portuguese, considered the two countries to be different.

Then again, Philip II of Spain declared himself king of Portugal as well, initiating the "Spanish Captivity" of Portugal, which lasted from 1580-1640 -- when Portugal declared its independence once again.  

Most Hispanic organizations do not regard Portuguese as Hispanic.  For example, the National Association of Latino Elected and Appointed Officials restricts the label "Hispanic" to individuals who are descended from countries in the Americas with a Spanish-language heritage.

Nor does the United States Census.

But Reps. Tony Coelho, who was of Portuguese ancestry, and Dennis Cardoza, whose ancestors come from the Portuguese Azores, joined the Congressional Hispanic Caucus.

Then again, there are White people in the Congressional Black Caucus -- or, at least, there have been such members in the past.

 

Personality Types

Obviously, our language contains a large number of nouns which designate various types of people.  These types are categories of people, and the nouns are category labels.  Many of these classificatory labels have their origins in scientific research on personality, including the terms used to label various forms of mental illness, but they have also filtered into common parlance.  You don't have to be a psychologist or a psychiatrist to label someone an extravert or a psycho.  

The classification of people according to their personality type has a history that goes back almost 2,500 years.

 

Theophrastus and the Characterological Tradition in Literature

039Theophrastus.jpg
                                                          (99023 bytes)The chief preoccupation of Greek science was with classification. Aristotle (384-322 B.C.), in his Historia Animalium provided a taxonomy, or classificatory scheme, for biological phenomena.   Theophrastus (370-287 B.C.), his successor as head of the Peripatetic School in Athens (so named because the teachers strolled around the courtyard while lecturing), followed his example by developing a two-part classification of plants that heavily influenced the modern "genus-species"  taxonomy introduced by Linnaeus.  Then he turned his attention to developing a taxonomy of people. His work is embodied in Characters, a delightful book in which he described the various types of people encountered in Athenian society. Unfortunately, that portion of the book which described socially desirable types has been lost to history: All that remains are his portraits of 30 thoroughly negative characters, most of whom are instantly recognizable even today, more than 2000 years later. All his descriptions follow the same expository format: a brief definition of the dominant feature of the personality under consideration, followed by a list of typical behaviors representative of that feature.  

The Distrustful Man

It goes without saying that Distrustfulness is a presumption of dishonesty against all mankind; and the Distrustful man is he that will send one servant off to market and then another to learn what price he paid; and will carry his own money and sit down every furlong to count it over. When he is abed he will ask his wife if the coffer be locked and the cupboard sealed and the house-door bolted, and for all she may say Yes, he will himself rise naked and bare-foot from the blankets and light the candle and run round the house to see, and even so will hardly go to sleep. Those that owe him money find him demand the usury before witnesses, so that they shall never by any means deny that he has asked it. His cloak is put out to wash not where it will be fulled best, but where the fuller gives him good security. And when a neighbor comes a-borrowing drinking-cups he will refuse him if he can; should he perchance be a great friend or a kinsman, he will lend them, yet almost weigh them and assay them, if not take security for them, before he does so. When his servant attends him he is bidden go before and not behind, so that he may make sure he do not take himself off by the way. And to any man who has bought of him and says, 'Reckon it up and set it down; I cannot send for the money just yet,' he replies, 'Never mind; I will accompany you home' (Theophrastus, 319 B.C./1929, pp. 85-87).

Theophrastus initiated a literary tradition which became very popular during the 16th and 17th centuries, especially in England and France (for reviews see Aldington, 1925; Roback, 1928). However, these later examples represent significant departures from their forerunner. Theophrastus was interested in the objective description of broad types of people defined by some salient psychological characteristic. In contrast, the later efforts show an increasing interest in types defined by social class or occupational status. In other instances, the author presents word portraits of particular individuals, with little apparent concern with whether the subjects of the sketch are representative of any broader class at all. Early examples of this tendency are to be found in the descriptions of the pilgrims in Chaucer's (c. 1387) Canterbury Tales Two examples that lie closer to Theophrastus' intentions are the Microcosmographie of John Earle (1628) and La Bruyere's Les Caracteres (1688). More recent examples of the form may be found in George Eliot's Impressions of Theophrastus Such (1879) and Earwitness: Fifty Characters (1982) by Elias Canetti, winner of the 1981 Nobel Prize for Literature.

The later character sketches also became increasingly opinionated in nature, including the author's personal evaluations of the class or individual, or serving as vehicles for making ethical or moral points. Like Theophrastus, however, all of these authors attempted highly abstract character portraits, in which individuals were lifted out of the social and temporal context in which their lives ran their course. Reading one of these sketches we have little or no idea what forces impinged on these individuals to shape their thoughts and actions; what their motives, goals, and intentions were; or what their lives were like from day to day, year to year. As authors became more and more interested in such matters they began to write "histories" or "biographies" of fictitious characters -- in short, novels. In the 18th century the novel quickly rose to a position as the dominant literary form in Europe, and interest in the character-sketch waned. Character portraits still occur in novels and short stories, but only as a minor part of the whole -- perhaps contributing to the backdrop against which the action of the plot takes place. Again, insofar as they describe particular individuals, character sketches imbedded in novels lack the quality of universality which Theophrastus sought to achieve.

 

Scientific and Pseudoscientific Typologies in the Ancient World

Characters is a classic of literature because -- despite the radical differences between ancient Athenian culture and our own -- Theophrastus' 30 character types are instantly recognizable by readers of any place and time. As a scientific endeavor, however, it is not so satisfying. In the first place, Theophrastus provides no evidence in support of his typological distinctions: were there really 30 negative types of Greeks, or were there 28 or 32; and if there were indeed 30 such types, were they these 30?  (Theophrastus didn't describe any positive characters, but suggested that he described them in another manuscript that has been lost - -or, perhaps Theophrastus was just kidding.)  Moreover, Theophrastus did not offer any scheme to organize these types, showing how they might be related to each other. Perhaps more important -- assuming that Characters attained classic status precisely because Theophrastus' types were deemed to be universal -- is the question of the origin of the types. Theophrastus raised this question at the very beginning of his book, but he did not offer any answer:

I have often marvelled, when I have given the matter my attention, and it may be I shall never cease to marvel, why it has come about that, albeit the whole of Greece lies in the same clime and all Greeks have a like upbringing, we have not the same constitution of character (319 B.C./1929, p. 37).

The ancients had solutions to all problems, both scientific and pseudoscientific.

 

Astrology

Some popular approaches to creating typologies of personality have their origins in ancient folklore, and from time to time they have been endowed with the appearance of science. For example, a tradition of physiognomy diagnosed personality on the basis of similarities in physical appearance between individual humans and species of infrahuman animals. Thus, a person possessing hawk-like eyes, or an eagle-like nose was presumed to share behavioral characteristics with that species as well.

Astrology.JPG
                                                          (103820
                                                          bytes)By far the most prominent of these pseudoscientific approaches to personality was (and still is) astrology, which holds that the sun, moon, planets, and stars somehow influence events on earth. The theory has its origins in the ancient idea that events in the heavens -- eclipses, conjunctions of stars, and the like -- were omens of things to come. This interest in astral omens has been traced back almost 4000 years to the First Dynasty of the kingdom of Babylon. Astrology per se appears to have begun in the 3rd century B.C., when religious authorities began using the planets to predict events in an individual's life. The various planets, and signs of the Zodiac, were thought to be associated with various attributes. The astrologer prepared a horoscope, or map of the heavens at the moment of an individual's birth (or, sometimes, his or her conception), and predicted on the basis of the relative positions of the heavenly bodies what characteristics the person would possess. Of course, because these relative positions varied constantly, somewhat different predictions could be derived for each individual. To the extent that two individuals were born at the same time and in the same place, then, they would be similar in personality.

Later, this complicated system was considerably simplified such that these predictions were based on the zodiacal signs themselves. Each sign was associated with a different portion of the calendar year, and individuals born during that interval were held to acquire corresponding personality characteristics. Thus, modern astrology establishes 12 personality types, one for each sign of the Zodiac. In the passages which follow, taken from the Larousse Encyclopaedia of Astrology, note the stylistic similarity to the character portraits of Theophrastus.

Aries is essentially a sign of beginnings, of boundless creativity and pure energy. Arians ... must be up and doing for the sheer joy of it, especially if the activity involves adventure, the exploration of unknown territory, and even danger. The rulership of Mars bestows strength and courage, and a strong desire nature, which means both sexual desire and the drive to conquer and possess material things, power, and fame. The incredible Aries energy in initiating new projects is a blend of the enthusiasm and self-confidence of fire and the outgoing activity of cardinality. The polar opposite of indecisive Libra, Arians seldom have time to look before they leap; they simply rush forward headlong, for it is their business to lead and inspire.

The symbol for Scorpio is the scorpion, a creature that travels by night and is feared for its deadly sting. Though all Scorpio people are by no means venomous and cruel, the symbol conveys the qualities of secretiveness, penetration, and power that do characterize the natives of this sign .... Besides the strong sexuality for which they are famous, Scorpio people also have an awareness of death that is often not fearful .... Indeed, if they are afraid of anything at all, it is of being known as deeply as they wish to know. Their ability to penetrate and probe may be channeled constructively into research or healing or it may be used to manipulate people in personal relationships.

Astrology was immensely powerful in the ancient world, and even in this century various political leaders such as Adolph Hitler in Germany and Lon Nol in Cambodia have computed horoscopes to help them in decision-making (Nancy Reagan famously consulted an astrologer about the scheduling of some White House events). However, by the 17th century astrology had lost its theoretical underpinnings. First, the new astronomy of Copernicus (1473-1543), Galileo (1564-1642), and Kepler (1571-1630), showed that the earth was not at the center of the universe, as astrological doctrine required. Then, the new physics of Descartes (1596-1650) and Newton (1642-1727) proved that the stars could have no physical influence on the earth. If that were not enough, the more recent discovery of Uranus, Neptune, and Pluto would have created enormous problems for a system that was predicated on the assumption that there were six, not nine, planets. In any event, there is no credible evidence of any lawful relationship between horoscope and personality.

Never mind that there are actually thirteen signs of the zodiac.  The Babylonians noted that the sun also passes through the constellation Ophiuchus, the serpent-holder (November 29-December 17).  But the sun spends less time in Ophiuchus than it does in the other constellations, and the "pass" is really only a tangential nick in spatial terms.  So the Babylonians, who wanted there to be just twelve  zodiacal signs, discarded it, leaving us with the twelve signs we know today.  And never mind that the boundaries between astrological signs are wrong.  Because of the astronomical phenomenon of precession, caused by the wobbling of the Earth on its axis, the actual dates are shifted by about a month from their conventional boundaries.  The true dates for Scorpio, which are usually given as October 24-November 22, are actually November 23-November 29.  If you want to mock either astrologers or horoscope-readers for not being faithful to their system, then you should knock Sir Isaac Newton as well.  After all, a prism really breaks white light up into only six primary colors (look for yourself), and he added indigo because he thought that the number 7 had occult significance (he was also an alchemist, after all.

In 2011, Parke Kunkel, an astronomer and member of the Minnesota Planetarium Society, reminded astrologers of these inconvenient fact, which meant that large number of people would have to adjust their signs.  According to a news story ("Did Your Horoscope Predict This?", by Jesse McKinley, New York Times, 01/15/2011), one astrology buff Twittered: "My zodiac sign changed.  Does that mean that I'm not anymore who I used to be?!?".  Another wrote, "First we were told that Pluto is not a planet, now there's a new zodiac sign, Ophiuchus.  My childhood was a bloody lie."  On the other hand, an astrologer told of "A woman who told me she'd always felt there were one or two traits about Sagittarius that didn't fit her personality, but that the new sign is spot on".  Other people, I'm sure responded "I don't care: I'm still a Scorpio" or whatever -- which, I think, is eloquent testimony to the fact that the traditional zodiacal signs really do serve as social categories, and as elements of personal identity -- which is why so many people exchange their astrological signs on first dates.  

 

The Humor Theory of Temperament

Greek science had another answer for these questions, in the form of a theory first proposed by Hippocrates (460?-377? B.C.), usually acknowledged as the founder of Western medicine, and Galen (130-200? A.D.), his intellectual heir. Greek physics asserted that the universe was composed of four cosmic elements, air, earth, fire, and water. Human beings, as microcosms of nature, were composed of humors -- biological substances which paralleled the cosmic elements. The predominance of one humor over the others endowed each individual with a particular type of temperament. 

Humor theory was the first scientific theory of personality -- the first to base its descriptions on some basis other than the personal predilections of the observer, and the first to provide a rational explanation of individual differences. The theory was extremely powerful, and dominated both philosophical and medical discussions of personality well into the 19th century. Immanuel Kant , the German philosopher, abandoned Greek humor theory but retained its fourfold classification of personality types in his Anthropology of 1798 (this book was the forerunner of the now-familiar introductory psychology textbook). His descriptions of the four personality types have a flavor strongly reminiscent of Theophrastus' Characters.

The Sanguine Temperament. The sanguine person is carefree and full of hope; attributes great importance to whatever he may be dealing with at the moment, but may have forgotten all about it the next. He means to keep his promises but fails to do so because he never considered deeply enough beforehand whether he would be able to keep them. He is good-natured enough to help others, but is a bad debtor and constantly asks for time to pay. He is very sociable, given to pranks, contented, doesn't take anything very seriously and has many, many friends. He is not vicious, but difficult to convert from his sins; he may repent, but contrition (which never becomes a feeling of guilt) is soon forgotten. He is easily fatigued and bored by work, but is constantly engaged in mere games -- these carry with them constant change, and persistence is not his forte.

The Melancholic Temperament. People tending toward melancholia attribute great importance to everything that concerns them. They discover everywhere cause for anxiety, and notice first of all the difficulties in a situation, in constradistinction to the sanguine person. They do not make promises easily, because they insist on keeping their word, and have to consider whether they will be able to do so. All this is so not because of moral considerations, but because interaction with others makes them worried, suspicious, and thoughtful; it is for this reason that happiness escapes them.

The Choleric Temperament. He is said to be hot-headed, is quickly roused, but easily calmed down if his opponent gives in; he is annoyed without lasting hatred. Activity is quick, but not persistent. He is busy, but does not like to be in business, precisely because he is not persistent; he prefers to give orders, but does not want to be bothered with carrying them out. He loves open recognition, and wants to be publicly praised. He loves appearances, pomp, and formality; he is full of pride and self-love. He is miserly; polite, but with ceremony; he suffers most through the refusal of others to fall in with his pretensions. In one word, the choleric temperament is the least happy, because it is the most likely to call forth opposition to itself.

The Phlegmatic Temperament. Phlegma means lack of emotion, not laziness; it implies the tendency to be moved, neither quickly nor easily, but persistently. Such a person warms up sowly, but he retains the warmth longer. He acts on principle, not by instinct; his happy temperament may suply the lack of sagacity and wisdom. He is reasonable in his dealing with other people, and usually gets his way by persisting in objectives while appearing to give way to others.

001Kant.jpg
                                                          (66261 bytes)In the end, Greek humor theory proved to be no more valid than astrology. Nevertheless, it formed the basis for the study of the psychophysiological correlates of emotion -- the search for patterns of somatic activity uniquely corresponding to emotional experiences. Moreover, the classic fourfold typology of personality laid the basis for a major tradition in the scientific study of personality, which emerged around the turn of the 20th century, which analyzed personality in terms of traits rather than types. We shall examine each of these topics in detail later. First, however, we should examine other typological schemes that are prominent today.

 

The Four Temperaments

The classic fourfold typology, derived from ancient Greek humour theory, is often referred to as The Four Temperaments.  Under that label, it has been the subject of a number of artworks.

In music, a humoresque is a term given to a light-hearted musical composition.  But Robert Schumann's "Humoreske in Bb", Op. 20 (1839), is a suite based on the four classical humours.

The German composer Paul Hindemith also wrote a suite for piano and strings --  actually, a theme with four variations -- entitled The Four Temperaments (1940), which was choreographed for the Ballet Society, the forerunner of the New York City Center Ballet by George Balanchine (1946).

 

 

Modern Clinical Typologies

With the emergence of psychology as a scientific discipline separate from philosophy and physiology in the late 19th century, a number of other typological schemes were proposed. Most of these had their origins in astute clinical observation by psychiatrists and clinical psychologists rather than in rigorous empirical research. However, all of these were explicitly scientific in intent, in that their proponents attempted to develop a body of evidence that would confirm the existence of the types.

 

Intellectual Types

Beginning in the late 18th century, and especially in the late 19th century, as psychiatry began to emerge as a distinct branch of medicine, a great deal of attention was devoted to classification by intellectual ability, as measured by IQ (or something like it).

At the lower end of the scale, there were three subcategories of "mental defective" (what we now call mental retardation):

Moron

Imbecile

Idiot.

At the upper end of the scale, there was only a single category, genius, for those with extremely high IQs.  More recently, the term "genius" has been replaced with "gifted".  The upper end has also been divided into subcategories:

Bright (IQ > 115)

Gifted (IQ > 130)

Highly Gifted (IQ > 145)

 

Freudian Typologies

Sigmund Freud (1908), a Viennese psychiatrist whose theory of personality was enormously influential in the 20th century (despite being invalid in every respect), claimed that adults displayed constellations of attributes whose origins could be traced to early childhood experiences related to weaning, toilet training, and sexuality. Freud himself described only one type -- the anal character, which displays excessive frugality, parsimony, petulance, obstinacy, pedantry, and orderliness. His followers, working along the same lines, elaborated a wide variety of additional types such as the oral, urethral, phallic, and genital (Blum, 1953; Fenichel, 1945; Shapiro, 1965).

The passage through the five stages of development leaves its imprint on adult personality. If all goes well, the person emerges possessing what is known as the genital character. Such a person is capable of achieving full sexual satisfaction through orgasm, a fact which for the first time permits the effective regulation of sexual impulses. The individual no longer has any need to adopt primitive defenses, though the adaptive defenses of displacement, creative elaboration, and sublimation are still operative. The person's emotional life is no longer threatening, and he or she can express feelings openly. No longer ambivalent, the person is capable of loving another.

Unfortunately, according to Freud, things rarely if ever go so well. People do not typically pass through the psychosexual stages unscathed, and thus they generally do not develop the genital character spontaneously. Developmental crises occurring at earlier stages prevent growth, fulfillment, and the final achievement of genital sexuality. These difficulties are resolved through the aid of additional defense mechanisms. For example the child can experience anxiety and frustration while he or she is in the process of moving from one stage to the next. Fixation occurs when the developmental process is halted, such that the person remains at the earlier stage. Alternatively, the child may experience anxiety and frustration after the advance has been completed. In this case, the person may return to an earlier stage, one that is free of these sorts of conflicts. This regression, of course, results in the loss of growth. Because of fixation and regression, psychological development does not necessarily proceed at the same pace as physical development.

Nevertheless, the point at which fixation or regression occurs determines the person's character -- Freud's term for personality -- as an adult. Not all of the resulting character types were described by Freud, but they have become generally accepted by the psychoanalytic community (Blum, 1953).

The Oral Character "... is extremely dependent on others for the maintenance of his self-esteem. External supplies are all- important to him, and he yearns for them passively.... When he feels depressed, he eats to overcome the emotion. Oral preoccupations, in addition to food, frequently revolve around drinking, smoking, and kissing" (Blum, 1953, p. 160).The oral character develops through the resolution of conflict over feeding and weaning. The oral dependent type relies on others to enhance and maintain self-esteem, and to relieve anxiety. Characteristically, the person engages in oral preoccupations such as smoking, eating, and drinking to overcome psychic pain. By contrast, the oral aggressive type expresses hostility towards those perceived to be responsible for his or her frustrations. This anger and hatred is not expressed by physical biting, as it might be in an infant, but rather by "biting" sarcasm in print or speech.

The Urethral Character:  "The outstanding personality features of the urethral character are ambition and competitiveness..." (Blum, 1953, p. 163).

The Anal Character develops through toilet training. The anal expulsive type retaliates against those deemed responsible for his or her suffering by being messy, irresponsible, disorderly, or wasteful. Or, through the mechanism of reaction formation, the person can appear neat, meticulous, frugal, and orderly. If so, however, the anal expulsive character underlying this surface behavior may be documented by the fact that somewhere, something is messy. The anal creative type, by contrast, produces things in order to please others, as well as oneself. As a result, such an individual develops attributes of generosity, charity, and philanthropy. Finally, the anal retentive type develops an interest in collecting and saving things -- as well as personality attributes of parsimony and frugality. On the other hand, through reaction formation he or she may spend and gamble recklessly, or make foolish investments.

The Phallic Character "behaves in a reckless, resolute, and self-assured fashion.... The overvaluation of the penis and its confusion with the whole body... are reflected by intense vanity, exhibitionism, and sensitiveness.... These individuals usually anticipate an expected assault by attacking first. They appear aggressive and provocative, not so much from what they say or do, but rather in their manner of speaking and acting. Wounded pride... often results in either cold reserve, deep depression, or lively aggression" (Blum, 1953, p. 163).  The phallic character, by virtue of his or her development, overvalues the penis. The male must demonstrate that he has not been castrated, and does so by engaging in reckless, vain, and exhibitionistic behaviors -- what is known in some Latin American cultures as machismo. The female resents having been castrated, and is sullen, provocative, and promiscuous -- as if to say, "look what has been done to me".

In the final analysis, Freud held that adult personality was shaped by a perpetual conflict between instinctual demands and environmental constraints. The instincts are primitive and unconscious. The defenses erected against them in order to mediate the conflict are also unconscious. These propositions give Freud's view of human nature its tragic flavor: conflict is inevitable, because it is rooted in our biological nature; and we do not know the ultimate reasons why we do the things that we do.

 

Jungian Typologies

C.G. Jung (1921), an early follower of