Tag Archives: Culture

Process Theory for Roboticists

Andrew Murphie’s take on Whitehead and McLuhan’s media theory is very succinct. McLuhan has become popular again but close examination of his work reveals sources including Whitehead and Innis, both of whom have more depth to their theories. Whitehead’s process philosophy seems increasingly relevant in understanding a world where the ‘original’ separations between animate and inanimate, human and non-human are shifting and “the relation is the smallest unit of being and of analysis”(Haraway 2008:156).

Btw. I don’t know why Andrew Murphie’s blog is called Adventures in Jutland. More reading is called for.

Whitehead’s Media Theory—a beginning

(Alfred North for those not living in the 1930s) Whitehead presents a little remarked upon but comprehensive ‘media theory’ that resituates media in the world, not “bifurcated” from a large slice of it. This theory is arguably more complete, if similar to, and yet predating, McLuhan’s. Indeed McLuhan read Whitehead extensively (see Douglas Coupland, Marshall McLuhan: You Know Nothing of My Work! 45, 59). In Whitehead’s theory of media there is no “bifurcation” between different types of signal (technical or natural, for example). Thus Whitehead’s philosophy becomes one in which the complexity of signal at the level of the world is paramount. Signals become “vectors of transmission” for the (“prehension” of) feeling which is central to his account of process. The world is a medium (Whitehead, Process and Reality, 286)—or a multiplicity of worlds (284) are mediums—for such vectors. For “the philosophy of organism the primary relationship of physical occasions is extensive connection,” (288) not simple extension of previously existing “things” (such as “us”).

Whitehead also preempts the very basis of both McLuhan’s thought–“the medium is the message.” He writes, “These extensive relations do not make determinate what is transmitted; but they do determine conditions to which all transmission must conform” (ibid.–see also Steven Shaviro, Without Criteria: Kant, Whitehead, Deleuze, and Aesthetics, 52). In a similar but again perhaps more comprehensive manner than McLuhan, Whitehead further understands the “the human body” as a kind of signal transducer or modulator, “…as a complex ‘amplifier’–to use the language of the technology of electromagnetism” (119). Even more than this,  “the predominant basis of perception is perception of the various bodily organs, as passing on their experiences by channels of transmission and of enhancement” (119).

There is more to say on this on another occasion. Here I will just point once again to the undoing of the bifurcation of nature within Whitehead’s philosophy with regard to signal.

Douglas Coupland, Marshall McLuhan: You Know Nothing of My Work! [New York: Atlas, 2010]

Steven Shaviro, Without Criteria: Kant, Whitehead, Deleuze, and Aesthetics[Cambridge, MA: MIT Press, 2009]

Alfred North Whitehead, Process and Reality [New York: The Free Press, 1978]

Exploring identity politics


Technology is able to infer relationships between objects that might otherwise require a curator or human interpreter to describe. Both the instruments pictured above were used to brand or define. The devil is in the details. (via catherine styles)
As part of his ‘Mining the museum’ installation at Maryland Historical Society in 1992–93, artist Fred Wilson placed a set of shackles in a display case with fine silverware and titled it Metalwork. Pow. United by the metal of their fabrication, the racially-divided, hierarchical histories of these objects dramatically distances them:

Who served the silver? And who could have made the silver objects in apprenticeship situations? And […] whose labour could produce the wealth that produced the silver?

A general principle can be distilled from this. Perhaps: In the very moment we identify a similarity between two objects, we recognise their difference. In other words, the process of drawing two things together creates an equal opposite force that draws attention to their natural distance. So the act of seeking resemblance – consistency, or patterns – simultaneously renders visible the inconsistencies, the structures and textures of our social world. And the greater the conceptual distance between the two likened objects, the more interesting the likening – and the greater the understanding to be found.

This simultaneous pulling together and springing apart of the sociophysical world interests me, and I’ve been thinking about it in relation to Sembl, where the challenge of the game is to identify a way in which a given object is related – surprisingly or humorously or otherwise interestingly – to another object.

What constitutes ‘interesting’ is of course difficult to define and depends to a large degree on the particular players playing. But if the natural conceptual distance between the two related objects is great, the relationship is more likely to be interesting – perhaps because it enables you to think about something in a new way. That’s what made Wilson’s juxtaposition of shackles with silver tableware interesting, and powerful.

Bechdel’s Rule – Gender in Film/Movies


This is so simple and yet it is so difficult to find movies that meet the three criteria.

1. Must have more than two women in it (with names as opposed to crowd bg)
2. That talk to each other (not the men)
3. About something other than the men.

I think Bechdel’s Test could be applied to many situations outside of movies too. I mention this a lot to other people half of whom have heard of ‘The Rule’. I have trouble remembering the Bechdel part so I’m blogging it again for the record. The Rule was in a comic drawn by Alison Bechdel in 1985 in ‘Dykes to Watch out For’ and has been referenced in feminist film theory and popular culture as the Bechdel Test.

Robotics I | ISEA2011 Istanbul | The Robot State

The papers/events at ISEA2011 present a smorgasbord of interesting reading from robotics to embodiment, augmentation, virtualization, sensory modes and cultural perceptions.

Oh to be in Istanbul now that ISEA 2011 is there!

Contact Summit 2011: The Evolution Will Be Social | Contact Un-conference


Is the future of everything internet social and glorious? If you followed Facebook’s announcement today of increased media sharing capabilities and the growing trend to platforming and gamification, then the Contact Summit is just the right froth for your low fat mochachino.

If however, you wonder about the quantification of everything, Foucault’s biopolitics, the lack of socially just change in the post computer world and the difficulty of finding opinions neither fragmented into silos nor frothing over the surface, then walk away. Turn off, tune out and don’t paraphrase glibly. There’s more to a life than its digits.

Social sciences still young

  • Edward L. Glaeser

Social scientists seeking an ancient intellectual lineage can find antecedents for economics, sociology, and political science in the work of Plato and Aristotle, but truthfully, the social sciences are parvenu fields. The widespread application of scientific methods to the study of human society—rigorous formal theories, serious empirical testing—occurred only during the twentieth century, mostly since World War II. The youth of the social sciences is exciting: progress is still being made at a ferocious pace, and the contours of these fields are rapidly evolving.

Without data, even the boldest theory is only an untested hypothesis, and in 1900, social scientists had very little real data. Economists lacked basic figures on national earnings. Political scientists knew little about individual voting. Ethnographic research was in its infancy. Only a few pioneering sociologists, like Émile Durkheim and W.E.B. Du Bois, were writing heavily statistical treatises on topics like suicide and African-American life in Philadelphia.

During the twentieth century, great measurers transformed social science. Some, like the economist Simon Kuznets, created usable data series by gathering information from disparate sources. Others, like the sociologist Samuel Stouffer, pioneered the design of large-scale surveys, taking advantage of the opportunities created by mass mobilization during World War II. Anthropologists like Franz Boas and Margaret Mead, and the Chicago sociological school, acquired evidence by close observation of a community. Psychologists began studying human behavior in their laboratories.

Yet these pioneering steps seem slow relative to the current onrush of new data that is now transforming the social sciences. Kuznets’s heirs are doing amazing things by using vast amounts of official data. Harvard’s Raj Chetty and Berkeley’s Emmanuel Saez, for example, have been granted access to Internal Revenue Service data that have produced findings as disparate as documenting the evolution of income inequality across the last century and showing that better kindergarten teachers significantly increase their pupils’ adult earnings.

During the 1990s, Harvard’s John Kain labored long to acquire access to the Texas school system’s database on students, teachers, classes, and test scores. A flood of administrative data followed that has produced scores of pathbreaking papers on the determinants of student achievement. These papers have transformed public-policy debates about schooling.

Technological change has also made life easier to examine. More than a decade ago, Robert Sampson, then at the University of Chicago and now at Harvard, along with several co-authors, studied Chicago neighborhoods by combining census and survey data with visual information gleaned by vehicle-mounted movie cameras. Searchable text databases have helped measure media bias. Researchers using fMRI machines can observe the neural activity associated with ethical or economic activity. The research possibilities created by Google’s database are enormous.

As data quality has improved, social scientists have moved beyond facts and correlations to the deeper quest for causality. Children who grow up in poor neighborhoods typically have worse economic and education outcomes, but does this mean their neighborhoods cause these outcomes? Moving from measurement to experiments is the second great social-science trend.

Good social-science experimental research first proliferated in psychology labs. Economists followed the psychologists by creating labs that tested (and often rejected) the predictions that game theory made concerning behavior in markets and auctions. But there is only so much that laboratory experiments can teach us about the long-term impact of having good neighbors or the functioning of a large, real market.

To analyze these phenomena, social scientists had to take experimental methods to the real world. Many early approaches relied on “natural” experiments, which occur when some external event, like a public policy, more or less randomly affects some individuals and not others. For example, my Harvard colleagues Guido Imbens and Don Rubin, and Bruce Sacerdote of Dartmouth, looked at people who won the Massachusetts lottery to examine the impact of extra earnings on spending and savings. The seemingly random timing of abortion legalization across states enabled John Donahue and Steven Levitt to test whether more abortions meant less crime.

But “natural” experiments are often imperfect, because policy changes are rarely truly random and may not answer the most pressing research questions. So social scientists increasingly have tried to turn public or nonprofit programs into true experiments. In the 1990s, the Department of Housing and Urban Development allowed part of its housing voucher program to become the Moving-to-Opportunity (MTO) Experiment. MTO randomly allocated housing vouchers across a pool of applicants from high-poverty neighborhoods, which enabled Harvard’s Lawrence Katz and Jeffrey Liebman and their co-author, Jeffrey Kling, to test whether children’s outcomes improved when parents were allocated vouchers that enabled them to move to better areas. Parents who got the vouchers did choose to have less-poor neighbors, but many of their children’s outcomes didn’t improve. Academically, girls did better but boys did worse.

The pressing problems of the developing world, and the lower cost of running experiments there, have led to an explosion of experiments in low-income countries. My colleague Michael Kremer helped pioneer such work when he helped set up and analyzed an experiment where de-worming drugs were distributed in some Kenyan schools and not others. School attendance increased substantially in the treated schools. Karthik Muralidharan and Venkatesh Sundararaman helped design an experiment in which teachers in some Indian schools but not in others got extra pay for improving test scores. When they compared the results across schools, they found that scores increased significantly in those randomly chosen schools whose teachers received incentive pay.

The adoption of experimental methods and improved data quality have, in turn, helped generate the third major social-science trend—the increasing irrelevance of traditional field boundaries. Empirical approaches are far more likely than theoretical edifices to be common across fields.

Moreover, the topic-based “silos” that once defined fields are far less binding, because we better understand the profound connections among economics, politics, and sociology. Economic outcomes often reflect sociological forces, and sociological outcomes respond to earnings. It is impossible to understand the wealth of nations without also knowing something about their politics, and Marx was at least right that economics has plenty of influence on politics as well. The connection between health and other outcomes means that the physical sciences are also being drawn in (as the work on de-worming suggests).

Social science is changing rapidly, as better data and real experiments replace the worldly philosophy of the past. Yet that change means that nineteenth-century field definitions feel increasingly obsolete. I hope that Harvard is at the vanguard in rethinking the shape of social science, just as it has been at the vanguard of working on better measurement and causal inference.

Edward L. Glaeser is Glimp professor of economics and director of the Harvard Kennedy School’s Taubman Center for State and Local Government.

Livestreaming robot competitions