Saturday, May 07, 2011

NYU platforms conference part 4

Platforms as Social Spaces

Moderator: Kathy Strandburg

Facilitators: Ryan Calo
Platforms mediate communications and the ways we parse interactions. How can this be manipulated? Studies the social psychology literature around design (really neat paper!): can we solve legal problems through design or reduce them? Example: FB designs its website so you’ll share as much as possible. Persuasive computing: at a granular level, incentivize sharing. When Twitter came around, “update” became more prominent—both questions and commands because some people respond better to one and some to another. The way they now prepopulate your comments, as if you’re already participating. This is no accident.

We get the benefits of disclosure immediately, but costs are less obvious/farther off. Disclosure arbitrage is taking place. We respond better to people who look like us—research suggests that if you morph someone’s face with that of the intended audience, the target likes the speaker better. Interface design has a huge impact on how conversation unfolds: do you feel like you’re talking to a person or just throwing it out? People will write less offensive comments on blogs depending on the font you use for their comments.

Increasing numbers of interactions aren’t with humans at all, though they seem to be. Bots interact online. Example: bot priced a book at $23 million on Amazon because it got into an interaction with another bot bidding up the price. That’s fine for a book, but we’ve seen stock crashes triggered by bot v. bot, and that is a problem.

Miriam Cherry
Distinctions between work and play online—not as neat as traditional categories might have been. Leisure space/work space have been divided; now the line between factory and home is blurring across many methods of production, combined with platforms.

Platforms can connect people looking for workers and people looking for work. Questions of labor/value arbitrage, what wages are being paid, and whether the platform has any responsibility for things like minimum wage.  Mechanical Turk and other crowdsourcing platforms: take big tasks and break them down into small tasks and then put the results together—can be used for tagging websites, gathering news. Other ways of working: contests, which can require midlevel or very sophisticated skills. People are inspired/incentivized to produce in response. Other types of work: gold farmers in virtual worlds. The value they produce is often tradeable outside or inside the game; has a commodifiable time value. Other forms of work in virtual worlds have more traditional analogues: accounting in Second Life, similar to accounting using a phone call.

Normative concerns arise as well as opportunities. Drawing people from across the world offers a chance to get the most efficient business and get the best from globalization/creativity. Questions: does the work feel like work? Huffington Post and SSRN—are these supposed to be commercialized? Is that wrong? What are users’ expectations, and can this be handled by a clickthrough license? Such licenses are often criticized and may not be helpful for navigating the work/play divide. Where does the possibility for exploitation seem higher?

Dan Hunter
Metaphors affect policy: the consequences of the narrative of place for the internet. Hard to get a purchase on platform as metaphor. To him, a platform doesn’t generate much in the way of implications. Some are private, others public.

Games: game designers are anxious/angry about the idea that people outside the magic circle come in and say “thou shalt not do this”—what if I want to create a game about Nazi concentration camps? Well, what about a child porn game? At what point is the barrier between the magic circle and the outside world breached? Same question as when should we say that the private regulatory regime of FB is something that leaches out into the rest of the world.

So the problem is the same: a consensually adopted ruleset has implications outside of that space, and we will always have issues of mediating between those within and those outside the community. Play and hanging out with friends are connected. The metaphor, thin though it might be, of platforms demonstrates the connections between WoW and FB.

Beth Noveck
Inverse: government as platform—a popular theme from, among others, Tim O’Reilly.  The state has always been a platform—an organizer of a set of consensual social norms. But institutions/regulators aren’t good at making decisions. They can co-convene meetings; how can they promote innovation in making public life? Platforms are another term for this idea of the hybrid between the institution and the network.

Notion of building sociotechnical platforms was essential strategy for cultural transformation within government. Data.gov, usaspending.gov, challenge.gov to condition human behavior/effect change using tools that weren’t previously available. Open platforms go from law about transparency to transparency in action. Transparency not for its own sake but to use raw data to foster co-creation. National Archives and GPO: printed the Federal Register in the same way since Roosevelt in 3-column PDF. Some guys download 10 years of raw data; they’ve made it searchable, with pictures, making it more transparent and fostering cocreation between citizens and government to make government work better.

Platforms put questions of design front and center: how do we design better collaborative governance platforms? FOIA sets up an adversarial relationship between government and citizens; our legal regimes are not designed for gov’t to be convening platform around ideas/expertise. Limited range of opportunities for only a few citizens. APA based rulemaking involves very few people, very poorly designed—notice and spam (“type here to comment”). Policymaking on Twitter—really good way to get information out, not so good on getting information in.

Strandberg: surveillance is ever more pervasive as we walk around—our footprints are easier to track. Makes the magic circle concept harder to sustain.

Zarsky: Design as tools to set defaults—reminds him of Sunstein and Thaler on Nudging/libertarian paternalism. For the government to manipulate us into making correct choices is really scary. Literature on setting defaults: should we set a default on what people want or on what is normatively correct, and if the latter how will we decide? Even in the supposedly easy case of privacy, if the FTC thinks a set of defaults is normatively preferable, should the government’s opinion determine that? “What people usually want” is also problematic.

Calo: scares him too. S and T’s response is that if the choice architect would feel comfortable disclosing the manipulation to everyone, there’s no problem, though the principle doesn’t require actual disclosure. His point: he wants disclosures to be consistent with what actually happens on a website. The model we use to convey that information has been written text that no one ever reads or understands. His proposal: let’s put nonlinguistic contracting on the table. If you want people to behave like it’s a church, make it look like a church. That works better than saying “speak softly, no shorts.” Competency: FTC has economists and lawyers who don’t understand human-computer interactions. But it’s still promising, and isn’t the same as government manipulating us.

Zarsky: but if you tell websites “you have to do this,” there’s no such thing as neutral, just bias in a different direction. (Not clear to me why it’s good to leave this to private actors, then.)

Calo: study: a picture of eyes induces people to pay more for coffee than a sign asking them to pay. Power of design to realign expectations with actual practice.

Pasquale: concerned about astroturfing of regs. Virtual workers: proposes that workers should know the projects they’re working on—imagine being asked to identify a person from a photo and you’re actually working for Mubarak.

Hunter: there seems to be a trend about to hit about the coopting of our enjoyment within various contexts—we also need to be aware of the way in which we’re being influenced. One of the ways of dealing with Mechanical Turk is being aware of what it’s being used for, but what if it’s still fun? Gamification—use of games in non-game settings. Marketers love this. Same thing will happen with gamification as with privacy, where people are theoretically interested but don’t change their behavior.

Strandburg: this volunteer stuff is great in one sense but then we start to get queasy. Users innovate and then companies find ways to get the users to give them the innovations. The assumed distinction between amateurs and professionals may not hold.

Nissenbaum: when people are letting you down, the excuses we give usually involve suffering—the reason I missed this deadline was that something unpleasant happened. Rarely do they say “I was having too much fun at the coffee shop to leave.” Fascinating question of what we call work. Leads into questions about cocreation and how technology opens up categories of work that are not suffering. To take UGC and start to call it labor, and to talk about exploitation, starts to bleed categories into one another. Is there a normative reason to counting classes of activities as work, even if people are tricked into having fun?

Cherry: we’ve never really paid people based on hedonic benefit and whether they’re enjoying themselves. There is typically a leisure/work distinction, but it’s not hard and fast. Some kinds of positions can be improved with gamification. Jobs that involve drudgery/data entry, where having it be a contest could be fun. In other instances, that would be stressful/terrible. In general, tricking people is bad, and maybe we need a disclosure norm. But there’s no general problem with work as fun or fun as work. The more we see work being deskilled or paid very badly, though (e.g., developing countries pulled into Mechanical Turk), the more we have to worry. But it’s very hard to tell who’s a volunteer, who’s doing this for fun, who’s doing this to survive.

Citron: Crowdsourcing can be problematic for policy reasons. Gov’t increasingly using third-party sites—you can friend the president on FB—but gov’t isn’t doing a good job with the privacy concerns raised thereby.

Noveck: use of third-party sites is phenomenally positive, given how hard it is to procure new tech in government. First federal government blog was in 2008, and now the Secret Service has a Twitter account (though it shows up as having 0 followers). Privacy protections are different in the contracts the government signs with FB and Twitter; modified cookie policy in place and some limits on the use of analytics. We need more tools: create an environment to encourage asking specific questions and providing specific answers voluntarily; that improves our ability to create/exchange expertise.

Frischmann: is there anything wrongful about free riding on others’ play? As long as there’s no deception. In copyright, we give enough incentive to create and then after that copying/free riding is fine. Maybe the objection is not consequential, but then what is it that makes “exploitation” bad?

Calo: distinguish normative point from technical point: these people are not making minimum wage. If they’re working, then that is a violation of the law.

Gillespie: that’s only if you think of it as work—if you’re walking across the park, you’re not violating the law.

Calo: but no one gives you a dollar an hour to walk across the park.

Cherry: blurs the line between commodification of labor and noncommodification. FB asked its users to translate the site into other languages; most sites would pay for translation. A lot of volunteers objected—this is the kind of thing you should pay for and that we don’t want to volunteer for. Minimum wage: we worry about wages being undercut. We know how to handle the factory, but not hybrid work/leisure.

Hunter: most recent criticism comes from Ian Bogost on Exploitationware.  Not in agreement that gamification is bad, but he does have a negative reaction to FB’s privacy policies, and in both cases individuals experience more of the gains than the costs.

Strandburg: we have a lot of norms about when monetization is ok (Viviana Zelizer!)—think SSRN. OK to advertise, but she didn’t like it when they decided to sell bound copies. We don’t have good theories of this because it was all handled by social norms before. No one feels it’s terrible that a nightclub profits because we enjoy going out, but it feels different on an online platform.

Lastowka: “if value, then property” concept—if you’re the one providing the value then you feel shut out.

Gillespie: if people are comfortable with the bargain, then the norm will settle. Different theories of why we pay: drudgery in the factory satisfies all the reasons we might pay someone—it’s hard, it produces value, people need to live. But other cases are different because they force disaggregation: do we pay because it’s the right thing to do, for their time, because it produces value, or some other reason/some combination?

(I feel weird about this conversation since no one asks very well-paid big firm lawyers and law professors, not to mention hedge fund managers and other Wall Street types, to give up money given how fun many parts of our jobs can be. Instead we consider our payments to reflect our value, which we often take to be inherent. But we as a society are apparently willing to tell teachers, along with the people we’ve been talking about here, that they should be content with the moral satisfaction/fun of their jobs. This seems to me to be part of the greater acceptance of inequality that is so pervasive now. The commodification forehand—if this were valuable, you’d get paid—and the commodification backhand—if you’re not getting paid, you must not be valuable—go together.)

Hunter: the market sets the price within a capitalist system. (Well, that and the capture of the tax system by people who succeed in getting their compensation taxed at the lower capital gains rate; and the people who succeed in getting government bailouts so that the big banks can record record profits and pay out record bonuses; and so on.) Here, the excess money goes to the shareholders if you save $1 million not hiring staff for call centers since people will provide customer service for Verizon for free. CAPTCHAs: ID yourself as a human being, but also process information for OCR.

Noveck: we need more microdemocracy: something for everyone to do. In 2008, you could engage by driving someone to the polls, blogging, etc. But after the election opportunities to participation just don’t exist. Challenge.gov: intended to give people ways to participate other than responding to a rulemaking—reach out to wider skills/interests. A platform for agencies to list problems, ask questions.

No comments: