![]() ![]() In 2006, in a pioneering work of what would come to be known as " computational social science," researchers at Columbia University built an entire social network to study how 14,000 human users shared and rated music. Törnberg isn't the first to build a social network in a lab. Artificial intelligence, it turns out, might offer us real intelligence about ourselves. "And then large language models come along, and they're precisely that - a model of a person having a conversation." By replacing people as the subjects in scientific experiments, AI could conceivably turbocharge our understanding of human behavior in a wide range of fields, from public health and epidemiology to economics and sociology. "If you want to model public discourse or interaction, you need more sophisticated models of human behavior," says Törnberg, an assistant professor at the Institute for Logic, Language, and Computation at the University of Amsterdam. So researchers are starting to use chatbots as fake people from whom they can extract data about real people. And their whole deal is that they are designed to act like people. AI bots, on the other hand, will do whatever you tell them to, practically for free. People are hard to wrangle, and the setup costs for human experimentation are considerable. It's difficult to model something like Twitter - or to do any kind of science, really - using actual humans. "Is there a way to promote interaction across the partisan divide without driving toxicity and incivility?" wondered Petter Törnberg, the computer scientist who led the experiment. They had created a model of a social network in a lab - a Twitter in a bottle, as it were - in the hopes of learning how to create a better Twitter in the real world. The scientists had used ChatGPT 3.5 to build the bots for a very specific purpose: to study how to create a better social network - a less polarized, less caustic bath of assholery than our current platforms. Meanwhile, in our world, the not-simulated world, a bunch of scientists were watching. Then the 500 robots logged into something very much (but not totally) like Twitter, and discussed what they had read. ABC News reported that Alabama students were throwing "COVID parties." On CNN, President Donald Trump called Black Lives Matter a "symbol of hate." The New York Times had a story about the baseball season being canceled because of the pandemic. On a simulated day in July of a 2020 that didn't happen, 500 chatbots read the news - real news, our news, from the real July 1, 2020. ![]() Account icon An icon in the shape of a person's head and shoulders.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |