COGSYS 261
No. of Pages 4, Model 5+
ARTICLE IN PRESS
3 March 2008; Disk Used
Available online at www.sciencedirect.com 1
Cognitive Systems Research xxx (2008) xxx–xxx www.elsevier.com/locate/cogsys
Book review
3
Artificial intelligence & religion
4
F
2
Action editor: Stefan Wermter
6
M. Afzal Upal
7
Cognitive Science, Occidental College, 1600 Campus Road, Los Angeles, CA 90042, United States
8 9
Received 27 January 2008; accepted 29 January 2008
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41
PR
42
The population is big enough that the results of a single run are not generally dominated by random factors, unless some interesting social processes give chance some leverage. Yet, it is small enough that today’s computers can process many steps of each simulation in the blink of an eye. Cyburg is in the population range of several real cities that were the focus of classic social-scientific community studies. (Bainbridge, 2006, p. 13)
45
The relatively short book (179 pages) is divided into eight chapters of roughly equal length. Chapter 1 introduces the simulation setup and Chapters 2–8 populate Cyburg with inhabitants of different characteristics. Each chapter describes a number of simulation runs obtained by varying a number of parameters to address a particular theme. The themes addressed include: segregation (Chapter 2), recruitment (Chapter 3), social networks (Chapter 4), Trust (Chapter 5), co-operation (Chapter 6), belief formation (Chapter 7), and belief propagation (Chapter 8). Chapter 2 recasts Schelling (1969) segregation model by naming the inhabitants Protestants and Catholics instead of calling them red and green. As in Schelling’s original formulation, the agents are randomly placed on the board and are allowed to move to another location if they are unhappy with their current location. Bainbridge walks his readers through various simulation runs. In the first run, Cyburg’s population is initialized to two equally sized communities of 14,700 agents each. Agents are unhappy if a majority of their neighbors does not belong to the same sect as them. At the start, roughly half of the near neighbors belong to the same denomination for segregation level of 50%. After 21 simulation steps when all agents become
53
ED
16
numbers is important because it allows us to place a ‘‘number of different denominations of exactly equal size” on a square 210 210 board. Furthermore:
CT
15
RR E
13 14
CO
12
In his book, ‘‘God from the Machine: Artificial Intelligence Models of Religious Cognition”, William Sims Bainbridge invites his readers to a guided pilgrimage of Cyburg: population 44,100. He promises them that through this experience they will learn about, ‘‘outreach strategies, religious conversion, ways that faith may limit deviant behavior, competition between denominations, and most importantly, religious belief” (Bainbridge, 2006, p. 6). In many ways, Bainbridge appears to be an ideal guide to have as one tries to navigate through the mysterious maze of religious beliefs. Having been trained as a Sociologist of Religion, he is well known for his in-depth case studies of new religious movements documented in Satan’s Power (1978) and The Endtime Family (2002) as well as his theoretic work with Rodney Stark on formulating a version of the rational choice theory of religion laid out in A Theory of Religion (1987). You may think that the fact that Cyburg is not a real city may make the trip there less enlightening than a trip to Jerusalem but Bainbridge argues otherwise. Unlike the real world, virtual worlds offer a number of possibilities that have not been explored by scientists interested in religion: the ability to run controlled experiments by systematically varying different factors and studying their impact on the society, the ability to see inside the head of the believers to find out what they really believe, the ability to go forward or backwards in time and the ability to speed up, slow down, or even stop the flow of time to better understand creation and growth of religious beliefs. Even the population size of 44,100 is a good choice because it is ‘‘divisible by all integers 1 through 7, plus 9, 10, 12, 14, 15, 20, and 25. Also 44,100 is a perfect square.” Divisibility by various
UN
10 11
OO
5
E-mail address:
[email protected] 1389-0417/$ - see front matter Ó 2008 Elsevier B.V. All rights reserved. doi:10.1016/j.cogsys.2008.01.001
43 44
46 47 48 49 50 51 52
54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75
COGSYS 261 3 March 2008; Disk Used
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131
F
84
OO
83
ED
82
social simulation methodology to sociologists of religion and religious studies researchers. But one wonders, would that purpose not be better served by introducing these researchers to a standard agent-based social simulation package such as NetLogo (Wilensky, 1999) or Repast (North, Tatara, & Ozik, 2007) that would allow them to more easily explore Bainbridge’s models? Bainbridge also argues that much about religion can be explained in terms of ordinary social factors such as social networks, social bonds, and group dynamics. This partly explains as to why he devotes a vast majority of the book to social group processes not usually considered to be central focus of religion. For most people, including it appears Stark and Bainbridge (1987), religiousbeliefs and their relationship with other cultural and common sense beliefs forms the core of religion. Perhaps, wanting to save the best for the end, Bainbridge leaves belief formation and communication for the last two chapters of the book. Chapter 7 is a modified version of Bainbridge (1995) which describes an agent-based simulation with each agent powered by a simple neuralnet. Each agent desires five commodities (called energy, water, food, oxygen, and life). Agents are randomly assigned these resources at the start and lose 1 unit of each resource in every simulation round. Agents are divided into four different groups depending on the kind of commodity they can produce. Water and food producers need 1 unit of energy to make 2 units of water or food. Oxygen producers need both 1 unit of water and energy to make 2 units of Oxygen. Energy producers are the only ones who do not need anything to producer energy. No one can produce life. Agents decide what to seek in each round by seeing what they lack the most. Having decided what to seek, they also decide what group of agents to seek it from. The learning decision is to learn how many groups there are (num_groups) and to learn the associations between resources and their producer groups. However, even when some agents come to believe that there are less than four groups (i.e., group_num = 1, 2, 3) they are still programmed to try and trade with agents from groups 4 4 – num_groups. It turns out that since no actual agent can produce life, life production comes to be associated with these phantom groups which Bainbridge labels supernatural. This simulation according to Bainbridge explains how people come to form religious beliefs. Religious beliefs are special algorithms for obtaining rewards such as eternal life which cannot be obtained. What makes these algorithms special is that they cannot be confirmed or disconfirmed, e.g., one would have to die to confirm/disconfirm the belief in life after death. This theory, originally laid out in Stark and Bainbridge (1987), may or may not be correct. The problem is that the simulation does not seem to tell us anything we did not know before the simulation. Agent-based social simulations are useful tools for social theory development when they result in the emergence of results that could not have been foreseen by the researchers before
CT
80 81
RR E
79
CO
78
satisfied and stop moving, the segregation level increases to nearly 80%. In the second run, happiness rule is changed so that agents are only happy if all of their neighbors are of the same sect as them. This time agents stop moving after only 12 steps and the segregation level only increased to 54%. Bainbridge explains that the reasons for this ‘‘ironic” result is ‘‘gridlock. At the end, there were still 14,700 available homes but each one had at least one Protestant and one Catholic living next door, so nobody was willing to move into it.” (Bainbridge, 2006, p. 26) The third run changes the happiness rule after every 10 steps. This results in complete segregation to be achieved after 75 steps. This Bainbridge says illustrates the notion of path dependence, ‘‘the phenomenon when the nature of the outcome depends very much on what route was taken to arrive at it.” (p. 28) The fourth run of Chapter 2 increases the number of denominations from 2 to 3 (named, Baptists, Catholics, and Methodists), then from 3 to 4, 4 to 6, and finally from 6 to 10 while keeping the happiness rule the same as in the first run. Each simulation results in significant increases in segregations from the starting random values. At the end of Chapter 2, Bainbridge sums up the main point as being that, ‘‘religious intolerance may really be increased (if not actually caused) by processes of interaction in which the relatively moderate desires of individuals produce consequences that none of them might have chosen consciously.” (p. 35) This point, similar to the simulation set up, seems to be a trivial restatement of Schelling’s 40-year old work with new religious garb but without any new insight. Chapter 3 simulates Roger’s innovation diffusion framework (Rogers, 1962) by populating Cyburg with 22,050 agents (this time called Unitarians) and then adds one agent (called a Mormon). An Unitarian converts to Mormonism if a critical number of its neighbors are Mormons. The critical number is set to 1 in the first run of the simulation and varied to a majority in a subsequent run which also contained four denominations (Baptists, Methodists, Lutherans, and Episcopalians). The summation section tells us that the point of these simulations was to show, ‘‘that an extremely simple model of recruitment to a growing social movement can produce the normal curve of recruitment rates. . . The dynamic geometry of diffusion alone can explain much about the shape of movement growth.” (p. 56). Chapter 4 redresses graph/social network theory in a religious garb, Chapter 5 does the same thing with multiagent models of trust (Ramchurn, Huynh, & Jennings, 2004), and Chapter 6 presents Axelrod’s (1984) work on co-operation with religious labels. Chapters 7 and 8 finally appear to provide new content which may come as welcome relief to the readers enticed by promises of artificial intelligence models of religious cognition. Arguments can be made for relabeling the classical agent-based social simulation work in religious garb and presenting it into one book. One of Bainbridge’s stated purposes for this book is to introduce the agent-based
UN
77
Book review / Cognitive Systems Research xxx (2008) xxx–xxx
PR
2 76
No. of Pages 4, Model 5+
ARTICLE IN PRESS
132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187
COGSYS 261
No. of Pages 4, Model 5+
ARTICLE IN PRESS
3 March 2008; Disk Used
Book review / Cognitive Systems Research xxx (2008) xxx–xxx
197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243
Smell
Smell glitter
Smell
glitter
Smell
Smell glitter
glitter glitter
glitter
Smell
glitter
glitter
glitter
Smell glitter
glitter
glitter
Smell
Smell glitter
Smell glitter
Smell Smell glitter
Smell glitter
Smell
Smell
glitter Smell
Smell
glitter
Smell
Smell
glitter
Smell
F
196
Smell
OO
195
glitter
Smell
PR
194
Smell glitter
Smell glitter
glitter
glitter
Smell Smell glitter
Fig. 1. A 10 10 version of the Multiagent Wumpus World (MWW) domain. This version has 10 agents, 10 wumpuses, and 10 treasures randomly placed on the board.
environment and explain each stimulus they observe. While causes (such as wumpuses and treasures) explain themselves, effects (such as stench and glitter) do not. The occurrence of effects can only be explained by the occurrence of causes that could have produced the observed effects, e.g., glitter can be explained by the presence of a treasure in a neighboring cell while stench can be explained by the presence of a wumpus in a neighboring cell. An observed effect, however, could have been caused by many unobserved causes, e.g., the stench in cell (2, 2) could be explained by the presence of a wumpus in any of the four cells:
ED
192 193
Smell glitter
CT
191
RR E
190
spending considerable time and effort to design and run simulations and to analyze the results. The problem with all the simulations presented here is that they do not tell us anything we did not know before. Agents in simulations of Chapter 8 were specifically programmed to favor the theories about the presence of phantom groups and the fact that they exhibit this bias does not tell us anything new about Bainbridge’s theory of religion. Similarly, relabeling red and green agents Catholics and Protestants and rerunning Schelling’s simulations does not tell us anything that we haven’t known for 40 years. There is however, another deeper problem with using traditional agent-based social simulation systems to simulate religious phenomena. Traditional agent-based social simulation systems are designed based on the keep-it-assimple-as-possible principle. The idea is that if complex social patterns can emerge from a simulation employing agents with simple decision making and agent-interaction rules and extremely limited memory (e.g., 1 or 2 bits) then it is easy to compute the causal links between the microlevel cognitive processes and macro-level social patterns. The problem is that what makes religious beliefs interesting and religious is the very fact that they are richly connected with other religious and non-religious beliefs. Such richly connected beliefs cannot emerge from a society of agent whose memory capacity is limited to one-bit (Doran, 1998; Epstein, 2001). A reformulation of traditional agent-based social simulation approaches is needed to allow us to model complex cultural phenomena such as the formation and propagation of religious beliefs (Sun, 2006; Upal, 2007; Upal & Sun, 2006, 2007). In order to have complex shared beliefs emerge at the societal level, individual agents need to be able to represent such beliefs and be able to acquire and modify them. To design predictive computational models we need to design agents that can models cognitive processes of information comprehension, information integration/belief revision, and communication. My students and I have designed one such multiagent society called CCI (Communicating, Comprehending, and Integrating agents) and embedded it into a multiagent version of Rusell and Norvig (1995) Wumpus World Domain (MWW). As shown in Fig. 1, MWW is an N N board game with a number of wumpuses and treasures that are randomly placed in various cells. Wumpuses emit stench and treasures glitter. Stench and glitter can be sensed in the horizontal and vertical neighbors of the cell containing a wumpus or a treasure. Once the world is created, its configuration remains unchanged i.e., the wumpuses and treasures remain where they are throughout the duration of the game. MWW is inhabited by a number of agents randomly placed in various cells at the start of the simulation. The MWW agents have a causal model of their environment. They know that stench is caused by the presence of a wumpus in a neighboring cell while glitter is caused by the presence of treasure in a neighboring cell. Agents sense their
CO
189
UN
188
3
Cell Cell Cell Cell
(1, (3, (2, (2,
2), 2), 1), or 3).
244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259
Agent store their world model-observations and past explanations-in their memory. In each simulation round, an agent has to decide whether to take an action or to stay motionless. Possible actions include the physical action to move to the vertically or horizontally adjacent neighboring cell or the communication actions of understanding a message or sending a message to another agent present nearby to request information that the current agent does not have. The MWW agents are goal directed agents that aim to visit all treasure cells on the board while avoiding wumpuses. Agents create a plan to visit all treasure cells they know about. The plan must not include any cells that contain wumpuses in them. Unlike Bainbridge’s agents CCI–MWW agents are not designed to favor beliefs in phantom agents. They are also capable of having richly connected beliefs: beliefs about the presence of a wumpus in a cell is connected to the belief in
260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276
COGSYS 261 3 March 2008; Disk Used
286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313
315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355
F
285
Axelrod, R. (1984). The evolution of cooperation. New York: Basic Books. Bainbridge, W. S. (1978). Satan’s power: A deviant psychotherapy cult. Berkley, CA: University of California Press. Bainbridge, W. S. (1995). Neural network models of religious belief. Sociological Perspectives, 38, 483–495. Bainbridge, W. S. (2002). The endtime family: Children of God Albany. NY: State University of New York Press. Bainbridge, W. S. (2006). God from the machine: Artificial intelligence models of religious cognition. Cognitive science of religion series. Lanham, MD: Altamira Press. Doran, J. (1998). Simulating collective misbelief. Journal of Artificial Societies and Social Simulation, 1(1). Epstein, J. (2001). Learning to be thoughtless: Social norms and individual computation. Computational Economics, 18(1), 9–24. North, M. J., Tatara, E. Collier, N., & Ozik, J. (2007). Visual agent-based model development with repast simphony. In Proceedings of the agent 2007 conference on complex interaction and social emergence. Argonne, IL: Argonne National Laboratory. Ramchurn, S. D., Huynh, T. D., & Jennings, N. R. (2004). Trust in multiagent systems. The Knowledge Engineering Review, 19(1), 1–25. Rogers, E. M. (1962). Diffusion of innovations. New York: Free Press of Glencoe. Rusell, S., & Norvig, P. (1995). Artificial intelligence: A modern approach. Englewood Cliffs, NJ: Prentice Hall. Schelling, T. (1969). Models of segregation. RAND. Stark, R., & Bainbridge, W. S. (1987). A theory of religion. New York: P. Lang. Sun, R. (2006). Cognition and multi-agent interaction: From cognitive modeling to social simulation. Cambridge, MA: Cambridge University Press. Upal, M. (2007). The structure of false social beliefs. In Proceedings of the first IEEE symposium on artificial life (pp. 282–286). IEEE Press. Upal, M., & Sama, R. (2007). Effect of communication on the distribution of false social beliefs. In Proceedings of international conference on cognitive modeling (pp. 151–156). Oxford, UK: Taylor & Francis. Upal, & Sun (2006). Cognitive modeling and agent-based social simulation: Notes of AAAI workshop on cognitive modeling & agent-based social simulation. Menlo Park, CA: AAAI Press. Wilensky, U. (1999) NetLogo. Center for Connected Learning and Computer-Based Modeling. Evanston, IL: Northwestern University.
.
OO
284
314
ED
283
References
CT
281 282
RR E
280
CO
279
stench in the neighboring cells. Even so our experiments with a version of the society where we disabled communication revealed that patterns of false beliefs emergent in such a society have a particular structure to them; agents are more likely to have false beliefs about wumpuses than about treasures (Upal, 2007). The reason appears to be that while hypotheses about the presence and absence of wumpus are harder to confirm and disconfirm for the agents than the hypotheses about the presence and absence of treasures. This is because agents seek the cells where they believe treasures lie but avoid cells where they believe wumpuses live. This is exactly what Stark and Bainbridge (1987) argued, hypothesis which are harder to confirm and disconfirm are more likely to be believed by believers. Furthermore, our subsequent experiments have shown that when agents are allowed to communicate with other agents this pattern continues to hold. I believe that this provides some vindication to not only Stark and Bainbridge (1987) but also for Bainbridge (2006) as he so convincingly argues that artificial intelligence based approaches are useful indeed needed to develop artificial worlds which could be used as testbeds for theories of religion. I believe that artificial intelligence researchers having had to deal with complex knowledge representation and reasoning issues are uniquely qualified to address challenges involved in modeling the information comprehension, integration, and communication processes so central to understanding how religious knowledge gets created, modified and diffused. Creating such artificial intelligence models of religious cognition, however, would involve a genuine long term collaboration between artificial intelligence researchers (in particular those interested in multiagent systems) and social scientists that is still in its nascence (Sun, 2006). Without such an effort the great potential of computer simulation for social science of religion that Bainbridge so eloquently argues for will remain just that, unfulfilled potential.
UN
278
Book review / Cognitive Systems Research xxx (2008) xxx–xxx
PR
4 277
No. of Pages 4, Model 5+
ARTICLE IN PRESS
356