Conference Paper Possible Lessons from a Recent Technology (Nuclear) for an Emerging (Ubiquitous Embedded Systems) Technology*

David J. LePoire Environmental Assessment Division Argonne National Laboratory ([email protected])

for submittal to Seventh Annual Ethics & Technology Conference June 25-26, 2004 Loyola University, Chicago www.ethicstechconference.org

The submitted manuscript has been created by the University of Chicago as Operator of Argonne National Laboratory (“Argonne”) under Contract No. W-31-109-ENG-38 with the U.S. Department of Energy. The U.S. Government retains for itself, and others acting on its behalf, a paid-up, nonexclusive, irrevocable worldwide license in said article to reproduce, prepare derivative works, distribute copies to the public, and perform publicly and display publicly, by or on behalf of the Government.

*

Work supported by the U.S. Department of Energy under Contract No. W-31-109-ENG

Possible Lessons from a Recent Technology (Nuclear) for an Emerging (Ubiquitous Embedded Systems) Technology David J. LePoire Environmental Assessment Division, Argonne National Laboratory ([email protected])

Abstract Information Technology (IT) has ushered in not only large societal opportunities but also large uncertainties and risks. Future developments, like ubiquitous networked embedded systems, are technologies society may face. Such technologies offer larger opportunities and uncertainties because of their ability to widely distribute power through their small, inexpensive, and ubiquitous characteristics. Many interpretations of how these technologies may develop have been postulated, ranging from the conservative Precautionary Principle, to uncontrolled development leading to “singularity.” With so much uncertainty and so many predictions about the benefits and consequences of these technologies, it is important to raise ethical questions, determine potential scenarios, and try to identify appropriate decision points and stakeholders. Rather than going along an unknown path, perhaps lessons could be learned from recently deployed technologies, such as nuclear technology, that were controversial but offered similarly large potential benefits and risks. The experience of nuclear technology development, with its various successes and failures, is recalled and compared with potential scenarios in the development of networked embedded systems.

Introduction Information Technology (IT) has ushered in not only large societal opportunities but also large uncertainties and risks (Berlind, 2003). Future developments, like ubiquitous networked embedded systems, are technologies society may face. Such technologies offer larger opportunities and uncertainties because of their ability to widely distribute power through their small, inexpensive, and ubiquitous characteristics. One way of characterizing the new technologies is Genetics, Robotics, Artificial Intelligence, and Nanotechnology (GRAIN) (Mulhall, 2002). Networked embedded technology might be considered an extension of information technology, which deals with large networks of small computing devices. As the GRAIN technologies converge, the embedded systems probably will utilize artificial intelligence techniques on a nanotechnology scale, with potential connections to human brains. Steps in the development of nano-computers are already underway, for example in 2001, IBM created a self-assembled carbon nanotube that formed a simple logic circuit (Chang, 2001). Embedded computers are rapidly entering many commercial products. For example, cars have over 40 embedded computers that monitor information and act on that information

1

to maintain efficiency. Toll booth collection devices allow rapid wireless integration of the cars transponder, the toll booth ID system, and connections to the banks to refresh the current balance. Radio frequency identification tags are being placed on more products in stores (Want 2004). Their use has already raised controversies that prompted some manufacturers to modify their deployment. Further applications are being explored for various monitoring and decision support in the natural environment (Estrin, 2003), human environments (Kang, 2003), and in individual humans (Kuzweil, 2000 and 2001). Their expected ubiquity and low costs would allow large potential benefits in improving safety, security, and creating better efficient use of resources. For example, various detectors in a power plant can monitor conditions in the complex of piping, pumps, and turbines and give early indications of potential problems. If these devices are networked, the communications could be used to tweak local conditions to achieve higher efficiencies or reduce chances for accidents. Pervasive computing, the use of ubiquitous small networked and embedded computing devices in the human environment, leads to larger concerns as humans, their property, or transactions are monitored and used in adjusting conditions or making decisions. Such systems are already embedded under roads to monitor traffic conditions at an intersection, allowing a handful of networked transponders to improve the traffic flow. However, the cars are not identified for tracking. Individuals already leave identification “tracks” for beneficial reasons when they use their credit cards, make phone calls, send email, or check out library books. Some Web sites and stores ask to use this information to better serve the individual. The government has been interested in collecting, classifying, and searching similar data. One recent high-profile project, however, the Total (or Terrorism) Information Awareness research, met heavy public criticism that led to Congressional action to stop funding for the project in fiscal 2004. Some ethical issues derive from the contrasting predicted consequences of potential development. Some foresee the technologies offering the potential to solve major problems for sustainability, welfare, and safety; others foresee the technologies potential for irreversibly changing society for the worse. The first group would suggest that it is unethical not to develop these technologies as fast as possible to relieve problems. The second group claims it is unethical to proceed at a faster pace than solid scientific evidence concerning their safety permits. The large stakes involved are highlighted in the book “Our Final Hour” by Martin Rees (Rees, 2003) where he estimates that the human race has a 50-50 probability at surviving through the century during the deployment of these technologies. This paper suggests that there might be lessons from the introduction of previous controversial technologies, such as nuclear technologies, that might help in understanding these risks and problems. To develop these lessons, a general approach to identifying new technology development characteristics is discussed. The risks and consequences of the new technology are dependent on the development rate and control. The possible scenarios corresponding to fast, medium, and slow rates are explored, along with monitoring and control approaches that might encourage them. Next the course of

2

nuclear technology development is placed in this framework of characteristics, risks, and control. Finally, possible lessons are identified on the basis of a comparison of the nuclear technology development path with the development paths of potential new technologies.

Risks and Uncertainties The introduction of a new technology into a social system raises many questions concerning its acceptance, potential development rate, and control: What characteristics of the new technology should be investigated before acceptance and integration? What are the possible development rates and consequences? What are the possible ways of monitoring, controlling, and handling the new technology? An analogy might be helpful in identifying characteristics a new technology should have before acceptance. A technology’s introduction into a social system is somewhat analogous to a new invasive species entering an ecosystem. Both the social system and the ecosystem have developed complex relationships among a set of participants. In an ecosystem, these participants are the organisms forming symbiotic and predator-prey relationships. In societies, the participants are humans, their institutions, and technologies. After studying functioning ecosystems and comparing them with human institutions, Dryzak (1987) found criteria for success: efficiency ( composed of negative feedback with coordination), robustness (or flexibility), and resilience. According to these criteria, the appropriate questions are: Are there techniques to determine what applications the technology is best at solving? Will the technology still be useful after adaptation to new cultural situations? Will the society be dependent upon the technology? Has the technology been tested on a social evolution timescale? Sometimes efficiency, resilience, and robustness can be conflicting goals. For example, too much efficiency might leave the system in a frozen state that is brittle like a solid. Robustness, or flexibility, is the ability to gracefully respond to moderate external changes, analogous to modeling clay. However, sometimes systems encounter large episodic events that temporarily take the system far from its normal dynamic equilibrium. Resilience, the ability to spring back from these events, is not found in the clay analogy. What are the possible development rates and risks? Many interpretations of how these technologies may develop have been explored. Some have extrapolated the trends in IT progress and predict a future that continues to rapidly change, including the surpassing of human intellect by machines (Kurzweil, 2001; Vinge, 1993). Others argue that technology is not the main issue; instead, it is the social infrastructures that need to catch up (McKibben, 2003). Another interpretation of IT progress, which includes a combined analysis of technology, culture, and evolution, leads to a prediction that we are at an inflection point in a large unique learning curve. At this critical stage, change occurs at its fastest pace but soon slows down (Modis, 2002).

3

The most radical predictions of the future come from Ray Kurzweil, (Kurzweil, 2001) who suggests that by 2030 advances in embedded systems will allow for direct brain nanobot implants to integrate with the human brain's memory, intelligence, and communication. He also speculates that the nanobots might eventually perform brain scans; at first to understand the brain, but later to "download" it for possible replication. The way the technological development paths are monitored and controlled might relate to the development rate. For example, an uncontrolled development might encourage a development path more like the scenario postulated by Kurzweil; a precautionary approach might encourage McKibben’s scenario; and a moderate approach might be compatible with the learning curve path discussed by Modis. What are the possible approaches to monitoring and handling the new technologies that have so much risk and uncertainty? At one extreme, some advocate that research should proceed with great caution to ensure that the technologies are not released for general use until the risks are shown to be acceptable, with the burden of proof on the proponents of the new technology. This approach is known as the Precautionary Principle (Ag BioTech InfoNet, 2003; Appell, 2003) as articulated by the Science and Environmental Health Network (SEHN), a consortium of environmental groups. Further discussion of the contrast between traditional risk assessment methods (Haimes, 1998) and a more biosystems oriented approach is being advocated as a way to proceed with the Precautionary Principle (Raffensberger, 1997). At the other end of the spectrum, the default practice in the United States is to allow technology development, unless explicitly declared otherwise. The risks are borne by the technology developers through the legal system, via lawsuits, if aspects of the technology prove harmful, as was the case with asbestos. In the middle are approaches that make conservative assumptions but still allow limited development. These approaches have been applied to a series of nuclear technology applications in power generation, medicine, nondestructive testing, and agricultural applications. A more recent approach considers many future scenarios constructed with combinations of identified characteristics. This leads to possible technological and social acceptance paths in which future decision points and options can be identified as the uncertainty is reduced (Interlaboratory Working Group, 2000). New approaches continue to be developed for considering new technologies, including analogies to ecosystems (Mulhall, 2003) and incorporation of social constructions of reality (Slaughter, 2003) Development Path of Nuclear Technologies Rather than going along a completely unknown development path for a new technology, perhaps lessons could be learned from recently deployed technologies that were controversial but offered similarly large potential benefits and risks. An example is nuclear technology, which has experienced a variety of successes and failures. Despite continued debate over its technological role in society, there are still large uncertainties about its future role.

4

The comparison might be tightened by considering some similar characteristics of nuclear technologies and potential nanotechnology networked embedded systems, including their transparency to human senses (radiation and nanbots cannot be felt); their unique identifiers (radiation and network); their lifetimes (difficulty in altering the decay of radionuclides and possibly nanobots); their hazards being independent of size; and the ability of special systems to exhibit uncontrolled positive feedback (nuclear weapons and meltdown; nanotechnology’s “grey-goo” scenario with uncontrolled replication). The review of the nuclear technology development path will start with a short history. This is followed by a discussion of the exhibited efficiency, robustness, and resilience characteristics. Finally the uncertainties surrounding the technology are identified along with the approaches taken to handle them. Nuclear radiation was first discovered in scientific laboratories in the mid 1890s; nuclear technology, however, was not practically considered until after the discovery of the neutron in the early 1930s. Research quickly led to the observation and identification of nuclear fission. The next few years of the Second World War saw tremendous resources being applied to take advantage of this million-fold increase in energy release per mass. The first large-scale use of nuclear technology was military in nature and left an indelible mark upon future applications. Post-war activities led to a variety of proposed nuclear applications, including power generation, rocket propulsion, and mining. Power generation came to fruition after the development of compact power sources for the Navy, although the hype surrounding the statement about electricity being too cheap to measure never materialized. Today, the economy is dependent on many nuclear applications, including the 10% of electricity generated by nuclear power plants. Many smaller scale applications can be found in medicine, manufacturing, resource extraction, and agriculture. Many of these applications utilize the unique, small, ubiquitous, and relatively easy to detect nuclear radiation identifier, similar to a feature of embedded devices. Examples of the diagnostic applications using these identifiers include positron emission tomography (PET) in medicine and subsurface characterization with well logging tools for oil exploration. The life-hazard characteristic of nuclear technology is used in therapeutic medicine and food sterilization. Since efficiency, robustness, and resilience were identified as criteria for success in the invasive species analogy, it is important to reflect on these characteristics of the technology’s development. Efficiency How were nuclear technologies explored and compared with previous technologies? When a technology is first developed, it is difficult to assess its potential in various situations. Examples of failed nuclear technology applications include the early research on nuclear rockets, planes, and mining activities, which were found to be too risky, engineering mismatches, or financially unattractive. However, the exploration for new

5

uses of the technology continues, such as the development of micro nuclear batteries for micro-electro-mechanical systems (MEMS) (Lai, 2002). International cooperation continues in evaluating and developing nuclear technologies for food preservation by irradiation, pest eradication, and better water supplies through nuclear desalination plants. The technologies are not too efficient to prevent newer technology from replacing them, as observed in the replacement of nuclear radioisotopes with accelerator-based systems in medicine and irradiation. Robustness What robust applications of nuclear technologies were identified? Generally small applications with low risk have been accepted. For example, almost every house has a material that is arguably more dangerous than plutonium, that is, americium (Am-241), in use to monitor the presence of smoke in the smoke detector. Under the severe conservative assumption that all the material is deposited in the lungs, there might be a chance of subsequent lung cancer; this risk is negligible however, compared with the advantage the device provides in fire safety. Larger doses are acceptable when the risks are known and managed by an individual, as is the case with many medical applications. This includes both diagnostic applications and therapeutic applications where the doses might be designed to kill specific tissues, such as cancer, or part of the thyroid in Grave’s disease. What has not worked as well in the United States are large projects with both fundamental uncertainties, large complex uncertainties, and competition with conventional technologies, as exemplified by the nuclear power industry. However, despite the fact that no nuclear power plant has been ordered in the United States after the Three Mile Island incident in 1979, nuclear power has continued to grow in the United States through more efficient application of the electrical generating capacities. Although some plants are opting to close because of large costs to replace non-nuclear components, such as steam generators, others have replaced these components and upgraded the plants for relicensing for about another 30 years. (Interlaboratory Working Group, 2000). Resiliency Why did nuclear power growth not continue in the United States? Is it resilient? Conventional wisdom is that the Three Mile Island incident combined with negative press coverage and Hollywood films like the “China Syndrome,” caused deep concern in the U.S. public about nuclear power. Others, however, have argued that financial reasons curtailed the growth, from both the increased cost from escalating regulatory requirements and the growth of energy efficiency. This efficiency was motivated after the OPEC embargo in the early 1970s and the subsequent leveling of energy demand (Cohen, 1990). Recent administration activities have included exploring and facilitating the reemergence of the nuclear power option in the United States. Design standardization is a technique that other countries successfully applied, compared with the U.S. approach, which encourages further innovation opportunities through design variety.

6

Uncertainties during nuclear technology development The large uncertainties in nuclear technologies stem from a combination of factors such as insufficient basic science understanding, engineering complexity, political interventions, and unintended consequences. These factors are related to the four major concerns that the public has about nuclear power generation: radiation effects from normal operations, the possibility of catastrophic accidents, the proper handling of waste products, and the prevention of terrorist activities (Cohen, 1990; Jaworowski, 1999; Brown, 1990). One factor is the uncertainty in basic scientific understanding of the effects of radiation on humans. Since direct experimentation on humans with radiation doses is unethical, the assumptions are based on studies on animals, cells, and case studies of humans in context. Various human situations cause increased exposure to radiation, from living in higher elevations; to spending time near natural radioactive granite used in such buildings as the Capitol, the Vatican, and Grand Central Station; to miners exposed to radon gases. However, one of the major sources for the assumptions concerning the effects of radiation on humans comes from the study of Japanese atomic bomb survivors. There is large uncertainty in the collection and interpretation of these data. The decision was made to interpret the data in a conservative way by assuming that the hazard is linearly proportional to the dose with no threshold, which is an unusual response function to a stressor. Further uncertainties derive from the complex engineering situations found in nuclear power plants. The problems during a major incident in the United States, the Three Mile Island incident which resulted in minor releases of radioactivity, were accentuated by misunderstandings among the human operators at the site (USNRC, 2004). The study of this incident led to improved monitoring, communications, and public preparedness. Political interventions over the years have contributed to escalating regulatory requirements, major decisions changing the basic technology roadmap, and delays in supplying waste handling facilities. Despite technology development planning, sometimes political decisions in the midst of development will upset the plan. An example in U.S. development of nuclear power was a political decision by the Carter administration to stop the U.S. from reprocessing spent fuel. They cited concerns about the potential for terrorists to gain access to the reprocessed plutonium and proliferate the use of nuclear weapons. Many other major nuclear power generating countries did not follow the U.S. example and continue to reprocess their spent fuel. The benefits of reprocessing include the generation of more fuel then initially mined and reduction of long-term waste. Unintended consequences can be categorized as either malicious or accidental. The malicious uncertainties concern potential acquisition of radiological materials for nuclear weapons or radiological dispersion. Worldwide attention has been focused on securing,

7

tracking, and monitoring not only nuclear power stations, but a variety of locations such as hospitals, industrial facilities, and oil fields where high- to medium-scale radiological devices are deployed for peaceful purposes. The collapse of the former Soviet Union caused great concern about the possible compromise of their facilities’ security. Uncertainties continue to exist in large complex applications but are being reduced through research and experience [Lake, 2002]. Continual review of studies and incidents refreshes the debate concerning basic dose-effect relationship assumptions. The complex nature of the reactors is being dealt with by better engineering, computer monitoring systems, and investigating the complexity with probabilistic risk assessment. The financial uncertainties are being addressed by simplifying the license process, progressing on opening a permanent waste repository, exploring standard designs, and exploring new engineering options.

Possible Lessons

How does the development path of nuclear technology compare with possible development of ubiquitous embedded systems? This question will be explored following the approach in the nuclear technology discussion, that is, by exploring the topics of control, technological ecosystem criteria, and sources of uncertainty. The approach of conservatively assessing the dangers of a new technology might be appropriate if the technology issues can be isolated in the decision-making process, as were many aspects of nuclear technology. This conservative approach can lead to problems when decisions include comparison of risks from standard technologies (e.g. fossil fuels) that have either fewer uncertainties or unquestioned risks. For example, the dangers of many fossil-fuel-related waste products released to the environment (e.g. carbon dioxide, nitrogen, and sulfur compounds, and radioactivity found in natural coal) have not been treated with the same conservative assumptions. However, current information technologies are very different because they are closely integrated with major sectors of the economy. The approach to current IT development seems to allow development of products without prior safety and cost-effectiveness justification. This approach has led to unforeseen problems and costs, such as security and privacy violations, cost ineffectiveness through obsolete software, hardware, and training, and increased maintenance costs. Most of these issues are currently being addressed, such as the application of autonomic services, to better control maintenance and reliability costs. The development path of a new technology probably will exhibit similar successes and failures. As applications are developed in an attempt to solve various problems, efficiency, robustness, and resiliency characteristics can be identified as key indicators of success in the technological ecosystem analogy.

8

Efficiency In conventional IT, there was a long time between large investments and eventual changes in the economy’s productivity measures. During this delay, the value of many aspects of IT were debated in what was dubbed the “productivity puzzle” (Gust, 2002). Eventually in the late 1990s the productivity started rising and much was attributed to the IT investments. However, although the cost of hardware performance demonstrated a rapid decrease, the costs of other necessary activities, such as software and maintenance, still exhibit uncertainty in their future trends. This is partly due to the continued growth of unintended consequences of IT integration, such as exploitation by hackers. Currently, the computer user has to be savvy and spend increasing amounts of time and resources to combat viruses, spam, worms, spyware, pop-ups, and intrusions through firewalls. A further inefficiency was due to the hyped investment of the late 1990s, leading to the “.COM bubble.” Ubiquitous embedded systems might also experience these three types of initial inefficiency as the social system takes a period of time to adjust to their added value, the hackers and defenders reach equilibrium, and the markets determine the financial risk balance. Robustness At first glance, the robustness of current information technology seems to be obvious. The technology has been applied to a wide range of applications in almost every sector in society. The growth and innovation in the technology has been unprecedented. However there are concerns not addressed in this view of robustness. Perhaps the motivation for widespread use is increased productivity and efficiency. Too much efficiency might lead to system inflexibility, for example, barriers might be too high to allow later transitions to competing technologies. A conventional measure of the economic growth, the Gross Domestic Product (GDP), indicates continued progress, although alternative measures, such as the Genuine Progress Indicator (GPI), that account for environmental externalities and exploitation costs seem to have stalled or peaked (Cobb, 2001). Resilience Will the current and future information systems infrastructure be reliable in demanding situations not experienced yet? There are concerns about situations that have not been experienced yet, such as financial emergencies and attempted foreign intrusions. However, various emergency situations such as the September 11, 2001 episode saw relatively quick resilience of the New York network system to assist in the situation. Questions continue as to whether the blackout along the East Coast was not somewhat related to a virus attack that might have caused local problems in monitoring and understanding, leading to a sequence of actions resulting in the blackout (Schreier, 2003). Perhaps there might be a large application of embedded technology that is analogous to nuclear power generation, that is, an application with the similar resiliency issues. It has been argued that nanobots might eventually integrate with the environment, combining virtual and real environments. Mulhall’s concern about the potential benefits and risks of

9

this scenario led him to advocate a forum on “Matching Nature’s Complexity: Using Advanced Technologies to Adapt to Newly Found Natural Threats at the Macro and Nano Scale” (Mulhall, 2003). One concept to explore is the inclusion of resilience in technology design, monitoring, and control. Handling Risks The ETC Group (ETC, 2003) advocates applying the precautionary principle and developing a “broad social discussion and preparation” for nanotechnology so that a broad public is “informed and empowered to participate in decision-making.” This public discourse is a laudable goal and has been attempted with the nuclear technologies topic by involved professional members, the creation of Web sites, and holding public discussions concerning environmental impacts. However, the results are limited. An example is found in a recent pamphlet on the hazards of food irradiation, which discusses proposed changes in international law. The argument mixes real concern about chemical changes with radiation when it states that the “food will be treated with the equivalent of 330 million chest x-rays – enough radiation to kill a person 2,000 times over (Public Citizen, 2001).” While there might be certain chemical reactions caused by the controlled irradiation, it causes no increase in the later radioactivity of the food and the risks should not be confused. One possible approach to handle the uncertainties is to slow but not stop progress. The ETC Group has outlined nine laws for incorporating new technologies. The first three are reproduced here (ETC, 2003): 1. It takes a full human generation to comprehend the ramifications of a new technology. Therefore, decisions about whether or not or how to use a new technology will necessarily be ambiguous. Society must be guided by the Precautionary Principle. 2. In evaluating a new technology, the first questions must be: Who owns it? Who controls it? By whom has it been designed and for whose benefit? Who has a role in deciding its introduction (or not)? Are there alternatives? Is it the best way to achieve a particular goal? In the event of harm, with whom does the burden of liability rest and how can the technology be recalled? 3. The extent to which a new technology may be beneficial to society will be in proportion to the participation of society in evaluating the technology – including and especially those people who are most vulnerable. The first principle is consistent with Modis’s interpretation of technological progress as possibly slowing down, indicating a limiting rate on technological introductions. The experience with nuclear power matches these conditions. The first phase of research and development lasted from about 1940 to 1970; the first generation of power reactors started roughly in the late 1960s. These reactors are going through either decommissioning or upgrading for relicensing application.

10

Another approach is to obtain international consensus and cooperation. In the nuclear nonproliferation treaty of 1970, there are currently 187 participating countries. The countries that already had developed nuclear weapons agreed to assist other countries with development of nonweapons nuclear technologies. The International Atomic Energy Agency was created under the United Nations to set up a safeguard system. This example might be copied for the monitoring of nanotechnology development. There might be selected research centers to handle the more potentially hazardous aspects, but with an outreach program for deploying the beneficial and safe applications. A difficulty with this situation is verification that research is not being conducted clandestinely. With nuclear weapons, verification of research and testing has had problems as countries have secretly circumvented the treaty. Monitoring earth tremors provides some comfort; however, as with the original uranium-based bomb, device testing is not necessary. An additional tool that might be appropriate is the use of nongovernmental organizations as technology watchdogs. In the nuclear field, the precedent has been set that independent agencies can be funded by the government organizations that they criticize. For example the U.S. Department of Energy funds part of the Rocky Flats Citizens Advisory Board, an independent group questioning and participating in discussions regarding the Rocky Flats former nuclear weapons component facility (Rocky Flats CAB, 2004). Independent groups participating in review of nanotechnology include the ETC Group, the Foresight Institute, the Institute for Science in Society, and the Science and Environmental Health Network. As with any analogy and comparison, there are many differences and exceptions. The development path of nuclear technology is far from over, but it already has provided interesting events and lessons. The development path of the new technologies will also very likely be full of surprises. The hope is that there will be enough preparation and discussion so that the surprises are more positive than negative.

References Ag BioTech InfoNet, “Precautionary Principle,” available at http://www.biotechinfo.net/precautionary.html (January 2004). Appell, D., “The New Uncertainty Principle,” Scientific American, January 2001. Available at http://www.biotech-info.net/uncertainty.html (December 2003). Berlind, D., “Ex-cybersecurity czar Clarke issues gloomy report card,” ZDNet Tech Update, October 22, 2003. Available at http://techupdate.zdnet.com/Clarke_issues_ gloomy _report_card_.html (January 2004). Brown, D.A., “Integrating Environmental Ethics with Science and Law,” The Environmental Professional, Vol. 2, pp 344-350, 1990.

11

Chang, K., “I.B.M. Creates a Tiny Circuit Out of Carbon,” New York Times, August 27, 2001. Available at www.research.ibm.com/nanoscience (January 2004). Cobb, C., M. Glickman, and C. Cheslong, “The Genuine Progress Indicator,” Redefining Progress Issue Brief, December 2001. Available at http://www.rprogress.org/ publications /2000_gpi_update.pdf (January 2004). Cohen, B.L., “The Nuclear Energy Option,” Plenum Press, New York NY, 1990. Dryzek, J. S., 1987, Rational ecology: environment and political economy. Basil Blackwell, NewYork, NY. Estrin, D., “Environmental Cyberinfrastructure Need for Distributed Networks,” Scripps Institute of Oceanography, August 2003. Available at http://www.lternet.edu/ sensor_report/ (January 2004). ETC Group, “The Big Down, Atomtech Technologies Converging at the Nano Scale,” January 2003. Available at www.etcgroup.org/documents/TheBigDown.pdf (January 2004). Gust, C., and J. Marquez, “Information Technology, Regulatory Practices, and the International Productivity Puzzle,” Federal Reserve Board, March 2002. Available at www.biz.uiowa.edu/econ/seminars/Spring02/gustmarqj.pdf. (January 2004). Haimes, Y., “Risk Modeling, Assessment, and Management,” John Wiley & Sons, NewYork, NY, 1998. Interlaboratory Working Group, “Scenarios for a Clean Energy Future (Oak Ridge, TN; Oak Ridge National Laboratory and Berkeley, CA; Lawrence Berkeley National Laboratory), ORNL/CON-476 and LBNL-44029, November 2000. Available at http://www.ornl.gov/sci/eere/cef/ (January 2004). Jaworowski, Z., “Radiation Risk and Ethics,” Physics Today, September 1999. Kang, J., and D. Kuff, “Pervasive Computing: Embedding the Public Sphere,” Institute of Pervasive Computing and Society, UCLA. Available at http://www.ipercs.ucla.edu/ (Fall 2003) Kurzweil, R., The Age of Spiritual Machines: When Computers Exceed Human Intelligence. Penguin Putnam Inc., New York, NY, 2000. Kurzweil, R., The Law of Accelerating Returns, March 2001. Available at www.kurzweilai.net/articles/art0134.html. (January 2004).

12

Lai, A., et al., “A Nuclear Microbattery for MEMS Devices,” DOE Project Report, August 2002. Available at http://www.osti.gov/bridge/product.biblio.jsp?osti_id=799209 (January 2004). Lake, J. A., et al., “Next-Generation Nuclear Power,” Scientific American, January 2002. McKibben, W., “Enough: Staying Human in an Engineered Age,” Times Books, 2003. Modis T., “Forecasting the Growth of Complexity and Change,” Technological Forecasting and Social Change 69, 377 – 404, 2002. Mulhall, D., Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World, Prometheus Books, July 2002. Mulhall, D., “Reassessing Risk Assessment,” in 21st Century Opportunities and Challenges, edited by H. F. Didsbury, World Future Society, 2003. Public Citizen, “Food Irradiation Alert!,” Vol. 2 No 1, February/ March 2001. Available at http://www.citizen.org/documents/febmarch.pdf (January 2004). Raffensperger, C., and P. deFur, “A Paradigm Shift: Rethinking Environmental Decision Making and Risk Assessment,” Risk Analysis Policy Association Meeting, Virginia, March 1997. Available at http://www.biotech-info.net/paradigm_shift.html. Rees, M., Our Final Hour, Basic Books, NewYork, NY, 2003. Rocky Flats Citizens Advisory Board, Web site: http://www.rfcab.org/index.html. (January 2004). Schneier, B., “Did Blaster Cause the Blackout,” ZDNET, December 9, 2003. Available at http://zdnet.com.com/2102-1107_2-5118123.html. (January 2004). Slaughter, R.A., “Changing Methods in Futures Studies,” in 21st Century Opportunities and Challenges, edited by H. F. Didsbury, World Future Society, 2003. U.S. Nuclear Regulatory Commission, “Fact Sheet on the Accident at Three Mile Island,”. Available at http://www.nrc.gov/reading-rm/doc-collections/fact-sheets/3mileisle.html (January 2004). Vinge V., “Technological Singularity,” Whole Earth Review 81 (1993). Available at http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html (January 2004). Want, R., “RFID: A Key to Automating Everything,” Scientific American, January 2004.

13

(Nuclear) for an Emerging (Ubiquitous Embedded ...

considered an extension of information technology, which deals with large networks of ... classifying, and searching similar data. One recent high-profile ... scans; at first to understand the brain, but later to "download" it for possible replication.

52KB Sizes 1 Downloads 158 Views

Recommend Documents

Emerging Nuclear Innovations - Kachan & Co.
Picking global winners in a race to reinvent nuclear energy. Page 2 ...... “Green” energy sources like wind and solar will also expand their share but they.

Emerging Nuclear Innovations - Kachan & Co.
fuels. “Green” energy sources like wind and solar will also expand their share but they will not be able to supply the baseload power provided by nuclear. But for ...

Computing: An Emerging Profession?
In the UK, the Institute of Civil Engineers was founded in ..... employers, government agencies, and ..... Professions, Illinois Institute of Technology, 2009; https://.

Machine Hearing: An Emerging Field
point is probably too big a job for anyone to take on, it might ... cations in the analysis of stored sound media. ... sound, including some speech data- bases, but ...

Contex Aware Computing for Ubiquitous Computing Applications.pdf ...
Contex Aware Computing for Ubiquitous Computing Applications.pdf. Contex Aware Computing for Ubiquitous Computing Applications.pdf. Open. Extract.

Design Patterns for Ubiquitous Computing
and shout and drink, and let go of their sorrows? .... the user is participating in a meeting, attending a ... owner is in a meeting and switch auto- matically to a ...

Contex Aware Computing for Ubiquitous Computing Applications.pdf
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Contex Aware ...

Middleware Technologies for Ubiquitous Computing ...
Fax : (+33|0)4 72 43 62 27 ... challenges raised by ubiquitous computing – effective use of smart spaces, invisibility, and localized scalability .... computer and network resources, enforcing policies, auditing network/user usage, etc. Another ...

Paired Spiking Is an Ubiquitous Response Property in Network Activity
content that PS carried about the stimulus with respect to that of the overall ... 30% (Fig. 2c). Fig. 1. IPSI distribution at individual sites a) and network level b).

An Overview of the ParcTab Ubiquitous Computing ...
system is based on palm-sized wireless PARCTAB computers (known generically as ... to locate the data file on the network server and to request a printout.

Computing: An Emerging Profession? - ACM Digital Library
developments (e.g., the internet, mobile computing, and cloud computing) have led to further increases. The US Bureau of Labor Statistics estimates 2012 US.

An Efficient MRF Embedded Level Set Method For Image ieee.pdf ...
Whoops! There was a problem loading more pages. An Efficient MRF Embedded Level Set Method For Image ieee.pdf. An Efficient MRF Embedded Level Set ...

Tablets Use in Emerging Markets: An Exploration
Aug 27, 2013 - from someone or as a prize from some contest or lottery. Most of the participants didn‟t feel like they needed a tablet until they owned one.

Embedded Hardware Design For An Autonomous Electric ... - GitHub
Mar 9, 2011 - Department of Electrical and Computer Engineering. University of .... At peak load, the steering motor can draw up to 10Amps continuously.

Nuclear Receptor Signaling: A Home For Nuclear ...
Dec 15, 2014 - authors can be accessed from the journal home page at www.nrsignaling.org ... committed funds to building a dataset metadata repository – the ...

Questioning Ubiquitous Computing
anywhere, a highly distributed storage of signals, the ubiquitous display of these signals, and the versatile processing of them. The organization is also ...

Ubiquitous Recommendation Systems
social networks. A link-based tech- nique facilitates Google's good search results. Recommendation systems mediate the user experience in the digital world,.

Enabling Ubiquitous Sensing with RFID
ditional barcode technology, it also provides additional ... retail automation, the technology can help bridge the .... readers will have access to wireless net-.

Point-and-Shoot for Ubiquitous Tagging on Mobile ...
Learning. Real-time. Detection. • The proposed method follows a standard ... c = [0,d0 sinθP ,d0 (1 - cosθP )]. Y. Z c. Virtual frontal view. Captured view. Patch.

Adaptive Content Delivery System for Ubiquitous ...
After contextual data and learners' preferences are separately identified by ... A simulation based on PowerPoint file is ... such as a big video or image not supported by mobile device ..... include mobile learning, data mining, intelligent tutoring

A Middleware for Context-Aware Agents in Ubiquitous
computing is essentially a reactive approach to information access, and it ..... Context-sensitive object request broker (R-ORB) hides the intricacies of ad hoc ...