By James A.M. McHugh and Fadi P. Deek

An Incentive System for

Reducing MALWARE ATTACKS Providing hackers an environment— other than the Net—to test and exhibit their malware talents has its rewards.

• Illustration by Jason Schneider

94

June 2005/Vol. 48, No. 6 COMMUNICATIONS OF THE ACM

V

iruses, worms, and other malware assaults on the integrity of the Internet seem like everyday occurrences. The consequences range from inconvenience, to organizational shutdown, to a compromised and unreliable Internet environment. Whitman ranked deliberate software attacks from viruses, worms, macros, and denial-of-service atacks as the greatest threats to information security, comparable in magnitude to the combined threats from technical failure and human error [12]. Furthermore, the last two factors can be exploited by malicious code. The CSI/FBI threat analysis review described by Power had similar findings [8]. Gordon et al. profile some of the profound economic liabilities posed by these attacks and describe how cyberrisk insurance has developed as a buffer against the resulting financial losses [5]. In the context of security threats, Davenport [3] even challenges an almost axiomatic characteristic of the Internet by arguing that Internet anonymity makes the accountability a functional social order requires impossible. Here, we briefly summarize some of the research on what motivates hackers, and then describe a test environment and incentive structure we propose can help channel these motivations in a more constructive manner. The theoretical and empirical research on hacker motivations has been limited. On the basis of interviews conducted with hackers, Jordan and Taylor concluded the most common motivating factors included a compulsive attraction to hacking, intellectual curiosity, an enhanced sense of control and power, and satisfaction from identifying with a group (other hackers) [6]. Rogers discusses types of hackers, classifying them into groups ranging from novice hackers, to disgruntled ex-employees (the group that commits most computer crime), to professional criminals, to cyber-terrorists [9, 10]. Van Beveren [11] has identified external enabling

factors that encourage the development of hacker behavior, including a perceived “lack of negative consequences for those who have been caught hacking,” the mutual reinforcement that occurs in online hacker communities, and the impact of “community recognition from other hackers.” He emphasizes the seductive nature of the psychological experience of flow that occurs in computer and Web environments and the importance of “the thrill of illicit searches in online environments.” He also suggests the way individuals perceive a virtual environment (as opposed to the physical world) is an important factor affecting criminal hacking and must be part of psychological explanations of hacking.

COMMUNICATIONS OF THE ACM June 2005/Vol. 48, No. 6

95

Rogers and Van Beveren identify an evolution in hacker mentality over time, suggesting the recent generation of hackers is driven more by greed, power, and revenge than by benign motivations like curiosity. An individual in the better-studied “cyber-punk” category is typically Caucasian, 12–30 years old, from a dysfunctional middle-income family, a loner, not career-oriented, escapist, and obsessed with computers. Hacking gives these people an enhanced sense of control over their lives, prestige, an outlet for hostility, and possibly recognition from the media. The loner self-characterization tends to conflict with the need for peer recognition and the desire to affiliate with other hackers [10]. Of course, researchers are well aware there is no reliable, general profile. Van Beveren suggests making the transition from neophyte to more advanced hacker is more difficult. System administrators could do this by making the beginning stages of hacking more difficult by fixing the more obvious software holes in systems. This would help break the initial positive feedback loop that encourages novice hackers to progress to more advanced forms of hacking. For further discussions, see Parker [7] and Adamski [1]. Rogers’s Web site on Psychology and Computer Crime at CERIAS at Purdue (www.cerias.purdue.edu) has useful references to the behavioral sciences, cyber crime, and IT. Denning’s comprehensive study of information warfare [4] discusses types of computer crime and motivations for criminal behavior. A PROPOSAL Our proposed approach to reducing the hostile deployment of malware on the Internet is to develop a small-scale, isolated version of the Internet (a “Microcosm” of the Internet) to serve as a platform for malware developers to vicariously challenge the real Internet, but through a surrogate environment. The environment would be sponsored by a consortium of major universities and software companies. The incentives for malware developers to test their wares in this environment would be economic, psychological, and social. Given the cost of a single serious virus or worm attack on the Internet may be on the order of one billion dollars in direct and indirect costs to the U.S. economy, the economic benefit from avoiding even one attack would merit a substantial monetary reward. The sponsors of the environment would provide this reward to any challengers whose malware succeeded in seriously disrupting the Microcosm. Successful challengers would also receive broad publicity by demonstrating to the world and their peers that 96

June 2005/Vol. 48, No. 6 COMMUNICATIONS OF THE ACM

they had developed the malware. The prominence of the consortium would guarantee this publicity. We recognize this proposal is provocative. It will be challenged as outrageous by some, dismissed outright by others, or considered to be promoting illegal behavior. Nonetheless, we believe it is worth discussing: to understand its implications, possible flaws in its plausibility, or unrecognized side effects, and to empirically clarify its likelihood of success by appropriate surveys. Our approach is motivated by behavioral models from economics and psychology, which are key relevant models for controlling malware development and deployment. While the underlying enabling causes are the (inevitable) security flaws in complex systems and the difficulty of enforcing sanctions, the activating causes are the human motivations that lead people to design and deploy malware. From this viewpoint, controlling this behavior becomes a question of what kind of incentive structure can satisfy the motivations, but redirect the behavior in a benign fashion. Malware is a potent product, but the current environment provides relatively unenforceable penalties and shaky defenses against it, and no mechanism for pricing its value (or the value of deterring its deployment). If malware could be prevented, there would be no need to price it, but since it cannot, the rational response is to price it in a way that deters its illicit deployment. Our proposal addresses the situation by monetizing the malware.

I

n the current Internet environment, the incentive structure for most malware developers is nil or primitive. They must remain cloaked in anonymity, they receive no financial gain for their products, and they garner neither public or professional recognition for their clever but destructive work. Furthermore, the sanctions imposed if the malware developer is caught are severe, including criminal and civil prosecution and personal ignominy. The rationale for the proposed system, which combines a test-bed intranet with an incentive mechanism, is based on recognizing and accepting the fact that behavioral forces are the driving factors behind this human phenomenon of malware development. Under this proposal, work now unrewarded in any objective sense would instead be rewarded by economic, professional, and ego incentives. Such a system would have both preventive and therapeutic benefits. Preventively, the system would attract or siphon off real attacks on the real Internet by transforming the incentives for attackers, making it behaviorally more rational for them to publicly target the Microcosm than to covertly target the Internet.

ECONOMIC MOTIVATIONS ARE POSSIBLE where the detrimental side effects of malware can be exploited by developers for gain. While hackers of individual systems steal intellectual property or commercially valuable information, malware deployers may expect to profit indirectly from an attack. Therapeutically, the system would sharpen existing software and hardware systems through a continual process of repeated software refinement in response to repeated extreme testing by the motivated and inventive malware challengers. It could also attract new individuals, never engaged in illicit activity, to apply their creativity productively within the context of the Microcosm. MOTIVATIONS AND EXPECTED BEHAVIORAL IMPACTS Is the system we propose likely to attract the interest of hackers? To analyze this issue, we present a simple (largely a priori) classification of the motivations of malware developers as either psychological, therapeutic, economic, or terroristic. After briefly discussing each category, we will project the anticipated effect of the proposed system on these different motivations. We emphasize that our focus is not on generic hackers who wish to gain entry to particular computer systems for whatever purpose, but on individuals who develop malware deployed to disrupt the overall Internet. Psychological motivations include ego satisfaction because of the fame (or infamy, albeit anonymous) associated with having created a virus that the whole world, and particularly the developer’s peers, knows about. Malicious individuals may be additionally motivated by sadistic pleasure or thrill-seeking. Some may not grasp the gravity of their actions, like oblivious juvenile delinquents, or may view the malware as a prank, though even such immature individuals know how to calculate the cost of their acts. Cohen [2] quotes G. Jelatis of Secure Computing to the effect that adolescents involved in hacking often stop after the age at which they become criminally liable. Therapeutic objectives motivate malware developers who are, or at least claim to be, driven by a kind of idealism. They may want (or claim to want)

existing systems to be improved so as to reduce security risks in the future, but they believe that, because of institutional and social inertia, there is no other way to bring about the needed changes except through public demonstrations of the weaknesses of existing systems. Economic motivations are possible where the detrimental side effects of malware can be exploited by developers for gain. While hackers of individual systems steal intellectual property or commercially valuable information, malware deployers may expect to profit indirectly from an attack. Terroristic motivations, including engaging in information warfare, drive individuals or groups whose intention is to cause widespread real damage, with possible particular reference to the U.S. industrial, commercial, social, or governmental environment. Those involved may be antisocial solo terrorists, or politically motivated terrorist cells engaged in asymmetric socioeconomic warfare, or agents of hostile nations who are probing the Internet for weaknesses on an ongoing basis or honing their skills for possible future use. With these kinds of motivations in mind, what are the likely effects of the proposed incentive system on malware developer behavior? The behavior of psychologically motivated malware developers seems most amenable to redirection by the proposed incentive system, because it addresses issues of social and peer/professional recognition and provides the prospect of economic gain for any developer clever enough to seriously disrupt the Microcosm. Even psychologically immature individuals with little grasp of the consequences of their actions can understand this kind of motivation: recognition and money can appeal even to a malicious individual who derives sadistic satisfaction by causing disruption. For example, think how such an individual would feel if (after the proposed system was operational and widely known) he or she COMMUNICATIONS OF THE ACM June 2005/Vol. 48, No. 6

97

THE PURPOSE OF THE PROPOSED MICROCOSM SYSTEM is to provide an environment close enough to the Internet in structure, software platforms, and end-host patterns of usage that the behavior of a virus randomly introduced into the system approximates the behavior of that virus on the real Internet. successfully and anonymously released a virus that wreaked havoc on the real Internet. After such a person saw how damaging the virus was to the Internet, would it not confound him or her to realize how much he or she could have profited and become recognized by instead submitting the virus for testing on the Microcosm? We believe there is a non-negligible chance that such people would perform this utilitarian calculus beforehand, and alter their behavior. Therapeutically motivated malware developers may be attracted by the system’s incentives because the creation of such a forum would at least have moved the world in the direction such individuals are interested in—one where systems are more secure and more thoroughly tested, and design flaws are fixed. The impact on economically motivated malware developers is less clear. It would depend on how much they expected to profit from releasing the malware on the Internet. They would again make a utilitarian calculation comparing how much they might have expected to profit from direct or indirect effects associated with their malware deployment, versus how much they would profit under the system we propose, including the lower risks they would incur. There would at least be a chance of enticing them to work within the system, more so than at present. Terrorist motivations would be the least affected by our approach. In particular, highly skilled individuals who exhibit criminal tendencies are an especially dangerous group [11] and unlikely to be attracted to the Microcosm. Other factors that may limit its appeal to inveterate hackers include its corporate sponsorship, the lack of actual victims, and the absence of the illegal thrills of hacking. The motivations of at least some individuals who initiate serious virus or worm attacks will lend themselves to benign redirection. If only one in 10 cases were successfully redirected and the proposed system caused no net increase in the misbehavior, it would 98

June 2005/Vol. 48, No. 6 COMMUNICATIONS OF THE ACM

pay for itself. Existing Internet systems become freer from flaws over time because effort focused on challenging them, and so gradually become less vulnerable to malware ism.

would of the would terror-

SYSTEM REQUIREMENTS AND OPERATION he purpose of the proposed Microcosm system is to provide an environment close enough to the Internet in structure, software platforms, and end-host patterns of usage that the behavior of a virus randomly introduced into the system approximates the behavior of that virus on the real Internet. It must both reflect a broad spectrum of platforms and be designed to emulate the behavior and vulnerabilities of a broad spectrum of users. It is not immediately obvious what the detailed requirements of such a system would be. A single small network might provide functionality adequate to accomplishing most of the emulation objectives, or the system might have to be fairly large, requiring many servers, hosts, routers, and so on in order to capture a representative cross-section of operating systems, versions, firewalls, browsers, traffic patterns, software usage patterns, and software packages. Assuming the system were collocated, network delays might have to be artificially introduced into router connections for verisimilitude. Initially, constructing a close simulacrum of Internet hardware and software complexity would seem to be the best and most obvious way to ensure that the results on the Microcosm reflected real Internet behavior, but even a small environment might be effective. How would the system be run, and what possible operating problems might the system experience? We have indicated it could be sponsored by a consortium of universities and software companies. Malware could be submitted with some type of identification,

T

needed to verify the originator in case the submission turned out to be effective. To minimize worthless submissions, a fee could be charged for submissions, or a brief technical description of the malware could be required so the system’s administrators could determine whether the malware was a plausible challenger or whether the submitter was knowledgeable. RISKS AND CONCERNS ould the very existence of such an incentive system exacerbate the problem of malware development rather than alleviate it? Certainly, the proposed incentives would interest more individuals in malware development, thereby increasing the pool of expertise in this area and consequently the potential threat. However, the rational behavior argument we have made contends the microcosm rather than the Internet would be the focus of this interest because of the rewards a participant could receive. Also, as we have observed, some researchers emphasize the importance of creating systems environments that make it difficult for neophyte hackers to obtain positive feedback in their initial efforts as a preventive measure in discouraging neophytes, so establishing a supporting environment for the behavior is arguably problematic. On the other hand, the extended knowledge base resulting from this marketplace could itself be brought to bear on the problem of malware. Another issue is the cost of establishing and operating the system. Liability is a key concern. For example, honeynets; (systems that provide the important service of monitoring visiting hackers to gather statistics on their patterns of behavior; see www. project.honeynet.org), must be careful not to inadvertently propagate malware. A basic microcosm that only accepted viruses for testing in a locked-in environment should limit direct liability risks. Ethical acceptability is also critical. We believe the proposal is ethical since it intends to supplant malicious behavior with sanctioned testing under controlled conditions.

C

SUMMARY Viruses are not only a technical phenomenon. They do not spring up by spontaneous generation or emerge as side effects as systems age or deteriorate. They are invented by people for some reason. The originators are often difficult to identify or prosecute because of anonymity or because of their youth. It is also difficult to protect against the security flaws that viruses exploit in our increasingly complex and interconnected systems. Inevitably, an appropriate response to this human phenomenon must include understanding, altering, and redirect-

ing the motivations that cause this activity, at least those motivations that can be addressed by feasible incentive structures. Human behavior can be altered by training, by inculcating moral codes to make individuals appreciate the implications of their actions, by applying stronger penalties for misbehavior—or by instituting incentives and rewards, either economic, psychological, or social, for good behavior. We propose the latter approach as having an important role to play in reducing the problem of Internet viruses and other malware. An Internet Microcosm would provide a venue within which these motivational forces could operate. c References 1. Adamski, A. Crimes related to the computer network. Threats and opportunities: A criminological perspective; www.infowar.com/new, 1999. 2. Cohen, R. Experts call hacker motivation key to prevention. Infosec Outlook 1, 2 (May 2000). Carnegie Mellon University, CERT Coordination Center, Pittsburgh, PA. 3. Davenport, D. Anonymity on the Internet: Why the price may be too high. Commun. ACM 45, 4 (Apr. 2002), 33–35. 4. Denning, D. Information Warfare and Security. ACM Press, Reading, MA, 1998. 5. Gordon, L.A., Loeb, M. P., and Tashfeen, S. A framework for using insurance for cyber-risk management. Commun. ACM 46, 3 (Mar. 2003), 81–85. 6. Jordan, T. and Taylor, P. A sociology of hackers. Sociol. Rev. 46, 4 (1998), 757–780. 7. Parker, D. Fighting Computer Ccrime: A New Framework for Protecting Information. John Wiley & Sons, New York, 1998. 8. Power, R. CSI/FBI computer crime and security survey. Comput. Secur. Issues Trends 8, 1 (Jan. 2002), 1–22. 9. Rogers, M. Psychology of hackers: Steps toward a new taxonomy; www.infowar.com, 1999. 10. Rogers, M. A social learning theory and moral disengagement analysis of criminal behavior: An exploratory study. Ph.D. Thesis, Dept. of Psychology, University of Manitoba, Winnipeg, 2001. 11. Van Beveren, J. A conceptual model of hacker development and motivation. J. E-Business 1, 2 (Dec. 2000), 1–9. 12. Whitman, M. Enemy at the gate: Threats to information security. Commun. ACM 46, 8 (Aug. 2003), 91–95.

James A. McHugh ([email protected]) is a professor of computer science and an affiliate of the information technology program in the College of Computing Sciences at the New Jersey Institute of Technology, Newark, NJ. Fadi P. Deek ([email protected]) is a professor of information systems and an associate dean of the College of Computing Sciences at the New Jersey Institute of Technology, Newark, NJ. This work was supported, in part, by the “NJ I-TOWER” Grant from the New Jersey Commission on Higher Education, award #01-801020-02. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

© 2005 ACM 0001-0782/05/0600 $5.00

COMMUNICATIONS OF THE ACM June 2005/Vol. 48, No. 6

99

An Incentive System

Whitman ranked deliberate software attacks from viruses ... cyber-terrorists [9, 10]. Van Beveren [11] has ... tium of major universities and software companies.

727KB Sizes 1 Downloads 246 Views

Recommend Documents

An Incentive System
from viruses, worms, macros, and denial-of-service atacks as the ... 96 June 2005/Vol. 48, No. 6 COMMUNICATIONS OF THE ACM. Rogers and Van Beveren identify an evolution in ... deployment of malware on the Internet is to develop.

Steptacular: an incentive mechanism for ... - Stanford University
system, and an improvement in the average steps per user per day. The obvious hypothesis is the .... Cloud (EC2) as an extra large instance. It comprised of a.

An Incentive Compatible Reputation Mechanism
present electronic business interactions as traditional legal enforcement is impossible .... system. However, for π (as defined by Equation (2)) to supply stable truth-telling .... and agreed that its partner can file reputation reports at the end o

Rank as an inherent incentive
May 15, 2012 - a Indiana University, School of Public and Environmental Affairs, 1315 East 10th St., Bloomington, IN 47405, USA ... wealth or status by itself is a major motivator.1 Recently, economic ..... Trade University were invited to participat

How system quality and incentive affect knowledge ...
of practice in human resource management (n ¼ 366), utilizing a survey ... Practical implications – This study provides managers of VCoP with valuable ...

Concealments of Problems: An Incentive of Reducing ...
Jun 25, 2017 - ... Database, http://www.sozogaku.com/fkd/en/cfen/CA1000622.html. .... justified by assuming that players take care of social norms. ...... φ1,r (s0) < 0 for all r, subordinate 0's best response is to conceal: r0 = 0. ..... )st + pSD(

“CONFESS”. An Incentive Compatible Reputation ...
that we allow the provider to confess defection, before ask- ing the consumer ... The management of each hotel decides upfront the in- .... of service. PROOF. Starting from the intuitive assumption that it is not rational for any hotel to declare a q

An Incentive Solution to the Peer Review Problem
the Internet has brought forward. Some people review quickly, while ... that his next submission sits in the editorial office for ten weeks before being sent out for ...

2017 Incentive prizes.pdf
Page 1 of 1. 2017 Incentive Program. To register for a Walk near you, go to alz.org/walk. $200 $300 $500 $750. $1,000 $1,500 $2,500. $5,000 $7,500 $10,000.

Math Incentive Program.pdf
Page 1 of 1. Page 1 of 1. Math Incentive Program.pdf. Math Incentive Program.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying Math Incentive Program.pdf.

Attendance Incentive Plan.pdf
Page 1 of 2. Lockview High School. Attendance Incentive Plan. A student can opt out of writing one exam not. including any English or Math course including. the following criteria: No more than 6 excused absences (not including school-based activity.

Attendance Incentive Plan.pdf
September 2014 Wettbewerbskonzept. Dezember 2014 / Januar 2015 Vorentwurf. Februar bis April 2015 Entwurf. Whoops! There was a problem loading this page. Retrying... Attendance Incentive Plan.pdf. Attendance Incentive Plan.pdf. Open. Extract. Open wi

VOLUNTARY INCENTIVE PROGRAMS ... -
Aug 20, 2010 - Revised August 20, 2010. 1. Will I qualify for unemployment? Since the State of Colorado Department of Labor and Employment is responsible for reviewing all claims for unemployment, we cannot answer this question with certainty. HR con

An overview of the immune system
Education (B Cohen BSc), St Bartholomew's and the Royal London ... function of the immune system in recognising, repelling, and eradicating pathogens and ...

Language as an evolutionary system
Jul 27, 2005 - At some point in the last five million years the arrival of human language .... game, but constraints are the rules that we play by [38]. ... effect on the framework we use to explain why language has the ..... entails the acquisition

Payment of incentive Bonus.PDF
Carriage & Wagon Repair Workshop Perambur on Southern Railway - pNVf ... Payment of incentive Bonus.PDF. Payment of incentive Bonus.PDF. Open. Extract.

asiriyar.com incentive related gos.pdf
asiriyar.com incentive related gos.pdf. asiriyar.com incentive related gos.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying asiriyar.com incentive ...

Do not trash the incentive
Dec 1, 2011 - http://www.trevisoservizi.com/index.php?title=rassegna. ...... Std. Dev. Minimum. Maximum. Sorted waste ratio (%). Arpav. 63.10. 58.98. 15.40.

An overview of the immune system
travel round the body. They normally flow freely in the ...... 18 The International Chronic Granulomatous Disease Cooperative. Study Group. A controlled trial of ...

Mechanism Design with Weaker Incentive Compatibility Constraints1
Jun 13, 2005 - Game Theory. The MIT Press, Cambridge/London. [4] Green, J. R., Laffont, J.J., 1986. Partially verifiable information and mechanism design.