1 ISSA Preeminent Trusted Global Information Security Community The Psychology Behind Security By Greg Sternberg ISSA member, Denver, USA Chapter Information security often overlooks what motivates people. Psychology can help us understand how best to work with our users to improve security. Abstract Information security often overlooks what motivates people. Everything people do is for a reason we may not agree with the reason, or even fully realize it, but the reason exists. The fact is we have to contend with thousands of years of instinct and basic human nature. Psychology can help us understand how best to work with our users to improve security. Quick question: How many of your users use good passwords? According to the analysis done by Imperva on the 32 million passwords 1 that were exposed in a recent database intrusion at RockYou Inc., the answer is probably not enough. Imperva discovered that about 30% of the passwords were six characters or smaller, while 60% were passwords created from a limited set of alphanumeric characters. Nearly 50% were easily guessable names, common slang words, adjacent keyboard keys, and consecutive digits. In fact, the most common password was , followed by and Rounding out the top five were the passwords password and iloveyou. I propose, for the most part, these users did not consider their passwords to be risky or unsafe. And even for those users who might have had momentary qualms about their password selection, they were willing to accept the risks posed by their 1 Jaikumar Vijayan, Users still make hacking easy with weak passwords, Computer World, January 21, still_make_hacking_easy_with_weak_passwords. weak passwords. As Bruce Schneier put it, Security is both a feeling and a reality. And they re not the same. 2 So what makes us feel secure? Actually a better question is: Where does convenience of use cross the line of risk we are willing to accept? The most secure computer system is one encased in five feet of concrete, powered off, disconnected, and at the bottom of the ocean, but that is not a very useful computer system (or very convenient). How the brain assesses risk Deciding where to draw the line between convenience and security is something we do constantly, whether deciding what route to take to get to the office or whether to allow our teenager to drive to the movies alone. In fact, our brains have two different systems which assess risk; the amygdala which handles the processing of immediate risk and the neocortex which handles the processing of future risks. The brain is a beautifully engineered get-out-ofthe-way machine that constantly scans the environment for things out of whose way it should right now get. That s what brains did for several hundred million years and then, just a few million years ago, the mammalian brain learned a new trick: to predict the timing and location of dangers before they actually happened. 2 Bruce Schneier, The Psychology of Security, January 21, com/essay-155.html. 12 All Web and links can be clicked to visit the URL, retrieve a resource, view an online article, or send an to the author.
2 Our ability to duck that which is not yet coming is one of the brain s most stunning innovations, and we wouldn t have dental floss or 401(k) plans without it. But this innovation is in the early stages of development. The application that allows us to respond to visible baseballs is ancient and reliable, but the add-on utility that allows us to respond to threats that loom in an unseen future is still in beta testing. Daniel Gilbert 3 Information and sensory input is sent to both the amygdala and the neocortex. The amygdala does a quick scan of a small portion of the input and rapidly comes to a conclusion based on instinct and previous history. It gathers any emotional context which might have been previously experienced and sends the entire data packet to the neocortex. It also prepares the body for a response if it feels the input represented a threat. The neocortex, on the other hand, looks everything over in a logical manner and takes a correspondingly longer time to draw a conclusion. This delay, could result in the amygdala s conclusion overriding the neocortex. 4 It is this override mechanism which makes us jump at loud sounds or feel uncomfortable walking down dark alleyways. Both the amygdala and neocortex need data before they can evaluate the situation. This data must be accurate and timely or our risk evaluation results may be flawed. Both the amygdala and the neocortex reference past events in coming to their conclusions, but it is the amygdala which references the emotional context and the neocortex which references abstract concepts like how reliable the data is or how reasonable the initial response is. 5 Unfortunately we are flooded with so much information (some 3.6 zettabytes 6 worth in ) from a host of sources such as TV, radio, Internet, newspapers, books, blogs, Twitter, movies, Facebook, other people, etc., that we have trouble remembering where we heard a particular piece of information and how reputable the source was. Fully and completely processing the information we receive is beyond the capabilities of either the amygdala or the neocortex, so we have developed mental short-cuts, called heuristics, to keep up. These heuristics are our way of handling the immensely complex world we live in. In fact, we use these heuristics without even realizing it. For example, count the number of Fs in the following text: Finished files are the result of years of scientific study combined with the experience of years. 3 Daniel Gilbert, If Only Gay Sex Caused Global Warming, Los Angeles Times, July 2, Daniel Kahnemann, A Perspective on Judgment and Choice, 2003, American Psychologist. 5 Amos Tversky and Daniel Kahneman, Judgement under Uncertainty: Heuristics and Biases, Science, zettabyte = 1 billion terabytes. 7 Doug Ramsey, How Much Information Americans Consume, UC San Diego News, If you are like most people, you answered 3 but there are actually six. Or to use the example in Don Norman s essay Being Analog : How many animals of each type did Moses take on the Ark? Most people would say two but the actual answer is none. Moses did not take animals into the ark, Noah did. This mental adjustment happens because our brains are wired to notice big differences like day vs. night or bears vs. blueberries since it is important we notice those our lives might depend on it. However, if the difference is subtle (i.e., Moses and Noah are both biblical) we may miss the difference entirely. When these heuristics and biases are accurate, they are incredibly useful. After all, it really does not matter how many Fs are in the sentence, since being close is usually good enough and much faster to determine. It also does not matter that Moses was used in the question above, since most of us mentally substitute Noah and in most situations that is the question that was really being asked. The ISSA Store Is Open Order Your ISSA Shirt Today Stand out at your next chapter or regional event by wearing the navy blue polo shirt featuring the embroidered ISSA logo. The stainless steel Thermos makes a statement and is the perfect beverage companion. Each tumbler holds 16 oz. of your favorite beverage. Our logoed pens with fraudresistant ink are a popular choice. Paired the fraud-resistant pen and ISSA notepad make for the perfect chapter or industry event door prize/giveaway, thank you gift for speakers, welcome gift for new members, or to express appreciation to volunteers. To find out more about purchasing these or other ISSA promot i o n a l products, contact Dana Paulino, , U.S. tollfree; , international; extension All Web and links can be clicked to visit the URL, retrieve a resource, view an online article, or send an to the author. 13
3 When these heuristics and biases are inaccurate or based on faulty information, we make dangerous or invalid risk assessments such as hiding under a tree or thinking lightning does not strike the same place twice. 8 While we use many heuristics and biases in our daily lives, the following ones have a direct relation to how secure we feel. Risking gains and accepting losses When it comes to evaluating gains or losses, people have a built-in heuristic against risking gains or accepting losses. Called the Prospect Theory, this is best demonstrated by an experiment 9 put together by Daniel Kahneman and Amos Tversky in which they gave one group the following gainrelated alternatives: Alternative A: A sure gain of $500 Alternative B: A 50% chance of gaining $1,000 and another group loss-related alternatives: Alternative C: A sure loss of $500 Alternative D: A 50% chance of losing $1,000 The results of the experiment were 84% chose the sure gain of A over the risky gain of B, but when faced with loss only 30% chose the sure loss of C over the risky loss of D. This translates into our users being more likely to risk a larger security loss (i.e. a break-in) then accept the certainty of a small security loss (i.e. forgetting a password). Especially when one considers that a break-in will likely have little direct affect on the user but forgetting a password certainly will and has the emotional context of embarrassment. It won t happen to me While this heuristic enables us to strive in the face of adversity or continue when others have failed, it also is the cause of much frustration for security personal. The optimistic bias is best described as...the demonstrated systematic tendency for people to be over-optimistic about the outcome of planned actions. This includes over-estimating the likelihood of positive events and under-estimating the likelihood of negative events. 10 And the more control we have over an event the more optimistic we tend to be - i.e. I would not let it happen that way. Research suggests this bias is largely due to people overestimating how skilled they are relative to other people. 11 The most interesting effects of this bias are that: individuals who had the least experience rated themselves to be far more capable then they actually were 8 wikihow Caught-Unawares. 9 Daniel Kahneman and Amos Tversky, Prospect Theory: An Analysis of Decision Under Risk, Econometrica, Wikipedia 11 David Dunning and Justin Kruger, Unskilled and Unaware of It: How Difficulties in Recognizing One s Own Incompetence Lead to Inflated Self-Assessments, Journal of Personality and Social Psychology, PaperDownload.aspx?paperID=883&fileName=Psych _ pdf. individuals who had the most experience rated themselves to be slightly less capable then they actually were Or as one researcher put it, The more you know the less you think you know. This heuristic goes a long way in explaining why our most naive users feel safer than those of us who have decades of experience on our side. The trust factor People instinctively trust other people. 12 We trust people to follow the rules of the road when we drive; we trust people to take care of our children; we trust taxi drivers to take us to our destination in a strange city. Colman described some fundamental components of trust which have direct correlations to security: Placement of trust allows actions which are otherwise not possible The vast majority of people turn their computers on with the expectation their computers will just work and have very little idea or care how they work. 2. If the person being trusted is trustworthy, then the person doing the trusting is better off; conversely if the person being trusted is untrustworthy, then the person doing the trusting is worse off Our users trust the sites they visit and the programs they run. When they work, we are better off; but when they do something unexpected or undesired, then we are worse off especially if our identity is compromised. 3. Trust is an action involving the voluntary placement of resources at the disposal of the person being trusted with no real commitment from the trustee As we know, allowing webpages to install programs requires a significant amount of trust not only of the webpage but in the software being installed, the connection between computers, the writers of the webpage, etc. Our users, however, only see a popup that they click on so it will go away. Small change blindness As long as the changes in our environment occur slowly, we adapt and are unlikely to detect the change. Eventually, if the change is cumulative, like the number of minutes the sun is up from day to day, we will notice, but it may be weeks or months before the accumulation of changes is great enough. 14 Sitting in front of a computer we are blissfully unaware of what is happening behind the curtains. We notice when someone breaks into our house because the damage and/or missing property is visible. We are unlikely to notice when 12 Niklas Luhmann, Trust: A Mechanism For the Reduction of Social Complexity, Trust and Power: Two Works by Niklas Luhmann, New York. John Wiley & Sons, James Coleman, Foundations of Social Theory, Belknap Press, Cambridge, MA, Ronald Rensink, J. Kevin O Regan, and James Clark, To See or Not To See, Psychological Science, download/psychsci97-rr.pdf. 14 All Web and links can be clicked to visit the URL, retrieve a resource, view an online article, or send an to the author.
4 Finally, unified content security. Visit websense.com to learn more.
5 someone breaks into our computer because the damage is invisible 15 (unless you know exactly how and where to look) and nothing is missing our data is still there. In fact, unless something noticeable happens, usually outside the realm of the computer, like money being siphoned out of our bank account, our system could remain infected for years. 16 What this means In the general IT industry there is the misconception that end users are dumb and all the training in the world will not help them. This mind set is prevalent even in the information security field, hence decals and T-shirts which say Social Engineering Specialist: Because There Are No Patches For Human Stupidity. If, however, we view their decisions (right or wrong) in the light of psychology, we begin to realize why they made those decisions, and even better, how we might affect a change in those decisions. We know there is no magic potion to keep users from clicking on unknown attachments or smart pills for recognizing phishing scams, but it is our job to develop the tools of awareness, training, and education 17 which our users need to make our jobs easier. Awareness To some, security is the use of a username and password and a vague impression of a group that makes it difficult to get things done. We need to raise the awareness of both the board and our users by evangelizing security. This evangelism needs to be in the spirit of security can help you and not in the spirit of repent sinner for it to be successful. From NIST publication 800: Awareness is not training. Security awareness is a blended solution of activities that promote security, establish accountability, and inform the workforce of security news. Awareness seeks to focus an individual s attention on an issue or a set of issues. The purpose of awareness presentations is simply to focus attention on security. Awareness presentations are intended to allow individuals to recognize information security concerns and respond accordingly. 18 NIST goes even further in SP 800, suggesting some activities to raise the awareness of security: Host an information security day Conduct briefings on current issues, like the dangers of social networking Distribute promotional items with motivational slogans (think coffee mugs, mouse pads) 15 Jeffrey Carr, Under attack from invisible enemies, The Independent, Kelly Higgins, The World s Biggest Botnets, DarkReading, darkreading.com/security/management/showarticle.jhtml?articleid= I don t know what it is about triads in security but there certainly are a lot of them. 18 NIST Special Publication , Information Security Training Requirements: A Role- and Performance-Based Model, Section Provide login banners serving as security reminders Show awareness videos (Computer Security Awareness Poster & Video Contest 2009) Distribute posters and flyers Whatever approach chosen, it is important to make it a regular event and not be repetitive in form but repetitive in message. A bored audience will remember little other than boredom; an interested and engaged audience will remember the message. Training It is important for our users to develop the skills necessary to be secure. In most companies security has to place a particular level of trust in their users (i.e., security cannot be everywhere at once). You have to trust users to adhere to security policies, use secure coding practices, take appropriate precautions when around secret or classified hardware and software, etc. This requires an understanding of how secure the systems need to be, what materials and processes will be needed to bring the users up to the appropriate level of secure, and what materials and processes will be needed to keep them there. Processes within security need to be implemented to ensure this training is appropriate, accurate, frequent, and up-to-date. Training which is not relevant or empowering to the user will be ignored at best and subverted at worst. Education Chris Christensen describes ten essential principles which can be used to teach adults in How Do People Learn? 19 Four of them are discussed below: Education is only one part of a behavior change program For education to be effective it must be reiterated on a regular basis and reinforced by other things such as an awareness program, visual reminders, or even included in quarterly performance goals. Apply learning immediately Learning is highly dependent on reinforcement. Reinforcement can come in the form of repetition or practice. For example, we remember that two plus two equals four because we have heard it so many times. Reinforcement can also occur via an emotional context. For example, people remember where they were when they heard about the World Trade Center because of the highly emotional context surrounding that event. Education must improve the business Sometimes in security we fail to understand the risks our users face. We see our users not following proper security procedures and assume the user does not understand the risks. To a user the biggest risk is not a security breach; rather it is 19 Chris Christensen, How Do People Learn? library/howdopeoplelearn.pdf. 16 All Web and links can be clicked to visit the URL, retrieve a resource, view an online article, or send an to the author.
6 not getting his job done. Almost all users, when faced with the risk of not doing their job or not following a security policy, will opt to not follow the security policy, especially if the penalties are not very severe. However, if the business sees security as a vital component, then users will be far more inclined to follow the security policy since it is a natural component of their jobs. Education must entertain As previously mentioned, emotional context is a key to learning. While either a positive or negative reinforcement can be used, researchers have found positive reinforcement results in lasting behavioral modification, whereas punishment only temporarily changes behavior. 20 Conclusion We all have commiserated with our colleagues about the seemly unbelievable risks our users have taken and still take. We express disbelief when people hold open secured doors for their colleagues (or total strangers) and circumvent the million dollar card-access system we fought so hard to have installed. Feeling frustrated by the very people we are trying to protect, we begin to rely on expensive technology, impossible to enforce policies, or over-zealous watch-dogging. We need to step back and consider the why behind what people do or do not do. Everything people do is for a reason. We may not agree with the reason, or even fully realize it, but the reason exists. We need to understand that while our policies and procedures are intended to keep the company safe, they may be, and likely are, viewed as impediments by the very people we are trying to protect. 21 The fact is we have to contend with thousands of years of instinct and basic human nature. 20 B. F. Skinner, Walden Two, Macmillan, Toronto, Larry Greenemeier, The Threat Within: Employees Pose The Biggest Security Risk, Information Week, showarticle.jhtml?articleid= As technologists, we often overlook what motivates people; we focus only on the symptoms. The research into human psychology and motivation has much to teach us about our users and their reactions. Improving security involves changing beliefs, attitudes, and behavior, both of individuals and of groups. Psychology can help us understand how best to work with our users to achieve the goal of improving security: Understanding how people think helps us understand how to craft training Knowing why people think helps us overcome improper ingrained (or instinctual) reactions Looking from the user s point of view and being aware of what motivates him shows us how to persuade and change attitudes Understanding how groups think helps us enhance our security procedures by protecting users from social pressures which might encourage risky behavior In short, we in security must understand why our users make the decisions they do and how we need to influence our users so they buy into security. To paraphrase a cliche: If you make a user secure; you are safe for today. If you convince a user to be secure; you are safe for a lifetime. About the Author My name is Greg Sternberg and at some point I have worked in every aspect of software engineering for almost three decades. My involvement in security started on the wrong side very early in my career, but I soon moved into the light. Since then I have worked on integrating security into the development life cycle, secure coding practices, compliance, and security architecture. I currently hold a CISSP and am TOGAF-certified. I can be reached at Advanced Persistent Threat: Sacramento Valley February Meeting Now on Connect Kevin Spease In February, David Blackburn of the Sacramento Valley Chapter recorded a presentation by Greg Hoglund, which has been posted in ISSA Connect so that all ISSA members might share. Presentation Overview: The term Advanced Persistent Threat (APT) has been used to describe high profile incidents such as the one reported by Google earlier this year. The primary means for data theft are malware programs that infect computers in your Enterprise. Malware has always had the ability to steal data, and malware has always been operated by real humans. The true threat is not the malware itself, but the human behind the malware. This is why existing security products cannot stop the attacks - the attacker is always evolving. By examining the malware attacks in your enterprise, you can gain insight into the intent of the attacker, and also his methods and capabilities. Technical analysis of malware will reveal actionable intelligence that can be used immediately to detect additional infections, update perimeter security devices, and shutdown data egress points. This information is critical for mitigating risk. All Web and links can be clicked to visit the URL, retrieve a resource, view an online article, or send an to the author. 17