16

I would like to know how you make employees report incidents. Incident reports are a key element of an ISMS. No reports = No discovery of the incident = High chance things go out of control.

We have a kind of game: people can give red cards to each other for small incidents. The cards tell them to report the incident, but people don't want to. I can imagine we have to use a reward system (focus on positive) to make people report and do their best to limit incidents, but how?

The reporting system currently works like this: Person B.A. sees Person A.B is not locking his computer. He then (should) give(s) a red card to the person, takes a picture and send it to the incident@company email with the name of the person in the mail. The mail goes to InfoSec team (not only IT guys btw) who then put it in the system.

Now, no-one sends those emails. I am checking to get enough reports for the audit, but that means people will not report because I will. I did stop checking for a month, then the number dropped to 1/4 of the previous month. I started checking again and it immediately rose...

What to do?

-Edit-important note:

I am a student, doing this as internship. I am an engineering management student, new to this area when I started 3months ago, no IT knowledge. The company is an IT company in Bulgaria. Now 200 employees, last year 100. Now everything is changing very fast of course. That is not the way it should be but the way it is. Please take those things into consideration when you reply. Feedback is welcome, but please tell me How

schroeder
  • 123,438
  • 55
  • 284
  • 319
johan vd Pluijm
  • 211
  • 2
  • 10
  • 2
    Is this the only kind of reporting that you have in place? Do you have guards or a receptionist? Do you have security cameras? What kind of behavior would you classify as an "incident"? – Tom K. Nov 24 '17 at 13:27
  • 9
    Side note: incident reports should be automatically generated and not reliant on users proactively reporting everything you need for reports. User-initiated reports should augment automated reporting. A reporting hotline is there to allow users to participate, not to make everyone a reporter. – schroeder Nov 24 '17 at 14:07
  • 12
    One issue you might have is that there is actually passive resistance to the security rules that are in place. If you have a rule that people must never use the same password for two different services, that passwords must contain unmemorable characters, and that they must never be written down, then people are going to silently break the rules, and they are not going to report their colleagues for breaking them. So your real challenge might not be to get people to report incidents, but to get them on your side. – Michael Kay Nov 24 '17 at 14:22
  • @MichaelKay I think you are right. Thing is, I am involved in this at a time where it was kinda too late. This means there is a certain time pressure and there are some other really negative things pushing people which I cannot control – johan vd Pluijm Nov 24 '17 at 14:52
  • If the only problem is with locking a computer, some IT company have an old custom. Whenever a workstation is found unlocked and unattented, use it, and send a mass email to the whole service stating "Free pastries for everyone on day X". Usually, people are honest, recognize they made a mistake and comply with the "breakfast fine". That way, there's no need to denounce your colleague, it's quite fun and doesn't bring embarassing statement such as "This reporting system sounds like collaboration" (see Amazon in Belgium and its new incident reporting 'game' and issues it brought). – Kaël Nov 24 '17 at 16:36
  • @Kaël I don't think people will appreciate that method either. I should switch to a positive approach. I have thought about your idea before, but I am afraid that is the best way to make myself really hated. Because then people do see the risk, yes, and the financial consequences (paying that food) but also see that I made them do it.. I have to get rid of the negative thing, looking at the general ideas below – johan vd Pluijm Nov 24 '17 at 16:43
  • 5
    In other words... how do I overcome the stigma of turning other people in? and how do you avoid "[Snitches get stitches](https://smoothwavesonline.files.wordpress.com/2014/06/a54b74099c660134e6850651ebdadb41-1000x1000x1.jpg)" kind of reactions that would be... detrimental... to a productive work environment? – WernerCD Nov 24 '17 at 17:01
  • @johanvdPluijm let the question and answers stand. As a Q&A site, we need clear questions with clear, applicable answers. No need to summarise. You have made good comments on the different answers. – schroeder Nov 24 '17 at 17:22
  • @johanvdPluijm, what you are (helping) doing, even with the best intentions, is a terrible idea. You are trying to create a hostile workplace. Also, I work in Bulgaria and have never seen such a practice at any company I have worked for; so don't be surprised if people see it as strange at best, and ignore it. – JohnSomeone Nov 24 '17 at 22:09
  • You are not trying to get people to report incidents. You are asking them to snitch. That's a completely different animal than "report if you suspect a virus on your system" or "bring USB sticks you find on the parking lot to IT for checking". – Tom Nov 25 '17 at 12:55

8 Answers8

45

Of course no one wants to report, they are "turning in" their peers. Also, the time and complexity it takes to go through the reporting process you described is another negative reinforcement. You are only going to get low compliance if everything is a negative.

And ... YOU CANNOT FORCE PEOPLE TO DO ANYTHING!!

You are approaching the problem backwards. You need to:

  1. use technical controls so that people do not have to think (set an auto-lockout time on idle workstations)
  2. reward people for doing the right thing (and no, reporting their peers is not the right thing)

Instead of punishing non-locked stations, reward people who locked their stations! Praise them publicly, offer them a chocolate. Whatever works for that office/local culture.

Your focus, at the moment, is to collect metrics for your incident reports. I suggest that this is also backwards. Locking a station is a behaviour. Not locking a station is not an incident (it's an event, at best). You are never going to get accurate metrics, so I'm not sure why this would be a focus.

I know that it is a huge mental shift, but there is a big difference between an intentional act of omission or commision (to not do or to do something) to violate policy (an incident) and inattention and inertia that results in non-compliance. You cannot confuse the two. Non-compliance is a behaviour issue, which needs to be handled (and tracked) differently.


To answer your question directly, in order to get people to do things, you need to address 3 factors:

  1. motivation
  2. ability
  3. trigger

They have to want to do it, it needs to be easy to do, and the trigger for when they are supposed to do it needs to be clear (the Fogg Model).

Scratching an itchy nose has high motivation, it's easy to do, and the itch is its own trigger. So, everyone does it reliably.

Reporting your peer for not locking a workstation has low motivation (even if you rewarded them for reporting), the process is complex, and the trigger is also not that clear. When does one deem that there is non-compliance? Does one have to be watching all the time? What if the other user stepped away and was within view of their workstation? What if the user is "looking out" for the workstation to ensure there is no unauthorised access?

You simply are on the wrong side of the Fogg Model. Address these 3 factors, and you can experience high compliance.

schroeder
  • 123,438
  • 55
  • 284
  • 319
  • I agree with you on the fact this is a negative approach and that is why I want to change it. It is, however, even more negative for the people here when we take their freedom away on their computers. If we do that, they will perceive this as not trusting them with the thing they are the best: computers. – johan vd Pluijm Nov 24 '17 at 14:09
  • 4
    Ok, but I'd review that policy, too. Are the risks that are introduced by non-compliance worth the costs of the user not having access to a computer? They might be, but it is rarely true in most organisations. – schroeder Nov 24 '17 at 14:11
  • And yes, the fact there is a need for input of report is very disappointing but that's the way it is. The trigger is clear here, I just did not put it in the question to keep things short. Ability is on 50%, that is changeable. Motivation will not come so reporting won't work if I look at that model you referred to. So you basically say I should convince people to set an timeout for a lock screen (and maybe check them once in a few weeks to make sure people know they can do this and help when needed). – johan vd Pluijm Nov 24 '17 at 14:17
  • There are risks by not enabling admin on computers. So that policy is here to stay. I am not on that subject and it is just the way it is. I personally have my doubts, but I only know a bit of that policy's reasoning – johan vd Pluijm Nov 24 '17 at 14:20
  • 1
    I'm getting confused here. "taking freedom away" does not mean "taking local admin rights away". No one should have local admin rights because the risks are too high. Admin is for Admins. – schroeder Nov 24 '17 at 14:22
  • If there is no motivation and ability is 50%, then you need to find another approach entirely. Hiring someone to do it (like an intern who is learning about security), is actually a viable solution if you need to get this done. The intern might be the only motivated person to do it. And do not *convince* people to set a time out: set it via group policies. Don't make people think. – schroeder Nov 24 '17 at 14:24
  • I am that intern you mentioned. But just for 3 other months. I started from almost 0. This reporting policy is something I did not create. It was there as a fun game, but that is on a level too low. The admin rights policy is something they started when the company was small. If we would change it, people will leave. It will be the one change too many. I am limited in my movements because of other (bad approached) changes, so I have to be careful. I cant go and turn around 180 degr. very fast, nor continue a negative spiral. Maybe you can tell me more in a skype or other verbal meeting? – johan vd Pluijm Nov 24 '17 at 14:28
  • Let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/69219/discussion-between-johan-vd-pluijm-and-schroeder). – johan vd Pluijm Nov 24 '17 at 14:29
10

Encouraging your employees to snitch on each other by sending documentation of minor misbehavior to a centralized email address is a terrible idea for work climate. Nobody does it because nobody wants their colleagues to hate them and nobody wants to build a work environment governed by a denunciation culture. The resistance to your process is not just understandable, it is completely justified.

For this specific case of enforcing locking of workstations I would recommend an automated process. Configure all clients to go to a screensaver when unattended for a while and require the user to reauthenticate when dismissing the screensaver. That's a configuration supported by every operating system I could think of. Locking, unlocking and going to screensaver are all events you should be able to log. If some people's workstations frequently go to screensaver while they are logged in and they don't unlock very soon afterwards, then they likely left their desk without locking their workstation. You could register that as a very (very! (VERY!!)) minor security incident. You should also only act on these incidents when they happen very frequently for specific users. Keep in mind that there are other reasons for this to happen, for example when the user was involved in a longer discussion with someone while still sitting in front of their workstation.

For more serious security incidents (compromising passwords, losing security tokens, setting up insecure systems), you should encourage self-denunciation. "Confess your sins, and you shall receive absolution". Promise that anyone who caused such an incident and reports it in a properly and timely manner will get pardoned for its consequences, while those who try to hide their security blunders will not.

Philipp
  • 48,867
  • 8
  • 127
  • 157
  • 1
    I like the self-reporting angle! But that usually requires quite the healthy security culture to pull off. – schroeder Nov 24 '17 at 15:48
  • I indeed see unlocked screens as a minor thing. I agree it is on the negative and that's why I want to change it. Do you really think people will report their mistakes? I am not sure. For me everything is new and there is a lot coming towards me. I only have one chance and it appears I did it good "on the surface/management perspective" but in reality it is almost a complete failure. That is what I expected but without anyone else experienced in this topic it was difficult. If I manage to get all the workstations with automatic screen lock, what's next? (I'll stop checking, thats clear) – johan vd Pluijm Nov 24 '17 at 16:25
4

I find it interesting to speak to the one who ever thought this was a good way to achieve results.

The stimulus is extremely negative. You ask people to snitch on their co-workers. They must do this in full view of other co-workers (taking photographs). You clearly distrust the offender to 'own up' and the snitch to report honestly, as you require hard evidence: a photograph. Where in this scenario does an individual ever get anything positive out of it?

Reverse it. Train all staff in security risks. Online courses are easy. Focus on good behaviour. Count locked workstations at lunch. And most important: Reward teams. Team with the highes number of locked workstations gets a reward. Keep a public score. Reward the team weekly. Leave individuals out of it. You want the team to correct their team-members.

Last: Bonus round. Go around and ask the teams to count how many times they've told a co-worker to lock their PC. Reward that team; and reward the team that says they lock without being told.

Public rewards, open scores, no individuals.

user24119
  • 41
  • 1
  • FYI: the choice between a group vs individual rewards is highly cultural. Not all cultures would respond well to team rewards. – schroeder Nov 24 '17 at 16:52
  • I guess my current situation (not my idea) is the way it shouldn't be. I do train people, Offline myself and just made a proposal for online training. Thing is: I want awareness and motivation, company want the certification atm. Conflicting interests, bad culture changes/company changes and there you have a nice amount of things to worry about – johan vd Pluijm Nov 24 '17 at 16:55
2

It starts with leadership.

In my current contract, for an organization of near 30,000 employees and contractors, I am nearly the only person who wears their photo-id badge. Everyone else carries it, with lanyard attached, in their pocket or purse, and sidles up to doors awkwardly to gain access.

In my previous placement, everyone wore their photo-id card around their neck, at all times; compliance was in excess of 99%. And at the semi-annual presentations by management to employees, seven events total over 3.5 years, every single presenter, from the President and CEO down, wore their photo-id prominently around their necks while on stage.

As long as management, and other privileged individuals, are in non-compliance your employees will be also.

1

@schroeder's answer is good, but it misses one of the huge problems with security. Instead of motivating people to do the right thing, change the thing so that the desired behavior is the default behavior, or at least the easiest possible action.

For the problem of getting people to lock their workstations when they leave, you could find an app that locks the machine when their phone's Bluetooth goes out of range. BtProx is an open source app that's been on SourceForge for over a decade (I remember playing with something like this in the early 2000s.)

Similarly, 15 years ago I bought a little radio-dongle pair where I kept the transmitter on my keychain and plugged the receiver into the USB of my machine. When I got more than about 10 feet from the machine, the software agent locked it. (It was a great concept, but the software agent it came with was horrible, and it never caught on.)

If your office building's layout supports it, another way could be to have them log in by inserting a smart card into their workstation instead of requiring a password (smart cards are really convenient for that), and use the same smart card media (with embedded RFID) for door access. If they need their smart card to get back into the office area after using the rest room, people will train themselves quickly to keep their cards with them.

Or perhaps there is a face recognition system that would lock the machine instantly when you're not facing it, and unlock it instantly when you turn back. Apple's already offering this on phones; I'd be surprised if it didn't exist for workstations.

And there are no doubt other solutions that would make locking the workstation automatic.

If you have to train/punish/reward people to do security better, it means that your security system isn't optimal. Take that as a sign that you should look for a way to improve the usability of the system.

John Deters
  • 33,650
  • 3
  • 57
  • 110
  • Actually, that was my point #1 ... and locking stations was only the example given - the OP explains that the desire is for the wide range of things to report – schroeder Nov 24 '17 at 15:29
  • And programmers have been trying to design fool-proof systems since, well, they started programming, and this is where we are. There is no 'optimal' state that we can reach right now. People are still a vital part of the process. Yes, use technical controls more, but you will never reach 100% coverage, so you still need to plan for that. – schroeder Nov 24 '17 at 15:34
  • 2
    @schroeder, your answer was more about motivating users to comply, whereas mine is more about recognizing this as an indication to address the usability of the security system. – John Deters Nov 24 '17 at 16:08
  • I agree that the system is flawed. I only have limited influence on it. Do you think I should get rid of the red card system and propose the opposite: a green card system? ( giving green cards to good examples?) That means directly opposing my company counselor by taking a 180 degree turn. In terms of Change Management, both red and green cards are part of the Red-thinking (Colours of De Caluwé). – johan vd Pluijm Nov 24 '17 at 16:30
  • The Bluetooth thing would not work here because 1) people often leave their phone at the desk and 2) for security reasons everyone disables their Bluetooth. – PlasmaHH Nov 24 '17 at 16:59
1

I agree with all the other answers: you will never get compliance with your current system. There's zero incentive for people to participate in reporting their peers, and being reported isn't the best motivation to keep your screen locked anyway. You need to ditch this plan.

Some comments suggested you start sending out emails from unlocked computers offering to buy the whole office breakfast. Some others said this is a bad idea, because it will make people really upset. I agree that it's likely to backfire if you do it to an unsuspecting victim, but I think there's a way you can make it work.

Get a big wig (like your CEO or someone like that) to play along with pretending they left their computer unlocked, and somebody else sent out a "I'll buy everyone breakfast on Friday," email. Have them send it out themselves, then follow it up a bit later with something like, "Whoops! I forgot to lock my computer when I got up, and [chief of security or whoever] sent out that email to teach me a lesson. I guess I will have to bring donuts on Friday for everyone. It's lucky that's the worst that happened, when [some other real and serious threat, like "hackers would love to steaI our credit card database"]. I know I've learned my lesson, and I hope you all have, too."

This serves multiple purposes:

  • It shows executives are being held accountable, and that they're serious about the policy

  • It demonstrates that there are real risks associated with leaving your computer unlocked, instead of some vague boogeyman

  • It gives people a reason to talk about the policy in a positive manner (they will probably think it's hilarious)

As a bonus, now you can periodically walk around the office and look for unlocked and unattended computers, and slap a sticky note on them that says "You must really want to buy the whole office donuts! =) " It's a consequence that's not too embarrassing, but it reminds people of the dangers. For repeated offenders, you can escalate by changing their wallpaper to be a box of donuts and things like that. Hopefully people will band together and lock computers they see unattended and/or give gentle reminders to each other.

Why do I think this will work? Shortly after I started my job, I was told a story of a co-worker who liked to send emails from unlocked computers offering to buy everyone lunch. I'm not convinced it's even true, but it got me into the habit of locking my laptop. I can also repeat the anecdote when I see other new hires being sloppy about locking their computers, and it doesn't come off as a lecture. I would never report my co-workers, but I have zero qualms about "protecting" them from a "threat" we both face.

In addition to that, I recommend you also configure all new computers to automatically lock after being idle for five minutes or so. Maybe also go around after the "free donuts" email and offer to set it up for people, so they "don't fall victim to any prankster-prone co-workers". It sounds like your employees are a bit protective of their computers, so make it voluntary, and let them know they can increase the time before it locks if they find it annoying; the configuration change is purely for their convenience.

You will never get 100% compliance, but these steps should set you on the path for improving your culture and making it more likely people will care about security. That's really all you can do without upsetting everyone.

Kat
  • 117
  • 6
0

Follow the principal of "sunlight is the best disinfectant" as opposed to the principal of "punish offenders into submission". Make it a game, not a snitch.

First of all, let there be no repercussions for transgressions, so that "reporting" is not "snitching". Then prepare a reward for the person with the most reports. Perhaps the INFOSEC team will treat the employee with the highest number of reports to a pizza once per week.

dotancohen
  • 3,698
  • 3
  • 24
  • 34
-3

A security incident is a breach or an attempted breach. Someone failing to lock a workstation is neither, though it may in some cases allow a breach to happen.

Try a different approach to promoting security behaviors such as locking workstations: Start a game where employees send prank emails ("Hey, I'm washing cars for free today!") to their team from their teammates' unlocked workstations. Very quickly you'll gain compliance with the security policy, nobody has to spend time filing useless "incident" reports, nobody has to spend time counting and monitoring said reports.

  • 1
    Incidents are defined by the org. Incidents do not have to be defined so strictly, but I agree that unlocked stations are not typically defined as an 'incident'. Also, note that accessing another computer and system account (email) can be (and often is) in violation of policy, and actually *does* constitute an 'incident' in most definitions (and is illegal in some jurisdictions). – schroeder Nov 24 '17 at 18:06
  • I believe your idea is not very different from the status quo, which I want to change. I think people will not like it when someone else actually does harm to his/her self confidence and privacy. By accessing their computer and using it to send those messages, that is what you do. An incident is defined here as an (potentially) unsafe situation or a potential breach. – johan vd Pluijm Nov 24 '17 at 18:17
  • Sending prank emails as described here is standard procedure at my workplace. Of course it's not foolproof, but I believe it does help. – David Z Nov 24 '17 at 23:28
  • 1
    David, in some organisations doing that would be a disciplinary breach of policy... – Rory Alsop Nov 25 '17 at 08:14