iT1 Resources

Lessons Learned From An IT Horror Story

I was going to title this blog, “The Money Is Gone.” You’ll see why as you read further down.

There are thousands of scary IT and security stories on the Internet. If you read them regularly like I do, you can begin to see a pattern. Users fall for a password scam, hackers get legitimate user IDs and passwords, the organization suffers, and sometimes the event is large enough to become newsworthy. While I have suffered a variety of security cracks, incidents, and other user error events throughout my career, none have become newsworthy. That said, my scariest IT security day  started with phone call from the CFO who immediately yelled, “The money is gone!”

Many hacking stories on the Internet cover the actual event such as a breach of IDs and passwords, however they do not typically follow the event to reach the “so what” or the real damage. Losing your ID or password is no more eventful than losing your house keys these days, unless someone uses it to hurt you. If you lost your house key and an armed robber showed up the first night to steal your valuables and hurt your family, then the loss of the house key becomes a very big deal; it becomes a “so what.” The “so what” of losing your ID and password could be just as bad if the thieves use that information to steal your paycheck, your expense check, or otherwise take money from your checking account. That is exactly what happened on the scariest day of my IT security career.

The precursor events to the scariest day started more than two months before anyone realized a hack had occurred. In the first 30 days, a couple of expense checks did not show up on payday but employees did not think anything of it because sometimes expense checks require more than one pay cycle to post. However, by the second cycle all of the money due employees was gone. This was not a clerical delay or processing error, the checks had been posted, but not to employees’ checking accounts. That is when I got the call and the message, “The money is gone.”

As my security team engaged, they quickly determined that checking account numbers had been changed in the payments processing system. This discovery raised many red flags and created a system lockdown as part of activating the security incident response plan. Once systems were locked, the next level of the investigation including log reviews, archive investigations, deep forensics using our internal resources, and other security incident actions led us to discover unusual email activity and folders that the employees did not recognize hiding inside their email accounts. The third level of the investigation revealed email rules to move alerts from the payments processing vendor into these mystery folders which hid the fact that hackers were changing deposit information in the payment processor’s system.

As the investigation deepened, we activated a higher-level security incident response to bring in our external security partners. With our internal and external security teams investigating, we found that employees had fallen prey to a password harvesting email more than two months prior and that the employees were using their same ID and password for multiple work accounts including the external payment processor. This unfortunate decision on their part to ignore the most basic ID and password common sense, and the first rule of passwords that we have all learned, led to employees losing thousands of dollars over a two-month period. All leading to the CFO calling me to say, “The money is gone.”

Reflecting on this scariest day of my IT leadership career, I first feel sympathy for the employees who lost thousands of dollars. They were not ignorant, lazy, or silly, etc. They were trusting people following rules and trying to make their lives just a little bit easier by not having to remember yet another infernal password. Beyond feelings of sympathy though, I felt a sense of responsibility. Not that any organization rule or process caused the employees to make poor password choices, however I did feel that the proper training, testing, and password remediation, might have helped those employees make better password choices and prevented them from losing thousands of dollars.

That thought and internal questioning led me to question how much security training was too much, a question that still does not have an answer today. As an experienced organization leader, I have discussed the how much question with many other leaders and heard of practices that include a daily phishing test with loss of job consequences to an annual slide show to check the box for the auditors. I suspect the right answer lies somewhere in-between those extremes for most organizations.

Reflecting on the event during our postmortem, several questions emerged including how much testing is too much, and how do we protect people from themselves as well as protect your organization from the people who work here? As we struggled with those questions, we also had to acknowledge that no matter how much you prepare, test, and protect, there will always be one person, one scam, one missing software patch, etc. that causes you to step into the unknown and lead to another scary day. However, the unknown does not have to be the unplanned.

Scary days happen in security. They always have and the always will. Prevention and preparation are the building blocks you can deploy to reduce the scary day count and magnitude or impacts to your organization. Your efforts as a security leader directly influence the outcomes of the next IT horror story so that IDs or passwords harvested from a phishing attack do not lead to your CFO calling to say, “The money is gone”!

 

If you’re looking for IT solutions or help with your security, contact iT1 today to learn more about our Security Risk Assessment.

 

 

AUTHOR BIO
Dr. Mike Lewis serves as Chief Information Officer, EVP of Informatics, Security & Technology for Trillium Health Resources, a managed-care organization serving more than 350,000 members in North Carolina. He earned his Doctor of Management degree from George Fox University and is a former MBA adjunct professor at Maryhurst University. Mike has worked in the IT field for more than 25 years with stints at IBM, Merisel, and Dell.

 

Visits: 430

<< Back to Resources