Friday, January 30, 2009

GAO: FinCEN InfoSec Program = Bad

Like most bad news Washington, this GAO report was released on a Friday afternoon. This Friday afternoon happens to be before the Super Bowl of all things. So this is a special Friday where it almost certain to be overshadowed by the drama of the “Big Game”.

This report (that I got to read, believe or not) tells the story of the security posture of the Financial Crimes Enforcement Network (FinCEN). FinCEN is responsible for some important things not the least of which is keeping money laundering to a minimum, stopping terrorist financing and investigating other financial crimes. You may be interested to know that it has ties to many financial institutions, casinos and other places where big money may be. This is also the group that banks notify when you move $10,000 or more.

Ok, so now you know who they are and what they do. Here's the rub:

Although FinCEN, TCS, and IRS have taken important steps in implementing numerous controls to protect the information and systems that support FinCEN’s mission, significant weaknesses existed that impaired their ability to ensure the confidentiality, integrity, and availability of these information and systems. The organizations have implemented many security controls to protect the information and systems. For example, FinCEN employed controls to segregate areas of its network and restrict access to sensitive areas, and IRS controlled changes to a key application in its BSA processing environment. However, weaknesses existed that placed sensitive data at risk of unauthorized disclosure. The organizations did not always consistently apply or fully implement controls to prevent, limit, or detect unauthorized access to devices or systems. For example, the organizations had not consistently or fully (1) implemented user and password management controls for properly identifying and authenticating users, (2) restricted user access to data to permit only the access needed to perform job functions, (3) encrypted data, (4) protected external and internal boundaries, and (5) logged user activity on key systems. Shortcomings also existed in managing system configurations, patching systems, and planning for service continuity. As a result, increased risk exists that unauthorized individuals could read, copy, delete, add, and modify data and disrupt service on systems supporting FinCEN’s mission.

Holy F@%!, Batman!

I would say this is in the category of jobs that you don't want to have. Or at least had. One thing that I think I can infer from the report, is that the system is not classified. Meaning people didn't have clearances to work on the system. But there doesn't appear to be any discussion about that and I am not saying that it needs to be. On this point, I want to say that there is enough documentation and business processes out there to support doing this better.

Now, I am not going to keep writing and lamenting that our data isn't safe. This is a tough job and usually information security is something that gets tacked on. The ST&E portions of the Certifications on the system were probably rushed and people missed things. Some things get rushed out the door, some risks get accepted, whatever.


But seriously, there are some pretty basic things that the Continuous Monitoring efforts should have taken care of: excessive user rights, unused accounts, limited or missing encryption. Read the report for yourself it reads like How not to do Information Security. And now for the moral of the story.

The answers here and with most organizations will not lie with new technology but with leadership, a plan and processes. Some of it, like the mainframe, sounds like it needs an upgrade. The common thing though are operational keeping an eye on user accounts, monitoring the logs, IDS; those are things that need humans with eyes and analytical skills.

Friday, January 9, 2009

Predictions for 2009 (Because all the cool kids are doing it)

And the answer is:

Nothing.

That's right I said it. Given the economy and the state of things, stuff like Policy Compliance and Risk Management are going to be sitting in the corner. Caveat: Unless there is a dramatic change.

That change would be something from the White House or Congress or (dare I say) Al Qaeda. If the deciders decide to take regulations and compliance seriously and start adding requirements to things like the TARP or whatever then we could see something new.

But the new FISMA does not provide for any changes to the current FIPS / 800-series documentation. It is the same ambiguous pain that we all been suffering through.

Lastly, there isn't going to be any new HOT security technology that will be coming out. It will be more of the my web app just got hacked/facebook malware/twitter worm stuff that has been emerging over the last 6 months.

We'll see. Improve your Process!

Saturday, November 22, 2008

Scan, Baby, Scan!

I recently experienced death by scanning (maybe death is too strong, extreme pain). The system I am supporting is eventually used by the government which means FISMA and by extension certification agents. I still do some certification agent work and I will say that it is still a foggy area. Some will take it to the Nth degree and dig in to every facet and crevice to get the best assurance possible. Others will do a superficial scan and call it a day.

My current annoyance in the ability to ascertain the security posture is this; the management of the little system wants as few vulnerabilities as possible, obviously. So naturally, their scanning policy is tailored to the system and to the excepted risks. The certification agent has their own process for assessment, good for them. However, their process does not include updating their process for the environment.

The two processes are at odds with one another, so we are constantly chasing vulnerabilities. Different tool sets, different time lines, different policy baselines, plugin updates, the list goes on. When I try to convince that less scanning needs to happen, we in fact get more.

Most people I talk to would agree, it would be good to run as many tools as possible at your environment. I am currently frustrated by it.

There really wasn't any point to this story except that scanning with 19,000 tools is helpful as long as everyone is on the same page and can adequately communicate that page. So far, I have apparently been an ineffective communicator or my communication has been accepted and then moved aside in an effort to portray a rosier picture.

A new plan will have to take shape now. Details to follow.

Tuesday, November 18, 2008

In which I am convinced that Cloud computing is evil

I have been to the a couple of sessions at CSI over the last two days. The conference is good overall, it appears to be well organized and the speakers have been engaging. Today, I attended a session called The Fate of the Secure OS. There was discussion about many topics including arcane, outdated and poorly supported operating systems. Some discussion about maintaining configuration and keeping your users informed. But there was also a presentation on ... Cloud Computing and Virtualization.

Up until this afternoon, I didn't think that it was more than a hassle that had to be dealt with. I knew the obvious drawbacks when it came to incident handling or things like “where is my data actually stored”. I saw a presentation by Dennis Murrow of ConfigureSoft and things got really scary.

I wish I had the slide deck to make all the points, the short version of a series of questions posed to a fictional SOA/SaaS provider:

Where is my data and how are you managing it (backups, access controls, auditing, etc)?

If I choose to leave you as customer, can I get my data back and what condition will it be in?

How is the underlying hardware, hypervisor, operating systems and applications maintained and operated?

What are your policy baselines and vulnerability remediation procedures?

The list went on. To many, this is most likely old news. Judging by the way that oxygen left the room, many people seemed to be just realizing these issues. The speaker was also able to present this information in a way that didn't appear to be coming across as FUD. It just seemed like a logical progression of things to consider before ... you know ... sending your confidential, proprietary data into the ether.

After the session, many had sworn off the idea of putting their data in a cloud computing environment. There may have been a few management types that still clung to the idea that outsourced data processing and storage was a good idea.

My end takeaway is this there is no risk that anyone in their right mind can accept here, there is no assurance evidence that could make me believe that in 2008 (and probably into 2009) that cloud computing is a good idea. I could almost see that you could sell “auditor me” on virtualizing a couple servers. But the jury is still out on that one. For now, I'm with Hoff. Cloud computing needs to come along further before I can get on board, anyone considering it ... should wait until some improvements come along.

Wednesday, November 12, 2008

This is what I am talking about

I didn't go to the presentation, but this guy's synopsis is the root of my frustration:

http://www.leune.org/blog/kees/2008/11/verizon-business-presentation.html

The idea that we can do an adequate risk assessment ... $0.
Subjecting ourselves to a fruitless process with no significant progress ... Millions
Suddenly coming to the realization that there needs to be an overhaul ... Priceless.