I never intended to write a daily column on the intersection between technology and the law, but I suppose that as technology has an ever-greater impact on daily life, conflicts will abound, making careful and informed analysis of the issues that much more important. The New York Times just published an article about one such conflict, this time involving computer security.
Likely due to the events surrounding Wikileaks' disclosure of classified documents, followed shortly thereafter by Sony Corporation's negligence in handling millions of customers' credit card numbers, the Obama administration and Congress have decided that the government should finally do something about what is being called the "cyber threat." In short, the administration wants to make it clear that Bad Guys, e.g. hackers, will be punished.
To be abundantly clear, I have no problem with punishing Bad Guys. I've watched malicious hackers destroy far more than mere bits on a hard drive, and then laugh about it. Where I get worried, and I do mean very, very worried, is about that rare instance when I'm accused of being the Bad Guy, and no one knows enough to say otherwise.
It's hard to describe the level of stress that the threat of a prison sentence can induce in a person who hasn't committed a crime. Throughout my life I've encountered my fair share of stressful situations, and two types stand out. One of them involves handling a family member's mental illness. The other involves what happened when I attempted to register my company as a federal contractor through the General Services Administration (GSA).
In retrospect, what happened with the GSA could have been much worse. No one died, and no one went to jail. That being said, it was absolutely terrifying at the time, and unless they are careful, President Obama and Congress are on course to ensure that my story is repeated many, many times over.
Modern society is completely dependent upon relational database technology, developed by IBM in the 1970s and popularized by Oracle soon after. Today, programmers can download incredibly powerful database software for free and write programs with it—and many do. Some of these software applications are well-crafted. Many are not. Some of them are personal projects. Some are multi-million-dollar, enterprise-grade, mission-critical programs that run the country, whether for private enterprises or government agencies. One might think that all enterprise-grade systems are in the "well-crafted" category leaving personal projects for "not," but nothing could be further from the truth.
In 2004, shortly after graduating from college, I found a series of serious security flaws in one system after another, and even considered starting my career as a computer security consultant. I discovered that the username and password needed to control South Station's merchant wireless network were "south" and "station," respectively (see http://www.thinkcomputer.com/corporate/whitepapers/southstation.pdf). I then realized that my company's payroll vendor, a company called PayMaxx, was exposing my social security number, home address, and salary data, along with the data of many thousands of others (see http://www.thinkcomputer.com/corporate/whitepapers/identitycrisis.pdf). I even found security flaws in the then-emerging social networking web site, Facebook. Each time, the corporation responsible ignored my reports, rebuffed my advice, and refused to fix the problem until I involved the press. I was certainly never invited to fix it for them, let alone paid for my time. The ensuing fallout damaged PayMaxx so much that it threatened to sue my company, and then agreed to sell itself to CompuPay, taking on its corporate parent's name.
As a slew of other security breaches appeared in the press in 2005, the GSA contracted with Unisys, a large government IT contractor, to build a software application called eOffer. The goal of the eOffer initiative was to automate the incredibly complex paper workflow necessary to approve companies for selling goods and services to the federal government. It was an expensive project requiring many programmers and resources. To secure the contract, Unisys employed one Jack Abramoff, among others, as a lobbyist in Washington, D.C. via the law firm of Greenberg Traurig LLP.
When I signed up for eOffer I was informed that I needed a digital encryption certificate from AT&T, which I obtained and installed. It quickly became clear that the system had an enormous design flaw: anyone with such a certificate—and anyone could get one—could sign in as any company in the Department of Defense's Central Contractor Registration (CCR) database. At the time, approximately 400,000 companies nationwide were registered, including of course every major defense contractor, technology company, and almost all of the Fortune 1000. Once signed in, it was possible to add fake records, delete legitimate records, and edit records. It was also clear that the system did not actually delete data properly, which carried additional security risks. (See http://www.thinkcomputer.com/corporate/whitepapers/restassured.pdf.)
After the shock of the discovery wore off, I reported my findings to the GSA's Inspector General (IG) in mid-December, 2005. After an incredibly long period of roughly one month during which the GSA IG ignored the problem completely, I contacted the press to pressure the government to actually fix the problem, which affected my business just as it did the thousands of others. The article that John Markoff ultimately authored for The New York Times touched off a firestorm of activity in Washington. Two senators wrote letters to the agency's interim head asking, among other things, whether or not the GSA had violated the Federal Information Security Management Act. Within days, I became the only suspect in a criminal investigation at the request of the GSA CIO responsible for eOffer. Congressional staff members and the GSA IG refused to talk to me, even as I tried to alert them to additional problems with the system and the link to Jack Abramoff, who was being investigated for fraud. (Abramoff's actions for another client of his eventually led to his incarceration and the ouster of the GSA's top executive.) Not long after, two armed federal agents appeared at my doorstep. One of them, who had no background in computer crime, handed me a copy of the Computer Fraud and Abuse Act of 1986, and told me that I should read it. A month later, at my own expense, I was on a plane to Washington to explain the nature of the flaws to a U.S. Attorney and five federal agents employed by the GSA IG. My lawyer, who sat silent for two hours as I answered the government's questions, charged several thousand dollars, which the GSA refused to reimburse later on.
The reason I needed a lawyer at all was that the USA PATRIOT Act had given sharp teeth to 18 U.S.C. § 1030. Punishment was very specifically "a fine under this title or imprisonment for not more than ten [in another clause, twenty] years, or both." Even though the statute said that a criminal would be someone who "knowingly and with intent to defraud" accessed a computer system, it still did not clearly define whether the "intent to defraud" standard applied to all of the Act, or just part of it.
Lucky for me, the U.S. Attorney didn't take long to figure out that I had no intent to defraud anyone—otherwise it would have made no sense for me to report the problems to the agency in the first place. Yet it didn't change the fact that just like the South Station technology vendor, PayMaxx, and Facebook, the GSA had no interest in hearing that its software was flawed, and it didn't change the defensive legal posturing that followed in almost every case. The GSA IG did eventually release a report on eOffer, and it also responded to the requests of Congress. You can guess who they blamed. (Hint: it wasn't Unisys, nor was it the Bush-appointed agency CIO who had decided that it made sense to create a crucial government system linked to the Department of Defense with no user accounts or passwords. See http://www.aarongreenspan.com/filing/20060221.gsa.pdf and http://www.aarongreenspan.com/filing/20060228.gsa.pdf.)
Computer security is a complex issue, and part of that complexity is that it is a two-way street. Malicious hackers should clearly be held liable for malicious actions, but corporations and government organizations that have been clearly and definitively warned about flaws should be equally liable for willful negligence. Good Samaritans should not. Right now, the amended Computer Fraud and Abuse Act allows—and even encourages—system operators to shoot the messenger. That's clearly not the way it should work, for it discourages legitimate reporting of flaws, which increases the likelihood of malicious exploits damaging society later on due to those flaws. If President Obama thinks that the law isn't tough enough already, then just wait. With more heavy-handed statues and overzealous prosecution, the "cyber threat" will merely grow larger. For years the Government Accountability Office (GAO) has routinely given government agencies flunking grades on security (see http://www.gao.gov/new.items/d03303t.pdf, http://www.gao.gov/new.items/d01171.pdf, http://www.gao.gov/new.items/d02231t.pdf, http://www.gao.gov/archive/2000/d01600t.pdf, etc.), and as Sony has so aptly demonstrated, private industry isn't much better.
The one thing new legislation should do is require companies, and especially banks, government agencies, and health care organizations, to have a defined channel for reporting security flaws anonymously and in detail. By reporting the GSA eOffer flaw to the agency's Inspector General, I followed proper procedure (and was punished), but most of the time, there is no proper procedure. I reported the PayMaxx flaw to the only people who would listen—my sales representative and customer service representatives—and unsurprisingly, the critical information went nowhere. I reported the Facebook flaw to Mark Zuckerberg, and unsurprisingly, he placed the blame on someone else, telling me that he hadn't written the code in question. Responses like these aren't good enough. On the FaceCash payment system web site, this is why we've put a link to our security response form on the bottom of every single page. (If you run a web site, you should do the same.)
If President Obama were really thinking ahead, he'd broaden the scope of whistleblower protection laws. Right now they cover government employees who report fraud, but not private citizens who report fraud (or in this case, technical flaws) in the government. Unfortunately, that seems rather unlikely to happen. (See http://mobile.salon.com/news/opinion/glenn_greenwald/2011/05/16/whistleblowers and http://www.newyorker.com/reporting/2011/05/23/110523fa_fact_mayer?currentPage=all).
Wikileaks was indeed a serious security problem from the standpoint of the State Department, but Washington is dangerously sidetracked. At this rate, as the saying goes, you ain't seen nothing yet.