By many accounts, computer crime has shot rapidly to the top of many companies’ risk management lists. The possibility of a company’s most valuable intellectual property falling into a competitor’s hands or the data systems of its entire operation crashing as a result of hacker’s whim is — or should be — striking fear into the hearts of many CFOs.
As a result, a counter-initiative is gathering steam in Corporate America. At its most robust level, Microsoft and other corporate giants, often in tandem with national and local law enforcement operatives, are seeking out and destroying criminals operating through extensive computer networks known as botnets.
But most companies don’t have the money to pay for the technology needed to mount such attacks nor the lawyers they might need to avoid the stiff liabilities likely to follow a misguided counter-attack. Yet there’s little doubt that many companies are taking a more aggressive approach to curbing cyber attacks than they once did.
Short of outright counter-attacks, there are a number of things employers can do to understand their cyber enemies and foil them — some intricately technological, some that simply involve common sense. On the more aggressive side is a technique that involves setting up a “honey pot” to lure hackers to an alternative site, where they can be studied and deterred.
Basically, a honey pot is a trap. For instance, it may be a site that seems to be part of a network that contains data of high value to the villains. In reality, it’s separate from the company’s existing site, or cordoned off within it. Because it’s isolated, hacker activity can be closely monitored.
One cost-free, open-source, collective effort, Project Honey Pot, purports to be “the first and only distributed system for identifying spammers and the spambots they use to scrape addresses from your website.”
The project is a distributed network of decoy web pages that companies’ web administrators can use on their sites to gather information about potential attackers. By using the Project Honey Pot system, corporations can install email addresses that are custom-tagged to the access times and Internet Protocol (IP) addresses of visitors to their site.
If one of those addresses starts getting email, Project Honey Pot technicians can tell that the messages are spam, as well as “the exact moment when the address was harvested and the IP address that gathered it,” according the project’s website.
Relatively simple to implement, the systems have the benefit of bearing relatively small legal risks, according to Daniel Garrie, executive managing partner for Law and Forensics, a boutique legal-strategy consulting firm.
Unlike the most aggressive forms of pre-emptive corporate cyber-defenses, “they never capture or attack other systems,” the attorney says. They would be illegal if they did.
However, he adds, “you could use a honey pot in a fashion that could certainly end up creating … morally and ethically and legally questionable results.” One such instance could occur if the attacker’s computer interacts with the honey pot in such a way that the corporation inadvertently ends up stealing from or attacking another computer — even the hacker’s, attorneys say.
For his part, Paul Paray, a partner in InfoLawGroup, agrees that a honey pot can help a company understand “what took place and how [the attackers] went through [its] network.” But beyond a slight deterrent effect if bad actors hear of the company’s use of the technique, there isn’t much benefit in it for most companies, he says.
Block and Tackle
Instead, “there are a number of approaches you could take for risk management purposes that are proactive and are aggressive that don’t entail … trying to disable potential hackers,” according to Paray.
He calls such approaches “block-and-tackle endeavors,” meaning that relatively small adjustments to corporate policies and procedures can yield big data-protection results.
For example, being aware of the data your company maintains or processes “is a very basic risk management question that needs to be answered before anyone goes and tries to do anything exotic or expensive.”
Included in such awareness is knowing where data is stored and how it’s disposed of. For instance, Paray asks: “When you provision laptops, do you generally know what your employees are storing on those laptops?”
Employer databases should carefully differentiate between the data kept in laptops of different kinds of employees — for example, those who are in sales versus those who are in marketing.
Employers should also pay special attention to whether or not the laptops are encrypted, according to Paray. Further, since laptop encryption can be a lengthy process, the company should have a policy in place stating which kinds of laptops should be encrypted.
Encrypting laptops can also protect employers from being sued for not providing notification after laptops have been lost.
The HITECH Act and many other notification laws provide a safe harbor for employers who encoded lost equipment, according to the attorney.“If you have an encrypted laptop that’s been lost, you may be able to argue you do not have a duty to disclose,” Paray says.
Even attending to the very basic issue of the provisioning of laptops can yield significant results in terms of data security. For example, a company that only gives an employees a new laptop when their machines is lost or stolen is going to see an uptick in lost or stolen laptops. “It’s just human nature,” Paray says. “If that’s the case and the laptops aren’t encrypted, that can trigger a good number of notification laws.”
Another area in which simple changes can avoid big problems is BYOD, that is, the policy of letting employees “bring your own device” to the workplace to use at work. For example, employers can bar employees from downloading zip files — “a known conduit for malicious code” — from smartphones onto company systems, he says.
“Now that smart phones are acting as de facto computers for an expanding number of employees, the bottom line is you need to be on top of that,” Paray adds.
Illustration by Pearson Scott Foresman, via Wikimedia Commons