Cybersecurity experts say an increasing number of private or supposedly secret documents are online in out-of-the-way corners of computers worldwide, leaving the government, individuals and companies vulnerable to security breaches.
Sitting at his laptop, Chris O'Ferrell types a few words into the Google search engine and up pops a link to what appears to be a military document listing suspected Taliban and al-Qaida members, dates of birth, places of birth, passport numbers and national identification numbers.
Another search yields a spreadsheet of names and credit-card numbers. "All search engines will get you this," O'Ferrell said, pointing to files of spoils he has found on the Internet: medical records, bank-account numbers, students' grades and the docking locations of 804 U.S. Navy submarines and destroyers and other ships.
And it's all legal, using the world's most powerful Internet search engine. At some Web sites and message groups, techno-hobbyists even offer instructions on how to find sensitive documents using a relatively simple search.
Although technically it is not trespassing, the practice is sometimes called "Google hacking."
"There's a whole subculture that's doing this," said O'Ferrell, a longtime hacking expert and chief technology officer for Netsec, a security consulting firm based in suburban Washington, D.C.
In the decade they have been around, search engines like Google have become more powerful. Also, the Web has become a richer source of information as more businesses and government agencies rely on the Internet to transmit and share information. All of it is stored on computers called servers, each one linked to the Internet.
For a variety of reasons — improperly configured servers, holes in security systems, human error — a wide assortment of material not intended for public viewing is, in fact, publicly available. Once Google or another search engine finds it, drawing it back into secrecy becomes nearly impossible.
That is giving rise to more activity by "Googledorks," who troll the Internet for confidential information, security engineers said.
"As far as the number of sites affected by this, it's in the tens of thousands," said Johnny Long, 32, a researcher and developer for Computer Sciences.
Google gets singled out for these searches because of its effectiveness. "The reason Google's good is that they give you more information and they give you more tools to search," O'Ferrell said.
Its powerful computer "crawls" over every Web page on the Internet at least every couple of weeks, which means surfing every public server on the globe, grabbing every page and every link attached to every page. Those results then are cataloged using complex mathematical systems.
The most basic way to keep Google from reaching information in a Web server, security experts said, is to set up a digital gatekeeper in the form of an instruction sheet for the search-engine's crawler. That file, which is called robots.txt, defines what is open to the crawler and what is not. But if the robots.txt file is improperly configured or is left off, a hole is opened where Google gets in. And because Google's crawlers are legal, no alarms will go off.
"The scariest thing is that this could be happening to the government and they may never know it was happening," Long said. "If there's a chink in the armor, (the hackers) will find it."
Google and other search-engine officials said they aren't in a position to control the problem.
With a vast array of more than 10,000 computer systems constantly collecting new information on more than 3 billion Web sites, Google cannot and does not want to police or censor what goes on the Web, said Craig Silverstein, the company's chief technology officer.
"I think Web masters have to be careful," he said. "The basic problem is that with 3 billion (Web sites), there's a lot of information out there." Google offers a tool on its own Web site, "Webmaster guidelines," on how to remove Web sites from Google's system, including Google's vast store of cached pages that may no longer be available online, Silverstein said.
For hacking experts, Google-hacking has a kind of populist allure: Anyone with Internet access can do it, if they know the right way to search.
"It's the easiest point-and-click hacking — it's fun, it's new ... and yet you can achieve powerful results," said Edward Skoudis, a security consultant for INS Inc., which helps government and business clients monitor what is visible from the Web. "This concept of using a search engine for hacking has been around for a while, but it's taken off in the last few months," probably because of a newfound enthusiasm in the underground hacking community, he said.
Search strings including "xls" or "cc" or "ssn" often brings up spreadsheets, credit-card numbers and Social Security numbers linked to a customer list. Adding the word "total" in searches often pulls up financial spreadsheets totaling dollar figures. An experienced hacker can find an alarming amount of supposedly private information.
"On a (client's) bank site, I found an Excel spreadsheet with 10,000 Social Security and credit card numbers," Skoudis said. The bank's Web server had been properly configured to keep such documents private, but someone had mistakenly put the information on the wrong side of the fence, he said. "Google found the open door and crawled in."
Google and other search-engine operators cannot gauge how frequently private documents are accessed using their sites, or how many are removed for security reasons.
"The challenge is that as the search-engine tool evolved, people got more lax about what they put on a publicly available Web server," said Tom Wilde, vice president and general manager of Terra Lycos' 19 search engines. "It would be impossible to monitor" the tens of millions of searches that take place every day, Wilde said, adding that he has never been notified of a security breach on his sites.
Government officials said they were familiar with Google hacking and were working with government agencies and businesses to secure sensitive documents on Web servers.
"It's an issue we're aware of and tracking," said Amit Yoran, director of the cybersecurity division of the Homeland Security Department. By law, each agency is responsible for its own security, and although hacking or security breaches are reported to Homeland Security, the cybersecurity division does not monitor the content of the Web, he said.
It is unclear who is at fault when someone digs up a confidential document. "I don't know what law's been violated just for searching" on a publicly available search engine, said FBI spokesman Paul Bresson, noting that the bureau has not taken action against individuals who have found secure documents by using search engines. "If they use it for some sinister purpose, that's another issue."
The availability of private information contributes to a rising incidence of identity theft, which for the past four years has been the No. 1 consumer problem for the Federal Trade Commission. Last year, the FTC received nearly 215,000 complaints about identity theft, up from about 152,000 in 2002.
Even after a document has been pulled off a Web server, as was the case when MTV removed from its Web site a pre-Super Bowl news release promising "shocking moments" at the halftime show, documents often remain stored in search engines' computers, so they still can be accessed.
"Once it is placed online, it's very hard to get the digital horse back in the electronic barn," said Marc Rotenberg, executive director of the Electronic Privacy Information Center. "It's close to impossible to get it back."
Source: Seattle Times
© 2005 - All rights reserved worldwide. Call Now 1-877-822-0875
|(We are Square Trade Verified)||(We Accept Payments via Paypal) - Partners for Search Engine Optimisation and SEO||(Member of International Internet Marketing Association)|
Warning: require() [function.require]: URL file-access is disabled in the server configuration in /home/submit/public_html/templates/CleanBlue/template.php on line 1
Warning: require(http://www.submitplanet.com/tla.php) [function.require]: failed to open stream: no suitable wrapper could be found in /home/submit/public_html/templates/CleanBlue/template.php on line 1
Fatal error: require() [function.require]: Failed opening required 'http://www.submitplanet.com/tla.php' (include_path='.:/usr/lib/php:/usr/local/lib/php') in /home/submit/public_html/templates/CleanBlue/template.php on line 1