Welcome to AstaHost - Dear Guest , Please Register here to get Your own website. - Ask a Question / Express Opinion / Reply w/o Sign-Up!
Cpus And Gpus Unite To Attack Passwords
Posted 22 October 2007 - 09:38 PM
The idea of using the GPU is not particularly new. The idea has been thrown around for several years but to my knowledge this is the first wide-spread practical application that has bee proposed. The science of cryptography has always been similar to the virus-antivirus arena. It is a rat race to one up the other side. It will be interesting to see which algorithms are susceptible to this attack and how the crypto community will react.
Posted 23 October 2007 - 11:02 PM
I like the idea of using more hardware more efficiently but this just seems odd, anything your own home computer could crack relistically shouldn't be something that needs to happen super fast. I mean, if you are cracking enough encryption routines at home that this is beneficial to you... you are prooobably doing something a little shady.
Posted 27 October 2007 - 05:23 AM
The Law of Unintended Consequences -- a graphic example: Turns out those advanced graphics chips that render fast-moving games in flowing detail have some other uses as well -- for one, they can dramatically reduce the workload of a password hacker. New Scientist reports that Moscow-based Elcomsoft is claiming to have filed for a U.S. patent on a technique that uses the massively parallel processing capabilities of the latest graphics processing units in its "password recovery" software. Using a high-end Nvidia GeForce 8800 Ultra, Elcomsoft increased the speed of its password cracking by a factor of 25, according to the company's CEO, Vladimir Katalov. Even a less powerful, $150 graphics card can plow through a complex crack in three to five days, as opposed to the months a central processing unit alone would take.
The speed comes from the way the graphics chip handles data. NVidia spokesman Andrew Humber explains, comparing the process to searching for a word in a book: "A [normal computer processor]would read the book, starting at page 1 and finishing at page 500," he says. "A GPU would take the book, tear it into a 100,000 pieces, and read all of those pieces at the same time."
Nvidia inadvertently helped the hacking advance along in February by releasing a hardware development kit that let programmers access the GPU's processing power directly, a boon for those working on complex science and engineering problems. But, as always, one man's tool is another man's weapon.
Posted 27 October 2007 - 10:45 AM
As for just cracking passwords...the speed increases this could provide are nothing compared to the speed you get with a precomputed hash-style attack. Generating the hash tables might be faster, maybe tho.
Posted 01 November 2007 - 10:13 PM
I will also agree that this does little in the space of very strong ciphers. If you are using 256 AES then you have only reduced the time from well after our sun goes supernova to just before our ultimate destruction. (For those who do not know, scientists predict that out own sun will end in a supernova explosion. You shouldn’t worry too much though because this will happen billions of years in the future. At current rates, it will take many billions of years to crack current strong cryptography and thus we will not be around to see the fruits of the cracking labor.)
I also see the true power of this technology in parallel processing of some sort. If you can get a ten fold increase out of a single graphics card then imagine if you crated a computer with one CPU and 10,000 GPUs. This is exactly what happened with the greatest blows to DES. The EFF funded a project called “Deep Crack” (http://en.wikipedia....EFF_DES_cracker) that exploited specialized FPGA hardware in parallel. When given the task of computing a DES cipher, this hardware was much more efficient than any supercomputer. When you put 10,000 of these things together all of a sudden you have a super cracking machine. This was one of the many projects that ultimately showed the US government that DES needed to be replaced. I would have to have guess that even ten years before this it was inconceivable that such a thing could happen.
Posted 02 November 2007 - 11:20 PM
I would really like to know though, "exactly" how do they make use of the GPU?
I imagine they get it to perform calculations just like a CPU, but how do they directly access it in such a way, and manage it properly so that you get the speed boost. I mean the power of a GPU is nothing compared to a CPU generally, how can it boost so much?
This is assuming one decent CPU VS an equally decent GPU.
Reply to this topic
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users