|
发表于 2005-10-21 13:11:43
|
显示全部楼层
翻译:http://grid.org/about/gc/seti.htm
The Proof That It Works
最有效的证明
The seminal Internet distributed computing project, SETI@home, originated at the University of California at Berkeley. SETI stands for the "Search for Extraterrestrial Intelligence," and the project's focus is to search for radio signal fluctuations that may indicate a sign of intelligent life from space.
SETI@home 是一个基于互联网的网络分布式计算项目,项目的主办方是美国加利福尼亚州的伯克利大学。SETI 是代表“搜寻地外文明”的意思,项目主旨在于寻找来自外太空的可能是外星智慧文明发射的无线电信号。
SETI@home is the largest, most successful Internet distributed computing project to date. Launched in May 1999 to search through signals collected by the Arecibo Radio Telescope in Puerto Rico (the world's largest radio telescope) the project originally received far more terabytes of data every day than its assigned computers could process. So the project directors turned to volunteers, inviting individuals to download the SETI@home software to donate the idle processing time on their computers to the project.
SETI@home 是迄今为止全世界参与人数最多的、最为成功的、最大的基于互联网的分布式计算项目。项目从1999年5月份开始分析由位于 Puerto Rico 的 Arecibo 射电望远镜所接受到的无线电数据,任何一天的数据单单凭项目主办方微薄的计算能力都将是永远处理不完的。于是,项目开始转向向民间寻找帮助,他们邀请志愿者下载 SETI@home 的客户端软件以贡献他们的闲置处理器计算能力来推动项目运作。
After dispatching a backlog of data, SETI@home volunteers began processing current segments of radio signals captured by the telescope. Currently, about 40 gigabytes of data is pulled down daily by the telescope and sent to computers all over the world to be analyzed. The results are sent back through the Internet, and the program then collects a new segment of radio signals for the PC to work on.
项目主持方在分配完积压的数据之后,开始向 SETI@home 志愿者分配一些刚刚从望远镜获取不久的数据包。目前,大约每天有 40GB 的数据经由望远镜获得并被发放给来自全世界的计算机进行处理。计算结果通过互联网络返回到项目服务器中,然后服务器将发放一个新的计算任务包给该计算机继续处理。
Over two million people — the largest number of volunteers for any Internet distributed computing project to date — have installed the SETI@home software.
超过 200 万志愿者——这是所有分布式计算项目中参与人数最多的一个项目——已经安装了 SETI@home 客户端软件。
This global network of 3 million computers averages about 14 TeraFLOPS, or 14 trillion floating point operations per second, and has garnered over 500,000 years of processing time in the past year and a half. It would normally cost millions of dollars to achieve that type of power on one or even two supercomputers.
大约 300 万台计算机,计算效率约为平均每秒运行 14 TeraFLOPS 或 14 兆浮点操作每秒,在过去的一年半的时间里项目共计调用了大概 500,000 年的 CPU 处理时间。这如果使用一、二台巨型计算机处理,通常也将花费成千上万美元来达到此种强大的计算能力。
[ Last edited by Danny on 2005-10-21 at 13:37 ] |
评分
-
查看全部评分
|