PrepTest 59, Section 4, Question 1
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage B
Researchers are now learning that many problems in nature, human society, science, and engineering are naturally "parallel"; that is, that they can be effectively solved by using methods that work simultaneously in parallel. These problems share the common characteristic of involving a large number of similar elements such as molecules, animals, even people, whose individual actions are governed by simple rules but, taken collectively, function as a highly complex system.
An example is the method used by ants to forage for food. As Lewis Thomas observed, a solitary ant is little more than a few neurons strung together by fibers. Its behavior follows a few simple rules. But when one sees a dense mass of thousands of ants, crowded together around their anthill retrieving food or repelling an intruder, a more complex picture emerges; it is as if the whole is thinking, planning, calculating. It is an intelligence, a kind of live computer, with crawling bits for wits.
We are now living through a great paradigm shift in the field of computing, a shift from sequential computing (performing one calculation at a time) to massive parallel computing, which employs thousands of computers working simultaneously to solve one computation-intensive problem. Since many computation-intensive problems are inherently parallel, it only makes sense to use a computing model that exploits that parallelism. A computing model that resembles the inherently parallel problem it is trying to solve will perform best. The old paradigm, in contrast, is subject to the speed limits imposed by purely sequential computing.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage B
Researchers are now learning that many problems in nature, human society, science, and engineering are naturally "parallel"; that is, that they can be effectively solved by using methods that work simultaneously in parallel. These problems share the common characteristic of involving a large number of similar elements such as molecules, animals, even people, whose individual actions are governed by simple rules but, taken collectively, function as a highly complex system.
An example is the method used by ants to forage for food. As Lewis Thomas observed, a solitary ant is little more than a few neurons strung together by fibers. Its behavior follows a few simple rules. But when one sees a dense mass of thousands of ants, crowded together around their anthill retrieving food or repelling an intruder, a more complex picture emerges; it is as if the whole is thinking, planning, calculating. It is an intelligence, a kind of live computer, with crawling bits for wits.
We are now living through a great paradigm shift in the field of computing, a shift from sequential computing (performing one calculation at a time) to massive parallel computing, which employs thousands of computers working simultaneously to solve one computation-intensive problem. Since many computation-intensive problems are inherently parallel, it only makes sense to use a computing model that exploits that parallelism. A computing model that resembles the inherently parallel problem it is trying to solve will perform best. The old paradigm, in contrast, is subject to the speed limits imposed by purely sequential computing.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Which one of the following most accurately expresses the main point of passage B?
Many difficult problems in computing are naturally parallel.
Sequential computing is no longer useful because of the speed limits it imposes.
There is currently a paradigm shift occurring in the field of computing toward parallel computing.
Complex biological and social systems are the next frontier in the field of computer simulation.
Inherently parallel computing problems are best solved by means of computers modeled on the human mind.
0 Comments