PrepTest 59, Section 4, Question 5
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage B
Researchers are now learning that many problems in nature, human society, science, and engineering are naturally "parallel"; that is, that they can be effectively solved by using methods that work simultaneously in parallel. These problems share the common characteristic of involving a large number of similar elements such as molecules, animals, even people, whose individual actions are governed by simple rules but, taken collectively, function as a highly complex system.
An example is the method used by ants to forage for food. As Lewis Thomas observed, a solitary ant is little more than a few neurons strung together by fibers. Its behavior follows a few simple rules. But when one sees a dense mass of thousands of ants, crowded together around their anthill retrieving food or repelling an intruder, a more complex picture emerges; it is as if the whole is thinking, planning, calculating. It is an intelligence, a kind of live computer, with crawling bits for wits.
We are now living through a great paradigm shift in the field of computing, a shift from sequential computing (performing one calculation at a time) to massive parallel computing, which employs thousands of computers working simultaneously to solve one computation-intensive problem. Since many computation-intensive problems are inherently parallel, it only makes sense to use a computing model that exploits that parallelism. A computing model that resembles the inherently parallel problem it is trying to solve will perform best. The old paradigm, in contrast, is subject to the speed limits imposed by purely sequential computing.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage B
Researchers are now learning that many problems in nature, human society, science, and engineering are naturally "parallel"; that is, that they can be effectively solved by using methods that work simultaneously in parallel. These problems share the common characteristic of involving a large number of similar elements such as molecules, animals, even people, whose individual actions are governed by simple rules but, taken collectively, function as a highly complex system.
An example is the method used by ants to forage for food. As Lewis Thomas observed, a solitary ant is little more than a few neurons strung together by fibers. Its behavior follows a few simple rules. But when one sees a dense mass of thousands of ants, crowded together around their anthill retrieving food or repelling an intruder, a more complex picture emerges; it is as if the whole is thinking, planning, calculating. It is an intelligence, a kind of live computer, with crawling bits for wits.
We are now living through a great paradigm shift in the field of computing, a shift from sequential computing (performing one calculation at a time) to massive parallel computing, which employs thousands of computers working simultaneously to solve one computation-intensive problem. Since many computation-intensive problems are inherently parallel, it only makes sense to use a computing model that exploits that parallelism. A computing model that resembles the inherently parallel problem it is trying to solve will perform best. The old paradigm, in contrast, is subject to the speed limits imposed by purely sequential computing.
Passage A
Recent studies have shown that sophisticated computer models of the oceans and atmosphere are capable of simulating large-scale climate trends with remarkable accuracy. But these models make use of large numbers of variables, many of which have wide ranges of possible values. Because even small differences in those values can have a significant impact on what the simulations predict, it is important to determine the impact when values differ even slightly.
Since the interactions between the many variables in climate simulations are highly complex, there is no alternative to a "brute force" exploration of all possible combinations of their values if predictions are to be reliable. This method requires very large numbers of calculations and simulation runs. For example, exhaustive examination of five values for each of only nine variables would require 2 million calculation-intensive simulation runs. Currently available individual computers are completely inadequate for such a task.
However, the continuing increase in computing capacity of the average desktop computer means that climate simulations can now be run on privately owned desktop machines connected to one another via the Internet. The calculations are divided among the individual desktop computers, which work simultaneously on their share of the overall problem. Some public resource computing projects of this kind have already been successful, although only when they captured the public's interest sufficiently to secure widespread participation.
Passage B relates to passage A in which one of the following ways?
The argument in passage B has little bearing on the issues discussed in passage A.
The explanation offered in passage B shows why the plan proposed in passage A is unlikely to be implemented.
The ideas advanced in passage B provide a rationale for the solution proposed in passage A.
The example given in passage B illustrates the need for the "brute force" exploration mentioned in passage A.
The discussion in passage B conflicts with the assumptions about individual computers made in passage A.
0 Comments