DocumentCode :
1733146
Title :
Computing at the Crossroads (And What Does it Mean to Verification and Test?)
Author :
Rabaey, Jan N.
Author_Institution :
Univ. of California at Berkeley, Berkeley, CA
fYear :
2008
Firstpage :
13
Lastpage :
13
Abstract :
Today, we are interpreting computation as the execution of complex algorithms that are executed in sequential fashion and are bound to deliver deterministic answers. A number of factors are conspiring to fundamentally change that model. First, with scaling of technology to the nanoscale dimensions, it is quite certain that the underlying hardware platform will be all but deterministic (given effects such as variability and error susceptibility). Second, the emergence of distributed computation impacts the type of algorithms that are favored. Finally, many of the interesting problems to be tackled lay in the domain of perception and cognition, and most of these challenges tend to be statistical in nature. It is hence quite plausible that the nature of computation and the underlying hardware platforms will become statistical. These trends will have a profound effect on the way we verify and test designs. It is paramount that we start to explore what all of this means today if we want to be prepared for what tomorrow will bring.
Keywords :
program verification; complex algorithms; distributed computation impacts; hardware platform; nanoscale technology; verification; Cognition; Content addressable storage; Design automation; Distributed computing; Engineering management; Hardware; Intellectual property; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Test Conference, 2008. ITC 2008. IEEE International
Conference_Location :
Santa Clara, CA
ISSN :
1089-3539
Print_ISBN :
978-1-4244-2402-3
Electronic_ISBN :
1089-3539
Type :
conf
DOI :
10.1109/TEST.2008.4700541
Filename :
4700541
Link To Document :
بازگشت