Abstract :
Twenty years ago, only a handful of visionaries could have predicted that powerful software born of supercomputing would butt its way into almost every desktop PC. Few foresaw the scale of data that would be manipulated or the complexity of the tasks that would be performed by software tools costing a few hundred dollars. But now, all developers of technical software take it as given that users may need to process gigabytes of data drawn from a combination of sources: instrument output; archived data; and publicly available materials, such as census data downloaded from the Internet. In this paper, the author argues that, in a sophisticated marketplace, the success of those developers hinges on equipping users to gain ever swifter insight into many reams of data
Keywords :
data visualisation; engineering computing; groupware; software engineering; technological forecasting; Technology 2000; data manipulation; desktop PCs; software developers; software tools; technical software; technology analysis; technology forecast; Application software; Costing; Data visualization; Design automation; Fasteners; Instruments; Internet; Radar; Software tools; Writing;