Title of article :
Using Information Theory to Study Efficiency and Capacity of Computers and Similar Devices
Author/Authors :
Boris Ryabko، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
10
From page :
3
To page :
12
Abstract :
We address the problem of estimating the efficiency and capacity of computers. The main goal of our approach is to give a method for comparing the capacity of differentcomputers, which can have different sets of instructions, different kinds of memory, adifferent number of cores (or processors), etc. We define efficiency and capacity ofcomputers and suggest a method for their estimation, which is based on the analysis ofprocessor instructions and their execution time. How the suggested method can be applied toestimate the computer capacity is shown. In particular, this consideration gives a new lookat the organization of the memory of a computer. Obtained results can be of some interestfor practical applications
Keywords :
computer capacity , channel capacity , information theory , computer efficiency , Shannon entropy
Journal title :
Information
Serial Year :
2010
Journal title :
Information
Record number :
668240
Link To Document :
بازگشت