DocumentCode
3437337
Title
Software review for automatic test equipment
Author
Barela, Scott
Author_Institution
NSWC Corona Div., CA, USA
fYear
2005
fDate
26-29 Sept. 2005
Firstpage
30
Lastpage
35
Abstract
The nature of test set programming can be tedious and repetitive. A test engineer can often fall victim to puffing blinders on when programming by overlooking errors when reviewing their own work. To avoid this, it makes sense to treat software like a published work where a reviewer, independent of the original programming team, checks the software for design, quality, and errors. This paper describes a disciplined and consistent process for reviewing Automatic Test Equipment (ATE) software. This type of independent review process is comprised of four major steps: Receiving, Processing, Reporting, and Following-Up. It can be conducted and repeated throughout the development life cycle to improve the quality of the software. Early involvement can influence design changes that could lead to simpler and more manageable software. Several errors can be detected prior to its release by reviewing the software with software tools such as PC-Lint™ or Understand for C++™. Having the discipline to follow this simple process can bring about software manageability for future modifications, easier to read software, and software that contains fewer errors.
Keywords
automatic test equipment; error detection; software management; software quality; software reviews; automatic test equipment; error detection; quality improvement; software manageability; software review; Automatic test equipment; Corona; Documentation; Software design; Software packages; Software quality; Software reviews; Software testing; Software tools; Unified modeling language;
fLanguage
English
Publisher
ieee
Conference_Titel
Autotestcon, 2005. IEEE
Print_ISBN
0-7803-9101-2
Type
conf
DOI
10.1109/AUTEST.2005.1609096
Filename
1609096
Link To Document