Аннотации:
© 2018 IEEE. The effect of faults on the circuit operating is usually evaluated by simulation, using the appropriate models of catastrophic and parametric faults. Functional and structural complexity, consideration of the tolerances on the components parameters, the infinity of the set of parametric faults, etc., leads to the accumulation of large amounts of data describing the behavior of the fault-free and faulty states of the circuit. Machine learning methods are actively used to construct neuromorphic fault dictionaries (NFD), which provide fault diagnostics of analog and mixed-signal integrated circuits. Many problems of training a neural network with a large amount of data can be solved by reducing the size of training sets and using only significant characteristics in them. The paper proposes a method based on calculating entropy to select the essential characteristics for a training set. An algorithm for reducing the dimension of a training set using entropy is presented. The results of experimental studies for the Sallen-Key analog filter are demonstrated. The received results show high efficiency of the proposed method: at insignificant time spent on data preprocessing, the training time of the neural network is reduced by a factor of 192, and the resulting NFD provides diagnosis up to 95.0 % catastrophic and up to 84.81% parametric faults.