TY - JOUR
T1 - NOW G-net
T2 - Learning classification programs on networks of workstations
AU - Anglano, Cosimo
AU - Botta, Marco
PY - 2002/10
Y1 - 2002/10
N2 - The automatic construction of classifiers (programs able to correctly classify data collected from the real world) is one of the major problems in pattern recognition and in a wide area related to artificial intelligence, including data mining. In this paper, we present G-Net, a distributed evolutionary algorithm able to infer classifiers from precollected data. The main features of the system include robustness with respect to parameter settings, use of the minimum description length criterion coupled with a stochastic search bias, coevolution as a high-level control strategy, ability to face problems requiring structured representation languages, and suitability to parallel implementation on a network of workstations (NOW). Its parallel version, NOW G-Net, also described in this paper, is able to profitably exploit the computing power delivered by these platforms by incorporating a set of dynamic load distribution techniques that allow it to adapt to the variations of computing power arising typically in these systems. A proof-of-concept implementation is used in this paper to demonstrate the effectiveness of NOW G-Net on a variety of datasets.
AB - The automatic construction of classifiers (programs able to correctly classify data collected from the real world) is one of the major problems in pattern recognition and in a wide area related to artificial intelligence, including data mining. In this paper, we present G-Net, a distributed evolutionary algorithm able to infer classifiers from precollected data. The main features of the system include robustness with respect to parameter settings, use of the minimum description length criterion coupled with a stochastic search bias, coevolution as a high-level control strategy, ability to face problems requiring structured representation languages, and suitability to parallel implementation on a network of workstations (NOW). Its parallel version, NOW G-Net, also described in this paper, is able to profitably exploit the computing power delivered by these platforms by incorporating a set of dynamic load distribution techniques that allow it to adapt to the variations of computing power arising typically in these systems. A proof-of-concept implementation is used in this paper to demonstrate the effectiveness of NOW G-Net on a variety of datasets.
KW - Classification
KW - Evolutionary computation
KW - Machine learning
KW - Networks of workstations
KW - Parallel computing
UR - http://www.scopus.com/inward/record.url?scp=0036808967&partnerID=8YFLogxK
U2 - 10.1109/TEVC.2002.800882
DO - 10.1109/TEVC.2002.800882
M3 - Article
SN - 1089-778X
VL - 6
SP - 463
EP - 480
JO - IEEE Transactions on Evolutionary Computation
JF - IEEE Transactions on Evolutionary Computation
IS - 5
ER -