Conditional preference networks (CP-nets) are used as intuitive graphical tools to represent conditional preference statements regarding the values of a set of attributes. Making use of CP-nets to solve certain learning problems has attracted growing attention from scholars and become a hot topic in the field of artificial intelligence. Therefore, many methods have been proposed to solve these learning problems. However, these approaches suffer from two main disadvantages: the amount of time required and their lack in concrete structures. To overcome these limitations, in this paper, we first provide theoretical support for the use of a conditional independent test for learning the structure of CP-nets. Second, we propose the dependent degree to calculate the dependency relationship among attributes. Finally, we present an algorithm to obtain the structures of CP-nets. Beyond that, a number of database samples have been reduced by filtering out insignificant or noise data, and a concrete structure of learned CP-nets with 18 attributes is given. The experiments show that our approach can obtain a better structure of CP-nets without materially increasing the time required for the process and put forward contrast to methods presented antecedently.
Note from Journals.Today : This content has been auto-generated from a syndicated feed.