Efficient approximate top-k mutual information based feature selection

Document Type

Article

Publication Date

8-1-2023

Abstract

Feature selection is an important step in the data science pipeline, and it is critical to develop efficient algorithms for this step. Mutual Information (MI) is one of the important measures used for feature selection, where attributes are sorted according to descending score of MI, and top-k attributes are retained. The goal of this work is to develop a new measure Attribute Average Conflict to effectively approximate top-k attributes, without actually calculating MI. Our proposed method is based on using the database concept of approximate functional dependency to quantify MI rank of attributes which to our knowledge has not been studied before. We demonstrate the effectiveness of our proposed measure with a Monte-Carlo simulation. We also perform extensive experiments using high dimensional synthetic and real datasets with millions of records. Our results show that our proposed method demonstrates perfect accuracy in selecting the top-k attributes, yet is significantly more efficient than state-of-art baselines, including exact methods for computing Mutual Information based feature selection, as well as adaptive random- sampling based approaches. We also investigate the upper and lower bounds of the proposed new measure and show that tighter bounds can be derived by using marginal frequency of attributes in specific arrangements. The bounds on the proposed measure can be used to select top-k attributes without full scan of the dataset in a single pass. We perform experimental evaluation on real datasets to show the accuracy and effectiveness of this approach.

Identifier

85140124590 (Scopus)

Publication Title

Journal of Intelligent Information Systems

External Full Text Location

https://doi.org/10.1007/s10844-022-00750-4

e-ISSN

15737675

ISSN

09259902

First Page

191

Last Page

223

Issue

1

Volume

61

Grant

1814595

Fund Ref

National Science Foundation

This document is currently not available here.

Share

COinS