google-research
Sequential Attention for Feature Selection
Sequential Attention is a feature selection algorithm for neural networks based on an efficient implementation of the classical greedy forward selection algorithm using the attention mechanism.
Paper
Please refer to the ICLR 2023 paper, also available on arXiv.
Experiments
To install the sequential_attention
package, download the datasets, and test the experiments, run sh run.sh
.
To run the full experimental suite for Sequential Attention, run python -m sequential_attention.experiments.experiments
after running sh run.sh
and source sequential_attention/bin/activate
.
This is not an officially supported Google product.