• Type of project: Research Project/Hauptseminar/Thesis (literature review + coding)
  • Contact: han.wang@tu-ilmenau.de

Background

Categorical variables are fundamental in machine learning, representing discrete choices in tasks such as classification, structured prediction, and reinforcement learning. This project explores the use of softmax-based neural networks to optimize categorical sampling under predefined constraints. By investigating different softmax variants and selection strategies, students will gain insights into differentiable optimization, probabilistic modeling, and efficient decision-making in machine learning.

Task

  • Investigate softmax-based selection techniques and their applications in machine learning.
  • Develop and optimize a softmax-based sampling model, exploring temperature scaling and regularization strategies.
  • Compare different selection methods and analyze their performance on synthetic and real-world datasets.

Reference

  1. Goodfellow I, Bengio Y, Courville A, et al. Deep learning[M]. Cambridge: MIT press, 2016. (6.2.2.3 Softmax Units for Multinoulli Output Distributions)
  2. Jang E, Gu S, Poole B. Categorical Reparametrization with Gumble-Softmax[C]//International Conference on Learning Representations (ICLR 2017). OpenReview.net, 2017.