zoukankan      html  css  js  c++  java
  • Reading List [09/07/2709/08/28]

    Simultaneous dimension reduction and regression or supervised dimension reduction (SDR) Methods

    Outline:

    29 July 2009 
    KAI MAO, QIANG WU,FENG LIANG,SAYAN MUKHERJEE. Two models for Bayesian supervised dimension reduction.
    Presenter: Xinwei Jiang, Presentation material will come soon.

    29 July 2009 
    Wu, Q., J. Guinney,M.Maggioni, and S.Mukherjee (2007). Learning gradients: Predictivemodels that infer geometry and dependence. Technical Report 07, Duke University.
    Presenter: Xinwei Jiang, Presentation material will come soon.

    Others:

    1. Vlassis, N., Y. Motomura, and B. Kr¨ose (2001). Supervised dimension reduction of intrinsically low-dimensional data. Neural Computation, 191–215.
    2. Fukumizu, K., F. Bach, and M. Jordan (2003). Kernel dimensionality reduction for supervised learning. In Advances in Neural Information Processing Systems 16.
    3. Li, B., H. Zha, and F. Chiaromonte (2004). Linear contour learning: A method for supervised dimension reduction.pp. 346–356. UAI.
    Goldberger, J., S. Roweis, G. Hinton, and R. Salakhutdinov (2005). Neighbourhood component analysis. In Advances in Neural Information Processing Systems 17, pp. 513–520.
    4. Fukumizu, K., F. Bach, and M. Jordan (2005). Dimensionality reduction in supervised learning with reproducing kernel Hilbert spaces. Journal of Machine Learning Research 5, 73–99.
    5. Globerson, A. and S. Roweis (2006). Metric learning by collapsing classes. In Advances in Neural Information Processing Systems 18, pp. 451–458.
    6. Martin-M´erino, M. and J. R´oman (2006). A new semi-supervised dimension reduction technique for textual data analysis. In Intelligent Data Engineering and Automated Learning.
    7. Nilsson, J., F. Sha, and M. Jordan (2007). Regression on manifolds using kernel dimension reduction. In Proceedings of the 24th International Conference on Machine Learning.

    methods based on gradients of the regression function:

    1. Xia, Y., H. Tong, W. Li, and L.-X. Zhu (2002). An adaptive estimation of dimension reduction space. J. Roy.Statist.Soc. Ser. B 64(3), 363–410.
    2. Mukherjee, S. and D. Zhou (2006). Learning coordinate covariances via gradients. J. Mach. Learn. Res. 7, 519–549.
    3. Mukherjee, S. and Q.Wu (2006). Estimation of gradients and coordinate covariation in classification. J. Mach. Learn. Res. 7, 2481–2514.
    4. Mukherjee, S., Q. Wu, and D.-X. Zhou (2009). Learning gradients and feature selection on manifolds.


    methods based on inverse regression:

    1. Li, K. (1991). Sliced inverse regression for dimension reduction. J. Amer. Statist. Assoc. 86, 316–342.
    2. Cook, R. and S. Weisberg (1991). Discussion of ”sliced inverse regression for dimension reduction”. J. Amer. Statist. Assoc. 86, 328–332.
    3. Sugiyama, M. (2007). Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis. J. Mach. Learn. Res. 8, 1027–1061.
    4. Cook, R. (2007). Fisher lecture: Dimension reduction in regression. Statistical Science 22(1), 1–26.

    methods based on forward regression:

    1. Friedman, J. H. and W. Stuetzle (1981). Projection pursuit regression. J. Amer. Statist. Assoc., 817–823.
    2. Tokdar, S., Y. Zhu, and J. Ghosh (2008). A bayesian implementation of sufficient dimension reduction in regression. Technical report, Purdue Univ.

     

  • 相关阅读:
    [网络流24题]飞行员配对方案问题
    学习笔记——线性基
    HDU 4507 吉哥系列故事——恨7不成妻(数位DP求平方和)
    bzoj1415&洛谷P4206 [NOI2005]聪聪与可可
    后缀自动机(模板+例题)
    最小表示法(模板)poj1059
    求次小生成树(洛谷P4180&bzoj1977)
    KMP poj3942
    最小表示法(模板) CH1807
    数位dp 求山峰数(hill)
  • 原文地址:https://www.cnblogs.com/ysjxw/p/1532302.html
Copyright © 2011-2022 走看看