یادگیری متریک نیمه نظارتی در فضای لایهای با بهرهگیری دقیقتر از دانش پیشین
محورهای موضوعی : مهندسی برق و کامپیوترزهره کریمی 1 , سعید شیری قیداری 2 , روحاله رمضانی 3
1 - دانشگاه صنعتی امیرکبیر
2 - دانشگاه صنعتی امیرکبیر
3 - دانشگاه دامغان
کلید واژه: یادگیری متریک نیمهنظارتیفضای لایهایلاپلاسینفرض همواربودن,
چکیده مقاله :
یادگیری متریک نیمهنظارتی مبتنی بر منیفلد در سالهای اخیر بسیار مورد توجه واقع شده است. این رویکردها، منظمسازی مبتنی بر فرض همواربودن دادهها روی منیفلد را اعمال میکنند، هرچند در معرض دو چالش قرار دارند: 1) شباهت بین دستههای مختلف، تقاطع منیفلدها با یکدیگر را ایجاد میکند که با فرض همواربودن برچسب در این نواحی در تناقض است. 2) دستهبند NN1 که برای تعیین برچسب دادهها در مسایل یادگیری متریک اعمال میشود با وجود تعداد کم دادههای برچسبدار دقت مناسب را ندارد. در این مقاله روشی برای یادگیری متریک نیمهنظارتی با فرض قرارگیری دادهها در فضای لایهای ارائه شده که در آن از دانش پیشین موجود که همان فرض همواربودن دادهها روی هر منیفلد است به صورت دقیقتر بهرهبرداری شده است. در مرحله یادگیری متریک، فرض همواربودن در نواحی تقاطع اعمال نشده و در مرحله دستهبندی، دادههای برچسبدار در نقاط داخلی منیفلدها بر اساس فرض همواربودن توسعه داده شده است. تفکیک نقاط تقاطع منیفلدها از سایر نقاط بر مبنای رفتار متمایز لاپلاسین تابع هموار روی هر منیفلد در نقاط داخلی نسبت به سایر نقاط صورت میگیرد. آزمایشها نشاندهنده دقت خوب روش پیشنهادی نسبت به روشهای مشابه است.
Semi-supervised metric learning has attracted increasing interest in recent years. They enforce smoothness label assumption on the manifold. However, they suffer from two challenges: (1) since data in each class lies on one manifold and the similarity between classes leads the intersection between manifolds, the smoothness assumption on the manifold is violated in intersecting regions. (2) 1NN classifier, which is applied for predicting the label of classes in metric learning methods, is suffered from the rare of labeled data and has not suitable accuracy. In this paper, a novel method for learning semi-supervised metric in the stratified space has been proposed that exploit the prior knowledge, which is the smoothness assumption on each manifold, more accurate than existing methods. In the metric learning stage, it doesn’t apply smoothness assumption on the intersecting regions and in the classification stage, labeled data in the interior regions of manifolds are extended based on the smoothness assumption. The different behavior of the Laplacian of piecewise smooth function on stratified space is exploited for the distinction of the intersecting regions from interior regions of manifolds. The results of experiments verify the improvement of the classification accuracy of the proposed method in the comparison with other methods.
[1] J. Lu, X. Zhou, Y. P. Tan, Y. Shang, and J. Zhou, "Neighborhood repulsed metric learning for kinship verification," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 36, no. 2, pp. 331-345, Feb. 2014.
[2] J. Yu, M. Wang, and D. Tao, "Semisupervised multiview distance metric learning for cartoon synthesis," IEEE Trans. on Image Processing, vol. 21, no. 11, pp. 4636-4648, Nov. 2012.
[3] B. Kulis, "Metric learning: a survey," Foundations and Trends in Machine Learning, vol. 5, no. 4, pp. 287-364, 2012.
[4] Y. Ying and P. Li, "Distance metric learning with eigenvalue optimization," J. of Machine Learning Research, vol. 13, no. 1, pp. 1-26, Jan. 2012.
[5] L. K. Saul and S. T. Roweis, "Think globally, fit locally: unsuper-learning of low dimensional manifolds," the J. of Machine Learning Research, vol. 4, pp. 119-155, 12 Jan. 2003.
[6] J. B. Tenenbaum, V. De Silva, and J. C. Langford, "A global geometric framework for nonlinear dimensionality reduction," Science, vol. 290, no. 5500, pp. 2319-2323, 22 Dec. 2000.
[7] Z. Zhang and H. Zha, "Principal manifolds and nonlinear dimensionality reduction via tangent space alignment," SIAM J. on Scientific Computing, vol. 26, no. 1, pp. 313-338, 2004.
[8] M. Belkin and P. Niyogi, "Laplacian eigenmaps for dimensionality reduction and data representation," Neural Computation, vol. 15, no. 6, pp. 1373-1396, 1 Jun. 2003.
[9] J. Wang, X. Sun, and J. Du, "Local tangent space alignment via nuclear norm regularization for incomplete data," Neurocomputing, vol. 273, pp. 141-151, 17 Jan. 2018.
[10] M. A. Carreira-Perpin and Z. Lu, "Manifold learning and missing data recovery through unsupervised regression," in Proc. IEEE 11th Int. Conf. on Data Mining, ICDM’11, pp. 1014-1019, Vancouver, Canada, 11-14 Dec. 2011.
[11] N. Pitelis, C. Russell, and L. Agapito, "Learning a manifold as an atlas," in Proc. IEEE Conf. on Computer Vision and Pattern Recognition, CVPR’13, pp. 1642-1649, Portland, OR, USA, 23-28 Jun. 2013.
[12] X. Wang, T. Peter, and A. F. Mark, "Multiple manifolds learning framework based on hierarchical mixture density model," in Proc. Joint European Conf. on Machine Learning and Knowledge Discovery in Databases, pp. 566-581, Antwerp, Belgium, 15-19 Sept. 2008.
[13] K. Q. Weinberger and L. K. Saul, "Distance metric learning for large margin nearest neighbor classification," The J. of Machine Learning Research, vol. 10, pp. 207-244, 12 Jan. 2009.
[14] M. Der and L. K. Saul, "Latent coincidence analysis: a hidden variable model for distance metric learning," in Proc. 25th Int. Conf. on Neural Information Processing Systems, vol. 2, pp. 3230-3238, Lake Tahoe, NV, US, 3-6 Dec. 2012.
[15] S. C. Hoi, W. Liu, and S. F. Chang, "Semi-supervised distance metric learning for collaborative image retrieval and clustering," ACM Trans. on Multimedia Computing, Communications, and Applications, vol. 6, no. 3, pp. 18-43, Aug. 2010.
[16] W. Liu, S. Ma, D. Tao, J. Liu, and P. Liu, "Semi-supervised sparse metric learning using alternating linearization optimization," in Proc. Int. 16th Conf. on Knowledge Discovery and Data Mining, ACMSIGKDD’10, pp. 1139-1148, Washington, DC, US, 25-28 Jul. 2010.
[17] M. Baghshah and S. Shouraki, "Semi-supervised metric learning using pairwise constraints," in Proc. of the 21st Int. Joint Conf. on Artificial Intelligence, pp. 1217-1222, Pasadena, CA, US, 11-17 Jul. 2009.
[18] G. Niu, B. Dai, M. Yamada, and M. Sugiyama, "Information-theoretic semisupervised metric learning via entropy regularization," Neural Computation, vol. 26, no. 8, pp. 1717-1762, Aug. 2012.
[19] Q. Wang, P. C. Yuen, and G. Feng, "Semi-supervised metric learning via topology preserving multiple semi-supervised assumptions," Pattern Recognition, vol. 46, no. 9, pp. 2576-2587, Sept. 2013.
[20] R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, John Wiley and Sons, 2001.
[21] L. Yang and R. Jin, Distance Metric Learning: A Comprehensive Survey, Michigan State Universiy, 2006.
[22] M. Belkin, Q. Que, Y. Wang, and X. Zhou, "Toward understanding complex spaces: graph laplacians on manifolds with singularities and boundaries," in Proc. 25th Annual Conf. on Learning Theory, COLT’12, pp. 1-26, Edinburgh, Scotland, 2012.
[23] Z. Karimi and S. Shiry Ghidary, "Semi-Supervised Classification in Stratified Spaces by Considering Non-interior Points Using Laplacian Behavior," Neurocomputing, vol. 239, pp. 223-231, 24 May 2017.
[24] L. Vandenberghe and S. Boyd, "Semidefinite programming," SIAM Review, vol. 38, no. 1, pp. 49-95, Mar. 1996.
[25] M. Belkin, P. Niyogi, and V. Sindhwani, "Manifold regularization: a geometric framework for learning from labeled and unlabeled examples," J. of Machine Learning Research, vol. 7, pp. 2399-2434, Nov. 2006.