Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization
Domain adaptation has recently attracted attention for visual recognition. It assumes that source and target domain data are drawn from the same feature space but different margin distributions and its motivation is to utilize the source domain instances to assist in training a robust classifier for...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2018-01-01
|
Series: | Complexity |
Online Access: | http://dx.doi.org/10.1155/2018/8425821 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832562690170552320 |
---|---|
author | Yiwei He Yingjie Tian Jingjing Tang Yue Ma |
author_facet | Yiwei He Yingjie Tian Jingjing Tang Yue Ma |
author_sort | Yiwei He |
collection | DOAJ |
description | Domain adaptation has recently attracted attention for visual recognition. It assumes that source and target domain data are drawn from the same feature space but different margin distributions and its motivation is to utilize the source domain instances to assist in training a robust classifier for target domain tasks. Previous studies always focus on reducing the distribution mismatch across domains. However, in many real-world applications, there also exist problems of sample selection bias among instances in a domain; this would reduce the generalization performance of learners. To address this issue, we propose a novel model named Domain Adaptation Exemplar Support Vector Machines (DAESVMs) based on exemplar support vector machines (exemplar-SVMs). Our approach aims to address the problems of sample selection bias and domain adaptation simultaneously. Comparing with usual domain adaptation problems, we go a step further in slacking the assumption of i.i.d. First, we formulate the DAESVMs training classifiers with reducing Maximum Mean Discrepancy (MMD) among domains by mapping data into a latent space and preserving properties of original data, and then, we integrate classifiers to make a prediction for target domain instances. Our experiments were conducted on Office and Caltech10 datasets and verify the effectiveness of the model we proposed. |
format | Article |
id | doaj-art-9fe4eda9aba048d8b49f302586ad6ddd |
institution | Kabale University |
issn | 1076-2787 1099-0526 |
language | English |
publishDate | 2018-01-01 |
publisher | Wiley |
record_format | Article |
series | Complexity |
spelling | doaj-art-9fe4eda9aba048d8b49f302586ad6ddd2025-02-03T01:22:01ZengWileyComplexity1076-27871099-05262018-01-01201810.1155/2018/84258218425821Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation RegularizationYiwei He0Yingjie Tian1Jingjing Tang2Yue Ma3School of Computer and Control Engineering, University of Chinese Academy of Sciences, Beijing 100049, ChinaResearch Center on Fictitious Economy and Data Science, Chinese Academy of Sciences, Beijing 100190, ChinaSchool of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049, ChinaSchool of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049, ChinaDomain adaptation has recently attracted attention for visual recognition. It assumes that source and target domain data are drawn from the same feature space but different margin distributions and its motivation is to utilize the source domain instances to assist in training a robust classifier for target domain tasks. Previous studies always focus on reducing the distribution mismatch across domains. However, in many real-world applications, there also exist problems of sample selection bias among instances in a domain; this would reduce the generalization performance of learners. To address this issue, we propose a novel model named Domain Adaptation Exemplar Support Vector Machines (DAESVMs) based on exemplar support vector machines (exemplar-SVMs). Our approach aims to address the problems of sample selection bias and domain adaptation simultaneously. Comparing with usual domain adaptation problems, we go a step further in slacking the assumption of i.i.d. First, we formulate the DAESVMs training classifiers with reducing Maximum Mean Discrepancy (MMD) among domains by mapping data into a latent space and preserving properties of original data, and then, we integrate classifiers to make a prediction for target domain instances. Our experiments were conducted on Office and Caltech10 datasets and verify the effectiveness of the model we proposed.http://dx.doi.org/10.1155/2018/8425821 |
spellingShingle | Yiwei He Yingjie Tian Jingjing Tang Yue Ma Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization Complexity |
title | Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization |
title_full | Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization |
title_fullStr | Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization |
title_full_unstemmed | Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization |
title_short | Unsupervised Domain Adaptation Using Exemplar-SVMs with Adaptation Regularization |
title_sort | unsupervised domain adaptation using exemplar svms with adaptation regularization |
url | http://dx.doi.org/10.1155/2018/8425821 |
work_keys_str_mv | AT yiweihe unsuperviseddomainadaptationusingexemplarsvmswithadaptationregularization AT yingjietian unsuperviseddomainadaptationusingexemplarsvmswithadaptationregularization AT jingjingtang unsuperviseddomainadaptationusingexemplarsvmswithadaptationregularization AT yuema unsuperviseddomainadaptationusingexemplarsvmswithadaptationregularization |