EW-CACTUs-MAML: A Robust Metalearning System for Rapid Classification on a Large Number of Tasks

This study aims to develop a robust metalearning system for rapid classification on a large number of tasks. The model-agnostic metalearning (MAML) with the CACTUs method (clustering to automatically construct tasks for unsupervised metalearning) is improved as EW-CACTUs-MAML after integrated with t...

Full description

Saved in:
Bibliographic Details
Main Authors: Wen-Feng Wang, Jingjing Zhang, Peng An
Format: Article
Language:English
Published: Wiley 2022-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2022/7330823
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study aims to develop a robust metalearning system for rapid classification on a large number of tasks. The model-agnostic metalearning (MAML) with the CACTUs method (clustering to automatically construct tasks for unsupervised metalearning) is improved as EW-CACTUs-MAML after integrated with the entropy weight (EW) method. Few-shot mechanisms are introduced in the deep network for efficient learning of a large number of tasks. The process of implementation is theoretically interpreted as “gene intelligence.” Validation of EW-CACTUs-MAML on a typical dataset (Omniglot) indicates an accuracy of 97.42%, performing better than CACTUs-MAML (validation accuracy = 97.22%). At the end of this paper, the availability of our thoughts to improve another metalearning system (EW-CACTUs-ProtoNets) is also preliminarily discussed based on a cross-validation on another typical dataset (Miniimagenet).
ISSN:1099-0526