A Generic Framework for Mobile Crowdsensing: A Comprehensive Survey
Mobile Crowdsensing (MCS) has emerged as a powerful paradigm for aggregating sensory data through the collaborative efforts of various mobile devices. Despite the innovative solutions inherent in this paradigm, it also introduces new challenges. The MCS literature has proposed various solutions, but...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10829933/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Mobile Crowdsensing (MCS) has emerged as a powerful paradigm for aggregating sensory data through the collaborative efforts of various mobile devices. Despite the innovative solutions inherent in this paradigm, it also introduces new challenges. The MCS literature has proposed various solutions, but many problems remain. Existing studies have addressed different aspects and processes of MCS and are proposing various solutions, each with a specific framework. Consequently, the diversity of frameworks complicates the integration and comparison of different works in this field. In response, our work presents a structured framework for MCS, consolidating its operational processes into a cohesive system. Our framework integrates key steps, including the registration process, anterior data processing, incentivization process, task allocation, task execution, and posterior data processing. By providing a unified framework, we aim to offer a comprehensive and structured approach to Mobile Crowdsensing (MCS), breaking it down into multiple subprocesses. This allows each work to fit into the framework more easily, facilitating the comparison and integration of various contributions in the field. This structured framework serves as a foundation for researchers and practitioners in the field, encouraging progress and innovation in the ongoing development of MCS applications. |
---|---|
ISSN: | 2169-3536 |