0 purchases
carp mobile sensing
CARP Mobile Sensing Framework in Flutter #
This library contains the core Flutter package for the CARP Mobile Sensing (CAMS) framework.
Supports cross-platform (iOS and Android) mobile sensing.
For an overview of all CAMS packages, see CARP Mobile Sensing in Flutter.
For documentation on how to use CAMS, see the CAMS wiki.
Usage #
To use this plugin, add carp_mobile_sensing as dependencies in your pubspec.yaml file.
dependencies:
carp_core: ^latest
carp_mobile_sensing: ^latest
copied to clipboard
Configuration #
When you want to add CAMS to you app, there are a few things to do in terms of configuring your app.
First, CAMS rely on the flutter_local_notifications plugin. So if you want to use App Tasks and notifications you should configure your app to the platforms it supports and configure your app for both Android and iOS. There is a lot of details in configuring for notifications - especially for Android - so read this carefully.
Android Integration #
Set the minimum android SDK to 26 and Java SDK Version to 34 by setting the minSdkVersion, the compileSdkVersion, and targetSdkVersion in the build.gradle file, located in the android/app/ folder:
If collecting step counts or using notifications in your app, add the following to your app's manifest.xml file located in android/app/src/main:
<!-- Used for activity recognition (step count) -->
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION"/>
<!-- Used for sending and scheduling notifications -->
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED"/>
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.POST_NOTIFICATIONS"/>
<uses-permission android:name="android.permission.USE_EXACT_ALARM" />
<uses-permission android:name="android.permission.SCHEDULE_EXACT_ALARM"
android:maxSdkVersion="32" />
copied to clipboard
Also specify the following between the <application> tags so that the plugin can show the scheduled notifications:
<!-- Used for scheduling notifications -->
<receiver android:exported="false" android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationReceiver" />
<receiver android:exported="false" android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationBootReceiver">
<intent-filter>
<action android:name="android.intent.action.BOOT_COMPLETED"/>
<action android:name="android.intent.action.MY_PACKAGE_REPLACED"/>
<action android:name="android.intent.action.QUICKBOOT_POWERON" />
<action android:name="com.htc.intent.action.QUICKBOOT_POWERON"/>
</intent-filter>
</receiver>
copied to clipboard
iOS Integration #
The pedometer (step count) probe uses NSMotion on iOS and the NSMotionUsageDescription needs to be specified in the app's Info.plist file located in ios/Runner:
<key>NSMotionUsageDescription</key>
<string>This application tracks your steps</string>
copied to clipboard
NOTE: Other CAMS sampling packages require additional permissions in the manifest.xml or Info.plist files.
See the documentation for each package.
Documentation #
The Dart API doc describes the different libraries and classes.
The CAMS wiki contains detailed documentation on the CARP Mobile Sensing Framework, including
the domain model,
how to use it by create a study configuration,
how to extend it, and
an overview of the available measure types.
More scientific documentation of CAMS is available in the following papers:
Bardram, Jakob E. "The CARP Mobile Sensing Framework--A Cross-platform, Reactive, Programming Framework and Runtime Environment for Digital Phenotyping." arXiv preprint arXiv:2006.11904 (2020). [pdf]
Bardram, Jakob E. "Software Architecture Patterns for Extending Sensing Capabilities and Data Formatting in Mobile Sensing." Sensors 22.7 (2022). [pdf].
@article{bardram2020carp,
title={The CARP Mobile Sensing Framework--A Cross-platform, Reactive, Programming Framework and Runtime Environment for Digital Phenotyping},
author={Bardram, Jakob E},
journal={arXiv preprint arXiv:2006.11904},
year={2020}
}
@article{bardram2022software,
title={Software Architecture Patterns for Extending Sensing Capabilities and Data Formatting in Mobile Sensing},
author={Bardram, Jakob E},
journal={Sensors},
volume={22},
number={7},
year={2022},
publisher={MDPI}
}
copied to clipboard
Please use these references in any scientific papers using CAMS.
Examples of configuring and using CAMS #
There is a very simple example app app which shows how a study can be created with different tasks and measures.
This app just prints the sensing data to a console screen on the phone.
There is also a range of different examples on how to create a study to take inspiration from.
However, the CARP Mobile Sensing App provides a MUCH better example of how to use the framework in a Flutter BLoC architecture, including good documentation of how to do this.
Below is a small primer in the use of CAMS for a very simple sampling study running locally on the phone. This example is similar to the example app app.
Following carp_core, a CAMS study can be configured, deployed, executed, and used in different steps:
Define a SmartphoneStudyProtocol.
Deploy this protocol to the SmartPhoneClientManager.
Use the generated data (called measurements) locally in the app or specify how and where to store or upload it using a DataEndPoint.
Control the execution of the study, like calling start.
Defining a SmartphoneStudyProtocol #
In CAMS, a sensing protocol is configured in a SmartphoneStudyProtocol.
Below is a simple example of how to set up a protocol that samples step counts, ambient light, screen events, and battery events.
final phone = Smartphone();
final protocol = SmartphoneStudyProtocol(
ownerId: '[email protected]',
name: 'Tracking steps, light, screen, and battery',
dataEndPoint: SQLiteDataEndPoint(),
)
..addPrimaryDevice(phone)
..addTaskControl(
DelayedTrigger(delay: const Duration(seconds: 10)),
BackgroundTask(measures: [
Measure(type: SensorSamplingPackage.STEP_COUNT),
Measure(type: SensorSamplingPackage.AMBIENT_LIGHT),
Measure(type: DeviceSamplingPackage.SCREEN_EVENT),
Measure(type: DeviceSamplingPackage.BATTERY_STATE),
]),
phone,
Control.Start,
);
copied to clipboard
The above example defines a simple SmartphoneStudyProtocol which will use a Smartphone as a primary device for data collection and store data in a SQLite database locally on the phone using a SQLiteDataEndPoint.
Sampling is configured by adding a TaskControl to the protocol using an DelayedTrigger which triggers a BackgroundTask containing four different Measures.
When this task control is triggered (after a delay of 10 seconds), the sampling will start.
Sampling can be configured in very sophisticated ways, by specifying different types of devices, task controls, triggers, tasks, measures, and sampling configurations.
See the CAMS wiki for an overview and more details.
Deploying and Running a Study on a SmartPhoneClientManager #
In CAMS, we talk about a study protocol being 'deployed' on a primary device, like a phone. CAMS has a fairly sophisticated software architecture for doing this.
However, if we just want to define and deploy a study locally on the phone, this can be done using the SmartPhoneClientManager singleton.
// Create and configure a client manager for this phone.
await SmartPhoneClientManager().configure();
// Create a study based on the protocol.
SmartPhoneClientManager().addStudyProtocol(protocol);
/// Start sampling.
SmartPhoneClientManager().start();
copied to clipboard
In this example, the client manager is configured, the protocol is added, and sampling is started. This can actually be done in one line of code, like this:
SmartPhoneClientManager().configure().then((_) => SmartPhoneClientManager()
.addStudyProtocol(protocol)
.then((_) => SmartPhoneClientManager().start()));
copied to clipboard
This will start the sampling, as specified in the protocol, and data is stored in the database.
Using the generated data #
The generated data can be accessed and used in the app. Access to data is done by listening on the measurements stream from the client manager:
// Listening on the data stream and print them as json.
SmartPhoneClientManager()
.measurements
.listen((measurement) => print(toJsonString(measurement)));
copied to clipboard
Note that measurements is a Dart Stream and you can hence apply all the usual stream operations to the collected measurements, including sorting, mapping, reducing, and transforming measurements.
Controlling the sampling of data #
The execution of sensing can be controlled on runtime by starting, stopping, and disposing sampling.
For example, calling SmartPhoneClientManager().stop() would stop the study running on the client. Calling start() would (re)start it again.
Calling SmartPhoneClientManager().dispose() would dispose of the client manager. Once dispose is called, you cannot call start or stop anymore. This methods is typically used in the Flutter dispose() method.
Extending CAMS #
CAMS is designed to be extended in many ways, including adding new sampling capabilities by implementing a Sampling Package, adding a new data management and backend support by creating a Data Manager, and creating data and privacy transformer schemas that can transform CARP data to other formats, including privacy protecting them, by implementing a Transformer Schema.
For example, you can write your own DataEndPoint definitions and a corresponding DataManager class for uploading data to your own data endpoint. See the wiki on how to add a new data manager.
Please see the wiki on how to extend CAMS.
Features and bugs #
Please read about existing issues and file new feature requests and bug reports at the issue tracker.
License #
This software is copyright (c) the Technical University of Denmark (DTU) and is part of the Copenhagen Research Platform.
This software is available 'as-is' under a MIT license.
For personal and professional use. You cannot resell or redistribute these repositories in their original state.
There are no reviews.