On-Device Features
Sentiance offers a variety of features whose processing happens strictly offline on the device. These are the on-device features that are available via the Sentiance SDK.
The features listed on this page must first be enabled by Sentiance for your app, before you can start using them. Please reach out to our support team to have a feature enabled.
Some features that are listed here are released as Early Access. Please check out our Production Readiness page to find out more.
Vehicle Crash Detection
This feature offers the ability to detect vehicle crashes (collisions) in real time, during trips. Sentiance processes various sensor and location data in real time on the device, in order to detect crashes and notify your app.
Crash data includes information such as location, magnitude of the detected peak, and confidence.
You can learn more about this feature in our Vehicle Crash Detection page.
Activation for Android
For Android apps, in order to activate vehicle crash detection and receive crash data, you must add the com.sentiance:sdk-crash-detection artifact as a dependency. You can then subscribe for vehicle crash events via the CrashDetectionApi
class.
Activation for iOS
For iOS apps, crash detection is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, to activate and access crash detection, install the @sentiance-react-native/crash-detection module.
Activation for Flutter
For Flutter apps, to activate and access crash detection, install the crash detection package.
Transport Classification
The Sentiance SDK processes various sensor and location data in real-time, directly on the device, to segment a user's trip into various transport modes, such as walking, biking, and car. The SDK uses Sentiance's state of the art deep learning model to achieve this.
The classification results are accessible via the event timeline and the user's context data.
Transport Mode Classification: Considerations and Recommendations
Please note that, currently, our on-device system utilizes a "sensor-only" model to classify transportation modes. This means that the model does not incorporate GIS-related features and is unaware of the surrounding environment. We have designed it this way to prioritize user privacy and minimize battery consumption. However, we are considering integrating these functionalities in the future.
Given the model's reliance solely on sensor data, certain misclassifications are more probable than others. The aggregated accuracy for all the transport modes stands at approximately 80%. Misclassifications are particularly prevalent between related categories such as:
Vehicles: including cars, buses, and motorcycles.
Rail: encompassing trains and trams.
For use cases that permit it, we recommend consolidating these classes. i.e., merge cars, buses and motorcycles as vehicles and trains and trams as rails. By simplifying the categories to 6 classes (idle, walking, running, biking, rail, and vehicle), the overall accuracy can be enhanced to around 90%. If you implement this recommendation, then most of the remaining misclassifications are expected mainly from the model's slight delay in recognizing changes in transportation modes.
Activation for Android
For Android apps, this feature is included after adding a dependency on one of the following packages:
com.sentiance:sdk-event-timeline
com.sentiance:sdk-user-context
com.sentiance:sdk-lifestyle
Activation for iOS
For iOS apps, transport classification is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Venue Type and Home/Work Detection
The Sentiance SDK can enrich a user's stationary data with information about the significance of the stationary (i.e. home, work, or a point of interest), and type of venue (i.e. restaurant, fitness center, etc). This is possible with Sentiance's on-device, fully offline, venue type mapping feature.
The SDK uses Sentiance's venue data source, by downloading and storing this data on the device on an as-needed basis. The data is divided into large geographic areas (town or city level), and is updated as the user navigates to different areas.
With the use of Sentiance's state of the art deep learning model for venue type mapping and home/work detection, the user's stationary data gets enriched, and refined as more data gets collected.
The venue information is accessible via the event timeline and the user's context data.
Activation for Android
For Android apps, this feature is included after adding a dependency on one of the following packages:
com.sentiance:sdk-event-timeline
com.sentiance:sdk-user-context
com.sentiance:sdk-lifestyle
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Event Timeline
The Sentiance SDK uses its detections to generate a timeline of the user's historic transports and stationary moments. This is known as the user's Event Timeline. With the help of the various on-device features, this data is enriched with additional information, such as the transport mode and distance, and a stationary's venue type and significance.
You can utilize the SDK's Event Timeline API methods to query for historic timeline events, and subscribe for real time timeline updates. See this how-to guide for code examples.
Activation for Android
For Android apps, this feature is included after adding a dependency on one of the following packages:
com.sentiance:sdk-event-timeline
com.sentiance:sdk-user-context
com.sentiance:sdk-lifestyle
You can then subscribe for user context updates, or request the context directly, via the EventTimelineApi
class.
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
User Current Context Information
A user's current context includes contextual information about the current state of the user. This includes things like:
Recent transport and stationary events, with venue type information if available.
The user's last known location.
The user's home and work venues, if detected.
The user's current semantic time (e.g. morning, lunch time).
The user's segments, if detected.
The population of this context information happens offline, on the device. See this how-to guide for code examples.
Activation for Android
For Android apps, in order to activate and access the user's current context information, you must add the com.sentiance:sdk-user-context artifact as a dependency. By default, this artifact excludes segment detection. To activate segment detection, you can add a dependency to the com.sentiance:sdk-lifestyle artifact instead, which will activate all lifestyle profiling features.
You can then subscribe for user context updates, or request the context directly, via the UserContextApi
class.
Activation for iOS
For iOS apps, the user's current context information is available via the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
@sentiance-react-native/user-context
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Semantic Time
The Sentiance SDK can enrich a user's context by providing information about the user's current semantic time. Semantic time (e.g. morning, lunch time, etc.) is personalized based on a user's timeline data, and becomes more accurate over time.
The semantic time is accessible via the user's current context.
Activation for Android
For Android apps, this feature is included after adding a dependency on the com.sentiance:sdk-user-context artifact.
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
@sentiance-react-native/user-context
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Segment Detection
The Sentiance SDK can enrich a user's context by assigning different segmentation profiles, e.g. heavy commuter, low/medium/high social activity, and more. This is possible with Sentiance's multi-platform decision engine which runs directly on the device. It consumes a user's timeline data, and in return produces segments that apply to the user.
The semantic time is accessible via the user's current context.
Activation for Android
For Android apps, this feature is included after adding a dependency on the com.sentiance:sdk-lifestyle artifact.
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
@sentiance-react-native/user-context
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Driving Insights
The Sentiance SDK can provide insights about the driving behaviour for vehicular transports. For example, for a given transport, scores can be computed for various safe driving attributes, such as attention, smooth and legal driving. These scores are based on a variety of detections that the SDK does during transports, such as harsh driving, phone usage, and call detections. These insights are made available via the Driving Insights API. See this how-to guide for code examples.
Activation for Android
For Android apps, this feature is included after adding a dependency on the com.sentiance:sdk-driving-insights artifact.
You can then subscribe for driving insights updates, or request it directly via the DrivingInsightsApi
class.
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/event-timeline
@sentiance-react-native/driving-insights
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Smart Geofences
The Sentiance SDK can monitor points of interest, and notify your app about entries and exits into and out of these monitored regions. Monitoring works in the background, when the background location permission is granted, and in the foreground (i.e. when the app is open), when the while-in-use location permission is granted.
For the moment, creation of geofences directly via the SDK's API is not supported. You can instead create geofences on the Sentiance backend. The SDK will regularly refresh the list of geofences it will monitor. Alternatively, you can request from the SDK for an immediate refresh.
You can use the Smart Geofences API to subscribe for entry and exit event notifications. See this how-to guide for code examples.
Activation for Android
For Android apps, this feature is included after adding a dependency on the com.sentiance:sdk-smart-geofences artifact.
Activation for iOS
For iOS apps, this feature is included in the main SDK framework. No additional dependencies are needed.
Activation for React Native
For React Native apps, this feature is included after installing the following modules:
@sentiance-react-native/core
@sentiance-react-native/smart-geofences
Activation for Flutter
For Flutter apps, this feature is included after installing the following packages:
Last updated