Skip to main content

Sensitivity & Tolerance

The Sensitivity & Tolerance section allows you to evaluate and optimize sensor configurations by testing various parameters, including sample rate, bit depth, noise levels, and error tolerance. This helps determine the impact of different conditions on model accuracy and refine system performance.

Trained Tool Overview

In this section, you can view detailed information about the trained tool, including versioning, sample rates, and target ranges. The table below outlines the available details:

FieldDescription
Trained Tool DescriptionProvides a brief overview of the trained tool, including its purpose.
VersionDisplays the version number of the trained tool.
Created Date & TimeShows when the tool was created.
Sample RateSpecifies the rate at which data samples were recorded.
Target Range (Number of Classes)Defines the number of output classifications.
StatusIndicates whether the tool is active or inactive.

Data Sample Lists

The Data Sample Lists section displays the available datasets used for testing. You can view detailed metadata for each dataset:

FieldDescription
List NameName of the data sample list.
List TypeSpecifies the type of dataset.
Data ShapeIndicates the structure of the data samples.
Sample RateShows the data collection frequency.
N SamplesNumber of samples in the dataset.
Target RangeDefines the range of expected values.
Created Date & TimeTimestamp of dataset creation.
Modified Date & TimeLast modification timestamp.
CommentsNotes or remarks related to the dataset.
StatusIndicates whether the dataset is active or archived.

Running a Sensor Sensitivity Test

Step 1: Select a Trained Tool and Dataset

  1. Navigate to the Trained Tool section.
  2. Select the required Trained Tool and Data Sample List from the available options.
  3. Click New Sensor Sensitivity Test to initiate the test.

Step 2: Configure Test Parameters

  1. In the Sensor Sensitivity Test window, select one Specification Variable to vary, while keeping other parameters constant.
  2. Choose the Range Variable from the following options:
    • Sample Rate
    • Bit Depth
    • Noise
    • None (Fixed Settings)
  3. Configure the test settings under the Settings tab:
    • Sample Rate (Min & Max Values): Define the minimum and maximum sample rates.
    • Bandwidth: Automatically adjusted based on the sample rate.
    • Advanced Options (Optional):
      • Bit Depth: Set the bit depth (default is native bit depth).
      • Quantization Depth: Adjusted based on the actual data range of each channel.
      • Noise (dB Level): Select noise parameters for testing.
      • Level Adjustment: Choose between:
        • SNR by sample (Signal-to-Noise Ratio per sample)
        • Fixed Background Noise
  4. Click Hide Advanced Options to collapse these settings.

Step 3: Define Step Count and Spacing

  1. Step Count: Enter the number of steps for testing.
  2. Spacing: Choose from:
    • x2
    • x10
    • Uniform
  3. The system will display Current Steps based on the selected spacing.

Step 4: Select Test Type

  1. Try New Data As-Is: Tests the accuracy of the existing trained tool on a simulated data stream without retraining.
  2. Optimized k-Fold: Re-optimizes training for each test point using k-fold validation.

Step 5: Start the Test

Click Start to begin the test.

Running an Error Tolerance Test

Step 1: Select a Trained Tool and Dataset

  1. Navigate to the Trained Tool section.
  2. Select the required Trained Tool and Data Sample List from the available options.
  3. Click New Error Tolerance Test to initiate the test.

Step 2: Configure Test Parameters

  1. Choose the Range Type:
    • % Error
    • Absolute Error
  2. Define the Maximum Error Delta as ± X %.
  3. In the Data Channels field, select the channels from the dropdown list.
  4. Enter the Step Count to determine test granularity.
  5. Select the Error Test Range. Accuracy will be tested in incremental steps from 0 to the maximum sensor error.

Step 3: Start the Test

Click Start to begin the error tolerance test.

Viewing Test Results

Once the tests are completed, results are displayed in the Test Results section.

Sensor Sensitivity Results

  • Displays Sample Rate, Bandwidth, Bit Depth, Noise Level, and Test Type.
  • Graphical Representation:
    The sensitivity results are plotted on a graph with:
    • X-Axis: Sample Rate (Hz) or Bandwidth
    • Y-Axis: Accuracy (%) or Precision
      You can switch X and Y-axis parameters using radio buttons.
  • Legend & Export Options:
    • The graph is color-coded based on accuracy or precision levels.
    • You can export data to the clipboard or delete results as needed.

Error Tolerance Results

  • This section displays:
FieldDescription
Sample List NameName of the dataset used.
Range Type% Error or Absolute Error
Max Error DeltaMaximum permissible error.
  • Graphical Representation:
    • X-Axis: Absolute Error
    • Y-Axis: Accuracy (%)
    • The Data Columns are displayed on the right side for quick reference.

This helps determine the best sensor settings for accuracy, evaluates how different error levels affect model performance, provides visual and numerical insights into sensor reliability, and allows you to test under varied conditions and refine model robustness. By following this structured testing process, you can fine-tune your sensor configurations, ensuring optimal performance and reliability for your AI-driven applications.