Open Autonomous Intelligence Initiative

Advocates for Open AI Models

OAII Base Model — Sensor v0.2

Depends on: OAII Base Model — Common Types v0.2; OAII Base Model — Signal v0.2

1. Purpose

The Sensor object represents a source of observable data. Sensors produce Signals by observing some aspect of the environment, device state, or interaction surface. In the OAII Base Model, Sensors are responsible for observation, not interpretation.

Sensors enable edge‑primary autonomy by grounding Events in locally observable evidence while remaining hardware‑agnostic and privacy‑aware.


2. Definition

A Sensor is an abstract observational component that:

  • MUST produce one or more Signals;
  • MUST define what it observes and under what conditions;
  • MUST NOT directly declare Events or semantic meaning;
  • MAY maintain Sensor Knowledge to support stable observation and comparison.

3. Scope and Non‑Scope

In Scope:

  • Physical or virtual sensing components
  • Environmental, positional, acoustic, visual, or device‑state observation
  • Signal production and quality management
  • Sensor‑local knowledge needed for calibration and baseline stability

Out of Scope:

  • Event recognition and lifecycle control
  • Cross‑sensor semantic fusion

4. Core Responsibilities

The Sensor object:

  • Observes a defined phenomenon or state;
  • Produces Signals according to its capture model;
  • Maintains observation quality (health, calibration, drift awareness);
  • Applies privacy constraints at the source;
  • Supports Agent comparison by maintaining Sensor Knowledge.

5. Required Attributes

A conforming Sensor MUST expose the following attributes using shared types from Common Types v0.1.

5.1 Identity and Context

  • sensor_id : SensorId
  • device_ref : DeviceId

5.2 Sensor Classification

  • sensor_type : string
    Describes the sensing modality (e.g., motion, GPS, microphone, camera, contact).
  • observed_domain : ValueDomain
    Defines what values the Sensor can observe.

5.3 Capture Model

  • capture_mode : string
    Continuous, periodic, event‑triggered, or episodic.
  • capture_params : ParameterSet
    Sampling rate, resolution, window size, duty cycle, etc.

5.4 Privacy and Access

  • privacy_class : PrivacyClass
  • access_class : AccessClass

6. Sensor Knowledge

Each Sensor MAY maintain Sensor Knowledge to stabilize observation and enable comparison.

Sensor Knowledge includes:

  • calibration parameters
  • drift profiles
  • baselines and norms
  • learned routines or reference patterns
  • quality thresholds

6.1 Sensor Knowledge Bundle

When present, Sensor Knowledge MUST be represented using the mechanism bundle pattern:

  • sensor_knowledge_id : MechanismId
  • sensor_knowledge_params : ParameterSet
  • sensor_knowledge_result : ResultSet (optional)

Sensor Knowledge MUST be local, World‑scoped, and privacy‑constrained.


7. Relationships to Other Objects

  • Devices host Sensors
  • Sensors produce Signals
  • Signals carry observable evidence
  • Sensor Knowledge contextualizes Signals
  • Events consume Signals and Sensor Knowledge for recognition
  • Logs may record Sensor state and output subject to policy

Sensors MUST NOT bypass Signals to influence Events directly.


8. Methods

To standardize behavior, a conforming Sensor SHOULD support the following abstract methods:

8.1 capture_signal

  • method_params : ParameterSet (capture configuration or trigger context)
  • method_result : ResultSet (Signal reference or summary)

8.2 validate_sensor

Assesses sensor health and observation quality.

  • method_params : ParameterSet
  • method_result : ResultSet (health status, diagnostics)

8.3 update_sensor_knowledge

Updates calibration, baselines, or learned patterns.

  • method_params : ParameterSet (new observations or summaries)
  • method_result : ResultSet

Updates MUST preserve continuity of interpretation where possible.


9. Edge‑Primary Constraints

In Edge‑primary systems, Sensors:

  • MUST function without continuous network access;
  • SHOULD support low‑power and duty‑cycled operation;
  • MAY degrade gracefully under partial failure;
  • MUST enforce privacy at the point of capture.

10. Failure and Degradation Modes

Sensor degradation may include:

  • hardware failure or dropout
  • environmental interference
  • drift or miscalibration
  • privacy‑induced suppression

Such conditions MUST be surfaced via Signal confidence or sensor diagnostics rather than hidden.


11. Notes for Implementers

  • A single physical device may host multiple Sensors.
  • Virtual Sensors (e.g., software‑derived counters or timers) are valid.
  • For audio and vision, Sensors should favor episodic capture in edge deployments.
  • Sensor Knowledge is intentionally defined as local and replaceable; it is not a global model.

Status: Draft v0.2

Leave a comment