Open Autonomous Intelligence Initiative

Advocates for Open AI Models

OAII Base Model — Signal v0.1

Depends on: OAII Base Model — Common Types v0.1; OAII Base Model — World v0.1.1

1. Purpose

The Signal object represents a unit of observation produced by a Sensor within a World. Signals are the raw or minimally processed inputs from which Events may be recognized. In the OAII Base Model, Signals are intentionally simple, local, and non-semantic by default.

Signals enable edge‑primary autonomy by providing observable evidence without embedding interpretation, intent, or meaning.


2. Definition (Normative)

A Signal is an abstract, time‑referenced observation that:

  • MUST be produced by a Sensor operating within a World;
  • MUST be associated with a specific time or time interval;
  • MUST NOT imply semantic meaning, interpretation, or intent by itself;
  • MAY be preprocessed for noise reduction, normalization, or feature extraction;
  • MUST remain interpretable independently of any specific Event.

A Signal is not an Event, a State, or Knowledge; it is evidence.


3. Scope and Non‑Scope

In Scope:

  • Raw or lightly processed sensor observations
  • Time‑bounded measurements or samples
  • Feature vectors derived directly from sensor output
  • Episodic or continuous observation streams

Out of Scope:

  • Semantic interpretation or classification
  • Event declaration or lifecycle management
  • Cross‑sensor fusion decisions
  • Medical or diagnostic inference

4. Core Responsibilities

The Signal object:

  • Carries observable data from Sensors;
  • Preserves temporal reference for comparison and correlation;
  • Supports uncertainty representation at the observation level;
  • Provides evidence for Event recognition;
  • Maintains hardware independence via abstraction.

5. Required Attributes (Abstract)

A conforming Signal MUST expose the following attributes, using shared types and conventions from Common Types v0.1.

5.1 Identity and Context

  • signal_id : SignalId
  • world_ref : WorldId
  • sensor_ref : SensorId

5.2 Temporal Reference

  • temporal_extent : ParameterSet
    MUST specify timestamp, interval, or sampling window.

5.3 Observation Payload

  • value_domain : ValueDomain
    Defines the form of the observed values.
  • signal_value : ParameterSet
    Encodes the observed measurement, sample, or feature vector.

5.4 Quality and Uncertainty

  • confidence : Confidence
    Represents reliability of the observation.
  • uncertainty_model : UncertaintyModel (optional)

5.5 Privacy Classification

  • privacy_class : PrivacyClass
    Indicates handling and export constraints for the Signal.

6. Optional Attributes

A Signal MAY include:

  • sampling_rate or capture_mode
  • preprocessing_descriptor (e.g., filter id, feature extractor id)
  • compression_descriptor
  • quality_flags (e.g., saturation, dropout, occlusion)

Optional attributes MUST NOT introduce semantic interpretation.


7. Signal States (Normative)

Signals MAY carry a simple state indicator:

  • VALID: observation usable as evidence
  • DEGRADED: observation partially reliable
  • INVALID: observation corrupted or unusable

Signal states MUST NOT be conflated with Event states.


8. Relationships to Other Objects

  • Sensors produce Signals
  • Devices host Sensors
  • World provides contextual frame for Signals
  • Sensor Knowledge provides baseline and reference for interpretation
  • Events consume Signals as evidence
  • Logs may record Signals subject to privacy and retention policies

Signals MUST be consumable independently of any specific Event.


9. Methods (Normative, Abstract)

Signals typically do not expose rich methods; however, to standardize behavior, a conforming Signal SHOULD support the following abstract methods:

9.1 validate_signal

  • method_params : ParameterSet
  • method_result : ResultSet

Postconditions:

  • Updates signal state (VALID / DEGRADED / INVALID).

9.2 summarize_signal

Produces a lightweight summary suitable for downstream comparison.

  • method_params : ParameterSet (e.g., window size, summary type)
  • method_result : ResultSet (e.g., statistics, embeddings, feature vectors)

Summaries MAY be used by Sensor Knowledge and Event recognition mechanisms.


10. Edge‑Primary Constraints

In Edge‑primary systems, Signals:

  • MUST be capturable and representable locally;
  • SHOULD support bounded storage and retention;
  • MAY be ephemeral and discarded after summarization;
  • MUST respect privacy_class constraints at all times.

Signals are optimized for availability, not permanence.


11. Failure and Degradation Modes

Signal degradation may include:

  • Sensor dropout or drift
  • Environmental interference
  • Quantization or resolution loss

Systems MUST propagate uncertainty rather than masking degradation.


12. Notes for Implementers (Non‑Normative)

  • Signals may represent raw samples (e.g., accelerometer readings), windows (e.g., audio frames), or derived features (e.g., motion energy).
  • Vision and audio Signals should typically be episodic, not continuous, in edge‑primary deployments.
  • Many Signals will never directly participate in Events; their role is to maintain situational awareness.

Status: Draft v0.1 (Normative Object Definition)

Leave a comment