TinyML for Concept Drift
assive and active [7], [8], [9]. Passive solutions adapt the model at each incoming data,disregarding the fact that a concept drift has occurred inthe data-generating process (or not) . The gradual forgettingclassifi show annotation
6], [17], [18].On the contrary, active solutions aim at detecting conceptdrift in the data generation process and, only in that case,they adapt their model to the new conditions . ChangeDetection Tests (CDT) ar show annotation
uitably weighted) [30].Finally, deep learning-based active approaches (integratingdeep learning solutions with active adaptive solutions) canbe found in [31], [32]. Tiny Machine Learning: TML techn show annotation
-cision Trees proposed in [15]. Deep learning-based passivesolutions examples can be found in [16], [17], [18]. On the contrary, active solution show annotation
tures [42], [43]. Inparticular, [44] introduces a methodology to explore sparse(and pruned) CNN architectures able to be executed on mi-crocontroller units, whereas [5] proposes a Tiny-CNN show annotation
mi-crocontroller units, whereas [5] proposes a Tiny-CNN whosebiases can be learned directly on the device. Finally, [45]investigates the i show annotation
irectly on the device. Finally, [45]investigates the impact of quantized networks in TinyMLembedded systems. Although there are very few work show annotation
works in TinyMLembedded systems. Although there are very few works proposing on-devicelearning, e.g., [5], [6], to the best of our knowledge, nowork presents a Tiny-ML solution able to adapt over timeto concept drift. 3 PROBLEM FORMULATIONLet P be a show annotation
[5]: TinyTL - Reduce Activations, Not Trainable Parameters for Efficient On-Device Learning
first time in the literature, a Tiny MachineLearning algorithm for Concept Drift (TML-CD) that canlearn directly on the IoT unit or embedded system andadapt the knowledge base in response to a concept drift(thus tracking the evolution of the data generating process). To achieve this goal, we introdu show annotation
ehavior. Amongthese mechanisms, we focus on the hybrid solution thanksto its ability to trade-off adaptation with memory demand .The three proposed TML-CD adapt show annotation
following two drawbacks.First, a kNN-based classifier requires to store all the data JOURNAL OF LATEX CLASS FILES, VO show annotation
4of the training set. Second, t he larger the amount of thetraining data, the higher the time to provide a classificationin output. These drawbacks are more severe show annotation
e different perspectives.First, condensing techniques [49], [50] aim at identifyingthe smallest subset of training data that can correctly classifyall the training samples . Second, editing techniques [51 show annotation
l the training samples. Second, editing techniques [51],[52], [53] instead reduce the number of stored samples byremoving the noisy ones, i.e., those not agreeing with theirneighborhoods. Third, [54] proposed to train a show annotation
theirneighborhoods. Third, [54] proposed to train a supervisedparametric classifier on available data and to remove allthe samples having a classification probability below a hardthreshold .In this work, we focus on the f show annotation
roach and, inparticular, on the Condensed Nearest Neighbour algorithm .In more detail, D applies this show annotation
tive Update: Active Tiny kNNThe Active Tiny kNN, whose pseudocode is shown inAlgorithm 4, relies on a Change Detection Test (CDT) θ todetect changes in the data generation process P . The coreof this algorithm is t show annotation
- ==== * show annotation
for this paper, Concept Drift feels more like Data Drift, since the drift problem that they are trying to resolve has examples “due to seasonality or periodicity effects, faults affecting sensors or actuators, changes in the user’s behavior, or aging consequences” which implies no change on what we are trying to predict (concept drift), but that the input itself has changed (data drift)
θ is the well-known and theoret-ically grounded CUSUM algorithm [24] in its generalizedversion [55], * show annotation