Inductive Bias or Inductive Prior
Created: =dateformat(this.file.ctime,"dd MMM yyyy, hh:mm a") | Modified: =dateformat(this.file.mtime,"dd MMM yyyy, hh:mm a")
Tags: knowledge
Overview
We bias the model towards solutions that we think in general are useful
- we know that the model can learn everything from data.
- but we want to make it a bit easier for the model to learn since we know we have not much data.
- so you design the model according to the bias that you know
Examples:
- CNNs ⇒ images
- CNNs has good inductive bias / prior for images
- since CNNs use convolutions with the concept of receptive field
- where receptive field is similar to how we interpret images
- probably what 1 pixel cares about is its immediate neighbourhood, and what that nbh cares about is its immediate neighbourhood
- LSTMs ⇒ Sequences
- LSTMs has good inductive bias / prior for sequences of text
- it is how you read, one word at a time with some memory / knowledge of the previous word