**Experiment2: Hidden Markov Model**

# Experimental Requirement

In this assignment, we will consider an indoor positioning system using hidden Markov models (HMMs). The aim here is to locate a mobile sensor receiving signals from beacons placed in a building or a room. The strength of a message fluctuates not only because of the distance between the transmitter and the receiver but also the reflection and scattering in buildings. To to overcome this problem is to collect the signal strengths for each position and construct a probability distribution as our observation model, according to which we can estimate the position of the censor.

# Congiguration

I coded up the algorithm and run it on vscode(Visual Studio Code) with the python package of edition 3.9.6. Additionally, I store my code on gitee in case of need.

# Implementation

## Preparation

- install jupyter and enable the widgets extension to have interactive controls
- install dependency packages

- code up to install
`jupyter`

1 | pip install jupyter |

- import package
`numpy and matplotlib`

1 | pip install numpy |

## Conditions

- The total number states
`n_states`

= (width * length) and a probability distribution of the location of the target sensor is a vector of length`n_states`

. `self.transprobs`

stands for the transition probalities P(S_{t+1}|S_{t}) stored in a 2-dimensional NumPy array which can be obtained given indexes.`n_beacons`

represents the length of observation vector that the sensor receives in each time step.- The RSSI received from beacons are stored in an array
`O`

. - P(O
^{(b)}=o_{b}|S=s) stands for the probability of observing o[b] in a particular state s which can be obtained by`self.obsprobs[b,o[b],s]`

. - Provided that the RSSI values of different messages are conditionally independent, we can decompose the probability of observing the vector o,
*P*(*O = o*|*S = s*) into a product of individual likelihood functions following

## Step1: Predict

To predict the next state by computing P(S_{t+1}|O_{1:t}) using probs(P(S_{t}|O_{1:t})) and transition probabilities(P(S_{t+1}|S_{t})). Calculate the multiplication of two matrix by means of method matmul of package numpy.

**The code is as follows**

1 | def predict(self, probs): |

## Step2: Update

This method updates the probability using the observations to compute probs*P(O_{t}|S_{t}). What its result represents depends on the calling method. If it is called from the method monitor, then probs represents P(S_{t}|O_{1:t-1}) and the resulting vector will be proportional to P(S_{t}|O_{1:t}). Similarly, if it is called from the method backwards, probs represents P(O_{t+1:T}|S_{t}) and it will return a vector that is proportional to P(O_{t:T}|S_{t}).

**The code is as follows**

1 | def update(self, probs, o): |

## Step3: Monitor

This method returns the monitoring probabilities for T time steps using the given sequence of observations by computing P(S_{t}|O_{1:t}) for each t.

**The code is as follows**

1 | def monitor(self, T, observations): |

# Result Analysis

**Test code using hmm test.py**

1 | python -m unittest hmm_test.py |

**The result is as follows**

# Conclusion

In this experiment, I learnt how to design and implement Hidden Markov models. Hidden Markov models are statistical models that are used to describe a Markov process with implicit unknown parameters. The difficulty is to determine the implicit parameters of the process from the observable parameters. These parameters are then used for further analysis.

I ran a few times to see how working time changes due to enironment and configuration.

Additionally, when coding up the algorithm, I need to search plenty of information on the Internet to make sure the code is correct. Looks like I’ve got a lot to learn. Plus, Hidden Markov models are exceedingly useful for forward prediction and backward extrapolation according to observation and probability.