Assume you have a sensor that generates a constant Signal (S) and has a time varying noise in the form of N*Sin(w*t) where N is constant and w= is angular frequency and t is time. How would you design an algorithm such that when you read the sensor output referred to as (Y), you can reduce the effect of noise and deduce the accurate value of your signal (S).