Define analog and digital signal.
Analog Signal
An analog signal is a continuous wave denoted by a sine wave (pictured below) and may vary in signal strength (amplitude) or frequency (time). The sine wave's amplitude value can be seen as the higher and lower points of the wave, while the frequency (time) value is measured in the sine wave's physical length from left to right.
Digital Signal
A digital signal - a must for computer processing - is described as using binary (0s and 1s), and therefore, cannot take on any fractional values.