Digitization is the process of converting analog signals or information of any form into a digital format that can be understood by computer systems or electronic devices. The term is used when converting information, like text, images or voices and sounds, into binary code.
By Cisco Eng. Shingie Lev Muringi
Digitized information is easier to store, access and transmit, and digitization is used by a number of consumer electronic devices. The essence of digitalizing the industries can be witnessed by the recent regulations constituted by the International Telecommunications Union (ITU) to have every Television Broadcaster migrate from traditional broadcasting methodologies into digital multi-broadcasting.
Our own ZBCtv have since then failed to meet such benchmarks in time and the quality of their visuals says it all. Other pacesetters such as DSTv continue to upgrade their digital broadcasting prowess through the adoption of resilient satellite and fiber optic broadcasting channels.
Digitization involves capturing analog signals and storing the results in digital form. This is usually done via sensors, which sense analog signals like light and sound, and transform them to their equivalent digital forms via an analog-to-digital converter chip or a whole circuit dedicated to converting a specific analog signal.
This works by converting the continuous stream of signal or data found in most analog data types into discontinuous values. These are then sampled at regular intervals to produce a digitalized output.
For example, an audio file is generally sampled in rates of 44.1 kHz to 192 kHz. If an audio file is sampled at a rate of 48.1 kHz it is sampled 48,000 times per second. The digitization process is more effective and of higher quality if performed at higher sampling rates.