Understanding Bandwidth in Communications and Computing - LEKULE

Breaking

1 Dec 2018

Understanding Bandwidth in Communications and Computing

This article discusses how to interpret and specify bandwidth in systems that involve data transmission and digital processing.

My previous article on the definition of bandwidth explored the complexities of this term in the context of amplifiers, filters, and RF systems. Nowadays, though, many electrical engineers devote a large portion of their labors to digital systems and, consequently, we must confront additional meanings and details associated with the term “bandwidth.”

Bandwidth as Throughput

The concept of bandwidth is closely linked to the ability of a system to transmit information. To transfer data, a signal must change in some way and the rate at which these changes occur influences the rate at which information can be transferred.

If a signal has more bandwidth—in this case meaning that it includes or is compatible with higher frequencies—it can change more rapidly. Thus, more bandwidth corresponds to a higher maximum rate of data transfer.


The carrier signal (blue, showing frequency modulation) must have more bandwidth than the baseband signal (red). Otherwise, the carrier’s capacity (in terms of speed) for data transfer would be lower than that of the original signal.

It’s not surprising, then, that the word “bandwidth” is now used in a way that places emphasis on data transfer rather than frequency response or spectral characteristics. This type of bandwidth seems to be more or less equivalent to “throughput,” i.e., the maximum rate at which a system can produce or process data; it is typically expressed in bits per second rather than hertz.
You may hear the term bandwidth in any of these following contexts:
  • Communication bandwidth
  • Digital bandwidth
  • Computing bandwidth (including memory bandwidth)

Communication Bandwidth: Bandwidth as a Measure of Data Transfer

Given the actual definition of bandwidth—i.e., the width of a band of frequencies—why is this term used to describe concepts like data transfer rates? The answer is related to how language evolves.
The traditional meaning of the word “peruse” is “to read carefully or thoroughly.” For some reason, people started using it to mean “read quickly, take a look at”—perhaps because the word sounds more relaxed, like “cruise” or “snooze.” This meaning became so common that it was included in some dictionaries, despite the fact that it is the opposite of the original meaning. Nowadays, if someone hands you a document and tells you to peruse it, you really have no idea what you’re supposed to do.
A similar difficulty arises when we separate the concept of bandwidth from its etymological reality. For example, there are websites that will measure the “bandwidth” of your Internet connection. The results of these tests will not be identical from one day to the next or even from one measurement cycle to the next.

Is the frequency response of your networking hardware really that variable? Does your modem randomly vary the frequency of its communication signals? Does the speed capability of your connection constantly change? I think not.

In this context, “bandwidth” is simply the rate at which your computer is successfully sending and receiving data, and this rate fluctuates because it is influenced by overall network conditions.
So, what exactly is the bandwidth of a digital communication system? Is it the maximum rate at which individual bits can be transferred? Is it the maximum rate at which data can be transferred, where “data” refers only to the payload of a packet? Does it account for the rate of bit or packet errors that typically occur for a given set of operating conditions, meaning that it is the rate at which data can be successfully transferred?

I fear that all of these are possibilities and, consequently, my recommendation is the following: when precision is important—for example, if your boss says “the system must have a bandwidth of one hundred megabits per second or you’re fired”—ask for clarification.


The overhead associated with packet-based communication causes the amount of real data that is transferred in a given time period to be lower than the number of ones and zeros that are transferred.

Digital Bandwidth: Defining a Range of Frequencies

If people are accustomed to thinking of bandwidth as throughput, they may be inclined to apply the term “bandwidth” to the frequency of a digital signal—e.g., “I’m running my SPI data line at a bandwidth of 100 kHz.” I don’t know if this usage is common, but it is seriously incorrect and should be discouraged.

If we go back to the core meaning of the word, bandwidth is a range of frequencies. The spectral content of a square wave extends far beyond the fundamental frequency, and thus a digital waveform that completes 100,000 cycles per second has a bandwidth that is much wider than 100 kHz.


The Fourier transform of a square wave reminds us that digital waveforms have significant amounts of energy at frequencies that extend far beyond the signal frequency (measured in cycles per second) or the bit rate (measured in bits per second).

Computing Bandwidth, Memory Bandwidth: Data Processing

The term “bandwidth” is used to describe not only the rate at which data is transferred but also the rate at which it is processed. This is essentially the concept of throughput applied to a processing system rather than a communication system.

The bandwidth of the CPU, itself, is determined by the clock frequency and architectural details (such as the number of cores) that determine how instructions are executed. It turns out, though, that the memory bandwidth can be the limiting factor in processors built around top-of-the-line CPUs.
Memory bandwidth refers to the speed at which the memory system can move data to the CPU, and apparently technological developments have favored CPU throughput rather than memory performance.

Perhaps the most important thing to remember about computing bandwidth—measured in, say, bytes per second or instructions per second—is that it is by no means equal to the processor’s clock frequency (in cycles per second). Instructions often require more than one clock cycle for execution, even in the relatively simple processors found in microcontrollers. In fact, the original 8051 architecture required at least 12 clock cycles to execute one instruction.

Combatting Confusion: The Many Faces of Bandwidth


I hope that these two articles have helped you to understand the diverse, and sometimes inconsistent, ways in which the term “bandwidth” can be used to characterize analog, digital, and radio-frequency systems. Engineering is a field that thrives on precision, so we all need to make an effort to clarify our information when terminology alone doesn’t suffice.

No comments: