Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Digital vs. Analog Data Digital data: bits. −→ discrete signal, Study notes of Voice

In broadband networks: −→ use analog signals to carry digital data. Page 2. CS 422. Park. Important task: analog data is often digitized. −→ useful: why? − ...

Typology: Study notes

2021/2022

Uploaded on 09/12/2022

shashwat_pr43
shashwat_pr43 🇺🇸

4.5

(15)

233 documents

1 / 29

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
CS 422 Park
Digital vs. Analog Data
Digital data: bits.
−→ discrete signal
−→ both in time and amplitude
Analog “data”: audio/voice, video/image
−→ continuous signal
−→ both in time and amplitude
Both forms used in today’s network environment.
−→ burning CDs
−→ audio/video playback
In broadband networks:
−→ use analog signals to carry digital data
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d

Partial preview of the text

Download Digital vs. Analog Data Digital data: bits. −→ discrete signal and more Study notes Voice in PDF only on Docsity!

Digital vs. Analog Data

Digital data: bits.

−→ discrete signal −→ both in time and amplitude

Analog “data”: audio/voice, video/image

−→ continuous signal −→ both in time and amplitude

Both forms used in today’s network environment.

−→ burning CDs −→ audio/video playback

In broadband networks:

−→ use analog signals to carry digital data

Important task: analog data is often digitized

−→ useful: why? −→ it’s convenient −→ use full power of digital computers −→ simple form: digital signal processing −→ analog computers are not as versatile/programmable −→ cf. “Computer and the Brain,” von Neumann (1958)

How to digitize such that digital representation is faithful?

−→ sampling −→ interface between analog & digital world

Sampling criterion for guaranteed faithfulness:

Sampling Theorem (Nyquist): Given continuous bandlimited signal s(t) with S(ω) = 0 for |ω| > W , s(t) can be reconstructed from its samples if

ν > 2 W

where ν is the sampling rate.

−→ ν: samples per second

Remember simple rule: sample twice the bandwidth

Issue of digitizing amplitude/magnitude ignored

−→ problem of quantization −→ possible source of information loss −→ exploit limitations of human perception −→ logarithmic scale

Compression

Information transmission over noiseless medium

−→ medium or “channel” −→ fancy name for copper wire, fiber, air/space

Sender wants to communicate information to receiver over noiseless channel.

−→ can receive exactly what is sent −→ idealized scenario

Part II. Compression machinery:

  • code book F assigns code word wa = F (a) for each symbol a ∈ Σ → w (^) a is a binary string of length |wa| → F could be just a table
  • F is invertible → receiver d can recover a from wa → F −^1 is the same table, different look-up

Ex.: Σ = {A, C, G, T }; need at least two bits

  • F 1 : wA = 00, wC = 01, wG = 10, wT = 11
  • F 2 : wA = 0, wC = 10, wG = 110, wT = 1110

−→ pros & cons?

Note: code book F is not unique

−→ find a “good” code book −→ when is a code book good?

A fundamental result on what is achievable to attain small L.

−→ kind of like speed-of-light

First, define entropy H of source 〈Σ, p〉

H =

a∈Σ

p (^) a log

p (^) a

Ex.: Σ = {A, C, G, T }; H is maximum if pA = p (^) C = p (^) G = p (^) T = 1/4.

−→ when is it minimum?

Source Coding Theorem (Shannon): For all code books F , H ≤ LF

where LF is the average code length under F.

Furthermore, LF can be made to approach H by selecting better and better F.

Remark:

  • to approach minimum H use blocks of k symbols → e.g., treat “THE” as one unit (not 3 separate letters) → called extension code
  • entropy is innate property of data source s
  • limitation of ensemble viewpoint → e.g., sending number π = 3. 1415927... → better way?

Would like: if received code word w = wc for some symbol c ∈ Σ, then probability that actual symbol sent is indeed c is high

−→ Pr{actual symbol sent = c | w = wc} ≈ 1 −→ noiseless channel: special case (prob = 1)

In practice, w may not match any legal code word:

−→ for all c ∈ Σ, w 6 = wc −→ good or bad? −→ what’s next?

Shannon showed that there is a fundamental limitation to reliable data transmission.

→ the noisier the channel, the smaller the reliable throughput → overhead spent dealing with bit flips

Definition of channel capacity C: maximum achievable reliable data transmission rate (bps) over a noisy channel (dB) with bandwidth W (Hz).

Channel Coding Theorem (Shannon): Given band- width W , signal power PS, noise power PN , channel sub- ject to white noise,

C = W log

P S

P N

bps.

P (^) S /PN : signal-to-noise ratio (SNR)

−→ upper bound achieved by using longer codes −→ detailed set-up/conditions omitted

Signal-to-noise ratio (SNR) is expressed as

dB = 10 log 10 (P (^) S /PN ).

Example: Assuming a decibel level of 30, what is the channel capacity of a telephone line?

Answer : First, W = 3000 Hz, PS /PN = 1000. Using Channel Coding Theorem,

C = 3000 log 1001 ≈ 30 kbps.

−→ compare against 28.8 kbps modems −→ what about 56 kbps modems? −→ DSL lines?

Digital vs. Analog Transmission

Two forms of transmission:

  • digital transmission: data transmission using square waves
  • analog transmission: data transmission using all other waves

Four possibilities to consider:

  • analog data via analog transmission → “as is” (e.g., radio)
  • analog data via digital transmission → sampling (e.g., voice, audio, video)
  • digital data via analog transmission → broadband & wireless (“high-speed networks”)
  • digital data via digital transmission → baseband (e.g., Ethernet)

Delay distortion: different frequency components travel at different speeds.

Most problematic: effect of noise

−→ thermal, interference,...

  • Analog: Amplification also amplifies noise—filtering out just noise, in general, is a complex problem.
  • Digital: Repeater just generates a new square wave; more resilient against ambiguitity.

Analog Transmission of Digital Data

Three pieces of information to manipulate: amplitude, frequency, phase.

  • Amplitude modulation (AM): encode bits using am- plitude levels.
  • Frequency modulation (FM): encode bits using fre- quency differences.
  • Phase modulation (PM): encode bits using phase shifts.

0 1 1 0 0 1