Carrier-less radio transmission

How to use noise for information transmission

This article is a brief summary of work being done while preparing MIPT university team for the International Physicists’ Tournament in 2021.
Special thanks to German Karnup for showing interest in the problem

Prerequisites

To read this article knowing the following concepts would help get the idea better:

  • Modulation

  • Spectral(Fourier) analysis

  • Coding theory

  • Information theory. All of it, mwa-ha-ha

Introduction

The problem statement is the following:

Noise FM

Amplitude, frequency and phase modulations are old robust methods to transfer information via electromagnetic waves. Propose a method to encode information in such a way that the signal will be indistinguishable from the background noise for an uninformed outsider. Propose and implement a setup giving maximum transfer rate and signal-to-noise ratio at some transmitter-receiver distance.


In the article I will talk about theoretical approach to carrier-less data transfer, try to estimate ways to achieve optimal spectrum utilisation efficiency. We will define the criterion for distinguishing a signal from noise and also speculate on what steps can an uninformed observer take to actually detect the transmission. Next certain considerations as per physical implementations, limitations and implications will be discussed.

Why bother

At the moment most of the wireless information is transferred using modulation of a certain carrier wave with a fixed frequency. This approach has been chosen due to its benefits in terms of robustness, ability to densely utilise spectral bands, a highly developed theory of optimal data encoding and for historical reasons. The historical reasons are that it was next to impossible to disturb and detect electromagnetic field in a controlled manner without a strong oscillator. Hence, the development of amplitude and frequency modulation of a strong carrier signal for radio transmission in the “analog times”.

One drawback of such transmission technique is that transmission fact becomes obvious to anyone with a frequency analyzer.

Waterfall

Image taken from wikiless.org under Creative Commons License

or more familiar example:

wifi

Creative Commons License

So when we are switching to “carrier-less” data transmission the visibility aspect morphs into another form. The ability to trigger detection by carrier fingerprint is replaced with a need to distinguish between the “silent noise” and “modulated noise”.

There is known at least one approach to make a step into this direction: that is by coupling unstable electronic circuits that move along the chaotic attractor in the phase space. In other words, in this approach a carrier harmonic wave is replaced with a non-linear chaotic oscillator that is capable to couple same oscillator in the detecting circuit. The “noise” in this case means non-correlating with harmonic carriers. The topic got further development for the non-synchronised case.

The approach in this method is closer, however, to the “prehistoric” radio transmission that is Hertz’s spark transmitter. The idea is to generate white noise across the broad spectrum range that is barely distinguishable from the background noise. (Note: yes, Hetz’s setup had a tuning and certain fundamental frequency. In a way it is more relevant to make and analogy to the coherer detection)

What adds more value other than a pure academic interest is the fact that with modern digital signal processors and fast digital-to-analog converters one can perform many kinds of operations with the electromagnetic spectrum that were simply impossible 10-20 years ago.

To be honest the idea came when the author started receiving unwanted radio signals from the Russian Railroads with the following content: “The train is trying to catch up the schedule”. The signal was received by a USB modem and resulted in shutting down internet access for a minute or so. The physical reason for that “transmission” was high speed and aggressive breaking on the railroad track somewhat 300 meters from the modem.

Simplifications

In the article certain shortcuts are taken

  • Background noise is considered white. This is generally not true, but it will work up to certain limits once we pick a spectral range of operation

  • We will not discuss data transfer rate but instead we will be talking about information transfer rate for the channel

  • It is considered that we can generate any radio-signal and send it to any point of space

  • Receiver is considered perfect in its ability to not distort the radio signal

We will discuss the sanity of the simplifications in the last chapter of the article

Basic principles

The basic idea of the transmission method is to turn white noise on and off, thus generating artificial ‘high’ and ’low’ states of the spectrum that may be a foundation for any way of data encoding, like on-off-keying, manchester coding or non-return-to-zero in the simplest form. With this type of operation it is possible to transmit data already at a certain rate.

ook

On-off keying

With this kind of “background noise modulation” an uninformed listener might become suspicious since the background noise behaves unexpectedly. On the waterfall spectrogram it will result in horisontal borders between wide areas of higher and lower intensity.

Hence the model requires adding numerical description. So the first parameter of data transmission invisibility is the amplitude of background disturbances that could be considered to be of natural origin.

amp safe

$ A_{max}[db]$ - max amplitude of background noise fluctuation considered natural

With this parameter we could not however get any limitation on the data transmission rate since an infinite band implies infinite precision of noise amplitude averaging operation. That said, we have to switch to a finite spectral band.

Within a given band we should be able to distinguish between two noise levels if we satisfy the N-sigma rule. To be more specific, our unmodulated background noise observation results in calculating two parameters: average amplitude and dispersion. Bigger data set results in thinning of Gaussian bell shape, hence resulting in bigger precision of the average value calculation.

variance over time

On the upper chart we can see a blue line representing average value of noise squared. It is constant for an unmodulated signal (for this problem there is a huge mess with terminology since we call noise a signal and our signal is noise). Cyan line shows variance of amplitude squared. It obeys the $ \frac {1} {\sqrt{N}} $ law and asymptotically approaches the average.

The lower chart shows our ability to distinguish the two levels. With short averaging time (yellow colour) we can see a big overlap of the levels, while longer averagin (green colour) results in smaller number of errors.

Which brings us to the next milestone: acceptable error rate. Since we are dealing with information transfer we don’t discuss the protocols and error correction. On this abstraction level we can only define an acceptable error probability and derive our averaging time based on that.

$$ T_{avg}=T_{avg}(A_{max},R_{EPB}) $$

Where $ R_{EPB} $ is maximal acceptable Rate of Error Per Bit.

Since this article is written in a lazy way with 24 hours time frame zero-to-complete, all calculations are omitted as a simple engineering excercise. (wink)

It might also me worth adding one minor addition to the picture for clarification: the “natural maximal amplitude”

with max amp

At this point we still have one more thing to consider: the bandwidth. Since we don’t have a defined spectral band, our averaging procedure is capable to get a lot of data (spectral-wise) for averaging, which would result in a faster time for getting the required n-sigma levels distance.

Another point for switching attention to the frequency bands is our ability to use the same encoding method for multiple bands in parallel. The algorithm for picking optimal bandwidth would require using same considerations as for the conventional carrier-based data transmission: our channels should not overlap.

Overlapping can be estimated using the uncertainty principle:

$$ \Delta \nu \approx \frac {\pi} {\Delta t} $$

where $\Delta \nu$ is frequency uncertainty of a signal that was measured for $ \Delta t $ period of time.

This gives us an estimation of the frequency band that is allowed to be used if the measurement process takes $ \Delta t $: $ \Delta f_{band} \gg \frac \pi {\Delta t} $

Optimisation section

With the restrictions above we may define the optimisation problem as the following:

With given maximum amplitude of noise, acceptable information loss rate and bandwidth find the number of bandwidth divisions and minimal switching period that would maximize the total information transfer rate.

Information transfer rate is defined as the following:

$$ I \propto \frac { n_{channels} } { \Delta t } $$

Considering time average to be the same as ensemble averaging, we are expected to have an invariant:

$$ \Delta f_{band} \cdot \Delta t = const $$

Where $ \Delta t $ is time it takes to reach given variance split. While variance split $$ var[A] \propto \frac 1 { \sqrt {\Delta f_{band} \cdot \Delta t }}$$

and it is fixed by the acceptable information loss condition.

This means that by dividing the given bandwidth into two parts we get $ \Delta f_{band new} = \frac {\Delta f_{band}} 2 $ and $ \Delta t_{new} = 2\Delta t$ and $n_{channels} = 2$

Applying this substitution to the information transfer formula we can see that the total transfer rate stays the same.

However the given calculation does not consider the “edge effects” related to bands overlaps. There are two principal cases:

  1. Initial frequency uncertainty is a lot smaller than given bandwidth: $ \Delta \nu \ll \Delta f $ (On the picture)
  2. Initial frequency uncertainty of the order of magnitude or greater than given bandwidth

 freq uncertainties

In the first case the above calculation is applicable and the band could be split into independent sub-bands while leaving the total information transfer rate intact, since small overlap will scale down with the growth of time it takes to reach given variance split.

In the second case bandwidth split would lead to decrease of total information transfer rate since it is required for band overlaps to be a lot smaller than the band widths. Hence increase of $ \Delta t $ is required which will decrease channel performance.

Uninformed observer, or how to detect the communication

At the moment we came to the conclusion that the transmission is done with slight increase of certain bands in the white-noise manner over periods of time that are enough to provide the predefined information loss of the channel.

The uninformed listener is unable to detect the signal since its level is below the “natural”. If the listener has read this article then he or she might perform a correlation analysis over the broad spectrum of its bands. It will require storing a lot of data and a lot of computational powers to split the spectrum into different bands and analyse different times of level switching. This method would not only detect the transfer but also retrieve the information passed. However this method would be even harder if various band channels are using various switching times and various bandwidths.

Another approach however would be to exploit some practical limitations.

Practical limitations

One strong assumption for the method to work is that the transmitter is able to set electromagnetic field in every point of space independently. In physical reality this is impossible to achieve since the field obeys the $ \frac 1 {r^2} $ law. This means that the given level of $ A_max $ can be achieved in the limited area of radiowave propagation. Hence, if the observer moves closer to the transmitter he will be able to track an unnatural increase of the background noise.

Another assumption is wave polarity. Since antennas cannot produce white noise of independent polarity distribution (Note: it is possible to use two polarities at the transmitting side, but the propagation of these waves would be different, so the trick will work only in direct visibility) it might be worth paying attention to noise statistics of independently measured horisontal and vertical polarities.

One more way to detect this translation is to use directional antenna at the receiver. This would uncover the source of white noise. However it does not guarantee information retrieval.

Also the strong drawback of the method is the need to know noise statistics at the observer’s position.

The small amplitude of the “signal modulation” guarantees that the transmission distance would highly influence data transfer.

One trying to experiment with the method should consider that in most places it is illegal to transfer white noise into radiowave. Also it might be illegal to transfer anything in certain bands at all. So this method is recommended for those who has this disturbance sorted out.

Implementation

Implementation of the method would bump into a number of technological challenges that can be solved with certain effort:

  • Antennas. Each antenna has its frequency response and direction diagram. For actual broadband noise generation it might be required to use multiple antennas for different frequency ranges

  • Writing software for custom type of data processing for encoding and decoding

  • Thorough control of actual background noise. If the natural noise level is increased during the transmission it may disturb the detection

  • Transferring broadband white noise may require a lot of power input depending on the chosen frequency band width

For those interested in trying out the method without a pain of getting into (a)mateur radio circles it might be worth taking a shortcut and implementing the concept in the acoustic waves. I have witnessed a second year student do this in a day (ahem, night) before the exam as her semester project.