BladeRF 2.0 frequency stability

Discussions related to schematic capture, PCB layout, signal integrity, and RF development

Moderator: robert.ghilduta

Post Reply
esterhui
Posts: 2
Joined: Tue Aug 28, 2018 12:36 am

BladeRF 2.0 frequency stability

Post by esterhui » Sat Sep 22, 2018 7:09 am

Hi all,

I was trying to use my new BladeRF 2.0 to transmit GPS signals to a receiver, but realized that something was very wrong (phase breaks in data). I then tested the BladeRF 2.0 by generating a tone at 1575.42 MHz with a signal generator, and using RX1 to digitize the tone. I realized the problem: The frequency stability is horrible. I see RMS changes of about 2 Hz when tracking the tone with 1msec integrations.

The plot below shows the doppler frequency of the tone as received by the BladeRF. I have confirmed the TX instability as well by looking on a spectrum analyzer at the instantaneous frequency with a narrow RBW (3 Hz), I can see the signal clearly walking around by 10-50 Hz.

I don't have the UMC to SMA adapter to try using an external clock, currently I'm only using the internal clock. Is this frequency change of 30-50 Hz over 1 second expected? I'm guessing either the clock has an issue, or the VTUNE voltage is noisy, or the AD9361 VCO isn't properly locking to the reference clock? Any ideas on how to fix this would be greatly appreciated.

Plot below shows frequency offset from 1575.42 MHz of tone as received by BladeRF RX1, the signal source is a high quality signal generator with excellent frequency stability.
Figure_1-14.png
bladeRF> info

Board: Nuand bladeRF 2.0 (bladerf2)
Serial #: e4639764725043ac801b852b283ccf9c
VCTCXO DAC calibration: 0x1e9b
FPGA size: 49 KLE
FPGA loaded: yes
Flash size: 32 Mbit (assumed)
USB bus: 4
USB address: 5
USB speed: SuperSpeed
Backend: libusb
Instance: 0

bladeRF> print clock_sel

Clock input: Onboard VCTCXO

bladeRF> version

bladeRF-cli version: 1.6.1-git-436c78b
libbladeRF version: 2.0.2-git-436c78b

Firmware version: 2.2.0-git-3d38fac2
FPGA version: 0.7.3

bladeRF> print hardware

Hardware status:
RFIC status:
Temperature: 33.3 degrees C
CTRL_OUT: 0xf8 (0x035=0x00, 0x036=0xff)
RX FIR: normal
TX FIR: bypass
Power source: USB Bus
Power monitor: 4.928 V, 0.55 A, 2.72 W
RF routing:
TX1: RFIC 0x0 (TXA ) => SW 0x0 (OPEN )
TX2: RFIC 0x0 (TXA ) => SW 0x0 (OPEN )
RX1: RFIC 0x0 (A_BAL ) <= SW 0x0 (OPEN )
RX2: RFIC 0x0 (A_BAL ) <= SW 0x0 (OPEN )

esterhui
Posts: 2
Joined: Tue Aug 28, 2018 12:36 am

Re: BladeRF 2.0 frequency stability

Post by esterhui » Sun Sep 23, 2018 2:40 am

Here is another test, BladeRF 2.0 generating a carrier at 1575.42 MHz (using osmocom_siggen -s 4M -f 1575.42M --sine). For reference, a signal generator was used to generate a second tone. In the spectrum/waterfall plot below you can see the BladeRF tone (left) and the signal generator tone (right). Notice the frequency deviations of BladeRF compared to signal generator.
spectrum.png

odrisci
Posts: 4
Joined: Wed Oct 10, 2018 1:47 am

Re: BladeRF 2.0 frequency stability

Post by odrisci » Wed Oct 10, 2018 2:33 am

I have exactly this same issue. In fact for me the BladeRF micro 2.0 is completely unusable due to this. The frequency stability (and hence phase stability) is appalling, particularly given the price of the device. I'm not sure if this is due to noise at the output of the trim DAC, or due to poor performance of the new MEMS VCTCXO, but the phase noise performance I'm seeing from the new Blade RF micro 2.0 (A9) is significantly worse than that I see with the RTL-SDR dongle. Given that I paid nearly 25 times as much for the BladeRF, this is disappointing to say the least.

For context: I'm using this device to test out concepts in GPS/GNSS receiver signal processing - particularly in relation to carrier phase based processing. With the RTL-SDR (and many other devices) phase tracking is not an issue, and indeed it is possible to use this low cost device for very high precision GPS signal processing (cm level relative positioning using carrier phase double difference ambiguity resolution). With the bladeRF I am not even able to achieve symbol synchronization.

Here's an example: the following plot shows the PLL phase discriminator output for 3 satellites that are tracked using the open source gnss-sdr (https://github.com/gnss-sdr/gnss-sdr).
PhaseDiscriminator_BladeRF2.png
The output is in cycles and has a half cycle ambiguity (+/- 0.25 cycles) due to the BPSK modulation on GPS signals. The PLL is unable to pull-in due to the high phase error dynamics.

Consider the following, which shows similar results using the $30 RTL-SDR dongle:
PhaseDiscriminator_RtlSdr.png
While this is not perfect, the error dynamics are much lower, and are easily tracked. In both cases the antenna was static, and the fact that the noise is consistent across all satellites shows that it comes from the receiver frequency reference.

Please tell me that this is something that can be addressed, otherwise I will have to return the board.

robert.ghilduta
Posts: 32
Joined: Thu Feb 28, 2013 11:14 pm

Re: BladeRF 2.0 frequency stability

Post by robert.ghilduta » Wed Oct 10, 2018 8:27 am

To avoid high EMI, the on-board MEMS oscillator employs a limited amount of spread sprectrum clocking. Furthermore, the oscillator was selected for its high frequency phase noise performance. To address this and the way to to counteract this, the bladeRF 2.0 micro has a number of ways of improving this performance metric. The on-board clocking architecture allows for an external clock to the drive the fundamental 38.4MHz clock. This option fully replaces the phase noise performance of the entire clock distribution on the board. Alternatively, another option is to use the on-board ADF4002 to tame the on-board oscillator. Using the ADF4002 will have a large (and desired) impact on the phase noise of the clock's lower frequencies (<1MHz).

odrisci
Posts: 4
Joined: Wed Oct 10, 2018 1:47 am

Re: BladeRF 2.0 frequency stability

Post by odrisci » Thu Oct 11, 2018 1:46 am

Thanks Robert, that sounds promising. So, does this require connecting the UFL CLK_OUT connector (J92) to the ADF REF_IN (J95)? Then issuing something like:

Code: Select all

set clock_ref enable
set refin_freq 38.4M
Creating a feeback loop in a PLL where the VCO is also the REF is not something I've done before - is this stable?

robert.ghilduta
Posts: 32
Joined: Thu Feb 28, 2013 11:14 pm

Re: BladeRF 2.0 frequency stability

Post by robert.ghilduta » Fri Oct 12, 2018 4:18 pm

The references should be supplied from an alternate source that matches the phase noise requirement of your application. In this case, a long integration application, something that has a strong low-frequency performance is required. If you provide a 38.4MHz you should use `set clock_sel external` to have it drive the fundamental clock of the board. If however you want to use the on-board clock, but you want the ADF4002 ( reference connector J95 ) to tame the clock, you can use `set clock_ref enable`, the default is a 10MHz reference. If however you wish to use a 38.4MHz clock you can use command `set refin_freq 38.4M` as you described. The advantage of using J95 is that the ADF4002's loop filter will favor the low frequency phase noise performance of the new reference and favor the on-board clock's high frequency phase noise performance, which together will give a very accurate and low jitter clock.

odrisci
Posts: 4
Joined: Wed Oct 10, 2018 1:47 am

Re: BladeRF 2.0 frequency stability

Post by odrisci » Sat Oct 13, 2018 5:20 am

Thanks again for the detailed response Robert, I think I misinterpreted your statement:
Alternatively, another option is to use the on-board ADF4002 to tame the on-board oscillator.
I interpreted this to mean that we could use the ADF4002 to reduce the phase noise in the VCTCXO, but in reality this means that you need to use an external reference. For me the accuracy (ppm accuracy) of the clock is not so relevant - this can always be accounted for in signal processing. The big issue is the short term phase stability, which cannot be accounted for in signal processing. If we have phase noise of the nature I showed in my previous post, then there's just no way to process the signals. My interpretation of "taming" was a way to reduce this phase noise.

Secondly, the statement
the oscillator was selected for its high frequency phase noise performance
bothered me for a bit. Since the signal I'm tracking has a frequency of 1575.42 MHz, I would have thought that this is sufficiently high frequency to not be a problem. However, thinking about this a bit more the only conclusion that I can draw is that it is the baseband frequency that matters. Now given that the AD9361 is a zero-IF architecture, we would surely expect that the low frequency phase noise should be the most important. Am I missing something here?

To test this interpretation I plan on collecting a dataset with a low IF (> 2MHz) to see if the phase performance increases. This for me would be a reasonable solution. The only problem being that the useable bandwidth for any signal is reduced. I mean to say that if we have a bandwidth of say 20 MHz, in a zero-IF configuration this is over the range [-10, 10] MHz - but the VCTCXO has very poor phase noise performance in the range [-1, 1] MHz, so I effectively have two useable, but non-contiguous frequency bands: [-10, -1] and [ 1, 10 ] MHz. This is disappointing, as there are many interesting things that could be done in the GNSS context with a true 56 MHz wide front-end, instead I have two 27 MHz bands (still pretty wide, but actually a little bit too narrow for the application I had in mind!)

Again, it's possible that I'm misinterpreting something here. But it seems like to get the performance that I thought I was going to get based on the specs, I will have to fork out extra cash to buy an external oscillator with good low frequency phase noise characteristics. Once again, I appreciate your feedback, it is possible that I'm getting this wrong, my background is more on the DSP than RF side. So please don't hesitate to call out any glaring inaccuracies in the above.

Post Reply