I attempted to utilize the BladeRF SDR for signal power measurement. My approach involved connecting the signal generator directly to BladeRF's RX 1, using a sinusoidal wave at 400 MHz with a power level of -35 dBm. MATLAB was employed to control the SDR, and the signal power was calculated using the formula: rms(x)^2. After 1 hour of measurement, a significant deviation was observed between the final and initial measurement results (refer to Figure 1, Iteration number: 1 iteration is equivalent to 2s ). Figure 2 illustrates the temperature readings from the internal temperature sensor of the SDR.
After approximately 1 hour of measurement, I incrementally increased the input power levels from the signal generator to -25 dBm and -20 dBm, respectively. The deviation exhibited a linear increase corresponding to the rise in input power.
My inquiry is as follows: Is this deviation normal, and what could be the reason for the substantial deviation attributed to heating effect
Please note: the AGC was set to manual mode and the gain was always 0 dB at Rx1.
Thank you.
Figure 1:

Figure 2:
