EttusResearch / uhd

The USRP™ Hardware Driver Repository
http://uhd.ettus.com
Other
946 stars 646 forks source link

Issue when streaming SC16 with C API #638

Open marcospanghero opened 1 year ago

marcospanghero commented 1 year ago

Issue Description

I am using the C API to stream a binary data file. The file is saved as interleaved int16_t [I][Q] samples. I am using an N210r4 with the latest UHD and fpga image. MTU on the NIC is configured at 3000, buffer size is 10000 samples. The same file, when streamed with the included tx_samples_from_file example works fine and the baseband is received correctly. When the equivalent code is written in C, the baseband is not correct.

Setup Details

Implemented C code streaming loop:

while (1) {
        if (stop_signal_called)
         break;

        uhd_tx_metadata_make(&md, false, 0, 0.1, false, false);
        size_t read = fread(buff, sizeof(int16_t), samps_per_buff, file);
        for(int i = 0; i < read; i++){
            printf("%d \n", buff[i]);
        }

        if(read > 0){
            uhd_tx_streamer_send(tx_streamer, buffs_ptr, read, &md, 0.1, &num_samps_sent);
            total_num_samps += num_samps_sent;
        }
        else
            break;
        if (verbose)
            printf("\n Sent %ld - from file %ld\n ", total_num_samps, read);
    }

buff containes the data block to stream and is defined as buff = malloc(samps_per_buff*sizeof(int16_t));

C metadata

uhd_stream_args_t stream_args = {
        .cpu_format    = "sc16",  
        .otw_format    = "sc16",  
        .args               = "",  
        .channel_list   = 0,  
        .n_channels   = 1};

Reference C++ streaming loop:

void send_from_file(
    uhd::tx_streamer::sptr tx_stream, const std::string& file, size_t samps_per_buff)
{
    uhd::tx_metadata_t md;
    md.start_of_burst = false;
    md.end_of_burst   = false;
    std::vector<samp_type> buff(samps_per_buff);
    std::ifstream infile(file.c_str(), std::ifstream::binary); 

    // loop until the entire file has been read

    while (not md.end_of_burst and not stop_signal_called) {
        infile.read((char*)&buff.front(), buff.size() * sizeof(samp_type));
        size_t num_tx_samps = size_t(infile.gcount() / sizeof(samp_type));

        md.end_of_burst = infile.eof();

        const size_t samples_sent = tx_stream->send(&buff.front(), num_tx_samps, md);
        if (samples_sent != num_tx_samps) {
            UHD_LOG_ERROR("TX-STREAM",
                "The tx_stream timed out sending " << num_tx_samps << " samples ("
                                                   << samples_sent << " sent).");
            return;
        }
    }

    infile.close();
}

Reference C++ metadata

uhd::stream_args_t stream_args("sc16", "sc16");
channel_nums.push_back(boost::lexical_cast<size_t>(channel));
stream_args.channels             = 0;
uhd::tx_streamer::sptr tx_stream = usrp->get_tx_stream(stream_args);

Expected Behavior

I would expect the two samples to perform exactly the same. The baseband should be identical

Actual Behaviour

Once shown on a spectrum analyzer, the C example shows a much larger gain an the baseband appears fragmented. I don't understand how the C api handles the buffer streaming. According to the code the C function wraps the exact same behavior of the send call.

Steps to reproduce the problem

I can provide source code for the two examples and binary file. Both samples are executed at 2.5Msps

Question

What am I missing here to correctly stream the baseband? My understanding is that once the data type is fixed in the streaming metadata, we call uhd_tx_streamer_send with the number of samples that we want to stream (which is what the C++ example does using as type std::complex<short>. In the C case, how do we achieve the same behavior?