First i implement it on MATLAB and use this sequence

Filter
handle filter delay
downsample
Using above sequence i successfuly implement it on MATLAB

For FPGA i use fir compiler core in which i paste the same coefficients that is generated through matlab
But the problem i face is fir compiler core directly gives you the decimated output without handling fitler delay
so what should i do if want to handle this filter delay in xilinx

I am trying to implement Decimation on FPGA and Matlab

For this task i chose following design parameters

Filter type=window hamming hamming method

filter order=30

decimation factor=10

input sample rate=2Ms/s

output sample rate=200ks/s

Normalized cuttof =1/decimation factor

First i implement it on MATLAB and use this sequence

Filter

handle filter delay

downsample

Using above sequence i successfuly implement it on MATLAB

For FPGA i use fir compiler core in which i paste the same coefficients that is generated through matlab

But the problem i face is fir compiler core directly gives you the decimated output without handling fitler delay

so what should i do if want to handle this filter delay in xilinx

Fir compiler core sequence

Filter-downsample

## Share this post

## Link to post

## Share on other sites