• 0

Question

Is there a way to measure the minimum and maximum time between two digital inputs over time?  For example, two digital inputs:

(1) digital input ,rising edge and is the trigger to start the time measurement
(2) digital input, rising edge, stops timer
- The time value is compare against a running minimum and maximum time value.

The goal is to have a continous measurement of the minimum and maximum time difference between the rising edge of digital input 1 and 2.

i.e measure jitter between two digital inputs over time.

Recommended Posts

• 1

You could use a custom decoder in Logic Analyzer to measure the timing and Script tool to find the min/max.
The repetitive captures will have gap between them but you could use Record (Mode) to have measurement on longer continuous sequence.   Decoder:

c = rgData.length
p0 = 1
p1 = 1
v = 0
i0 = 0
for(var i = 0; i < c; i++){
d0 = rgData[i] & 0x01 // DIO 0
d1 = (rgData[i] >> 1) & 0x01 // DIO 1
if(p0 == 0 & d0 == 1){ // DIO 0 rise reset 'v'
v = 0
i0 = i
}
if(p1 == 0 & d1 == 1){ // DIO 1 rise store 'v'
for(var j = i0; j < i; j++){
rgValue[j] = v
rgFlag[j] = hzRate
}
}
v++
p0 = d0
p1 = d1
}

Value to Text:

function Value2Text(flag, value){
switch(flag){
case 0: return "X";
default: return 1e6*value/flag; // us, microsecond
}
}

Share on other sites
• 0

I was using record mode and a python script to parse the files.  Will try the javascript method also.

Thank you,

Tim

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account