Is there a way to measure the minimum and maximum time between two digital inputs over time? For example, two digital inputs:
(1) digital input ,rising edge and is the trigger to start the time measurement
(2) digital input, rising edge, stops timer
- The time value is compare against a running minimum and maximum time value.
The goal is to have a continous measurement of the minimum and maximum time difference between the rising edge of digital input 1 and 2.
i.e measure jitter between two digital inputs over time.
Is there a way to measure the minimum and maximum time between two digital inputs over time? For example, two digital inputs:
(1) digital input ,rising edge and is the trigger to start the time measurement
(2) digital input, rising edge, stops timer
- The time value is compare against a running minimum and maximum time value.
The goal is to have a continous measurement of the minimum and maximum time difference between the rising edge of digital input 1 and 2.
i.e measure jitter between two digital inputs over time.
Share this post
Link to post
Share on other sites