i2c - Dealing with non-standard custom chip timing
Posted: Tue Apr 04, 2023 1:35 pm
Hello,
So I've got an audio DSP here that relies on somewhat custom timing with regards to i2c. From the documentation there are varying amounts of delays that need to be accounted for when reading or writing from the chip.
- After the Start command, a delay of 20 us
- Between it's internal address low & high bytes, a delay of 24 us
- Before reading the returned data bytes, a delay of 16 us
For other delays, like reading back the stored register values they are in minimum ms. Which is fine, since command is sent as to which value to read in a single command, then read later on with another i2c command link.
Since each 10 us is one bit operation @ 100 kHz, there doesn't seem to be an easy way for me to add custom delays (especially in us) to an i2c_cmd_handle_t.
Does anyone have any experience / direction to take with this before I dig into the lower level code for days?
So I've got an audio DSP here that relies on somewhat custom timing with regards to i2c. From the documentation there are varying amounts of delays that need to be accounted for when reading or writing from the chip.
- After the Start command, a delay of 20 us
- Between it's internal address low & high bytes, a delay of 24 us
- Before reading the returned data bytes, a delay of 16 us
For other delays, like reading back the stored register values they are in minimum ms. Which is fine, since command is sent as to which value to read in a single command, then read later on with another i2c command link.
Since each 10 us is one bit operation @ 100 kHz, there doesn't seem to be an easy way for me to add custom delays (especially in us) to an i2c_cmd_handle_t.
Does anyone have any experience / direction to take with this before I dig into the lower level code for days?