Hello,
So I've got an audio DSP here that relies on somewhat custom timing with regards to i2c. From the documentation there are varying amounts of delays that need to be accounted for when reading or writing from the chip.
- After the Start command, a delay of 20 us
- Between it's internal address low & high bytes, a delay of 24 us
- Before reading the returned data bytes, a delay of 16 us
For other delays, like reading back the stored register values they are in minimum ms. Which is fine, since command is sent as to which value to read in a single command, then read later on with another i2c command link.
Since each 10 us is one bit operation @ 100 kHz, there doesn't seem to be an easy way for me to add custom delays (especially in us) to an i2c_cmd_handle_t.
Does anyone have any experience / direction to take with this before I dig into the lower level code for days?
i2c - Dealing with non-standard custom chip timing
-
- Posts: 24
- Joined: Fri May 28, 2021 1:58 pm
i2c - Dealing with non-standard custom chip timing
Last edited by expresspotato on Tue Apr 04, 2023 9:52 pm, edited 1 time in total.
-
- Posts: 1724
- Joined: Mon Oct 17, 2022 7:38 pm
- Location: Europe, Germany
Re: i2c - Dealing with non-standard custom chip timing
Not 100% sure about the I2C driver implementation right now, but try and see if you can split up your communication into multiple "transactions", starting with a "transaction" which only contains a single START, then basically one write (or read) transaction per byte to have enough delays.
That chip has one pretty wild "I2C" interface someone cobbled together in software
That chip has one pretty wild "I2C" interface someone cobbled together in software
-
- Posts: 24
- Joined: Fri May 28, 2021 1:58 pm
Re: i2c - Dealing with non-standard custom chip timing
Hello @MicroController,
Thanks kindly for the reply and your suggestion. The more I think about it, the more I think it's going to be a real problem because the chip requires 24 us, which is neither 1 nor 2 bits (10 us each) but something in between for some reason.
I will try just the start command followed by some arbitrary delay, but this may be hard to achieve accurately since the delay is done back on the CPU in user code rather than the lower level i2c code / hardware.
I suspect this could be done by creating an output clock and bit banging, but then I loose all the i2c library functionality...
Thanks kindly for the reply and your suggestion. The more I think about it, the more I think it's going to be a real problem because the chip requires 24 us, which is neither 1 nor 2 bits (10 us each) but something in between for some reason.
I will try just the start command followed by some arbitrary delay, but this may be hard to achieve accurately since the delay is done back on the CPU in user code rather than the lower level i2c code / hardware.
I suspect this could be done by creating an output clock and bit banging, but then I loose all the i2c library functionality...
-
- Posts: 1724
- Joined: Mon Oct 17, 2022 7:38 pm
- Location: Europe, Germany
Re: i2c - Dealing with non-standard custom chip timing
To me it seems that the specs are only giving minimums for the timings...
-
- Posts: 24
- Joined: Fri May 28, 2021 1:58 pm
Re: i2c - Dealing with non-standard custom chip timing
After hours with the logic analyzer, the timing doesn't even follow the spec sheet when sent from its windows configuration utility. I ended up using the following library and adding the needed delays with esp_rom_delay_us.
https://github.com/tuupola/esp_software_i2c
https://github.com/tuupola/esp_software_i2c
Who is online
Users browsing this forum: No registered users and 128 guests