i2c_master_write_byte(m_i2c_cmd, (uint8_t) ((devAddr << 1) | I2C_MASTER_READ), true)mzimmers wrote:Well, my HW guys have checked out the board and say it looks OK, so I'm inclined to think I'm doing something wrong.
Here's a code snippet of my attempt to implement the protocol described in the drawing in my above post.Does anything jump out at you as wrong here?Code: Select all
m_i2c_cmd = i2c_cmd_link_create(); ESP_ERROR_CHECK(i2c_master_start(m_i2c_cmd)); ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, (uint8_t) ((devAddr) | I2C_MASTER_READ), true)); ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, (uint8_t) regAddr, true)); ESP_ERROR_CHECK(i2c_master_read(m_i2c_cmd, data, 2, I2C_MASTER_LAST_NACK)) ESP_ERROR_CHECK(i2c_master_stop(m_i2c_cmd)); err = (i2c_master_cmd_begin(I2C_PORT_NBR, m_i2c_cmd, 100));
Thanks.
(solved) error when trying to use I2C
Re: error when trying to use I2C
Re: error when trying to use I2C
Hi Fly -
It took me some time to figure out the Maxim datasheet, but I believe that their addresses are "pre-shifted."
Slave Addresses
Charger: 0xD2/D3h
Clogic, GTEST and Safeout LDOs: 0xCCh/0xCDh
Fuel Gauge: 0x6C/0x6D. See the Fuel Gauge I2C Protocol for details in the Fuel Gauge section.
I believe that, in the example of the fuel gauge, the address is really 0x6c >> 1. When you want to write, you use 0x6c; when you want to read, you use 0x6d (I hope that made sense).
I've managed to get this much working (and by "working" I mean not producing a timeout error):
If I uncomment the i2c_master_read line, then I get a time out. I'm sure it's my doing; I'm still trying to understand the protocol of the Maxim device. I am curious, though, how I'm supposed to introduce the read address into my code.
It took me some time to figure out the Maxim datasheet, but I believe that their addresses are "pre-shifted."
Slave Addresses
Charger: 0xD2/D3h
Clogic, GTEST and Safeout LDOs: 0xCCh/0xCDh
Fuel Gauge: 0x6C/0x6D. See the Fuel Gauge I2C Protocol for details in the Fuel Gauge section.
I believe that, in the example of the fuel gauge, the address is really 0x6c >> 1. When you want to write, you use 0x6c; when you want to read, you use 0x6d (I hope that made sense).
I've managed to get this much working (and by "working" I mean not producing a timeout error):
Code: Select all
esp_err_t PowerMgr::i2cRead(uint8_t devAddr, uint8_t regAddr, uint8_t *data)
{
esp_err_t err;
uint8_t addrRead, addrWrite;
// note about data (address) field: the ESP32 docs say to shift addresses one bit to the left,
// but it appears that the addresses supplied in the MAX77818 data sheet already have been shifted.
// keep this in mind when setting addresses from the datasheet.
addrRead = static_cast<uint8_t>(((devAddr) | I2C_MASTER_READ));
addrWrite = static_cast<uint8_t>(((devAddr) | I2C_MASTER_WRITE));
m_i2c_cmd = i2c_cmd_link_create();
ESP_ERROR_CHECK(i2c_master_start(m_i2c_cmd));
ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, addrWrite, true));
ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, regAddr, true));
//ESP_ERROR_CHECK(i2c_master_read(m_i2c_cmd, data, 1, I2C_MASTER_LAST_NACK))
ESP_ERROR_CHECK(i2c_master_stop(m_i2c_cmd));
err = (i2c_master_cmd_begin(I2C_PORT_NBR, m_i2c_cmd, 100));
if (err == ESP_OK)
{
ESP_LOGI(TAG, "i2cRead(): i2c_master_cmd_begin() successful.");
}
else
{
ESP_LOGE(TAG, "i2cRead(): error %x on i2c_master_cmd_begin.", err);
}
i2c_cmd_link_delete(m_i2c_cmd);
return err;
}
Re: error when trying to use I2C
I think I'm slowly getting to the bottom of this...this code works:
I think this is observing the protocol in the diagram...notice the extra i2c_master_start() call in the middle of the sequence. Seems weird, but at least it returns without error.
Now...to see if I can actually read the registers...
Code: Select all
addrRead = static_cast<uint8_t>(((devAddr) | I2C_MASTER_READ));
addrWrite = static_cast<uint8_t>(((devAddr) | I2C_MASTER_WRITE));
m_i2c_cmd = i2c_cmd_link_create();
ESP_ERROR_CHECK(i2c_master_start(m_i2c_cmd));
ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, addrWrite, true));
ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, regAddr, true));
ESP_ERROR_CHECK(i2c_master_start(m_i2c_cmd));
ESP_ERROR_CHECK(i2c_master_write_byte(m_i2c_cmd, addrRead, true));
ESP_ERROR_CHECK(i2c_master_read(m_i2c_cmd, data, 1, I2C_MASTER_LAST_NACK))
ESP_ERROR_CHECK(i2c_master_stop(m_i2c_cmd));
err = (i2c_master_cmd_begin(I2C_PORT_NBR, m_i2c_cmd, 100));
Now...to see if I can actually read the registers...
Re: error when trying to use I2C
Here is how I read a byte, and later how i read a word.
Read Word
Code: Select all
esp_err_t readByte(uint8_t icAddress, uint8_t registerAddress, uint8_t *data)
{
esp_err_t espRc;
i2c_cmd_handle_t cmd = i2c_cmd_link_create();
i2c_master_start(cmd);
i2c_master_write_byte(cmd, icAddress << 1 | WRITE_BIT, ACK_CHECK_EN);
i2c_master_write_byte(cmd, registerAddress, ACK_CHECK_EN);
// Setup the read
i2c_master_start(cmd);
i2c_master_write_byte(cmd, icAddress << 1 | READ_BIT, ACK_CHECK_EN);
i2c_master_read_byte(cmd, data, NACK_VAL);
i2c_master_stop(cmd);
// Shoot it out
espRc = i2c_master_cmd_begin(I2C_NUM_0, cmd, 1000 / portTICK_RATE_MS);
i2c_cmd_link_delete(cmd);
return espRc;
}
Read Word
Code: Select all
esp_err_t readWord(uint8_t icAddress, uint8_t registerAddress, uint16_t *data)
{
esp_err_t espRc;
uint8_t msb, lsb;
i2c_cmd_handle_t cmd = i2c_cmd_link_create();
i2c_master_start(cmd);
i2c_master_write_byte(cmd, icAddress << 1 | WRITE_BIT, ACK_CHECK_EN);
i2c_master_write_byte(cmd, registerAddress, ACK_CHECK_EN);
// Setup the read
i2c_master_start(cmd);
i2c_master_write_byte(cmd, icAddress << 1 | READ_BIT, ACK_CHECK_EN);
i2c_master_read_byte(cmd, &msb, ACK_VAL);
i2c_master_read_byte(cmd, &lsb, NACK_VAL);
i2c_master_stop(cmd);
// Shoot it out
espRc = i2c_master_cmd_begin(I2C_NUM_0, cmd, 1000 / portTICK_RATE_MS);
i2c_cmd_link_delete(cmd);
*data = (uint16_t)( (msb << 8) | lsb );
return espRc;
}
Re: error when trying to use I2C
Interesting...your read protocol includes that master_start in the middle of the message, too. Maybe that's not as unusual as I thought. Of course, my memory is probably somewhat faded; the last time I did any I2C stuff, Bush was president...
I'm still not sure I trust the readings I'm getting, but more experimentation will tell.
I'm still not sure I trust the readings I'm getting, but more experimentation will tell.
Re: error when trying to use I2C
ya... when it comes to reading you first write to the chip basically telling it Hey, i want to read from this register next, then you issue the restart, because if you'd issue a stop, you would have effectively told the chip were done, which we are not yet. After the restart, then you issue the read I.E. "READ_BIT" . then when your done you issue the stop to complete the process.
Re: error when trying to use I2C
I had to look into this some more. I'm using some wrapper code that another consultant wrote for the company before I signed on. I also looked at the IDF docs. I noticed that the IDF docs example for a read doesn't set the sub-address, and doesn't have the two starts.mzimmers wrote:Interesting...your read protocol includes that master_start in the middle of the message, too. Maybe that's not as unusual as I thought. Of course, my memory is probably somewhat faded; the last time I did any I2C stuff, Bush was president...
I'm still not sure I trust the readings I'm getting, but more experimentation will tell.
https://docs.espressif.com/projects/esp ... e4ce9f.png
You have the two starts and my wrapper code has it as well because you are programming the sub-address before the read.
John
Re: error when trying to use I2C
Hi John - yeah, that made it a bit of a challenge to map the ESP calls to the Maxim device protocol. Fortunately, staring at the Maxim drawing long enough eventually got it to sink in. What you're referring to as sub-addresses are called "registers" in the Maxim data sheet. Not all I2C devices have multiple registers, I guess, and the example in the ESP docs is a rather simple one.
Re: error when trying to use I2C
Seems like every I2C chip I've used had registers. The docs probably would be more helpful if the example had them as well.mzimmers wrote:Hi John - yeah, that made it a bit of a challenge to map the ESP calls to the Maxim device protocol. Fortunately, staring at the Maxim drawing long enough eventually got it to sink in. What you're referring to as sub-addresses are called "registers" in the Maxim data sheet. Not all I2C devices have multiple registers, I guess, and the example in the ESP docs is a rather simple one.
John A
Re: (solved) error when trying to use I2C
Well, I'm seeing problems with my I2C again. This might be due to hardware changes, but I thought I'd post here to see if anyone had an idea on what might be going on.
My call to i2c_master_cmd_begin() is returning ESP_ERR_TIMEOUT. There are 3 instances of this return code in the routine, but I traced my occurrence to here (beginning at line 1271 in 3.1.1):
I should point out that my calls return (almost) immediately, even though I specify a relatively long wait time. Again, it's very possible that this is due to a hardware error of some kind, but based on what I've posted, does anyone have an idea what might be happening?
Thanks...
My call to i2c_master_cmd_begin() is returning ESP_ERR_TIMEOUT. There are 3 instances of this return code in the routine, but I traced my occurrence to here (beginning at line 1271 in 3.1.1):
Code: Select all
portBASE_TYPE evt_res = xQueueReceive(p_i2c->cmd_evt_queue, &evt, wait_time);
if (evt_res == pdTRUE) {
if (evt.type == I2C_CMD_EVT_DONE) {
if (p_i2c->status == I2C_STATUS_TIMEOUT) {
// If the I2C slave are powered off or the SDA/SCL are connected to ground, for example,
// I2C hw FSM would get stuck in wrong state, we have to reset the I2C module in this case.
i2c_hw_fsm_reset(i2c_num);
clear_bus_cnt = 0;
ret = ESP_ERR_TIMEOUT;
Thanks...
Who is online
Users browsing this forum: No registered users and 120 guests