Driving an 8-bit parallel 8080 bus using I2S
Re: Driving an 8-bit parallel 8080 bus using I2S
It is interesting I can't see an example of TFT parallel driving with I2S however this is probably very easy.
There are examples of streaming data of even 24bit in parallel. I must try it with one of my larger LCDs like ili8488.
So far I used SPI but it is a bit slow for larger data. 8bit lcd should be quite fast using I2S. It is similar to driving led panels
There are examples of streaming data of even 24bit in parallel. I must try it with one of my larger LCDs like ili8488.
So far I used SPI but it is a bit slow for larger data. 8bit lcd should be quite fast using I2S. It is similar to driving led panels
-
- Posts: 8
- Joined: Wed Nov 07, 2018 3:29 am
Re: Driving an 8-bit parallel 8080 bus using I2S
I actually found the speed to be lacking. The clock is topping at circa 18MHz for parallel I2S and it's definitely not enough for a large display with 8 bits per channel. And I need the extra pins, so going 16 bit is a no-go. QSPI and a small FPGA is my next bet and I'm experimenting with the iCE40.
Re: Driving an 8-bit parallel 8080 bus using I2S
Clock limit for I2S should be in theory 80Mhz. The camera itself uses 20MHz so maybe you are doing something incorrectly.
It shouldn't be a problem even for 800x600 but it depends what refresh speeds you need
It shouldn't be a problem even for 800x600 but it depends what refresh speeds you need
-
- Posts: 8
- Joined: Wed Nov 07, 2018 3:29 am
Re: Driving an 8-bit parallel 8080 bus using I2S
Output and input behave differently. It's not even the case that the output is garbled: the CPU just hangs if you try to clock the I2S peripheral too high (parallel mode, I mean).
Re: Driving an 8-bit parallel 8080 bus using I2S
Well I had no problems with I2S at 40Mhz which was max clock for ledc. However you can source-clock it with APLL clock which according to specs ranges from 16-128Mhz.
I noticed that with high frequencies you must supply very stable 5V/3.3V voltage and good (2A) current for the clock source to be accurate.
It also depends a lot on quality of the ESP board.
I noticed that with high frequencies you must supply very stable 5V/3.3V voltage and good (2A) current for the clock source to be accurate.
It also depends a lot on quality of the ESP board.
-
- Posts: 8
- Joined: Wed Nov 07, 2018 3:29 am
Re: Driving an 8-bit parallel 8080 bus using I2S
Would you be so kind to share your I2S setup, including values for divisors and clock source? I shared pretty much everything I managed to test on the Github issue. And I'm aiming for 60FPS refresh, but solid 30Hz should be okay too.Deouss wrote: ↑Thu Apr 04, 2019 8:51 pmWell I had no problems with I2S at 40Mhz which was max clock for ledc. However you can source-clock it with APLL clock which according to specs ranges from 16-128Mhz.
I noticed that with high frequencies you must supply very stable 5V/3.3V voltage and good (2A) current for the clock source to be accurate.
It also depends a lot on quality of the ESP board.
Re: Driving an 8-bit parallel 8080 bus using I2S
I was using modified code of this example
https://github.com/bitluni/ESP32Lib/blo ... 2S/I2S.cpp
But I must check again with 16 and 24bit lcds.
I would also check out reference manual how to set I2S properly
https://github.com/bitluni/ESP32Lib/blo ... 2S/I2S.cpp
But I must check again with 16 and 24bit lcds.
I would also check out reference manual how to set I2S properly
-
- Posts: 8
- Joined: Wed Nov 07, 2018 3:29 am
Re: Driving an 8-bit parallel 8080 bus using I2S
I read the TRM extensively. As I mentioned here (https://github.com/espressif/esp-iot-so ... -445348857) "rtc_clk_apll_enable(true, 0, 0, _10_, 0) breaks, while rtc_clk_apll_enable(true, 0, 0, _9_, 0) works..." and "clkm_div_num set to 3 or 4 works, as long as the [DMA] linked list isn't used."Deouss wrote: ↑Fri Apr 05, 2019 12:04 pmI was using modified code of this example
https://github.com/bitluni/ESP32Lib/blo ... 2S/I2S.cpp
But I must check again with 16 and 24bit lcds.
I would also check out reference manual how to set I2S properly
But I'm totally going to try Bitluni's parameters. Legendary hacker.
Re: Driving an 8-bit parallel 8080 bus using I2S
I am not expert but if you look at rtc_clk_apll_enable() code (https://github.com/espressif/esp-idf/bl ... /rtc_clk.c line 238) it seems like that function sets some I2C registers only.
Setting proper clock for I2S may require manipulation of I2S_CLKM_CONF_REG or so.
When I have time I will investigate. I know 20Mhz worked fine for me
Setting proper clock for I2S may require manipulation of I2S_CLKM_CONF_REG or so.
When I have time I will investigate. I know 20Mhz worked fine for me
Re: Driving an 8-bit parallel 8080 bus using I2S
Alright. So, I saw there was a library written by bitluni for VGA where bitluni achieved 580MHz of speed! However, unfortunately for me, unlike for all of other codes, this one he licensed under a ShareAlike license!
I wonder if it's somehow possible to do a clean-room implementation (to avoid the license since I only want the I2S+DMA config code) of the I2S+DMA code that can copypaste a whole framebuffer or maybe scanline by scanline via a scanline buffer onto an 8-bit parallel ILI9341 display at solid 60FPS so that one core does only the game's stuff while the second core does the graphics rendering. That would be epic for something like an NES emulator. In fact, there is an NES emulator, but it runs on the slow SPI code and it's under the GPL license.
Anybody have any idea?
I mean, really, it's again one of those ramblings and raves about how Atmel AVR XMEGA ATXMEGA128A1U doesn't have a well-documented nolatch 1MB SRAM config and which doesn't have a free compiler that has 24-bit pointers (or 32-bit) and which has two address spaces instead of just one and which cannot output fast 60FPS video like how Uzebox (GPL licensed!!!) did with an even weaker microcontroller. Now the ESP32 has the framerate issue. And yet again, look at GameBoy Advance! How come that console doesn't have any of these problems and yet it runs on 1995-ish hardware and doesn't have microcontrollers, but microprocessors? Today we have ESP32 which is far more powerful and yet we cannot make a decent game console that matches at least SNES/GBA graphics! And if there is something, it's always some GPL-licensed Linux stuff. Now, I've grown out of the ESP32 phase of my life, but I'm using ESP32 for a college IoT project so I need a GUI; not a high framerate one, but LittleVGL at 30FPS on the second core with partial updates looks promising. I'm now more into making an SoC-based game console running on BSD and supporting all Godot Engine games including the Godot Engine itself all under a permissive license. Who knows what I'm gonna have trouble with over there? Segmentation faults? OpenGLES3 compatibility problems? Driver licensing? It's just an endless battle over what to use and what to make while the ideas just keep piling up for eternity while nothing is being implemented. Unless I get in touch with some Mandarin Chinese people who developed the Sony Ericsson K610i SoC to give me some in a (barely) legal way. Or if I get an FPGA-based SNES emulator/synthesizer to modify into a self-sustainable C/ASM compiler with all the gamedev tools. Please tell me I'm not the only one.
I wonder if it's somehow possible to do a clean-room implementation (to avoid the license since I only want the I2S+DMA config code) of the I2S+DMA code that can copypaste a whole framebuffer or maybe scanline by scanline via a scanline buffer onto an 8-bit parallel ILI9341 display at solid 60FPS so that one core does only the game's stuff while the second core does the graphics rendering. That would be epic for something like an NES emulator. In fact, there is an NES emulator, but it runs on the slow SPI code and it's under the GPL license.
Anybody have any idea?
I mean, really, it's again one of those ramblings and raves about how Atmel AVR XMEGA ATXMEGA128A1U doesn't have a well-documented nolatch 1MB SRAM config and which doesn't have a free compiler that has 24-bit pointers (or 32-bit) and which has two address spaces instead of just one and which cannot output fast 60FPS video like how Uzebox (GPL licensed!!!) did with an even weaker microcontroller. Now the ESP32 has the framerate issue. And yet again, look at GameBoy Advance! How come that console doesn't have any of these problems and yet it runs on 1995-ish hardware and doesn't have microcontrollers, but microprocessors? Today we have ESP32 which is far more powerful and yet we cannot make a decent game console that matches at least SNES/GBA graphics! And if there is something, it's always some GPL-licensed Linux stuff. Now, I've grown out of the ESP32 phase of my life, but I'm using ESP32 for a college IoT project so I need a GUI; not a high framerate one, but LittleVGL at 30FPS on the second core with partial updates looks promising. I'm now more into making an SoC-based game console running on BSD and supporting all Godot Engine games including the Godot Engine itself all under a permissive license. Who knows what I'm gonna have trouble with over there? Segmentation faults? OpenGLES3 compatibility problems? Driver licensing? It's just an endless battle over what to use and what to make while the ideas just keep piling up for eternity while nothing is being implemented. Unless I get in touch with some Mandarin Chinese people who developed the Sony Ericsson K610i SoC to give me some in a (barely) legal way. Or if I get an FPGA-based SNES emulator/synthesizer to modify into a self-sustainable C/ASM compiler with all the gamedev tools. Please tell me I'm not the only one.
Who is online
Users browsing this forum: No registered users and 112 guests