Fairly large (>64k) const arrays

adrian650
Posts: 5
Joined: Tue May 28, 2024 8:28 am

Fairly large (>64k) const arrays

Postby adrian650 » Tue May 28, 2024 8:55 am

Hi, first time posting after a fair bit of reading as a guest. I've done a fairly thorough search of ideas here and in other resources and not been able to find a solution or test to try.

I've written an open 3D world renderer in Arduino C (I appreciate Arduino is really C++ but code is struct-based rather than true objects) and it works after a lot of effort. However I've run into a problem with 'large' worlds with around 10k triangles.

I build the world in a workflow from Blender via GoogleScript into const arrays of indices and vertex floats (these may be thousands of elements each and my 'Vec3f' type is 3 floats hence 12 bytes) to a .h file that is compiled into the code. It works as it should upto nearly 9k triangles. With a larger world I got the 'dangerous relocation' error and so added the '-text-section-literals' flag to c, cpp and s extra flags in Arduino platform file. That changed the error to L32R operand being out of range. The longcalls flag is set as default in Arduino options.

I largely understand the assembly language theory here but don't see why a 32bit processor and toolchain can't deal with this. It is frustrating to have this as a compile time error that doesn't feel to be due to poor coding, especially as the error message doesn't indicate which out of range literal is causing the problem!

My code allows world maps to be broken up which reduces the size of each one but same problem.

Any ideas would be gratefully received.

I have thought about quitting Arduino but ultimately whatever UI I use they have ESP IDF under the hood so can't see an instant win.

Thanks!

ESP_Sprite
Posts: 9708
Joined: Thu Nov 26, 2015 4:08 am

Re: Fairly large (>64k) const arrays

Postby ESP_Sprite » Wed May 29, 2024 1:12 am

That is odd. How do you declare those const arrays? In general, can you share your project?

MicroController
Posts: 1688
Joined: Mon Oct 17, 2022 7:38 pm
Location: Europe, Germany

Re: Fairly large (>64k) const arrays

Postby MicroController » Wed May 29, 2024 10:00 pm

I don't have an explanation of what's going wrong in your case, but, as an alternative, you may want to consider storing the world data in a dedicated data partition. This way, it cannot in any way interfere with building your application, and you can change/update the world data on the chip without having to rebuild the application.

lbernstone
Posts: 826
Joined: Mon Jul 22, 2019 3:20 pm

Re: Fairly large (>64k) const arrays

Postby lbernstone » Wed May 29, 2024 10:47 pm

Sorry to sound like a broken record, but we really need to see how you are mapping the disk data into your structures.
Flash sectors are 4096 bytes, 32-bit aligned. It will be best if you access them as such if you read the disk raw.
C-style const arrays (eg, from xxd) are uint8_t arrays. I'm not entirely sure how the mapper is aligning ro-data, but if it is translating your 8-bit array into 32-bit aligned memory, that might be taking a lot more space than you expect. I use const arrays all the time to deliver web assets (.js libraries) up to 100KB, so it isn't a problem inherent to a large array.
Post very minimal code that shows how you are using/casting the data for your model, and that generates the error if possible so we can get a better idea of what is going on.

adrian650
Posts: 5
Joined: Tue May 28, 2024 8:28 am

Re: Fairly large (>64k) const arrays

Postby adrian650 » Sun Jun 09, 2024 7:24 pm

Thanks, I've not managed to set a minimal fail code as yet.

World is set with arrays like this to set vertex coordinates, which is the largest array:

  1. const Vec3f cvertices[NVERTS + 1] = { {0.f,0.f,0.f},// Dummy vertex to allow OBJ standard indexing
  2.  
  3. {40.000000f, 0.000000f, 90.000000f}, {50.000000f, 0.000000f, 90.000000f}, {40.000000f, 0.000000f, 80.000000f},
  4. {50.000000f, 0.000000f, 80.000000f}, {42.000000f, 0.000000f, 88.000000f}, {44.801487f, 10.000000f, 85.198532f},
  5. {42.000000f, 0.000000f, 82.000000f},...
with similar arrays for faces and colour attributes:
  1. const uint16_t cnvertices[1170] = {
  2. 2, 3, 1, 2, 4, 3, 8, 10, 12,
  3. 20, 13, 16, 19, 16, 15, 14, 19, 15,
  4. 17, 14, 13, 43, 26, 41, 21, 27, 22,
  5. 27, 23, 22, 35, 33, 28, 29, 26, 25,...
As I've a system for overlaying worlds these are referenced from a structure. I'd think that as I'm not loading a literal directly the full 32 bit address could be pulled from the WorldLayout without the limitations of bit depth in the L32R instruction:
  1. struct WorldLayout
  2. {
  3.   // The constants of number of triangles etc aren't needed as they are fixed via chunks
  4.   const uint32_t type;            // what type of layout is this?
  5.   const uint32_t frames;          // how mnay frames are in a flip book style animation
  6.   const uint32_t nvertices_num;   // number of vertices per flip  - must be constant
  7.   const uint32_t frame_time;      // the number of ms betwen each frame in a flick book
  8.   Vec3f const * const vertices;         // Pointer to an array of Vec3f vertices - these are all const ptrs to const values
  9.   uint16_t const * const nvertices;     // Pointer to an array of indices to the vertex coordinates
  10.   uint16_t const * const texel_verts;   // Pointer to an array of indices to texture UV coordinates
  11.   Vec2f const * const vts;              // Pointer to an array of UV coordinates
  12.   faceMaterials const * const palette;  // Pointer to an array of faceMaterials that's a palette
  13.   uint16_t const * const attributes;    // Pointer to an array that selects from the palette per face
  14.   ChunkFaces const * const TheChunks;   // Pointer to the array of lists of faces per chunk
  15.   const ChunkArr ChAr;            // The arrangement of chunks used in this layout
  16. };
  17.  
  18. const WorldLayout world[NUMBER_OF_LAYOUTS]=
  19. {
  20. {
  21.   LO_PLAIN,
  22.   0, // No value needed
  23.   0,
  24.   0,
  25.   cvertices,
  26.   cnvertices,
  27.   ctexel_verts,
  28.   cvts,
  29.   cpalette,
  30.   cattributes,
  31.   cTheChunks,
  32.   20,70,4,3,10
  33. }
  34. };
The data is then sent to the projector and raster as pointer:
  1.         // set the pointer to the layout in use
  2.         this_world_ptr = &world[worlds]; // ERROR SITE?
and utlised in phrases such as:
  1. const Vec3f& v0 = layo_ptr->vertices[layo_ptr->nvertices[idx * 3]];
The code works spot on until the vertex array approaches 64k bytes, as mentioned that is equivalent to around 8500 triangles. I'd understand it better if it failed ABOVE 64k!

I appreciate that I could work from a filesystem but at present it seemed simpler to stay in DROM rather than parse from a file-like arrangement into PSRAM arrays. I did try to copy the arrays into PSRAM but same assembler error.

MicroController
Posts: 1688
Joined: Mon Oct 17, 2022 7:38 pm
Location: Europe, Germany

Re: Fairly large (>64k) const arrays

Postby MicroController » Fri Jun 14, 2024 8:33 pm

You don't have to use files or a filesystem; you can flash any data (structure) you want into a data partition and then mmap it into the CPU's address space...

adrian650
Posts: 5
Joined: Tue May 28, 2024 8:28 am

Re: Fairly large (>64k) const arrays

Postby adrian650 » Tue Jun 18, 2024 3:47 pm

Thanks, I'll take a look at that, might be simpler than fatfs although that has the merit that it can be built on the host from 'normal' files.

I have managed to reproduce the error from a fairly minimal program. I took the arrays and then made repeated small routines to memcpy to malloc'd PSRAM. As I approached the full set of arrays L32R error appeared. It isn't due to a specific array as I could swap them around/not build in and still get the same error. It was reading the const arrays not the malloc'd space as I kept that the same.

BTW, I moved the #include into app_main() and got even more of the L32R errors.

MicroController
Posts: 1688
Joined: Mon Oct 17, 2022 7:38 pm
Location: Europe, Germany

Re: Fairly large (>64k) const arrays

Postby MicroController » Tue Jun 18, 2024 9:06 pm

The partition API also has convenience functions for mmapping, e.g. esp_partition_mmap(...). But yes, you'll have to write the (binary) data in the format you want yourself. You may be able to modify the scripts you're using to generate the C arrays to output binary; or you could compile the generated arrays into a little C program to run on the PC which dumps the binary data from the arrays into a file. If you're into Python, you can also do it in Python, somewhat like e.g. the IDF's gen_crt_bundle.py does.

adrian650
Posts: 5
Joined: Tue May 28, 2024 8:28 am

Re: Fairly large (>64k) const arrays

Postby adrian650 » Thu Oct 24, 2024 7:33 am

A few months later....

I took the plunge and set up partitions for my 3D world and textures independent of the code. I rewrote my world 'compiler' in node.js to take a .obj from Blender and make a structured binary that is self contained. As you say, 'finding' the partitions is easy from IDF. Of course, I needed some initialisation to copy pointers in the partition and more importantly correct them for their mapped destinations.

I now have a world > 1MB in its partition and can update the world, textures or code independently except when I've altered the binary structure.

Definitely the right way to go and it also shows that ESP32S3 can handle long range pointers as you'd expect. It seems that the compiler is perhaps not so good at doing that (or I've misunderstood the various flags).

Who is online

Users browsing this forum: Google [Bot] and 34 guests