Resizing Arrays


#1

So I am having some issues when trying to resize arrays such as doing a 8x8 Dense array. I get an error stating that the subarray is out of bounds when I try to use subarray = {1,8,1,8} when the actual array is {{1,8},8} for columns and rows. Anyone else have this issue using cpp version


#2

Can you please share your array schema (e.g., number of dimensions and dimension domains)?


#3

This is similar to what I am trying to achieve is an Dense array output that output 36 numbers.
Domain domain(ctx);
domain.add_dimension(Dimension::create(ctx, “rows”, {{1, 6}}, 6))
.add_dimension(Dimension::create(ctx, “cols”, {{1, 6}}, 6));

const std::vector subarray = {1, 6, 1, 6};

std::vector data(36);

Also I noticed that the default size dimension is 1,000,000 Although I have tracked down this number I can not see where it is actually set at in order to increase. I am very new to TileDB and databases in general.


#4

You are creating a 6x6 array but you request an 8x8 slice (larger than the domain). This is why you get the out-of-bounds error. Either create an 8x8 array or shrink your subarray to {1,6,1,6}.

Regarding the default dimension size, I am not sure I understand what you refer to. There are no defaults for dimension domains, you can choose any type and value for the dimension domain. Also array resizing is not supported in TileDB yet (once you create the array domain, you cannot change it).

I strongly recommend spending some time on our docs. No databases background is needed at all. Our quickstart (https://docs.tiledb.io/en/latest/quickstart.html) should get you up to speed with the very basics. We have plenty examples and tutorials from Beginner to Intermediate to Advanced users.

I hope this helps.


#5

Sorry I meant an 6 x 6. I did in fact make sure everything was changed. When I did an 5 x 5 it worked fine but not when I tried a 6 x 6. Also we are required to create and array that is 20GB array soace so anytime I try to put anything larger than 1000000 I get a long int error


#6

I also spent alot of time going through the doc.


#7

Can you please share a minimal piece of code (that compiles and runs) that recreates the problem you are experiencing, along with the exact error message? Thanks.


#8


So what I have noticed is that if I start the whole database over again I can do a larger number but if I try to rerun with a new number I get the error as shown in the picture. Can the database only handle one array schema at a time and must be dumped by restarting to do a different one?


#9

The other issue it throws a conversion error


#10

Nevermind figured out the conversion problem. Thank you for your help.


#11

In function write_array above, you also need to set query.set_subarray({1,7,1,7}). I feel though that this example should not have returned an error, since the default subarray should have been the entire domain (which is what you are writing here). We will look into that. Thanks!


#12

Yeah, it seems that the default subarray is the array domain. So I am assuming that your problem was in conversion. Please let us know if there is another issue.


#13

Hello again ,
I am having a few issues with the read from the array
I have to use a very large array with twice my memory which causes my array to be huge. I am having issues with read function it is throwing an error I tried to use long int and just int but it just throws errors I am not sure how to fix this. I need to be able to fill the array with 100% non - empty , 50% non empty etc. I included screenshots of problem


#14

Can you please use explicit bitwidths for your types to avoid any confusion, instead of int and long int? For instance, explicitly set the dimension domains and attribute to int64_t or int32_t. I see in your code that your attribute is set to int, but your data vector to long int. Probably in your system long int is 64 bits.