Huge array size problem
4 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hello,
i am trying to create a matrix of size 110 000 x 110 000 (91.1 GB) that should containt small double values.
i will be using the values in this huge matrix as inputs to a specific algorithm . i am aware that this exceedes the maximum array size and that it could not be insirted into memory. So i am looking for a fast (the algorithm accesses the data an enormus amount of times) method to have access to this data while overcoming the size problem.
Could you propose a solution for this problem ?
i am using MATLAB2016b
this is the code i used for the generation of the matrix
clc
clear
load('Gw8_matrices.mat','Gw8');
size_group=110592;
distance_table_gw8=zeros(size_group,size_group);
%now fill the table
for i=1:size_group
i
for j=1:size_group
temp1=Gw8(:,:,i) - Gw8(:,:,j);
temp2= norm (temp1,'fro');
distance= round(temp2,4);%Most important
distance_table_gw8(i,j)=distance;
end
end
save('distance_table_gw8.mat','distance_table_gw8','-v7.3');
thank you
1 Commento
rough93
il 25 Set 2019
Could you not just have several sub-matrices and access the one you need? You could set up two for loops to create a square of submatrices and just insert your matrix creation in the middle (with small parameters of course).
Risposta accettata
Matt J
il 25 Set 2019
Modificato: Matt J
il 25 Set 2019
You probably should not pre-store this matrix. You should probably just (re)compute chunks of the matrix as you need them. Note, for example, that the complete j-th column of your matrix can be obtained quite efficiently as follows,
G=reshape(Gw8,[],size_group);
jthColumn= round( vecnorm(G-G(:,j),2,1) ,4).';
and will be even faster on the GPU if you have the Parallel Computing Toolbox.
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Matrix Indexing in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!