Index exceeds the number of array elements (2).

4 visualizzazioni (ultimi 30 giorni)
I am facing error to solve this code , kindly guide me
i am facing issue at:
P1 = P2(1:L/2+1);
kindly help me to solve the error
Please find the attachment for full code and datasets:
the error i am getting at the arrowed syntax shown in below picture:
  3 Commenti
Aniket Manjare
Aniket Manjare il 20 Dic 2020
Can you please guide me what changes i have to do in above code, below are the data snapshots of the variables
NoOfFailureModes = 2
NoOfDataPoints = 86
Here i am using the computeFFT function
for i = 1:NoOfFailureModes
for j = 1:NoOfDataPoints
data{i}.X_FFT(j,:) = computeFFT(data{i}.X(j,:)-mean(data{i}.X(j,:)),L);
this is the view of data{i}
this is data{1} ih has 1 - 86 data snapsots

Accedi per commentare.

Risposta accettata

Walter Roberson
Walter Roberson il 20 Dic 2020
The code you are adapting, which you posted in another question, was designed for a different file structure -- one in which each field of the input file has many different points separated by commas. The original code put together those multiple points into table entries with two sections of X values and two sections of Y values and two sections of Z values, for each input row. The code you posted here assumes that there are 100 columns for each of those entries -- so that each input row of the file encoded 100 values in each of the six fields.
However, your files are not like that at all. Instead you have exactly one entry for each of the fields. But your code has not been changed appropriately.
The code takes the fft, and discards the second half (corresponding to the negative frequencies) and doubles the values in the first half (so preserving the total power.) But now with only data entry per row for each of the six fields, and with the mean being subtracted out before the fft, if you were to adjust L to take the first half of the data that is actually there... then it would just be the 0 (because fft of data with mean 0 is going to start with 0 in the results.)
The fft section of the code is therefore useless, and could the indexing error "fixed" by having computeFFT return 0 all the time.
The original data expected 100 sensor readings at a time. You don't have that: you have one sensor reading at a time. And there is no point in fft'ing one sensor reading.
  3 Commenti
Walter Roberson
Walter Roberson il 22 Dic 2020
That program is unsuited for that data file.
The original program you were using expected a data file that looked like
20020-03-23 13:02:50 UTC,1,"0.04,0.38,-0.06,<Another97entries>","0.015,0.018,-0.01,<Another97nentries>",<fourMoreQuotedFields>
It is expecting 100 sensor readings for each of field1, field2, up to field6. Each of the fields is treated as a time course, mean is subtracted, fft is done, first half is taken, and so on. And then on to the next row in the file, that would also be expected to half those six quoted fields of 100 entries each. The code expects at least 86 such lines in the file.
But your actual file has only 1 sensor reading for each of field1, field2, up to field6, and when you try to process the data the same way as before, your calculation fails.
Is it possible that you now have files with 100 data lines (plus one header line) and that you have 86 such files? Or that you have 86 data lines (plus one header line) but you have 100 such files and that each file has exactly the same times?
Aniket Manjare
Aniket Manjare il 24 Dic 2020
No sir, When i am uploading my sensor data from Node MCU on thinkspeak, i am getting this kind of csv file shown below. 3- axis acclerometer i.e field1= X, field2 = Y field3 = Z . My project is Predictive maintainece with vibration analysis.
Would you please guide me though this step of preprocessing of data , Identifying condition indicators and training model with the data shown below

Accedi per commentare.

Più risposte (0)

Prodotti


Release

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by