Bank_inhouse returns empty files


About my metabolomic datas, I used to performed the "bank_inhouse" script with my variablemetadata (with M/Z and RT columns) and my in house library to match datas.
The script returns an empty files. When I check the parameters to run the script and my variablemetadata/in house lib, I don't see anything different than usual.
Any suggestions to help me find the issue?

Thanks in advance

Hi Maelle,
Sorry for this issue.
May you share your history? my email to share via galaxy is franck.giacomoni [AT]
Thank you

1 « J'aime »

Hi, thank you so much, I just shared it, let me know if you can see it.
It is probably an easy mistake I made but I can not find the mistake.

Thank you in advance,
best regards,


I read a error message in job information: 'The 689 th value : '164125' is greater than the maximum!'
I checked your variable_metadata file (dataset 422) and found it at line 690: 'M164T77 M164.125T77 164125 ...'
Just edit this value and add a valid mass... Hope there is only one error in your file... Did you edit manually your file (with excel for example) ?

Who, thank you so much for your quick help. As it stayed green, I did not see the error message.
It does work now.
I just used excel at the end to convert my retention time in minute. I checked and that masse was wrong from the beggining (from my first variable metadata from FillPeak). I don't see any error on the parameters I set in W4M.
To perfom the bank_inhouse identification I just change the masse in Excel and uploaded again my variablemetadata.
Should I be worried about the validity of my workflow and run it again from the begining?

Thank you very much for you help,
best regards,

Could you share with me the history where the fillPeak was run? I can have a look to check why the dot is missing.
Best regards

Thank you for your answer, I will share it now. :slight_smile:
Actually it is quite a mess because I tested again the processing until fill peak to see if I did a mistake or if it repeated the mistake.
It happened to me on 3 analysis I did on the same day, and it did not happen on the analysis I did the day before (in case it helps you).

According to your history, there is no problem with the fillPeak outputs, nor any of the variableMetadata files used in Galaxy before the one from your edition outside Galaxy.

Please note that, depending on which software you use to open this kind of file (even if you do not perform any change yourself to the file), the way the information in the file is displayed (and thus saved afterward if you modify the file) may be an interpretation and not the original data. For example, using Excel, if you have a number that have a huge number of decimal places (e.g. "278.046188354492") you will get a rounding of it (e.g. "278.04619"), although you can still have access to the original value as long as you do not save the file using Excel. The problem is, in some cases the interpretation of the data changes the data with no way to go back to the original value. For example, with Excel in English language, if you have "278,125" in a cell, it may be converted automatically to "278125" as it "looks like" one of the way to store numbers in English. Since Excel makes the change automatically when opening the file, it seems as the value was "278125" from the start when in fact it was not. Maybe this is why also, when checking your data, you get the impression that the "bug" was there from the start in your data when in fact it was not.
In conclusion, please always keep in mind that what you see with this kind of software may be tricky, and be very cautious when using this kind of software.
Best regards