I was wondering what file sizes does a typical proteomics experiment outputs in each phase (identification and quantification).|
Also, how does the type of the quantification (relative, absolute) affect the size of the output files.
I was wondering what file sizes does a typical proteomics experiment outputs in each phase (identification and quantification).|
Also, how does the type of the quantification (relative, absolute) affect the size of the output files.
This can vary a lot depending on the length of run, the sample itself, type of method, and maybe even the instrument manufacturer.
There are also various approaches to doing relative and absolute quantification, but I would not anticipate these factors to have the largest impact on file size. Certain method parameters like profile/centroid settings and run length are likely the biggest factors.
I have proteomics files for TMT-label quantified samples with a 2 hour duration that were collected in centroid mode for ms2 and ms3. These ones are 800-900MB. A smaller sample load with the same method has a file that is 600MB.
I also have files with the same setup, but collected entirely in profile mode that are 3-4GB.
Maybe you should define "typical proteomics experiment" a bit more. In general @Nathan is right with his comment about centroid / profile mode.
From my point of view, a typical experiment would be a data-dependent acquisition (DDA, aka "shotgun proteomics"), using a 90 min gradient on a common ThermoFisher Orbitrap instrument. This would typically lead to file sizes of about 400-500 MB.
Spiking peptide or protein standards to perform absolute quantification will not make the files considerably larger.