Here it is: If we are always going for our most intense ions, Top10 or 20 or whatever, would it even matter what we put our target threshold at? Or would we never get down into that junk?
So here is my crude bumpy-airplane-ride attempt at an answer.
1)Start with my target dataset: A 1ug Hela lysate ran on an Orbitrap Elite with a Top15 method in "high:high" mode with HCD (MS/MS at 15,000 resolution).
3) Set the bottom screen to only show the peak list
4) Go through all of the below analyses, return to Xcalibur, change the settings to "Display all" or you do a bunch of Excel work for nothing....erk....exactly why they serve alcohol on bumpy plane rides....
5) Export the peak lists. For the sake of brevity, I exported one at each of these time points (in minutes): 10,20,30,40,50,60,70,80,90,100,110
6) Find out how many peaks are there and what the average intensities are
So, at the MS1 level in this extremely complex digest, each MS1 spectra contained, on average 2,413 peaks that Xcalibur could detect (+/-200 or so). Of those peaks, the average recorded intensity was 1.2E5!
Okay, more maths: How does this compare to our MS1/MS2 ratios:
Considering the length of the gradient, we ended up getting a full scan every 2.41 seconds in this experiment.
Let's assume that we have a 30s peak width (should be close) and we'll assume that is uniform, so each compound is detectable in the system for 30s, and that is it. This breaks our gradient into 240 measurable time windows, of which I sample 10.
Now, if we assume that the average number of ions around is a good measurement, 2413 x 240 gives us 579,000 ions that the instrument was able to detect and assign an intensity. This is a tryptic run, but I'm going to ignore the fact that a lot of these are singly charged and unlikely to be sequenceable (which isn't a real word, I guess.)
So there are 579,000 ions and we looked at the most intense 26,494. This is the top 4.2%. Compare that to just the average intensity, and that means that at the absolute minimum, every ion we fragmented (given all of these assumptions are true) was at least 1.2E5.
This is on paper (partially on a napkin, to be perfectly honest), but according to the math. No, there is no reason in a complex mixture to spend time fretting over whether you set your MS/MS triggering threshold to 5,000 or 2,000 or 500 or even 50. If you're always going for the most intense ion, you'll never be digging into junk that low in intensity.
However, there are beginnings and ends to each gradient, and those shouldn't have peptides in them (in an ideal world they shouldn't have anything in them at all, but we all know this isn't how it works). If your threshold is too low, then you will be triggering on noise there and increasing your file size, but that would be the only drawback. From a statistical FDR, type level, this would actually be good for you if the premise that "the more bad MS/MS events we have for FDR, the better it works" is true (I've written about that somewhere in one of my previous and long FDR rants).
But, wait a minute! Didn't I just write about the importance of thresholds in dynamic exclusion settings? Yup! I promise I'm going somewhere with this, but I'm out of time. More later, maybe
TL/DR: On paper (or on a napkin) there doesn't seem to be a good reason to worry about your minimum MS/MS intensity threshold cutoff in a TopN experiment in a complex mixture of fairly high load.