This paper from Cao, et al., came to my attention while digging through references for a paper I am constructing with my soon-to-be ex-lab. I've read this through 4 times, because I am absolutely amazed by how different the protocols described in this paper are from the way that I have learned to do things. These are primarily small differences, but it is the number of these alterations that astound me.
For example:
-Dimethylacrylamide (DMA) was used to alkylate proteins
-Peptides were desalted with ultra-MicroSpin columns from the Nest Group
-The Orbitrap XL performed MS1 at 60,000 resolution, and accumulated MS/MS data from the Top 6 ions
-The data was processed with Bioworks against the UniRef 100 database
-MS1 tolerance was set at 100 ppm and the MS/MS tolerance was 1 Da
-A variable modification was set for the deamidation of asparagine
-MySQL was used to compile the data
Not one of the steps listed above is how I would have done this experiment, but that obviously doesn't matter because their data is beautiful. They show thousands of unique high scoring peptides for each sample preparation method they evaluate.
That brings me back to the actual paper. The goal, as stated in the title, is to show what methods produce the best coverage of the plasma proteome. They compare 1 dimensional gels to OFFGEL fractionation to High Ph-RP-HPLC, as well as the pros and cons of collecting more fractions within each method. Overall, it is an extremely meticulous work. If you are interested in plasma proteomics you need to read this paper.
The real take-away for me, however, is how many changes can be made in MS/MS downstream processing that will still result in good peptide coverage. This also suggests how difficult it is going to be to make all of us transition over to the unified protocols we all know are necessary to really move proteomics into the promise land of cross-laboratory reproducibility.
No comments:
Post a Comment