Filter aided sample prep (FASP) has always been a bit of an enigma. There are people who use it for everything (me) and people who tried it a few times and have never done it again.
There was a visiting fellow in my old department who spent months prepping samples and bringing them to me for LC-MS and I never identified a darned thing in his samples except my normal keratin background. When we finally reviewed the whole method with him we realized he was using the wrong spin filters. No protein was ever retained. It was a total bummer because his samples took foooorrreeevvveeer for him to get and he lost a lot most of his research time because of a wrong catalog number or something.
Never ever underestimate the value of a good biological QC for your sample prep methodology -- especially if it is really complicated! This also impressed upon me the need to match the filter to the method paper (or to just buy the pre-assembled kits!)
Turns out it's even more complicated than that! This team looks at different filter shapes and finds remarkably different numbers of peptides are retained depending on how the filter is constructed!
They use an interesting mouse and human cell samples and find what I've always seen -- FASP outperforms in-solution digestion markedly (they use ethanol crashed -- I'm going to go for acetone -- and not just because I love the smell). They also experiment with some buffers for their protocol, but the interesting thing is the difference in coverage between the flat bottom filters and the conical filters. Since they don't give away the answer in the abstract, I'm not going to give it away here either. Let's just say that the kits I've always bought in the past are on the right track.
The samples are ran on a Q Exactive Plus and the data is processed in both PD 1.4 using Mascot as well as with MaxQuant, which is where all the LFQ measurements came from.
I really wonder how much these observations play into whether one researcher thinks FASP is a great idea -- or a dumb one.