Why did my SBIR/STTR submission get “Not Selected?…
Anyone frustrated why their SBIR/STTR proposal rejection letter was limited on detailed feedback? Maybe you had multiple similar submissions with differing evaluation results. All can seem a bit confusing when taken as a whole, especially if it’s your first time playing the SBIR/STTR game. Allow me to shed some light on these otherwise murky waters.
It’s important to understand that most open topic programs submissions are evaluated by volunteers to the program from within that service. If you have a limited pool of volunteers you can’t realistically expect all evaluators to be technical experts in every submitted proposal field or technology. Also as these volunteers are on average mostly new to the evaluation process, you also need as simple and automated of a process as possible to enable the most streamlined evaluation of all submissions within your planned time constraints.
The point is there are two different factors at play that can lead to this type of negative evaluation result. First, different people evaluated each of the similar submissions, so that’s one way to get totally different grades on essentially similar or the same packages. Second, because of the simplified evaluation process, if a particular section of your proposal isn’t screaming obvious high value, it gets marked lower. Add one or two lower scores to a total aggregate of less then ten grading sections, and the large number of total submissions in competition with each other, that results in cut lines of approval that are quite high. This is also why the feedback is very limited, and wildly lacks details (common complaint from submitters for many years now). The evaluator is usually just marking a simple overall value impression for that section (an equivalent score of 1-5), and not generally providing additional detail.