Perhaps for very low cost devices such as phototransistors or even many photodiodes. "Good" PIN and APD detectors have very little noise. While PMT's are routinely used for photon counting (I was detecting individual photons [less quantum efficiency] in 1970, limited only by cosmic ray events), today even APDs are able to count photons.
Photon counting cameras are also common [liquid nitrogen went out decades ago] though not cheap. While lower noise in a standard camera will always help, major factors in establishing noise levels are pixel area, geometry, quantum efficiency, detector material properties, and temperature. Still the cumulative noise is still Gaussian (for "cheap" detectors) or Poisson (for "good" detectors) so time integration reduces these statistical problems (proportional to the squareroot of the integration time), which is why I wrote what I wrote.
Claiming a better camera is fine, but not that its "1000x" more sensitive.
All cameras are necessarily photodetectors, but the reverse is not true.
The factors you list are certainly correct, but once all of those have been selected, in the final analysis, the controlling parameter is the noise level in the individual sensing element. Yes, some advantage can be gained by longer integration time, but even there, the noise is the final determinant of what is practical. Note that NASA PUT INTO ORBIT a satellite whose detector was LIQUID-HELIUM-COOLED for this very reason. The advantage in increased sensitivity in the spectral region of interest was sufficient for them to make this very radical design choice.