Did the operating system make it too hot or did deep space? I thought deep space, or all space, was a constant temperature somewhere in the 70s.
Maybe because it is made to sense the infrared spectrum, that it might pick up heat easily whenever it momentarily faces the sun, then requiring a cool down before becoming functional again.
When it left earth the telescope was too warm to do it’s job, and the only way to get rid of heat in space is to radiate it away. There’s no convection, no cool breezes (ha!), etc., and radiating heat away is comparatively slow. As far as the ambient temperature of deep interstellar space, I always thought it was just a few degrees above absolute zero. But this close to the sun, it’s another story.
That’s just what I think. I could definitely be wrong about all this. Someone correct me if I am, thx!
I am guessing a bit here based on space radio reception knowledge. I think, whether it’s RF or IR or any other sensor, it’s about getting adequate signal to noise to be able to ‘detect’.
Cosmic space background noise, which can be measured in temperature or dB (there is a conversion), averages about 2.7 deg kelvin, just above absolute zero. That’s empty space. Add stars and other stuff into the background and the average bumps up to about 30 or 40 deg kelvin. If you were looking at the Earth, the reradiated heat is about 270 deg kelvin. There is a dependency on wavelength as well. The sun puts out about 10k kelvin at our distance from it here.
So to detect for example some star dust 13 billion light years from here that has a temperature seen from here of 40 degrees kelvin, which is lost in the average background, the sensor needs to be cold enough, several degrees colder, to sense the difference. It also needs to be able to be focused enough to not let nearby hotter stuff saturate the picture and drown out what you’re trying to observe.
That’s an explanation from an RF engineers perspective.