fL, fI and L are what are so unknowable about the Drake equation. And for all we know, fL could be 1x10^-100,000,000,000. The circular reasoning supporting the notion that fL is some non-infinitessimal number is incredible. And the misreporting just as much.
Do you realize when they report that they’ve discovered conditions under which it would be possible for an amino acid to form, it’s not even the right amino acid? And so what? To suggest that this makes life inevitable and therefore common is like saying that the fact that a mark on a stone resembles a letter means it’s capable that with an infinite number of marks possible over an infinite amount of time (the “monkeys typing” argument), it’s inevitable that the complete, unabridged works of Shakespeare will spontaneously emerge. Oh, except the letter that it looks a little bit like is epsilon, not a letter in the English language.
Oh, and that “monkey argument?” That says that if you have an infinite number of monkeys hitting random keys for an infinite amount of time, they’ll eventually type the entire works of Shakespeare, verbatim, consecutively, so anything could happen to exist by chance, no matter how complex. It’s based on false premises.
There’s not an infinite amount of time, not an infinite amount of space, and not an infinite number of universes as the argument presumes. There’s no evidence for a multiverse (which is not to say there’s no evidence for multiple dimensions, we just don’t know whether those dimensions are sparse or dense). There’s strong evidence against the notion of our universe pulsing. There’s a finite minimum size (Planck’s constant), and a finite size of the universe. The scale of these numbers is literally unimaginable, but nothing compared to 26^5,000,000, which is how many keystrokes you’d have to type to accidentally type Shakespeare. (Actually, more, because there are more than 26 keys.)
In fact, infinity does not exist, except possibly the forward direction of time. Our notion of infinity stems from its use in Calculus, but that use is allegorical. We substitute “negligibly small” or “infinitely small,” and we get Alice in Wonderland. (Literally: Lewis Carroll was a Christian minister who recognized the theological danger posed by the methods used to teach Calculus and wrote Alice in Wonderland to warn against it. There are, by the way, ways of teaching calculus without using the absurdities of infinity.)
See, 1 divided by infinity doesn’t equal zero. There are tons of mathematical absurdities created by this notion, such as infinity PLUS 1/infinity can be proven to equal -1. The fact that is “obviously” not true doesn’t disprove the math used to conclude that, it disproves the way infinity is used in arithmetic.
Divide by zero, and we're all dead!