“...we cannot compute the one-way value of c, except by dividing d+d1 by 2. ... c is defined by convention...” [Manly Warrior, post 24]
If you’re asserting that we cannot compute the speed of light *exactly*, I concede the point.
But that degree of exactitude can only be found in the realms of theoretical physics, mathematics, and philosophy. All three fields display a common flaw: practitioners seem afraid of reality - even as they insist their fields of expertise are somehow superior to, or at least prior to, reality.
I realize that the speed of light in a vacuum is not exactly 299,792,458-point-something meters per second, but nine significant digits might be enough to unscramble a whole bunch of everyday problems.
I wasn’t smart enough to attain an engineering degree, but in professional life I was required to work closely with engineers, and theoretical physicists, and mathematicians (but darned few philosophers). We were always buried in practical problems urgently in need of solution, and it was essential to get the various working-group members oriented, motivated, and toiling. A struggle that never ended.
Eventually, it dawned on me that the preoccupations of the “experts” with were not that relevant to the grubby real-world situations at hand. I gave up kidding myself, that we needed to compute the value of c past yet another dozen significant digits, or discover a Higher Truth, to do our jobs.
C=(d1+d2)/(t1+t2) but we know that we cannot use two clocks to generate the time each way without violating basic relativity. So, one clock is used and elapsed time is determined for the round trip, so time is an average. An average doesn’t require equality. Cannot prove that time is of uniform speed in both components of d because you’d have to be able to use both time segments measured independently but current technology and physics says otherwise.
If declaring c as a convention satisfied Dr. Einstein, I’m okay with it, and for practically all of our needs, it works just fine.