Posted on 11/12/2019 3:41:01 AM PST by ShadowAce
Programmers have pride with good reason. No one else has the power to reach into a database and change reality. The more the world relies on computers to define how the world works, the more powerful programmers become.
Alas, pride goeth before the fall. The power we share is very real, but its far from absolute and its often hollow. In fact, it may always be hollow because there is no perfect piece of code. Sometimes we cross our fingers and set limits because computers make mistakes. Computers too can be fallible, which we all know from too much firsthand experience.
Of course, many problems stem from assumptions we programmers make that simply arent correct. Theyre usually sort of true some of the time, but thats not the same as being true all of the time. As Mark Twain supposedly said, It aint what you dont know that gets you into trouble. Its what you know for sure that just aint so.
Here are a number of false beliefs that we programmers often pretend are quite true.
We pound the tables in the bars after work. We write long manifestos. We promise the boss that this time, this new language will change everything and wonderful software will flow from the keyboards in such copious amounts that every project will be done a month before the deadline. In the end, though, we stick data in variables and write some if-then logic to test them.
Programmers see structure in their code and dream of squeezing every last inefficiency from it. So they imagine elaborate castles in the air called frameworks, scaffolding, platforms, or architectures and fiddle with them until they offer just the right support to the current problem so that everything can be written in a few elegant lines. Alas, the next assignment has a different structure.
In the end, all of this is artifice and syntactic frosting. Structural liquor that numbs the pain of coding life until it wears off. Computers are built out of transistors and no amount of clever punctuation and type theory can hide the fact that all of our clever code boils down to one bit of doped-up silicon choosing to go left or right down the fork in the code and there is no middle path.
Perhaps you built your last web application in React because you were unhappy with the pages constructed in Vue? Or maybe you wrapped together a headless Ruby with some static pages built from a templating engine because the WordPress interface was clumsy and dated? Or maybe you rewrote everything in something smaller, newer, or cooler like Marko or Glimmer or Ghost? The programmer is always searching for the perfect framework but that framework, like the end of the rainbow, never appears.
Ralph Waldo Emerson anticipated the programmers life when he wrote Self-Reliance in 1841. Society never advances, he noted, speaking of course of programming frameworks. It recedes as fast on one side as it gains on the other. Its progress is only apparent like the workers of a treadmill... For every thing that is given something is taken.
And so we see again and again as developers create new frameworks to patch the problems of the old frameworks, introducing new problems along the way. If a framework adds server-side rendering, it bogs down the server. But if everything is left to the clients, they start slowing down. Each new feature is a tradeoff between time, code, and bandwidth.
Figuring out how to handle null pointers is a big problem for modern language design. Sometimes I think that half of the Java code I write is checking to see whether a pointer is null.
The clever way some languages use a question mark to check for nullity helps, but it doesnt get rid of the issue. A number of modern languages have tried to eliminate the null testing problem by eliminating null altogether. If every variable must be initialized, there can never be a null. No more null testing. Problem solved. Time for lunch.
The joy of this discovery fades within several lines of new code because data structures often have holes without information. People leave lines on a form blank. Sometimes the data isnt available yet. Then you need some predicate to decide whether an element is empty.
If the element is a string, you can test whether the length is zero. If you work long and hard enough with the type definitions, you can usually come up with something logically sound for the particular problem, at least until someone amends the specs. After doing this a few times, you start wishing for one, simple word that means an empty variable.
The problem of codifying gender and choices of possible pronouns is a big minefield for programmers. Computers deal fixed lists and well-defined menus and humans keep changing the rules. One very progressive school licensed an off-the-shelf application only to discover that the forms gave only two choices for gender.
Computer scientists never really solve problems, they just add another layer of indirection, in this case a pointer to an empty string field where the person can fill in their choice. Then some joker comes along and chooses his majesty for a pronoun, which makes some kids laugh and others feel offended. But going back to a fixed list means excluding some choices.
This design failure mode appears again and again. If you force everyone to have a first name and a family name, some will have only one name. Or then theres someone who doesnt want to be known by a string of Unicode characters. And what if someone chooses a new emoji for their name string and the emoji doesnt make the final list of acceptable ones? No matter how much you try to teach the computer how to be flexible and accepting of human whims and follies, the humans come up with new logic bombs that trash the code.
Theres an earnest committee that meets frequently trying to decide which emojis should be included in the definitive list of glyphs that define human communication. They also toss aside certain emoji, effectively denying someones feelings.
The explosion in memes shows how futile this process can be. If the world finds emojis too limiting, spurring them to turn to mixing text with photos of cultural icons, how can any list of emojis be adequate?
Then theres the problem of emoji fonts. What looks cute and cuddly in one font can look dastardly and suspect in another. You can choose the cute emoji, and your phone will dutifully send the Unicode bytes to your friend with a different brand phone and a different font that will render the bytes with the dastardly version of the emoji. Oops.
One of the ways that developers punt is to put in a text field and let humans fill it with whatever they want. The open-ended comment sections are made for humans and rarely interpreted by algorithms, so theyre not part of the problem.
The real problem resides in structured fields with text. When my GPS wants me to choose a road named after a saint, it tells me to turn onto Street Johns Road. Road names with apostrophes also throw it for a loop. Its common to see St. Johns Road spelled as Saint Johns, St. Johns, Saint Johns, and even the plural form: Saint Johns. The U.S. Post Office has a canonical list of addresses without extra characters, and it maintains an elaborate algorithm for converting any random address into the canonical form.
It may feel like time keeps flowing at a constant rateand it does, but thats not the problem for computers. Its the humans that mess up the rules and make a programmers life nasty. You may think there are 24 hours in every day, but you better not write your code assuming that will always be true. If someone takes off on the East Coast of the United States and lands on the West Coast, that day lasts 27 hours.
Time zones are only the beginning. Daylight saving time adds and subtracts hours, but does so on weekends that change from year to year. In 2000 in the United States, the shift occurred in April. This year, the country changed clocks on the second Sunday in March. In the meantime, Europe moves to summer time on the last Sunday in March.
If you think thats the end of it, you might be a programmer tired of writing code. Arizona doesnt go on daylight saving time at all. The Navajo Nation, however, is a big part of Arizona, and it does change its clocks because its independent and able to decide these things for itself. So it does.
Thats not the end. The Hopi Nation lies inside the Navajo Nation, and perhaps to assert its independence from the Navajo, it does not change its clocks.
But wait, theres more. The Navajo have a block of land inside the Hopi Nation, making it much harder to use geographic coordinates to accurately track the time in Arizona alone. Please dont ask about Indiana.
It seems that merely remembering the data should be something a computer can do. We should be able to recover the bits even if the bits are filled with many logical, stylistic, orthographic, numerical, or other inconsistencies. Alas, we cant even do that.
Whenever I ask my Mac to check the file system and fix mistakes, it invariably tells me about a long list of permissions errors that it dutifully repairs for me. How did the software get permission to change the permissions for access to my files if I didnt give permission to do it? Dont ask me.
These are only two examples of how file systems dont honor the compact between user (the person supplying the electricity) and the machine (desperate needer of electricity). Any programmer will tell you there are hundreds of other examples of situations where files dont contain what we expect them to contain. Database companies are paid big bucks to make sure the data can be written in a consistent way. Even then, something goes wrong and the consultants get paid even more money to fix the tables that have gone south.
We like to believe that our instructions are telling the computer what to do and that arrogant pride is generally true except when its not.
What? Certainly that may not be true for the average nonprogramming saps, unwashed in the liniment of coding power, but not us, wizards of logic and arithmetic, right? Wrong. Were all powerless beggars who are stuck taking whatever the machines give us. The operating system is in charge, and it may or may not let our code compute what it wants.
OK, what if we compile the Linux kernel from scratch and install only the code that weve vetted? Certainly were in control then.
Nope. The BIOS has first dibs over the computer, and it can surreptitiously make subtle and not-so-subtle changes in your code. If youre running in the cloud, the hypervisor has even more power.
OK, what if we replace the BIOS with our own custom boot loader? Youre getting closer, but theres still plenty of firmware buried inside your machine. Your disk drive, network card, and video card can all think for themselves, and they listen to their firmware first.
Not only that, but your CPU might have a Hidden God Mode that lets someone else take command. Dont bother looking at the documentation for an explanation because its not there. And those are just the problems with the official chips that are supposed to be in your box. Someone might have added an extra chip with a hidden agenda.
Even that little thumb drive has a built-in processor with its own code making its own decisions. All of these embedded processors have been caught harboring malware. The sad fact is that none of the transistors in that box under your desk report to you.
I haven’t seen a Hollerith card in decades!
I started programming in the early ‘60s on an IBM1620 in college. The FORTRAN compiler was a punched card deck of several hundred cards. A fellow student by the last name of Humburg became known as Humbug for his propensity to drop the compiler. Took eight passes trough the card sequencer to rebuild.
Doesn't matter. It's still 3.
Yes I know, it was a failed attempt at humor.
Sorry. :(
TI DX10 on a TI990/12 were the best.
DNOS on a DX10 offered almost bulletproof, although slower, execution. This was MULTI-USER not just multitask stuff!
I was thinking like in encryption where you might XOR some plaintext with some keytext, a second XOR returns the original plaintext. Hadn’t seen any applications for three consecutive XORs! ;)
Learned that one in IBM 370 assembly language class 101 in 1977.
Pretty slick, I gotta admit!
On cold damp winter days I remember walking across campus with a box of cards. The cards would soak up the moisture in the air and then not fit into the card reader. You then had to let the cards sit for an hour to dry out before they could be read.
And I don't expect that to be the end of the list. Most of these things I have dropped by the wayside.
I have JAVA books, but never took the time to learn the language because I didn't have a need for it. Read the ADA Language Standard, but never had a machine with a compiler to play with.
I keep hearing about other "new" languages that have enhancements for security; someday I'll have a reason to try them. I don't know their names yet.
Quite the parade, isn't it?
Universal truths:
Constants aren’t.
Variables won’t.
bttt
Your resume looks a lot like mine.
My code doesn’t work. I don’t know why.
My code works. I don’t know why.
WYGIWYG.
Bob Newheart having Herman Holerith explain the punch card concept. I haven't seen this since I was in computer school in 1967. It elevated my esteem of Newheart considerably.
Nine and a half minutes long - as as old punch carder, 4:24 brought back memories.
Absolutely priceless and produced by IBM. Newhart said you should probably bring the card to a piano roll company. Cracked me up. Newhart was brilliant and the master of impromptu. Thanx for sharing, brought a lot of memory’s back.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.