Posted on 10/19/2020 8:49:54 PM PDT by SeekAndFind
In 1930, John Maynard Keynes predicted that wed be having 15-hour workweeks by the end of the century. But by the time it was 2013, it was clear that the great economist had gotten something wrong.
Welcome to the era of bullshit jobs, as anthropologist David Graeber coined it. Since the 1930s, whole new industries have sprung up, which dont necessarily add much value to our lives. Graeber would probably call most jobs in software development bullshit.
I dont share Graebers opinion, especially when it comes to software. But he does touch an interesting point: as more and more processes are automated, most jobs are obsolete at some point. According to one estimate, 45 percent of all jobs could be automated using current technology. And over time, they probably will.
In software development, where things move pretty fast anyway, you can see this happen in real-time: as soon as software testing became a hot topic, automation tools started springing up. And this is just one of the many areas where the bullshit-parts the parts that are iterative and time-consuming of software has been automated away.
This begs the question, though, whether developers are making themselves obsolete by building automation tools. If more and more machines can write code for themselves, what do we need humans for?
Software developers are builders at heart. They build logical links, algorithms, programs, projects, and more. The point is: they build logical stuff.
With the rise of artificial intelligence, were seeing a paradigm shift though. Developers arent designing logical links anymore. Instead, theyre training models on the heuristic of these logical links.
Many developers have gone from building logic to building minds. To put it differently, more and more software developers are taking on the activities of data scientists.
If youve ever used an IDE, then you know how amazing assisted software development can be. Once youve gotten used to features like autocomplete or semantic code search, you dont want to go without them again.
This is the first area of automation in software development. As machines understand what youre trying to implement, they can help you through the process.
The second area is that of closed systems. Consider a social media app: it consists of many different pages that are linked among each other. However, its closed insofar as it isnt designed to directly communicate with another service.
Although the technology for building such an app is getting more and more easy to use, we cant speak of real automation yet. As of now, you need to be able to code if you want to create dynamic pages, use variables, apply security rules, or integrate databases.
The third and last area is that of integrated systems. The API of a bank, for example, is such a system since it is built to communicate with other services. At this point in time, however, its pretty impossible to automate ATM integrations, communications, world models, deep security, and complex troubleshooting issues.
The three areas of automation. Image by the author, but adapted from Emil Wallners talk at InfoQ. Software development is a bumpy road, and we dont really know when the future will arrive.
When asked whether theyll be replaced by a robot in the future, human workers often dont think so. This applies to software development as well as many other areas.
Their reason is clear: qualities like creativity, empathy, collaboration, or critical thinking are not what computers are good at.
But often, thats not what matters to get a job done. Even the most complex projects consist of many small parts that can be automated. DeepMind scientist Richard S. Sutton puts it like this:
Researchers seek to leverage their human knowledge of the domain, but the only thing that matters in the long run is the leveraging of computation.
Dont get me wrong; human qualities are amazing. But weve been overestimating the importance of these problems when it comes to regular tasks. For a long time, for example, even researchers believed that machines would never be able to recognize a cat on a photo.
Nowadays, a single machine can categorize billions of photos at a time, and with a greater accuracy than a human. While a machine might be unable to marvel at the cuteness of a little cat, its excellent at working with undefined states. Thats what a photo of a kitten is through a machines eyes: an undefined state.
In addition to working with undefined states, there are two other things that computers can do more efficiently than humans: firstly, doing things at a scale. Secondly, working on novel manifolds.
Weve all experienced how well computers work at a scale. For example, if you ask a computer to print("I am so stupid")
two-hundred times, it will do so without complaining, and complete the task in a fraction of a second. Ask a human, and youll need to wait for hours to get the job done
Manifolds are basically a fancy, or mathematical, way of referring to subsets of space that share particular properties. For example, if you take a piece of paper, thats a two-dimensional manifold in three-dimensional space. If you scrunch up the piece of paper or fold it to a plane, its still a manifold.
It turns out that computers are really good at working in manifolds that humans find hard to visualize, for example because they extend into twenty dimensions or have lots of complicated kinks and edges. Since many everyday problems, like human language or computer code, can be expressed as a mathematical manifold, there is a lot of potential to deploy really efficient products in the future.
Where we are in terms of computer scalability and the exploration of novel manifolds. Were working on areas one and two, but have barely touched area number three. Image by the author, but adapted from Emil Wallners talk at InfoQ.
It might seem like developers are already using a lot of automations. But were only at the cusp of software automation. Automating integrated systems is almost impossible as of today. But other areas are already being automated.
For one, code reviews and debugging might soon be a thing of the past. Swiss company DeepCode is working on a tool for automatic bug identification. Googles DeepMind can already recommend more elegant solutions for existing code. And Facebooks Aroma can autocomplete small programs on its own.
Whats more, the Machine Inferred Code Similarity System, short MISIM, claims to be able to understand computer code in the same way that Alexa or Siri can understand human language. This is exciting because such a system could allow developers to automate common and time-consuming tasks, such as pushing code to the cloud or implementing compliance processes.
So far, all these automations work great on small projects, but are quite useless on more complex ones. For example, bug identification software is still returning many false positives, and autocompletion doesnt work if the project has a very novel goal.
Since MISIM hasnt been around for a long time, the jury is still out on this automation. However, youll need to keep in mind that these are the very beginnings, and these tools are expected to become a lot more powerful in the future.
Some early applications of these new automations could include tracking human activity. This isnt meant like a spy-software, of course; rather, things like scheduling the hours of a worker or individualizing the lessons for a student could be optimized this way.
This, in itself, presents huge economic opportunities because students could learn the important stuff faster, and workers could serve during the hours in which they happen to be more productive.
If MISIM is as good as it promises, it could also be used to rewrite legacy code. For example, lots of banking and government software is written in COBOL, which is hardly taught today. Translating this code into a newer language would make it easier to maintain.
Being a software developer will remain exciting for a long time to come. Photo by Brooke Cagle on Unsplash
All these new applications are exciting. But above them looms a big Damocles sword: what if the competition makes use of those automations before you catch on? What if they make developers totally obsolete?
These are certainly two buzzwords in the world of automation. But theyre important nevertheless.
If you dont test your software before releases, you might be compromising the user experience or encounter security issues down the road. And experience shows that automated testing covers cases that human testers didnt even think of although they might have been crucial.
Continuous delivery is a practise that more and more teams are picking up, and for good reason. When you bundle lots and lots of features and only release an update, say, once every three months, you often spend the next few months fixing everything that got broken in the process. Not only is this way of working a big hindrance for speedy development, it also compromises the user experience.
Theres plenty of automation software for testing, and theres version control (and many other frameworks) for continuous delivery. In most cases, it seems better to pay for these automations than to build them yourself. After all, your developers were hired to build new projects, not to automate boring tasks.
If youre a manager, consider these purchases an investment. By doing so, youre supporting your developers the best you can because youre capitalizing on what theyre really good at.
Oftentimes, projects get created somewhere in upper management or close to the R&D-team, and then get passed down until they reach the development team which then has the task of making this project idea real.
However, since not every project manager is also a seasoned software engineer, some parts of the project might be implementable by the development team, while others would be costly or pretty much impossible.
That approach may have been legitimate in the past. But as lots of the monotonous parts of software development yes, those parts exist! are being automated, developers are getting a chance to get more and more creative.
This is an excellent chance to move developers left, i.e., involving them in the planning stages of a project. Not only to they know what can be implemented and what cant. With their creativity, they might add value in ways that are not imaginable a priori.
Its been a brief five years since Microsofts Satya Nadella proclaimed that every business will be a software business. He was right.
Not only should developers shift left in management. Software should shift up in priorities.
If the current pandemic taught you anything, then it is that much of life, and value creation, happens online these days.
Software is king. Paradoxically, this becomes more apparent the more of it gets automated.
When I was at school, people who liked computers were deemed unsociable kids, nerds, geeks, unlikeable creatures, and zombie-like beings devoid of human feelings and passions. I really wish I were exaggerating.
The more time is progressing, however, the more people are seeing the other sides of developers. People who code are not regarded as nerds any more, but rather as smart folks who can build cool stuff.
Software is gaining more power the more its being automated. In that sense, your fear of losing your developer job due to automation is largely unfounded.
Sure, in a decade in a few months even youll probably be doing things that you cant even imagine right now. But that doesnt mean that your job will go away. Rather, it will be upgraded.
The fear that you really need to conquer is not that you might lose your job. What you need to shake off is the fear of the unknown.
Developers, you wont be obsolete. You just wont be nerds that much longer. Rather, youll become leaders.
Written by
Siri, build me an app ...
If our department needs an application (that isn't available off the shelf) then we have to give them too much money to spend too much time developing an app that doesn't really do what we want. That's because they are all programmers who don't understand our business requirements. They may know how to handle abstract computer concepts such as stacks, queues, semaphores, etc. but they don't know our business processes and don't seem to care except what we put in the requirement spec. So basically it's our fault if we didn't get the requirement spec exactly right.
So that's where I come in. I write apps in low-level code such as VBA. I understand the requirements, I can develop and deliver code quickly, and it's in a format they are comfortable with, e.g. Excel spreadsheet.
I don't see how someone like me can be replaced by a program. The kinds of customizations that I need to add to the software in order to keep individual users, and the user group in general, happy are hard to predict and sometimes require driver-level adjustments in the code.
I lucked out that I didn't get forced into one of the programming departments, but it is a bit lonely being the only programmer around.
I gave this article careful consideration and then decided IT was bullshit.
Guessing a vast majority of the programming Jones in your organization is taken by unqualified H1B scum from India.
The trick will be getting management to know what it is they want. Often they have no idea and keep changing their requirements.
That’s it!
In the 1980s when Lotus 123 introduced their macro language in v2 I was fascinated by the possibilities.
My work study job at the time was “computer lab assistant”. I assisted students with their assignments but I also filled in for the professors when they were unable to make it in for some reason. My favorite class was the Lotus 123 Macro Class and because the professor relied on me I received a lot of extra attention and insight. It was amazing how less effort was required to create a useful 123 macro than write a full fledged program in a computer language.
What are you driving at? Come on, spit it out!
Regards,
Don't you understand? They no longer have an accent... YOU do.
Regards,
2020 - telework, distance learning, zoom, vpn
Unit testing began in 1993, almost 30 years ago. Software development tools don't improve nearly as much or as fast as the author thinks. And while there can be improvements, the basic problem is in interpreting requirements. This is a human issue, and very often the person in charge of those requirements doesn't know exactly what they want. By the time you get explicit enough requirements for a code generator, you've pretty much got the code already.
If they ever develop an AI that can understand the always craptastic functional specs let me know because I will start a new religion and worship that phantasmagoric unicorn.
Trust me if, they ever build an AI that can do that it will essentially be Skynet and, after a couple of months working with the a55clowns that infest management, it will take out the world with nukes.
In a previous lifetime, when I worked in IT, I remember one day my IT Director asked me what I thought of the latest fad which was at that time being promoted by the manufacturers and software companies: it was called Rapid Prototyping and/or Rapid Development.
That’s cool I replied, it enables them to write rubbish quicker.
He nodded.
This is not that new.
I was working on code generators in the 80’s for C.
These things were out even earlier, for Cobol for Pete’s sake, in the 70’s.
You could lay out a relational database system, data entry and edit screens, reporting, etc. and it would churn out the code. Full code to implement an RDMS, in 2nd generation languages, no SQL.
Same nonsense was said in the 1980’s too. There isn’t that much automation and software has become far too fragile to automat it.
“To write the code for the machines that write the code.”
I was writing code that wrote code that wrote code in the ‘80s. And much of the time since then. I’ve been automating myself “out of a job” for decades.
Ain’t it the truth, but it never seems to happen. The other truth that so few seem to grasp is “requirements”. Requirements lead to specifications which lead to design. Here again it depends on which level you are working. As far as that 1980s B.S., I go back 20 years before that. When I was 18 I working with IBM 1401s and RAMAC 305s, Univac, NCR B series with CRAM in NEAT3. Way before the 360/370 appeared. Used to make extra money wiring 407, 188, and 514 panels back when people had forgotten how.
That is when you learned the basics by programming in assembler. I’ve forgotten so much that I sometimes don’t remember how much I knew. And that was just in the beginning.
Started in 6502 Assembler (mid-1980s). Took Computer Science when it was really an Applied Mathematics Degree in the mid-1980s. Bought a VIC-20 to play with in Basic Training/ AIT. Wrote and published my first published commercial game in 1984 while at my 1st Duty Station. Programmed an old Radio Shack Pocket PC to calculate firing ranges for Artillery.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.