Posted on 02/28/2024 1:41:30 PM PST by yesthatjallen
The government would prefer it if you stopped programming tools in C or C++. In a new report, the White House Office of the National Cyber Director (ONCD) has called on developers to use "memory-safe programming languages," a category which excludes the popular languages. The advice is part of U.S. President Biden's Cybersecurity strategy and is a move to "secure the building blocks of cyberspace."
Memory safety refers to protection from bugs and vulnerabilities which deal with memory access. Buffer overflows and dangling pointers are examples of this. Java is considered a memory-safe language due to its runtime error detection checks. However, C and C++ both allow arbitrary pointer arithmetic with direct memory addresses and no bounds checking.
In 2019, Microsoft security engineers reported that around 70% of security vulnerabilities were caused by memory safety issues. Google reported the same figure in 2020, this time for bugs found in the Chromium browser.
" Experts have identified a few programming languages that both lack traits associated with memory safety and also have high proliferation across critical systems, such as C and C++," the report reads. "Choosing to use memory safe programming languages at the outset, as recommended by the Cybersecurity and Infrastructure Security Agency’s (CISA).
SNIP
(Excerpt) Read more at tomshardware.com ...
” Don’t blame the paintbrush.”
DEEP thoughts ...... are you sure your in the right place?
8-)
I have had some interest in "rust", "D" and clojure as "safe" languages. The problem is that none of the customer code is written in those languages. The real world stuff is C, C++, Java, FORTRAN and Ada. The Ada stuff comes from an attempt by DoD to push for Ada development. The FORTRAN stuff remains due to stable implementations of compute intensive algorithms that just "work" and can leverage a vectorized processor like a Cray.
Thanks for the info.
I've been writing in C# since 2000. It has a great type system and leverages lessons learned from Java and C++. When I could not lay hands on the Windows memory checking software, I opted to write my application using "managed" C#. The "managed" variant takes care of memory with a garbage collector similar to Java. It was fast enough to do voice recognition, voice synthesis, call processing and a geospatial database. When I have to do digital signal processing, I insist on C++ and the FFTW (Fastest Fourier in The West) library. I start with a MATLAB script that performs the operations correctly, but too slow for the application. Re-code the MATLAB algorithm using best practices in C++. Last time I needed to do that was 2005. MATLAB may be faster now.
Indeed. One of the reasons I never learned coding with decimal dollars and cents. Never wanted to do that kind. The Navy did well with Grace Hopper’s invention. They dreamed that someday we all would be able to code in English.
Years later they realized that most programmers don’t know English. Getting worse these days, too.
Few people leverage the multi-core, multi-threaded capability of modern processors. I've done it for very specialized cases in both C++ and Java. You can do it nicely in Scala as well.
Chromium is an open source browser that google uses for their Chrome browser and further developed into their Chrome Operating System for their Chromebook.
I have Chromium because I run Ubuntu, a type of Linux. My everyday browser is Waterfox, a privacy based version of Firefox.
Microsoft gave up on their own original Edge browser and now build Edge off of Chromium.
Chromium is a good open source basic browser and your complaints should be aimed a Chrome/Edge, not Chromium.
Just checksum the push pulls.
I favor Python for data science and machine learning tasks. Last year I did a project with ML written in Python running on an Azure cloud. Input/output was via REST methods to hide the interior details of the ML application.
I was on a technical challenge team that was given a database of all satellite orbits and all ship voyages over a 20 year time span. The task was to identify every ship that managed any segment of its voyage outside the view foot print of any satellites. It was a very intensive bit of processing. We successfully delivered the answer and won the proposal.
There was only one really great programming language: ALGOL.
Most of my years as a software engineer we used C++. Developed some military applications with it.
Yeah right. When I see it on stackoverflow I’ll believe it. Until then I don’t trust the thing this government says.
Agree on ALGOL. Used that in my first Comp Sci course in 1970.
Those were the days! :)
Actually, 'Purify' has probably been bought and sold a thousand times since I used it.
I used to do microcode floating point algorithms for IBM 360 machines for well log analysis. The rest of the coding was FORTRAN using punch-cards.
For small embedded systems what can you use except assembly and some sort of C/C++. (no comm)
FORTRAN forever!
+1 for FORTRAN and Cray.
RIP Seymour.
I, also, programmed in Fortran (mostly) on a Univac at Johnson Space Center — related to retrospective interactive simulation of certain Skylab-related data and some other projects.
“Those were the days” for sure. (Being young helps!)
I loved eating lunch there in the big cafeteria. No telling who you would meet and get to talk to. Or more often, mainly listen to.
Half the time I would even understand half of what they told me!
That was around 1980-1982. I learned a lot. Really interesting nerd city!
I still use FORTRAN on a regular basis.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.