Free Republic
Browse · Search
News/Activism
Topics · Post Article

To: Myrddin
Current high performance processors depend upon a pipelined execution queue. Optimized compilers can create better "pipeline" code than most "skilled" programmers in assembler. A pipeline stall created by a careless choice of operations will trash the execution performance on that kind of CPU.

Back in the old days it was easy to beat a compiler. Nowadays, it's much more difficult. Plus, CPUs are so fast that it makes no practical difference in most cases. However, there are a few applications where it still makes sense to use asm. For example, crunching large bitmaps. Compilers seem to have trouble with the nasty bit fiddling you do when crunching bitmaps.

32 posted on 04/02/2004 10:32:36 PM PST by mikegi
[ Post Reply | Private Reply | To 27 | View Replies ]


To: mikegi
Today's CPUs can be absolute dogs with a bad compiler. The blazing fast execution is as much a consequence of an optimizing compiler as it is a "fast" CPU. It takes both in concert to achieve the speed.

I agree with respect to bit twiddling a complex bitmap. The "simple" stuff that can be accomplished with mapped bitfields inside a C struct is fine for most modern compilers. Unfortunately, you get compiler dependent implementations that don't port nicely.

33 posted on 04/02/2004 11:00:19 PM PST by Myrddin
[ Post Reply | Private Reply | To 32 | View Replies ]

To: mikegi
Did Richard Clark Work on the F22?

Let's assume the source of the original article is trustworthy, and not an engineer responsible for the problems he's complaining about like a former counter-terrorism czar. Among the rules in place for coding on several platforms under development, Joint Strike fighter included, is the constraint that source code may not be modified. Yes, I said SOURCE code. DOD and the Air Force are at the leading edge of platform independent Model Driven Architecture. The code is generated automatically.

As Microsoft has shown with their Common Language Interface (CLI) standard, procedural languages are equivalent. Fortran = C++ = J# = Python = Basic = Cobol ... They all compile to an intermediate language. Last I heard there are some 60 languages which .NET will translate to the the Intermediate Language IL (like Java's byte code), where they are dynamically compiled to machine code. Further, I can say from personal experience that compilers will almost always achieve better performance from a system because processors are so much faster than memory. Many, if not most, processors are designed as pipelines, a set of parallel processors to comprise the computation engine with critical timing to achieve claimed performance. The timing requirements are embodied in the compiler. In fact compiler designers are key to the design of modern processors.

There have been modeling languages for decades, created to help address the complexity of system design. The current contender is the UML, or Unified Modeling Language. The core idea is not so hard; identify closely related activities, package them, name them and use them as building blocks - again and again. The blocks can be represented graphically. What matters is how functional blocks relate to one another. Those relationships get represented graphically as lines with a family of arrows and labels. I won't wax on, but suggest a look at www.omg.org, the Object Management Group, to learn more.

It is possible to design formally validated systems (provably correct) by using a subset of the UML. One could also use Petri nets or other equivalent techniques. Semiconductor designers have had to learn about modeling because the cost of failure was too great. Software designers, most of whom in the US are not degreed engineers, quick and smart as many are, have not learned the discipline of engineering, finite state machines, time dependency analysis.

Modeling is the future of software. Much of the design process is graphical. Ada, other than that there is a dearth experienced programmers, isn't the issue. Procedural languages deal with one function at a time. In the real world there are lots of things happening in parallel. Some of those things (the word object is usually used, and vague, but I'll avoid it) need to know the state of other things - does the door lock control need to know when the car is moving? The UML and its relatives are implicitly parallel discriptive languages. They all provide rich mechanisms for defining the state - door open/closed - engine running/stopped. For activites to follow, one after another, would requre that the designer express functional blocks as related by a squential order. More important is that the design must characterize the required relationships between the blocks which are needed to describe the system.

Who uses modeling besides the DOD? IBM, Microsoft, Nokia, Lockheed, Borland, Motorola, GE, Grumann, Tellabs, Cisco, Nortel, Mentor, Cadence,.. I see semiconductor vendor job reqs for MS degree Engineers with UML experience using Rose or Rhapsody in Hyderabaad (India). I'm glad the Indians are getting educated because our companies can't compete without well educated engineers, and our schools are handicapping our brightest students (Students in math and engineering from abroad require about two years less coursework preparation for a Ph.D at U.C. Berkeley)

When the F22 project was begun, there were tools, but they were used primarily by the telecom industry. Telecom developed a language, an ITU standard is called the System Definition Language, SDL, which is quite similar to the UML. Ptolemy, Esterel, Simulink, Colored Petri-Nets, and lots of other languages exist. SDL is still in use, but being displaced by the UML. I thought the UML was being used for the F22 project, but don't know first hand.

Idealy fully executional modeling separates the design from the deployment. You can run the model, and decide later what hardware, how many signal processors or general purpose cpus, memory, message transports, displays, etc. you need. The intellectual property, how to control the elevators, how to evaluate targets, how to monitor engine condition, is all preserved in models. If no one touches source code, fixes are made by fixing the model. The integrity of the design is preserved, ready for translation to new hardware without touching the model, though work will have to be done to the translation module.

The model is also the specification - the document - of the design. Models, because their allowed states are well defined, provide sufficient information for generating tests automatically. Products are commercially available from hundreds of vendors.

The DOD appears to be addressing software complexity. NASA isn't as disciplined, as the Rover programming problems atest. (Rover wasn't modeled) How about Boeing, which built an airplane, the 767, with software which cost more than the airframe. I hear that Boeing "end-of-lifed" the 767 before the first plane flew. I'll bet they model the software for the next airplane!
45 posted on 04/03/2004 3:19:05 AM PST by Spaulding (Wagdadbythebay)
[ Post Reply | Private Reply | To 32 | View Replies ]

Free Republic
Browse · Search
News/Activism
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson