Posted on 07/06/2009 12:40:46 PM PDT by ROTB
I'm writing a desktop application that I'd like to deploy onto Windows, Macintosh, and Linux. I'm trying to pick the language that will give me the most conveniences, without setting timebombs that will go off down the road.
Here's what I think I know so far:
C++ Pros:
1) resulting code runs fastest, provided I am not a bonehead
2) most flexibility in memory management
3) Maximum difficulty in reverse engineering my object code, though there is nothing revolutionary or complex in what I plan to write.
4) Tools are rock-solid.
C++ Cons:
1) memory management is the biggest hassle, though garbage collection, usage of stack, pools, and smart-pointers, can mitigate this. 2) cross-platform programming requires much work for me to manage
Java Pros:
1) Cross platform programming handled
2) Memory management easier relative to C++
3) Lots of Tools, fairly mature
4) many years of advancements over C++
5) "Hot-Spot" compilation can make code run nearly as fast as C++
Java Cons:
1) 10 Megs of runtime need to be packed up with my app, since users might not be technical enough to install a JRE on their own
2) Code is more easily reverse engineered than C++, but how good are the obfuscators?
C# Pros:
1) Through Mono, I can deploy in lots of places
2) I get to use Developer Studio initially, and then use less capable IDE's for porting
3) Handles garbage collection, and does more for the programmer than C++.
4) Lots of libraries.
C# Cons:
1) I might have issues with Mono working exactly like Microsoft's runtime.
2) What did I forget?
C# Questions:
1) Does it compile to machine code, or bytecode? If bytecode, does that compile to machine code?
Python Pros:
1) Clean expressive syntax
2) Cross platform programming handled
3) Memory management easier relative to C++
4) many years of advancements over C++
5) More rapid iteration, since the projects needs not be compiled.
Python Cons:
1) Bytecode can be reverse engineered, but "Shed-Skin" would make the resulting code as fast and as difficult to reverse engineer as C++, BUT "Shed-Skin" is a work in progress, and if the lead-programmer dies, I would have to pick up the slack. I might also need to find bugs myself.
Thanks for the link Dan1123 it is much appericated. :)
I will look at Python, VB, and C++ to start out with. Once again thanks for the advice.
First, get Visual Basic: Visual Basic 2008 Express
Then try: VB Tutor
One of my favorites for web development, including VBScript and JavaScript: W3Schools
Otherwise... Google: VB Tutorials
That’s not strong typing, it’s duck typing. You didn’t have to declare a, b and c since they took on an assumed type of whatever you assigned to them. It then got caught during runtime, not compile time.
When I program I would wonder is a an int? Maybe it want it to be a byte, decimal, floating point, small (2-byte) int, large (8-byte) int, signed or unsigned.
Duck typing is not necessarily inferior, it’s just a style of programming I don’t prefer. But I do think a programmer should become proficient in strongly-typed design before he goes off with something like Python and produces a horrible, unmaintainable mess.
There's the problem with the young'uns these days. They want to sit down and learn a programming language before they actually learn how to program.
Check out JIT compilation. It happens on the fly. As far as making a standalone executable, that seems to miss the point of Java in the first place.
Ditto. Java is the pits.
I’d use C# since I’m using it right now. C# or VB about the same, since both mean .NET stuff. But Desktop generally means a lot of UI. C# and VB do that easily.
If it’s a massive and complex app with performance issues I’ll use C++. I haven’t used python, so no can say.
My primary language was COMPASS, the Assembly Language on Control Data Mainframes. I supported the COBOL and Fortran compilers as part of my job as a Operating Systems programmer. Now I use C and Java mostly with a little bit of Fortran.
The teaching part came in when my company brought in a bunch of university students as interns (co-ops). The ones that got sent to the MIS department had to learn COBOL. The math and science interns used Fortran, and my buddy taught that. Back in those days you could look at a dump and debug a program rather quickly. I swear my boss could look at a dump and reconstruct the Fortran code.
I have found that using VB increases the possibility that the application will see the light of day. Java is a close second because there is a lot of Java code out there.
I've written little BASIC programs without an interactive debugger--I mean, it was an Apple II after all--but I can't imagine having to write any serious software today without a debugger. I have tremendous respect for those who did. It feels like I'm cheating somehow. :-)
Try machine language without an assembler. Welcome to hell.
The first part is duck typing. The error comes up because of strong typing. I think you're confusing strong typing with static typing. You can have a static strong typed system (Java), a static weak typed system (C), or a dynamic (or duck) strong typed system (Python).
A major problem with the C type system is implicit casts. You can define your type as a 2-byte integer, but you can add a 1-byte char to it or a pointer to it and the compiler won't even send out a warning. Sometimes that may be what you want, but the compiler should at least point out that you're operating across different types. Also, I would like a C compiler to tell me if I'm adding pointers when I probably don't want to (like when two char* are being added when I probably want to concatenate).
I can't even fathom that today.
I love some of the geek-fights over low-level vs. high-level languages:
"Well, Sonny... Back in MY day we had to code in BINARY. None of this namby-pamby HEX fluff, no sir! If you can't make your machine do what it should with a bank of toggle switches, you're no programmer!"
C# for a desktop app, C# and PHP if it is a web app.
You could also build the app to use components in all three languages (C#, VB.NET and C++ either managed or unmanaged).
Right. And that’s good in many ways to do. Or even Perl, or anything. However a shop should go hard one way or the other. One tool on nearly everything. Or any tool, anytime. Either way is good, but to be in the middle means things will get lost.
If all of our servers were Microsoft we would be .NET, but if you have a bunch of legacy systems running on different platforms and OS’s then Java is the only thing that makes sense.
Not completely. I was blending bits of strong, static and safe. But for example:
a = 20
b = 1
c = a + b
What will the result be? An int? A byte? A float? Python doesn't care, it's all an int, or a float if there's a decimal point. It will only catch egregious errors as you mentioned. I'm not a Python expert, so correct me if I'm wrong.
As far as dynamic typing, it's also bad for beginners. It lets them play too fast and loose. Thus we get a lot of code jockeys who don't really know how to program.
This looseness is also what I hate about Visual Basic and most other scriptable languages. Sure it lets you prototype faster, but there's a lot more to programming than the prototype. This is my main complaint in my recent foray into Objective C. I like the strong, static, safe, nominal typing of c#. But then I'm a control freak when it comes to programming.
Yeah, well, I still don’t like Java. I want the power to destroy worlds. Java is too playpen-ish.
But still, it is just a tool. Any tool can be used well.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.