![]() |
#196 |
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
|
![]() |
![]() |
![]() |
#197 |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 1,692
Karma: 16307824
Join Date: Sep 2022
Device: Kobo Libra 2
|
I don't think that's the right way to look at it. Python is designed to be a language that makes it easy to accomplish tasks. The price for that ease of use is decreased efficiency. A language like C is much more efficient, but it's also more difficult to use. It is perhaps an unavoidable tradeoff. Python provides a higher level of abstraction so that the programmer can tackle problems from a human perspective, but the code required to create that abstraction reduces efficiency. C forces you to think more like a computer in order to accomplish anything, and having little abstraction is efficient, but it can also be obnoxious. You have to pick your poison depending on what task you're trying to accomplish.
|
![]() |
![]() |
![]() |
#198 |
Still reading
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 14,214
Karma: 105212135
Join Date: Jun 2017
Location: Ireland
Device: All 4 Kinds: epub eink, Kindle, android eink, NxtPaper
|
ALL programming languages are designed to make it easy to do tasks. Some are aimed a particular tasks. Like Perl is almost just I/O on RegEx scripts.
C started as a kind of machine independent Macro Assembler with more English, so as to make it easy to port UNIX. BASIC (Beginners All purpose Symbolic Instruction Code) was a cut down interpreted version of Fortran for Dartmouth College, ported in barely more than a week-end by Bill Gate's friend for the 8080. Cobol was the first language for business Applications. Fortran for mathematics and Science. QUBAL (Queens University of Belfast Algorithmic Language) was experimental for teaching programming, based on Algol. Wirth and Hoare did it. Wirth developed more for Teaching (soley) with Jensen. Hoare did Occam for the Transputer because all existing languages were rubbish for Parallelism. Wirth then did Modula, Modula-2 (allows device drivers and parallelism) and then Oberon. Modula-2 is fundamentally different from Pascal, though it looks similar. US DOD had Ada. A bit like Moduia-2, but deliberately wordy to be readible. Poorly written C is terrible to debug. Much easier to debug Modula-2 or Ada. Then in 1987 I learnt C++. Unfortunately it was crippled by ease of use of C Libraries (many still have bugs) and backwards compatibility. People ignored the safe C++ strings and we still have buffer overflow in C and C++ programs. Java was Sun's version of C++ J++ was MS version of Java, sued so became C# Javascript and related are not much to do with Java. Client side scripting because HTML isn't a programming language. Then a rash of hard to debug scripting languages for server side, some of which are now used for stanalone applications. Python, C, Basic and similar are terrible languages to learn about programming. Universities in Ireland and UK used to teach programming. Now they teach a programming language. No wonder Win10 and II is worse than Window NT 3.51 or NT 4.0 GUI is now driven by fashion and clueless so called Graphic designers and has ditched loads of decent research and principles. Good programmers that learned to program can use any language in a week or two. The system APIs, libraries and frameworks take longer. The efficiency of C was maybe true porting UNIX in 1976. It's been nonsense since the late 1980s if not earlier. C should now be banned totally for any new project. It was obsolete nearly 40 years ago. |
![]() |
![]() |
![]() |
#199 |
Grand Sorcerer
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 5,800
Karma: 103362673
Join Date: Apr 2011
Device: pb360
|
"C is quirky, flawed, and an enormous success.", Dennis Ritchie, The Development of the C Language, Second History of Programming Languages conference, Cambridge, Mass., April, 1993.
|
![]() |
![]() |
![]() |
#200 |
Connoisseur
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 55
Karma: 2600000
Join Date: Sep 2018
Location: New Jersey
Device: kindle, nook
|
MUMPS anyone? On a more serious note, reading this thread prompted me to google Lahey Fortran to see if it were still around, and sadly it won't be after the end of the year.
Last edited by Pierre Lawrence; 12-27-2022 at 03:40 AM. Reason: add a comment |
![]() |
![]() |
![]() |
#201 | |
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
Quote:
I'm not saying that Python is a bad language - it certainly has many valid use cases and indeed I use it a lot. My beef isn't with Python itself, it's with programmers who only know Python, or who have spent too much time with Python before using other languages. There's a style of programming that Python encourages that leads to inefficiency (particularly in terms of memory use) that is over and above the inefficiencies in the language itself, and that earlier exposure to other languages can teach programmers to avoid in their Python programs. Learning C and/or C++ will make a better Python programmer (particularly if done before the Pythonic styles become too firmly embedded) - that's my point. It's also true that learning Python will make a better C++ programmer - it works both ways. I guess my real point is that a good programmer should know multiple languages, not just to be able to write those languages, but also because the learning and mental toolkit developed from each language improves programming in all the others. Too many graduates emerge from Computer Science degrees only really knowing one language. These days that's mostly Python (a few years ago it would often be Java), which is why Python gets the sharp end of my irritation - it's not the fault of the language, my concern is more with the state of CS teaching in universities. |
|
![]() |
![]() |
![]() |
#202 | ||||
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
Then don't produce poorly written C!
![]() It's possible to write bad, unmaintainable code in any language. It is, you're right, particularly easy to do in C, but that's no need to avoid it - just a reason to be careful about who you allow to use it! Quote:
Quote:
Quote:
Quote:
In particular, the standardisation of the memory model introduced in C11 (and C++11) is crucial when building safe multithreaded, multicore and multiprocessor systems, allowing programmers to be explicit about memory fences, acquire/release semantics, with fine-grained control over allowed reordering etc.. It's still (for many) the language of choice for systems programming and for embedded systems (particularly those with limited hardware resources and memory), whether RTOS-based or unhosted/bare-metal. The Linux kernel is written (mostly) in C. The MacOS kernel is written (mostly) in C. Many RTOSes are written in C. It's entirely likely that the kernel on your phone is written in C. Many of the things that you rely on every day, from your microwave, to the ABS system on your car, to the GPS system you use for navigation (both the code on the satellites to the code for the gps chips in your phone) are very likely to be written in C. It's almost certain that the network packets sent when posting this message will pass through switches and routers which are programmed in C (Cisco's IOS is mostly written in C). The list goes on .... C is also very useful as a backend language for higher-level domain-specific languages. E.g. much flight code for aerospace systems is written in MATLAB/Smulink and transpiled to C. Not bad for a language that's been obsolete for 40 years! |
||||
![]() |
![]() |
![]() |
#203 |
Still reading
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 14,214
Karma: 105212135
Join Date: Jun 2017
Location: Ireland
Device: All 4 Kinds: epub eink, Kindle, android eink, NxtPaper
|
The problem is that people should have stopped using C for new projects, because it's so obsolete and terrible.
It's an indictment of how unprofessional the Industry is. C generated by pre-processors or cross-compilers etc is fine. That's just using it as sort of machine independent assembler. It's not the same problem as human written C which has no advantages at all and only disadvantages now. C++ was better than C in 1987 and now much better. It would have been better still if AT&T hadn't insisted on C compatibility. I've debugged so many stupid programs that are allegedly C++, but really more like C compiled with a C++ compiler. Last edited by Quoth; 12-27-2022 at 05:04 AM. |
![]() |
![]() |
![]() |
#204 | ||
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
Quote:
'Terrible' is a matter of opinion. Clearly you think so, but many would disagree. Quote:
In practice, you often end up being restricted to a subset of C++ to avoid those traps, and that subset starts to look very much like C. In those cases it's often cleaner just to start with C, and take advantage of its explicitness - what you write is what you get. Clearly, C can be misused (so can every language), but well-crafted C, written by competent programmers, is still a perfectly sensible approach for many problems. When I'm writing a large application, I won't choose C, but if I'm e.g. hand-crafting a threading system for an embedded microcontroller I might well use C. |
||
![]() |
![]() |
![]() |
#205 |
Bibliophagist
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 46,534
Karma: 169115146
Join Date: Jul 2010
Location: Vancouver
Device: Kobo Sage, Libra Colour, Lenovo M8 FHD, Paperwhite 4, Tolino epos
|
I tend to agree that too many programmers are not taught the basics of generalized programming since the courses they take focus on learning to program in a single language. You end up with programmers who see every problem as a nail since their only tool is a hammer.
|
![]() |
![]() |
![]() |
#206 | |
Wizard
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 1,692
Karma: 16307824
Join Date: Sep 2022
Device: Kobo Libra 2
|
Quote:
It sounds like you're approaching programming from an artistic sense, rather than a utilitarian sense, and I suspect many programmers see programming solely as a means to an end, rather than a form of artistic expression. Of course, those programmers might wish that they had focused more on writing beautiful code when the time for debugging comes around... ![]() |
|
![]() |
![]() |
![]() |
#207 |
Still reading
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 14,214
Karma: 105212135
Join Date: Jun 2017
Location: Ireland
Device: All 4 Kinds: epub eink, Kindle, android eink, NxtPaper
|
There isn't any one language for everything and learning to be a programmer rather than learning a programming language is not about learn multiple languages. It's about concepts and principles that might be illustrated in two or three languages. About dangers too.
Micrrocontrollers that can't run an off the shelf OS: PIC family micros have only enough stack for a call & return: JAL is best language. If the microcontroller has a decent stack then Forth may be a good choice. Basic and C have been bad choices for years. If you are customising a CMS you need skills in php and Javascript (also CSS and HTML, but those are description, not programming) and also SQL (which isn't really a programming language, but can have stored proceedures). Some server side may use Java too. Less popular CMS may use other languages. If it's an ARM microcontroller and can run a cut down Linux then C++ may be best. In the very unlikely situation I was writing a program for DOS or 8088 to 80486 without Windows or a real OS, I'd use Modula-2 because I have the tested libraries and even device drivers that bypass DOS for timers, graphics, midi, serial, parallel, soundblaster, PC speaker 5 bit audio, disk i/o and parallelism up to animated sprites. Only uses DOS to load to RAM at the start. Most programming isn't from scratch, but fixing security, bugs or adding features to existing programs. Perl or python might be best to translate a file from one format to another. (Wordstar 6.0 to text, or ebook formats in Calibre) You need basically Java to write an Android app. Unlike desktop or server or embedded system there is no choice. You'll need to be able to read C and C++ to use existing Windows & Linux APIs reliably. VB6 is gone. So for rapid MS Windows development of a GUI program with SQL for in house or own use, using C# is best. You might need C, Pascal, Turbo Pascal, Fortan, Cobol, VB6, various assemblers to re-write or translate existing code or libraries for a new platform (eg. An RDS decoder in Assembler to JAL on pic Micro or C++ on something with a real OS). One language for the desktop only, if you have a clean sheet and no legacy APIs or Libraries to use might be possible. In the real world you need to be a Programmer, not someone that has learnt a programming language. A real programmer can understand an unseen language in less than a day and be competent in a week. It's the libraries, Frameworks and APIs that take time with any new system or language. Learning one popular language only, rather than programming, is a massive handicap. It's like learning all the menus and settings on a massive desktop program without learning the principle. See MS Project without learning Project management, or MS Word or Scrivener or Indesign without knowing things from two domains, like how to write (ideas, outlines, dialogue etc) and also the concept everything needs separate styles. In early 1990s I was supposed to teach a class learning to be secretaries Wordperfect. At work experience not one was using Wordperfect, but they didn't panic because the teaching concentrated on what you needed to understand about documents and what to look up, not WP commands and menus. Last edited by Quoth; 12-28-2022 at 05:06 AM. |
![]() |
![]() |
![]() |
#208 | |
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
Quote:
Part of the skill of a good programmer is in choosing the right tool for the job - if you only know one language, your options are limited. As DSNB said above, if your only tool is a hammer, all problems look like nails. But, let's assume that it is indeed the case that one language can be used to accomplish all your tasks - why would you want to spend time learning to be a "good" programmer? Because becoming a good programmer will allow you to accomplish those tasks more quickly, with higher quality code. Code that runs more quickly, uses less memory and other resources, is less buggy, has fewer security holes, is more amenable to code reuse, is easier to parallelise, and is easier to maintain (etc. etc.). There's much, much more to being a good programmer than simply knowing a programming language. A good programmer should understand the underlying model of computation (e.g. Church-Turing thesis, lambda calculus, Universal Turing Machines etc.), complexity analysis (big-O etc.) and so on. They should also have a good understanding of how the machine they're programming works (both hardware and OS, if there is one, and VM if there is one) so they can write efficiently for it - e.g. an understanding of the memory hierarchy (so that code can be written which, for example, has good locality of reference and limits working set to minimise cache misses and page faults). They should understand the processor architecture so that multiple cores and/or processors can be used effectively etc.. Learning a low-level language like C helps with all this. A good programmer should also have an extensive toolkit of problem-solving approaches at their disposal - learning different languages with different paradigms (OOP, functional programming, logical programming etc. etc.), and even different languages with different takes on the same paradigm, expand that toolkit. Even if you don't use those other languages for a particular task, an understanding of the way they work can allow you to see the problem in different ways and come up with better solutions in whatever language you actually are using. |
|
![]() |
![]() |
![]() |
#209 | |
Somewhat clueless
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 772
Karma: 9999999
Join Date: Nov 2008
Location: UK
Device: Kindle Oasis
|
Quote:
There have also been several other alternatives for Android apps for a while now - e.g. Scala, Flutter/Dart. Even C & C++ can be used for parts of Android apps using the Android NDK. |
|
![]() |
![]() |
![]() |
#210 | |
Still reading
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() Posts: 14,214
Karma: 105212135
Join Date: Jun 2017
Location: Ireland
Device: All 4 Kinds: epub eink, Kindle, android eink, NxtPaper
|
Quote:
Preferred by who? Edit: I see it's Google, so likely politics as they never had a Java licence for the flavour Android needs. New programmers or ones that started doing phone apps with the crippled Java allowed on Symbian (Sun only licensed full Java for desktop and Oracle continued this)? Google bought in Android and ignored the license violation, then ended up in Court). Or is it like Davik an attempt to backtrack from license violation. The actual preferred language for Android is a mere detail that doesn't matter to real programmers and the point is that it's different. Even Java for Android and Java for Windows/Mac/Linux is so different with APIs and what's possible and how stuff works. I confess it's years since I wrote either an app for Android, or a desktop Java using a FW so it would look like the user's native desktop theme on Mac / Windows / Linux. But the point is that writing an App for Android isn't going to use the same language as your washing machine controller. Last edited by Quoth; 12-28-2022 at 11:30 AM. |
|
![]() |
![]() |
![]() |
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Netronix new 5", 8", 9,7", 6" touchscreen and WiFi E-Ink readers | Charbax | News | 4 | 06-08-2009 11:08 AM |
2 new ebook readers from netronix (6" & 9.7" eink) | lionfish | Which one should I buy? | 2 | 03-12-2008 11:11 PM |
"Upload" (German book) by Cory Doctorow released as CC | Alexander Turcic | Deals and Resources (No Self-Promotion or Affiliate Links) | 1 | 03-04-2008 10:37 AM |
"Scroogled", CC short story by Doctorow: What if Google was evil ? | Hadrien | Reading Recommendations | 5 | 09-25-2007 05:07 AM |
Cory Doctorow, "Down and Out in the Magic Kingdom" | eponymous | Deals and Resources (No Self-Promotion or Affiliate Links) | 4 | 05-28-2003 08:43 AM |