What programming language would you recommend to a beginner?


It were a 'ard life then. Writing naughty words on a 7 segment LED display using only the letters A to F took some doing, especially when you're coding with hex. No fancy, la-de-da assemblers in those days...

Eeeee, those were the days.
Aye, but were fun, though. I think I might have played with a KIM-1 somewhere along the line. What I did play with, though, was the SC/MP. My 3rd year project involved looking at these new-fangled microprocessory things to see if they showed any potential for satellite or telescope control systems (I was studying astrophysics and the dept was big into sats and scopes). The project was simply to integrate the SC/MP kit (similar set up to the above) with a couple of stepper motors / shaft encoders for precision control. I think it took me all of 2 days, so I connected up the '.' lines on the LEDs, designed a memory expansion, wrote some calculator software for it and a cross-assembler on the mainframe (because I was too lazy to hand-translate to hex).

Whilst modern machines are considerably more powerful, I don't have the same sense of 'connection' to them as I did to those earlier devices that demanded a far greater intimacy with their workings.
Sure. Start your own thread. There's potentially enough milage in the current subject for a bit a straight and level before manoeuvring into that big a meander - the first half of such a thread would be devoted to determining what you mean by 'think'.

Been done. Skynet is an early example.

Again, separate thread stuff. Ask an adult to show you how if you can't work it out.

The question itself is an old one and there is much literature on it. The Berkenstein Bound is one place to start thinking.

A Consultant? Ah. That explains much ... not everything, but much. I refer to an earlier message in this thread.

... 'Procurement Consultant'? Well, I suppose it looks better than 'Pimp' on your passport. ^^

(... and there's me thinking you liked to call a spade a spade)

But you're happy enough to insult us on the equipment we provide! :evil:

Your anti-intellectual slip is showing again.
It makes me weep when I think of the millions wasted on sophisticated software systems and dicky hardware that are rushed into businesses with insufficient training by overpaid IT contractors. Terms and conditions that pin down commisioning and warranties are essential before a penny is paid...nothing anti-intellectual, just prudence.

Machines that replace humans are something I look forward to.... I can get back to proper pimping then...:)
I spent the summers of my time at Uni working for Ferranti (they were sponsoring me).

Navigation Systems Division had this range of programmable ATE kit called "FIST". Proper computers - it took up most of a 19" rack, it had ferrite core memory, and it had a front panel. With thumbwheel-and-lighted switch inputs (silly octal, none of your fancy hexadecimal).

Ar, them were't days. Anyone who worked with GR.3 might have seen its replacement; single container, lots of connectors...
A sensible (ish) answer (on ARRSE? Yeah right...) - C++, get your head round OOP. Once you understand OOP then the chosen language is really pretty immaterial. A working knowledge of VB helps with writing VBA in MS Office, similarly Javascript in Open Office. C is good for witing fast code, but lousy at writing good code, it allows programmers to write some spectacularly messy code.

Some background in database queries invariably means one of the many varients of SQL.

Web-based apps - HTML, XML, PHP - any further than that is beyond me.

If you're off into the realms of scientific computing and want to earn bucketloads - Labview, MATLAB.

If your talents lie more with the black-hand gang then PLC programming - Simens STEP - if you can code in that, then you can code on any other PLC.



I also could do with a cheap/free langauge and/programming course to get back up to speed, as I haven't done any serious programming for years, but need to enhance my employability.

I used Sinclair ZX81/Spectrum BASIC as a kid, used Z80 machine code and assembly langauge as a BTEC student, then used C (Microsoft QuickC to be precise) and 68000 assembly langauge at University, also using QBasic to write a program to calculate the look angles for a satellite communications link (and do link budget calculations) but since then not too much. I would like to be a confident programmer - able to write decent programs for calulations and engineering programs that interact with hardware inputs and outputs.

Any suggestions?

Also, can one do a short course on PLC programminhg? Since my background is more Communications (instead of Power/Control) Engineering, how helpful do you think it would be regarding employability?

See also my post here: http://www.arrse.co.uk/intelligence-cell/161079-poor-uk-education-standards-41.html#post4303264


Any ideas with respect to learning and developing marketable programming skills?


Book Reviewer
Any ideas with respect to learning and developing marketable programming skills?
Look at the job adverts for the big software companies and see what their requirements are. The division I work for in my particular company is strong on C++ and Java. But I'd question whether you could make up the ground on a career programmer. They not only know the language but have often picked up a tool kit of specialist skills - for example one or two of the developers I work with are good at developing code involving the efficient 3D manipulation of large numbers of objects.

A word of warning (and I don't know how typical this is) but my company sets interviewee's a practical coding test. They assume that they will have basic knowledge of coding - so are interested in a person's problem solving skills. So anyone coming in for interview gets 1 - 2 hours to code a solution for a simple problem. A senior developer then gets them to talk through their code and the logic behind it. The pass rate is not high...

The other avenue you might look at is if you could develop niche programming skills - for example you can call pretty much all of AutoCAD's functions via code, allowing you to build specialist functionality on top of it. If you were already skilled in particular commercial software, this might be worth pursuing.



Why programming necessarily? Where I work they are rapidly being regarded as offshoreable roles with little value being given them. I don't agree with it but its how it is. Analysts are more highly valued (and paid) and seem to have better job security. That and Oracle DBAs. An OO background in development is very useful but my advice would be to get into a role further up the chain that does not rely on one or two languages
Any ideas with respect to learning and developing marketable programming skills?
The Microsoft Visual Studio is quite good. The Express version provides you with a free development environment for building C++, C# and Visual Basic applications.

If you have Microsoft Excel or Word, then you can learn use the Macro editor as a learning environment - it provides an object browser, variable browser and Immediate window to output expressions to. It also has the added advantage of allowing you to gain skills in programming Excel or Word.


Time to reconsider this.
A side question to programming languages:

Is there still a requirement for the teaching of binary?
The usual explanation is that computers ‘think’ in binary. This relates to the on / off state and is no more than an electrical circuit ‘thinks’ in binary. (More ‘I think, therefore I am on’)

You don’t have to learn about binary to become an electrician, but you do have to learn a little about logic circuits to wire a light and switches for stairs.

Back in the 80s at school the lessons went from binary, CESIL*, BASIC, then touching hexadecimal and onto the variety of programming languages, machine code and assembler.

* CESIL was a simple language to teach us the very basics of writing a program. Usually to input some numbers and add or multiply them.
We had to write out the program then wait a week until it came back with results or errors. The language only had simple commands such as IN, OUT, PRINT (for text), we could use variables but all functions acted on a register so we had to move things around with LOAD and STORE.
What nobody told us was that it we were really programming in assembler.

They made such a big thing about binary, and though we wrote in CESIL and sent our programming sheets off week after week they made little of it. When it came to machine code & assembler these were rushed through as complicated and not really required unless we were designing electronic circuits. Just 5 minutes of thought and referring us back to CESIL would have told us we could easily use assembler and a new generation of tech wizards could have been born.

It was only when I was at college and started a project to write a CESIL interpreter that I quickly realised it was just assembly language and scrapped the project.

A friend is now at university, recently learned binary and is moving onto languages. He was given the same reason – computers think in binary.
If a computer ‘thinks’ in anything it doesn’t do its ‘thinking’ in on/off but within the processor which will be 32bit+. Most likely really working in 16 bit with a command and a value to work on.

What is the point in binary to modern computing other than something simple that if people can’t get a grasp then they are going to struggle with the rest, or for a geek exercise?
What is the point in binary to modern computing other than something simple that if people can’t get a grasp then they are going to struggle with the rest, or for a geek exercise?
It's because of boolean logic. If you can't cope with the basic concepts involved in AND, OR, NOT, XOR then you're not going to write control code.

If you can't do simple boolean maths, you'll never understand that NOT(A AND B) is the same as (NOT A) OR (NOT B). You'll write "NOT A AND NOT B" in the wrong place, possibly by accident, and then wonder why your code behaves strangely.

It gets fun when you find out that some spoken languages view the word "or" as being a logical "exclusive-or" (one, or the other, but not both), whereas other spoken languages view it as meaning "inclusive-or" (one, the other, or both together) - e.g. English and Japanese respectively.


Book Reviewer
We were taught Pascal at college. I'd probably recommend C++, but it would depend on what their ultimate goals in programming were.
As a software engineer since 1985, I generally agree with this. My own opinion is not really relevant. A previous Industrial Trainee whinged that at Uni they were taught Java, which was easy to learn, then C++ which taught them how to do things properly that Java had let them get away with being slack. Better had they learned C++ first.

I learned Pascal (and Cobol - spit) at college in the 80s to go with PL/1 on the IBM mainframes that drove Army Pay and Manning Services. I went back to the college a couple of years later and they had dropped Pacal for C++.

Everything I learned has been undermined by upstart languages that are great for everything except the mainframes that drive (for example) 100% of a certain bank's mission critical business. Not that lot that were in trouble recently: they let their software get out of date then employed some monkey to update it badly. I am about to start work on a one-touch install for mainframe software, something that's decades past due, but there are no longer the system programmers to do the job properly, so it becomes ever more necessary as time passes. Sooner I retire, the better.

I'll be writing it in REXX, the Restructured EXtended eXecutor language, or System Product Interpreter if you prefer. Sooner I retire, the better.
Not wanting to derail this with the binary comments but there are varying degrees of 'understanding binary' which are more or less useful depending on what you do. One overarching principle holds true however:

"Sometimes you need to step down a level from the abstraction which helps you. Often this is when it doesn't do what you want or need it to or it goes wrong. It is much harder to learn down the levels coming from the top down than it is to climb the abstractions on the way up."

For example learning two's complement, opens the door to understanding how bytes relate to bases, bases open the door to encoding, encoding is important when you ship data between systems, literally the values in your bytes. Whatever you think about needing to be able to understand set, read, write, move, compare, add and other basic instructions understanding binary is helpful even for those working at very high levels of abstraction.
"Sometimes you need to step down a level from the abstraction which helps you. Often this is when it doesn't do what you want or need it to or it goes wrong. It is much harder to learn down the levels coming from the top down than it is to climb the abstractions on the way up."
Agreed. However, my major PITA of the moment is software written in the "what's the minimum change I can do" mode, rather than stepping back and trying to see the wood instead of the trees; and software that assumes that everything will work wonderfully, and that no-one will ever pass them a crappy input. At least I can point them at the alarms raised by the static analysis tools, and ask them to fix them.

While it's often very helpful to able to cope without the abstractions that make life much easier, I find that I'd rather people tried to write abstracted, encapsulated, loosely-coupled, code with minimal APIs rather than just hacking in another public function because it's easier, or adding a boolean flag to a parameter list to make the function behave differently.

I'm about to refactor a library that's been written and rewritten and hacked and patched, by multiple engineers, where no-one has taken the half an hour needed to ask "actually, why do we do it this way?". If I get it right, it will be simpler, cleaner, faster, and safer.

New Posts

Latest Threads