> Why do you say programming has gone nowhere because there are people who don't use the newer advancements?
Speaking for myself, its because Visual Studio and its ilk don't feel like advancements. They feel like bandaids. They do their damnedest to reduce the pain of programming by automatically producing boilerplate, auto completing words, providing documentation, and providing dozens of ways to jump around text files.
Personally, I don't feel that the bandaid does enough to justify using it (granted, my main language doesn't work well with Visual Studio, so there's that too).
The main source of the pain is, to me, is that we're still working strictly with textual representations of non-textual concepts and logic, no matter how those concepts might better be rendered. We're still writing `if` and `while` and `for` and `int` and `char` while discussing pointers and garbage collection and optimizing heap allocation... Instead of solving the problem, we're stuck down describing actions and formulas to the machinery. No IDE does anything to actually address that problem.
Sorry, rant, but this problem certainly resonates with me.
>The main source of the pain is, to me, is that we're still working strictly with textual representations of non-textual concepts and logic, no matter how those concepts might better be rendered.
I can't see any issue with representing logic abstractly with symbols. It's the same for calculus. Of course the ideas we're representing aren't actually the things we use to represent them, the same as written communication.
Non-textual programming has been explored to some degree, such as Scratch, but it's not seen as much of a useful thing.
>Instead of solving the problem, we're stuck down describing actions and formulas to the machinery. No IDE does anything to actually address that problem.
Describing actions and formulas to a machine in order to make it do something useful is pretty much the definition of programming. IDEs make it a more convenient process.
Unless you want to directly transplant the ideas out of your neural paths into the computer, maybe some AI computer in the future based on a human brain, this is how it's going to be.
> I can't see any issue with representing logic abstractly with symbols.
That's the problem: text isn't abstract enough. So we put some of the text into little blobs that have names (other methods), and use those names instead, and we call that "abstraction," but black-box abstraction doesn't help us see. The symbols in calculus, by contrast, are symbols that help you see. The OA is calling for abstractions over operating a computer that help us see.
Agree. There is must be more abstract way to present ideas than text. In this way, programs are easier to understand and modification, and have less errors and bugs.
I am suspicious. I think it would certainly be easier in some ways for rank beginners- it would make spelling errors and certain classes of syntax errors impossible- but those aren't really the bugs that cause experienced programmers grief. It's generally subtly bad logic, which is more about how people are terrible. Plus, we already know how to create computer languages that largely avoid those problems.
Written language is wonderful in many respects, and I sometimes thing people discount these things out of familiarity. Keyboards too- you can do things very quickly and very precisely with keyboards. Those things matter for your sense of productivity and satisfaction.
> ...while discussing pointers and garbage collection and optimizing heap allocation
People still do that?
We still use 'if' and 'then', but higher level languages like Ruby and Python have eliminated pointers and their ilk from day-to-day discussion, relegating heap allocation discussion to the halls of specialized conferences, while many programmers go about their day-to-day activities skipping over the pain of garbage collection.
They may not win language shootout speed tests, but they remove a lot of programmer pain and help with how long it takes to write code.
Of course, but I don't see how that "problem" will ever be solved. As long as a process has been abstracted, there will always be someone who has to look after that abstraction.
Somebody has to design and engineer the hardware and every level of abstraction between that hardware and whatever abstraction layer the "average" developer uses.
Speaking for myself, its because Visual Studio and its ilk don't feel like advancements. They feel like bandaids. They do their damnedest to reduce the pain of programming by automatically producing boilerplate, auto completing words, providing documentation, and providing dozens of ways to jump around text files.
Personally, I don't feel that the bandaid does enough to justify using it (granted, my main language doesn't work well with Visual Studio, so there's that too).
The main source of the pain is, to me, is that we're still working strictly with textual representations of non-textual concepts and logic, no matter how those concepts might better be rendered. We're still writing `if` and `while` and `for` and `int` and `char` while discussing pointers and garbage collection and optimizing heap allocation... Instead of solving the problem, we're stuck down describing actions and formulas to the machinery. No IDE does anything to actually address that problem.
Sorry, rant, but this problem certainly resonates with me.