Which programming languages don't u use,and for which reason

That does work? The function declaration must contain the type too, right?
Yes that works. Indeed, a typedef just introduces an alias name, not a new type. Of course, the function declaration must declare the types, but still Apple and Orange are just different names for the same type here.

C even has the concept of compatible types. A declared struct is a new type, but still two structs with identical "bodies" are compatible as soon as they are declared in different compilation units ... 🤯 (I mean, it makes sense because there's no type information in the ABI, but it sounds quite weird at first).

You can ignore all types in C by just referring to bytes all the time. That makes it not strongly typed?
Actually no, you can't. You can always access any byte of any object in C because the language allows a char * pointer to alias any other pointer type. But that won't get you far, because the actual representation of types are implementation defined. An int on your typical amd64 looks very different in memory than it does on, say, an MC68000.
 
I use none of them, no interest, there's a long way to go from "Hello World" to anything useful and I just don't have the time or inclination.
 
That does work? The function declaration must contain the type too, right?
The compiler sees this as:

int Macoun
int Valencia

MakeApplesauce(Valencia)

where the function prototype would be MakeApplesauce(int) -- because MakeApplesauce(Apple) resolves to that.
 
I prefer C++ over C because it makes things easier.
You likely work in a resource-rich environment -- where execution time and available memory are "without cost".

When your boss comes to you and says:
"I need a gizmo that does X, Y and Z. And, has these general performance requirements. How much will the hardware cost? And, how long will it take to develop, given that we can't make changes to it after it leaves the factory?"​
then, you want a language that gives you a good idea of how to estimate these things.

If you look at things like Lifecycle Controllers (Dell servers), you can see that the folks who planned and designed the hardware and software obviously were disappointed with the performance they got from their efforts (they are invariably sluggish). But, the cost of refactoring the hardware and software designs has obviously led those Powers-That-Be to conclude (rationalize!): "It's good enough."

Should it really take more than a minute for a *server* (not some schlock bit of consumer hardware) to actually get to the point where it LOOKS at the boot device? How many billion opcode fetches have transpired? What the hell were they all doing??!
 
Yes that works. Indeed, a typedef just introduces an alias name, not a new type. Of course, the function declaration must declare the types, but still Apple and Orange are just different names for the same type here.

C even has the concept of compatible types. A declared struct is a new type, but still two structs with identical "bodies" are compatible as soon as they are declared in different compilation units ... 🤯 (I mean, it makes sense because there's no type information in the ABI, but it sounds quite weird at first).


Actually no, you can't. You can always access any byte of any object in C because the language allows a char * pointer to alias any other pointer type. But that won't get you far, because the actual representation of types are implementation defined. An int on your typical amd64 looks very different in memory than it does on, say, an MC68000.
I think a way to bypass type checking doesn't make a compiler "weakly typed".
Then any direct write access to memory space would do so. A block of bytes has no type either.
That's why I believe the C/C++ vs. Rust debate has a corporate origin. It's not about security but end-user control of the machine. "Ecosystems", locking the user inside a software construct. The biggest scam in computing history.
 
I think a way to bypass type checking doesn't make a compiler "weakly typed".
C's notion of types is just an illusion, for the most part. One wants to be able to create new types so you can leverage the compiler's ability to ENFORCE restrictions on the sorts of things a coder could do -- like making applesauce out of oranges.

C++, OTOH, would prevent me from making "applesauce" out of oranges and "orange juice" out of apples. But, getting this level of protection comes at too high of a cost!
 
C's notion of types is just an illusion, for the most part. One wants to be able to create new types so you can leverage the compiler's ability to ENFORCE restrictions on the sorts of things a coder could do -- like making applesauce out of oranges.

C++, OTOH, would prevent me from making "applesauce" out of oranges and "orange juice" out of apples. But, getting this level of protection comes at too high of a cost!
Isn't everything that abstracts away from the hardware architecture an illusion? I mean, in the end your whole program structure is made of AND, OR and XOR with an instruction pointer to remember where you're at.
No reason to enforce limitations to this with prescribed programming methods and languages. They might like it at Google, MS and Oracle, though.
 
So, I should write my OS in the same language that I use to write the device drivers that tie into it, the applications that run atop it, the interface to the RDBMS and the scripts that users want to be able to write? That would lead to either an overly complex language or a crippled one.
This is kind of the approach that the "post-UNIX" industry took at first. "C for everything"

Though I believe a decent compromise was quickly made with (close) supersets of C. For example, whilst the kernel and drivers are written in C, the DBMS could be written in C++. Importantly, the DBMS in C++ could directly interface with the operating systems C libraries. This part is key.

Yes, C++ is a bit grim in places (and Obj-C has gone into hibernation). But projects like cppfront, carbon, circle, etc are trying to come up with safe languages that can also leverage C code directly (i.e without bindings). I strongly suspect that this will be the future. Code rarely gets rewritten; a future language that can correctly consume the "old stuff" is valuable. Possibly the continued success of C++ has demonstrated that this feature is more valuable than safety and ease of use.
 
Isn't everything that abstracts away from the hardware architecture an illusion? I mean, in the end your whole program structure is made of AND, OR and XOR with an instruction pointer to remember where you're at.
No reason to enforce limitations to this with prescribed programming methods and languages. They might like it at Google, MS and Oracle, though.

There is a difference between illusion and abstraction.

Illusion is saying Apples are not the same thing as Oranges -- when the implementation says otherwise!

Abstraction is saying that an int can represent an Apple. Or, an Orange.

In no case does MakeApplesauce(42) make sense (which is how the compiler sees this example)

Though I believe a decent compromise was quickly made with (close) supersets of C. For example, whilst the kernel and drivers are written in C, the DBMS could be written in C++. Importantly, the DBMS in C++ could directly interface with the operating systems C libraries. This part is key.

You misread my post:
So, I should write ... the interface to the RDBMS ....

I.e., do away with the domain specific language called "SQL" in favor of some sort of C++ syntax that implements similar functionality? Just so all of the code is C++ instead of C, C++, SQL, etc.? (SQL being the language most appropriate for interfacing to an RDBMS)

[Edit doesn't allow quoting so I resort to copy-and-paste...]

"Yes, C++ is a bit grim in places"

I like what I can do in C++. It is nice to be able to use a Q10.6 format and still write arithmetic equations -- overloading arithmetic operators. The alternative messes up your code with lots of function call syntax that makes it harder to see what is happening.

I like try-catch to make error handling "cleaner" (to the reader).

I like being able to have a Class Apple and a Class Orange -- even if their underlying types and methods are strikingly similar.

etc. Everything that lets me make my code more visually concise (i.e., taking up less paper) is an enhancer for comprehension; the reader can see the whole program instead of just one page of it.

But, the rest of the crap that the *implementation* drags into play is a killer. IMO, one needs to be very much more expert in understanding the details of the language to be able to :know" what a line of code is doing "on the CPU's address and data busses". As one is typically highly engrossed in the problem THEY are trying to solve when writing code, I suspect too many folks would "miss" some detail that they should have recognized and, later, wonder why the code is so lethargic.

[I wrote to Stroustrup about a typo in _The D & E of C++_. His reply: "You're right! Everything on that PAGE is backwards!" I.e., he was likely more focused on WRITING than in thinking about what the code he was writing was actually doing! But, I should be more aware than he?? o_O ]

This is a common example I give during interviews: Which is the better implementation:

for (row = 0; row < MAX_ROW; row++) {
for (col = 0; col < MAX_COL; col++) {
mat[row][col] = f(row,col)
}
}

or:

for (col = 0; col < MAX_COL; col++) {
for (row = 0; row < MAX_ROW; row++) {
mat[row][col] = f(row,col)
}
}

The point being, if you saw one -- or the other -- in a piece of code, would you think about whether it was the "right" way to do this, or the "wrong"? Chances are, you'd just gloss over it as you are preoccupied with other things... Understanding what the code is intended to do and what it ACTUALLY is going to do should be the goal of any good language design. Because people are lazy.

An interesting challenge:
take three bottles and three chopsticks (in my mind, I always imagine "Coke" bottles from the 1960's)
arrange the chopsticks so that none touches the ground/table/worksurface

(Easy peasy)

Now, remove one bottle and repeat the exercise.

Now, remove another bottle...

The point: all three problems are trivially solvable. And, each successive (more constrained) problem's SOLUTION is technically a valid solution for the problem preceeding.

SO WHY DIDN'T YOU COME UP WITH THE THIRD SOLUTION IN THE BEGINNING??
 
Abstraction is saying that an int can represent an Apple. Or, an Orange.
That's not an abstraction, that's a simple translation/representation. An abstraction would be defining them both as "fruit", and using it in a generalized function MakeJuice(fruit), which can create apple juice or orange juice depending on the type of fruit you shoved in it.
 
That's not an abstraction, that's a simple translation/representation. An abstraction would be defining them both as "fruit", and using it in a generalized function MakeJuice(fruit), which can create apple juice or orange juice depending on the type of fruit you shoved in it.
The abstraction is that they are both objects -- with ints being a chosen way of referencing them. One could have used floats, strings, some complex data type, etc. to the same result. The example could just as easily have been:

typedef int Apple
typedef int Automobile

Apple Macoun
Automobile Mercedes

MakeApplesauce(Mercedes)

[Clearly Apples and Automobiles have no underlying commonality -- beyond being objects/things]
 
The abstraction is that they are both objects -- with ints being a chosen way of referencing them. One could have used floats, strings, some complex data type, etc. to the same result.
All different representations, a computer doesn't know (or cares) what an apple or orange is, so we represent them in a way a computer can deal with it. You're not abstracting here, especially if you stick it in a specific function designed for one specific value (or type) of object. The abstraction happens when you use a generalized function that works for any type of fruit, or even any object. The 'function' of making juice is the same for all fruits, you squash it, then filter out the solids.

Automobile Mercedes

MakeApplesauce(Mercedes)

[Clearly Apples and Automobiles have no underlying commonality -- beyond being objects/things]
A generalized function MakeJuice would have no problem with this. The result won't be pretty obviously, but the function itself would still work the same without having to modify it to accommodate an entirely different kind of object. Heck, somebody might find getting "car juice" useful in the future.
 
All different representations, a computer doesn't know (or cares) what an apple or orange is, so we represent them in a way a computer can deal with it. You're not abstracting here, especially if you stick it in a specific function designed for one specific value (or type) of object. The abstraction happens when you use a generalized function that works for any type of fruit, or even any object. The 'function' of making juice is the same for all fruits, you squash it, then filter out the solids.
A computer knows nothing of "integers" floating point numbers, characters, strings, etc. These are all abstractions that we (as humans) map onto "collections of bits" which, themselves, are abstractions of "states of charge" in an electrical circuit.

The types of operations that you can perform on an Automobile are different than those you can perform on an Apple -- regardless of how much you want to bastardize tha analogy. What does MakeApplesauce *mean* in the context of an electronic circuit? Applesauce itself is an abstraction.
A generalized function MakeJuice would have no problem with this.
Just because the code doesn't CRASH doesn't mean the code is correct or not "problematic". What does MakeApplesauce(23) *mean* -- to a human? How do you map that onto a meaningful concept (i.e., abstraction)?
 
You misread my post:
Good point but in some ways that makes it even easier as a decision. Almost any interface (such as libmysql, liboci8, etc) is (and should be) written in C or exposed as a C API. The simple reason is that almost any language has some sort of interop with C.

I.e Getting Java <-> Python to share libraries is a pain in the butt.

In terms of an interface for communication between the DBMS, as you were referring to, then SQL is more like a protocol. Yes, I absolutely would not like to use C or C++ for that.

Things like SQL or HTML, these tend to not quite come under the same rules as typical languages. They aren't really "programming" languages as such.
 
A computer knows nothing of "integers" floating point numbers
Common general purpose CPUs actually do know of these. That's how they work, by manipulating numbers, that's the only thing they can do. Although manipulating floating point numbers was once done on an entirely separate co-processor it has since been integrated in the CPU itself. And you may find certain architectures take a more modular approach (RISC-V for example), but the one thing they are all capable of is manipulating discreet numbers.
characters, strings, etc.
These are typically represented by discreet numbers using some form of encoding standard, ASCII, EBCDIC or unicode for example.

Just because the code doesn't CRASH doesn't mean the code is correct or not "problematic".
Computers are stupid machines, it does exactly what you tell it to do, it's correct to a fault. If the result isn't what you expected or when the computer does something wrong, it simply means you, as its programmer, made a mistake (logic errors for example).

What does MakeApplesauce(23) *mean* -- to a human?
Make applesauce from 23 apples? When I see a function defined this way I'm going to assume the argument is the amount of apples. Why would you provide a number representing an 'apple' as an argument to a function if the only thing it can do is create applesauce from apples.
 
Good point but in some ways that makes it even easier as a decision. Almost any interface (such as libmysql, liboci8, etc) is (and should be) written in C or exposed as a C API. The simple reason is that almost any language has some sort of interop with C.

I.e Getting Java <-> Python to share libraries is a pain in the butt.

In terms of an interface for communication between the DBMS, as you were referring to, then SQL is more like a protocol. Yes, I absolutely would not like to use C or C++ for that.

Things like SQL or HTML, these tend to not quite come under the same rules as typical languages. They aren't really "programming" languages as such.
The crystal-language can easily create html & sql-queries in a few lines.
 
The crystal-language can easily create html & sql-queries in a few lines.
Code generators have always been popular for that kind of stuff.

For HTML when I begrudgingly have to make simple web pages, I tend to use M4. Not quite as encompassing as Crystal, but part of base (and SUS I believe).
 
Common general purpose CPUs actually do know of these. That's how they work, by manipulating numbers, that's the only thing they can do. Although manipulating floating point numbers was once done on an entirely separate co-processor it has since been integrated in the CPU itself. And you may find certain architectures take a more modular approach (RISC-V for example), but the one thing they are all capable of is manipulating discreet numbers.
Are they *integers*? That's a concept that we have mapped onto the contents of certain registers. E.g., if I am manipulating an ADDRESS register, then I'm likely manipulating an ADDRESS. In my gesture recognizer, all of the quantities manipulated are fixed point (Q10.6) "values". Because that's how I interpret the results of the calculations that I perform on them (fitting cubic beziers). Not floating point, not integers.

Has the CPU suddenly changed because my code is now addressing these fixed point values? The charges stored are still there but the abstractions they represent have been changed.

Everything in the machine is just collections of charges. How we interpret them is a matter of abstraction. Make *juice* from fruit? What's a "fruit" to a computer? You can define a "fruit" type but not an Apple type?

Make applesauce from 23 apples? When I see a function defined this way I'm going to assume the argument is the amount of apples. Why would you provide a number representing an 'apple' as an argument to a function if the only thing it can do is create applesauce from apples.
The programmer didn't provide a "number". The (C) compiler interpreted it as a "number" because the C compiler has no way of enforcing the distinction between ints and Apples. To it, they are the same! If it were more strongly typed, then it would have thrown an error as an int would not have been an acceptable argument to the function -- which was the point of my post (neither would an Orange have been permitted).
 
In terms of an interface for communication between the DBMS, as you were referring to, then SQL is more like a protocol. Yes, I absolutely would not like to use C or C++ for that.
Of course SQL is a language. It has a grammar. It has an abstract machine on which it operates. It is Turing complete. The same is true of PostScript (though folks rarely WRITE PS code.

The point was that these languages address specific application domains -- effectively. Eliminating them in favor of some "more common" language would be a disservice to the folks tasked with these problems.
 
Of course SQL is a language. It has a grammar. It has an abstract machine on which it operates. It is Turing complete. The same is true of PostScript
Ultimately there isn't too much difference between it and what we send for, i.e the IRC protocol, requesting data. This uses grammar and could be over-engineered enough to utilise an abstract machine.

The topic title mentioned "programming languages" which, general purpose or not, does tend to exclude SQL or HTML.

The point was that these languages address specific application domains -- effectively. Eliminating them in favor of some "more common" language would be a disservice to the folks tasked with these problems.
I kind of see what you are saying. But in the same way, they are really just configuration files in the same way that .ini or .json are. This is a completely different application domain to programming languages. Thus they are not really a good argument against "just write everything in C".

Postscript is actually quite substantial as a programming language. It is way more than just static markup for example. I don't know what it classes as these days. Its all a grey area.
 
The topic title mentioned "programming languages" which, general purpose or not, does tend to exclude SQL or HTML.
But you can "write code" in SQL. You can manipulate the state of the database and data within in much the same way that you are manipulating bytes in memory (using a conventional language).

Ladder logic is a progamming language -- albeit one that is typically presented more graphically. Again, more suitable to the problems in *its* domain.

I kind of see what you are saying. But in the same way, they are really just configuration files in the same way that .ini or .json are. Programming everything in C doesn't exclude them in the same way we don't program image files in C.
You underestimate them.

I track all of the "names" (objects) in my "disk collection" (think: offline disk array) using a schema that creates tuples of container identifier (i.e. a reference to the tuple defining the parent folder), object name (file, folder, etc. as a string) and a unique identifier for this tuple. So, I can reconstruct a path to a particular object (file) by chasing the "container identifiers" back up the file hierarchy:
(ID,name,containerID)​
1,D,Cid -- i.e., (1,D,48)​
...​
8,A,VOLid -- i.e., (190,A,Volume)​
...​
48,C,Bid -- i.e., (48,C,190)​
...​
190,B,Aid -- i.e., (190,B,8)​
So, object "1" would be "Volume/A/B/C/D"

To do this, I have to "run code" that patches together each of these tuples -- an arbitrary number of which may exist -- to create a human readable path name. This code is written in SQL -- no "regular programming language" necessary.

[You can see how you would do this with C structs (Delphi records, etc.). And SQL can do similarly.]

The same is true of, e.g., PostScript. It may not be a convenient way to write a payroll program, but that doesn't mean you can't! (Turing complete)
 
Back
Top