an OS api is an OS api and should be relatively language (and their runtime
system) independant. However current trends are otherwise (with .NET, Obj-C,
and Java (Sun)). Avoiding Python on Linux is nearly impossible (though I
try).
Until not that long ago OS APIs were assembly based, you had to stick
parameters into certain CPU registers and initiate a software
interrupt. At some point higher level language APIs were added while
the assembly APIs were still kept around Eventually, the assembly APIs
were dropped. Somebody probably complained about that in the same
manner you are now complaining about the dropping of higher level
assembly APIs (C is after all an assembly notation for the PDP-11,
augmented with some higher level language control flow constructs).
Somebody else might just call this progress.
Let's be honest, OS APIs have never been language independent, they
were always centered around whatever language flavour was chosen by
the OS implementors as a base. Initially it was all assembly based,
then IBM mainframes started to have IBM language based APIs, DEC had
an API which was in a way language independent but all languages on
the VAX had to get language extensions to support all the calling
conventions, in fact this was not unlike what I am doing with
Objective Modula-2, just that the calling conventions on VMS aren't
Smalltalkish.
Then only reason why the C calling standards (and very similar ones
like Pascal) look "normal" is because in recent decades most vendors
have copied the Unix model and used C or Pascal (or derivatives
thereof) as implementation languages. In other words, the way APIs
have come to look is more or less accidental and clearly not language
independent.
To claim otherwise would be like saying that communication in English
is language independent because English is very widespread.
I think the changes in OS APIs is part of a regular cycle by which new
concepts are introduced, some concept becomes de facto standard as
everybody adopts it, finally to be challenged and replaced by another
new concept when the cycle repeats itself. It's called progress ;-)
Interesting. Till now I had the feeling it was more about syntax/semantics
than hybrid properties. Btw, I'm in machine vision, so I like a bit of speed
too :)
To a language purist (isnt there a bit of a language purist in all of
us perhaps?) when first beign presented the idea of a "hybrid"
language, it is probably an appalling idea. But think about it, we
have had inline assembly capabilities within C and Pascal compilers
for ages. Also, interfaces to SQL are often realised as a kind of
inline SQL statement sequence. The Smalltalk derived message syntax in
Objective-C can be viewed as inline Smalltalk. This sort of thing is
fairly common and rather practical. When you look at it this way, the
notion of a hybrid language may no longer seem so outlandish.
When you work with Objective-C over time you realise that separating
control flow and data structures and handle one of them in a
precedural imperative language and the other in a Smalltalk like
language is actually quite practical and useful. Imperative languages
are simply speaking better at control flow and certainly they are
faster at it. On the other hand higher level languages like Smalltalk
generally make it easier to handle complex dynamic data structures.
Dividing these two sub-domains by mixing the respective features in a
single language can really make a difference.
Before I started with Objective-C I often had this dilemma that I
would rather use an imperative language for a project because of one
constraint, but rather use a higher level language because of another.
With a hybrid like Objective-C I no longer have to make this choice, I
get both in one.
It may seem odd from a language purist's point of view but it is
really practical when you use it, so over time you simply stop caring
about the oddity of it all. Brad Cox, the creator of Objective-C once
told me that when they designed the language that was precisely their
aim, they didn't think of it as a new language at all, but they simply
wanted some contraption that did something they wanted to do. The
outcome was good enough to catch on and become a language in its own
right, but it wasn't actually intended to be a language.
But then I don't get the compilation to obj-c part. That is a long way ahead
then. Good luck :-)
Well, maybe I should clarify this a bit. Anything in the Objective
Modula-2 source code which is conventional Modula-2 will come out as
vanilla C, but anything that uses the language extensions will come
out as its respective Objective-C equivalent.
For example
DEFINITION MODULE MyClass : SuperClass;
IMPORT SuperClass;
...
END MyClass.
will come out
#import "SuperClass.h"
@interface MyClass : SuperClass {
'''
}
...
@end
But
WHILE x > 0 DO
...
END;
will come out
while (x>0) {
...
}
This is a little easier to do than to generate assembly level code or
even virtual machine code because you don't have to keep track of
registers, you don't have any pipelining and scheduling issues, you
don't have to manage the stack frames. Most of the Modula-2 subset
features map more or less 1:1 into C, all of the Objective Modula-2
extensions map 1:2 to Objective-C extensions. The only thing which
requires some special attention are M2 name mangling and local
procedures.
In my experimental compiler (used for testing syntax during the design
phase) I have used a text template engine which I wrote for generating
configuration files to generate output. I wrote a bunch of C and ObjC
templates, one for each production in the target C grammar where each
of the non-terminals are placeholders, then I used the template engine
to replace the placeholders recursively until all constructs are C.
This has two benefits. It's less work than writing a traditional code
generator and it is useful as a verification tool to check if the
output is equivalent to the input, because the mappings are fairly
straight forward between M2 and C (other than for local procedures).
This will later come in very handy when writing the LLVM generator and
trying to figure out any bugs then (for example: do we have a silly
bug like forgotten to pop something from the stack, or do we have a
conceptual error in the translation scheme?)
I'm FPC's FreeBSD maintainer,
cool, do you have any coroutines support (or library) for FPC that
works on FBSD?
more go with the flow (not bad enough to resist) than love.
I understand the feeling.
I don't really like any of the currently popular general purpose scripting
languages, since for my feeling they don't add much. I usually jump right
from Object Pascal/C++ to domain specific languages (that really incorporate
modelling from the problem domain)
It seems to me that many if not most of those scripting languages are
meant for web development. I don't do web development, so I was never
surprised that those languages didn't seem to suit my needs much.
I never did Ada, but saw a lot of it over the years by looking sideways from
Pascal. Seems to suffer from the C++ problem of extreme language size.
I agree. One of the main reasons that made Modula-2 attractive was its
simplicity, but ISO Modula-2 is almost as bad as Ada and C++, not
quite, but definitely no longer simple enough to be appealing to me.
Oberon on the other hand seems just a tad too simplistic. At the very
least I would want to keep unsigned integers, enumerations and
separation of definition and implementation modules, if only for
reasons of readability and documentation.
And while I support nearly all choices they made individually, I disagree
with the sum of them
Haha, well said, that's the trouble with language design, right there.
There is always a good reason to add one more bit, but you can never
seem to figure out where you should have stopped adding more stuff.
That could be. I'm more the engineering, and less the computer
scientist type.
In respect of Objective-C (and by extension Objective Modula-2) it
looks the exact opposite, like I said, the Objective-C way of
hybridisation is purely practical and rather unscientific.
The hybrid Modula2 - Smalltalk is what I find surprising. Modula2 is as
strong typed and static as it gets. Smalltalk, euh, well isn't.
That's precisely why a hybrid makes sense. It wouldn't be of much
added utility if you combine two toolkits which have identical tools
in them. It will however increase utility if you combine two toolkits
which have different tools in them.
As for static and dynamic typing, Objective-C has both. If you use the
type "id" (the equivalent in Objective Modula-2 is called OBJECT, then
its dynamically typed, but if you use the actual type identifier of an
actual class, then it is statically typed.
I use dynamic typing in Objective-C only where one would use generics
in Pascal, M2 or C++. If you have the choice of dynamic typing when
you want it, then you don't need generics, but you still get static
typing for all other usage scenarios.
Like I said before, the hybrid is not about just mixing features,
stirr them and end up with a melange. Each feature set has its well
defined problem domain, Modula-2 for all the control flow and simple
scalar operatings like say calculating the index for a bucket in a
hash table. the Smalltalk side is for creating and managing non-scalar
data structures. With this separation it makes perfect sense that the
Modula-2 part should always be strictly typed and that the Smalltalk
part gives you the option to choose either static or dynamic typing.
In fact this is not limited to static versus dynamic typing, Objective-
C (and by derivation Objective Modula-2 as well) gives you choices on
a case by case basis where in most other languages the choice has been
made for you by the language designer.
In fact, Objective-C gives you the choice which garbage collection
scheme you want to use on a per object basis. You simply send a
message to it and depending on the message it is either reference
counted or automatic. In all other languages I know of, it is either
all garbage collected or nothing is. Again, I found the ability to
choose on a per object basis to be very helpful in practise.
This level of choice on a per item basis in Objective-C is something I
do not want to miss now and it is at least in part a result of the
language being a hybrid of two unlikely marriage partners. So, yes, it
does make perfect sense to marry Modula-2 and Smalltalk, maybe not
from a language purist's point of view but definitely from a
practicality and usability point of view.
But back to the point, while I generally are fond of M2 and Pascal
(including OO variants), IMHO the importance of language, while not
negiable, is overrated. I use Pascal, not Modula2 because I liked the
compiler best. (though admitted, conceptually they are nearly the same). I
used Topspeed Modula2 before, which is why still monitor this group.
I can agree to that only for situations where the languages are at
least broadly similar, that is when they use comparable paradigms.
When you cross paradigm boundaries, then the different paradigm of the
other language may make a very big difference. But, in principle, yes
its not the language as such, it is about the paradigm or set of
paradigms that the language is based on. In the case of Pascal,
Modula-2, Oberon, Ada you have a very similar set of paradigms, even
from there to the C and C++ or Java, the paradigms are relatively
similar. But there are languages with fundamentally different
paradigms, Smalltalk being one of them, and if you know how to make
use of that, then it will make a significant difference.
But that is not necessarily to say that one paradigm is superior to
another. Sometimes it also has to do with how a given paradigm matches
your way of thinking and working. Sometimes a new paradigm can lead
you to change the way you think and work and the outcome may be
positive, but sometimes a paradigm just isn't compatible with the way
you like to do your stuff.
The most programmed language in the world is Excel macro language/VBScript.
Still I don't have an immediate urge to venture into that realm.
What I meant to say was that your perception of what Objective-C is
may have to change because it seemed to be based entirely on what the
majority of people who use it use if for. Consequently, if the
majority of people who use it use it for something different, then
your perception would have to shift. Ergo, Objective-C might soone
have to be perceived as "the iPhone development language" because it
may turn out that most folks using it will use it for that. You may
then say, that it doesn't make sense to base one's perception of a
language on what others are using it for, but that was actually the
point I was trying to make ;-)
That is a different thing. I called you on simple facts about GNUsteps
usability in the wider world. And facts remain facts. Objective C only plays
a small role outside the Mac world. It's like saying that Pascal is great
because you can use UCSD on Apple II.
This is precisely where you are mistaken, you have an idea based on
what you think the majority of folks who use it are doing and what
those people are and how they fit into some box (eg. Apple fanboy) and
you dismiss anything that doesn't fit that perception.
GNUstep was created when SUN Microsystems and NeXT worked together on
OpenStep. Back then Apple was a pure Pascal shop and the iPhone was
not even a remote possibility. Objective-C and OpenStep was all the
rage at SUN before they decided they wanted to exercise more control
and created Java. OpenStep was the number one development platform in
the financial industry, too. WebObjects was very widespread. Heck,
DELL's ordering system was running entirely on Objective-C. Michael
Dell, dropped it with sorrow not for technical reasons but for
political reasons after Aplle acquired NeXT.
Yes, the acquisition and adoption of NextStep by Apple may have been
both a blessing and a curse. On the one hand it has added many more
users, on the other hand it has also led to the other areas in which
the environment was very strong and widespread before to be pushed
into the background. But that doesn't mean that anybody who picks up
Objective-C or its object system or the class libraries has to follow
this trend. It is perfectly reasonable to use this environment for non-
Mac non-GUI stuff and for its own merits. And it is equally reasonable
for somebody to write a clone for Windows (e.g. Cocotron) as it is for
somebody to write a .NET clone for *nix systems (e.g. Mono).
I'm a bit scared. I do like LLVM, but I don't like the possible
ramifications of Apple adopting it. I'm a bit afraid that Apple will use it
to exert even more control over 3rd party developers. A bit like Java phones
that don't allow apps that are not signed, under the guise of "security".
Luckily there is a significant difference. The Java license doesn't
allow you to fork, the LLVM project's BSD style license does. So, if
Apple goes too bossy on the whole thing and upset developers in the
process, there'll be a fork.