I like to think that all three languages benefited from the competition.
Bottom line, it demonstrated an answer to a problem that a lot of people were having, and it promised to answer that problem in an 'open source' kind of way.
In Self we have two ways of specialization:
1. Prototypes
2. Parents
Prototypes are objects that are "cloned" to create instances, a very simple and direct notion of inheritance. We create an object (a "prototype") with properties that apply to a larger set of objects, and then for each instance we copy (clone) it and then add or modify properties as necessary for the instance. Self had optimizations to deal with this so instances did not end up being very fat.Parents are objects that other objects "inherit from". At run-time property lookups are delegated to a parent. The difference between a parent and a prototype is that changes to the prototype do NOT affect the derived instances. Changes to a parent DO affect the derived instances.
So, when I read about "class-based" versus "prototype-based" languages, I cringe. It is really "class-based" versus "parent-based". How did cloning get confused with run-time delegation?
Self introduced the notion of self-describing instances. That is the essential coolness. The simplifying notion.
http://www.selflanguage.org/_static/published/self-power.pdf
We say "prototype" to indicate an important feature of Self which its children (Python, JavaScript, etc.) inherited: Unlike classes, prototypes are open for modification. The precise mechanisms differ between the different languages in Self's family, but they all have that feature in common.
Compare and contrast with other Smalltalk children, like Java or E, where this isn't possible because classes are closed. (E doesn't have classes, but it has a similar property, in that object literal scripts are closed for modification.)
In the original Lieberman paper[1] you created new objects that were empty and inherited from the prototype. Then you added local slots (name/value pairs) to the object for anything that was different from the prototype. Languages based on this concept often have two kinds of assignments: one that adds a new local slot and another that just replaces the value of the old slot (either local or inherited).
In Self, on the other hand, a prototype is a template for an object you want and you clone it to get a new object. Once it has been cloned, there is no longer any relation between the prototype and the new object and they evolve separately. These kind of languages tend to have only one way to assign values to slots.
Many languages that claimed to follow the Self model (like NewtonScript and Io) actually use the Lieberman model instead. In either case the prototypes are fully functioning examples of the object you want to create new instances of. So, unfortunately, it is natural that one word is used for both. But this results in very confusing discussions when someone is talking about a language with one model while thinking about the other model instead.
[1] http://web.media.mit.edu/~lieber/Lieberary/OOP/Delegation/De...
Edit: Oh I see. Cloning is the alternative to instantiation. Parent slots are the alternative to classical inheritance.
But it didn't have to be that way. And now the sentiment is that OOP is bad, and inheritance is evil, and classes are the worse, forcing one to predefine a taxonomy that's likely to need refactoring.
But prototypal languages can be easily changed. Just change the parent slot(s), or modify the object itself, etc.
Are there any codebases around 100K to 1 M lines of code written in a prototypal style, which are actually in production use?
You can claim thousands of such systems for classes. In that sense, classes are a success. That people write horrible class-based code isn't a knock against them. People also write horrible procedural code. Most code is bad. But there is some code with classes that is very good.
There was sentiment in the 90's and early 2000's that OOP is bad. I think the world has learned how to use classes since then -- e.g. no more large inheritance chains and fragile base classes. Not everything is an object -- some things are just functions, and some things are just data with no behavior.
As far as I can see, prototypes are worse along all dimensions than classes.
Just as his misunderstanding of the meaning of equality [1] [2] and truthiness [3] manifested itself as quirks in JavaScript's design.
[1] https://developer.mozilla.org/en-US/docs/Web/JavaScript/Equa...
[2] https://www.theguardian.com/commentisfree/2014/apr/07/brenda...
Reminds me of the container-based virtualization renaissance that's happening right now.
Hmm, and I've always thought Chrome's extra-thick titlebar was liberated from the Xerox Star UI (http://imgur.com/6h13MlP).
That's really doubtful. Self's delegation-based inheritance was expedient for quickly implementing an object model instead of having to build a class system, the influence never seems to have gone any deeper.
The one big influence on Javascript was Scheme, that was the original idea, when Netscape's execs asked for a more java-style language for marketing reasons, special-casing a single parent slot was an easy way to bolt an object system into the thing.
From what little I looked at it, delegation (through parent slots) is everywhere in Self, it's used for inheritance but also for mixins and scopes chaining and… It's not just an object model, it's a core semantic principle and tool.
I’m not proud, but I’m happy that I chose Scheme-ish first-class functions and Self-ish (albeit singular) prototypes as the main ingredients. The Java influences, especially y2k Date bugs but also the primitive vs. object distinction (e.g., string vs. String), were unfortunate.
For me though it's not about the Self language but about the combination of the language and environment. I did a screencast attempting to show some of how Self development is done https://bluishcoder.co.nz/2015/11/18/demo-of-programming-in-...