Couldn't agree more
Java is the extreme case of this. Patterns like abstract visitor factories are hacks to express situations that cannot be expressed in an obvious way.
Inheritance is just one of multiple facets of safe code reuse in OOP. Aggregation, composition, encapsulation are as are much as fundamental notions in OOP as inheritance. So i think reducing OOP in general, and java in particular to "inheritance based OOP" is a miss characterization
> are that it fits very few problems well, that it usually causes lots of problems and that many programming languages only have inheritance based OOP in their toolbox.
Do you have any objective way to measure that ?
> Patterns like abstract visitor factories are hacks to express situations that cannot be expressed in an obvious way.
But isnt that the reason to have a pattern ? An easy way to expression a non obvious recurring situation ?
What's "inheritance based OOP"? The kind of OOP that models everything using subclassing? Partly due to structural static typing so compatibility/polymorphism can only be achieved by having a common ancestor class?
Sure, but that's missing the point of OOP almost completely.
Anyway, subclassing is an extremely useful and by now somewhat underrated tool: it allows for unanticipated extension and programming-by-difference. Meaning you already have something that's close to but not quite what you need.
And they were both wrong, as millions of programmers use Java and C# and C++ just fine, and have created much more impressive software than what Dart or Golang programmers have. Plus, people used to the power of C++ would never switch to Golang (which is also what the Golang team observed: they mostly got people from Python/Ruby and that kind of services).
>Haskell and OCaml are a joy to program in, in part because they (mostly) eschew subtyping.
Sorry, but did you just said that "subtyping and parametric polymorphism (generics in Java) were deemed too hard for Google programmers to understand" (an argument based on complexity) and then went on to argue in favor of Haskell, which is notoriously difficult to grasp, and has so many foreign concepts that it makes Generics look like BASIC level concepts.
"Kotlin is very reference semantics, itʼs a thin layer on top of Java, and so it perpetuates through a lot of the Javaisms in its model.
If we had done an analog to that for Objective-C it would be like, everything is an NSObject and itʼs objc_msgSend everywhere, just with parentheses instead of square brackets. .."
I think Swift has real chance to reach Java level popularity. It is already at #11 in Redmonk ranking. All languages above Swift are at least 15 year older than Swift. And once it server side features like concurrency it can be much more general purpose.
http://rosslebeau.com/2016/swift-copy-write-psa-mutating-dic...
I thought Objective-C had already solved this problem quite nicely with the explicit NSFoo/NSMutableFoo class pairs. I don't see why this needed to be fixed again, in a less explicit way.
Not always. The C++ standard has allowed copy elision for some time [1]. Guaranteed copy elision for certain forms of copy has been proposed for C++17 [2].
[1] http://en.cppreference.com/w/cpp/language/copy_elision
[2] http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/p013...
I am still waiting for first class support on Windows on the download page.
Right now Rust has much better OS support than Swift.
No, you seem to confuse "replace Java" with "TOTALLY AND ABSOLUTELY replace Java everywhere".
Swift just needs to be usable on the platforms that matter -- and since it's by default on OS X, that leaves Windows and Linux.
Nobody cares if it runs on some mainframe architecture that 0.001 of Java use happens, or some other obscure environment.
One of the main points of Kotlin is that it integrates tightly with IntelliJ. So Kotlin is a layer (not so thin) between a visual IDE (IntelliJ or Android Studio) and Java-decompilable bytecodes on the JVM.
You don't get that with other JVM languages, e.g. Apache Groovy only gives correct type hints in Eclipse 80% of the time, and JAD stopped working on Groovy-generated bytecodes in Groovy 1.7.
It's very wide use of lambdas with unique syntactic support for them makes it feel very different from Java.
We have a schizophrenic attitude towards C++.
On one way we love the language, the expressive power it gives us, the type safety taken from Simula and Algol, thanks to C++'s type system.
On the other hand like Chris puts it "has its own class of problems because itʼs built on the unsafety of C".
So some of us tend to work around it, by using safer languages and only coming down to C++ for those tasks, where those better languages cannot properly fulfil them.
But as the CVE database proves, it only works if everyone on the team cares about safety, otherwise it is a lost game, only fixable by preventing everyone on the team to write C style unsafe code to start with.
Sure nowadays there are plenty of analysers to ensure code safety, but they work mostly on source code and like any tool, suffer from people really caring to use them.
[1] shameless plug: https://github.com/duneroadrunner/SaferCPlusPlus-AutoTransla...
Writing code is fun and interesting. But most software development is not writing code. It's a little bit of build management, even more testing, but mostly it's debugging. Debugging is not as fun as writing code. Every language feature that makes debugging more necessary, harder to do and more time intensive sucks. Dangling pointers are the absolute worst.
I can easily give up multiple inheritance for a more functional language that's far easier to write correct code in.
Maybe that the problem, if you see C++ as something "built on C" then it logical that the see a lot of the same problem. C++ evolved from C to specifically address a lot of the weakness in C.
> Every language feature that makes debugging more necessary, harder to do and more time intensive sucks. Dangling pointers are the absolute worst.
language design is an exercise in compromise, and there is space for multiple compromise points on the spectrum. C++ decided (for better or for worst) to go for performance vs nice debugging experience.
> I can easily give up multiple inheritance for a more functional language that's far easier to write correct code in.
Am i the only getting tired of this kind of blanket statements ?
Well, he has written not just LLVM in C++, but also a C++ compiler in it (Clang), besides having written Swift in C++.
So that's as far as knowing C++ one can go I'd say.
Emphasis mine. Not that I disagree completely...
In a way many iOS and macOS applications are front-end software. It much more makes sense to make Swift available for other kinds of front-end development that for server-side coding.
Systems programming seems well catered for with Java, Go and Rust while high level application programming is left at the mercy of javascript (I like TypeScript but it's mostly improvements borrowed from C# that are bolted on). I think there would be a lot to gain there first and foremost by compiling Swift to WebAssembly.
I hope that this is never happen. Swift is great, it's universal and it saves you a lot of time during coding, BUT It also has very large syntax and high number of features - documentation is huge! The most of swift programmers probably don't know complete syntax and all features which is problem in a world where we code in teams and work with open source(both cases mean that you work with code you didn't write).
We just need new simple way how billions of people can explain computers what to do and backwards understand what computer was told to do and I'm sure that it's not Swift, Java or C++.
It's interesting to note that Google has taken the deliberately opposite approach with Go: small syntax, learn 90% of the language's ins-and-outs in a few weeks, so that the average fresh college grad Googler (average tenure being less than 2 years, IIRC) spends as little time as possible ramping up and has relatively predictable output.
I love Objective-C, but I don't want to inherit its baggage (via Swift) when I write backend code.
What realistic alternative playing in the same space doesn't have this problem?
I'm storing popcorn for the day all the people who've been burned working for Musk finally come together and speak out about his insanity as a manager.
I suspect the thing holding them back is that Musk's goals are laudable and everyone still wants them to succeed.
But be glad you're a (potential?) customer of Musk's, not an employee.
"When I joined Tesla, it was in the midst of a hardware transition from "Hardware 1" Autopilot (based primarily on MobileEye for vision processing) to "Hardware 2", which uses an in-house designed TeslaVision stack. The team was facing many tough challenges given the nature of the transition. My primary contributions over these fast five months were:
We evolved Autopilot for HW2 from its first early release (which had few capabilities and was limited to 45mph on highways) to effectively parity with HW1, and surpassing it in some ways (e.g. silky smooth control). This required building and shipping numerous features for HW2, including: support for local roads, Parallel Autopark, High Speed Autosteer, Summon, Lane Departure Warning, Automatic Lane Change, Low Speed AEB, Full Speed Autosteer, Pedal Misapplication Mitigation, Auto High Beams, Side Collision Avoidance, Full Speed AEB, Perpendicular Autopark, and 'silky smooth' performance. This was done by shipping a total of 7 major feature releases, as well as numerous minor releases to support factory, service, and other narrow markets. One of Tesla's huge advantages in the autonomous driving space is that it has tens of thousands of cars already on the road. We built infrastructure to take advantage of this, allowing the collection of image and video data from this fleet, as well as building big data infrastructure in the cloud to process and use it. I defined and drove the feature roadmap, drove the technical architecture for future features, and managed the implementation for the next exciting features to come. I advocated for and drove a major rewrite of the deep net architecture in the vision stack, leading to significantly better precision, recall, and inference performance. I ended up growing the Autopilot Software team by over 50%. I personally interviewed most of the accepted candidates. I improved internal infrastructure and processes that I cannot go into detail about. I was closely involved with others in the broader Autopilot program, including future hardware support, legal, homologation, regulatory, marketing, etc. Overall I learned a lot, worked hard, met a lot of great people, and had a lot of fun. I'm still a firm believer in Tesla, its mission, and the exceptional Autopilot team: I wish them well."
"letʼs start hacking, letʼs start building something, letʼs see where it goes pulling on the string" feels scarily accurate, and it's unclear where the language will be in 5 years.
Among other things, there's no way to disable objective-c interop, even though it complicates the language and feels like someone merged smalltalk, C++, and ML—not a pretty combination. But—literally the only reason you'd enable that would be to work with Cocoa/UIKit.
I'm still out on ARC—it was much less of a problem than I expected on my last project, but it never feels like an optimal solution, and you can never just "forget about it for the first draft" the way you can a VM's GC.
So, apparently you didn't even read the article, as it is explicitly stated that this was not the intention or direction of Swift.
> Among other things, there's no way to disable objective-c interop, even though it complicates the language and feels like someone merged smalltalk, C++, and ML—not a pretty combination. But—literally the only reason you'd enable that would be to work with Cocoa/UIKit.
Swift on Linux does not use any of the ObjC runtime features that are used on Apple platforms.
It might actually help that there is a real commitment in that direction. The issue being that it was IBM that mostly pushed for changes in foundation and without there initial blue socket support, even the most basic tasks did not even succeed.
Let alone the none existing windows support. It may not have been Chris his intention but one now ex-employee intention does not mean a lot when the company determines the direction after his release.
Believe it or not, this compiler option is named `-disable-objc-interop`.
> Could someone explain why I should build a language developed entirely by and for writing Apple ecosystem products?
Possibly because you have an affinity for value types, performance, or safety. A language is a lot more than just a checkbox of platforms it supports, although iOS is a pretty large checkbox right now.
> the long list of benefits suddenly looks much, much smaller compared to e.g. JVM, .NET, Go, etc etc.
Swift isn't trying to compete with any of those. I mean sure in the "world domination 10 year plan" sense, but for the forseeable future the bullets that make Java attractive to enterprises (lots of developers, lots of libraries, lots of platforms) are not on anyone's todo list in the Swift community.
Rather, the short-term goal is to compete with C/C++/Rust. So you are writing a web server (e.g. nginx alternative, not a webapp) or a TLS stack or an h264 decoder and buffer overflows on the internet sounds scary, you are doing pro audio plugins where 10ms playback buffer is the difference between "works" and "the audio pops", you need to write an array out to network in a single pass to place your buy order before the trader across the street from you, but still have a reasonably productive way to iterate your trading algorithm because Trump is elected, etc.
As far as JVM/.NET, a cardinal rule of software is that it bloats over time. So JVM/.NET/Go can never "scale down" to the kinds of things C/C++ developers do, but it is less known whether a low-level language can "bloat up" to do what .NET developers do. In fact, C++ kinda does "bloat up", because C++ .NET exists. But that is basically an accident, because C++ was not designed in the 80s with .NET developers in mind, and perhaps for that reason it is not the most popular .NET. To the extent we have a plan, the plan with Swift is to try that "on purpose this time" and see if it works better when we're designing it to do that rather than grabbing a round peg off the shelf and hammering it into our square hole. It probably won't ever be as good at .NET problems as .NET, but perhaps it can get close, for useful values of close.
> you can never just "forget about it for the first draft" the way you can a VM's GC.
Similarly, ARC does not exist to compete with your VM on ease-of-use, it competes with malloc/free on ease of use (and your VM on performance). If your VM is performant enough (or you can afford the hardware to make it so), great, but that just isn't the case for many programming domains, and that's the issue we're addressing.
There is also a quasi-non-performance aspect to ARC that is often overlooked: deterministic deallocation. Most VM memory models are unbounded in that resources never have to be deallocated, but in a system like ARC we have fairly tight guarantees on when deallocation will take place. So if your objects have handles to finite resources in some way (think like open file handles, sockets, something to clean up when they blow away) the natural Swift solution will be much more conservative with the resource use relative to the natural JVM solution. Because of that it may be more useful to think of ARC as a general-purpose resource minimization scheme (where memory is merely one kind of resource) rather than as a memory model or GC alternative itself.
Assuming there are no pauses due to deletion of deeply nested data structures, or worse, stack overflows.
Herb Sutter has a very interesting presentation at CppCon 2016 about these issues, where he then presents a kind of C++ library based GC to work around them.
Also ARC has performance impact, because increment/decrements need to be synchronized due to threaded code.