I've been a professional designer since 2006, and I got over that thinking pretty quickly. A designer trying to be strikingly original is rarely acting in service of the design. If you want to be strikingly original, you probably want to be an artist instead of a designer. What a designer fundamentally does is communicate the best solution to a problem, given the requirements, goals, and constraints of that problem. Originality is subordinate to that at best.
I was a UI/UX guy for about 5 years and worked for a company that pumped out thousands of sites a year. A bunch of their designs won awards and I saw their model and thought I could do that, it seemed easy.
The hitch was that I was going to design really cool sites, with all kinds of animations, huge text, have really cool navigation menus, etc. In short, I had a very romantic idea that I would dictate some incredible design to my clients. I thought I was like the Frank Lloyd Wright of design and whatever I showed people they would swoon and then go with whatever uber cool thing I showed them.
Reality set in with my first client. Same thing, they didn't want cool shit, they just wanted their potential clients to find information about their work and contact them to hire them. After another 4-5 clients, I suddenly realized that web designers aren't some artist creating ultra cool, ultra rare stuff that your clients must absolutely have like a Banksy piece, they have more fundamental problems they're trying to solve and want you to solve them for them.
I got my ego checked in a hurry, but it was a good lesson to learn. You're not selling art, you're selling a solution to their problems.
The same with software devs that they think, it must be “framework like code, extensible, reusable that will be there for 20 years” - well no if it is crud app most likely it will be trashed in 2 years stop overthinking and just do it :)
hmmm... That approach is anathema to every other UI designer or UX person I encountered in that field. The core of UI design is 100% about clarity-- letting the user focus on exactly what they need to solve their problem. The core guiding principle of UX work is designing based on empirical research, and then iterating based on user testing... even if it doesn't work out like that in practice, it's still laser-focused on helping the user achieve what they need.
Did you transfer into the field from a non-web-design background? The people I've seen approach web design with the intent of making some sexy website that's flashy for its own sake were a) front-end developers that thought the technical know-how was the hard part, b) branding and identity designers, or maybe print designers that never had to consider designs that people actually had to do stuff with, and c) small-org IT people that were sick of IT and were charged with maintaining the organization's website so they figured it would be an easy switch.
Apple stopped bundling an iPhone charger on recent models. Samsung did the same, but realized the backlash was enormous, and offered the charger for free (instead of being an additional purchase) if you bought a recent model.
Same with headphone jack, although it was received much more negatively and I'm pretty sure Samsung didn't give a damn about most of its users complaining they now had to buy new headphones (they mitigated this a bit by offering a USB-C headphone on their flagship devices for a while) to listen to music in their devices.
It's an outdated line of thought to think you need your designs to feel familiar to the user, even if the competitors have dark or annoying design patterns, rather than convenient. The average user is no longer a tech illiterate person. We should stop assuming common things like opting out of marketing/AI data training should be left for advanced users only and make it available for everyone, with ease.
Originality certainly has a role to play in there - many (most?) iconic products were strikingly original. Would the iPod have been a better designed product with a D-pad (or other standard button arrangement) over its scrollwheel? Or the Wii with a standard gamepad?
Originality and novelty (particularly when it comes to visual aesthetics) are forces people respond to, and great designers know how to channel those forces in constructive ways for their work.
Or the 3DS without the 3D?[0]
What they do is to move them around every few months, change the colours, design our application with a mobile layout, despite 99% of our users being on desktop computers…
Designers that change up the UI just to have something to do drive me nuts! And, now it's life threatening because cars now get updates every ~12 months where some designers have decided how to use the car changes. So, things you got used to suddenly change and you have to figure them out WHILE YOU'RE DRIVING!
I hope someone manages to sue over these changes when someone inevitably dies, so they'll be some pressure not to make them.
Poor fuckers probably don't even know that there is a standard -- CUA -- for what order those are supposed to be in: File, Edit, View ... Tools, Window(s), Help.
And without a mobile layout before, I wonder why?
It's bonkers that not everyone is on board with responsive design being table stakes in 2024.
I recently found this really excellently designed grain bin from Masuda Kiribako:
https://kirihaco.shop-pro.jp/?pid=181616902
It looks nice but is fairly simple; if you haven't spent a bunch of time looking at available alternatives it might not look like anything special. Keeping grain away from insects and humidity and oxygen (and sometimes rodents, though I'm not sure how well this one would do in that case) while still being able to access it easily is not trivial. Plastic buckets work well and are cheap but don't look as nice and most lids are annoying (I suspect the lid on this one might possibly be a bit annoying as well but likely not as bad). Glass jars are nice to use but fragile and best for smaller amounts. Wood is particularly challenging due to the dimensional instability and they use a particular type of wood with something like eight years of preparation to make durable boxes. (I suspect the magnet on the scoop is pure marketing though, you can't even use it when refilling if you hook the lid on the edge which is the one time it would be really handy).
I think low latency is one of the things that makes software and websites feel really nice to use and is often overlooked.
Perhaps a project is 50% similar to existing project A, 45% similar to existing project B, and 5% novel. Finding this correct balance of copies of A and B, and finding a good solution to the novel part - this process feels "original" in many ways.
Copying is the way art works as well (at least for those who are not doing super-edgy-fine-art).
Typical journey of a digital painter:
1. Refuse to copy. Refuse to even look at references.
2. Hoard references. Over reference.
3. Copy in the right way.
But there is still a difference between a designer who blatantly slaps an existing aesthetic onto your project and a designer who tries to come up with a suitable look from first principles.
Design isn't styling, it is the visual organization of information with styling. So unless your information is the same the outcome will differ anyways.
> In the middle of Apple’s case against Microsoft, Xerox sued Apple, hoping to establish its rights as the inventor of the desktop interface. The court threw out this case, too, and questioned why Xerox took so long to raise the issue. Bill Gates later reflected on these cases: “we both had this rich neighbor named Xerox ... I broke into his house to steal the TV set and found out that [Jobs] had already stolen it.”
Apple engineers got to see the Alto, not the Star (the screenshot in the article is wrong, the chronology is wrong). The visit was so fast that Apple engineers thought they saw realtime overlapping windows when they didn’t. [0] So it’s possible Xerox was inspired by Apple with the Star, not the other way around.
Meanwhile, Bill Gates totally outs himself as someone who would steal shamelessly.
[0]: https://folklore.org/On_Xerox%2C_Apple_and_Progress.html
I didn't know that touring somewhere meant you could copy all their designs. Was that explicitly stated?
What the Apple engineers did was take obvious inspiration from what they saw at PARC but then ended up going in a different direction when they actually had to both implement it and make it workable as an OS. The overlapping windows is the most oft-cited innovation they came up with but there were many other perhaps more subtle ones.
The impression I have though is then that Gates basically copied Apple engineering, not PARC.
> Jobs's company stood on the precipice of a public offering guaranteed to make him and any investors wealthy, and the tech guru's impending good fortune enticed the suits at Xerox to make him an offer he couldn't refuse: Let us buy shares in your company, and we'll give you a peek inside the greatest minds in your field.
[0] https://www.newsweek.com/silicon-valley-apple-steve-jobs-xer...
"....Here is Mike Boich's recollection of the 'Xerox' story, which goes a little differently, and is likely to be more faithful than mine: The meeting was one of the quarterly meetings, where Steve, Mike Murray, Belleville, you and I all got together with the Microsoft crew, which at was usually Bill, Jeff Harbers, Jon Shirley, and sometimes Neil and/or Charles Simonyi. I don't recall whether Windows had been announced, or we were just concerned about it, but Steve was trying to convince Bill that having a "Chinese wall between the Windows implementers and the Mac implementors wasn't sufficient for us to work well together. He was trying to get them to forget about the OS business, since the applications business would be much bigger total dollars. He said, "It's not that I don't trust you, but my team doesn't trust you. It's kind of like if your brother was beating up on my brother, people wouldn't say it was just your brother against my brother, they would say the Gates are fighting with the Jobs." Bill responded that "No Steve, I think it's more like we both had this rich neighbor named Xerox, and you went in to steal the TV, and found that somebody else had stolen it. So you say, "hey, that's not fair. I wanted to steal the TV"."
Not fun at all. Microsoft is like Disney, they steal from others and trounce others for stealing from them.
Absurd people.
I do recall Disney (a main reason copyright laws last so long, and who didn't want Steamboat Willie to enter public domain).
I also think of Amazon (which the creator of the Elm programming language describes as having "the Jeff problem" because they steal smaller people's/team's ideas), although that's a different problem.
I can't say anything comes to mind right now about MS, though, which is most likely a failure of my memory/knowledge. So I'd appreciate some examples.
And on the forum which should most know it to be true!
We create new things by collecting, regurgitating and mutating stuff we experience, just like LLMs. In a vacuum man has no ideas outside of base impulses.
Hence why originality is a novice belief. The closer you get to any field, the more you realize the stories around who made all the breakthroughs are BS media narratives. Most if not all steps forward in any field have hundreds of people clawing at similar ideas concurrently.
Designing things have two goals:
- Make old things seem new
- Make new things seem old and familiar
Both need a lot of knowledge about how humans work and how we have made sense of the world up until now. Design can't be made in a vacuum and without input.
Edit: To expand: An LLM would never have come up with touch input. It would have regurgitated the existing ideas of using a pen or a mouse to point at things on a screen. To come up with touch input was a huge feat of human engineering that was a combination of design (making touching a obvious for any human, old or young) and engineering (making that interaction actually work).
They might not have had experience 2 years ago, but in the meantime they assisted 100s of millions of people for many billion tasks. Many of them are experiences you can't find in a book. They contain on-topic feedback and even real world outcomes to LLM ideas. Deployed LLMs create experiences, they get exposed to things outside their training distribution, they search solution space and discover things. Like AlphaZero, I think search and real world interaction are the key ingredients. For AZ the world was a game board with an opponent, but rich enough to discover novel strategies.
> Philosopher René Girard, scholar Robert Hamerton-Kelly, and Thiel co-founded IMITATIO in 2007 to support the “development and discussion of René Girard’s ‘mimetic theory’ of human behavior and culture.” Mimetic theory, the concept that humans are fundamentally imitative, has had a profound effect on Thiel, who calls Girard “the one writer who has influenced me the most.”
https://www.theverge.com/2016/12/21/14025760/peter-thiel-het...
Pull down to refresh is a great example of this. Not visible or discoverable at all, but was all the hype when Tweetie first released it. On paper it's an anti-pattern, but now it's so ingrained as a trend and pattern that it became expected, and is now muscle memory for many users.
The same goes with flat buttons - I used to be quite opposed to them since there was no visual elevation off the page designating it as a button. Now if you create a button with a bevel, users will think it's an ad, not part of the page itself.
Copying leads to harmony in the wider ecosystem, and it creates a defined agreement on what things are are how they work. It's an important part of the user experience.
Pull to refresh is useful and optional.
Flat buttons save precious space on tiny mobile devices.
You're probably mistaking a flat button for a link / undecorated button. Apple's HIG refers to these as plain buttons for iOS[1]. I'm referring to flat vs bevelled[2], which take up the same space.
[1] https://developer.apple.com/design/human-interface-guideline...
[2] https://image.non.io/428397dc-93ae-4158-8b71-323bd11182a0.we...
> That just your own bad taste.
Please be civil
[link to hn guidelines here]
Many (bad) designers confuse what I would call styling with design. Design is a lot about functionality and how information is organized visually. These two core design points can only be copied if the underlying project is exactly the same in terms of underlying information. But even for two blogs about different topics the question which information needs to be presented how would be different — even if both blogs were using the browser's default CSS. This is the core of design.
Styling is finding colors, shapes proportions etc. All of this of course overlaps with the functional question and the question of organization of information — bigger buttons get more attention and all that — but ultimately you can slap more or less any style on any content. Whether it makes sense is a different question.
A lot of people think of aesthetics when they hear "design". But design is about how things work. Everything we use was at some point designed by someone.
In our SaaS company we changed the role of Designers to Product Designers to help people understand it a little better.
There is only so much information a person can process at once, there are certain expectations where they would find things those can be met or sub erted, colors, font choice, all important for how information gets processed on the functional level.
So in the end design cannot exist with some degree of aesthetical choice, and the better designers are the better they are at choosing aesthetics that serve the functional choices (if that is the goal of their designs).
If you make it completely utilitarian it might become boring, and the function of a design in a SaaS company could also be to sell the product..
The video (if I recall correctly) goes a bit further, attacking patents/IP law as anti-creative.
I find this happens in UI/UX design too. When you're trying to come with the best interface for a problem, there's only so many directions that make sense once you've explored the design space and understood all the constraints.
With desktop and mobile interfaces for example, all operating systems and devices have converged on a lot of similar patterns and visuals. I don't think this is because people are unoriginal, but given the constraints, there's only so many decent options to pick from so many designers will inevitably converge on the same solution.
> I’m a designer. As a designer, I feel the need to be original.
I'll often come up with a solution on my own after immersing myself in a problem for a while, then after looking at existing work more later, find it's already been done. I'll then sometimes even consider changing my solution so it doesn't look like I copied, but usually there's no obvious other direction you can go in that is close to as good.
Listen dude, go ahead and buy the $145 Modway chair. It's so bad, it is $118 nowadays. It will literally fall apart under your ass. Read the reviews.
Followed by pictures of two different-looking chairs. IMO the Modway looks notably worse.
Whether you believe that it’s worthwhile or worthless to copy, whether you think that copies are a valuable part of the design community or a scourge, you are using software, hardware, websites and apps that all owe their existence to copying.
As long as there is design, there will be copying.
Want an online menu for your restaurant? Well, you can't just go copying someone else's design; so you must create your own from scratch. Will yours look and behave practically identically to the other? Yes. Will both websites be overall worse quality than if everyone just collaborated on a standard design? Yes. Would it save the world an incredible amount of redundant work to just allow people to copy each others' work? Yes. Who wins in this arrangement? Only those who have already won.
Keep looking at this pattern, and you will enter a deep cavernous rabbit-hole. At the bottom, you will find yourself at the very core of design itself: the goals, philosophies, and systemic failures of every design we use today can be traced back to this point: collaboration must be avoided at all costs. Compatibility is the cardinal sin, and it must be punished.
So we go on, building silos upon silos. When will we ever learn?
---
There is a lot of talk lately for change. They say, "AI will be the end of copyright. It's too important to hold back the potential of AI over a petty argument for intellectual property." I don't believe for a minute that LLMs will ever reach the lofty goal of "General Intelligence". I don't believe for a minute that megacorps like OpenAI, Google, and Meta deserve a free pass to siphon data for profit. So why is it that these words ring true? AI has nothing to do with it: it's design itself that has incredible potential, and we should absolutely stop holding it back. Intellectual Property is nothing more than a demand against progress.
Went to art school and a significant part of my art history class dealt in remembering the name of art "movements" which is a veiled way of saying a period when everyone was copying each other. Then of course you learn about the influential artists who heavily borrowed from xyz. Another funny one is "revival" which just means "straight up copy"
This is why I have limited sympathy for the uproar about AI art. It's just cutting through the boring part.
I think we still haven't found a proper economy for the digital world. The fact that pirating game of thrones was a better option than waiting for it to be premiered in your region goes to show there is still a lot of work to be done in this area. If there wasn't piracy, free software, open source and american VC (the first few waves, not the last few), this industry wouldn't have grown at this pace.
I'd like to offer a more moderate option--or perhaps just radical in a different direction.
Artists would like to make a living, and the "deception" comes from how that slogan is used to falsely present the powers-that-be as able, willing, and actively delivering on that goal.
It would have been natural, but also depressing.
Paying per-copy and agreeing not to copy for some fixed period is more consumer friendly than, say, everyone pooling their money into a giant one-and-done Kickstarter and just trusting that the end result will be good. If your work can be published serially, then something like Patreon might work, but that's impractical for a lot of larger projects. The consumer unfriendliness manifests in the form of risk: who is out the money if something turns out to suck, or worse, doesn't even get made. The traditional "sell copies with a monopoly" model means that if I don't like a work, I just don't buy it. We have reviews to inform people if a thing is good or not, but you can't review a finished work based off the Kickstart campaign. This results in a market dominated by scams of varying degrees, customers who are hesitant to put money into campaigns that might not produce, and artists that can only really make the business model work if they have a lot of social capital and reputation to stake.
I mentioned fancy capitalist words like "risk" and "market", so let's talk about the capitalist side of the business: the publishers. Or "managerial types", as it were. They do not make their money from selling the service of creating art, they make money from selling art that has already been made, which is capital. When Napster was telling people to stop paying for music and just steal it, the publishers shat their pants. An embarrassingly large part of the music business at the time was reissuing old acts on CD[0][1], and even new acts had to sell albums, which is why 90s listeners had to deal with a flood of albums with one good song and 10 terrible ones.
It's specifically the capitalist side of the business that got screwed over the hardest by Napster. What screwed over artists was Spotify, which made music profitable again for the capitalists by turning it into a subscription. A music Boomer[2] accurately summed this up as a faucet pouring water straight into a drain. This is the best way to devalue artists, because it doesn't matter what songs the artists make - just that the publishers control the flow of the songs.
The Spotify mentality has percolated into basically every other form of media over the last decade. It's why you will own nothing and be 'happy', and why every publisher CEO has a boner for generative AI, even as their artists are screaming their heads off about being scraped. Publishers have nominally been stolen from as well, but they don't care, because the theft is in their benefit[3]. It's the exact opposite of the Napster situation. What matters is not what will benefit the artists, nor what the law says. What matters is what will make them richer.
[0] This is also why the SPARS code was a thing for a few years - to distinguish between new recordings made for CD and reissues riding the hype of digital music.
[1] Metallica also found themselves caught on the back foot, mainly because they found out Napster users were trading pre-release soundtracks they'd made. Their reaction made them look like suits for a while, because Metallica had gotten popular through unlicensed copying, though I don't think this read was entirely fair.
[2] https://youtu.be/1bZ0OSEViyo?t=485
[3] I don't think generative AI will replace real artists, but it doesn't matter so long as publishers believe it can.
As far as I know, a raider doesn't share or enable others.
Nobody outside of Gen X PC gamers know what Commander Keen is. Everyone knows what Mario is. While copying may be the way design works, copying only gets you so far.
Keen was/is great, but Mario 3 and Mario World are on the shortlist for best game ever.
If you’re small time and have a great idea, you’re better off going stealth and this is its own mitigation against destructive copying.
"Individualism would have it that the work of a genuine artist is altogether ‘original’, that is to say, purely his own work and not in any way that of other artists. The emotions expressed must be simply and solely his own, and so must his way of expressing them.
It is a shock to persons labouring under this prejudice when they find that Shakespeare’s plays, and notably Hamlet, that happy hunting-ground of self-expressionists, are merely adaptations of plays by other writers, scraps of Holinshed, Lives by Plutarch, or excerpts from the Gesta Romanorum; that Handel copied out into his own works whole movements by Arne; that the Scherzo of Beethoven’s C minor Symphony begins by reproducing the Finale of Mozart’s G minor, differently barred; or that Turner was in the habit of lifting his composition from the works of Claude Lorrain. Shakespeare or Handel or Beethoven or Turner would have thought it odd that anybody should be shocked."
I do understand the desire to protect one's work too and find it hard to take a single side.
If you model ideas mathematically, you will see that societies plagued with IPDD (https://breckyunits.com/ipdd.html) will become extinct, because they prolong the lifespan of bad ideas, and those with intellectual freedom, where bad ideas rapidly evolve into good ideas, will rise to the top of the food chain. The equation is simple: ETA! (https://breckyunits.com/eta.html)
Question whether we should even have a concept of "licenses" (hint: we shouldn't). Look up "freedom licenses", which "freed" African Americans used to have to carry around in the 1800's. Think about how future generations will look at us for having a concept of "licenses on ideas". Think about the natural progression of automatic licenses on ideas (copyright act of 1976), to breathing: there is no reason not to require "licenses" to breathe, given that you exhale carbon dioxide molecules just as you exhale "copyrighted" information.
I ask, because these intellectual property protections are intended to incentivize creation. If that incentive overwhelms these models of information sharing and testing frictions then the model is incomplete.
Judge something not by what people say it does, but by what it actually does.
> If that incentive overwhelms these models of information sharing and testing frictions then the model is incomplete.
Agreed. But try as I might, I can't find any way theoretically or empirically to model copyrights and patents that show a positive impact on innovation.
Nature's survival of the fittest already provides near infinite incentive to innovate.
Now, I think patents and copyrights had a positive side-effect effect in the early days of the United States because it created a centralized library in the District of Columbia containing all of the latest information across the fledgling nation. But with the Internet, we don't even need that anymore. All the other parts of those laws are harmful and a drain on innovation.
Look at what happened with Windows/Crowdstrike-ultimately another harm caused by closed source, under-evolved "IP protected" ideas. Ironically Microsoft calls Windows their "Intellectual Property" when collecting money, but when that IP harms people, suddenly it's not their property.
> Is there any evidence that the equations in the blog post model the real world?
Depends on where you live. If you live in America, evidence is all around you. :)
But here is some hard data, thousands of programming languages ranked by languages most used to build other languages (which gives an objective measure of idea quality):
https://pldb.io/lists/explorer.html#columns=rank~name~id~app...
Utterly dominated by open source langs. Closed source, IP ones are headed for extinction.
I'm deeply suspicious of this conflation. I think it's done on purpose, in bad faith, for nefarious reasons.
The article isn’t explicitly dated (afaict). Using an inflation calculator leads me to believe it was written in 2019 [0]. The same calculator indicates a material deviation from the quoted number: “$145 in 2024 equals $10.16 in 1947.”
Amazingly, the chair is listed on Amazon now at $118.53 [1] (at least for my login/cookies/tracking; price includes shipping estimated at 6 days), the equivalent of $8.31 in 1947, a 60% off sale.
The cost probably has some externality tradeoffs however. Was the wood clear cut by children from thousand year old forests? Was the chair manufactured by prisoners using chemicals known by the state of California to cause cancer?
0. https://www.saving.org/inflation/inflation.php?amount=145&ye...
1. https://www.amazon.com/Modway-EEI-510-WEN-Fathom-Mid-Century...
https://www.everythingisaremix.info/
It's been submitted to HN many times but has never spawned any discussion:
It could be because, at least for me personally, I found the first 15 minutes to be a little too boring. Perhaps people just gave up before then.
Originality is overrated in art, painting restoration usually entails repainting large sections of the original. The image and the ideas far transcends the "original" which is usually reserved for bragging rights for uber rich collectors. The best art is the art you get to enjoy everyday.
Maybe interesting to point out from what year it is. It looks like 2020.
Wed Oct 28 2020 00:00:00 GMT+0000 (Coordinated Universal Time)
Article has been submitted twice, but never gained any traction (no comments, very few votes):
On why they copy the shape and size, that is the part where you can be more artistic, and it seems they have no taste.
(Affordance meaning using what people already is familiar with so they don’t have to relearn an interface)
They will learn a lot from doing so.
(Probably stolen)
- Pablo Picasso
> We’re not designers, or programmers, or information architects, or copywriters, or customer experience consultants, or whatever else people want to call themselves these days… Bottom line: We’re risk managers.
Somewhat unrelated, but it's a shame that manifestos have such a bad rap, most often associated with terrorists and such. There is something sorta nice about sitting down and clearly declaring your thoughts on a subject. It makes sense that people pushed to the edge want to let us know why they are behaving they way they are, but it's a shame that normal people aren't encouraged to reflect upon their thoughts and write them down. Being able to think about a topic and put on paper that these are my thoughts and feelings about $x brings a certain amount of clarity to your thinking and can help other people understand your thinking in a way that has a lot of power. Consider historical documents like the Declaration of Independence, the points are laid out in a way that even if you disagree with them, there is no denying what they are declaring.