Now, this a problem for bigger reasons:
An older picture of Denzel Washington gets an ok: http://imgur.com/Li0gZqH
A recent picture of Howard Stern gets a hot: http://imgur.com/L8hxoVK
Obviously this is just a toy and your algorithm is pretty inexact, but... you need to fix it, or at least note in giant letters that it only works for white people right now and that you're working on your algorithm to make it more universal. Because it only (kinda, sorta?) works for white people right now. If you claim something is universal in your headline then note its specificity in the fine print, you're lying. If you build an algorithm that calls most people who aren't white ugly, you need to think about the buzz-to-backlash factor of demoing it.
It's really not a good look, and you've got a week at most before you get called a "Nazi Dating App" on twitter and your potential VCs get spooked and pull out. I don't think it's intentional on your part, but literally no one cares about what your intentions are when there's an opportunity to create moral indignation clickbait. Just a friendly word of warning!
Secondly what makes you think that every race is equally attractive universally? Studies have people find people of their own race more attractive. If whites are more attractive (either by their training set or user base ratings) doesn't that merely reflect the composition of their user base?
PS: Giving some solid advice to public-facing startups looking for funding. Definitely take the autistic high-ground on every issue. Investors don't care about bad PR, they only care about abstract principles of truth as you understand them.
Well, there goes my last ounce of self esteem.
If anyone needs me I'll be sitting in the corner with one of those criminal hacker ski masks while I work on open source stuff.
https://polybox.ethz.ch/index.php/s/tI9QTAblYVanuA9
:-)
(XPpPpP)
I didn't get above the nice level.
First picture of my wife she immediately got the godlike level. Unfair advantage!!
"Before performing any experiments we removed underage people, anyone over 37 and bi- and homosexual users as these comprise only a small minority of the dataset."
That's already biasing it of course!
"Interestingly, 44.81% of the male ratings are positive, while only 8.58% of the female ratings are positive. Due to this strong bias of ratings by women, we only predict the ratings of men."
So, the algorithm learns to rate me as a heterosexual man?
Now, I look even worse and don't own a working webcam.
On a more serious note, training set probably didn't include data from Asia.
Edit: Just scrolled down and saw your question (and the answer you received). :C
Cool work! I'm curious regarding the training dataset. What is the distribution of faces by race/age? Also regarding the raters, what is their distribution? (Race/age/cultural background)?
It's widely known that attractiveness is heavily dependent on cultural upbringing.
Signed, Butthurt dev whose best pic only rated an "OK".
EDIT: You also rated Yoona (a Korean pop star) as just "Nice": http://imgur.com/uJVnQ9S. I guess that makes me feel better about my "OK". I'd stay out of Korea if I were you---I hear their fans aren't very forgiving.
I am Radu, one of the authors.
As mentioned in our article (http://arxiv.org/abs/1510.07867v1), the training dataset for attractiveness is from Blinq. The underage and above 37 years old face images were discarded. All people are heterosexual in our training dataset and mainly from Zurich area, Switzerland. Therefore, our model of attractiveness fits the cultural bias of Zurich, Switzerland.
We consider faces.ethz.ch a little fun tool. I hope that the fans of Yoona will understand :) With more training data from abroad Switzerland our algorithms will fit their expectations.
For age and gender we used much more and diverse data, therefore are more reliable for the majority of ethnic backgrounds.
Edit: Sorry, missed it! http://arxiv.org/pdf/1510.07867v1.pdf
Picture or Extracted Face?
If you're good looking you can pretty much use whatever.
But for someone like myself who is ethnic and not visually attractive, my success rate is really low on certain sites and acceptable on certain apps.
For example, my performance on Match (Graphic I made: http://i.imgur.com/UZuSzD9.png) was pretty woeful in December. But I started using another app in the same week and had much higher success in getting responses relative to effort level.
https://www.crunchbase.com/organization/opinionaided#/entity
http://techcrunch.com/2013/06/26/thumb-social-polling-app-me...
I mean, is it different from Project Oxford, the Microsoft API that's been around for awhile and is still quite amazing?
https://www.projectoxford.ai/demo/vision#Analysis
I actually tried it out early this morning, to compare it with a stock install of OpenCV 3. It got the faces correct, and the ages very well too.
Here are its guesses for the Star Wars TFA poster: http://imgur.com/XT7RmX6
Of course, perhaps users have trained it...particularly ones sympathetic to Carrie Fisher. Though I'd argue that they would've also corrected Boyega's face.
Edit - tried a different pic, it guessed my fiance was 51.
Lighting obviously plays an effect, but I was pretty surprised at how it got TFA's Han Solo down decently well. In the poster, he looks more in his 50s.
Lighting obviously plays a part. I wonder if race does as well? To use the common stereotype, does the algorithm make a guess if you're Asian, then guess downwards?
- Be female.
- Face should occupy about 1/3 of the image.
- Cut off your forehead.
- Show your long hair.
- Oversaturate the face.
- Put a filter on it.
- Add a border.
I hope that this helps to understand the aforementioned result.
The real question is how old and hot is that popcorn?
More than anything, I'm curious to know what features it was that registered me as female. Was it as simple as the long hair, or some complicated subtle mix of many small details?
What direction was it off on your age? I'm 31 (30 in the older pictures I sent), and it said I was 19-22 in all the pictures I tried.
This made me laugh really hard. What a positive twist on the fact that the algorithm is clearly a WIP.
I am Radu, one of the authors.
We thought at similar experiments, however psychology is not our expertise. If you check our paper on hotness/attractiveness (http://www.vision.ee.ethz.ch/~timofter/publications/Rothe-ar...), I am sure that you'll find some interesting results on how different age-grouped people rate, a paradox, and more. And yes, there are many interesting experiments to do and questions to answer.
Also, the ones it rates the least attractive it, for some reason, tends to identify me as much younger (across both genders). Like more than a decade younger than the picture, and it'll rate it "Hmm..."
As for the highest rated pictures... I can't figure out what it does; though one where someone else did my makeup and it was perfect was among the two it rated stunning. I was surprised that the ones I tried to feed it where my phone's "Beauty face" kicked in (which removes most wrinkles and skin flaws) didn't seem to rate any higher... though makeup did make a difference.
A fun little toy.
Edit: Oh, and other than occasionally docking me a decade as mentioned, it was pretty accurate on age (+-3 years, generally). Which I find interesting as I'm frequently told I look younger than I am.
Uploaded a picture from about a month ago... same, except it said I was 19.
Well, I'm flattered it thinks I can't drink, and I'm glad I pass. Too bad it doesn't think I'm hot, but I've always preferred to go for cute over hot anyway.
I'm gonna dig through my photos and see how consistent this is... (edit: a couple more, 21-22, female, and still ice cold)
I'm actually a year younger in the second picture than the first. Maybe it's the glasses?
Yeah this site is bogus. So many inconsistent ratings.
Anyone tried a picture with larged amounts of cash in the background?
A lot of the issues our estimator (just an age estimator) ran into were the standard face recognition problems: occlusion, lighting, and (the obvious) bogus images.
Anyone involved, what data set was the attractiveness scale built from(Labelled Faces in the Wild Dataset (http://vis-www.cs.umass.edu/lfw/))?
I am Radu, one of the authors.
After 2 years we face almost the same issues, but probably we cope differently with them. Note that our solutions are fully described in the two papers mentioned on our faces.ethz.ch page. For attractiveness we used data from Blinq.
Our apparent age estimation solution is the winner of the latest LAP challenge, ICCV 2015.
- body type
- piercing
- tattoos
- eyeglasses
- colored hair
- etc.
Check out how Sensetime did a similar feature.
Anyway I only got "connection error".
Minus: I got the lowest rating possible. Haha, terribly depressing feedback before a date.
Then again, that bears up in reality. People who smile are perceived as more attractive.
I put pictures of really attractive Asian guys (specifically men who honestly have a lot of diehard female fans who are interested in them) and at best they got "Nice"
http://pasteall.org/pic/show.php?id=97312 Off by almost a decade on age.
Check this album out: http://imgur.com/a/1a1tn
Seems racist and sexist towards men.
Our data consists only from normal (or natural looking) face images in the wild (from IMDB, Wiki, and/or BLINQ user profiles). On such data we get very good apparent age prediction (better than the human reference) and also very good gender and attractiveness prediction.
The attractiveness is highly subjective and its perception varies from one culture/region to another. We used data from Switzerland.
Our solutions are far from being perfect and the guessed results should not be taken too seriously.
We consider to update our models to explicitly deal with distortions and non-human face contents.