How I'm fighting bias in algorithms | Joy Buolamwini

댓글

  1. Lan Astaslem

    Lan Astaslem12 일 전

    I remember when TED talks were not leftist propaganda

  2. Lan Astaslem

    Lan Astaslem3 일 전

    @Zacny Łoś Instead of criticizing another peoples accomplishment, maybe she should do a talk on how we can get black people to stop comitting a hugely disproportionate amount of crime

  3. Zacny Łoś

    Zacny Łoś3 일 전

    What is leftist in not working code/webcam?

  4. Paul's Treehouse

    Paul's Treehouse개월 전

    This video is one of the best examples of the victim complex so pervasive these days. I'm a graphic artist and know better than most the issues that emerge with video compression and low contrast values. The solution to her problem isn't activism, it's 16 bit per channel imagery.

  5. Paul's Treehouse

    Paul's Treehouse개월 전

    ...and using "cheap" face recognition software as a base doesn't help.

  6. Ethan Wright

    Ethan Wright2 개월 전

    Hahahahahahahahahahaha. Black people fighting against the machines... can't be you must be the machines. Machines are racist bwahahahahahahaha.

  7. Tom Atkinson

    Tom Atkinson4 개월 전

    You should add her link www.ajlunited.org/ The Algorithmic Justice Leage

  8. Alex K

    Alex K4 개월 전

    Is robot are racist for refusing to recognition blacks people’s?

  9. embraced chimera

    embraced chimera5 개월 전

    they worshipped work of their hands that had not ears to hear or eyes to see (but could "speak") the work of their own hands/cannot buy or sell without it. A.I. all the way. robots have not ears or eyes..but they say they do...but the bible verse meant made with own hands=robot without REAL/human biological ears and eyes yet was "speaking" yes..so yes it responds to you...responding means it 'heard' you/saw you. just without eyes and ears. speaking yes,.sound yes.....only YOUR ears/your real ones caused you to be able to hear an A.I. mark of beast/666/wisdom/man....what man is associated with wisdom? Solomon....black man....beast...Solomon ASKED for wisdom and RECEIVED it. which means as a black man HE DIDNT ALREADY HAVE IT. he asked and "received" it as a GROWN MAN. he wasn't born naturally with it. it pleased the Lord that he asked for it though..but then he failed with lower nature in combination of higher wisdom not even belonging to him! (women and riches) and kingdom was removed from him. still to this day.........same ole.........lower nature money and women........losing kingdom but why keep letting him try? that's a sin to keep trying.or giving to him.if you already KNOW he will heap chariots/riches and women to himself...its sin to let him.

  10. John Smith

    John Smith5 개월 전

    Her main talent seems to be being black. It would be refreshing to see a black programmer who's claim to fame was something other than his or her skin color.

  11. Lauren Cross

    Lauren Cross5 개월 전

    This is so interesting! Joy Buolamwini rocks.

  12. Brian B

    Brian B5 개월 전

    I lost interest when she tried to turn this into a social justice cause. If we are going to bring social justice warrior nonsense into this then it opens Pandora's Box to a whole range of offensive arguments, like how one could argue that even using facial recognition software while not being of the same culture as the writers of the code would be something along the lines of cultural appropriation. Just keep it straight technology and leave out the lame attempts to make it a social justice cause and we can all get along better.

  13. Ayon

    Ayon6 개월 전

    I agree with the premise of making services equitable in access and fair, but I think "unlocking equality" through digital technology is a vague and concerning mission. Equality of what? Between which groups/sub-groups? And who decides?

  14. Ayesha Ahmed

    Ayesha Ahmed7 개월 전

    I don't get why this has so many dislikes? I liked her talk.

  15. Ian

    Ian9 개월 전

    I appreciate the content of the video, but I wish she would have included more statistical examples. Its one thing to claim your face wasn't recognized. Its another thing to display data on many people who's faces were scanned and were or were not recognized.

  16. Jyoti Swamy

    Jyoti Swamy년 전

    Thank you Joy! This world needs you, because programmers (as evidence from the comments), have no understanding of the SOCIAL IMPACT of their work. Of course there may be other solutions, but it is the SOCIAL structure of your field that matters. This would not be a TECHNICAL issue if every programmer was black, but because RACIAL MINORITIES are highly disadvantageous due to unethical practices and historical processes, (that have kept them from learning about such software relative to others dependent on race and gender), cannot be due to a "glitch" in the system. WAKE UP PEOPLE!!! Racial paradigmatic bias exist in computer science as well. Also, this is a BLACK WOMAN talking about facial analysis software which is changing the SOCIAL STRUCTURE of the field, and needed to prevent issues like this in the future. You can most definitely argue that there are other ways to fix this solutions, but you can't argue that the minority elite does not look like Joy. I swear this world needs to be more reflexive........UGH. JOY YOU ARE A QUEEN! Thank you so much for speaking up and being a voice for the voices in a very underrepresented field. (Simply look at the representation of the audience). WAKE UP YALL.

  17. Gabriel Butler

    Gabriel Butler년 전

    very cool kanye

  18. Daniel Matthews

    Daniel Matthews년 전

    This talk is a bit of a fraud as your choice of camera type matters as much or more that the training of the software. Have a look at the image of the two women in this article, www.photonics.com/Article.aspx?AID=51523

  19. S Brown

    S Brown년 전

    website mentioned..www.ajlunited.org/the-coded-gaze

  20. VitaSineLibertatenih Hithere

    VitaSineLibertatenih Hithere년 전

    This chick is reaching new levels of her inferiority complex. So fun to watch.

  21. Retro Gamer

    Retro Gamer년 전

    I've been saying this for several years.. LOLs..... Oh wells. I'm still invisible.

  22. dhbceeyjnnv gcseyhnvv

    dhbceeyjnnv gcseyhnvv년 전

    only white people matter, stay mad fucking muds

  23. Clay Robert

    Clay Robert년 전

    Brilliant. But I don't think anyone should want to have their face recognized by a software. Doesn't that seem a bit intrusive?

  24. Colorfully

    Colorfully2 년 전

    The people programing/coding the algorithms that impact us all have implicit (internal) biases that become implicit in their code.

  25. Mario Viti

    Mario Viti2 년 전

    Machine learning algorythms for classification minimizes loss wich is the average distance from the classfier to the ground truth, for practical reasons it is averaged supposing a uniform distribution, this hyphotesis is profoundly well thought because if the population of your examples is random than arithmetic mean approximates the expected vaule of the underlying distribution. Bias occours when examples are not sampled randomly and distribution is not adjusted accordingly. Stupidity of the model is just the reflection of the stupidity of the engeneer using it.

  26. Mr. Mystiks

    Mr. Mystiks2 년 전

    Pretty pointless talk considering how face recognition has been resolved way before this. This isn't a bias algorithm, it's called a bug sweetheart. Companies alone rectify theses errors since it's in their best interest.

  27. kablamo9999

    kablamo99992 년 전

    Why are there so many idiots who are making comments here? Do they honestly believe that their idiotic and uninformed opinions are needed or wanted?

  28. Vince Ferdinand

    Vince Ferdinand2 년 전

    3:15 This is where it is getting misleading 3:43 There comes the blasphemous lie She above all should know how things work. Altough machine learning technics based on analyzing photos plays a part in facial recognition, this is not the cause of this specific problem. Computer software is, compared to humans, very bad at identifying visual patterns such as faces. Dark tones absorb light while lighter tones reflect it, this reflection is needed for the machine to identify a face. Basically the eye of the machine is, at this point, not advanced enough to identitfy people with a dark skin tone. As we continue to improve this technology it is only a matter of time, a few years pehaps, before this will be solved. Her solution to employ more black people and train for racial awareness will not solve this problem in any way and seems politically motivated. Whats scary to me is that the US government has indexed the faces of half their population and how they replace human decision making with artificial intelligence in very critical and various parts of society. Not a word of critc is mentioned about that. Shes hunting the racist algorithms. TED is starting to look like a madhouse.

  29. Kosm _

    Kosm _2 년 전

    how interesting that a video on algorithmic bias has equal parts likes to dislikes.

  30. The Health Body Fitness

    The Health Body Fitness2 년 전

    She Amazing

  31. BLACK CAT

    BLACK CAT2 년 전

    wtf.. Whats with those many Dislikes ? This is genuinely good ted-talk.

  32. Noxious Jellyfish

    Noxious Jellyfish2 년 전

    so... physics is biased?

  33. Emmanuel Ezenwere

    Emmanuel Ezenwere2 년 전

    Nice one Joy, I'll join your fight!

  34. ME Soto

    ME Soto2 년 전

    Better Off Ted vimeo.com/188033313

  35. MedEighty

    MedEighty2 년 전

    The process by which the majority of viewers of this video decided to rate the video: 1. Notice that the person presenting is black. 2. Notice that the person presenting is a woman. 3. Notice that the title of the video has "fighting" and "bias" in it. 4. Switch on racist and sexist brain circuitry. 5. Click thumbs down, before the video beings. 6. Move on.

  36. James Jeude

    James Jeude년 전

    As of May 2018 the like/dislike ratio is about 51/49, so "majority" might be an overstatement ... but ... the problem of inadequate training data is worth discussing and is the obligation of everyone in the AI field to discuss. Look at the google search for something as basic as 'grandpa' and you'll see almost entirely whites in the top few dozen results. This is not a result of bias at Google Image Search, but the preponderance of examples that have the word 'grandpa' on (EXIF) or near the photograph, and the link and click history as processed by an algorithm that is proprietary to Google or Bing or Yahoo or whatever. The softer side of the question, the less technical side, is whether a company like google has an obligation to un-do the bias it picks up from the actual clicks, link, and web design behaviors of its billions of websites and users. So to the point of her video - does an image-recognition engineer have an obligation to look beyond the mass of evidence and check for bias in the less common cases. It's analogous to a statistician designing a survey to 'oversample' certain segments with low representation. ("oversampling" doesn't mean "over-representing", contrary to the misunderstanding some politicians had during the 2016 election. The numbers are normalized before the survey is published.)

  37. billsbat capincap

    billsbat capincap2 년 전

    now robots are racist

  38. Morph Verse

    Morph Verse2 년 전

    Well, i once had to put my face on a cam and the cam didn´t recognize my face either.. She actually does something about it, so why the dislikes for this video?. I support her actions, as long as it doesnt results in consequences that leave other groups out for the sake of the main group..

  39. T Clark

    T Clark3 개월 전

    Because she displacing science for ideology.

  40. P3ncil L3ad

    P3ncil L3ad2 년 전

    I would be happy if a computer could not recognize me.

  41. TheMrmoc7

    TheMrmoc72 년 전

    I'm as anti-feminist as they come but this was an interesting talk that did not warrant the like/dislike ratio. This is exactly the kind of talk from a minority member that we have been waiting for. A lot of people are overly sensitive about the word choice "bias" in the title but given how horrible similar talks have been in the past, this should be held up as an example of how to give a TED talk when you happen to be a minority member.

  42. Robbie H

    Robbie H2 년 전

    You can tell most of the people who disliked the video don't have a science background. It IS a bias if the algorithm only recognizes a certain type of face. The word bias has no negative connotation by itself, it simply means preference or "works better with". She isn't saying the algorithms are "racist".

  43. The D-Rex

    The D-Rex2 년 전

    She didn't racially complain, she's actually trying to solve the problem. Do not know why so many retards disliked this. Way to show bias. I always see a lot of dislikes when it comes to colored speakers.

  44. john carter

    john carter2 년 전

    hey pretty soon we will detect the reptilian shapeshifters

  45. procrasti86

    procrasti862 년 전

    This is the Kodak film problem all over again. In 7:11 she forgot to mention - Black Lives Matter As an MIT graduate why don't you feed this machine learning code some dark-colored faces yourself instead of whinging about it, you lazy cow Hurr durr blame the white people, their code is racist

  46. Sbongiseni Mazeka

    Sbongiseni Mazeka2 년 전

    Black people aren't "people"... they are Gods.

  47. Top Nep

    Top Nep2 년 전

    Racist science! Haha. lol

  48. Timo jissink

    Timo jissink2 년 전

    I've actually studied everything to do with 3D printing and so also 3D scanning, I've learned that there are 3 things that are difficuld to scan by a 3D scanner. the first is shiny objects, the second in translucent objects and the last was black objects... "Black objects" it's true, licht gets absorbed by the color.

  49. cazador1022

    cazador10222 년 전

    So when the robots take over you will be invisible to them...

  50. BunnyFett

    BunnyFett2 년 전

    Intro sucked, but the rest of the video was great.

  51. Off-Grid Optimist

    Off-Grid Optimist2 년 전

    people watching TED are of a much lower likelihood to discriminate. we enjoy unbiased factual information. I did watch the whole video before looking down to see the over whelming amount of dislikes. and the only reason i noticed was because i went to click dislike myself. the reason being is that there are many reasons from a technological stance that the camera may not pickup her face. starting with lighting being the MOST likely. most budget tech projects dont use low lux cameras with a high dynamic range because they are more expensive. so dark objects get under exposed. ergo, no detail equals no recognition. There is no bias, she is clearly seeing what she wants to see, an excuse to stand up and talk about color. seems to be all the rage these days. apparently it's more important to dwell on the past than to face our current struggles like human trafficking. we are all different colors, black, brown, yellow, red, olive, or lacking pigmentation altogether. we need to stop focusing on liberating those who were oppressed in the past and move on to uniting as one species to protect our humanity. the evil forces in this world love that we are blinded by our struggles on color so that they can stay out of the spotlight and profit from our ignorance. wake up America.

  52. Global Saturation

    Global Saturation2 년 전

    That feeling when even math is oppressing her

  53. Richard Bull

    Richard Bull2 년 전

    Thanks

  54. Stephen Clement

    Stephen Clement2 년 전

    Cool initiative! I am sure coders would love you helping them identify their bugs and provisioning them with free data. Just make sure you remember they probably didn't do it intentionally and approach them kindly. Otherwise you will end up as crusaders fighting someone who isn't really your enemy.

  55. David Devine

    David Devine2 년 전

    I agree that this is a problem, but she seems to really have a shotgun approach to trying to create new or take over existing phrases / memes / acronyms ... feels to me like she's hoping one or more of them will gain traction for her own self aggrandisement ... self promotion is one thing, but it feels like this speaker took it to another level, beyond that which TED is normally known for.

  56. Rahn127

    Rahn1272 년 전

    If the facial recognition software isn't working properly, then you fix it. What you don't do is keep working with something that is broken and then whine about it in a video.

  57. Joshua Latham

    Joshua Latham2 년 전

    ffs, the computer learns what faces look like. If the program is fed white faces it's not going to recognise black faces. Rather than telling people they're doing a bad job, build you're own face recognition software.

  58. Goldmeteora

    Goldmeteora2 년 전

    Some ppl in the comment section are really better off getting a proper life

  59. Daniel Aparicio

    Daniel Aparicio2 년 전

    This woman accent is very annoying, and distracting from her talk.

  60. Trevor Miranda

    Trevor Miranda2 년 전

    So many downvotes, when she's a highly educated CS postgrad trying to create data sets that only make facial recognition better. People are dumb.

  61. Leonidas GGG

    Leonidas GGG2 년 전

    Black woman refers to "coded gays" as a problem... Beautiful way to start a talk. (I leave this to you internet)

  62. Petr Skokan

    Petr Skokan2 년 전

    I am pretty sure that first US humanoid robot will have to be black.

  63. Petr Skokan

    Petr Skokan2 년 전

    OMG ... black studies ?

  64. Michael Hartman

    Michael Hartman2 년 전

    Insure that the error is not a matter of insufficient contrast. Increase your dataset with local data. Include what is not valid as well as valid. A computer is not a bigot any more than a wheelbarrow is prejudice. Your amount of data is too small. Get over it.

  65. morbionicle

    morbionicle2 년 전

    ahh this is tedx... explains the banal topic