Another limitation of working with big data is that, we tried to derive meaning from this data. But the meaning is not always meaningful or at least it's not what we want to be in the future. Let me walk through an example. For example, we take all texts corpora that we have on the Internet. You see all the texts that you find and then you feed it into an artificial intelligence and what it actually does then with applications like virtual vague, so it takes these words and converts it into vectors, and creates this multidimensional vector space. Again, nobody can imagine beyond three-dimension, but just imagine a three-dimensional space. Now where you have words in different corners of the space that can like cluster together because they co-occur together and have something to do with each other. In origin derive meaning you just see like what kind of words hang together. For example, let's feed all the texts we find on our social media, on newspapers, on Wikipedia, and so forth, encyclopedias, we feed it into our word2vector artificial intelligence, and then we see their names like male names, John, Paul, Mike, Kevin, Bill, and female names Amy, Lisa, Sarah, Diana. We see where this vector space they are and what other words are In the vicinity of these words. When we do that what we find is, male names are in the vicinity of other words like executive, management, professional, corporation, salary, office, business, career. Female names like Amy, Lisa, Sarah, and so forth, are in the vicinity of words like home, parents, children, family, marriage, wedding, relative, which is funny because per biological definition, there as many male parents as female parents. So it's funny. Well, that's what the algorithm detects. Where does the algorithm get things like that from? The algorithm is really discriminatory and discriminates against women. Where does the algorithm get that from? Let's look at another example. So here's an example, where we take names, predominantly used by ethnic origin, white ethnic origin, for example, Harry, Katie, Jonathan, Nancy, and Emily and we see these names are in the vicinity of words like freedom, health, love, peace, heaven, gently, lucky, loyal, diploma, laughter, and vacation. When we take names predominantly use by African-Americans; Jerome, Ebony, Jasmine, Latisha, Tia, we see they are in the vicinity of other words used to abuse, filth, sickness, accident, poison, assault, poverty, evil, agony, prison. Again, that's really a racist artificial intelligence. Where does this artificial intelligence gets such racist statement that somebody called Jerome, Ebony, Jasmine, Latisha, Tia is more likely to go to prison. For example, if we would now use this artificial intelligence to invite for a job interview, and we will ask this artificial intelligence, who should be invite for job interview? The artificial Intelligence says, well, don't invite anybody with these names because they are more likely to go to prison. It's actually has been shown that, if you would use that, the probability that you will get invited to job interview just by having this name after consulting this kind of artificial intelligence is only 66 percent, compared with people with other first name. We have a higher probability, the artificial intelligence would recommend them because professional, love, laughter, and happiness in contrast to sickness, accident, and prison. So where again, I ask for the third time, where does the artificial intelligence get that from? How can artificial intelligence be so racist? So discriminatory in terms of gender, where does it get it from? You're right, it gets it from us. It gets it from us. We just fed it with the big data, with the digital footprint of us. We are as racist, we are as discriminatory, with regard to women, and everything we leave behind in our social media, and our books, and our newspaper articles now, encyclopedias, we feed it in our artificial intelligence and now we can see that. All the difference is now, you can really just actually visually see it in a vector space.