• (made with electron, nodejs, html, scss, p5js, ml5)

  • If machines are being run on biased data what does that mean for the future of humanity? Especially for those who are targeted? The goal of this project is to highlight and stress the importance of human bias in datasets and what that could mean when it comes to the future of technology.

    Concept & Domains

    What does the future look feel like when systems from the government are running on biased data? SAL demonstrates the dangers of biased data in technology. It watches and examines the user all throughout the project and judges them based on corrupt datasets.

    • Human-Computer Interaction
    • Biased Data
    • Machine Learning

    I used ML5 and Face++ to help build the profile of the user. ML5 (PoseNet) was used to watch the users movements for certain poses and behavior. Face++ was used to set the users score and estimate their age, race, and gender.

    Form Focus

    Initially I wanted to have a one page design with the information displayed but the format became very cluttered when prototyping. So, after some revision I settled on a single page with tabs to display different pieces of information relative to the user. This is currently what I've settled on but it is subject to change and I will update this page if it does.

    In order to make sure the user experiences some biases I added a single if/else statement to the code to determine the outcome from the beginning. When the user takes their picture SAL will analyze them and determine their race. If they're identified as white they automatically pass. Also, during the pledge of allegiance, SAL takes a photo of the user and if they have their hand over their chest they gain extra points. This was just for the sake of the prototype, in the future iteration the final outcome will be based on various other factors as well.

    User Testing

    During the Thesis Popup show I was able to get a lot of feedback from various testers. The main problem I was encountering was that the instructions weren't clear enough. Which often caused the message to get lost through the process. Also, I was told frequently to push it further and make the bias more subtle and not so much in your face. It took away from the purpose and the National Anthem along with the Pledge made it seem like too much of a joke for some users. I have to find the right balance of satire to add to the project.

    By making the process extremely bias I was able to have the users experience and feel how it feels to be ruled out by a machine but the approach didn't allow them to feel it the right way. The directness of the bias made it not as impactful. Moving forward I plan on implementing this in a far more subtle and realistic manner.