In the name of science?
According to Mimi Launder, there are tools that designers use to influence the characteristics of people who are misled by directed consciousness. In this regard, we may find various tools to use so as to avoid unethical treatment of our customers.
Such patterns (unethical) are called dark patterns and designers try to use them so that users do not know anything about them; Or even make a misrepresentation of information easy to avoid legal problems.
Some designers conduct various types of experiments in this regard, which may include such things as: hidden checkboxes, confusing or overly abstract use of terms and conditions, and so on.
It is necessary to grasp that similar experiments are found everywhere. So often, if you want to be a casual participant in them, never give up on the internet at all.
The more interactions take place on social networks, the easier it will be for different companies to introduce unethical principles that would be easier to observe face to face. Additionally, the Internet has its own peculiarities that cannot be reproduced outside of it. This is largely due to the fact that companies simply do not have enough information on ethics / morality.
When Andrew Ledvina, a former Facebook data scientist and software engineer, told a Wall Street Journal correspondent that there was no checklist or system on Facebook at the time, he probably did not expect the journal, from his 45-minute interview. , That is what he would focus on.
Andrew tried to rectify the situation with a satirical article titled 10 Reasons Why Facebook Is the Devil which was met with an outright reaction. Readers (correctly) considered that he was not taking the issue quite seriously. Andrew even said he just does not understand what race is all about; According to him, every Facebook user was involved in this or that experiment. It was used in marketing as well as in design and their page structures (through algorithms).
Andrew says that the experience of each user is very important to Facebook data scientists, which in his opinion means offering worse conditions for several of them in order to create a better situation for more than a billion.
That same week, another manipulative experiment by the social media giant was unveiled. Over the course of a week, half a million news pages were distributed as sad or good news. Users participating in the experiment reflected this situation in their own statuses.
Although this was only 0.07% of the customers the results of the experiment negatively affected the general perception of people towards the company; Which raises the suspicion that the social media app is really interested in the interests of its users
Facebook obviously did not want anyone to hear about it. Hidden practice, however, does not make it ethical.
The problem here is not minor changes in design or algorithm. The issue here is the power that is actually the object of research.
With similar experiments, Facebook wanted to measure the impact frameworks in which it drives its own users. If this process is hidden it becomes impossible to give permission for the terms provided by this study. In general, in such professional situations, it is necessary to inform the subjects about what elements they will have to interact with.
Facebook does not know that experiments like this will have a negligible impact on users. It is generally clear that the company has not met the academic standards associated with such practices.
Such experiments are not necessary either. Facebook also realized that it was not worth taking the risk mentioned and began to introduce changes such as emoticons based on emoticons.
Personal life and manipulation
Cases like this are not limited to design changes. For example, the company OkCupid (a Tinder-like app that packs people) lied to its own users about compatibility. Couples who clearly did not fit in with each other were told they were "perfectly arranged."
As Facebooks OkCupid also had a non-serious response and proudly stated that they were "experimenting on people"; Which prompted a response from the Washington Post.
Although Facebook deliberately provided sad information to its users, it did not attempt to directly, actively worsen peoples relationships (in this particular case).
How do designers solve similar problems?
Not surprisingly, people are divided on the issue. A relatively primitive ethical position is to commit certain "evil" behaviors in order to improve the situation for most of us.
Happily for us a much more interesting and solid position on this position was already outlined 24 centuries ago. Which will take us too much time to figure out here. However, based on its principles, it is better to remove experiments altogether if we do not accurately describe to users the situation in which they will find themselves.
In general, more customers always carry more responsibilities that tech giants, in our experience, have not been able to deal with.
Improving their situation to the detriment of people involves jointly conflicting concepts. Purpose and means are interconnected and not divided so that evil is justified by the idea of "goodness."
Sadly, procrastination on the side of truth, principles, and values is an elementary part of the lives of too many people; Today just like 24 centuries ago.
Its your responsibility - as a designer - not to make unintentional decisions.