Plagiarms is an online university. This university is wildly controversial because it teaches online and uses non-expert teachers.
Why Is Plagiarms In Online Learning An Ethical Concern
Last week, Slate published a piece about Plagiarms (also known as Project Plagiarm), an online AI teaching resource for educators.
Like most digital learning resources, Plagiarms is wildly popular and useful for teachers and students. At its core, the teacher-created platform makes it easy to get the latest course content into the hands of students with few set resources, scheduling, and other steps. In just four years, the platform has created some 68 learning modules spanning 24 languages and over 1,000 content topics.
Plagiarms is built on the idea that a skilled teacher should make all of their own teaching materials. A classroom is stocked with textbooks, practice exercises, materials and supplements—but that’s not the way courses are made. Teachers have to rely on the resources they have access to, to get their students ready for learning. Plagiarms offers a working textbook platform that has been thoroughly vetted by teachers, industry experts, and each course’s principal. In some cases, the platform is even the textbook.
But there’s a catch with Plagiarms’ brilliant algorithm, and Slate warned teachers to be wary of the platform’s “potentially ethical pitfalls.”
Co-Founder and COO Andrew Gelman told Slate’s Vicky Ward that Plagiarms tests teaching content for a long time using an artificial intelligence system to find out which tools students need and therefore, “are the right resources to use.” The process takes up to three days, which basically means any critical material is carefully vetted by teachers before it makes it into the platform. A section on ethics at the end of the PLagiarms opt-in Consent page states that any material that is deemed “inappropriate” or “potentially unethical” is removed.
Gelman warns that if Plagiarms finds that a student or teacher has inadvertently taught a lesson that isn’t appropriate, “the educator is asked to revise the lesson.” And if an instructor doesn’t revise and doesn’t keep it up to date, the content could continue to be taught. Gelman told Ward that there’s no way for anyone to find out what students were taught.
What’s particularly concerning to me is that Plagiarms allows educators to blacklist materials that they find to be inappropriate, thereby keeping these materials from students. Why on earth should we allow schools to have a form of censorship? The only purpose these tools serve is to guarantee that materials remain popular. Let’s not forget that in the ’90s, K.T. McFarland, the director of Central Intelligence for the USA was put in charge of the CIA’s network of unclassified informers; the New York Times classified it as “classified work” because of how much secrecy it covered. In the government, these same methods are used to hide unethical practices from the public—and thus from the media.
Gelman told Slate that the platform does this to keep “workers” from learning of government wrongdoing, but “does not censor any kind of lesson plan or student learning.” The irony of having power come with the risk of responsibility is obvious—they can easily be turned on by teachers who would use their platforms to censor the public. Or teachers who would censor their students.
In recent years, we’ve seen a series of ethical issues arise around technology. The Federal Trade Commission’s report cited 2017 budget cuts to the Education Department (which include the possibility of teacher layoffs) as possible reasons why we’re seeing increased use of technology in schools—where the line between personal use and public use has blurred. According to the FTC, non-law enforcement uses of the Internet and the internet of things decreased by almost 90% from 2012 to 2016. After it was revealed that computer science teacher Stacy A. Chou used her school’s IP address to access companies’ websites, she was called “indefensible.”
If schools are already facing the threat of layoffs and massive budget cuts, it’s not a good time to have groups of people being used as guinea pigs or performers for their own purposes. We should be wary of certain products such as Plagiarms, and consider not only whether they’re ethical, but whether they’re required for our safety as a society.