In my “Writing in Science” class we’re currently reading the autobiography of James Watson, one-half of the duo credited for discovering the structure of DNA, which won them the Nobel Prize. As a humanities student, my last encounter with hard science was ninth grade biology, so reading Watson’s autobiography that’s filled with detailed genetic explanations and breakdowns of complex chemical processes is above my head. Scientific jargon aside, I’m fascinated by the way Watson describes the level of competition and deceit among the labs in which he and his colleagues worked. For Watson, the fact that two countries (America and the UK) raced against each other to discover the structure of DNA seems to overshadow the amount of manipulation and espionage that went on behind the scenes. I mean, this background competition doesn’t surprise me. But, I find it disturbing to learn how seemingly respected scholars could steal another person’s work and then be valorized for it. Like, what does this say about our ethical responsibilities to each other? Do those even exist anymore? What does it mean to collaborate with other people when we don’t know if they’re trustworthy?
I find questions like these super interesting because I don’t know the answer, especially when viewed from the frontier of new technology. In a podcast I recently listened to, a former member of MIT’s Media Lab talked about her experience watching the world of technology transform from something visible into something increasingly less so. Basically, the fact that the tools we use on a daily basis are starting to become less noticeable and more discreet concerned this individual, and for a good reason. The tools we use – be they our smartphones, laptops, tablets, wearables and other gadgets – have the capacity to store vast amounts of information about their users. It’s no wonder then why some people – like the woman featured in the podcast – feel uncomfortable about where the future of technology is heading. Is there a line between normalizing the technology we use and not being aware of it? If so, where is this line drawn, and who determines its location?
It’s kind of exciting to think about in a Mel-Gibson-post-apocalyptic-Hollywood-blockbuster kind of way. But at the same time it’s really unsettling too, just like it’s unsettling to find out that the process of peer review can be controversial. When it comes to establishing credibility among members of a field, how do we regulate the flow of information in order to ensure credit is given where credit is due? If the increasing availability of advanced technology results in the potential for human beings to become better liars, is this acceptable? I don’t mean to say that we’re all walking around pretending to be something we’re not. But then again, some of us are. And my question is, is that a bad thing or a good thing?
Are you for or against the decreasing visibility of new technology? Email me your thoughts at firstname.lastname@example.org.
Graphic by Sandra Bae.