Friday, May 1, 2020

Book Review: Rebecca Skloot's "The Immortal Life of Henrietta Lacks"

This is the second of my posts written during the COVID-19 quarantine, during which I tried to catch up on reading I've been neglecting. For this week I'll be reviewing freelance science writer Rebecca Skloot's The Immortal Life of Henrietta Lacks (2010), an engaging, thought-provoking, and unflinching nonfiction tale at the intersection of medical ethics, race, and biotechnology.

The year was 1951. Henrietta Lacks, a 31-year-old Virginian mother and tobacco farmer, found a hard "knot" on her womb that she could not explain (Skloot, 2010, p. 56). It wasn't until her subsequent appointment that she was diagnosed with cervical cancer. While Henrietta was anesthetized to receive a radium treatment, samples of her tumor were taken without her knowledge or consent by Johns Hopkins cell tissue researcher George Gey in his quest to culture a line of human cells that would be useful in advancing biomedical research. Up until that point, most cell lines did not last that long, because a sample's original cells only divide a finite amount before dying, about when they have "doubled fifty times," making research resource and time intensive (Skloot, 2010, p. 390). This limit on cell division is known as the "Hayflick Limit" (Skloot, 2010, p. 390).

Unfortunately, Lacks would succumb to the unusually aggressive cancer not long after her diagnosis and gruesome radium treatments; but her cells, known as "HeLa" for the first two letters of her first and last names, would (and still) live on in labs around the world. This is due to the fact that her cells, infected by a particularly virulent strain of Human Papillomavirus (HPV), were cancerous in nature and not constrained by the Hayflick Limit like normal cells. It would be those properties of hardiness and rapid reproduction that allowed researchers to make numerous advances beneficial to humankind in the development of treatments and vaccines for a wide variety of ailments and diseases without risking human test subjects. 

Not coincidentally, biomedical and pharmaceutical companies would largely reap the benefits of HeLa's rise. Initially unaware of the commercialization of the HeLa cells, the Lacks family would find out twenty years after Henrietta's death of HeLa's fame and accompanying profitability, with understandable feelings of betrayal and exploitation. Skloot contextualizes this episode in a long history of gruesome experiments conducted on African-Americans, such as the infamous Tuskegee syphilis study, in order to get readers to understand the complex ethical issues surrounding patient consent, tissue ownership rights, and scientific advancement. The latter is especially important considering our tissues are taken all the time. For example, when we attend doctor appointments or send in our DNA to companies like Ancestry and 23andMe.

Of course, there are now laws that seek to promote patient consent, privacy, and rights such as those prohibiting genetic discrimination or the unauthorized release of patient medical records. However, the reach of these laws is limited. For example, modern laws surrounding consent largely apply to federally funded research. Should these laws expand their reach to the situations not covered by federal research law? Would this chill scientific advancement? These are but a few of the questions raised. Skloot's book provides no definite answers, but leaves it up to the reader to make an informed judgment on these important ethical questions.

Overall, Skloot's employment of the running parallels of the Lacks family story and HeLa's scientific journey makes The Immortal Life of Henrietta Lacks an informative and engaging must-read for everyone, scientist or not.


Works Cited:

Skloot, Rebecca. (2010). The Immortal Life of Henrietta Lacks. New York: Broadway Books.

Wednesday, April 8, 2020

"The Danger of a Single Story"

Like many of you during this pandemic, I too am largely confined to my dwelling. As an introvert used to staying in, this isn't much of a problem! However, I have had to find more creative ways to structure my time now that I am on leave from my library position and have additional time on my hands. One of those ways I am keeping myself busy is by taking advantage of remote staff development opportunities like webinars.

Today, I viewed the Wisconsin Library Association's presentation from this past October titled "Putting Equity, Diversity, and Inclusion into Action" and learned about how public libraries are making themselves more welcoming to diverse patron communities. During this webinar, a 2009 TED Talk by Nigerian author Chimamanda Ngozi Adichie, well known for works like We Should All Be Feminists (2014), was referenced among a list of resources libraries are using for diversity training. Curious, I decided to check it out after I finished with the webinar. And I'm glad I did!

Titled, "The Danger of a Single Story,"  Adichie's talk highlights the dangers of stereotypes that crystallize in people's minds after being exposed to media that presents a singular narrative about a culture, people, and/or place. For example, the overrepresentation of stories from or about Africa that speak to disease, poverty, and war that tend to overshadow other stories from the same continent that speak to its cultural, economic, ethnic, lingual, political, and social diversity and vibrancy. In other words, people and places are not monolithic; there are nuances to their histories and stories that are hidden or minimized when singular narratives repeated over and over overpower these shades of gray.

Coming from an avid consumer of literature and history alike, these shades of gray are more interesting and enlightening in the ways they challenge those powerful narratives out there and reveal something about those that perpetuate these stereotypical messages. Adichie presents all of this in an accessible way relevant for anyone, regardless if you're a bibliophile or not.

Check it out! See the link below in the references section or above.

Stay safe and stay well!

Works Cited:

Adichie, C. N. (2009). The Danger of a Single Story [Video file]. Retrieved from  https://www.ted.com/talks/chimamanda_ngozi_adichie_the_danger_of_a_single_story?utm_campaign=tedspread&utm_medium=referral&utm_source=tedcomshare.

Monday, January 6, 2020

Questions of Hegemony, Meritocracy, and Technological Redlining

[Happy New Year everyone! This essay is a piece I wrote for one of my library school courses about algorithmic bias. I thought I'd share it with you. While not written optimistically, I have come to realize since that as an aspiring librarian, I have a responsibility to try and help correct some of the flaws inherent in our knowledge organization systems.]


There is a growing body of multidisciplinary research within critical information and technology studies which suggests that technology and technological progress are not isolated from the socio-historical contexts they are situated in (Allen, 2019; Noble, 2018; Wajcman, 2010). The algorithms of “big data” that have come to govern most of our lives, from those underlying search engines to housing selection are no exception (Allen, 2019, p. 219; Noble, 2018, p. 27). Multiple scholars have found that algorithms are complicit in “technological redlining,” a carry-over of gendered and racialized discrimination onto the “cybertopia” of the Internet meant to be a liberating project (Allen, 2019; Noble, 2018, p. 1, 47; Wajcman, 2010).

In this post, I will further explore the concept of technological redlining as it relates to hegemonic and meritocratic ideas in my examination of the introduction and first chapter of information studies scholar Safiya Noble’s book Algorithms of Oppression: How Search Engines Reinforce Racism (2018) and legal scholar James Allen’s journal piece “The Color of Algorithms: An Analysis and Proposed Research Agenda for Deterring Algorithmic Redlining” (2019). Both scholars have contributed greatly to an understanding of the harms resulting from unregulated proliferation of algorithms and have called for greater awareness among the public and policymakers who consume and produce information largely mediated by these algorithms.

Beginning with Algorithms of Oppression, Noble makes it clear that algorithms are a technical product reflecting the privilege and biases of its designers, a dominant group in tech that believes in the meritocratic idea that popular technologies acquired their status through an impartial, yet democratic, process of technological development involving the best ideas and practices (Noble, 2018). To do this, she conducts a case study inquiry on commercial search engine algorithms (focusing on those employed by Google), by searching various terms related to group and individual identity, and then contextualizing her interrogation of the value judgments and ideas represented in the results within the wider intersectional critical literature. The results were shocking, often bringing up derogatory, pornographic, racist, and sexist content about women and people of color (Noble, 2018).

For example, she highlights the differences in the search suggestions that are brought up by Google’s auto-suggestion algorithm in searches beginning with the phrases “why are black women” and “why are white women.” For the former, suggestions to complete that sentence are “angry,” “loud,” “mean,” “attractive,” “annoying,” and “insecure.” As for the latter, suggested terms were “pretty,” “beautiful,” “mean,” “easy,” “insecure,” and “fake” (Noble, 2018, p. 21). Note the primary positioning of both white and black women as sexual objects. Conversely, a search for “professor style,” a professional identity term, brought up mostly pictures of white men in very similar suit-and-tie and khaki outfits (Noble, 2018, p. 23).

Noble argues that these racist and sexist interpretations manifested in most image results retrieved from the above terms associating women with reproduction, sex, and the domestic sphere, and men with the public and professional sphere, are nothing new. For Noble, this is further evidence of a larger pattern of “technological redlining” and “algorithmic oppression” in that harmful stereotypes are merely being reproduced and transmitted within newer modes of communication such as the Internet (Noble, 2018, p. 1, 4). Not coincidentally, such historically marginalized groups like women and people of color often do not have the economic, political, and social clout to stave off the damage that results from being associated at an individual or group level with these popularized portrayals. “Code is a language full of meaning” that imbues technology with the subjective understandings of its creators, rendering normalized narratives about its impartial nature inaccurate (Noble, 2018, p. 26).

While Noble’s methods of exposing the otherwise invisible power relationships evident in Google’s search algorithms using a black feminist lens are largely qualitative in nature, her insights are nevertheless erudite and evocative in highlighting the power a dominant minority with specialized skill sets holds over a dependent and less technologically literate majority.

Complementing Noble’s arguments about search engines being symptomatic of tech’s larger complicity in preserving and promoting the very oppressive system of social relations they purportedly avoid reifying with their “neutral” algorithms is James Allen’s treatment of housing algorithms. While Noble conducts her own case study to demonstrate the arguments made by techno-feminist critiques against the harms stemming from the normalization of gendered and racialized stereotypes, Allen conducts an extensive literature review, before concluding with a substantive section on possible solutions to the problems treated in his analysis. Moreover, Allen examines Noble’s “algorithmic oppression” more specifically as “algorithmic redlining,” defined as “sets of instructions” that “carry out procedures that prohibit or limit people of color from procuring housing or housing finance, particularly in non-minority neighborhoods” and threaten the “autogenerating” of discrimination (Allen, 2019, p. 222-223, 230; Noble, 2018, p. 1).

Echoing Noble’s assertions that people must critically examine how their data is used in machine-learning algorithms, Allen finds that most of these algorithms, meant to provide an efficient means of assuring for the greatest amount of people possible equitable lending and affordable housing, draw upon a wealth of discriminatory historical data generated during and reflecting past periods of segregation combined with present data-mining pertaining to additional factors such as ethnicity and residential location. For example, in the determination of creditworthiness, which is a critical first step in securing lending, many eligible for these funds are either denied or given disadvantageous interest rates because of a high concentration of risk factors in their data profile, such as having lived in a majority-minority neighborhood.

Both Allen and Noble advocate for algorithmic literacy among the public and policymakers and more transparency in algorithmic design. Allen pushes more specifically for the updating of past reforms, such as the Community Reinvestment Act (CRA) and the Fair Credit Reporting Act (FCRA), and their accompanying equivalents in Internet and intellectual property rights law (Allen, 2019). Essentially, there must be a higher bar for the disclosing and auditing of both the data used by algorithms and their conclusions, along with strictly enforced human oversight of algorithms throughout the process of decision-making (Allen, 2019).

Overall, both scholars provide valuable insights into how power is negotiated between those hegemons who know how to manipulate code in various contexts and those who do not, but do not go much beyond establishing their research agendas to elucidate the exact technical mechanisms needed to minimize algorithmic bias nor specifically address how women and people of color in tech negotiate a culture of meritocracy that undervalues their work and achieve positions that allow them influence in the development process. No doubt, this is due to the self-reinforcing nature of algorithms, in that they draw upon the fallibility of their creators and the data gathered about people by governments and private entities to make decisions (as well as its own past decisions) paired with the shield granted the code of the algorithm by proprietary or intellectual property rights law (Allen, 2019; Noble, 2018). Therefore, it is difficult to advocate for specific mechanisms to fix a code if one cannot access it. All in all, it is apparent that reforms are needed at multiple levels of society, but the growing prevalence of algorithms and their profitability combined with the need of addressing the larger structural systems in place that fostered their creation make the prospect of meaningful reform highly unlikely. 

Works Cited:

Allen, J. A. (2019). The color of algorithms: An analysis and proposed research
agenda for deterring algorithmic redlining. Fordham Urban Law Journal, 46(2), 219-270. Retrieved from http://search.ebscohost.com.ezproxy.library.wisc.edu/login.aspx?direct=true&AuthType=ip,uid&db=lft&AN=136193121&site=ehost-live&scope=site.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. Retrieved from https://ebookcentral.proquest.com.

Wajcman, J. (2010). Feminist theories of technology. Cambridge Journal of Economics, 34(1), 143–152. Retrieved from https://academic.oup.com/cje/article-abstract/34/1/143/1689542?redirectedFrom=fulltext.

Book Review: Rebecca Skloot's "The Immortal Life of Henrietta Lacks"

This is the second of my posts written during the COVID-19 quarantine, during which I tried to catch up on reading I've been neglecting...