Digital Citizenship

I have never felt at ease existing in the digital world. I’ve created and abandoned or deleted probably a dozen Facebook, Twitter, and Instagram accounts collectively. I’ve never had an account with Amazon despite sharing a home in Seattle with the company for the majority of my post-university life. There is something about the need to always be engaged, to consistently participate in the conversation. My discomfort with digital has not been limited to the web. I didn’t grow up with video games, let alone cable TV and a VCR. Despite being a somewhat competitive endurance athlete, I’ve never logged my runs or bike rides with GPS, nor paid much attention to my power output or heart rate. Technology has always felt too complicated to mix with me. Yet somehow, I find myself existing professionally in the digital space. And I’m still not sure how I feel about it.

I will concede that the advent of the Internet on the whole has been positive for the development of society. The ease of information sharing has improved research and application of health and human services. It’s connected families and reconnected old friends. However, even while doing good, it has been a marketplace of more sinister intentions.

The “invisible algorithmic editing of the web” as Eli Pariser puts it in his 2011 TED Talk Beware Online Filter Bubble is an unsettling reality. That tech companies can so easily dictate what we see and what we do not is arguably as manipulative as society has ever experienced.

The argument on behalf of these arbiters of online content is that the filter bubble is decided by the user. That they decide what gets let in and the experience is more tailored and therefore more efficient. I don’t take issue with tailoring digital experiences per se, but when the user cannot see what is left out, as is the case with the filter bubble, they are being steered down a path not entirely of their choosing.

Pariser points out that algorithms are as much editorial gatekeepers as humans can be. To this point, algorithms may operate on their own terms, but they are not created on their own terms. People like you and me decide how to build these calculating systems that determine what others do and do not see. The companies who own these systems must be more thoughtful about their application.

I’m unnerved by what is known about me. According to Pariser, at the time of his talk, Google looked at fifty-seven data points of an individual user in order to customize their search results. That is still being done, possibly at a greater scale ten years later, even without a user being logged into a Google account. Moreover, the writing was on the wall for 2016, both in the U.K. and here in the U.S. The ability for tech companies to gather such specific data points on individuals and for partisan entities to have access to that data to spread misinformation has proven dangerous.

In her 2019 TED Talk, Carole Cadwalladr laid out the circumstances behind the U.K. public’s vote to leave the European Union. I remember wondering how that vote could have ended the way it did, not fully aware that the same situation would soon play out in my own country to similar effect. Her presentation makes clear that we are the canary in the coal mine when it comes to free and fair elections. So long as companies like Facebook allow their platforms to be used for targeted misinformation, we are in for increasingly combative elections. The contentious 2020 U.S. presidential election and the disturbing circumstances surrounding the January 6 insurrection are unlikely to be isolated incidents should more strict regulations of tech companies not be put into place.

In my professional experience, any content we publish is tracked for performance and logged in an archive. That advertisements on Facebook are only documented through individuals’ screen-captures is unacceptable. Facebook should not be a “black box” of ad content as Cadwalladr refers to it.

Contract for the Web

The Contract for the Web is a theoretically positive effort at making the Internet an equitable space, a digital resource through which humanity can unite and thrive, where users’ privacy and data rights are protected.

The contract’s position that “everyone has a role to play in safeguarding the future of the web” is agreeable. The three principles for respective governments, companies, and citizens clearly outline what these entities can do to support the contract. However, when viewing who has signed on in support of the initiative Google, Facebook, and Amazon are some of the first companies listed. Knowing how these three enterprises operate when it comes to utilizing user data for profit, I find it difficult to see legitimacy in the contract. What does it mean for the contract’s worth when an actor as compromised as Facebook is signed-on? Rebranding is not enough to regain consumer trust and appreciation. I still hold hope that the next generation of digital technicians and leaders will choose to do right by their fellow global citizens. The web can create a world of good, but it can also create damage and lasting trauma in the physical world. That negativity is preventable, but it requires that tech companies be thoughtful about their tactics, governments be intentional about their policies, and private citizens careful with our clicks.

Bibliography

Cadwalladr, C. (1555445340). Facebook’s role in Brexit—And the threat to democracy. https://www.ted.com/talks/carole_cadwalladr_facebook_s_role_in_brexit_and_the_threat_to_democracy

Contract for the Web. (n.d.). Contract for the Web. Retrieved December 12, 2021, from https://contractfortheweb.org/

Pariser, E. (1304298000). Beware online “filter bubbles.”https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles

css.php