Algorithmic Self

In today’s digital landscape, our identities are increasingly shaped by algorithms. These complex sets of rules and calculations determine the content we see on social media, the advertisements we encounter, and even the news we consume. This phenomenon, often referred to as the ‘algorithmic self,’ highlights the interplay between technology and personal identity. Algorithmic mechanisms on digital media are powered by social drivers, creating a feedback loop complicating the role of algorithms and existing social structures. 

At the core of the algorithmic self is the idea that our online behaviours and interactions feed into algorithms that, in turn, influence our future actions. Are we becoming the people our feeds want us to be? Scroll long enough on social media platforms like Insta, Tube, or FB and you’ll notice that the content feels uncannily tailored to you. Your feed seems to know what you crave before you do, an oddly perfect mix of travel destinations, recipes, memes, news, workouts, and political takes. This can lead to a more personalised online experience, but it also raises questions about the extent to which our choices are truly our own. What began as a convenience has evolved into something far more consequential. We are not merely using algorithms anymore; we are slowly becoming the selves they design for us.

Algorithms are built to predict and keep us engaged. Every click, pause, like, or scroll is recorded and analysed. In return, the system feeds us more of what we have already consumed. This sounds harmless. After all, who wouldn’t want relevant recommendations? But personalization is never neutral. When a platform rewards the content that hooks us, it amplifies our biases and shrinks our curiosity. Over time, the feedback loop begins to define our worldview, narrowing the range of opinions, art, music, or even relationships we encounter.

The unsettling part is that the algorithm’s goal is not truth, diversity, or personal growth. It is engagement. If desire makes you scroll, it will serve you love. If envy fuels your clicks, it will curate envy-inducing lifestyles. What feels like a reflection of your taste is often a reflection of what keeps you online.

Human behaviour is always shaped by culture, but algorithmic influence is different in speed and precision. Traditional media might set trends, but it never recalibrated itself in real time for every individual. Today, AI systems track micro-reactions—how long your eyes linger on a video frame, how quickly you swipe away, and adjust instantly.

This raises a disturbing question. When you decide to buy a product, support a social cause, or adopt a new hobby, how much of that decision is you, and how much is a carefully engineered nudge? We still feel autonomous because the algorithm rarely forces choices. Instead, it quietly limits what enters the realm of possibility. You can’t choose what you don’t see. Is this the erosion of free will?

Living in an algorithmic world also reshapes identity. Our “digital selves” are rewarded for consistency. The more we like certain posts, the more similar content we receive, and the more we feel pressure to maintain that version of ourselves, whether it’s the fitness enthusiast, the foodie, the activist, or the minimalist. The feed trains us to be predictable because unpredictability breaks the machine’s efficiency.

The rise of the algorithmic self also brings about ethical considerations. There are concerns about privacy, as the data collected to fuel these algorithms often includes personal and sensitive information. Additionally, there is the issue of transparency. Many algorithms operate as ‘black boxes,’ with their inner workings hidden from users. This lack of transparency can make it difficult to understand how decisions are being made and to hold platforms accountable for their actions.

Many people feel a subtle dissonance, their offline preferences drift, but their online persona stays fixed. We perform for the algorithm, optimizing captions, hashtags, even our emotions, to remain visible. Our feeds don’t just reflect who we are, they encourage us to stay who we were yesterday.

But then how do we break the loop?  The answer is not to reject technology altogether. Algorithms are not inherently evil; they can help us discover music, connect with communities, find a job we want, or learn skills we might never find on our own. The challenge is to reclaim agency within the system.

Practical acts of resistance can be quite simple, like, disrupting the feed by clicking on unfamiliar topics or following people outside your cultural bubble; time-box social media use or schedule ‘algorithm-free’ days; read newsletters or listen podcasts where engagement isn’t the primary metric. There could be several other ways to disrupt and reintroduce randomness. However, the most important step, is awareness. Algorithms will always evolve faster than regulations or ethical guidelines. The only lasting defence is a conscious user, someone who understands that every scroll is a form of training data.

The algorithmic self represents a significant shift in how we navigate our identities in the digital age. The question is not whether technology shapes us. It always has. As we continue to integrate technology into our daily lives, it is essential to remain mindful of the ways in which algorithms shape our identities and to advocate for greater transparency and ethical considerations in their design and implementation. The real question is whether we allow a handful of opaque systems to quietly define what we desire, believe, and become. If we don’t actively resist, our algorithmic selves may thrive while our authentic selves quietly disappear into the feed.

Code Dependent: Living in the Shadow of AI

Code Dependent: Living in the Shadow of AI

by Madhumita Murgia | 320 Pages | Genre: Non-Fiction | Publisher: Pan Macmillan | Year: 2024 | My Rating: 5/10

My life—and yours—is being converted into such a data package that is then sold on. Ultimately, we are the products.”
― Madhumita Murgia, Code Dependent

Code Dependent is a collection of case studies about people who are affected by technology, without the rigour and analysis that I was expecting. But then it is not an academic or research-oriented book, but more in the popular non-fiction genre. Several of the case studies in the book reflected on the dark side of technology and social media manipulation of individuals and communities, and their rights, privacy, freedom and future.

The book is an account of how the algorithms used by tech in our daily lives through the user-friendly apps like Google Maps, Uber, Instagram, Facebook and others are changing us and the way we see the world. Our data and us as data is continuously being used for targeted advertisements to make businesses grow fatter.

Murgia defines AI as “a complex statistical software applied to finding patterns in large sets of real-world data.” I believe that AI is much more than Statistical Pattern Recognition (SPR), and this viewpoint of the author is quite narrow.

I agree with Murgia’s take on emergence of new data colonialism around the worlds, especially in under-developed and poor economies, where sub-contracting create numerous jobs as data workers, but wealth created in not shared equitably. ‘Informed Consent’ seemed misinterpreted in the book, and subjective.

There was less of ‘AI’ and more of ominous ‘shadows’ in the book. While the book talks about algorithmic bias against people, it certain has flavours of human bias against technology from the author. The book read more on data transparency than demystifying the positives and negatives of AI and technology. Pessimistic views due to advancement in technology is more pronounced throughout the book.

The book is still a fascinating read, with glimpses of ‘how AI is altering the very experience of being human’.

Data is Divine

In God we trust. All others must bring data.” This quote, made by W. Edwards Deming holds true (and may even supersede God for some as Divine).

I have been in love with data right from my school years and the mysteries of the world it holds. I have tried to develop data driven models on human relationships, the movement of animals, finding patterns in the ways of the world, and later designing programs of social impact for challenging poverty, and policy development. In the end, we all are data, from the moment we are an idea until long after we pass away.

“Data is divine” highlights the growing understanding of data’s vital significance in modern society, in much the same way that religious or spiritual values have directed civilizations throughout history. In today’s digital age, data powers innovation, decision-making, and advancement in all fields, including governance, research, business, healthcare, and lifestyle.

1. Data as a source of truth: Data is frequently regarded as an impartial depiction of reality, providing information on trends and occurrences that may be imperceptible to anecdotal experience or intuition. In this way, data has a unique position as the basis for making well-informed decisions and uncovering hidden facts.

2. The power of data in innovation: Data is driving advancements in domains like healthcare, finance, and climate science and is revolutionizing industries as it powers AI/ML and sophisticated analytics. This emphasizes how data has the “divine” ability to spark significant change. The use of data for enhancing human welfare, from preventing pandemics through data-driven epidemiology to lowering inequality by studying societal trends has been in use. When applied sensibly and morally, it can aid in resolving some of the most pressing issues facing society.

3. Data as omnipresent: From the apps we use daily to the systems that manage our cities, data is present everywhere in the modern world. Its pervasiveness is comparable to a certain “divine” quality in that it affects almost every facet of contemporary life, whether we are conscious of it or not.

4. Data and ethics: Data carries a great deal of responsibility along with its power. Similar to supernatural knowledge, there are significant ethical ramifications to the way we collect, use, and safeguard data. Data misuse can result in inequality, manipulation, and privacy violations. As a result, it is crucial to handle data with dignity, openness, and ethics.

“Data is divine” also implies that we must treat it with deference and accountability while simultaneously appreciating its immense importance in shaping our future. We need to balance the power of data with ethical considerations as our world grows more and more data driven. The following are some crucial strategies to preserve this equilibrium,

1. Data privacy and informed consent: People ought to be in charge of how their information is gathered, kept, and utilized. It is not appropriate to force them to divulge information. Companies must be open and honest about their data practices so that users know what information is being gathered and why. Clear and informed consent should not be buried in complicated terms and conditions. Data literacy is essential among general population so that they are aware of the consequences of disclosing personal information, and the dangers of data misuse.

2. Data minimization: Only gather information that is absolutely required for the current job. This reduces the possibility of abuse and shields people from needless exposure. I’ve seen in recent years how social development initiatives gather and store vast amounts of data, with donors coercing their nonprofit partners to obtain it, yet this doesn’t address any societal issues. It is crucial to have a conscious grasp of what is needed.

3. Data bias and fairness: AI/ML systems may reinforce or increase biases found in the training data. Therefore, diversifying datasets, employing inclusive development techniques, and reviewing algorithms for bias are all necessary to ensure fairness.

4. Equitable data access: One way to lessen inequality is to make sure that data access and its advantages are shared equitably among all communities. This entails preventing the reinforcement of systemic disadvantages while ensuring that marginalized groups have access to data-driven insights.

5. Data governance and accountability: To ensure that data is utilized properly, organizations and governments must establish robust data governance policies and ethical frameworks. To stay up with the latest developments in technology, these policies must be revised regularly. It is imperative to establish unambiguous lines of accountability for the handling and utilization of data. Data practices can be kept moral and in line with social standards with the support of independent oversight organizations or ethics boards.

6. Regulation and legal safeguards: Strong data protection laws that impose restrictions on how businesses and organizations can gather, keep, and handle personal data must be enforced by governments. Laws that address issues like accountability for algorithmic judgments, eliminating discrimination, and safeguarding human rights in AI-driven systems are crucial for the ethical application of automation and artificial intelligence. Because technology is changing so quickly, regulatory models must be adaptable and flexible to support innovation and enable quick responses to emerging ethical dilemmas.

7. Data for social good: Data can and should also be used positive social impact including lowering inequality and poverty, combating climate change, and improving public health. Governments, corporations, and civil society organizations working together can help guarantee that data is used morally and for the good of society. These collaborations may result in common frameworks for the ethical use of data.

A multifaceted strategy including legislation, transparency, public education, and proactive governance is needed to strike a balance between the power of data and ethical issues. Prioritizing the defence of individual rights, maintaining equity, and advancing the common good while fostering innovation should be the main goals of ethical data use. Through cultivating a culture of accountability and responsibility, we can leverage data’s promise (and divinity) without sacrificing moral principles.

Disclaimer: The opinions expressed are those of the author and do not purport to reflect the views or opinions of any organization, foundation, CSR, non-profit or others

Cover Photo: This is an AI generated image.