Business A.M
No Result
View All Result
Wednesday, February 11, 2026
  • Login
  • Home
  • Technology
  • Finance
  • Comments
  • Companies
  • Commodities
  • About Us
  • Contact Us
Subscribe
Business A.M
  • Home
  • Technology
  • Finance
  • Comments
  • Companies
  • Commodities
  • About Us
  • Contact Us
No Result
View All Result
Business A.M
No Result
View All Result
Home Insead Knowledge

In Support of Serendipity

by Chris
January 21, 2026
in Insead Knowledge

AI algorithms are powerful tools, but we shouldn’t ignore the value of chance encounters.
Amazon, Netflix, Spotify and TikTok – three global platforms that harness recommendation algorithms to keep giving consumers more of what they want. Their success is a testament to the incredible potential of this technology to enhance the consumer experience. The value of being recommended a new film, your next book, a great song or even a funny cat video, selected to perfectly match your mood and tastes, seems unquestionable.
However, in a recent paper, my co-authors and I caution consumers against an over-reliance on AI algorithms when making such picks. Specifically, we highlight how AI has the potential to constrain the human experience and ultimately limit individuals’ choices.

A limit on choice
The effectiveness of AI-driven recommendation algorithms lies in their ability to leverage past consumer behaviour. These models review a user’s previous choices to selectively curate future content or products that match those past preferences. From this perspective, the system shapes what we see and what we can select.
While this is often very useful, it may ultimately constrain our self-determination. With an AI algorithm, consumers are much less likely to be shown content or products that don’t match their past choices. Demonstrate an interest in a particular topic or genre and chances are you will be fed more of the same, taking you down a rabbit hole of related content at the expense of exploring other more unlikely but no less valid options.
Compare this to a visit to a physical bookstore, where we are greeted by a central display of books. Attracted by the cover and blurb of one particular title, we buy it even though it has no link to our previous reading history. It turns out to be an amazing read and introduces us to a previously unknown author.

A cost to consumers
Such overreliance on past consumer preferences can also impact the range of content shown to consumers with similar tastes. For example, a music-selecting algorithm sees that lots of users have previously listened to Taylor Swift, so it pushes her songs to more people. This popularity bias means that the market share for already popular products increases. Marginal (less popular) choices are overlooked by the algorithm, and the range of recommendations we are given becomes ever narrower.
Besides limiting choice, there is a financial implication for consumers. A lack of exposure means marginal products struggle to survive. This can lead to monopolies for the more popular recommendations, and a lack of competition means prices for these can rise, with the consumer left to pick up the bill.
The danger of objectification
Existing AI-driven recommendation models tend to exhibit a bias towards objectification, reducing individual and community characteristics to a limited set of features or data points. Often, the model fails to adequately capture, or under-represents, an individual’s actual preferences and instead delivers outcomes based on the larger group or category the individual belongs to.
This can lead to unfair or inefficient treatment in areas like loan approvals, hiring policies or pricing due to biased algorithms. Such simplification can also lead to a mischaracterisation of individual preferences. Anyone who has tried to get a service bot to adapt to their personal circumstances will have been frustrated by its inability to respond appropriately.
By reducing individuals to mere functions and scores, AI systems can limit our experiences and perpetuate subtle dehumanisation. This oversimplification can misrepresent our true preferences, leading to poor decisions or limited outcomes. What’s more, it can erode trust in AI, especially in sensitive areas like healthcare where understanding human uniqueness is crucial.

Increasing transparency
Such oversimplification underlines perhaps the biggest issue consumers face when being delivered algorithm-driven decisions: we don’t know what objective or goal an algorithm is designed to pursue. Is it primed to discount environmentally unsound firms when making investment suggestions, or was it just recommending firms that gave the greatest returns?
In the absence of being able to comprehend all the parameters that algorithms are based on – a problem that computer scientists refer to as unexplainable AI – being able to understand the goal the algorithm pursues could help individuals put greater trust in those predictions. It would certainly allow consumers to make a more informed choice on whether to accept those decisions or continue to search for something more suitable.

Greater personalisation
Allowing consumers to personalise those parameters would go a long way towards giving them a greater sense of control and trust. For example, a user could have the ability to select whether they want the quickest, shortest or most scenic route between two destinations. Or perhaps, if they want to get fit, they could tweak their TikTok algorithm to show fewer videos of trending dances and more workout routines.
Building such personalisation into the interface presents development challenges, but it would give the consumer a greater sense of control and, in turn, greater trust in the suggested choices. It might also benefit firms as much as consumers. For instance, a study found that consumers who were fed content tailored to their ideal preferences didn’t just find it more helpful but were much more likely to reuse the service and more willing to pay for it.

A more balanced perspective
Not only does the homogenisation of outcomes ignore the nuance of individuals. It also restricts access to alternative views, voices and perspectives.
As we’ve seen in recent election campaigns around the world, it can be particularly problematic when delivering contentious or political content. By amplifying more extreme views at the expense of more reasonable (but less popular) arguments,recommendation algorithms can lead to the creation of echo chambers. An individual’s existing viewpoints or opinions are simply reinforced or amplified at the expense of all others.
To combat such polarisation and foster greater empathy, AI systems should be designed to expose users to diverse stimuli, perspectives and opinions. Allowing consumers to see both sides of an argument could help develop a broader understanding of an issue. Not only does this give individuals an opportunity to change their minds, it also helps foster greater compassion and respect for the apparent “enemy”.

Introducing serendipity
Building on this is the idea of developing AI systems that are more flexible and less tied to past preferences. Just because consumers listen to country music more often than jazz does not mean they only want recommendations for more sad ballads and honky tonk tunes.
Analysing past preferences from a longer consumption time period could be one way to deliver more balanced recommendations. Another solution is to allow consumers to select the degree to which the algorithm recommends previously consumed categories and how much it delivers serendipitous or unrelated content.
As individual consumers, we should have the ability to make our own choices. We can always choose to skip something if we don’t like it, but only if given the opportunity. By being given more unpredictable options, we could end up discovering a love for 1970s Nigerian funk or 18th century chamber music, which we never previously knew we had.

Broadening horizons
It is important that governments are aware of these challenges. Current regulation tends to focus on clear, measurable factors like bias or restricting price competition. However, it should also consider how the technology is used in real-world settings and how its features can interact with human behaviour to create results that limit users’ development and expression of their preferences.
If we want to retain freedom and serendipity in consumer choice, we need to adopt a user-centred approach to developing new AI systems – one that enhances transparency, exploration and personalisation and doesn’t restrict users to reliving the past but instead enriches the gamut of human experience.
Without opportunities to make new discoveries, those chance encounters that can make life so rich will be closed off to us.

Previous Post

Unstoppable Entrepreneur: Madison Reed’s Amy Errett

Next Post

E-commerce Lessons for Scaling Nigeria’s Food Distribution

Next Post

E-commerce Lessons for Scaling Nigeria's Food Distribution

  • Trending
  • Comments
  • Latest
SIFAX subsidiary bets on operational discipline, cargo diversification to drive recovery at Lagos terminal

SIFAX subsidiary bets on operational discipline, cargo diversification to drive recovery at Lagos terminal

February 10, 2026

Reps summon Ameachi, others over railway contracts, $500m China loan

July 29, 2025
NGX taps tech advancements to drive N4.63tr capital growth in H1

Insurance-fuelled rally pushes NGX to record high

August 8, 2025
What's Behind the Fourth-Quarter Earnings Dip?

What’s Behind the Fourth-Quarter Earnings Dip?

September 23, 2025

6 MLB teams that could use upgrades at the trade deadline

Top NFL Draft picks react to their Madden NFL 16 ratings

Paul Pierce said there was ‘no way’ he could play for Lakers

Arian Foster agrees to buy books for a fan after he asked on Twitter

Expert points to Eastern ports’ high revenue, employment potential

Expert points to Eastern ports’ high revenue, employment potential

February 11, 2026
African insurers step up as climate shocks expose protection gaps

African insurers step up as climate shocks expose protection gaps

February 11, 2026
Nigeria’s insurance recapitalisation exposes cracks in financial discipline

Nigeria’s insurance recapitalisation exposes cracks in financial discipline

February 11, 2026
35,000 fibre cuts undermine Nigeria’s telecom infrastructure

NCC, NSCDC warn contractors as fibre-optic damage escalates nationwide

February 11, 2026

Popular News

  • SIFAX subsidiary bets on operational discipline, cargo diversification to drive recovery at Lagos terminal

    SIFAX subsidiary bets on operational discipline, cargo diversification to drive recovery at Lagos terminal

    0 shares
    Share 0 Tweet 0
  • Reps summon Ameachi, others over railway contracts, $500m China loan

    0 shares
    Share 0 Tweet 0
  • Insurance-fuelled rally pushes NGX to record high

    0 shares
    Share 0 Tweet 0
  • What’s Behind the Fourth-Quarter Earnings Dip?

    0 shares
    Share 0 Tweet 0
  • Elumelu leads corporate mourning after UBA staff die in Afriland Towers fire

    0 shares
    Share 0 Tweet 0
Currently Playing

CNN on Nigeria Aviation

CNN on Nigeria Aviation

Business AM TV

Edeme Kelikume Interview With Business AM TV

Business AM TV

Business A M 2021 Mutual Funds Outlook And Award Promo Video

Business AM TV

Recent News

Expert points to Eastern ports’ high revenue, employment potential

Expert points to Eastern ports’ high revenue, employment potential

February 11, 2026
African insurers step up as climate shocks expose protection gaps

African insurers step up as climate shocks expose protection gaps

February 11, 2026

Categories

  • Frontpage
  • Analyst Insight
  • Business AM TV
  • Comments
  • Commodities
  • Finance
  • Markets
  • Technology
  • The Business Traveller & Hospitality
  • World Business & Economy

Site Navigation

  • Home
  • About Us
  • Contact Us
  • Privacy & Policy
Business A.M

BusinessAMLive (businessamlive.com) is a leading online business news and information platform focused on providing timely, insightful and comprehensive coverage of economic, financial, and business developments in Nigeria, Africa and around the world.

© 2026 Business A.M

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Technology
  • Finance
  • Comments
  • Companies
  • Commodities
  • About Us
  • Contact Us

© 2026 Business A.M