Humanising AI
Why We Give Technology a Face
In Papua New Guinea, understanding often begins with story, relationship, and personhood.
We don’t grasp ideas purely as systems or abstractions.
We understand them through people.
That’s why:
- faith
is expressed through 'human' figures,
- leadership
is personal, not distant,
- and
knowledge is passed through voices, not manuals.
So when people give AI a name, a voice, or even a
personality, it isn’t strange — it’s completely normal.
We Have Always Humanised Complex Ideas
Across cultures and history, humans have done this
instinctively.
Religion is one example:
- Christians
understand God through Jesus, the Son of God
- Buddhists
understand enlightenment through the life and teachings of the Buddha
- Indigenous
belief systems often express knowledge through ancestors, spirits, and
named forces
These figures don’t limit understanding — they make it accessible.
Humanising something does not mean we believe it is human. It means we are creating a bridge between complexity and comprehension.
AI is no different.
Why AI Feels Easier When It Has a Name
When people interact with AI, many naturally:
- speak
to it politely,
- ask
follow-up questions,
- test
ideas conversationally,
- treat
it as a thinking partner.
This doesn’t mean people are confused.
It means they are engaging with it in the most intuitive
way possible.
Conversation is humanity’s oldest interface.
Giving AI a persona — even a light one — helps people:
- ask
better questions,
- explore
ideas more openly,
- reduce
fear and intimidation,
- and
learn faster.
This is especially important for beginners.
The Line We Must Keep Clear
Humanising AI helps us use it — but it must not lead us to misplace trust.
AI:
- does
not have beliefs,
- does
not have values,
- does
not have wisdom,
- and
does not replace human judgement.
The danger is not in naming or humanising AI. The danger is forgetting who is responsible for decisions.
People remain responsible. Always.
Why This Matters for Papua New Guinea
In PNG, relationships matter more than systems.
If AI is presented as:
- cold,
- foreign,
- technical,
- or
elite,
it will be resisted.
If AI is presented as:
- a
helper,
- a
guide,
- a
tool that listens,
- something
that can be questioned,
it becomes approachable.
Humanising AI is not cultural weakness. It is cultural intelligence.
A Final Thought
We don’t learn by worshipping tools. And we don’t learn by fearing them either.
We learn by engaging, questioning, and relating.
If giving AI a voice helps people in Papua New Guinea
understand how it works — then that voice becomes a pathway, not a problem.
The key is remembering this simple truth:
AI can assist thinking — but meaning, values, and responsibility remain human.


Comments
Post a Comment