Designing and Building Technology for High-Context Culture

Some cultures build trust through content: clear words, direct instructions, and speed. Others rely on context: tone, rhythm, timing, and what’s left unsaid. Most global tech is built for the former, then superficially adapted through “localization” or RTL design tweaks. But context is where trust lives for much of the world, and it's often the first thing lost in translation. AI agents might finally offer us the chance for a reset.

Waleed Abu Nada

I first came across Edward T. Hall one night deep into a Wikipedia rabbit hole. He was an American anthropologist who had spent his life trying to decode the invisible languages that shape how cultures think, feel, and communicate. In his work, Hall introduced the distinction between "high-context"1 and "low-context"2 cultures, a framework that continues to serve as a powerful lens for understanding how people communicate, form relationships, and build trust. He also explored how different cultures perceive space, time, and rhythm. This was not in the form of philosophical abstractions, but as tangible rules of interaction that influence everything from actual physical personal distance in a conversation to the pace at which decisions are made. As someone fascinated by how cultures interact, I found his work naturally insightful but what really stood out was how much of it applies to the way technology is designed today.

In low-context cultures like the United States, Germany, or Scandinavia, communication tends to be explicit, linear, and focused on getting tasks done. Meaning is packed directly into the words themselves, and technology is often designed with the assumption that users prefer autonomy, self-service, and minimal ambiguity. These are cultures that reward clarity and speed. They also tend to be monochronic, which means they treat time as a finite resource: structured, measurable, and not to be wasted. This means that schedules are sacred. Meetings start on the dot. Productivity is a metric, and interruptions are seen as inefficiencies. High-context cultures, by contrast, take a more fluid approach. Across the Arab world (also much of Asia, Latin America, and parts of Africa), communication is layered and trust unfolds gradually through context, ritual, and relationship. These cultures are generally polychronic, meaning time isn’t something to control but rather something to move with. Tasks can happen in parallel. Formal meetings tend to warm up with side talks before getting to the point. It’s even quite normal for a meeting to be interrupted by someone3. That is not seen as poor planning, but simply part of the social rhythm. In high-context cultures, this isn’t always seen as mismanagement. It’s a different kind of logic where presence is valued over precision, and where how you spend time with someone can say more than how efficiently you use it.

Hall also emphasized proxemics, or how different cultures perceive personal space. In high-context cultures, closeness - physical and relational - is normal. Sitting close, speaking warmly, showing hospitality are not “nice-to-haves,” they’re signals of trust and social alignment. But what happens when digital platforms are built in cultures where efficiency and distance are favored over warmth and proximity? As technology has become the primary way people engage with services, institutions, and each other, the cultural logic behind how it’s designed has taken on enormous importance. Too often, the systems we all use are shaped by low-context assumptions and then exported globally without adjusting for the way people in other cultures make meaning or build trust. The result is a digital world that may work logically but not emotionally, leaving many users feeling disconnected or even unseen.

This isn’t just theory. I’ve seen it firsthand. In high-context societies, what builds trust isn’t always a smooth interface or clear instructions but it’s feeling understood. In Japan, I’ve been in meetings that were almost completely silent. At first, it felt awkward, but over time I realized it was a quiet signal of respect and reflection. In Brazil, I’ve captured more insights on some topics over a well-cut picanha than in meetings with structured agendas. Also after years of working across different sectors in the GCC, I’ve seen how decisions aren’t always purely data-driven but are often influenced by timing, relationships, and unspoken alignment. But most tech platforms don’t make space for that. Think about Google. Its homepage is minimal and direct, based on the idea that users want instant access to information. ChatGPT is a natural extension of such logic by delivering fast, clear answers, but often without the relational nuance high-context users expect. Amazon automates everything, from reviews to delivery, to build trust through speed and predictability. These platforms have achieved global scale, but their core design logic still reflects the cultural norms they were born from; norms that don’t always align with how people in high-context cultures signal trust or connection.

Now take the contrast between Uber and Careem. Uber entered the Middle East with the same model it used everywhere: tap a button, get a ride, rate the driver. It removed the need for conversation or small talk. But in cities like Cairo, Amman or Riyadh, many riders wanted to talk to the driver first - not just to clarify logistics, but to connect, or get a feel on the driver first. Careem understood this. They added the ability to call the driver before the trip, offered Arabic-language support, and even let people book rides on behalf of their relatives4. These weren’t just added features, but they were signals of cultural understanding.

This kind of cultural fluency in design isn’t about throwing in local features to check a box. It’s not simple localization. It’s designing for culture. In high-context societies, trust doesn’t appear in the first click. It grows over time, through familiarity, tone, and rhythm. That’s why onboarding can’t just be a sequence of popups and tooltips. Sometimes, it should feel like a message from someone you know. And communication shouldn’t be confined to static chatbot forms. It might need to come through a WhatsApp message, a voice note5, or a well-timed call. Designing for culture means understanding those preferences and building systems that speak in the ways people already do. Take my father as an example. He refuses to order food or groceries through an app. He prefers to call the store directly so he can crack a joke, ask the staff about their last names, and trace their origins all the way back to where their great-grandparents might have lived during the days of the Ottoman Empire6. Only then does he place the order, with a warm reminder to prepare everything with extra care and attention. For him, that little conversation isn’t just a courtesy, it is the whole point and the heart of his CX.

Now, we’re entering the era of AI agents. These are systems that aren’t just reactive, they’re designed to interpret, learn, and adapt based on how people behave. That opens up a new opportunity because unlike a static user interface, an agent can remember context. It can recall not only what someone asked, but how they asked it. This is especially powerful in high-context settings.

We’re already seeing this shift take shape in the MENA region. Maqsam is redefining agent-customer interactions by embedding AI into voice systems that pick up on tone, rhythm, and formality - elements often overlooked by Western-centric models. This allows agents to sound more local, more authentic, and more emotionally attuned to the person on the other end of the line. It mirrors the spontaneity and empathy of real conversations, enabling responses that feel less transactional and more genuinely supportive. This is critical in high-context cultures where how something is said carries as much weight as what is said. Similarly, Tarjama shows what it means to build AI for such cultures. With its Arabic-first, human-in-the-loop approach, it doesn’t just swap words from one language to another. Instead, it interprets and safeguards the sentiment, hidden nuances, and cultural undertones that can completely shift the meaning of a messagt. Both platforms reflect a deeper kind of design, one that understands the emotional texture of communication in the Arab world.

These are not just tools. They are signs of where AI is headed when it learns to listen first. Imagine a digital concierge at a Saudi healthcare provider. It doesn’t just send out generic reminders. It knows when to be formal, when to use a softer tone, and when to time its messages thoughtfully. Instead of pinging with sales notifications, it gently reminds you to book a check-up for your mother, phrased in a way that feels caring rather than clinical. Take a real estate agent in Jordan. It doesn’t just blast listings at random. It remembers that your cousin already visited a property with them last year and holds off on texting you during Eid break. Better yet, it tailors suggestions based on where your extended family lives. It might offer options nearby if you're on good terms, or suggest homes on the opposite side of Amman... if you're not exactly too excited to see your mother-in-law every day. It’s smart enough to know that in a high-context culture, location isn’t just geography but rather a social strategy.

The promise becomes even more compelling when we think of agents as mediators rather than tools, as entities that understand the unspoken needs between a user and a system, and that advocate for rhythm, empathy, and nuance in the digital environment. An agent working with a high-context user should ideally operate less like a search engine and more like a trusted go-between; a digital cousin rather than a digital clerk. Unfortunately, most AI agents today are still shaped by low-context paradigms. They may sound friendly, but they follow scripts written with a Western communication style in mind which are typically direct, task-focused, and emotionally neutral. Without deliberate effort, these agents will simply reinforce the cultural blind spots of the systems that came before them, only this time at scale. This raises the question: who gets to teach the agents? Whose tone, priorities, and rhythms become the default in a world increasingly mediated by virtual intermediaries? If we aren’t careful, we risk building global systems that unintentionally alienate the very people they’re meant to serve. Only when trust becomes the product, not just a metric, we’ve crossed a threshold. We’re no longer flattening the social map in service of speed but we’re drawing it, in all its complexity, into the interface.

But there’s one more question we can’t ignore: What if high-context cultures don't just use low-context systems, but over time transform into them?

This is already in motion. Even in the most traditionally high-context societies, we’re seeing a quiet shift toward more transactional behaviors. As digital tools reward speed, clarity, and automation, the cultural rhythms that once shaped communication in many contexts are being compressed or sidelined.

The result is a subtle, yet growing tension between how people were raised to connect, and how they’re now expected to operate.

Cultural traits are not fixed. When all the places you communicate with value ease over nuance, you adapt at the risk of not only creating tension between people and organizations but also within societies themselves. The gap between cultural memory and digital practice widens, and with it, something essential starts to feel out-of-tune.

Tech should not take us towards homogenization in the name of convenience. It ought to preserve the nuance that makes us human. We must create systems that not only bring people where they are, but enable them to remain there - if they wish to remain.

________

1 High-context cultures rely heavily on implicit communication, shared values, and unspoken cues. In these settings, context often carries more meaning than content—what’s left unsaid can be just as important as what is.  

2 Low-context cultures prioritize directness, clarity, and explicit language to convey meaning. Here, content outweighs context—meaning is embedded in the words themselves rather than inferred through shared background or setting.

3 I remember once speaking to a mentor whom I often turn to for perspective and advice. I was venting to him pressure to move fast in my work, the need to hit deadlines, and chase outcomes that didn’t match the effort being put in to get there. He told me, “We treat time as both endless and urgent; it’s the most important thing we have today, and somehow the one we mismanage the most.” What he said stuck with me as it captured the crossroads I find myself in the region: where cultural rhythm meets modern urgency, and neither side quite knows how to speed up or slow down enough.

4 Careem knew that in many Arab households, the person booking a ride wasn’t always the one taking it. Parents, helpers, siblings—often needed to be transported. The ability to book rides for others wasn’t just a convenient feature, it reflected how mobility actually worked in our societies. In cases like young girls traveling alone, it also added a crucial sense of safety and reassurance for families.

5 I’d be genuinely curious to see research on how often Arabs choose voice notes over text on WhatsApp. Does the emotional nature of certain people affect whether they text, send voice notes, or use both?

6 Somehow my father always magically manages to make everyone’s ancestry Palestinian.