How AI Forces Us to Reflect on the Purpose of Work
- Brainz Magazine

- 6 days ago
- 5 min read
Updated: 5 days ago
Written by Egbert Schram, Group CEO
Egbert Schram is a global authority on cultural analytics. As Group CEO of The Culture Factor Group, he helps executives navigate the paradoxes of cultural data while looking deeper at what drives people. Author of "Navigating Foreignerness" and a new book on lessons from natural resource management, he is seen as a delicate provocateur.
In my first book, Navigating Foreignerness, I explored the idea that we are all "foreigners" in some capacity, whether by nationality, generation, or function, and that success depends on testing our assumptions before we act. Today, as we stand at the precipice of an AI-driven revolution, we face a new kind of "foreignness," the integration of artificial intelligence into the very "soil" of our organizational systems.

Many leaders view AI simply as a tool for efficiency. However, through the lens of a forester, I see it differently. You cannot force growth, you can only design the conditions, the environment, the soil, the light, that make it possible. AI is not just another piece of technology, it is a fundamental shift in the environment that forces us to re-evaluate the "operating system" of our organizations.
The gap between strategy and purpose
In my work with the Culture Factor Group, I often encounter "Strategic Sabotage," where a brilliant strategy is undermined by an underlying culture that isn't "fit for purpose". AI amplifies this risk. If your strategy involves rapid AI adoption but your actual culture remains risk-averse and means-oriented, the technology will stall.
AI can be useful as well to further your sense of purpose as an organization. Our sense of purpose comes from making work life a consistent experience. We use AI to get a picture of what is the desired culture an organization claims to have, and the interesting thing is, they’re all pretty much the same across industries. A recent study we did among 94 Finnish stock-listed companies' public cultural profile concluded that, for instance, in highly regulated sectors like banking (Nordea) or energy (Fortum), we frequently observe a "Strict" control dimension necessitated by compliance and safety. Conversely, in creative and tech sectors like gaming (Rovio or Remedy), an "Easy-going" control environment and "Goal-orientation" are common to foster innovation. Nordea Bank reveals a culture classified as "Means-oriented," where employees describe the pace of work as bureaucratic and report that less than half of meetings are effective. In such an environment, the integration of AI might stall because the practical experience of employees remains focused on processes and rules rather than outcomes.
Conversely, creative and tech sectors like gaming often foster an "Easy-going" control environment to drive innovation. Rovio Entertainment (known for Angry Birds) exemplifies this with a culture that values independence and responsibility, providing gaming zones and lounges to maintain a relaxed atmosphere that encourages initiative.
The cultural blind spot: Why "custom agents" aren't enough
If AI does not necessarily impact the “desired” culture, as regulations tend to have a stronger impact, then where does AI impact culture?
A common misconception is that AI can simply "learn" culture from data. However, recent tests conducted by our team using custom-trained AI Agents against validated cultural benchmarks proved otherwise. These agents, even when equipped with the Six Dimensions of National Culture, struggled significantly with cultural nuance.
Average error: AI agents like "Indian" or "American" personas remained far off the mark with an average Mean Absolute Error (MAE) of 27.25.
The Finnish exception: While the "Finnish Agent" showed modest improvement, AI still failed to capture the lived phenomenon of culture.
Neutrality bias: AI often leans toward certain cultural biases rather than opting for neutrality, shedding light on the inherent cultural influences embedded within its programming.
This data proves that culture is not just a collection of facts or a dataset to be decoded, it is a context-rich, emotionally layered, and lived phenomenon. AI forces a confrontation with the "Actual Culture" of a workplace. When machines take over the routine, what remains is the deeply human, and this forces us to think much harder about the purpose of work:
If AI handles the "what" and the "how," humans must reclaim the "why".
Practices over values: We must move beyond aspirational values on a wall and focus on measurable practices. How does AI change our rituals, our decision-making speed, and our collaboration?
Emotional preferences: Technology is global, but people remain local. Despite the belief that LinkedIn or TikTok has made us all the same, the emotional preferences that define us are deeply ingrained. What motivates a team in Sweden to use AI will differ from a team in Delhi. And this difference has increased, starting at how we raise our children, for example, when it comes to money, but also with regards to how societies teach children (future workforce) on how to deal with new technology such as AI.
From intuition to infrastructure
The most dangerous thing a leader can do in the age of AI is to "guess". Leaders often rely on intuition, but those perceptions are filtered through power and position. To successfully integrate AI, we must treat culture as a measurable infrastructure, a way of working that generates a "wow" effect, not a "help" reaction.
As I prepare for my second book, I am focused on how leaders can ensure Cultural Executive Ownership (CEO) in this new era. We must diagnose the current state, define the "Optimal Culture" for an AI-augmented workforce, and close the gap through deliberate behavioral design.
Leaders must:
Diagnose the current state of culture.
Define the "Optimal Culture" for an AI-augmented workforce.
Design deliberate behavioral changes to close the gap between current and desired states.
The augmentation paradox: Designing collaboration
The introduction of AI engaging with knowledge work and creative processes is a profound shift. However, human-AI collaboration does not naturally lead to improved creativity, without deliberate structure, it typically stagnates.
Research from IMD identifies three distinct activities for effective human-machine partnership:
Responsive refinement: Humans generate ideas, and AI provides feedback on practical constraints.
Generative expansion: AI generates new ideas based on very specific human prompts.
Bidirectional development: A symbiotic relationship where humans critique AI suggestions while AI analyzes human concepts.
Conclusion: Strategy should assist, humans must lead
AI is not here to replace the purpose of work, it is here to force us to define it. If we do not deliberately understand how our culture enables collective learning, we will find ourselves in a state of chaos rather than innovation.
As we design the optimal culture for an AI-augmented workforce, we must remain conscious of why we work in the first place. In many contexts, work shapes our identity, if the nature of work is "thinned out" by automation, we must reflect on what happens to that identity. Ultimately, while strategy should assist in navigation, the understanding and design of culture must remain firmly human-led.
Read more from Egbert Schram
Egbert Schram, Group CEO
Egbert Schram is a global authority on cultural analytics and the Group CEO of The Culture Factor Group. Originally set to join the Dutch Marines, he pivoted to forest management at Wageningen University, specializing in environmental psychology and stakeholder management. After entering the market research software industry, he was tapped to internationalize a Nordic organization before eventually taking the helm as CEO. Having scaled the business to over 60 countries, Egbert now focuses on partnering with clients who view culture as the "operating system" of every organization and society. As a delicate provocateur and author of Navigating Foreignerness, he bridges the gap between human complexity and data-driven strategy.










