Skip to main content

Engineering in the age of AI: Insights from PG Niero

Piergiorgio “PG” Niero has led engineering across games, digital advertising, and broader tech – including senior roles at SuperAwesome and Epic Games. We asked him how AI is reshaping software engineering and how tech organisations are evolving.

What do you think AI might replace today, and what might it replace in the future?

I think the question needs reframing. When you ask, "What will AI replace?", you are already assuming that AI is here to take tasks away from humans. That is a loaded starting point.

Step back for a second: AI is just another layer in the technology stack. Like cloud, like mobile, like automation before it. It is a tool - one that a business can choose to adopt or ignore - depending on whether it offers real competitive advantage.

So the real question is not "what will AI replace?", but "where can AI create meaningful advantage?" And that answer is multi-lens:

The business lens: Customers will not buy your product because it is powered by AI. They will buy it if it solves a pain point that matters enough for them to put money on the table. AI is only valuable if it strengthens that outcome.

The people lens: Your ability to benefit from AI depends on your organisation's maturity. How many juniors vs. seniors you have. How adaptive your culture is to new tools and processes. And yes, even geography plays a role - because attitudes to technology adoption are deeply cultural.

The ethical lens: If the very people meant to wield AI feel threatened by it - copywriters, designers, junior developers - you risk introducing a tool that looks powerful on paper but gets rejected in practice.

AI does not just "replace tasks." It reshapes how organisations create value. Whether it accelerates you or stalls you depends on how thoughtfully you evaluate it across those lenses.

"AI is just another layer in the technology stack - the question is where it creates real advantage."

If you are in a creative role and AI is generating a lot of content, can you change someone's mindset if they feel threatened?

I believe you can - but start by understanding where the tension comes from. In engineering, AI already takes on repetitive, low-value tasks - boilerplate code, meeting summaries, scaffolding. That work historically fell to juniors or assistants, so it can feel like encroachment.

AI does not replace the engineer; it reshapes the role. In agile organisations, face-to-face communication is central and AI cannot replicate it. Engineers must listen to stakeholders, extract what matters, and translate it into clear prompts AI can execute. Accountability stays with the engineer.

For creatives, the pattern is similar: AI handles grunt work and first drafts; the craft - judgment, originality, brand alignment - remains human. The mindset shift is: AI is not here to erase your role; it is here to raise the level of the game.

"AI does not replace the engineer - it reshapes the role."

Do young engineers risk not going deep enough by relying too much on AI?

It is a valid concern - so leadership must put guardrails in place.

What makes a good engineer has not changed. A coder solves assigned tasks; an engineer understands the foundations - space, time, complexity, systems. They explain trade-offs and guide others. Yesterday the "coder" was human; today, much of that can be AI.

AI strips surface-level tasks that never required deep understanding, while elevating engineers to think end-to-end: scalable design, risk balancing, compliance, connecting tech to business outcomes. This is not replacement - it is raising the bar.

"AI is raising the bar for engineers by pushing them to think end-to-end."

Do software engineers now need stronger language skills for things like prompt engineering?

Yes. We have seen shifts like this before: from assembly to high-level languages with compilers. Now, engineers increasingly write specifications in plain language. Prompting looks simple, but it is not trivial.

LLMs are probabilistic, not deterministic. Same prompt, different output. Good prompting means guardrails that steer models toward desired outcomes. Frameworks are emerging to systematise this - from guided context capture to generating prompts from acceptance criteria in tools like Jira. Language is becoming as fundamental as code.

"Language is becoming as fundamental as code."

Given AI's impact, does it change the kind of engineers you hire?

Absolutely. Traditional knowledge-quiz interviews are obsolete. People do not "Google it" - they "GPT it." I want engineers who embrace AI. One of my first questions: "How did you start using AI?"

If someone says, "I do not use AI; I prefer writing code myself," that is a hard no. It is 2025 - most companies are embracing AI. You hire humans over agents when they are faster because they wield AI well, or when they bring deep domain expertise your team lacks.

"Hire humans when they are faster with AI - or when they bring deep domain expertise you lack."

Where is AI regulation heading?

GDPR became a global benchmark for privacy; AI lacks an equivalent standard today. Approaches diverge: US prioritises building fast and fixing later; Europe aims to protect broadly before scaling - part of why the US moves faster.

The key question: What is the minimum global standard we accept, regardless of where AI is built or used? AI can be harmful - especially to vulnerable people - and content generation has accelerated. We should not wait.