Interviews
Yes, You Can Use AI in Our Interviews. In fact, we insist
How We Redesigned Technical Interviews for the AI Era

The nature of software engineering is changing rapidly. At Canva, we believe our hiring process should evolve alongside the tools and practices our engineers use every day. That's why we're excited to share that we now expect Backend, Machine Learning and Frontend engineering candidates to use AI tools like Copilot, Cursor, and Claude during our technical interviews.
The Reality of Modern Engineering
At Canva, almost half of our frontend and backend engineers are daily active users of an AI assisted coding tool. Our engineers leverage AI to prototype ideas, understand our large codebase and generate code. This allows them to focus on what matters most: Empowering the World to Design(opens in a new tab or window).
We not only encourage, but expect our engineers to use AI tools as part of their daily workflow. We believe that AI tools are essential for staying productive and competitive in modern software development.
Yet until recently, our interview process asked candidates to solve coding problems without the very tools they'd use on the job. Our interview approach included a Computer Science Fundamentals interview which focused on algorithms and data structures. This interview format pre-dated the rise of AI tools, and candidates were asked to write the code themselves.
This dismissal of AI tools during the interview process meant we weren't truly evaluating how candidates would perform in their actual role.
Our Philosophy: Transparency Over Detection
The rise of AI interviewing tools has changed the landscape entirely. Candidates are increasingly using AI assistance during technical interviews, sometimes covertly through tools specifically designed to avoid detection.
Rather than fighting this reality and trying to police AI usage (which is increasingly difficult(opens in a new tab or window)!), we made the decision to embrace transparency and work with this new reality. We want to see how well candidates collaborate with AI to solve problems. This approach gives us a clearer signal about how they'll actually perform when they join our team.
Our initial experiments confirmed what we suspected: AI assistants can trivially solve traditional coding interview questions. When we tested our Computer Science Fundamentals questions with AI tools, they produced correct, well-documented solutions in seconds, often without requiring any follow-up prompts or clarification.
To gain a meaningful signal about a candidate’s problem solving and coding abilities, we needed to rethink our approach to technical interviewing. We would allow and encourage the use of AI tools as well as modifying our existing format to take this into account.
We also understand that engineering work involves far more code reading and comprehension than writing code from scratch. Engineers spend most of their time understanding existing codebases, reviewing pull requests and iterating on solutions, rather than implementing algorithms from scratch. With AI tools generating initial code, a critical skill is the ability to read, understand, and improve that code. Our traditional interviews gave us no signal about these essential capabilities.
Embracing Complexity and Ambiguity
Recognizing the need to change, we redesigned our questions to be more complex, ambiguous, and realistic - the kind of challenges that require genuine engineering judgment even with AI assistance. These problems can't be solved with a single prompt; they require iterative thinking, requirement clarification, and good decision-making.
For example, instead of asking a candidate to implement Conway's Game of Life, we might instead present them with a challenge like: "Build a control system for managing aircraft takeoffs and landings at a busy airport."
Introducing AI-Assisted Coding Interviews
We've piloted a new competency we called "AI-Assisted Coding" that replaces our traditional Computer Science Fundamentals screening for backend and frontend engineering roles. In this interview, candidates are expected to use their preferred AI tools to solve realistic product challenges.
This isn't about letting AI do all the work. We're evaluating candidates on skills that matter more than ever:
- Do they understand when and how to leverage AI effectively?
- How well do they break down complex, ambiguous requirements?
- Can they make sound technical decisions while using AI as a productivity multiplier?
- Can they identify and fix issues in AI-generated code?
- Can they ensure AI-generated solutions meet production standards?
Addressing Internal Concerns
When we first proposed this change, some of our engineers were understandably concerned. The initial reaction was worry that we were simply replacing rigorous computer science fundamentals with what one engineer called "vibe-coding sessions."
This concern was completely valid. Maintaining technical rigor is crucial for building world-class products. However, once we explained that code fluency and technical depth were still absolute requirements, just evaluated in a different context, the sentiment shifted to positive support. We’re still assessing computer science fundamentals through the new process, and we expect engineers to take full ownership of any code they produce, whether they wrote it themselves or with AI assistance.
What We've Learned
Our pilot revealed great insights. The most successful candidates didn't just prompt AI and accept whatever it generated. Instead, they:
- Asked thoughtful clarifying questions about product requirements
- Used AI strategically for well-defined subtasks while maintaining control of the overall solution
- Critically reviewed and improved AI-generated code
- Demonstrated strong debugging skills when AI solutions had issues
Interestingly, candidates with minimal AI experience often struggled, not because they couldn't code, but because they lacked the judgment to guide AI effectively or identify when its suggestions were suboptimal.
Looking Forward
We're mindful that this shift requires candidates to adapt. That's why we now inform candidates ahead of time that they'll be expected to use AI tools, and we highly recommend they practice with these tools before their interview.
This change reflects our broader "AI Everywhere" philosophy at Canva. We're not just building AI features for our users, we're reimagining how we work, create, and solve problems across the entire organization.
We believe the future belongs to engineers who can seamlessly blend human creativity and judgment with AI capabilities. Proficiency with AI tools isn't just helpful for success in our interviews, it is essential for thriving in our day-to-day role at Canva. By evolving our interview process, we're ensuring we hire people who will thrive in this new paradigm while maintaining the high technical standards that make Canva's engineering team exceptional.
The early results are promising. Our AI-assisted interviews feel more engaging for both candidates and interviewers, and they're providing strong predictive signals about candidate performance. Most importantly, they're helping us identify engineers who can leverage AI thoughtfully and effectively. Exactly the kind of people we want building the future of visual communication.
The AI landscape continues to evolve at an unprecedented pace, and our approach must evolve with it. What works today may need refinement in six months as AI capabilities advance and new tools emerge. We're committed to continuously evaluating and adapting our interview process to ensure we're always assessing the skills that matter most for engineering success.
Interested in joining Canva's engineering team? Check out our current openings(opens in a new tab or window) and experience our AI-assisted interview process for yourself.