Generative AI is no longer a future issue in design education—it’s already in the studio, in workflows, and in the tools students are experimenting with on their own. The real question schools are facing now isn’t whether AI should be part of the conversation, but how to bring it into the classroom without losing what design education is actually trying to build: authorship, skill, critical thinking, and a strong individual voice. Policies like ArtCenter College of Design’s generative AI guidelines offer a useful example of what a balanced, workable approach can look like.
What grounds ArtCenter’s approach is simple but meaningful: AI is framed as a tool, not a creator. Generative systems can produce images, text, and other media, but the responsibility for the work stays with the person using them. Students and faculty are accountable for the accuracy, quality, and integrity of what they make, even when AI is part of the process. That keeps authorship grounded in human decision-making, which is still central to how design is taught and evaluated.
Students don’t just need to learn tools, they need to learn how to design with intelligence, judgment, and ethics in an AI-shaped world. Our responsibility is to teach creative leadership, not software.
Gerardo Herrera, Associate Chair, MDes Brand Design & Strategy
At ArtCenter, Herrera’s team collaborated with Google to develop Synchro, an AI platform built around sustainability, shared systems, and personalization. But at its core, this project wasn’t about automation. It was about human connection, using technology to create more thoughtful, responsive, and emotionally aware brand experiences.
Stephanie Player, Graphic Design Student | Clayton Lin, Interaction Design Student| Haile Wu, Industrial / Product Design, Student| ArtCenter College of Design
Transparency is another key piece. Undisclosed AI use is treated as an integrity issue, similar to plagiarism. The point isn’t to scare people away from using new tools, but to make the creative process visible. When AI contributes to a project, that role is acknowledged and cited. This turns AI from a hidden shortcut into something that can be discussed, critiqued, and learned from—which is exactly what education should be doing.
The policy also brings in ethical and legal awareness, especially around intellectual property. Because AI systems are trained on large datasets, their outputs can resemble existing work or carry embedded bias. By putting the responsibility on users to think critically about what they’re producing and whether it could infringe on someone else’s rights, the policy connects AI use to the same ethical standards that already shape professional creative practice.
Photo Credit: Juan Posada, ArtCenter Photographer, Google AI Packaging Sponsored Project.
Quote: Gerardo Herrera
There’s also an important reminder about privacy. AI tools aren’t just creative assistants — they’re data platforms. Encouraging students and faculty not to share personal or sensitive information helps build a more realistic understanding of the systems these tools are part of. It’s a small point, but an increasingly necessary one.
Educators, even in the same institution, are seldom in full agreement about process and practice. Understanding that any school is made up on individuals, one of the most practical strengths of ArtCenter’s approach is its flexibility in the classroom. Instead of one blanket rule, instructors can decide how AI fits their course: not allowed in some skill-focused settings, limited in others, or actively explored in classes that examine emerging tools. That makes sense in design education, where a foundations studio and an experimental media seminar are trying to do very different things.
Within the Graphic Design department, we are actively working towards engaging AI in the classroom, but we are not just teaching tools because we recognize that tools will change. We are looking into the design process and ways to embed it into the structure of our core classes. This integration may look very different from class to class and subject to subject. There is no one-size-fits-all model.
Elaine Alderette, Assistant Chair Screen-Based Media
Altogether, this kind of policy works less like a restriction and more like a framework. It sets expectations, protects creative integrity, and gives students a way to engage AI critically instead of passively. ArtCenter’s approach shows that it’s possible to make room for these tools without putting them at the center — to experiment with new technologies while keeping human judgment, voice, and responsibility where they belong. For design schools trying to figure out their own path, this model offers a clear takeaway: AI can be part of the studio, but the values of design education still lead.
Generative AI creates artifacts. Agentic AI creates systems. The next generation of designers must understand both and know when human judgment leads. The value of the designer is no longer just what we make, but how we frame problems, guide systems, and translate AI output into meaningful human experiences.
Gerardo Herrera
The post What Does a Thoughtful AI Policy Look Like in Design Education? appeared first on PRINT Magazine.

