Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

DS 116: Proceedings of the DESIGN2022 17th International Design Conference

Year: 2022
Editor: Mario Štorga, Stanko Škec, Tomislav Martinec, Dorian Marjanović
Author: Qihao Zhu, Jianxi Luo
Series: DESIGN
Institution: Singapore University of Technology and Design, Singapore
Section: Artificial Intelligence and Data-Driven Design
Page(s): 1825-1834
DOI number: https://doi.org/10.1017/pds.2022.185
ISSN: 2732-527X (Online)

Abstract

Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation.

Keywords: early design phase, idea generation, generative design, natural language generation, generative pre-trained transformer

Please sign in to your account

This site uses cookies and other tracking technologies to assist with navigation and your ability to provide feedback, analyse your use of our products and services, assist with our promotional and marketing efforts, and provide content from third parties. Privacy Policy.