From art generators to chatbots, AI seems to be having its zeitgeist moment in popular culture. But for those of us who work in design, the near-term and future applications of AI have been lively discussion points in strategic planning meetings for quite some time. There is no doubt that AI will be an instrumental part of our world’s future. It will allow us to rapidly synthesize all the data being collected via our phones, cameras, computers, smart devices, and much more, giving us the ability to decipher and understand that data in illuminating, meaningful, and likely, world-changing ways.
What does this mean for the design industry? Though it may be a long time before AI is able to design a product from the ground up, the potential is clearly there. In fact, we believe AI is a tool that designers should be adding to their arsenal sooner rather than later.
Putting AI to Work
To put our money where our industry-informed opinions are, the Kaleidoscope Innovation team recently embarked on a studio project to design a high-end lighting fixture that could mimic lighting patterns found in nature. The project would enable our team to flex our aesthetic skills while using the full range of our design toolbox. One of those tools is Midjourney, a proprietary artificial intelligence program produced by an independent research lab by the same name. Though still in the open beta phase, Midjourney proved to be a useful partner in our mission. The collaboration between AI and the guiding hand of our expert design team delivered intriguing results.
One important distinction about the AI portion of the project: We were not setting out to produce real-world functionality, and in fact, we had no expectation or need for the AI to produce fleshed-out ideas or even design sketches. This experiment was about exploring new territories in aesthetics and applying them to materials and manufacturability considerations.
Our first step was to gather a team to collaborate on the search terms that would help visually articulate the aesthetic aspirations for our new fixture. Midjourney works by inputting text-based prompts, which the AI algorithm uses to generate new images using vast databases of existing images. The terms we fed the algorithm included chandelier, lighting, brilliant, elegant light, airy, crystalline patterns of light, dancing, photorealistic detailed plants, greenery, daytime, bright, modern, beautiful, natural colors, garden, and greenery. The team also used technical inputs alongside these qualitative descriptors to determine the aspect ratio and resolution while also guiding the algorithm to reference certain lighting styles and rendering approaches.
Digesting these descriptive words, Midjourney searched vast amounts of data across the internet to create original—albeit amalgamated—artwork. The images it produced reflected the algorithm’s interpretation of the inputs the team provided. From there, we tweaked specific inputs to alter the color, lighting, tone, and subject matter, continuing to iterate until we had collected a series of AI-generated lighting fixtures that could inspire the team.
How Did AI Do?
Based on the text inputs the team provided, Midjourney was able to identify design elements that could produce the effect of light shining through leaves. The images it produced looked organic, almost surreal in the way they were able to capture the kind of nature-made glow and transparency that is elusive in real-world lighting solutions. The various iterations of artwork then became mood boards that set up our team to brainstorm ways in which the effect could conceivably be produced.
The algorithm’s interesting use of materials, colors, lighting effects, and overall mood inspired us to apply those attributes to a holistic design. In other words, instead of our team scratching their heads visualizing how the light should transmit, AI provided us with ideas that enabled us to focus on materials, manufacturability, technical requirements, and more. Rather than spending hours scouring the internet for inspirational imagery, the team was able to craft that inspiration imagery ourselves through AI in a fraction of the time—imagery that exactly aligned with our design vision.
Without question, Midjourney served as a highly effective springboard that sparked ideas our team would probably not have come up with starting from a blank sheet of paper and pen. In this sense, AI provides an upfront efficiency that can move a project farther down the road faster than it might otherwise have gone. Perhaps more than that, a significant strength of AI in this application is that it can cast a wide net in terms of inspiration and exploration. It’s an open mind, and designers should be willing—and eager—to go down the rabbit holes, teasing out new possibilities. Once an intriguing direction is established, the designer can take over to turn the AI-generated inspiration into an actual product.
The key to a successful AI collaboration is plugging in the right words or phrases to best draw out the AI. And so, crafting prompts could be viewed more as art than science. Further, with a program like Midjourney, there is an element of unpredictability: You don’t have much control over what you’re going to get out of it. There is a lot of trial and error and shooting in the dark. Therefore, if you already have a set idea in mind, using AI to design it will probably be more frustrating than productive.
The inherent aspect of exploration and discovery is a factor to consider as well. Our team felt excited about experimenting with this technology specifically because the lighting fixture was an internal project. Had we been designing for a client, we would have been more hesitant to use AI while balancing product requirements, timeline, budget, and resources.
Lastly, because this was a purely aesthetic exercise, we weren’t trying to solve any mechanical problems through AI—that’s skill is not in its wheelhouse at this point. This limitation provides a real barrier to the widespread adoption of AI, but as the algorithms improve over time, AI may be able to help us solve even our stickiest mechanical problems.
Beyond leveraging AI for creative exploration, Kaleidoscope has also put it to use in some of our research work. As part of our insights and user experience programs, we often do ethnography or time-and-motion studies in which we observe individuals interacting with a tool or experience. Typically, one of our team members is responsible for reviewing videos to log data, tracking everything from how often someone does something to the amount of time it takes them to do it. It’s a time-consuming process that has led us to start dabbling with programming AI to analyze video recordings for certain elements and then export the data quickly and effectively. Using AI to track the frequency and duration of actions for time-and-motion studies shows tremendous potential to save time and reduce costs while freeing our team members to focus on more creative assignments.
The Kaleidoscope team came away with an appreciation for where AI can support our design efforts today, particularly as a powerful aid in producing aesthetic inspiration and as a tool to sort and output raw data. Both help the design process in productive ways and serve as a small window to what may someday be an AI-driven design future.
This was written for IDSA, if you'd like to see the INNOVATION Magazine article, please check out idsa.org/news-publications/innovation-magazine/spring-2023/
Back to Insights + News