Technology’s New Helping Hand

In today's fast-paced world, it's become almost second nature for us to order products online and have them delivered right to our doorstep. From the newest gadgets to the clothes on your back, the efficiency of the process is truly remarkable. But have you ever stopped to wonder what goes on behind the scenes during this intricate process? How do companies plan, implement, and control where your goods and services are shipped to with such precision and speed? Let’s delve into the fascinating world of supply chain management and discover the secrets behind its efficiency.  

 At the heart of this remarkable process lies a well-orchestrated network of manufacturers, distributors, logistics providers, and retailers. A talented group of people uniquely specialized in their field to get these finished goods to your doorstep. Together they ensure that your products meet customer needs in a timely, cost-effective manner by optimizing their operations and working with meticulous planning and coordination.  

 The journey begins here at Kaleidoscope, we embark on the process of product development, collaborating closely with companies like P&G. Take Tide Pods, for example. We design, engineer, and test the product to meet consumer demands. From ideation to final production, extensive research and development are essential to creating innovative and high-quality products. This journey requires a strong partnership and collaborative effort to bring these ideas to life. 

 Once the products are ready to go, they are transported to distribution centers or warehouses strategically located to facilitate efficient distribution. These distribution centers act as hubs where products are received, sorted, and prepared for further transportation. Advanced technologies, such as automation and robotics, are employed to ensure that each delivery is assigned the most appropriate route, minimizing travel time and maximizing efficiency. 

 Once an order is placed, logistics comes into play. Sophisticated systems manage inventory, track shipments, and optimize routes to ensure timely delivery. The products are carefully packaged and labeled, ready to embark on their journey to the customer's doorstep. This process can be transported to you in various modes, including trucks, trains, ships, airplanes, and depending on the urgency of the delivery even drones.  

One of the key drivers of efficiency in this process is advanced technology. Logistics is a rapidly growing industry full of artificial intelligence, machine learning, and big data analytics. This technology is constantly analyzing vast amounts of data, so that companies can gain valuable insights into consumer behavior, and demand patterns. Furthermore, with the ability to track our packages in real-time we can track shipments, demand forecasting, route optimization, and manage inventory. This abundance of technology helps improve overall supply chain performance. 

Another crucial aspect of efficient product delivery is what’s called “last-mile logistics”. This final step of the journey from the distribution center to the customer's doorstep can often be the most challenging. To overcome this, companies are using innovative strategies to enhance efficiency and customer satisfaction. Delivery through drones, autonomous vehicles, and even crowdsourced delivery services are being tested and implemented to reduce delivery times and overcome the challenges of urban congestion.  

 So, the next time you receive a package at your doorstep, take a moment to appreciate the incredible logistics infrastructure and the efforts that go into making it happen. Acknowledging that behind these technological advancements, there is a dedicated workforce of logistics professionals who work tirelessly to ensure the smooth flow of goods.  The evolving world of logistics continues to push boundaries and find innovative solutions to meet the increasing demands of e-commerce and consumer expectations. It's an exciting time to witness the transformation of how products reach us with such efficiency and convenience. 

 What do you think the future of logistics will look like as technology advances? Do you have any experiences or insights to share about the logistics behind product deliveries? We'd love to hear your thoughts in the comments below. 

Back to Insights + News

Authors

  • Taylor Schmitt

    Marketing Co-op | [email protected]

    Taylor Schmitt is currently a student at The Ohio State University, where she studies marketing. She loves exploring new opportunities and facing new challenges. While working at Kaleidoscope she has been able to work closely with the sales team to support business growth and brand visibility

  • Matt Suits

    Head of Sales | [email protected]

    Matt has always loved interacting with clients to find solutions for their challenges. He was drawn to business development at Kaleidoscope Innovation because of the great potential he saw. After graduating from the Lindner College of Business at the University of Cincinnati, he worked with two startups, a marketing consultancy, a financial services company and the non-profit 3CDC. He believes that listening is the most important part of sales. In his free time, Matt enjoys movies, trying new foods, traveling and the great outdoors.

The Future of AI-powered Healthcare

What is artificial intelligence (AI)? Is it the evoking computer from sci-fi aware of its own existence and determined to destroy humanity? Is it a robot that does our job for us while we kick our feet up? Right now, maybe it is neither, it can be defined as “A System that mimics human intelligence to perform complex tasks using advanced learning algorithms that capture underlying patterns and relationships from the data they collect.”  The tasks and benefits from such a system can be many but generally serve as three main use categories: accuracy improvement, automation of tasks, or a recommendations engine.

In developing a SAMD (Software As a Medical Device) product consider both the regulatory guidelines and best practices.  The FDA is partnering with industry to develop regulations in this emerging field, they recently released a guidance on Clinical Decision Support Software describing the criterion in which software is considered a medical device by the agency. And, during software life-cycle development, ISO 62304 outlines the processes of risk management, maintenance, configuration management, and problem resolution.

Developers should build in systems on the front end for data mining whether in the form of document capturing tools, video data collection, speech recognition, or otherwise.  And, comprehensive cybersecurity around these data sources in addition to the access, analysis, and output systems.

Lastly, algorithms should take bias into account.  This is already present in the diagnosis making process today, clinicians can jump to conclusions based on early information and stick to their guns even as new information becomes available (premature closure / anchoring). The algorithms themselves can have bias, in how data is fitted when machine learning is automated.

  • Automation Bias: Tendency of people to show deference to automated output, maybe due to person’s lack of confidence/experience, or assumption that the automation designed to make the correct determination.
  • Fitting Bias: Over Fitting- Automation has been overly relying on the trained data and does not provide correct responses when given new information.  Under Fitting - Machine is under trained and doesn’t correctly identify relationships between the variables.

Widespread AI use is in its infancy, its currently being leveraged across several surgical products currently on the market including surgery planners, guidance systems, AR, blood loss monitoring, and predictive analytics. The future holds many opportunities for AI to burn down existing healthcare challenges.

Accuracy Improvement:

  • Comprehensive Patient Medical Information
  • Summarization and Highlighting of Patient Case History
  • Accurate Encoding of procedures and diagnosis for insurance
  • Accurate diagnosis from medical images
  • Risk-aware decision making –using predictive analysis of surgical outcome, implant choice, length of hospital stay, risk of re-hospitalization
  • Post op x-ray, feedback loop, feedback to surgeon on trending accuracy stats, predictive risks
  • Physician burnout - make less errors during diagnosis
  • Physician shortage – making fewer surgeons more efficient

Automation Enabled Improvements:

  • Improved surgical planning / operation
  • AI-assisted surgical robotics
  • Supply chain automation
  • Reduced non-conformances, out-of-commission instrument sets
  • Reduced waste, reprocessing costs
  • Smart intra-op assistant / training

Recommendations Engine:

  • Patient/procedure/surgeon customized device on demand
  • Fair surgeon success ratings based on predictive risk/outcomes
  • Informing consumers on surgeon/facility for their condition to maximize outcomes

It probably won’t be too far into the future before some of these AI-enabled improvements become mainstream practice in the healthcare domain. The recent advances in ChatGPT have shown how complex knowledge intensive tasks such as text summarization, essay generation, intelligent Q&A (Question and Answer), etc. can be accomplished by current language models.  Convolutional Neural Networks (CNN)-based deep learning models are showing promise for automatic detection and classification of tumors in medical imaging. Advanced Machine Learning (ML), Rule-based modeling, and Embedded-AI can help with addressing other opportunities such as risk prediction, improved surgical planning, AI-assisted robotic devices, supply chain automation, and customized recommendations

AI will help in bringing consistency in the process, improve overall efficiency, reduce cost of operations while adhering and improving the regulatory compliance.

Interested in implementing AI/ML technology into your business?

Kaleidoscope uses advanced learning algorithms to capture patterns and relationships within your data to help you better understand the data collected and provide both exploratory and predictive analytics based on findings. Contact Matt Suits: [email protected]

Back to Insights + News

Authors

  • Eric Kennedy

    Principal Engineer | [email protected]

    Eric Kennedy is an engineer at Kaleidoscope Innovation based in Cincinnati, Ohio, and has over 15 years of global medical device experience leading large- and medium-scale concept-to-launch orthopedic, micro-surgical, cardiovascular and ophthalmic devices.

  • Dr. Ravi Nandigam

    Principal Consultant

    Dr. Ravi Nandigam is a Principal Consultant in the Advanced Engineering Group at Infosys. He has 15 years of experience applying Artificial Intelligence, Machine Learning, and Software-based solutions in diverse Engineering domains. Dr.Nandigam is an inventor of a patent and author of many technical articles in peer-reviewed international journals on topics of AI/ML-based applications in Engineering.

  • Dr. Ravi Kumar G. V. V.

    Vice President and Head Advanced Engineering Group (AEG)

    Dr. Ravi Kumar is Vice President and Head Advanced Engineering Group (AEG) of Engineering Services, Infosys. He led numerous innovations and applied research projects for more than 26 years. His areas of expertise include mechanical structures and

AI as Intelligent Design? Not Yet, But It’s Coming.

From art generators to chatbots, AI seems to be having its zeitgeist moment in popular culture. But for those of us who work in design, the near-term and future applications of AI have been lively discussion points in strategic planning meetings for quite some time. There is no doubt that AI will be an instrumental part of our world’s future. It will allow us to rapidly synthesize all the data being collected via our phones, cameras, computers, smart devices, and much more, giving us the ability to decipher and understand that data in illuminating, meaningful, and likely, world-changing ways.  

What does this mean for the design industry? Though it may be a long time before AI is able to design a product from the ground up, the potential is clearly there. In fact, we believe AI is a tool that designers should be adding to their arsenal sooner rather than later. 

 

Putting AI to Work 

To put our money where our industry-informed opinions are, the Kaleidoscope Innovation team recently embarked on a studio project to design a high-end lighting fixture that could mimic lighting patterns found in nature. The project would enable our team to flex our aesthetic skills while using the full range of our design toolbox. One of those tools is Midjourney, a proprietary artificial intelligence program produced by an independent research lab by the same name. Though still in the open beta phase, Midjourney proved to be a useful partner in our mission. The collaboration between AI and the guiding hand of our expert design team delivered intriguing results. 

One important distinction about the AI portion of the project: We were not setting out to produce real-world functionality, and in fact, we had no expectation or need for the AI to produce fleshed-out ideas or even design sketches. This experiment was about exploring new territories in aesthetics and applying them to materials and manufacturability considerations. 

Our first step was to gather a team to collaborate on the search terms that would help visually articulate the aesthetic aspirations for our new fixture. Midjourney works by inputting text-based prompts, which the AI algorithm uses to generate new images using vast databases of existing images. The terms we fed the algorithm included chandelier, lighting, brilliant, elegant light, airy, crystalline patterns of light, dancing, photorealistic detailed plants, greenery, daytime, bright, modern, beautiful, natural colors, garden, and greenery. The team also used technical inputs alongside these qualitative descriptors to determine the aspect ratio and resolution while also guiding the algorithm to reference certain lighting styles and rendering approaches.  

Digesting these descriptive words, Midjourney searched vast amounts of data across the internet to create original—albeit amalgamated—artwork. The images it produced reflected the algorithm’s interpretation of the inputs the team provided. From there, we tweaked specific inputs to alter the color, lighting, tone, and subject matter, continuing to iterate until we had collected a series of AI-generated lighting fixtures that could inspire the team.

How Did AI Do?  

Based on the text inputs the team provided, Midjourney was able to identify design elements that could produce the effect of light shining through leaves. The images it produced looked organic, almost surreal in the way they were able to capture the kind of nature-made glow and transparency that is elusive in real-world lighting solutions. The various iterations of artwork then became mood boards that set up our team to brainstorm ways in which the effect could conceivably be produced.  

The algorithm’s interesting use of materials, colors, lighting effects, and overall mood inspired us to apply those attributes to a holistic design. In other words, instead of our team scratching their heads visualizing how the light should transmit, AI provided us with ideas that enabled us to focus on materials, manufacturability, technical requirements, and more. Rather than spending hours scouring the internet for inspirational imagery, the team was able to craft that inspiration imagery ourselves through AI in a fraction of the time—imagery that exactly aligned with our design vision. 

concept board

Without question, Midjourney served as a highly effective springboard that sparked ideas our team would probably not have come up with starting from a blank sheet of paper and pen. In this sense, AI provides an upfront efficiency that can move a project farther down the road faster than it might otherwise have gone. Perhaps more than that, a significant strength of AI in this application is that it can cast a wide net in terms of inspiration and exploration. It’s an open mind, and designers should be willing—and eager—to go down the rabbit holes, teasing out new possibilities. Once an intriguing direction is established, the designer can take over to turn the AI-generated inspiration into an actual product.  

The key to a successful AI collaboration is plugging in the right words or phrases to best draw out the AI. And so, crafting prompts could be viewed more as art than science. Further, with a program like Midjourney, there is an element of unpredictability: You don’t have much control over what you’re going to get out of it. There is a lot of trial and error and shooting in the dark. Therefore, if you already have a set idea in mind, using AI to design it will probably be more frustrating than productive.  

The inherent aspect of exploration and discovery is a factor to consider as well. Our team felt excited about experimenting with this technology specifically because the lighting fixture was an internal project. Had we been designing for a client, we would have been more hesitant to use AI while balancing product requirements, timeline, budget, and resources.  

Lastly, because this was a purely aesthetic exercise, we weren’t trying to solve any mechanical problems through AI—that’s skill is not in its wheelhouse at this point. This limitation provides a real barrier to the widespread adoption of AI, but as the algorithms improve over time, AI may be able to help us solve even our stickiest mechanical problems. 

Beyond leveraging AI for creative exploration, Kaleidoscope has also put it to use in some of our research work. As part of our insights and user experience programs, we often do ethnography or time-and-motion studies in which we observe individuals interacting with a tool or experience. Typically, one of our team members is responsible for reviewing videos to log data, tracking everything from how often someone does something to the amount of time it takes them to do it. It’s a time-consuming process that has led us to start dabbling with programming AI to analyze video recordings for certain elements and then export the data quickly and effectively. Using AI to track the frequency and duration of actions for time-and-motion studies shows tremendous potential to save time and reduce costs while freeing our team members to focus on more creative assignments. 

The Verdict 

The Kaleidoscope team came away with an appreciation for where AI can support our design efforts today, particularly as a powerful aid in producing aesthetic inspiration and as a tool to sort and output raw data. Both help the design process in productive ways and serve as a small window to what may someday be an AI-driven design future.

This was written for IDSA, if you'd like to see the INNOVATION Magazine article, please check out idsa.org/news-publications/innovation-magazine/spring-2023/

Back to Insights + News

Authors

  • Tony Siebel

    Director of Design | [email protected]

    Tony Siebel is director of design at Kaleidoscope Innovation, delivering a user-centered mindset to products and experiences.

  • Tom Gernetzke

    Senior Industrial Designer | [email protected]

    Tom Gernetzke is a senior lead industrial designer at Kaleidoscope Innovation and has spent the last 12 years creatively bringing new product ideas to life.

  • Caterina Rizzoni

    Lead Industrial Designer | [email protected]

    Caterina Rizzoni is a lead industrial designer at Kaleidoscope Innovation and is the Director-at-Large of Conferences for IDSA.

Infosys Medical Devices and Engineering Services x Kaleidoscope Innovation

The Healthcare and Medical devices industry is undergoing a revolutionary transformation in the way solutions and devices are being formulated and developed. Medical devices are becoming more connected than ever and remote patient monitoring with data analytics is becoming a norm.

It is imperative for the medical device companies to adopt a strategic approach to stay ahead of the innovation curve by leveraging technology advancements in multiple areas such as mobility, wireless, cloud, and analytics to drive innovation that addresses market needs and challenges of longer device development cycles, optimization of development processes, and high production costs.

At Infosys, we help our clients in designing customized devices, end-to-end product development, maintenance, manufacturing support, regulatory documentation, and product compliance and certifications. We also help optimize R&D cost and improve supply chain efficiencies by leveraging new technologies and partner ecosystems. This is to bring innovative medical devices and Software as a Medical Device applications into the market with the objective of improving patient care while reducing the cost of care.

Our ISO 13485 certified processes and Quality Management System ensures high-quality product development which enables our client to meet their regulatory needs and objectives. With our recent acquisition of product design and development firm, Kaleidoscope Innovation, we plan to redefine patient treatment and consumer health across the globe.

Full article can be found on Infosys.com

Back to Insights + News
The Design of Everyday Things

Inspired by Don Norman’s classic work, the Design of Everyday Things, we’ve been thinking about mundane, everyday items that can have annoying usability flaws. While we have a particular focus on the human factors of healthcare and medical products here at Kaleidoscope, we can apply that same rigorous, analytical human factors approach to these everyday things.

So, here we have the seemingly benign 2.5 gallon jug of drinking water, a household staple used by a variety of brands across the country.

Problem 1: As water is dispensed from the jug, additional air is required to replace the dispensed water to ensure consistent water flow and prevent the jug from collapsing due to the pressure of the surrounding air. To add air flow into the jug, a small hole must be punctured into top with a sharp knife. The use of a sharp knife poses a potential safety hazard when considering the orientation and motion in which the knife must be used and the force necessary for the knife to puncture the slick plastic material of the jug. In addition, the most obvious place to puncture this hole is the top side facing the front of the jug, which has a slight slant toward the user. The angle of the stabbing motion must be just right; if the angle is too shallow, the knife blade can skid across the surface of the plastic, with the blade pointing in toward the user’s body.

A potential mitigation for this problem is to provide an adhesive pull tab that can be removed to reveal a pre-punctured vent hole.

Problem 2: The spigot contains a small strip of plastic that extends from the spigot base to the dispenser handle. The plastic strip is intended to prevent the dispenser handle from being pulled open until the user intentionally breaks the strip, pulls the dispenser handle, and begins dispensing the water. However, the plastic strip can be easily broken unintentionally, and the dispenser handle then opens with very little resistance. This can lead to the dispenser handle opening inadvertently when force is applied to the spigot during loading, or the spigot catches on a surface while unloading, potentially emptying water into a shopping cart or the trunk of a car.

A potential mitigation for this problem is to provide a screw cap over the spigot, similar to the caps on water bottles.

What’s an aspect of an everyday item that you would change to improve the user experience?

Back to Insights + News

Author

Improving Vision and Quality of Life with Samsara’s Implantable Miniature Telescope

Tom Ruggia, Samsara’s CEO, talks about the innovative technology that improves the vision of people with untreatable retinal disorders. The discussion also covers the importance of partnerships, tele-medicine, and the human-centric approach.

Hosted by Jeff Kavanaugh, Chief Learner and Sharer of the Infosys Knowledge Institute.

“I like innovators, of course, but I like people who really get into the task at hand and the objective. I love to see masters at their craft. So I was intrigued and got to know Kaleidoscope well… we were working in a laboratory, working on a design of a drug-delivery technology that was making its way to market… What they did with that, it wasn't just, "Let's create a catheter that gets there." It was, "Let's create a catheter that gets there. Let's make sure the surgeon experience is perfect.”

“The telescope will focus on tissue just around the section which has lost vision. So we go around the lesion with the light that we magnify, and then the brain can take that image and use it as a central image thanks to the magnified light.”

“The doctors are very aware that wet AMD can be treated, and that's thanks to the treatment innovations. What doctors are not as aware is the other treatments for late-stage disease.”

“There are currently 4 million patients that are appropriate for our technology who have not had a previous cataract surgery and have concomitant late-stage AMD. So for those 4 million patients, we want to get this SING IMT out there, and we feel we can get it to the masses and train the surgeons appropriately.”

Tom Ruggia


INSIGHTS

  • 11% of elderly patients will have age-related macular degeneration. So many of them making it to the latter stages of the disease around the world. We see this disease in various populations. Macular degeneration affects the tissue in the center of the back of the eye.
  • Patients lose central vision that begins, let's say, somewhere in your 50s and can progress through your life to where you're almost centrally blind, can't see faces, can't read, can't drive, which happens rather quickly.
  • And there are treatments for some parts of the disease. So inside the back of the eye you may create new blood vessels, and neovascularization as common in this disease state. So when the eye creates new blood vessels in that very tight tissue plain, it's a recipe for disaster, so to speak. Generally, those patients lose vision very fast and that's referred to commonly as wet AMD or neovascular AMD, and there's treatments for wet AMD, but you're not treating the underlying condition; you're treating the neovascularization.
  • So there's injections like Lucentis or Eylea, which are very common today. They can rescue that fast onset of vision loss in wet-AMD patients. The patients still progress along the disease cascade. Some patients that don't experience neovascularization can be referred to as dry-AMD patients. Those patients will progress and then eventually lose that central vision, and the wet-AMD patients over time will also lose their central vision thanks to the underlying disease state. It's a very sad condition. It leads to debilitation and reliant on caregivers.
  • What doctors are not as aware is the other treatments for late-stage disease, and today, really those treatments are limited to external devices that can be used to magnify light and change the central focus of the light. The retina specialists are not aware of the technology. Some are not favorable to the technology, and the patients get a bit of fatigue, because so many of them that have dry AMD are told that there's no pharmaceutical intervention, there's no surgical interventions that can help them, and they go into a reluctance to come back and see the doctor.
  • If you lose the ability to drive at 20:80 and lose most of your central vision, 20:160's far worse than that. Most of our patients come in, maybe they can only see the big E on the very top of the chart.
  • We're approved in Europe, we've been commercializing this device in Europe now for a year, and we're seeing an average of three to four lines improvement. What that means is the patient starts with the ability to just see the big E on the top of the chart, but then can see four lines deeper. So getting very close to that 20:80 or possibly even better than 20:80 after surgery.
  • So we re-engineered the technology with human factors in mind. Surgeon factors, of course; the delivery now can be done in 28 minutes on average where it was 70 minutes prior; but also with patient-centric design in mind. Now the six-and-a-half millimeter incision in the eye is about half the size of the previous incision, leading to faster recovery and less chance for issues with the cornea.

LISTEN ON


ABOUT TOM RUGGIA

President and CEO – Samsara Vision

Tom Ruggia

Thomas Ruggia joined Samsara Vision as the Chief Executive Officer in July 2020. With nearly 20 years of ophthalmology business experience, he has a comprehensive and nuanced understanding of the healthcare environment in the United States and abroad, as well as significant experience in the development and commercialization of vision products with differing regulatory and pricing structures.

Before joining Samsara Vision, Mr. Ruggia spent five years at Johnson & Johnson, working at Johnson & Johnson Vision and The Janssen Pharma Co, respectively. Most recently at Johnson & Johnson Vision, he was the Vice President WW Customer Experience and Ocular Surface Disease, responsible for two global commercial teams working in customer strategy, customer service, and field technical service. At Janssen, he was the commercial strategy leader in ophthalmology assigned to an asset in development for AMD. Previously, Mr. Ruggia spent fourteen years at Alcon, a division of Novartis, working in a variety of ophthalmology sales and marketing roles with escalating responsibility. He graduated with a Bachelor of Science from The College of New Jersey in 1998.

Connect with Tom Ruggia on LinkedIn

Mentioned in the podcast

About the Infosys Knowledge Institute

Samsara Vision, Implantable Telescope Technology

Back to Insights + News
Breaking the Mold: Alternative Materials for Sustainable Product Development

Over the past year, Kaleidoscope has begun to identify areas of opportunity for sustainability improvements within our culture and the products we design. One area we have been exploring is the potential of alternative materials within our product development process. Through a vendor partnership with 3D Color and RyPax, the team has been able to perform a preliminary evaluation of a molded fiber tray through small batch prototyping.

What is Molded Fiber?

Molded fiber or molded pulp is a material used for making molded forms from a variety of natural fibrous materials and/or recycled paper pulp. It is commonly used for a variety of packaging applications, from clamshell containers to consumer electronics packaging. The Kaleidoscope team was eager to evaluate potential packaging and product designs utilizing this technology.  To do so, we teamed up with 3D Color to produce a small batch of 100 trays, which were originally designed for thermoformed plastic sheet.

Process and Environmental Impact Considerations

The mixed material used to make these trays contained bamboo fibers and bagasse, a by-product of sugarcane processing. There are a variety of other types of natural materials that can make up the base mixture, resulting in different physical properties.

The team also learned that there are two different methods that can be used in the molding process: dry press and wet press. The trays that were evaluated were generated via the dry-press method, where the material is formed in a tool, dried, and cured. These trays have a 1.2mm wall thickness and ~5° draft angle. The maximum wall thickness that can be achieved with this method is ~2.0mm. The minimum draft angle that can be achieved is ~5°.

The wet-press, or “heated-press” method, differs in that the mold is formed using a heated tool. With this method, the maximum wall thickness is ~1.2mm and minimum draft angle of 1° draft can be achieved. Wet press can produce a higher fidelity finish, though the heated tool requires more energy than the dry press method.

Preliminary Performance Evaluation

Upon receiving the batch of dry-press trays, some qualitative observations were able to be made:

  • Features and edges were surprisingly sharp and visually on par with what’s expected from plastic thermoforming.
  • Structural strength was similar to an egg carton, even with a significantly thinner wall thickness.
  • A uniform off-white color was achieved. Fibers that make up the tray were visible upon close inspection.
  • The material released particulate when stressed, with the particulate being very small pieces of the fiber.
  • The underside of the trays had a slightly rougher surface due to the screen used in the molding process for moisture evacuation.

To understand the trays’ performance over time, we also performed simulated accelerated aging. We stored the trays at 60degC for 30 days, which is approximately equal to one year of aging, per the Arrhenius equation. The team noted several observations after removing the trays from the aging chamber:

  • Slight warping was present in some samples.
  • Accompanying creases seemed to occur in areas that were not stiffened by curved/bent geometry.

Design Considerations

 This exercise just scratched the surface of testing needed to ensure a product made using this technology meets its requirements. However, it allowed for several basic discoveries that will inform how we might use this manufacturing process in the future. Moving forward the team has determined that the following should be considered, depending on the application and the requirements:

  • Material
    • What different natural fiber material sources yield significantly different properties? (e.g., Do corn husks have the same properties as bagasse?)
  • Thickness
    • What is the right thickness for the application?
  • Simulated Environmental/Distribution Testing
    • Whether the recycled paper pulp and/or natural fiber-based material is the product or the packaging, does it survive expected conditions? Shipping, temperature and humidity fluctuations, and shelf life are all variables that need to be considered.
    • For sealed packaging, how well does the material keep water out? How well does it “breathe”?
  • Coatings, including thin sheet PLA.
    • How well does coating remain applied to tray?
    • What properties does the coating give to the tray? Water resistance, decreased particulate shedding?
  • Printing/Marking/Adhesives
    • Can the material receive printing/stamping?
    • Can dyes be added to the material?
    • Do stickers or adhesive remain adhered?
  • Cost
    • What is the order quantity and required fidelity?
    • In general, it was learned from the experience of our partners that the cost of molded fiber parts is approximately 2-3 times that of a plastic vacuum formed part in a production run. What does this mean for trays that are more complex?
  • Sustainability
    • Are the perceived benefits of using the molded fiber material overcome by the accumulation of energy, coatings, markings, etc. that may be used in making this product?

These tray prototypes were high fidelity and can prevent the need for expensive and time-consuming production-quality tooling early in the development process. Fibrous particulate is present on and in this material. If your product is sensitive to foreign material, a coated molded fiber tray is likely a better choice than an uncoated tray. Heat/aging seems to influence the integrity of the part, though this is only a preliminary finding. Due to relatively low structural strength compared to plastic, we recommend designing for strength, and reducing stress concentrations where possible. We are still evaluating this technology, and fortunately, partners like 3D Color make the iterative design and test cycle faster and less costly.

What’s Next?

Leveraging diverse design and manufacturing experience, evaluating new technologies and materials, and cultivating strong vendor relationships is nothing new for Kaleidoscope. Though these are among our strongest skillsets, we recognize that there is significant potential to better drive sustainability within the products we design and provide meaningful material and technology alternatives for the clients we serve.

About the Authors/Companies:

Kaleidoscope Innovation

  • Headquartered in Cincinnati, Ohio, Kaleidoscope Innovationprovides medical, consumer, and industrial clients with full-service insights, design, human factors, and product development. For more than 30 years we have been helping our clients grow their capabilities, gain usable knowledge, and get worthwhile results. As a full-spectrum product design and development firm, we are an expert extension of your product vision. Our teams collaborate across disciplines, providing specialized input to produce the ideal intersection between function and form. To ensure the soundness of our work, Kaleidoscope houses a full range of test labs, and we employ an award-winning team that embraces every challenge, applying their experience, ingenuity, and passion.

3D Color:

  • We help our clients shape first impressions into lasting ones.  At 3D Color, we help the world’s best designers, marketers, brand builders and researchers bring their ideas to life. With industry-leading capabilities in advanced prototyping, exceptional comp production, custom color development, efficient sales sample programs and an innovation incubator, we’re a full-service strategic partner to the visionaries who are shaping the future of packaged goods.

Grant Cothrel

  • Sr Design Engineer
  • Grant enjoys being faced with new challenges and recognizes that well-designed products lead to better experiences and outcomes for users. He loves the design process and has operated primarily in medical device and industrial instrumentation. His passion to understand, innovate, and simplify has been supported and strengthened by the Kaleidoscope team and their talented partners. At home, he always has a fun project in the works (think: wooden bicycle, handheld Theremin, one-string electric guitar)!

Sophie Fain

  • Industrial Designer
  • As a part of the Kaleidoscope team, Sophie has had the opportunity to collaborate with diverse and talented individuals to understand complex problems and provide meaningful human-centered solutions. Sophie is driven by the chance to create a positive emotional connection with users through the experiences she crafts. Prior to joining Kaleidoscope, she has worked for companies like Depuy Synthes of Johnson and Johnson, The Clorox Company, and a physical therapy startup, BAND Connect. When she is not captivated by a design problem, Sophie enjoys getting her creative energy out through activities like pottery and cooking.

Mike Corbett

  • Director of Model Services
  • Mike leads the model shop team at 3D Color and utilizes a wide range of techniques to meet clients’ challenges. With 28 years of experience and a passion for model making, Mike leads by example and teaches his craft to his team and to the company’s clients.  Mike enjoys the “hands-on” nature of our work and the problem-solving thinking needed to complete projects.  In his free time, Mike enjoys his family, travel, and the wicked game of golf.
Back to Insights + News
Virtual Tools for Innovative Product Design

Co-authored article with Infosys

Design influences a product’s lifecycle performance and cost, starting from its development. Product development costs rise significantly if a defect is identified at a later stage. Using virtual tools for new product introduction simulates possible scenarios upfront for comprehensive testing. It gets products to the market quickly and saves money for a successful launch.

Insights

  • Design influences a product’s lifecycle performance and cost, starting from its development.
  • Conceptualization and design stages determine more than 70% of a product’s lifecycle decisions
    and cost.
  • Virtual tools are an effective way to design new products that serve specific customer needs.
  • Virtual models of new products accelerate their evaluations to shrink the development cycle time.
  • Organizations should create virtual replicas of workplaces for human-machine interactions studies from multiple perspectives.

Lifecycle cost is the total cost (direct and indirect) a product incurs in its life span. Conceptualization and design stages determine more than 70% of a product’s lifecycle decisions and cost.1 The earlier an issue is identified, specifically in the design stage, the easier it is to fix and avoid costly rework. Virtual replicas (or digital twins) of products, processes, and environments streamline design and new product development to reduce costs and time to market.

A common assertion is between 80% and 90% of new products fail. However, realistic failure rates vary by industry, from 36% in healthcare to 45% in consumer goods.2 Professor Clayton Christensen, best known for his theory of disruptive innovation, believes the success mantra is to design products that serve its intended customers. Manufacturers should focus on the function that a customer who buys a product would want it to do.3

To enable that, virtual representations of the product under development, in orchestration with humans and other entities in the ecosystem, is an effective approach. The approach encourages innovation. Designers visualize the product’s operating condition, create digital prototypes for trial runs, and carry out tests on a global scale. Virtual tools like 3D computer models and digital twins support informed decisions in early product design stages. This mitigates the risk of a wrong product release or a poor customer experience.

Virtual products are an effective way to design new products that serve specific customer needs.

When end users receive virtual training of a complicated product’s operation (like an aircraft engine), memory retention happens in the background. Any number of such instances can be created at a negligible marginal cost for repetitive usage. A central digital setup saves the cost of setting up multiple physical arrangements at different locations.

Parameters of Successful New Products

Product failures are more from a commercial perspective than technical. More than 25% of revenue and profits across industries come from new products, according to a study by McKinsey. Successful products relate to a set of core capabilities, with the top-most as follows:4

  • Collaboration to execute tasks as a team.
  • Investment to mine market insights and their inclusion in the product.
  • Plans for new product launches, comprising target customer segments, key messages to communicate, and objectives to achieve.
  • Talent development for new product launches with defined career paths and incentives.

At the same time, the primary reasons for product failures and mitigants are the following:5

  • Gap in meeting product expectations; delay launch until product completion.
  • Inability to support rapid growth if a product is successful; set ramp-up plans to avoid this.
  • Low demand for a new product; perform due diligence for customer requirement before planning a product. Launch products in suitable markets.
  • Difficulty in new product usage; provide proper customer orientation and training.

Virtual tools for product design address the above reasons for failure and increase the chances of successful product launches.

Design Thinking with Virtual Tools

Design thinking is a popular, technology-agnostic approach for new systems design and problem solving. It balances the technical feasibility of products, financial viability, and desirability from a customer’s perspective (see Figure 1). It is even more impactful when implemented along with virtual product design tools.

Figure 1. Design thinking at the sweet spot of desirability, viability, and feasibility

infographics

Source: Infosys

The design thinking cycle starts from empathy to understand a customer’s needs from their perspective, followed by defining, ideating, prototyping, and validating, in iterative loops. New product development and customer participation encourage collaboration in a virtual environment to practice design thinking. Immersive environments using mixed reality (combinations of augmented reality or AR and virtual reality or VR) create a working environment close to the real world, to identify and correct issues much ahead (see Figure 2).

Figure 2. Virtual tools used across design thinking stages

infographics

Source: Infosys

Virtual models of new products accelerate their evaluations to shrink the development cycle time.

 

Design firm IDEO, for example, wanted to perform ethnographic research to capture customer requirements for new products. However, it was difficult to identify key observations from many data points and recreate them later, even with expensive videos or photos. It addressed the challenge through a VR camera.6

Kaleidoscope Innovation, a design and development unit within Infosys, designed a large freezer project using virtual tools. Such projects usually undergo several time-consuming team reviews. The team created a 3D model in a VR environment that helped designers walk around the product early in the design phase, evaluate its usability from multiple perspectives, and tackle proposed changes to design.

This virtual model did not change the overall project plan, but accelerated evaluation and decisions around it, shrinking the product development cycle time. The team selected the best design without spending time and money on physical prototypes.

Automation in WareHouses

Humans work with machines in warehouses. Material handlers carry out order fulfillment along with pick-and-place robots. Workers’ safety in all situations is important.

A leading e-commerce player wanted to validate design decisions for robots working in its order fulfillment warehouses to gain insights into their safe working alongside humans. Kaleidoscope Innovation created a virtual environment where employees interacted with robots in different situations. The team created a digital twin to simulate several configurations of robots and their working environment. The company recorded the results and interviewed employees about pros and cons of each situation.

The VR-based solution provided a cost-effective and safe way for the e-commerce firm to test new concepts in human-robot interaction and capture data and feedback before implementation. It helped the managers zoom out and look at the big picture, in contrast to one robot or equipment at a time.

Training for Product Usage

Operators need training to work on machines with complex functionality and procedures, to stay safe and productive. VR-based training prepares humans before hands-on operation on a machine. For instance, Rolls-Royce has rolled out a VR-based training kit for its airline customers to manage aircraft engine maintenance and repair.

Infosys’s VR-based program provides step-by-step instructions to train employees in a hospital environment. The program uses physical gestures to simulate actual tasks involved in a job. Gamification with scores and points keeps employees engaged and motivated. Scores reflect an individual’s strengths and weaknesses. Training data is integrated with the central learning management system for records.

A multinational industrial and consumer goods manufacturer wanted to create an e-training platform for its new operators. It had a few integrated assembly lines for its finished items. The Kaleidoscope Innovation team created a virtual training module along the assembly line, one workstation at a time. The team used front-end user interface elements to guide users for equipment operations. It tracked performance metrics in the backend to provide feedback for correction. Best practices of creating a virtual replica of one workstation are used at later stations.

Futuristic Workplaces

While collaborative, remote and hybrid working has surged since the pandemic, the future is in three-dimensional virtual and mixed reality workspaces. Organizations benefit from a virtual 3D replica of its workspaces, equipment, products, avatars, or personas. Employee collaborations lead to faster new product development with effective interactions. Teams share ideas, explore, and invent new concepts. Early collaboration of team members in multiple locations enables them to make more informed decisions in the product development process.

Organizations should create virtual replicas of workplaces for human-machine interactions studies from multiple perspectives.

The future of work in healthcare, retail, engineering, and manufacturing is where humans and human-like machines work together. Organizations should proactively create such workspaces virtually and study human-machine interaction from safety, productivity, and employee morale perspectives before any physical implementation.

Resources

  1. Product life cycle cost analysis: State of the art review, Y. Asiedu &P. Gu, 2010, International Journal of Production Research.
  2. Myths About New Product Failure Rates, George Castellion, Stephen K. Markham, 2013, published in the Journal of Product Innovation & Management 30 pp. 976-979.
  3. What Customers Want from Your Products, Clayton M. Christensen, Scott Cook and Taddy Hall, January 16, 2006, Harvard Business School.
  4. How to make sure your next product or service launch drives growth, Alessandro Buffoni, Alice de Angelis, Volker Grüntges, and Alex Krieg, October 13, 2017, McKinsey.
  5. Why Most Product Launches Fail, Joan Schneider and Julie Hall, April 2011, Harvard Business Review.
  6. IDEO: Getting closer to the customer through virtual reality, Lauren, April 27, 2017, Harvard Business School.

 

Back to Insights + News
Designer Centered Design: Humane Design

While “User Experience Design” is often used interchangeably with “User Interface Design,” UX goes far beyond mere interface design to encompasses a user’s complete experience of a product, system or service. For Don Norman, the usability engineer and researcher who coined the term “User Experience,” all aspects of the product experience, “from initial intention to final reflections,” ought to support the user’s needs and desires. Years before Norman came onto the scene, this same concept inspired Jef Raskin, a human-computer interface expert, to define the ideal computing system. Though his vision of a computer, which was nothing more than a glorified word processor, was uninspired even in its own time, Raskin developed a set of UX Design principles, including UI consistency and encouraging users to develop productive habits, that are still relevant today.

“The Canon Cat and the Mac that Steve Jobs Killed,” an article by Matthew Guay, describes Raskin’s desire to create a computer with a humane interface. “An interface (i.e. ‘The way that you accomplish tasks with a product’) is humane if it is responsive to human needs and considerate of human frailties,” wrote Raskin. His goal was to liberate computer users through increased productivity—getting more done in less time. Inspired by Isaac Asimov’s laws of robots, Raskin defined his own laws of computing to achieve this goal:

“A computer shall not harm your work or, through inaction, allow your work to come to harm.

“A computer shall not waste your time or require you to do more work than is strictly necessary.”

Raskin’s second law is applicable far beyond word processing and seems to emphasize a common struggle faced by UX and UI designers alike. Powerpoint is a notable example of a poorly designed interface that results in decreased productivity. Its predictive toolbar feature that attempts to anticipate the user’s needs based on what has been selected. While this feature can be helpful when it correctly predicts the user’s needs, it can be very inconvenient when it guesses incorrectly, adding multiple mouse clicks to the user’s workflow.

Another violation of Raskin’s second law is inconsistency between user interface elements. Consider Apple’s latest iOS update. Previously, incoming text messages appeared at the top of the lock screen. Following the 16.1.1 update, incoming text messages now appear at the bottom of the lock screen. Neither location is objectively right or wrong, except for the user’s previous experiences of seeing new messages at the top. Now users must unlearn a previous habit to relearn a new interaction. Does the new feature add sufficient value to be worth the friction it introduces into the user’s experience?

The quintessential mnemonic “righty tighty lefty loosey” illustrates the socially ingrained understanding of how to lock or unlock a rotating mechanism. This convention becomes apparent when a user encounters an experience that is counter to what they expect. Because a user intuitively expects to turn the mechanism a certain way, requiring the opposite is a source of confusion and frustration.

When designing products, consistency is one of many usability principles, known as heuristics, that act as general guidelines for creating intuitive user interactions. Usability expert Jakob Neilsen, who cofounded the Nielsen-Norman Group with our good friend Don Norman, created the most well-known and widely used set of usability heuristics. These heuristics are used by product designers across the globe to design more intuitive and user-friendly products and experiences.

Another key heuristic that Nielsen defined is the user’s ability to match the design of the system to their understanding of the real world. Imagine a stove top with 4 burners arranged in a square and knobs that are arranged in a line. This creates confusion and tension because the user does not know which knob controls which burner. However, if the knobs are arranged in the same square pattern as the burners, and each knob activates its corresponding burner, users quickly understand which knob needs to be turned to ignite the intended burner.

The ultimate goal of user-centered design is to increase productivity and create an experience that is “responsive to human needs and considerate of human frailties.”  No product is experienced in a vacuum—each user encounters that product within the context of a lifetime of other experiences. Understanding the needs and frailties of the end user empowers designers to create more intuitive, efficient, and enjoyable experiences for users. While Jef Raskin’s Canon Cat was a commercial failure, in a world inundated with widgets, tools and systems—both physical and digital—his concept of a humane interface is perhaps more relevant now than ever.

About:
Headquartered in Cincinnati, Ohio, Kaleidoscope Innovation provides medical, consumer, and industrial clients with full-service insights, design, human factors, and product development. For more than 30 years we have been helping our clients grow their capabilities, gain usable knowledge, and get worthwhile results.

As a full-spectrum product design and development firm, we are an expert extension of your product vision. Our teams collaborate across disciplines, providing specialized input to produce the ideal intersection between function and form. To ensure the soundness of our work, Kaleidoscope houses a full range of test labs, and we employ an award-winning team that embraces every challenge, applying their experience, ingenuity, and passion.

Back to Insights + News

Author

  • Tom Gernetzke

    Senior Industrial Designer | [email protected]

    Tom Gernetzke is a senior lead industrial designer at Kaleidoscope Innovation and has spent the last 12 years creatively bringing new product ideas to life.

Designer Centered Design: Using VR for User Research and Testing

What do you do when you are in the early concept development of the design process and want to get user feedback to inform future development? Maybe you would 3d print or hand prototype your design. Putting an early mockup of your design in the hands of the user for them to assess is an important part of any user centered design process. But what if your design concept involves autonomy, a UI or a complex series of physical interactions with the user? Without additional functionality, a physical low fidelity mockup in this context loses its effectiveness in garnering insight.

“Product design” as a whole has shifted. Increasingly, the objects that we design and use in our daily lives have a component of digital interaction and/or are part of a larger virtual ecosystem. These challenges of gaining early insights from low fidelity mockups is epitomized when designing something like an autonomous robot. This design process sometimes involves years of hardware and software development for even basic functionality. So how can designers run ahead of this development to put a concept in front of users early enough to inform how such a complex product should be designed to work in these interactions to instill trust, engagement, and even enjoyment?

Let’s say that we are designing a new autonomous robot to deliver room service orders to guests at a hotel. The first issue to address is how people react to an autonomous device sharing their space. How close is too close? Is there a violation of a social contract by placing this robot in what was otherwise a dedicated space for people? How do they expect the robot to behave? Most importantly, how do you begin to probe those expectations of the customer when hardware and software development are not mature enough to represent the final design concept? You cannot put an engineering prototype in close proximity with the user without creating a potential safety risk. If you were to make a remote-controlled mockup of the robot, how can you truly test user comfort with autonomy when the test subject knows that there is a human in control? And how do those reactions to autonomy change with multiple robots? This is where VR stands out as a remarkably effective tool for gaining insight.

Utilizing VR in complex product interactions allows designers to not only save on the resource cost of hardware prototyping and manufacturing, but also allows them to iterate much more rapidly and push boundaries of comfort with users without ever putting the user at risk. By conducting user testing in VR, not only can you present a complex and interactive product experience in front of the user, but you can also transport them to specific environments and scenarios with the push of a button. This enables the development of not only a guiding model for the design, but also a guiding model for software development as VR interactions can inform what does and does not work in interactions between humans and autonomous systems. However, VR still has its shortcomings and is not the definitive means of user testing in product development.

Virtual reality for user testing and concept evaluation is simply another tool in our toolbox as designers and design researchers. While it offers new capabilities for testing and evaluation, there is a major tradeoff between a VR mockup and a physical one… namely the nature of “virtual” reality itself. There is no physical feedback, and while there is a strong sense of depth perception, it is not the same as an actual physical interaction. While augmented reality may better incorporate both the physical and virtual, the virtual assets can stand out as even more artificial than a full virtual immersive experience because of the difference in fidelity of virtual vs real world objects. Does this eliminate the need for physical prototyping and low fidelity physical mockups? No. But VR enables designers and developers to test more complex products earlier in the design process with users where alternative approaches are less feasible due to complexity and cost.

While the role of a designer can be reductively described as “stylist” I think the true value we bring to a team are as story tellers both outwardly to the customer/target user and internally. VR enables us to share virtual models without “CAD scale blindness” and to collaborate more seamlessly even while remote. Having a VR headset brings even remote collaborators together and immerses them in a 3d virtual experience. Meaning there is less misunderstanding and room for interpretation than just a concept sketch, 2d render, or even a 3d CAD model on a screen.

As this technology continues to mature and becomes more accessible, I see the use of VR as an increasingly valuable tool for designers. Where paper and markers gave way to Cintiqs and iPads, I could see CAD modeling and user testing making room for VR modeling, collaboration, and design evaluation. We are entering a new frontier for design and media with VR that will undoubtedly influence how we live and work. Pick up a headset and explore the possibilities for yourself. There is plenty of undiscovered opportunity and impact to be harnessed with this new technology!

Back to Insights + News

Author

  • Nikko Van Stolk

    Lead Industrial Designer

    Interested in evolving processes with new capabilities and new technology. A proven track record of experience working with surgical and industrial robotics. A strong storyteller and team leader that takes the initiative constantly seeking to exceed client expectations. A good people person and collaborator able to wade into the thickest CAD assemblies with large engineering teams, deep experience working with human factors in developing and executing human factors research, and facilitating and engaging in creative problem solving and brainstorms with fellow designers. A full designers toolkit of sketching, rendering, CAD modeling/surfacing, and DFM.