Upgrading and extending our neuromarketing capabilities

Project Title: Upgrading and extending our neuromarketing capabilities

Project Lead: James Coyle

Email: james.coyle@miamioh.edu

Phone: (513) 529-0483

Affiliation: FSB

Other Team Member Names: Phill Alexander, Neill Brigden

Project Details: The purpose of the project is two-fold. Both objectives are equally important. 1) There is a great deal of demand for our current desktop eye tracker. We are asking for support for a second desktop eye tracker to meet this demand. 2) To extend our expertise in eye tracking and neuromarketing we are asking for support for a related neuromarketing tool, facial expression analysis. Every year more and more students and faculty from every School and College on campus turn to eye tracking technology to better understand how people visually process images and digital interfaces. We want to meet these growing needs. It's also important that we give students ways to more accurately understand affective reaction to imagery. In this way, facial expression analysis complements what we can learn from eye tracking.

Problem Project Attempts to Solve: Currently, we can determine how people visually process a wide range of images, interfaces and other visual stimuli. Thanks to a recent student tech fee award, this now includes interfaces viewed on smartphone and tablet. To complement this physiological data, we rely on self-report data to better understand why people visually process the way they do. Self-report data can be problematic. With facial expression analysis software, we will have a powerful new way of interpreting people's reactions and attitudes to visual stimuli. We will be able to go beyond interpretation of what they were looking at and for how long, to also understand how they were feeling as they were viewing. While facial expression analysis has been a common way of measuring affect, until recently it has been hugely costly in terms of the time it takes to judge facial reactions. Facial expression analysis modules, like the one we include in our proposal, take data from cameras in our eye tracking hardware to digitally code facial reactions. Also, demand for our desktop eye tracker has grown, such that we are often having to tell students that they need to delay their research because our one desktop eye tracker is in use. Thus, we are also asking for a second desktop eye tracker in this proposal. Professors in the following classes see the eye tracking and facial expression analysis technologies discussed in this proposal as an opportunity to enhance their classrooms: ART 351 Design Systems (20 students): Students can use the technologies to analyze the materials they create in natural settings and on multiple formats. ART 354 3-D Design/Multi-Disciplinary Studio (40 students) : Students can use the technologies to analyze the materials they create in natural settings and on multiple formats. ART 452 Senior Degree Project (20 students): Students can use the technologies to analyze the materials they create in natural settings and on multiple formats. ART 453 Highwire Brand Studio (20 students): In this capstone class, marketing and graphic design students design a communications campaign for a client. As part of a process, students can use the technologies to evaluate the effectiveness of their campaign materials in natural settings and on multiple formats. ART 650 Core Design Studio (currently 6 students; projected to have 9-12 next year and 20-24 in two years): The class currently includes an eye tracking theme. Students would be able to extend their study with "hands on" experience with the hardware and software. ENG/IMS 426/526 Developing and Publishing Digital Books (20-25 students): Students can use the technologies to assess the digital reading experience of materials designed for the course's major project. ENG/IMS 411 Visual Rhetoric (2 sections per semester; 25 students per section): Students can use the technologies to evaluate a website design, one of the courses major projects. IMS 413 Usability and Digital Media (1-2 sections; 25 students per section): Students work to evaluate the effectiveness of an organization's interface, and then develop a prototype that improves the user experience. More and more frequently, clients are needing feedback on smartphone and tablet apps. The technologies requested in this proposal would greatly improve our students' ability to provide that feedback and to evaluate the prototypes they design. IMS 466 Critical Game Development (12 students): Students can use the technologies to evaluate game play in games developed. IMS 487 Game Design and Implementation (20 students): Students can use the technologies to evaluate game play in games designed. MKT 335 Marketing Research (5-6 sections per semester; 25-35 students per section): Students can use the technologies to better understand reactions to paid marketing communications; all students in all sections of this class currently work through a two-week eye tracking module using our eye tracking technology. MKT 435 Branding and Integrated Marketing Communication (3 sections; 28 students per section): Students can use the technologies to evaluate the effectiveness of materials developed for class projects. PSY 375 Perception, Action, & Cognition (20 students): Students in this class can use the technologies to better understand visual processing of and affective reaction to a variety of stimuli. PSY 453/553 Human Factors and Ergonomics (20 students): Students in this class can use the technologies to better understand human factors theory and principles by studying visual processing of and affective reaction to a variety of stimuli. FINAL NOTE: There are about 550 students per semester and 1,100 per year in the classes listed above that can benefit from eye tracking and facial expression analysis.

Does this project focus on Graduate Studies?: No

Does it meet tech fee criteria?: The combination of eye tracking and facial expression analysis gives students a remarkably innovative way to better understand the behavior and attitudes towards the digital interfaces and communications that govern how we work, learn, study and play. With very little intrusion, students will be better prepared to assess and design effective digital materials and interfaces by getting accurate biometric feedback from research participants. Importantly, students will be able to conduct their research in the field where people may naturally encounter these images and interfaces.

How will you assess the project?: Because classes will vary widely in what faculty hope to achieve by using these technologies, Professor Jim Coyle, who manages the current eye tracking technology, will meet with individual professors to develop a plan to assess how well learning goals are achieved. This plan has worked well with the marketing research classes. Each semester Professor Coyle meets with faculty teaching sections of that class to review procedures and discuss learning outcomes.

Have you received tech fee funding in the past?: Yes

What results were achieved?: Yes, I applied for and received Tech Fee grants in 2012 ($45,058) and 2015 ($53,010). I submitted final reports for both. Because of the funding, we have been better able to serve a wide range of student projects. For example, with the support we received last year, students can finally evaluate how users visually process interfaces on smartphones and tables. Thus, we continue to see rapid growth of our eye tracking into curricula. For example, every semester students from all sections of the Marketing Research class (MKT 335) in the Farmer School of Business use the technology as they participate in a 2-week eye tracking module. In the Spring 2016 semester about 125-175 students will be participating and between 150-210 next Fall. Another example is the number of capstone classes that typically use the technology. From the AIMS capstone to Highwire, students from across the University count on using eye tracking to develop communications campaigns and digital solutions for client projects. IMS 413 (about 80 students per year), a class entirely devoted to the study of usability, uses the technology extensively. Last semester students on two client projects in the class were able to analyze how users of the clients' apps visually process digital content. This marks the first time the study of apps using eye tracking was possible, and is entirely due to the fact that the 2015 proposal was funded.

Did you submit a final report?: Yes

What happens to this project in year two?: In subsequent years, we will need to pay for service contracts that will also include software upgrades. Our expectation is that these ongoing costs will be shared across departments that use the technologies. We now have a critical mass of classes in several academic areas that benefit from the technology such that the service costs will be relatively low.

Software: Tobii Enterprise software - Tobii; iMotions software - iMotions, $29,135

Hardware: Tobii Enterprise hardware - Tobii; Dell Precision Mobile Workstation M6500 Advanced, $41,610

Contracts: Shipping for eye tracker, $587.50

Total Budget: $71,332.5

Comments: The online form only allows for me to enter two other project team members. In addition to Phill Alexander and Neil Brigden, who are listed on the form, the following faculty have also asked to be included as team members as well: Matt Board Tim Lockridge, English (CAS) Mike McCarthy, Marketing (FSB) Silas Munro, Graphic Design (CCA) Bob DeSchutter (CEHA and AIMS) Jay Smart, Psychology (CAS)