By Ronnie Wendt

The event was executed flawlessly, leaving the planners and stakeholders with a sense of accomplishment. But planners are not organizing events for themselves. They are planning them for attendees and the organizations they serve.

Beth Schnabel, director of Strategic Events for Community Financial Credit Union in Plymouth, Mich., says it’s an easy mistake to make. “Planners,” she says, “are so close to everything that it’s hard to look through an objective lens. We need to recognize the importance of attendee feedback because we cannot be everywhere at once.”

According to her, the way feedback is evaluated is just as important as collecting it in the first place. Establishing measurable goals for each event is crucial, and evaluating its success before, during and after the event is a must, she says.

Measuring the success of an event requires evaluating what went right and pinpointing areas that need improvement. A comprehensive analysis can include everything from attendee surveys and social media metrics to staff debriefings, she says.

WHY ARE EVENT EVALUATIONS IMPORTANT?

Event evaluations are vital because they set the tone for future events, adds Brittany Nelson, Meeting & Events manager for the American Optometric Association. In this role, Nelson and her team oversee nearly 400 events, evaluating each carefully to continually improve.

The organization’s annual Optometry Meeting hosts over 5,000 attendees. Nelson places great importance on event evaluations for this meeting. “We scrutinize all feedback against measurable goals to determine how we move forward for the next one,” she says. “We closely collaborate with many of our internal departments including our education, industry relations, and communications/marketing teams to achieve a common agenda of success.”

She emphasizes that attendee feedback is a valuable tool for identifying successes and opportunities for growth, enabling these teams to improve future events. The evaluations capture attendance data and provide insights into how attendees perceived the event.

“We look at total attendance,” she says. “We naturally have a goal to increase attendance. But we also want feedback that checks on event quality. It’s nice to measure quantity, but quality is vitally important. Without quality, many people may attend but not return next year.”

Schnabel also says she values attendee feedback. “I’m hyper-sensitive to survey responses,” she says. “I always want to be sure we’re creating an event where every- one who attends feels a sense of belonging and that the event is very accessible to everyone. If there was an obstacle that prevented someone from participating, I want to know about it.”

Although event evaluations hold significance, Schnabel advises against taking comments to heart and viewing them as a personal affront. “You cannot see the comments as a personal attack on you, but rather look at the information from the perspective of ‘How can I learn and grow from them?’”

SET GOALS

Event evaluations cannot take place without setting measurable goals upfront, adds Schnabel.

Schnabel credits the Event Design Collective through Meeting Professional International (MPI) for solidifying the importance of asking the right questions of all stakeholders before planning begins. “You need to ask the right questions of the leadership team to make sure you clearly understand the goals and objectives for the event,” she explains.

She recommends questions like: What are we trying to change? What are we trying to accomplish? What outcomes would make this a successful event? “There always needs to be a purpose, that call to action,” she says.

Schnabel suggests using responses to these questions to measure success and identify areas for improvement. “You can take these goals and look for ways to improve. Later, you can go back to your leadership team and say, ‘Here was the event goal, and here is how we met it.’ There are many layers to success, and you need to measure all of them. But the main question always has to be, ‘What was our objective, and did we meet it?’”

WHEN TO EVALUATE

There are different times to evaluate an event — not just when it ends. Event eval- uations can occur during the pre-event phase, during the event and after it ends, Nelson says.

The pre-event phase involves budgeting and planning details — such as venues, speakers and ticket prices. The evaluations may consider the previous budget and adherence to it, and audience preferences for meeting type and communication methods.

During the event, evaluations include noting what works and what doesn’t. This also involves monitoring social media during the event.

Post-event evaluations occur after the event concludes. This is the point when planners can assess the event in its entirety via post-event surveys, budget comparisons, ROIs and attendance numbers.

Schnabel explains the magnitude and scope of an event determines the timing of evaluations. For larger-scale, signature events, she recommends pre-event, mid- event and post-event evaluations. “This ensures the team is aligned on what we are looking to accomplish,” she says. “I really like to get anecdotal feedback from attendees during the event, then lean towards online post-event surveys for the rest.”

For smaller events, she engages in pre- and post-event evaluations. “It’s always good to hit pause and ask, ‘Are we doing it this way because this is the way it’s always been done?’” she says, “Or are we doing it this way because it helps us achieve the goals and objectives of the event?”

The American Optometric Association evaluates most events after the fact, Nelson says. Evaluations of individual speakers might happen during the event, but attendee-wide surveys occur afterward. “We give attendees time to reflect and answer based on their experiences,” Nelson says. “Then we apply that information to future events.”

The American Optometric Association live tweets during large events with selected hashtags. “This lets us see how people are feeling and their overall engagement and feedback,” she says. “We work with our communications/marketing team to evaluate these things.”

Community Financial Credit Union also considers social media responses. The marketing team focuses on social media analytics, while Schnabel takes anecdotal responses into account. “I like to see attendees eagerly sharing their experiences and saying they cannot wait to come back next year. They are creating buzz around the event and a little fear of missing out for those who were not there,” she says.

GATHERING INFO

Qualitative feedback is based on the opinions and accounts of attendees regarding the event. Schnabel and Nelson both say they gather this information online, choosing digital data collection over paper-based methods.

In a previous position with a different organization, Schnabel partnered with EventMobi to survey attendees through an event app. “Attendees just click on a link in the app to respond,” Schnabel says. “I like to keep the survey as simple as possible. So, unless a speaker requested a specific question, I stick to asking things like, ‘What worked? What didn’t?’ and provide space for additional comments. If the surveys get too granular, you will lose people and not get the feedback you are hoping for.”

Schnabel also has attendees rank their experiences on a scale of one to 10. Attendees who assign something a one through six are always asked for an explanation. “You always want to get a little more information about that low score,” she says.

Nelson sends out an email to attendees following the event to ask about housing, registration, the keynote speaker, exhibit hall, general and closing sessions, and ed- ucational opportunities. “We also want to ask them what influenced their decision to attend in the first place,” she says.

Leaving space in the survey for open-ended responses yields usable testimonials, according to Schnabel. “I always ask for their permission to share their responses within the survey itself,” Schnabel says.

She adds it’s best to send the survey immediately after the event ends, as response rates decrease the further away from the event. “That way it’s fresh in their minds,” she says. “I always put a time in the survey too, saying, ‘It will take you less than a minute to finish this survey.’ I think it’s really helpful for people to understand the survey will not take 15 minutes of their time.”

Community Financial Credit Union has also used a QR code to gather attendee re- sponses. However, where some events will splash that QR code everywhere during the event, Schnabel prefers to push it out through the app as the event ends.

“When you have the survey open throughout the event, you get a lot of those people who pop on to share one bad experience. Then the results do not look at the event as a whole,” she explains. “I would prefer attendees rate their entire experience rather than just one part of it.”

REMEMBER TO TRACK ATTENDANCE

An event needs attendees, so it’s vital to track attendance. As simple as it seems, Schnabel says some planners may overlook this critical step.

“We target registration numbers and our attendance ratios to make sure we are hitting our numbers,” she says. “It’s really important when looking at the overall picture to understand how many people committed their time to spend with us. It is a huge measure of success.”

Nelson says the American Optometric Association always analyzes total attendance but considers it in tandem with survey responses and other data. “Our events have doctors, student doctors, practice staff, paraoptometrics, including our staff here at AOA comprised of our board members, executive team and interdepartmental staff members, so we like to see the percentage of each that are weighing in.”

Attendance info helps the association customize future events. For example, if their numbers show that doctors send staff with purchasing power instead of attending themselves, they might consider including education for both doctors and staff the next year.

WRAP UP WITH A TEAM DEBRIEFING

A team debriefing is a critical part of event evaluation. Planners want to discuss numbers and data at the meeting, and ask the planning and event team for their perspectives.

Nelson conducts several debriefings a few weeks after the event when things are still fresh in everyone’s minds; one for each department, which includes the students sector, education sector, industry relations and the executive team. Everyone on staff participates in the next debriefing. “This helps us understand what worked well for the entire team and what didn’t and examines how we can work together better in the future,” she says.

The teams review all responses and apply what they learn to future events. Nelson says, “The team looks at what we spent, participation and overall feedback. For example, our wellness classes. How many people actually visited these classes and participated? What was their feedback? How much did we spend on it? Do we feel it was an overall positive for those who participated? Was attendance lower than expected for these classes? Maybe we didn’t market them well enough or maybe we need to reevaluate if we want to do it next year and if we do, do we need to change our strategy?”

Schnabel cautions that successful debriefings focus on: “What were our “did wells” and where are our “do betters?” Reframe your thoughts from ‘that was a failure’ to ‘that was a growth opportunity and we added a new tool in our toolkit.’”

Successful events don’t just happen, they both agree. Planners must lean on their team, understand their attendees and really listen to both. “Listening is what will bring your event to the next level,” Nelson concludes.


Ronnie Wendt is a freelance writer based in Minocqua, Wis., that writes for a variety of publications throughout the Midwest.