Conference Sessions and Reviews

Over the last few years, I’ve been invited a few times to give a session at a conference. A fun way to share along what I’ve learned and engage with other engineers from around the world.

But public speaking is a skill you must practice and can improve over time (I hope). The one thing you will need the most to be able to improve is good feedback.

What didn’t work

For years the only way I gathered feedback is by approaching attendees after the session or creating my own feedback form.

A personal feedback form, which you share with the audience through a QR code, never really worked for me. It seems attendees don’t have/take the time between sessions to give feedback, don’t trust random QR-codes or (let’s hope so) enjoyed the session and saw no reason to give feedback at all. While the latter seems fine for the speaker, it isn’t really. First, if you get up on a podium and talk for 45 minutes, no matter how confident you are, at some point, some positive feedback is nice. Second, both methods don’t let the feedback flow back to conference organizers.

Another perspective: conference organizers

For a few times now, I have participated in a program committee and within minutes I found myself looking for speaker evaluations from last year. Who should we invite again, and who should we maybe not invite again? Quality session reviews will help make those decisions.

Now, at this point it is important to understand that a quality review is more than a red or a green card or just a single star rating. For example, a lot of one-star ratings can also mean that a session topic did not fit the conference theme, for which the speaker can’t be faulted. Or that the room was way too hot, again, not the speakers mistake.

High-quality feedback

High-quality feedback rates a few different aspects: speaker’s performance, topic-fit and session conditions. Ideally, attendees are encouraged to also write one or two lines of feedback as text to explain their ratings.

With information like this, conference speakers can improve, and conference organizers can make better speaker selections.

So, what then?

During my last sessions, I piloted two different feedback systems. The first one is from Sessionize and the other one is SpreaView.

I’m pretty impressed by SpreaView. SpreaView is activated (for commercial conferences: bought) by the organizers, so all speakers at the conference can use it for their session.

As a speaker it provides me with a single pane into all the speaker reviews I’ve received across all the conferences that use Spreaview. For each session I download the QR-code, which has a nice SpreaView look&feel, meaning that attendees will recognize it directly.

Attendees can then rate my session on a few aspects, including my content, delivery and interaction. On top of that, it also gives me the conference averages of all these ratings. Now I also know how I’m doing compared to other speakers, which helps me understand what a rating means in context. Finally, I got to see the top 10 of best rated sessions, so I can see which other those other 9 (just kidding 😉) speakers did well and try to catch their sessions in the future and learn from them.

As a program committee member, I now also have all this information. Both sides get precisely the same information, with the organizer seeing the details of all sessions at the conferences they organize.

Small point of improvement: add a rating for topic selection so that program committees know how they are doing.

Sessionize is a SaaS product for conference organizers to publish a call for speakers and perform speaker selection. It’s a paid product for commercial events, like SpreaView. Recently, Sessionize has added support for generating QR codes that speakers can incorporate in their presentation. There are a few differences with SpreaView though: The Sessionize QR code does not have an outline and header attached, so its a less appealing and also won’t have the same appearance across all sessions on a conference. A missed opportunity if you ask me.

As a speaker I’m also getting way less information from Sessionize as I’m getting from SpreaView. I can’t compare my results to those of other sessions: no top 10 and no averages. Whereas Sessionize doesn’t mention it on their website and I haven’t used Sessionize Feedback as part of a program committee yet, I do assume they also share the result with conference organizers. IMHO, something they should mention on their website.

Conclusion

All-in-all, I enjoy the SpreaView system more. Better interface, more detail and better comparison options. But either way, I’m happy there are more options coming to gather attendee feedback in a structured way! I hope we will see these systems used more at conferences!

Geef een reactie

Je e-mailadres wordt niet gepubliceerd. Vereiste velden zijn gemarkeerd met *