Bring Trip Feedback Survey onto Mountaineers.org and include course surveys SD31/38
President Dan Lauren created a trip survey mechanism that operates through a different platform than Mountaineers.org but which provides invaluable information for Leaders about how their trips were viewed. It would be great to make this information easier for Leaders to see on Mountaineers.org rather than another platform with a different log-in. This could then be expanded to courses and allowing committees to analyze feedback from students efficiently through the website as well.
See https://www.mountaineers.org/blog/2-4-books-feedback-search fro more info.
Jimmy Klansnic commented
Many good observations appear in the list. I particularly wanted to put a vote for John Ohlson's "(B) Activity questions - each activity (climbing, scrambling, sea kayaking....) has one set of questions which apply for all courses ....in their domain. (Later Phase could permit different questions for different courses, but let's walk before we try to run) ", which I think dovetails with Peter Hendrickson's vote for "piping". This way a response to a given activity area would not be onerously long with inapplicable questions.
It does make sense for Leader and Course surveys be captured in a common DB for best visibility and use (e.g. Safety Comm.) and not sending out multiple surveys for the same event.
Brent Colvin commented
We too use Survey Monkey (sometimes Google Forms) in our Basic Climbing class. I'm not sure how generic this can be and still be useful from a course administration standpoint, we request feedback on all lectures and field trips, instructor effectiveness and when new techniques are introduced we would like to get feedback on that also. For specificity we allow for a 1 - 10 rating scale. In order to get the most candid feedback we also allow the choice to submit the survey anonymously but encourage a name and participation by entering names into a prize drawing (usually a NW Forest Pass). This is a great way for course coordinators to get an idea of what is working well and what might need more attention. Free form comments are very important as we've had very interesting feedback that's given us a lot of insight. I've never thought of the Managing or Safety committees as audiences for the responses and I'm not sure the survey intent is the same from a Course Coordinator perspective.
Bill Coady commented
I'm a little late to the comments page but having read all the comments prior I believe each writer has some good ideas. My summary is:
** Start simple, as many writers suggested. One survey format for all trips. One for courses.
** Short, can't take more than 3-5 mintues
** Choices are numbered 1-4 or 1-5. Don't need 1-10. Do need an "N/A" not applicable choice.
**Questions should allow respondent to comment on trip planning by leader, leader's leadership style (would I go with this person again?), safety, trip ambiance or aesthetics (is this a good destination or trip?) Was the course well run? Was it worth the money and time investment? Was I, the student, respected and encouraged? etc.
**Should be some mechanism for free form comments that can be either public or perhaps private to the survey administrator or committee chair or??
Becca Polglase commented
I would like to see in peoples' profiles a "my forms and surveys" page, similar to what many advocacy organizations have. When a participant completes a course or trip, that course/trip's survey automatically shows up on their page, and disappears once its complete. They could also get an email notifying them of it. by going through our website and not an email server, we eliminate the great annoyance of email servers (comcast) blocking people from surveys. Once this functionality works, we could potentially do elections voting through peoples' profiles as well, which to me makes all the sense in the world.
David Shema commented
I currently "harvest" the Trip Participant Survey responses, and distribute these responses to the Safety Committee and the Managing Committee (for further distribution within the Branches).
Speaking as a member of the Safety Committee (but NOT for the committee), I would like to have a question on "Was there a Safety Concern - Yes/No" with another field for a detailed explanation, for both normal trips and field trips. This would be another mechanism available to the Safety Committee to learn about incidents. (Incidents seem to be under-reported by leaders.)
There should be a privacy mechanism for participants who do not wish to have their names publicly associated with their comments. I have been approached by participants and told of incidents that were not otherwise reported. They would not fill out the Trip Participant survey at all for fear that their name would become known to the leader. Currently, I remove participant names before distribution. Identifying information still is available in the raw survey responses.
Leaders should have a method of viewing responses for their own trips/activities. Committee leaders should be able to view responses for all the the trips/activities sponsored by their committee. Participants should have the option to block their responses from trip leaders (but not committee leaders).
Carefully design the responses so that many of the questions are suitable for statistical analysis, with follow-up questions accepting detailed explanations.
Start small and simple, but design the survey mechanisms to allow for flexibility. Most projects like this morph into something unexpected and much more useful.
Finally, more thought needs to be given as to how to use any information collected. As far as I can tell, Seattle activities do not know what to do with the data.
Past Seattle Climbing Chair, Safety Committee member
Heidi Walker commented
Looking at this, I'm thinking the Hiking Committee will be happy with a general course survey of satisfaction and if course coordinators wish to, they can create something more comprehensive.
Seattle Hiking Chair
This functionality may also be able to accommodate the various forms and election processes that could be incorporated into this same piece of development (youth forms, branch elections, etc.).
Brett Dyson commented
chris - thank you for writing!
i suggest if you want feedback, do not create barriers. having us to go to some forum and sign in is a barrier and definitely is a barrier for constructive criticism.
as a sig leader for many years, we have been given little to no feedback and there are many things that would be helpful:
are our students signed up for their field trips? have our students completed them? did they struggle on anything? did they complete their climbs? when? did they do nav? wfa? did they graduate?
we have, and continue to, fly blind. no feedback. i am skeptical of surveys, as the questions asked skew the results. still, poor feedback is at least something :>)
ps – i have a masters with information management as my area of concentration and i have done a fair amount of programming. from my experience, if it “will cost money and cause a bottleneck when people want to modify the survey ,” something is wrong with the people writing the specs, the programmers or both.
Absolute gold. Everyone. Thank you. This input is appreciated so very much. I can't guarantee that every suggestion can be incorporated in its entirety, but I guarantee that every effort will be made. Technical limitations and other considerations will come up, but knowing all of this great thought before going into the development process lets some very smart minds on the technical end do the best work possible without spending $100 on a $5 problem (I personally think this is a $71 issue though, but you get what I mean).
Do I read the general trend as being "keep it relatively short?" Are there suggestions for specific factors to ask in such surveys besides what's been posted here? We have (1) the technical issues people mention that I described in a variety of venues to people (how to provide flexibility but also allow the feedback to reach the people who can use it) - but (2) we also have to decide what data to pursue at this point as well. Keep it coming.
John Ohlson commented
1. Lots of good ideas in these comments. My version follows - uses some of ideas already mentioned plus some other stuff.
2. Keep it simple - 5 minutes max total time to respond to the basic stuff, excluding time for any free form comments.
3. I suggest 3 sections:
(A) General questions - everybody answers these (course, course leader if known, overall value, overall quality.....) But I would hope that much of this will be set up for them, by sending them an email at the end of the course which solicits input once. There is an argument of simplicity if you do not let them change their initial comments. Allowing later changes is perhaps not affordable and maybe not desirable.
(B) Activity questions - each activity (climbing, scrambling, sea kayaking....) has one set of questions which apply for all courses ....in their domain. (Later Phase could permit different questions for different courses, but let's walk before we try to run)
(C) Free form text for anything the responder wants to say.
4. I spent 20 years as a Professor before going to industry. I had a very simple survey that I used for most of those years, and it gave me everything I needed. The kinds of questions are pretty straightforward. Don't burden the responder with detailed questions. Half a dozen questions can gather a lot of information. (was instructor prepared?, boring? right level? materials good?....) You also don't need 5-10 numerical choices of response. I usually used 3 or 4. The statistical averaging gives you what you need to know. Everyone thinks they can generate a survey. But look at most of them done by non-professionals - they are too detailed, hard to understand, and take way too long to fill out. Whoever generates ours should have some experience with teaching and surveys - we have lots of teachers and professors in our ranks - get them in on the definition or at least get them to comment on the drafts.
5. Readers of the data need to be able to sort and look for info by course name, climb name, leader name, activity committee, dates....... Start simple with the two most important of these which are course name and date.
6. There is the issue of who can enter information. It cannot be totally open to anyone to write as many times as they want - that would skew the data. It must clearly be restricted to the participants with one vote.
7. You might want to establish a separate email address that people could report on serious issues that are more private, and this would be processed like a complaint desk. The responder would be asked to use their good judgment whether to write the issue in the free form or to send it as a separate item via email. However, someone will have to read those, and respond if necessary. Resource would be needed. Businesses get sued if they do not read and respond to serious issues like harassment. So there are issues of what you need to do. But, above all, do not give someone the opportunity to complain about serious matters unless you intend to read and act upon what comes in!
8. There is the issue of who can see this information and how does this affect what the responders might be willing to say. Some Universities let everyone see the information. Others restrict it in various ways. I tend to favor it being open to instructors and committees, but not to members generally and definitely not to the public.
Peter Hendrickson commented
I don't understand why piping can not be used to expose follow-on items relevant to the specific activity types. That's a vote for option #2 with the piping option. Also strongly recommend that # of items be restricted, in any case. Got to think really hard about:
• Who the audiences are for the data
• How the data data will be used
• What analysis engine will be employed to generate both data files and dashboards
• How folks might be able to query the data with default and custom queries
• Who will manage this nascent monster
• What kind of working group to convene to come up with common data elements, business rules, data dictionary...
I believe this is more than a feedback loop. Seems to me this leads to the creation of a dynamic DB which both provides contemporaneous feedback and a repository for program evaluation and/or research.
Cheryl Talbert commented
I'm sure that a generic course survey could satisfy most of the feedback needs that everyone running a course would want to see, and surveygizmo is out there to offer more course-specific information. But we certainly don't want to hit one group of course participants with two surveys so it would be good to start at the generic survey and see where that gets us first.
Here are the major things I would want to know:
* What was the course
* Who was the instructor
* How easy/difficult was it to find the class and register for the session(s) you needed? Suggestions to make it easier?
* Was there enough space available in lectures and field trips for you to find a spot?
* Was the material included in the course relevant - did it cover the things you felt you needed to know? Significant omissions? Time spent on things you thought you didn't really need?
* Was the material clearly presented? Improvement suggestions?
* Were the presentation materials concise and effective? Improvement suggestions?
* How knowledgeable/proficient was the instructor about the subject? Able to answer detail questions, convey personal examples?
* For a field class.
** Were the field exercises well targeted to teach the skills you expected to learn from the class?
** Were the field exercises run with a high standard of safety?
** Were the field exercises run efficiently? Instructions clear?
** Same question as above about instructor proficiency
** How well did instructions provided before the class allow you to prepare?
* How well did instructors and other support personnel represent the values of the club in terms of professionalism? Appropriate behavior? Listening? Interpersonal skills? Ethics? Anything that made you uncomfortable?
* How would you rate the value you received against the cost of the class?
Melinda Moree commented
Maybe a combination of what was proposed in the email but to keep it all on Mountaineers.org. Is there a way that everyone would receive a more general list of 4-5 questions but then each committee or group could have a few questions specific to their activity (make the assumption they don't change.) If there is a way for someone to check the group at the beginning of the survey and then to have only the specific group of questions for their activity come up it would avoid the really lengthy survey issue of having lots of non-relevant questions. I think that the sea kayaking group could come up with the few kayaking specific questions that we would want to ask of all activities. Then if someone wants something a lot more specific they could use some kind of separate survey instrument. I see a lot of benefit for Mountaineers as a whole to have consistent feedback on all activities to look for systemic trends. If we did this there could also be creative ways to encourage participation like drawings for free memberships or donated gear and participation in surveys gives you an entry ticket. Thanks for asking!
Margot Tsakonas commented
The shorter, more "generic" course survey works for me.(I have chaired activities 4 or 5 times now and run courses maybe 6 or 8 times.) If needed we (individual committees) can supplement with more course- specific questions. No point in over-developing and spending a lot of money on trying to get every last possible kind of question that anyone would ever possibly want...:-)
Surveys are great but just chatting with students about what is going well, what is not etc. is very helpful also. I have found that when we establish the right atmosphere at the beginning, people are not reluctant later to give some honest input.
As for the trip leader feedback surveys, they are important to me and to other leaders I have spoken with and I value seeing the feedback. Agree with Dan that if a participant checks that it is OK to share, we should keep sending those (or making accessible) to trip leaders. Committees need to be able to see feedback for all the leaders who are leading for their activity.
If I get more time I may chime in with some more specifics but just wanted to get this general point of view posted while it is fresh in my inbox.
Felicia Wibowo commented
For Sea Kayaking Basic class I, Felicia Wibowo, have created surveys in Survey Monkey. Like every surveys, participation is the biggest obstacle. For classrooms, we need to know whether we give them enough material for them to digest, topics that were missing and whether the presenter is presenting the material in clear and concise manner.
For required activities associated with the course, we want to know whether they were given enough knowledge (mainly self and assisted rescue) by the instructor so they can practice the skills themselves.
At then end we also like to know what kind of continuing courses/seminars and trips they like us to offer.
I have given the link, user id/password in Survey Monkey to instructors & presenters to check the results of the surveys.
Theoretically those who has the user id/password, will be able to change/add/delete the questions in Survey Monkey. But more likely I have to solicit their opinions and make the changes myself in Survey Monkey.
...and add a course-survey component to this project as well. This will allow people to get feedback about all of the many products of their volunteerism (courses and activities), align us with best practices, and create a mechanism for quality improvement efforts at every level of the organization.
[from Dan Lauren]
1. Participant Trip Survey – all participants on a trip receive a survey the following week asking for feedback on how the trip went, leader ratings, and safety concerns. Survey is not anonymous, each response is tied to that trip and that person to assist in follow-up as needed. Survey results are accessible to a controlled list such as Managing Committee, Safety Committee, Branch activity chairs and admins. Integrate into Salesforce reporting for volunteer tracking and dashboards.
2. Leader feedback page – A leader who has received feedback from one of their trips via the Participant Trip Survey process, and the respondent has indicated that it is OK to share that response with the leader, will be able to see these survey responses for their trips. Integrate into Salesforce reporting for volunteer tracking and dashboards.