We’re still in beta and by invite only
As a student enrolled in the MBA program at Western Kentucky University, I was tasked, along with four other students, to work with the Suzanne Vitale Clinical Education Complex (CEC) on a survey administration project to help them achieve directed growth. This was a practicum assignment that was intended to give us some real world experience, and I can certainly attest that it did just that.
The CEC is a community-based clinic that houses various programs that assist children and young adults who have developmental delay issues. Our objective, as dictated by the CEC, was to create a community needs assessment in an effort to ascertain how effectively the CEC met the needs of our specific community.
The main problem that the CEC faced was that they were growing too fast, but still had access to only a relatively small amount of resources. Our job, through the community needs assessment, was to determine what the community needed most, so that the CEC could efficiently direct their growth based on the community’s prioritization of needs.
It is also worth noting that the groups we were placed in lasted the duration of our MBA program, so we had already gotten to know each other fairly well. It is safe to say that we had already surpassed the socialization stage and were gelling as a group. It had always been the case with this group that we operated without a clear-cut leader. We were placed in our groups based on our specializations determined by our undergraduate degrees. We felt that it was best for the person with the most knowledge to take the reins whenever the situation arose.
After a couple of group meetings, we came to a consensus that the most efficient and effective way to take the temperature of the community’s needs was to simply administer a survey. This is the point where we had to make a decision: Do we go for volume by sending out as many surveys as possible, or do we purposely bias our survey pool because of the nature of our community subset (people who care for children or young adults with special needs)? This would affect how we went about the survey administration.
We decided that we would purposefully target that specific subset for our survey sample, as it would yield the most relevant and educated/experience-based results. For us, it was much more beneficial to have fewer, more insightful survey responses than a volume of surveys filled out by those who are largely uneducated on the topic.
As for the survey itself, we included both qualitative and quantitative items. Most of the question items were based on the Likert scale, with an intentional omission of the “indifferent” option so as to prevent any lukewarm answers. We also listed some general needs (based off of what the CEC’s current programs offered) and had participants rank them in order of importance. This was the crux of our survey, as we could use these results to help us prioritize growth for the CEC.
The survey was administered through hard copies and also via an electronic survey service called Qualtrics. In terms of response rate, we hovered around 30% (which is exceptional in most cases). We attributed this to the bias of our survey pool, which was pre-conditioned to care about the topic.
Overall, the survey administration project proved to be a great experience. It was my first real taste of applied research, and we were able to provide a detailed report to the CEC, who in turn implemented a strategy based on our findings and recommendations. It was quite satisfying to see business applications make a difference in the community.