Access to Academic Materials for Post-Secondary Students with Print Disabilities

CONSTRUCTION AND DISSEMINATION OF THE SURVEYS

Objectives

The central objective set out at the start of this research project was to gather information on the accessibility, availability, timeliness and quality of educational materials in alternate formats for post-secondary students with print disabilities. The surveys used in our research were constructed with this objective in mind. The questions were formulated to help NEADS identify gaps in the process of producing alternate formats for students. They were also designed to allow respondents to suggest ways in which the process could be improved; thus informing recommendations meant to streamline the delivery and improve the quality of these materials.

Production

The project was developed by the National Educational Association of Disabled Students in close consultation with the Council on Access to Information for Print Disabled Canadians and the Learning Disabilities Association of Canada. Other partner groups were then invited to join a steering committee. In consultation with the NEADS board of directors and steering committee it was decided to start developing the Access to Academic Materials surveys in the Winter of 2004. The goals of the project were to understand the current state of alternate format production in Canada and the opinions of students and service providers regarding the standards, quality, and levels of effectiveness of the various technologies and services for students with print disabilities in post-secondary institutions in Canada. NEADS hired a consultant, Dr. Liam Kilmurray, to construct the surveys and to produce a statistical package for their analysis. The surveys for this project built upon the 1999 NEADS Survey, Working Towards A Co-ordinated National Approach to Services, Accommodations and Policies for Post-Secondary Students with Disabilities: Ensuring Access to Higher Education and Career Training, but focused specifically on students with print disabilities. The steering committee would aid in the construction of the survey. The membership of this steering committee represents a core of expertise in the area of library services and alternate formats and technologies. It includes representation from students with disabilities, service providers, and consists of stakeholder organizations. The steering committee provided invaluable advice regarding the construction of the project surveys.

The steering committee met on several occasions with the national co-ordinator and the consultants to identify areas of concern and interest, and to clarify questions and definitions. It was through this process that the surveys emerged. Also invaluable in the process was an online project forum that enabled participants (the steering committee, national co-ordinator and consultants) to exchange ideas and information. An essential element in the construction of the surveys was the test piloting undertaken. Six students and five service providers participated in this aspect of the research. The surveys were completed by these people and returned with comments, and from this some changes were made and final versions of both surveys were developed. Alternate format student questionnaires were professionally produced by T-Base Communications.

Dissemination

Once the surveys were finalized, the NEADS list of disability service providers at post-secondary institutions was utilized to contact schools with regard to participation in the project research. The contact person at the various institutions (usually the service provider, occasionally the librarian) was telephoned and asked to provide the numbers of students with print disabilities at their institution. The contact person was also asked to provide the numbers of particular formats that they would like to receive the survey in. For example, they were asked to state how many Braille, large print, diskette, or cassette tape versions of the survey they would need. From this, and after a round of follow up phone calls and contacts, the numbers of various formats for the surveys were collated and a master list was constructed from which the surveys would be mailed out.

Over a period of several weeks, staff at the NEADS office at Carleton University organized a mass mail-out of surveys to various institutions. The objective in the initial phone-around was to enlist the participation of service providers at the schools to complete the service provider questionnaire and to ask them to distribute student surveys to those students with print disabilities registered at their offices. Students and service providers were asked to complete the survey within one week of receiving it. Further, online versions of both surveys was also constructed and made available through the NEADS website.

While waiting for the surveys were to be returned, the process of constructing the statistical files for eventual analysis of the data proceeded. The program SPSS was chosen as the main method with which to enter and analyze the data. Once the surveys were returned, the data was entered, and the consultant constructed several presentations of preliminary data to iron out any discrepancies that emerged. The final leg of the journey was the production of a final report, in which the statistical data would be represented in chart and graph format and the ‘open-ended’ and commentary information were analyzed and used to complement the statistical findings of the final report.

Methodology

The methodological approach to this survey was straightforward: and surveys were sent out on a convenient sample basis. That is, a list of institutions across Canada was constructed from NEADS’ disability service provider list, the service providers (or librarians) were contacted and asked how many surveys they felt would be needed for the participation of the students within their institution who fit the profile of print disability. From the numbers supplied by the colleges and universities, and the breakdown of survey types requested (i.e. Braille, large print, cassette, French language, etc.) packages were assembled and sent directly to the service providers for dissemination.

With respect to the response rates to the surveys, it should be noted that this methodology contained the risk that an overestimation of the target student population would occur. Also, there were instances where an institution simply requested some of each available survey format with, apparently, the intention of disseminating these to students upon the request of the relevant format. This resulted occasionally in a larger mail out of surveys than could ever be responded to from the relevant institutions. This is one of the inevitable results of a convenient dissemination methodology. While it does not affect the validity of the responses received, it does alter the response rate when counted against those surveys mailed out.

General Overview

A total 2,613 surveys were sent out to both the students and the service providers.

Forty-nine institutions participated in the student survey, with 130 respondents, and 55 institutions participated in the service provider survey, with 67 respondents, and two who did not identify their institution.

The total number of responses received to both of the ‘Access to Academic Materials’ surveys was 197 . Of this number, 67 were from service provider surveys returned and 130 student surveys. For both surveys, we received responses from universities, CEGEPs, technical vocational institutions and community colleges. Seven identified themselves as university colleges (primarily from BC). These institutions ranged from the largest universities to the smallest community colleges. The responses to the student survey were received from 49 separate institutions. For the service provider survey, 55 separate institutions were represented, two were unidentified.

Students from Ontario represented the largest percentage of respondents (48.46% of respondents). This was followed by Alberta (13.08% of respondents), Quebec (10.77% of respondents), and British Columbia (9.23% of respondents). The highest number of service provider responses came from Ontario (25.76%), followed by Quebec (18.18%), British Columbia (16.67%), and Alberta (15.15%).

Limitations of the Research

As with all surveys, the limits of what the research is able to say are bounded by the amount of information provided by the respondents. Statistical analysis can combine the responses of many people and compare, contrast and crosstab various areas of information. However, in the final analysis of the survey research this study is limited by both responses provided and questions asked. One of the issues to emerge from the student profiles was that there were more students reporting a learning disability (47%) than those who reported a blind/visually impaired disability (21%). The higher number of students with learning disabilities was also evidenced in the CILS report, where some two-thirds of students have a learning disability, and one-third a visual disability (see Appendix 2A ‘student profiles’). A similar situation prevailed in the 1999 NEADS survey, where students with learning disabilities outnumbered those with print disabilities by a ratio of over two to one. Having said that, the identified preferred formats by student respondents were affected by the types of disabilities represented in our student respondent group. While students with learning disabilities and blind/visually impaired students may often use and benefit from the same types of electronic format materials, students who have learning disabilities almost never use Braille texts unless they are also blind.

In addition to this, there were many questions in both surveys where the respondent could check more than one box, that is report more than one response. For example, under question 12 in the student survey (please state the nature of your disability/disabilities), the respondent might have a medical disability and a visual disability, or a learning disability and a mobility disability. In such cases, the response rate to the question may be higher than the number of respondents, and to represent this statistically we have added up the total responses and made each category a percentage, as was the case with question 12 of the student survey. The end result of allowing multiple responses is that we can get closer to the identification of the respondents, who are not forced to choose between categories in an either/or scenario, but who can identify a range of disabilities, or technologies, that best describe them and their needs. Despite any limitations in this survey, the reported number of students with disabilities attending service provider respondent institutions is 22,250. Students with print disabilities totalled 4,218. Therefore, the survey data is based on a very broad population.

Table of Contents




Top