For over a year, The Public Library Blueprints blog has used Public Library Annual Report (PLAR) data to demonstrate different methods for data analysis and visualization. Hopefully, readers have found a piece or two of PLAR data to spark their interest or even inform a decision. We also want to recognize that library staff collect much of their own data on programs and services specific to their library. When libraries want to hear directly from the communities they serve, what data collection method will they often turn to? Surveys, of course! So, let’s take a quick break from data analysis and visualization and return to this tool for gathering library data. Even flawless analysis and impeccable visualizations will fall short if the survey was not built with accessibility in mind.
The validity of a library survey depends on its ability to gather data that accurately depict the communities served by the library. These communities will include people with disabilities. Recent data shows that around 13% of the population of the United States is living with a disability, so the value of an accessible survey cannot be overstated. Accessible surveys are designed so people with a range of backgrounds and abilities can fill out the survey without undue inconvenience, ensuring their opinions are heard. A survey that is not accessible will hinder people from taking the survey. This excludes people from historically marginalized groups such as those with a disability. Basing decisions and planning off such a data set can result in the continuation of systems and processes that are inconsiderate of an entire population segment. Creating accessible surveys will allow library staff to hear from all the communities they serve.
Fundamentals of Accessible Surveys
In addition to producing representative data sets, accessible surveys are also necessary to comply with the law. This year, digital accessibility compliance becomes mandated for state and public entities. Making a fully accessible document can feel overwhelming at first, so in this post we’ve broken down survey accessibility into three parts:
- Clear features
- Welcoming language
- Navigable for all
Clear features
There are three survey features that can greatly affect the accessibility of a survey: text, design, and images.
Text
Regardless of whether a survey is printed or online, it’s necessary to consider its text size, font, and color contrast. Luckily, the ability to control these characteristics is standard in any online survey builder. Obviously, text needs to be large enough to easily read, but what’s “large enough”? Since this will vary from person to person, it’s best to err on the larger side, likely 12 point or greater depending on the font used. If the survey is distributed online, ensuring the platform allows people to magnify the page to their preferred size is also a great solution.
Simple fonts are the only truly accessible ones, so you’ll have to bypass the elaborate curlicues on surveys no matter how pretty they might be. For online surveys, sans serif fonts are considered easier to read. These fonts do not have serifs (the small lines attached to the ends of letters) which are generally used in large blocks of printed text. It’s easy to find lists of accessible fonts online, so even when limited to simple fonts, it’s still possible to find an appropriate option that matches a survey’s style.
Design
Beyond the text itself, how a survey is designed also impacts its readability. Leaving an appropriate amount of white space around questions and question answers guides the respondent through the survey. This reduces the possibility of survey takers becoming overwhelmed and overlooking things. Leaving enough space is especially important for open-ended responses on printed surveys. We all have different sized hand writing and a smaller space could discourage some from sharing everything they might want to.
Another aspect of design that has a large impact on accessibility is the use of color. There must be significant color contrast between a survey’s text and background. Black text on a white background is the most accessible choice. If other colors are still preferred, there are many resources online to check the color contrast. Previous Library Research Service (LRS) posts have also covered accessible color combinations.
Images
Surveys don’t generally include many important images. but it’s always worth noting that images are only accessible if they include alternative text. Alternative text is a short description of what the image shows, enabling screen readers to describe it to their users. Because images are not often a key survey component, and great resources already exist on writing alternative text, we’ll not go into too much detail in this post. Alternative text is not required for purely decorative images. Screen readers will usually skip images without alt text. Logos on online surveys should also have alternative text that states the name of the organization.
Welcoming language
Plain and Simple
As vital as it is for a survey to be visually accessible, being easily understandable is equally important. A representative data set stems from a survey taken by people of all different ages, education levels, and perspectives. So even if library staff understand the survey, members of the public may not. Acronyms that are familiar to library staff are likely to trip up respondents and could cause them to skip or misinterpret questions. Avoid acronyms, and use plain language free of jargon. Keeping questions short and simple will lead to more accurate data and make the survey welcoming to more people. This is a matter of accessibility as well as basic courtesy. A questionnaire with confusing questions can make users uncomfortable or frustrated.
Sensitivity
A clear question can still deter survey takers if it reads as biased, assumptive, uninformed, or offensive. Naturally, certain survey topics will require greater care than others. For instance, a short evaluation form for a children’s program probably doesn’t run much risk of offending, but the library may ask for demographic information within the survey to check whether it’s collecting a representative sample of responses. Demographic questions and answer options need to be considered carefully. These types of questions should include all possible answer options, refer to groups of people by their preferred names, protect people’s identities, and make it clear that self-identifying is optional.
There are also times when close attention must be paid to the language throughout an entire survey. For example, at LRS we are conducting research on library services to incarcerated people through the PRISM project. When creating a survey for people who are or were incarcerated, we needed to ensure that we used trauma-informed language. There are many unfamiliar terms within the Department of Corrections (DOC) or words that have different meanings within this context. To make the survey understandable and ensure it would not cause harm, we enlisted sensitivity readers who had experience with DOC. They helped us evaluate the language we used through the lens of their first-hand experiences.
The key to accessible surveys is remembering that everyone has a different set of perceptions and experiences. What is clear to you might not be clear to survey respondents. It’s also important for the survey builder to know the topic of the survey to ensure each question is necessary and will lead to valuable data. By asking people to fill out a survey you are asking for their time. A smooth and easy survey experience will yield more complete and accurate data. It’s also considerate of people’s time to share what the survey is about and how long it is at the beginning of the survey.
Navigable for all
We discussed earlier how screen readers interpret images by reading alternative text. This enables people with vision impairments to understand the image content. But screen readers read much more than just alternative text. They are an essential tool that allow people with certain disabilities to navigate through all items displayed on a screen. Unfortunately, screen readers are often limited by how a page is set up. For example, a survey in PDF format must have all survey elements “tagged.” Items must also be set to the proper reading order to make sense when interpreted by a screen reader.
This means that, when building a survey that will be taken on a screen, the layout and formatting of the screen should be considered. Website pages and documents must have the correct formatting for titles and headings applied. With the correct headings applied, screen reader users can quickly navigate through the document. The layout of a page also impacts whether a survey is navigable using only keyboard controls. This is a necessity for accessible online surveys because people with mobility limitations may not be able to use a mouse. Using only a keyboard, survey takers should be able to navigate between sections as well as individual questions and select or input answers. Exactly how to ensure a survey is navigable will depend on the survey platform used. Survey builders, such as Google or Microsoft forms, have built in accessibility features and are compliant with web content accessibility guidelines (WCAG). Using a survey building tool that prioritizes accessibility, tests surveys for accessibility and practices continuous improvement will streamline the process. But there are still things survey builders should always be aware of and check themselves.
Even on the most accessible survey platforms, there may be types of questions that are difficult or impossible to make accessible. Anything that could be difficult for screen readers to use should be avoided altogether. This includes (but is not limited to): questions that incorporate images, videos, or maps, drag and drop answer formats, slider question and answer formats, and questions that require the respondent to use a mouse to draw or write something, such as a signature.
Even with the correct formatting and question types, it may still take people with disabilities longer to navigate a survey. But there is information we can incorporate into a survey to ease the experience. For example, if a question requires an answer or an answer in a specific format, it is necessary to state that clearly within the question. Navigating the survey only to be asked to go back and fix an answer can be cumbersome for users of assistive technology. If possible, including an option for people to save and continue the survey later allows people to take a break from it if needed. These suggestions are just a couple examples of how accessibility benefits all people taking a survey, not just people with disabilities.
Considering Communities
A survey builder should also take into account the different languages spoken in the communities being surveyed. You may need to have a survey translated into Spanish and/or other languages spoken in the area to receive truly representative answers. Also, be sure to pay attention to the variety of devices people will have access to. A survey may be accessible on a laptop but unreadable on a mobile device if the design is not adaptable to different screen sizes. Not everyone will have a personal laptop or computer to use. Many people may prefer to use their phone regardless, so a survey must stay accessible across platforms when distributed online.
The general guidelines shared above are a starting point to creating accessible surveys, but there may be times when additional steps need to be taken to ensure accessibility for certain target populations. For instance, when distributing the Colorado Talking Book Library (CTBL) biennial evaluation to patrons, LRS collaborates with CTBL’s staff to give patrons the option to call the library and fill out the survey with the help of a staff member. For CTBL patrons, having this option available is necessary to ensure we hear from as many patrons as possible.
Final Thoughts
As we all collectively work towards creating accessible surveys and materials, there’s a lot to learn. Nobody is going to implement accessible practices flawlessly from the beginning. But we all benefit when we do our best to ensure we’re hearing from a representative sample of all the communities we serve. At LRS we’ve surveyed many diverse populations, but we are still learning and refining our processes with each new survey we make. Our last piece of advice is to develop a habit of considering accessibility from the very beginning of a project. Surveys will be more successful if accessibility is prioritized from the start and stays at the forefront of mind every step along the way.
LRS’s Colorado Public Library Data Users Group (DUG) mailing list provides instructions on data analysis and visualization, LRS news, and PLAR updates. To receive posts via email, please complete this form.