If you know me, it’s no secret I’m a big fan of Nearpod. This article is a response to a post on Twitter by Wendy Torres, and I think she is doing an incredible job advocating for all learners (and instructors), specifically highlighting some aspects of accessibility that in spite of actually being federal law, are largely overlooked.
I have two very different responses to Wendy’s assessment of Nearpod’s accessibility. Today I will write in defense of Nearpod, and why I think in many respects it is very accessible.
However, in my next article, which hopefully I will write tomorrow , but, it is New Year’s Eve, so most likely it will take me a few days. In my next article, I will switch sides, and completely agree with her assessment!
I can do this because accessibility is such a huge and complex thing, and it is just one very important piece of a very complex process we call education.
So let’s begin.
Special Educators
When I was a special education teacher, I would sometimes assess my job as, quite frankly, impossible to actually do. The number of demands simply exceeds the time that a special education teacher has to accomplish them. I was fortunate enough to teach content that I actually knew well. And so co-planning with my general education partners was very equitable. But planning accommodations and modifications is just one of many pieces of the job. Without going into an exhaustive list of all of the demands of being a special education teacher, suffice to say, something had to give somewhere. One either excelled at case management, or instruction or differentiation, among other things, but no one was honestly expected to master all of them all of the time.
In fact, I hope that this time has passed, but when I left the classroom, there were people who would conduct observations, and in their minds the highest compliment they could give was to say, “I couldn’t tell who was the regular education teacher and who was a special education teacher” Of course I knew what they were getting at, the seamless teamwork. But think this through. If that’s the case, who is doing the accommodating? Who is doing the intervention? Who is gathering the mountains of data required by most IEPs? In short, who is fulfilling the diverse, individualized demands of all of the IEPs in the room?
All Means All
This is what my mind flashed back to as I read Wendy Torres’ blunt assessment of Nearpod’s accessibility score card. I was struck by just how difficult it is to do the right thing. That is, to create and maintain educational tools that are 100% equitable, that are accessible to ALL. (For example, “all” means the teachers creating the learning experiences as well as the students that will use them. I don’t always consider accessibility for the creators. Those two links are to her tweets linking to those assessments.)
Nearpod Reports
And yet, I frequently hold up Nearpod as an exceedingly accessible learning platform. When I introduce educators to Nearpod, I often confess my least favorite piece is the student reports you get after a session. Nearpod has many formative assessment tools built in, but I do not think of Nearpod as an assessment tool primarily. It is an instructional platform that includes formative assessment, but those assessment results are most useful during a lesson, not afterward. Nearpod’s reports are really a victim of how robust a platform it is.
Think about it this way. If I give you all multiple choice questions, my report will look fabulous in a spreadsheet. Even if I sprinkle in some open-ended questions, a spreadsheet still looks good. It’s just that some columns will be very wide, because you typed an answer into them. Now, however, let’s add drawings, matching activities, audio recordings, video recordings, and fill in the blank questions! Obviously this is not going to look good in a spreadsheet. In fact, try to even imagine a succinct report of all students’ results with such a variety of question types. What would it even look like? So the fact that I personally don’t happen to like Nearpod’s after session reports is a direct result of the incredible variety of question types Nearpod offers.
Immersive Reader
One component of accessibility is the delivery of content to students. Another is the way students can respond to show what they have learned.
Nearpod was one of the first platform’s outside of Microsoft Office/Office 365 to adopt Immersive Reader. That means much of the content of your lessons can be accessed by a majority of students. All ten of the assessment type slides support Immersive Reader, which is an entire suite of reading accessibility tools.
It is a very much aligned to UDL. Immersive reader is universally available to all users, and each can choose which components to turn on or off. There are several features that address visual components of reading. Several address decoding, several address comprehension, and it also translates into over 80 different languages and dialects. It’s called “immersive” reader because it removes all extraneous information from the screen, showing just the text to be read. That matters much more in other applications than it does in Nearpod. That feature can be helpful for students with attention issues. Immersive reader addresses more than just students that struggle with decoding issues.
Slide Creation
If the teacher creates all of their slides within Nearpod, then all of the Immersive Reader functions are available for all of the text in those slides. If the teacher uses the Nearpod add-on for Google Slides, then the text also opens in Immersive Reader in Nearpod.
Side note
This is a bit of a mystery to me. Immersive Reader is a Microsoft product that started in OneNote. As such, OneNote is the only place where Immersive Reader reads text from images. Everywhere else, it reads text, not images of text. After Microsoft added Immersive Reader to all of their products, they gave it away to any other company that wanted to use it, and Nearpod was one of the first. Somehow, Nearpod and/or Google figured out how to get Immersive Reader to read text from an image of a slide when you send your slides from Google Slides to Nearpod, using the Nearpod Add-on for Slides. So this Microsoft product works better in Nearpod via Google than it does in many Microsoft products. Go figure.
Audio Recording in Nearpod
Teachers can also add audio recordings to pretty much any slide type in Nearpod. Seven of the ten question types allow you to choose from six different types of “reference media”. You can choose to add an image, a video, a website, a PDF, upload audio or record audio. So if you don’t have an obvious need (i.e. they need to see this chart or graph to answer the question) you can record yourself reading the question and answer choices aloud as your reference media for a question so they can hear it read aloud (in their teachers’ voice) without even having to open Immersive Reader.
One of the main tenants of universal design for learning is providing multiple means of representation of content. Nearpod simply has so many different ways to present the content, it is easy to apply this udl principle.
Draw-It
Another central tenant of universal design for learning is multiple means of action and expression. As I have already stated, there are 10 difference formative assessment types within Nearpod. My personal favorite is the Draw-It activity, because there are so many different use cases for it, as well as so many different options for students to respond. They can choose to upload an image, using inking to draw a response, type a response, or handwrite a response. Really they don’t even have to choose between them, because they can use more than one of these response types to answer a single question. If they’re using a device with a camera, they can even take a picture of work they have done non-digitally, and upload that as their response. There are just so many options for student voice and choice.
Touch Screens?
The matching activity doesn’t require the use of mouse or keyboard, but the options can simply be clicked. Of course this is dependent on them having a touchscreen device. If they do, however, then fill in the blank works the same way. You can click and drag with the mouse, or simply points and drag with your finger on a touchscreen device. (In my next article, I will counter this idea, by pointing out the consequences of assuming that all students have touchscreen devices And the ability to use them!)
Once teachers have the ability to add audio to slides, we all began asking Nearpod to give students the same ability. Eventually, that ability became available in open ended questions period when a teacher creates an open ended questions slide in Nearpod, they can toggle on the ability for students to record an audio response rather than typing a response. That’s one way to differentiate a Nearpod lesson. You can make 2 copies of that lesson, and toggle that feature on in one copy, and leave it turned off in another copy. If you have one small group that needs the ability to use the audio record feature, but another small group that you want to specifically type in response, Nearpod makes that easy to do.
Video Responses in Nearpod
Of course, as soon as students were able to audio record a response, we immediately demanded that they should also be able to record a video response! Instead of reinventing the wheel, Nearpod partnered with flip grid, a very fun and robust tool for recording videos. The integration of a Flipgrid slide into Nearpod is tighter than the simple web slides, where you enter the URL for a website. Flipgrid functions completely within near pod’s platform. This is helpful both for teachers when looking through reports and assessing student learning within Nearpod, and is also helpful for students in terms of organization because they are able to stay right inside of the Nearpod platform, and not have to navigate additional windows or tabs.
In terms of Accessibility, Nearpod excels at providing multiple ways to provide content to students, with some accommodations built in. In addition, it provides multiple ways for students to respond, with more accommodations built in. It does fall on the teacher to design Nearpod lessons and use tools that are available deliberately and appropriately for the students that they have in their classes.
So where does Nearpod fall short?
As Wendy Torres so clearly points out, in her Web 2.0 Accessibility scorecards, Nearpod falls short comes under two categories. This is true both for teachers as creators and students as participants. It does not behave well for students who must navigate with keyboard commands and shortcuts rather than using a mouse. It also does not play well with the screen reader. The temptation for teachers therefore, is to look at the students in their classes, and declare , “I don’t have any students who need those features! So I don’t need to do those things!” That’s what I plan on writing about in my next article!
Summary
So my argument in favor of Nearpod’s accessibility features, is that for students with the most common disabilities, the most frequent reasons for having an IEP, Nearpod meets their needs. Nearpod addresses decoding, issues, some visual issues, translation for English language learners, gives options for non text or non written responses and has the ability to provide student choice.
Regarding short comings with respect to keyboard commands and screen-readers, this is similar to my dislike of the post-session reports. Nearpod is a victim of its success. Because it aspires to have so many options for delivering content, and so many options for student responses, they have created a platform that makes it very difficult for keyboard navigation and screen readers. Ironically, it’s also probably likely that Nearpod’s efforts to provide the types of accommodations I’ve listed in this article have also contributed to the difficulty in using keyboard commands in screen readers.