sample questions are formatted as mini-vignettes: a problem is posed, followed by the ever-familiar a-b-c-d range of possible solutions or responses. For example:
Marisol created a presentation about recycling plants for her science class. She
showed it to her classmate Keith before she presented it to her class. In his
feedback, Keith noted that some parts of the presentation didn’t sound natural,
and Marisol showed him where she’d copied and pasted from various Web sites.
What should Keith do?
A "feedback" option can be switched on by the adminstrator which will provide a rationale for the correctness or incorrectness of each item as a way to add formative-assessment zing.
The scenario approach seems stronger for the ethics-driven digital citizenship kinds of questions---what would/should you do, rather than what can you do. But even so, the set-up necessarily forces a "correct" answer; and as we all know, ethical issues are seldom resolved so neatly. The updated NETS-S scrunches down the old focus on technology operations and concepts into a single standard. With the bold new emphasis on creativity and innovation, communication and collaboration, and the other 21st century skills that resist understanding in the old-school definition of "skills," I wonder how close to the mark a multiple choice assessment can really get.
I wonder, too, how Atomic Learning has factored in readability. Are different versions of the instrument available for kids reading at different grade levels?
Does your district use Atomic Learning? The Tech Skills Assessment could be a valuable tool for understanding students' progress toward NETS-S at the school or district level. At the very least, it could help start some good conversations. Check it out and let us know what you think.