Let me touch my words again!
I love my Kindle. It’s a great little eReader. The battery lasts a a month and I can read all the books in the world on it. After traveling to multiple countries and living abroad with it, my arm and shoulder muscles have to thank Amazon for allowing me to leave the burden of physical books behind.
But the reason I love my specific Kindle is that I can still touch it like a book.
Image Courtesy of The Verge.
There’s a worrying trend in eReaders I’ve noticed and that’s the lack of physical buttons. The new line of kindles (2014–2015) are purely touch based and have one button: the power button. The reason I love the kindle I have (5th Gen basic ad-supported model) is that it has physical page turn buttons and when I touch the screen…absolutely nothing happens. In the age of multi-touch and gesture-based devices, I have to echo the words of Dr. Ian Malcom.
“Yeah, yeah, but yourwere so preoccupied with whether or not they could that they didn’t stop to think if they should.”
Being able to move your fingers across a page or screen, pointing at specific difficult words is a huge part of learning to read, and acquiring literacy and conceptual understanding. For children and adults alike, simply touching words is something we cannot do anymore, for when we do, the page turns, a menu appears and a definition pops up. To concentrate on a specific word or sentence, we can now only just stare at it.
Spitzer (2013) goes so far as to argue that without a tactile experience, children’s normal ability to interface with the physical world in a way that we all traditionally view it may be stunted. Sure, they have a tactile experience, but a flat glass or e-ink screen has no variation, no texture, and this is where physical buttons can come in. Feeling the press and hearing the click adds sensory stimulation to a literally flat experience.
In a study by Macken and Ginns (2014), they found that when students were not allowed to touch a paper diagram, their learning suffered. Those that did not touch the words had lower learning performance than those who gestured in the air, and even lower, still, when compared with those who were allowed to touch the words.
The same was true to learning geometry worked examples (Hu, Ginns & Bobic, 2014). This concept is called Embodied Cognition, the idea that what we do with our bodies is inextricably linked with our cognitive processes, including information processing and learning. These studies provide empirical evidence that learning and recall of information, and thus engagement with content, suffers when we are unable to physically interact with our devices.
Now, I have an iPad and a Kindle, and whenever I read on my iPad, there’s a voice in the back of my head saying ‘don’t touch it, don’t touch it, don’t touch it’ because when I do, I accidentally turn the page. I discovered very quickly that when I want to run my finger underneath a difficult sentence this gesture, by default on many apps, starts the highlighting process. No! That’s not what I wanted!
I am not a product designer, so it’s not my job to make suggestions, just to illustrate the detrimental effect that this can have on children and adults using eReaders with touch-based screens. Perhaps physical buttons don’t need to come back, instead giving users the option to turn off full screen touch responsiveness, to opt small page turn buttons at the bottom, top or sides of the screen. I know the top-tier Kindle Voyage has a touch-responsive bezel, but doesn’t not have the ability to turn off touch-responsiveness on the screen.
Everyone loves touching their books. It’s one of the most enjoyable aspects of reading for me, and for many others I’ve talked with. Watch any child read a book, or what any adult read a book or journal article that is challenging for them to understand. By having a touch screen, we are now being trained to avoid touching the words in our books, and reducing the area that we’re allowed to touch to simply the margins.
Limiting this ability has been proven to be detrimental to learning and conceptual understanding. I get that Amazon and other manufacturers have a responsibility to create products that their consumers love. But they also have to make sure that we don’t all become dumber because our devices prioritize aesthetics and the latest technologies over what has helped us understand words and concepts since we first put pen to paper thousands of years ago.
Hu, F.-T., Ginns, P., & Bobis, J. (2014). Does tracing worked examples enhance geometry learning? Australian Journal of Educational Developmental Psychology, 14, 45–49.
Macken, L., & Ginns, P. (2014). Pointing and tracing gestures may enhance anatomy and physiology learning. Medical Teacher, 36(7), 596–601. http://doi.org/10.3109/0142159X.2014.899684
Spitzer, M. (2013). To swipe or not to swipe? — The question in present-day education. Trends in Neuroscience and Education, 2(3–4), 95–99. http://doi.org/10.1016/j.tine.2013.09.002
(cross post with http://stoosepp.com/2015/07/07/let-me-touch-my-words-again/)