Do eReaders really need Touch Screens?
Kindles are great little eReaders. The battery lasts a a month and you can read all the books youwant on it. After traveling to multiple countries and living abroad with it, my muscles have to thank Amazon for allowing me to leave the burden of physical books behind.
But there’s a worrying trend in eReaders that is the lack of physical buttons. The new line of kindles (2014–2015) are purely touch based and have one physical button: the power button. Older Kindles have physical page turn buttons and when you hold the device and rest your thumb on the screen, the best thing happens: NOTHING.
In the age of multi-touch and gesture-based devices, I have to echo the words of Dr. Ian Malcom.
“Yeah, but your were so preoccupied with whether or not they could that they didn’t stop to think if they should.”
Why is a lack of touch on an eReader screen important?
Being able to move your fingers across a page or screen, pointing at specific difficult words is a huge part of learning to read, and acquiring literacy and conceptual understanding. Even for those who read voraciously, if we come across a word we’ve never seen before, our first instinct is to place a finger under the word on the page so we can give it more of our attention and figure out what it means.
For children and adults alike, simply pointing at words is something we cannot do on newer eReaders, for when we do, the page turns, a menu appears or a definition pops up. Sometimes we want to figure it out on our own. To concentrate on a specific word or sentence, we can now only just stare at it, among a sea of other letters, words and phrases, which can create a cognitive burden.
Spitzer (2013) makes the argument that without a tactile experience, a growing child’s ability to interface with the physical world may be stunted because touch screens provide no texture or context to the task being performed. A flat screen of glass or E-Ink does little to assist in their development and while physical buttons don’t add much, they do add something with movement sound and a change in visual stimuli along with it.
More closely related to the pointing issue, in a study by Macken and Ginns (2014), it was found that when students were not allowed to gesture in any way, their learning suffered. These students performed lower that those who could gesture in the air, and even further behind studentss who were allowed to touch the paper and run their fingers along the page.
The same was true to learning geometry worked examples (Hu, Ginns & Bobic, 2014), when students were asked to trace lines in geometric shapes with a finger. This concept of linking physical action to learning and memory processes is called Embodied Cognition, the idea that what we do with our bodies is inextricably linked with our cognitive processes. These studies provide empirical evidence that learning and recall of information, and thus engagement with information, suffers when we are unable to physically interact with our devices.
Many users love the touch-definitions available on most eReaders. This is especially helpful for those learning to read, or those learning another language. I personally, use these features daily on my iPad AND my non-touch Kindle, but whenever I read on my iPad, there’s a voice in the back of my head saying ‘don’t touch it, don’t touch it, don’t touch it’ because when I do, I accidentally turn the page. I discovered very quickly that when I want to run my finger underneath a difficult sentence, this gesture on many apps, starts the highlighting process. No! That’s not what I wanted!
Perhaps physical buttons don’t need to come back, instead giving users the option to turn off full screen touch responsiveness, to opt for a smaller virtual set of page turn buttons somewhere on the screen. The top-tier Kindle Voyage has a touch-responsive bezel, but currently doesn’t have the ability to turn off touch-responsiveness on the screen.
eReader or Tablet? Pick one and stick with it.
Everyone loves touching their print books. It’s one of the most enjoyable aspects of reading for many of us. Watch any child read a book with parent, or watch any adult read a book or journal article that is challenging for them to understand; Inevitably they touch it, and point to the words or run their fingers under lines of text that they are having trouble with. By having a touch screen, we are now being trained to avoid touching the words in our books and magazines.
Adding touch, and as a byproduct, limiting our ability to point to words may be detrimental to learning, retention and conceptual understanding as evidenced in the literature. Amazon and other manufacturers want to create products that their consumers love, but are now faced with a challenge:
How do they update their products with new features readers will love, while still allowing for traditional physical experiences that have helped us learn and connect with the written word for hundreds and hundreds of years?
Hu, F.-T., Ginns, P., & Bobis, J. (2014). Does tracing worked examples enhance geometry learning?Australian Journal of Educational Developmental Psychology, 14, 45–49.
Macken, L., & Ginns, P. (2014). Pointing and tracing gestures may enhance anatomy and physiology learning. Medical Teacher, 36(7), 596–601. http://doi.org/10.3109/0142159X.2014.899684
Spitzer, M. (2013). To swipe or not to swipe? — The question in present-day education. Trends in Neuroscience and Education, 2(3–4), 95–99. http://doi.org/10.1016/j.tine.2013.09.002