Tag Archives: evaluation

Large-scale analysis of mobile text entry

There will be one billion smartphone users in 2013 and most of them will need some sort of text entry. To help people to enter text on mobile devices we aimed at studying how people type with a large number of participants. Therefore, we developed a typing game that records how users touch on the standard Android keyboard to investigate users’ typing behaviour. We published the typing game Type It! on the Android Market. The game got installed by 72,945 players and enabled us to collect 47,770,625 keystrokes from around the world.

Using the data we identified three approaches to improve text entry on mobile phones. As we found a systematic skew in users’ touch distribution we derived a function that compensates this skew by shifting touch events. In addition, we changed the keys’ labels by shifting them upwards and visualize the position where users touch the keyboard. By updating the game we conducted an experiment that investigates the effect of the three approaches. Results based on 6,603,659 further keystrokes and 13,013 installations show that visualizing the touched positions using a simple dot decreases the error rate of the Android keyboard by 18.3% but also decreases the speed by 5.2% with no positive effect on learnability. The Android keyboard outperforms the control condition but the constructed shift function further improves the performance by 2.2% and decreases the error rate by 9.1%. We argue that the shift function can improve existing keyboards at no costs.

Our paper with the lengthy title ‘Observational and Experimental Investigation of Typing Behaviour using Virtual Keyboards on Mobile Devices‘ that describes our work has recently been accepted at CHI 2012.

Type It! – an Android game that challenge your texting abilities

Type It! is a game for the Android platform that is all about speed and quick fingers. It challenges (and hopefully improves) your texting abilities. You have to touch and type as fast as you can to see if you can beat all levels. The player’s task is to enter the words that appear as fast as possible. The faster they are the more points they get. Players might improve their dexterity by trying to be the fastest guy in the high score.

This game is part of our research about the touch performance on mobile devices and also part of my work as a PhD student. While users play the game we measure where they hit the screen and how fast they are. By combining this information with the position of the keyboard we can estimate how easy each key is to touch. Based on this data we are hopefully able to predict user’s performance with different keys and character sequences. We plan to derive an according model and this model could possibly be used to improve the virtual keyboards of current smartphones.

We hope that we can collect data from thousands of players. That would enable us to derive information that is valid not only for a small number of people but for every user. We are, however, not interested in you contact list, browsing history, or phone number. Okay – if you are good looking I might be interested in your phone number but I don’t want to collect such data automatically ;). In general we don’t want or need data that enables identifying individuals. Thus, we do not collect those things or other personal information.

Type It! is available for Android 2.1 and above. You can have a look at users’ comments and the game’s description on AppBrain or install it directly on your Android phone from the Market.

Evaluation of our HCI lecture

We conducted an evaluation of our lecture and lab about Human-Computer Interaction. The aim of the study is to improve the lecture in the future. We collected qualitative feedback using a questionnaire from nine students. Overall the participants appreciate the practical projects and the lecture itself. The participants criticized the weekly presentations about the on-going practical project as well as the room. Participants recommend a larger room and project presentations only every second week.

Motivation and Background

This year we gave the lecture and lab for the third time. As most lecturers we were never trained in lecturing and base our work only on assumption and personal experiences. While we appreciate the overall results of the lecture and the practical part we each year do not had tangible data about the students’ opinions.

Our HCI lecture is split into two parts. We give lectures about the usual topics of a HCI course along the user-centred design process. E.g. we teach about how to collect requirements, different kinds of prototypes, usability evaluations and how to design and interpret experiments. The practical part runs in parallel to the lecture. In the beginning of the semester PhD students from our group present a number of topics. The students pick one topic and form groups of 2-4 students. During the term the students had to work on these projects along the user-centred design process and present their progress in weekly presentations. In the end of the semester the students have to present their project to our group and interested guests in a final presentation and take an oral exam.


As the aim of the study is to improve the lecture in the future we focussed on qualitative feedback. We compiled a questionnaire with the following four questions (we actually asked the questions in German):

  • What did you like about the course?
  • What did you not like about the course?
  • How would you change the course?
  • Do you have additional comments?

We did not ask demographic questions or similar aspects in order to keep the results anonymous.

We distributed the questionnaire to all students of the course that were present (about 20) during the last lecture and collected them after the lecture. While we asked the students to fill the questionnaire we also told them that they are free to not fill it.


In total we collected 9 questionnaires resulting in a return rate of about 50%. Most participants provided answers to the first three questions but no one gave additional comments. After collecting the questionnaires we sorted the data by the questions, clustered the statements by topic and translated them to English. In the following we provide an overview about the results grouped by the three first questions.

What did they like about the course?

Four participants wrote that they liked the lecture. They stated that it is a “good lecture”, appreciated the “very good content of the lecture” and that the “content is well conveyed”. Four participants also liked the hands-on work. Participants explicitly mentioned “the large amount of practical work”, the “practical work” and the “practical experience”. Two students highlighted the structure of the lecture and two others mentioned “new technologies” and the diversity of the projects. One participant highlighted the support by the supervisors when working on the practical project.

What did they not like about the lecture?

Five participants criticized the weekly presentations of the projects’ progress. They stated that there have been “too many presentations” and that “5 minutes is too short for the presentations” even though we scheduled 10 minutes for each presentation plus further question and comments. Three participants commented on the room for the lecture. They criticized that the room is too small. One of the three participants also criticized the low quality of the projector. One participant criticized that the lecture is not always relevant for the practical project and another one the synchronization between the lecture and the practical work. One participant mentioned that the lecturers did not always upload their slides to the learn-management system on time.

How would they change the lecture?

Participants recommended changing four aspects of the course. Four participant recommended fewer presentations of the ongoing work (e.g. “presentations only every second week”) or more interaction between the groups. Three participants recommended a better room. In particular, they requested a room with ventilation or just a bigger room. One of these participants also recommended a larger projector. For the lecture one participant requested a short description for each lecture and another one recommended to make the lecture “even more interactive”. One participant stated that “the practical part (projects) could eventually be reduced”.


We collected feedback only from nine out of about 20 students. Thus, we got only results from self selected participants. We assume that this could have resulted in a bias towards positive feedback. Participants only had limited time to fill the questionnaire and we might have collected only superficial feedback.


Overall the participants appreciate the lecture and in particular the practical work. Participants did not like the weekly presentations about the ongoing practical work and recommended to reduce the number of presentations, probably to one presentation every second week. Participants also did not like the technical resources of the course, in particular, the room and the projector and recommend a larger ventilated room.

While the return rate is only around 50% and the results might be biased by self selection we assume that the results can provide insights for future courses. E.g. we will try to organize a bigger room with a build-in projector. One particular aspect that raised our attention is the critique about the weekly project presentations. We originally structured the course with fewer students in mind. The current structure might not scale well with an increasing number of students. We will consider reducing the number of project presentation as requested by the students. This might also help to scale the lecture to a slightly larger group of students.

Hit the Rabbit!

Fight the dreadful rabbits and crush them with your holy thumb. The shooting season begins with my first game in the Android Market. Your job is to hit as many rabbits as possible. Pan the background around to find some of these evil creatures and hit them with a lusty touch. You can show your skills in different levels that force to hurry up. The time trial mode adds even more variety and you can fight against the clock.

You can download the latest version from the Android Market and don’t forget to give me some proper rating if you like it. Please leave a comment if you have critics or recommendations. In particular, if you have ideas to improve the game. It’s my first game (ever) so please be gentle with me. You find the game in the Market. You can also have a look at the description and screen shots.

Push the study to the market

My student Torben has just published his Android augmented reality app SINLA in the Android market. Our aim is to not only publish a cool app but to also use the market for a user study. The application is similar to Layar and Wikitude but we believe that the small mini-map you find in existing application (the small map you see in the lower right corner in the image below) might not be the best solution to show the users objects that are currently not in the focus of the camera.

We developed a different visualization for what we call “off-screen objects” that is inspired by off-screen visualizations for digital maps and navigation in virtual reality. It based on arrows pointing towards the objects. The arrows are arranged on a circle in a 3D perspective. Check out the image below to get an impression how it looks.

Its our first try to use a mobile market to get feedback from real end users. We compare our visualization technique with the more traditional mini-map. We collect only very little information from users at the moment because we’re afraid that we might deter users from providing any feedback at all. However, I’m thrilled to see if we can draw any conclusion from the feedback we get from the applications. I assume that this is a new way to do evaluations which will become more important in the future.