How does the time needed to read a sentence scale with the number of characters? Or does this time scaling depend on something more than just character count?
For example, let $X$ be the number of characters and $F(X)$ the seconds to read an entire sentence of $X$ characters long. There are 3 options:
Linear $F(X) \in \Theta(X)$: Is the relationship linear? That is, if it takes 0.5 seconds to read "hello", it would take 1.0 second to read "hello world": $F(X) = 0.1 X$
Sublinear $F(X) \in o(X)$: Or does it decay? That is, as people read a long sentence, their reading speed increases (crazy example $F(X) = 1 / X$)
Super-linear $F(X) \in \omega(X)$: Or is it like quadratic? That is, each additional character ads more time than the previous word? (example: $F(X) = X ^ 2$)
Which of the above 3 options best describes reading speed based on length of text? Is the specific functional form of this relationship known?
Motivation
I am creating an iPhone app where I need to show transient confirmation messages. For example, when a user submits a comment, I pop up a message saying "thanks for submitting your comment". Shortly after, the message will fade away. There exist many such transient messages all over my iPhone app.
What I would like to do is to calculate the optimal time to show each message based off of the number of characters in that message. I want users to have enough time to comfortably read the message, but not so long that the message annoys them. The purely UX version of this question is here: