15

Today, you need the @ character in many places, most notably in email addresses, but I suspect that when the syntax for an email address was defined, the sign was already supported in character sets and on keyboards, so it was easy to use for addresses.

Typewriters do not have an @ sign, so when and why was the @ sign added to computers before it became important in various types of user handles on the Internet?

allo
  • 1,073
  • 1
  • 8
  • 11
  • 45
    Wrong! The @ key was on typewriters in the 1940s. – Chenmunka Jun 13 '23 at 12:31
  • 9
    Fun fact: In Hebrew, it is colloquially known as שְׁטְרוּדֶל (shtrúdel), due to the visual resemblance to a cross-section cut of a strudel cake (wikipedia ref) – Jonathan Jun 14 '23 at 10:27
  • 8
    "Typewriters do not have an @ sign" I'm frankly confused as to why you would have believed this to be the case, or generalize it so bluntly as if it didn't require comment. A bit of searching easily finds me images of typewriters with keys to produce an @ symbol, e.g.. Granted, they don't generally use shift-2. – Karl Knechtel Jun 14 '23 at 15:12
  • 3
    @KarlKnechtel "Granted, they don't generally use shift-2." - Funnily enough, the @'s on those example typewriters are in the exact same place they are on my ISO BrE keyboard. – Nick is tired Jun 14 '23 at 15:24
  • 5
    All the typewriters I've ever used since the early 60s had an @ sign. They didn't all have a 1 or 0 but they had an @ sign. – cup Jun 14 '23 at 18:15
  • @Jonathan, we call it “zavináč“ (rollmops, https://cs.wikipedia.org/wiki/Zavin%C3%A1%C4%8D_(j%C3%ADdlo) on Czech Wikipedia) in Czech. – jiwopene Jun 14 '23 at 19:00
  • @Jonathan: Dutch calls it a "little monkey tail" (which starts with an "a" in Dutch: "apenstaartje") – Flater Jun 14 '23 at 23:12
  • A YouTube video about the @ character: https://www.youtube.com/watch?v=MjE03a8PGko – md2perpe Jun 15 '23 at 05:08
  • Somewhat related https://retrocomputing.stackexchange.com/questions/5854/when-did-the-tilde-first-start-to-appear-on-standard-keyboards – Freiheit Jun 16 '23 at 13:51
  • To justify the OP, @ was present on typewriters in the English-speaking countries but not necessarily in other parts of the world where neither the symbol nor its usual meaning was used. Granted, the computer was invented -- mostly -- in those countries but the original question was probably written by somebody living in a different country. – Gábor Jun 16 '23 at 17:41

3 Answers3

50

The @ symbol was present on typewriters a long time before computers were invented, see for example this 1889 Hammond typewriter. In English-writing countries, the symbol was already used in commercial settings for prices: “3 apples @ $1”. Some other languages also used it, with different meanings.

Closer to computers, @ was commonly present in International Telegraph Alphabet 2 implementations (see also Gil Smith’s Teletypewriter Communication Codes), so teleprinter keyboards typically supported it too. The symbol is also part of EBCDIC and ASCII, among many other early character codes, and thus supported on any device supporting one of those. See Coded Character Sets, History and Development for details; @ was included in ASCII-63 as a “commercial usage” symbol, in position 0x40 just before the alphabetic characters because it was intended for replacement by “à” in countries such as France and Italy. It was moved to 0x60 in ASCII-65 and back again in ASCII-67.

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
  • 5
    Yes, as a kid many years ago, using @ was the way we were taught to write prices of groceries and other things. Read as "at", yes. :) – paul garrett Jun 14 '23 at 16:40
  • 5
    What I always wondered was why we needed a symbol to abbreviate a word that's only 2 letters to begin with - like how much harder was it to write "3 apples at $1"? But that's a better question for Linguistics, as it long predates computers. – Darrel Hoffman Jun 14 '23 at 18:13
  • 1
    @DarrelHoffman Your comment made me think of '&' which is a ligature of 2 letters, 'et'. Both @ and & are probably just faster to write by hand and so they became the norm. – Hans Kilian Jun 15 '23 at 07:03
  • Notably, the cited Wp article shows that @ is in the expected place in ASCI-63 but not in ASCII-65. Apart from that there are a few 6-bit character sets which omit it, e.g. referring to Wp's "Six-bit" and "BCD_(character_encoding)" articles some from CDC. – Mark Morgan Lloyd Jun 15 '23 at 11:01
  • 1
    Interesting. None of the typewriters I've seen (what are not that many models, though) had an @ sign, but probably it is a question of for which kind of business a certain model is built. – allo Jun 15 '23 at 16:06
  • @allo: Some typewriters were designed to allow some of the key caps and type bars to be swapped. My grandfather's typewriter is designed that way, though I don't know the whereabouts of any alternative type bars. I don't remember if the type bars that are installed had @, but if not I suspect that there was probably a type bar available with that symbol. – supercat Jun 15 '23 at 16:43
  • @DarrelHoffman I think that @ implies the specific meaning 'at a unit cost of', i.e. 3 apples at $1 each, whereas '3 apples at $1' could be taken to mean $1 for three apples. – nekomatic Jun 16 '23 at 08:53
  • My mother taught touch typing at a girls’ school in the 1970s. The @ symbol was called “commercial-at”. – user7761803 Jun 16 '23 at 11:40
18

The AT-symbol is a US development in use at least since the early 1800s, denoting single unit prices in commercial texts (*1). It was as much used, that early typewriters of the 1880s added it as distinct glyph (*2).

In telecommunication it was present in US variants (US TTY) of the International Teletype Alphabet Number 2 (ITA2) since the late 1920s, where it replaced the Ampersand/et (&).

In Computing it's present at least since the 1950s with the FIELDATA code (*3), itself one of the predecessors of ASCII and in turn now prevalent UNICODE.


*1 - Itself possibly a derivation of the French à used in the same context

*2 - While we like to foremost think of typewriters as tools for authors or at least letters, their most important early usage was in commercial bookkeeping. New technology was always and will always be first used where the money is to buy it - no matter how much more visible other applications are to the common eye.

*3 - Which now offered both, @ and &.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • 3
    A couple of details that are probably not coincidental: (1) The @ sign was often used to serve a "bulky blob" glyph when printing large banners; (2) The Apple I used a blinking @ sign as a cursor; (3) Some Hazelton terminals which didn't support lowercase implemented character code 64 as a solid rectangle, which they in turn used as a cursor (perhaps they were designed to use a blinking @, but someone thought a blinking rectangle would look better)? – supercat Jun 13 '23 at 15:12
  • 1
  • 1
    Not just a US development. See my Answer. I concur with most of what you wrote, though. – RichF Jun 13 '23 at 18:20
  • 1
    @supercat the @ as cursor comes simply from being the glyph produced by the character generator at codepoint 00. – Raffzahn Jun 13 '23 at 19:29
  • @Raffzahn: The display of the cursor on the Apple I is done by overwriting the contents of the display at the cursor position; I would think the easiest way to accomplish that would be to have overwrite the cursor position with a character supplied from software, but I wouldn't be terribly surprised if there's dedicated hardware given that even the handling of newlines is handled by hardware rather than software. Though I just remembered some more details about the block character and realized I was wrong about @. The terminal room had two kinds of Hazeltons as well as Decwriters, and... – supercat Jun 13 '23 at 19:44
  • ...program characters which appeared as a solid block on the Hazelton which supported lowercase showed up as @ on the one which didn't, but as a grave on the Decwriters, meaning that would have been code 0x60 rather than 0x40, though it showed up the same as 0x40 on the Hazeltons which didn't support lowercase. – supercat Jun 13 '23 at 19:46
  • 1
    @manassehkatz-Moving2Codidact what a coincidence - I was just reminiscing on another question about a terminal that used core memory, and it was a Hazeltine. Today I live about a mile from Lake Hazeltine, and I've always wondered if that's where the name originated. – Mark Ransom Jun 15 '23 at 04:07
  • 1
    @manassehkatz-Moving2Codidact I got curious and decided to research Hazeltine - it was a New York company, absolutely no connection to Minnesota where I live. – Mark Ransom Jun 15 '23 at 04:19
7

By coincidence I just happened to watch a YouTube video yesterday:

I found it very informative. It dates the symbol back to over 2000 years ago.

I second what others have said about @ being on typewriters for a very long time. American ones, anyway. I don't know about typewriters in other countries. It could be like the # symbol, which we used to pronounce as "pound" or "sharp". British typewriters replaced it with their pound symbol, though, £. I guess by the time people started thinking of # as "hashtag", typewriters were already passé.

RichF
  • 9,006
  • 4
  • 29
  • 55
  • Not really. For one are all the origin stories he cites debunked, but also Europe always used and still uses - the French à instead of US @. Further completely screws up the story about computing. Like showing a Univac machine and telling that the @ symbol of the tty shown could not be used. Total bogus as Univac machines of all used fieldata which included @. – Raffzahn Jun 13 '23 at 19:28
  • 7
    Wasn't ‘#’ called ‘hash’ long before Twitter et al? (I assumed that the term ‘hashtag’ was coined for a tag indicated by a hash symbol.) – gidds Jun 13 '23 at 22:15
  • 4
    @gidds: The # symbol has gone by a wide variety of names. My favourites are probably "octothorpe" and "pigpen". – Greg Hewgill Jun 13 '23 at 22:21
  • 3
    @gidds yes, among many other things (and arguably ♯, sharp, is not the same as #, hash — any more than ♭, flat is the same as b) – hobbs Jun 14 '23 at 05:31
  • @Raffzahn I don't see how you can say the origin stories are debunked when there are literal documents to support them. Maybe the use of @ for a unit of price originates in the USA, but the symbol itself definitely predates the 19th century. – JeremyP Jun 14 '23 at 08:48
  • 1
    @hobbs nothing arguable about it. The sharp and the hash are different symbols. They don't even look the same ♯,# and they have different Unicode values. – JeremyP Jun 14 '23 at 08:51
  • @JeremyP now you need to make up your mind. ♯ and # are clearly the same symbol - way more than various of the explanations about @. One's the cursive version of the other. There a Japanese and even Sumerian symbols looking much like Latin letters. They as well predate them, still, no relation at all. Same here. If it looks a bit like a duck but doesn't quack, then it's something different. But you don't have to listen to me, Dieter Zimmer spend a whole chapter (p.133-142) in his 2000 book ('Die Bibliothek der Zukunft'](https://dl.acm.org/profile/81350572099) about all those details. – Raffzahn Jun 14 '23 at 09:15
  • 6
    @JeremyP : it's true that some symbols are different, like ♯ and #, or ♭, and b, but in the time of mechanical typewriters they often saved on the number of keys and gears by omitting similar symbols. Many typewriters don't even have a distinct 1 and l (lower-case L). – vsz Jun 14 '23 at 09:44
  • 3
    @Raffzahn ♯ and # may look very similar, but they have completely different meanings, origins, and code points, as well as a subtly but significantly different look. (Sharp uses strictly vertical lines, with slanted horizontals; hash is usually the other way around.) Any confusion between them is, as vsz says, historical due to the lack of a separate character on typewriters, computer keyboards, and most pre-Unicode character encodings. (contd…) – gidds Jun 14 '23 at 09:52
  • 6
    …Compare, for example, the Latin small letter O o, the Greek small letter omicron ο, the Cyrillic small letter O о, the Latin letter small capital O , the ring operator , the degree sign °, the masculine ordinal indicator º, the ring above ˚, the Hebrew mark Masora circle ֯, the ring point … Are they all ‘clearly the same symbol’? – gidds Jun 14 '23 at 09:53
  • Heh, this reminds me of a series of comments on one of the English Language Exchanges, where people were arguing over "a" and "an" being the same word or not. (I'm in the 'not' group on that one.) The # and @ symbols may have had different meanings over time to different societies, but the fact is, they were useful characters on a typewriter keyboard. – RichF Jun 14 '23 at 13:42
  • But why were hash functions called "hash functions"? – Jeremy Boden Jun 14 '23 at 17:02
  • 1
    @JeremyBoden Wikipedia has some information on the origin of the term: https://en.wikipedia.org/wiki/Hash_function#History. The short version is we don't know the precise origins, it's very likely derived from the non-technical meaning "to chop up", and the term has been in wide use in programming since at least the 50s. – Chuu Jun 14 '23 at 20:40
  • 1
    Please do not post link-only answers, especially ones that force us to watch a video rather than reading text. – Federico Poloni Jun 15 '23 at 22:29
  • @FedericoPoloni The link paragraph was simply an introduction to the longer paragraph below it. I am sorry if it distracted you. The point of the first paragraph was simply to indicate that the @ symbol (or variants) have been around a very long time, over 2000 years. – RichF Jun 16 '23 at 00:52
  • 1
    @gidds blame Unicode for that mess. Letters are obviously different from, say, a degree symbol, but one ο letter would suffice. Small capitals should not even be present in Unicode at all, as they are not different symbols, but different formatting. – Trang Oul Jun 16 '23 at 09:30
  • 1
    @RichF which part of the video indicates that the symbol as been around for over 2000 years? The earliest reference I see in the video is 1345, which will be less than 2000 year in the past for another 1322 years. – Stephen Kitt Jun 17 '23 at 19:08
  • @gidds Well, all true, but for one, the arguments repeated in that cited video are about similar looks, not meaning, which is a false trace and for that reason long debunked. Origin here lies in meaning and variation over time when repeating a symbol not used in regular writing of a language. In addition having lines slanted or not is part of writing style/font, they are only relevant distinctions when within the same font, not when it's about different situation. Having a unified system is a rather new thing. Last but not least, yes, they are all circles, so essentially ... – Raffzahn Jun 19 '23 at 09:34
  • @gidds ... the same glyph. Distinction is only needed when used in the same context. When typewriters didn't have a key for Zero, everyone was using the letter 'O' (often as lower case). It was not seen as a replacement - same for using I for 1 (still common in the US). Common way into the 1970s. It wasn't 'lack of characters' but a missing need for distinction. That only arose with data processing were distinct symbols simplify handling. The change of handling made automated (not necessary human) distinction necessary. Using today's context to look at past usage is rarely a good idea. – Raffzahn Jun 19 '23 at 09:51
  • @RichF As so often grouping is up to whoever does the group and for what reason. Like function vs. expression? I'm rather on the other side as 'an' is just a variant pronunciation of 'a' depending on surrounding. What is often forgotten is that spoken English concatenates words even more than German - it's just way less visible in writing. a/an is one of the exceptions. Also helpful to remember that those little variations are offspring of the Old High Herman 'ein' as indefinite article, necessary when stressing of certain vowels was reduced. – Raffzahn Jun 19 '23 at 10:09
  • @StephenKitt, you are correct. I misremembered what I had seen. – RichF Jun 19 '23 at 13:07