Your question assumes perhaps that something can't be used until it's formalized. Quite the opposite is true. Things are sometimes formalized AFTER they have become widely used.
Early computers processed characters as groups of binary digits (6 bits, 7 bits, or 8 bits), as provided for in those early peripheral systems, such as teleprinters, card punches, punched tape readers and writers, etc. Many card punches were used for input of software programs, and these punches used a six bit binary set of codes, encoded as holes in a punched card. If you had 20 early machines, each individual card punch machine or computer might have had it's own completely non-standard encoding for those codes. Hollerith, an early innovator, had its own format, as did others. I guess Baudot code wins over holleriths cards (Baudot 1870, Hollerith 1890).
In the IBM world, EBCDIC formally codified (in 1963) what predated it considerably, if you consider that the punch card peripherals that used the same six-bit-binary-codepoints that were later codified as EBCDIC, started in the late 1950s. Similarly there must have been proto-ASCII terminal or teletype devices in use, prior to them being formally codified.
A standard character set starts out as a single device, which then becomes an ad-hoc standard, which others follow on with, and which later gets called EBCDIC, or ASCII.
So in addition to whatever early teleprinters used, the various binary encoding formats used in card-punches could be considered. As some people have said, the teletype, though it predates the computer, also needs encodings for characters, although the morse code system is not strictly comparable to those systems in use in digital computing. The morse code system was intended for a human to human communication over radio or wired teletype.
This is how Wikipedia says the same thing:
EBCDIC descended from the code used with punched cards and the
corresponding six bit binary-coded decimal code used with most of
IBM's computer peripherals of the late 1950s and early 1960s.
2
here's a fun read for historical computing information about programming computers (in FORTRAN, etc) using punched cards. The encoding of the characters can be seen clearly in pictures in this article: http://www.columbia.edu/cu/computinghistory/fisk.pdf
– Warren P – 2013-01-16T19:23:15.453It's probably mentioned in some of the links in the answers, but one of the first programming codes (not really a "character" code) was invented by Basile Bouchon (1725) and improved by Jacquard for programming looms. https://en.wikipedia.org/wiki/Basile_Bouchon https://en.wikipedia.org/wiki/Jacquard_weaving
– Joe – 2013-01-21T23:30:47.827