Language and Technology
Definitions of 'technology'
1. Theoretical knowledge of industry and the industrial arts. 2. The application of science to the arts. Funk and Wagnall's New Practical Standard Dictionary, New York, 1946 Science of the industrial arts; practical arts collectively. Pocket Oxford Dictionary, Oxford, 1969 The methods for using scientific discoveries for practical purposes, esp. in industry Cambridge Dictionary of American English, Cambridge, 2002 Knowledge, equipment, and methods that are used in science and industry. Cambridge Learner's Dictionary, Cambridge, 2002 New machines, equipment and ways of doing things that are based on modern knowledge of about science and computers. Longman Dictionary of Contemporary English, Harlow, 2003 1. The application of scientific knowledge for practical purposes. 2. The branch of knowledge concerned with applied sciences. Compact Oxford Dictionary Online, Oxford, 2004 Why is technology relevant to language?
All technology influences language, in ways that are not always obvious. The development of transport systems, for example, leads people to move around so that language forms used in regional varieties may move into other regions. We use a metaphor such as "all guns blazing" to suggest the idea of an action performed with energy or aggression - so the technology of weapons extends the usage of everyday speech or writing. Since technology is a means to extend man's reach, then it is necessarily connected to language, in the sense that both natural languages and technologies will be important in enabling us to do all sorts of things in almost any area of human activity. For example, we use aeroplanes to fly people and goods around the world. And we try to make this safer and more efficient by developing an air-traffic control system. That's language and technology working together for the common good. (And English is the language used in that system globally.) With the popularity and rise in real-time text-based communications, such as Facebook, Twitter, instant messaging,email, Internet and online gaming services, chat rooms,discussion boards and mobile phone text messaging (SMS), came the emergence of a new text language tailored to the immediacy and compactness of these new communication media.
While it does seem incredible that there are thousands of texting abbreviations, different chat abbreviations are used by different groups of people when communicating online. For example, people playing online games are likely to use chat abbreviations that are different than those used by someone running a financial blog updating their Twitter status. |
Key Terms Glossary
|
Electronic text and digitized information:
In every case "digitized" information is really a series of 1s and 0s in the binary machine code that enables a computer or other device to represent the information in some other format, so that humans can experience it - such as an image (still or moving), a high-fidelity audio track or a text document that we can read, write or edit within the interface of a word processor, text editor or instant messenger.
The technologist has found many ways to do more things because digital information can be used across a wider range of devices that are inexpensive to manufacture. (At an even more highly technical level, this is because these devices use principles of solid-state physics, so there are no moving parts.) Some of the most popular applications of digitized information are very closely modelled on analogue technologies - such as voice telephony, and TV and radio broadcasting (indeed the broadcasting bit of the process is not changed; the difference lies in the nature of what is broadcast - so now the same ultra-high frequency and very-high frequency radio waves carry signals that are decoded as digital information by the receiving device). Recording to CD, DVD, mp3 players and hard drives also mimics recording to audio and videotape.
While Tim Shortis may be right to single out text as a most important language form that digital information can represent, I suggest that speech is not far behind. Information technology can convert any audio source into digital information (and reverse the process), so we can use this for
The technologist has found many ways to do more things because digital information can be used across a wider range of devices that are inexpensive to manufacture. (At an even more highly technical level, this is because these devices use principles of solid-state physics, so there are no moving parts.) Some of the most popular applications of digitized information are very closely modelled on analogue technologies - such as voice telephony, and TV and radio broadcasting (indeed the broadcasting bit of the process is not changed; the difference lies in the nature of what is broadcast - so now the same ultra-high frequency and very-high frequency radio waves carry signals that are decoded as digital information by the receiving device). Recording to CD, DVD, mp3 players and hard drives also mimics recording to audio and videotape.
While Tim Shortis may be right to single out text as a most important language form that digital information can represent, I suggest that speech is not far behind. Information technology can convert any audio source into digital information (and reverse the process), so we can use this for
- recording speech (or music, or birdsong),
- interacting with it (for example by mixing tracks, or altering the audio properties) and
- relaying it instantly (as in Internet telephony, or, more crudely, an audio facility in an Instant Messenger program).
Application softwear
The modern personal computer (Apple Mac or Windows or Linux PC, say) began life as a relatively high-cost device for business productivity, as did many of the add-on devices we refer to collectively as peripherals - printers and scanners, for example. In a very short time the manufacturers' and resellers' competition for business customers made the price affordable for home users, which in turn influenced the development of the PC to include features for entertainment, and later for personal/social (not work-related) communication.
The first computer applications were specifically intended for business activity. The spreadsheet (invented in 1979 by Dan Bricklin who never patented it) was largely responsible for the introduction of personal computers in many areas of business. Over time these applications have been adopted by users in other contexts, and adapted for such use. (So the spreadsheet becomes both a powerful tool for business users, with a high price tag, and a more customized tool for managing domestic accounts and paying bills, with a smaller price tag to match the customer's wallet or purse.)
In the 21st century, the uses of office productivity software (spreadsheets, database management programs and word-processing) are widespread, almost universal, in business, commerce, education and administration. Increasingly for the domestic user the important applications are those used for leisure, entertainment and communication. This does not mean what are often loosely called "computer games" - which are a more defined and specific kind of application. Typically it may include, rather:
When we link digital information from one process to another, while using a computer, then we are almost certainly using one or more applications.
The first computer applications were specifically intended for business activity. The spreadsheet (invented in 1979 by Dan Bricklin who never patented it) was largely responsible for the introduction of personal computers in many areas of business. Over time these applications have been adopted by users in other contexts, and adapted for such use. (So the spreadsheet becomes both a powerful tool for business users, with a high price tag, and a more customized tool for managing domestic accounts and paying bills, with a smaller price tag to match the customer's wallet or purse.)
In the 21st century, the uses of office productivity software (spreadsheets, database management programs and word-processing) are widespread, almost universal, in business, commerce, education and administration. Increasingly for the domestic user the important applications are those used for leisure, entertainment and communication. This does not mean what are often loosely called "computer games" - which are a more defined and specific kind of application. Typically it may include, rather:
- Browsers - for viewing documents on the World Wide Web and interacting with them.
- Image editing and album programs.
- Communications - e-mail, instant messaging and message boards.
- Media players - for playing audio and video files, and organizing them in collections and play lists.
- File-sharing applications - for exchanging data files with other users over networks, especially the Internet.
When we link digital information from one process to another, while using a computer, then we are almost certainly using one or more applications.
Asynchronous Vs. Synchronous Execution
When you execute something synchronously, you wait for it to finish before moving on to another task. When you execute something asynchronously, you can move on to another task before it finishes.
That being, said, in the context of computers this translates into executing a process or task on another "thread." A thread is a series of commands--a block of code--that exists as a unit of work. The operating system can manage multiple threads and assign a thread a piece ("slice") of processor time before switching to another thread to give it a turn to do some work. At its core (pardon the pun), a processor can simply execute a command--it has no concept of doing two things at one time. The operating system simulates this by allocating slices of time to different threads.
Now, if you introduce multiple cores/processors into the mix, then things CAN actually happen at the same time. The operating system can allocate time to one thread on the first processor, then allocate the same block of time to another thread on a different processor.
All of this is about allowing the operating system to manage the completion of your task while you can go on in your code and do other things. Asynchronous programming is a complicated topic because of the semantics of how things tie together when you can do them at the same time.
That being, said, in the context of computers this translates into executing a process or task on another "thread." A thread is a series of commands--a block of code--that exists as a unit of work. The operating system can manage multiple threads and assign a thread a piece ("slice") of processor time before switching to another thread to give it a turn to do some work. At its core (pardon the pun), a processor can simply execute a command--it has no concept of doing two things at one time. The operating system simulates this by allocating slices of time to different threads.
Now, if you introduce multiple cores/processors into the mix, then things CAN actually happen at the same time. The operating system can allocate time to one thread on the first processor, then allocate the same block of time to another thread on a different processor.
All of this is about allowing the operating system to manage the completion of your task while you can go on in your code and do other things. Asynchronous programming is a complicated topic because of the semantics of how things tie together when you can do them at the same time.
Theorists
Shlegoff's canonical sequence (1986)
1) Summons/ answer - opens the channel of communication.
2) Identification/ recognition - not needed for Face-to-face.
1) Summons/ answer - opens the channel of communication.
2) Identification/ recognition - not needed for Face-to-face.