Linda Myers' article "Approaches to computer writing classrooms" really struck a chord with me. I had the opportunity to teach in a lab this morning, and I couldn't help noticing how much more demanding it felt. The brief instructions I tried to provide at the beginning of the class seemed to fall to the floor just after the words left my mouth. I looked out across the room, and all I saw was a sea of Apple computer monitors with little patches of hair just barely rising above them. I could barely see my students, and they couldn't really see - or hear - me very well either. I quickly realized that my usual strategy of standing at the front of the room and "broadcasting" instructions to the whole class wasn't the best strategy in a proscenium style computer lab. I ended up running around the room restating the instructions individually to students who couldn't really hear me when I addressed the whole class. It was all very inefficient.
While I have taught in computer labs several times before - I used to teach in one once a week at UMD - I haven't really had a clear sense of the particular demands each environment placed on me as a teacher. Perhaps because I teach in a lab less frequently now than I used to, the differences really came into sharp relief this morning after class when I felt exhausted after running around trying to keep my students on task.
I think that in future lab sessions I will prepare instructions ahead of time and post them online. This will hopefully allow students to work at their own pace by reading the instructions and completing the exercises on their own. Since the lab doesn't really foster a good teacher-addressing-all-students-at-once approach, I will simply transform my words into documents that effectively address each student on a one-to-one basis.
Wednesday, September 26, 2007
Friday, September 21, 2007
Online Language
In his article, "Meeting the Paradox of Computer-Mediated Communication in Writing Instruction," Stuart Blythe discusses, among other things, the unique features of language as it's used in online communication. He says, "Linguistically, CMC can be characterized as a hybrid that sits somewhere between talk and writing" (119). Indeed, CMC has no real world equivalent. While chat room communication may take on some characteristics of a face-to-face discussion, it has some distinct differences. For example, we lose the benefit of non-verbal cues that we rely on during a face-to-face chat. However, it's not the same as any other form of written correspondence either. When we send an instant written message to someone, we expect a reply in pretty short order - usually in a matter of seconds. Also, the typing demands of instant messaging are creating a new type of virtual short hand - lol, brb, U, etc. The pressure of matching the response time of verbal communication with typewritten communication actually seems to be reducing written words to their essential communicative cues. One could make a fairly convincing argument for the case that what is occurring here is really the formation of a sort of pigeon language. Rather than this being a language developing out of the communication needs of speakers of differing languages attempting to interact, we have a language drawing on several modes of communication to meet the needs of an entirely new and unique mode of communication.
Indeed, I have already seen several instances of virtual language interference in the writings of my freshman, here at BG and elsewhere. Some of them occasionally lose track of the mode they are working in and use abbreviations like "U" instead of "you" or "B" instead of "be" in their academic writing. While this was pretty alarming to me when I first encountered it, I have to admit that I find it kind of fascinating at the same time. It's amazing to witness and even participate in all of these new communication circumstances as they act on language and force it to do new things. Many argue that this kind of language on the net - and its increasing appearances off the net - is diminishing the richness of the English (or other) language, but I would argue that the expressive potential of language is actually increasing. Before CMC, language never had to do what it is doing now; the truncated expressions and other emerging features of CMC are not rotting away the existing expressive power of language, they are simply adding new expressive techniques to meet the needs of new mode of communication.
While Blythe does briefly touch on this in his article, David Crystal's book The Language of the Internet provides a more thorough and insightful study of this emerging language of the Internet.
Indeed, I have already seen several instances of virtual language interference in the writings of my freshman, here at BG and elsewhere. Some of them occasionally lose track of the mode they are working in and use abbreviations like "U" instead of "you" or "B" instead of "be" in their academic writing. While this was pretty alarming to me when I first encountered it, I have to admit that I find it kind of fascinating at the same time. It's amazing to witness and even participate in all of these new communication circumstances as they act on language and force it to do new things. Many argue that this kind of language on the net - and its increasing appearances off the net - is diminishing the richness of the English (or other) language, but I would argue that the expressive potential of language is actually increasing. Before CMC, language never had to do what it is doing now; the truncated expressions and other emerging features of CMC are not rotting away the existing expressive power of language, they are simply adding new expressive techniques to meet the needs of new mode of communication.
While Blythe does briefly touch on this in his article, David Crystal's book The Language of the Internet provides a more thorough and insightful study of this emerging language of the Internet.
Friday, September 14, 2007
Technology: Being the First is Being the Best
I think Inman's point about the relationship between technological practices and President Kennedy’s assertion of America being first to the moon has significant implications about how we think about technology in general. When I really think hard about what landing on the moon really meant for us as a nation, I see no direct benefit. The benefits are all psychological. Did landing on the moon first really prevent Russia from attacking us? Probably not. Did we receive any military advantage at all from landing on the Moon first? Again, probably not.
I guess my point is that the impact of "us" landing on the moon before "them" didn't have a physical consequence in the same way, for example, that creating the first atomic bomb did. I do think, however, that there were definitely cultural consequences (good and bad) that resulted from the event. Perhaps a carry over from the Bombs that ended World War II, I think that America needed reassurance that we were still the strongest, and in order to that we had to be the "first." Thus, getting there first, assured Americans that we were still invulnerable to the instability of the rest of the World. I think this mentality has rippled forward in time in parallel with technological advancement. In other words, I think that part of the drive for technological advancement or even the drive of consumers to have the latest technology derives from the need to be better or the need to be a step ahead of the "others."
Indeed, there seems to be an embedded sense of "us" and "them" in technological implementations. For example, many universities market themselves as having more or better technology available to students compared to other universities. Hardware and software producers also use this paradigm of thought - "Our laptops are 1.3% faster than Dell's." It always comes down to "us" and "them" as well as "more," faster," "smaller," "lighter," etc.
While this mentality seems to have fueled great technological innovation over the past several decades, I think it is important to also think about the consequences of this way of thinking. At what cost - environmental, geo-political, humanitarian, etc. do these great technological innovations come to us. Also, considering these costs, do we have a greater responsibility to use these innovations to create some sort of net gain? I think this has important implications for us as educators. When our institutions insist on investing and reinvesting in technological upgrades, we almost have an obligation to make the most constructive uses of these technological possible.
I guess my point is that the impact of "us" landing on the moon before "them" didn't have a physical consequence in the same way, for example, that creating the first atomic bomb did. I do think, however, that there were definitely cultural consequences (good and bad) that resulted from the event. Perhaps a carry over from the Bombs that ended World War II, I think that America needed reassurance that we were still the strongest, and in order to that we had to be the "first." Thus, getting there first, assured Americans that we were still invulnerable to the instability of the rest of the World. I think this mentality has rippled forward in time in parallel with technological advancement. In other words, I think that part of the drive for technological advancement or even the drive of consumers to have the latest technology derives from the need to be better or the need to be a step ahead of the "others."
Indeed, there seems to be an embedded sense of "us" and "them" in technological implementations. For example, many universities market themselves as having more or better technology available to students compared to other universities. Hardware and software producers also use this paradigm of thought - "Our laptops are 1.3% faster than Dell's." It always comes down to "us" and "them" as well as "more," faster," "smaller," "lighter," etc.
While this mentality seems to have fueled great technological innovation over the past several decades, I think it is important to also think about the consequences of this way of thinking. At what cost - environmental, geo-political, humanitarian, etc. do these great technological innovations come to us. Also, considering these costs, do we have a greater responsibility to use these innovations to create some sort of net gain? I think this has important implications for us as educators. When our institutions insist on investing and reinvesting in technological upgrades, we almost have an obligation to make the most constructive uses of these technological possible.
Friday, September 7, 2007
Thought's on Digital Literacies in the University
While there were many things competing for my attention from this week's readings, there is one particular thing that still lingers in my thoughts as we look forward to week four. In Cynthia Selfe's intriguing case study of David's experience in college, she makes the point that David did not succeed because his literacies were not valued by instructors who were teaching his courses. They did not appreciate his abilities in Web "design" and were only interested in his ability to master the literacies they were addressing in their courses. Selfe seems to fault his instructors because they "failed to take advantage of, build on, and even to recognize, in some cases, the literacy strengths he [brought] to the classroom" (51). While Selfe's overall point that the academy should recognize and more prominantly value digital literacies is quite valid, I think her faulting of David's intructors is a little unjust.
Selfe describes David's Literacies in the following way:
"David was confident in using several word-processing packages like Microsoft Word to compose documents; WebChat to speak with others synchronously on teh World Wide Web; Poser, Bryce, and Photoshop to create various kinds of representations; and HTML, Java, and Shockwave to design Web Documents" (45-46).
While she does use verbs like "compose," "speak," "create," and "Design," which imply that these are intellectual literacies, her overall description seems to more explicity express that David was simply proficient at using these pieces of software and coding languages. The active verb "using" seems to trump the verbs that follow it. She spends little time discussing how proficient David became in the "design" or "crafting" aspects of using these tools, other than to say that he was being paid by certain organizations for his work. Isn't there a difference between knowing how to "build" something and knowing how to "design" something? Do architects and building contractors not employ drastically different "literacies" as they engage in their professions? Given that David was attending a University, is there not the expectation that he develop multiple literacies (not just digital ones) and that these literacies should involve something more advanced than just basic instrumental usage of tools? Would we be satisfied if our freshman writing courses only expected students to have proficiency in using Microsoft Word without knowing much about the actual craft of writing?
I do not intend for this to undermine Selfe's larger point that we should place a higher value on digital literacies in our curricula and recognize and foster them in our students. But I think we need to be careful not to displace the more intellectually demanding literacies of college with basic lessons of "know how." When so many of us struggle to keep up with the basic "know how" of emerging technologies that might be applicable to a college writing course, it is easy to over value a student's proficincy for knowing how to use tools to create a Web site. Nevertheless, I think it's crucial that we expect creative use of technological tools in order to foster more advanced digital (and other) literacies than just basic "know how." Furthermore, we should not allow our classes to beocme consumed with addessing digital literacies. While they are increasingly important, traditional academic writing literacies are still equally valid. Thus, I don't think we should simply allow students to substitute their digital literacies (however developed they may be) for the more traditional writing literacies we commonly teach in freshman writing. In fact addressing them both (digital and traditional), and looking at how they relate and interact, would probably be the most productive and interesting way to approach this situation.
Selfe describes David's Literacies in the following way:
"David was confident in using several word-processing packages like Microsoft Word to compose documents; WebChat to speak with others synchronously on teh World Wide Web; Poser, Bryce, and Photoshop to create various kinds of representations; and HTML, Java, and Shockwave to design Web Documents" (45-46).
While she does use verbs like "compose," "speak," "create," and "Design," which imply that these are intellectual literacies, her overall description seems to more explicity express that David was simply proficient at using these pieces of software and coding languages. The active verb "using" seems to trump the verbs that follow it. She spends little time discussing how proficient David became in the "design" or "crafting" aspects of using these tools, other than to say that he was being paid by certain organizations for his work. Isn't there a difference between knowing how to "build" something and knowing how to "design" something? Do architects and building contractors not employ drastically different "literacies" as they engage in their professions? Given that David was attending a University, is there not the expectation that he develop multiple literacies (not just digital ones) and that these literacies should involve something more advanced than just basic instrumental usage of tools? Would we be satisfied if our freshman writing courses only expected students to have proficiency in using Microsoft Word without knowing much about the actual craft of writing?
I do not intend for this to undermine Selfe's larger point that we should place a higher value on digital literacies in our curricula and recognize and foster them in our students. But I think we need to be careful not to displace the more intellectually demanding literacies of college with basic lessons of "know how." When so many of us struggle to keep up with the basic "know how" of emerging technologies that might be applicable to a college writing course, it is easy to over value a student's proficincy for knowing how to use tools to create a Web site. Nevertheless, I think it's crucial that we expect creative use of technological tools in order to foster more advanced digital (and other) literacies than just basic "know how." Furthermore, we should not allow our classes to beocme consumed with addessing digital literacies. While they are increasingly important, traditional academic writing literacies are still equally valid. Thus, I don't think we should simply allow students to substitute their digital literacies (however developed they may be) for the more traditional writing literacies we commonly teach in freshman writing. In fact addressing them both (digital and traditional), and looking at how they relate and interact, would probably be the most productive and interesting way to approach this situation.
Subscribe to:
Comments (Atom)