So someone brought up an excellent question in class the other day. Before I pose it, let me provide some background info.
If you are not in EDTS 523 (the class I am referring to above), you probably need to know what it’s about to understand the context of the class. It’s a graduate level course about using technology in the classroom. I’d say the premise of the class is the notion that technology needs to be used more in the class so as to engage students. There are so many things “out there” that I never knew about. Ever heard about wordle? How about neoffice? Look them up. It’s really interesting stuff. I’m sure you already know that “kids these days” are tech. geniuses. My 18 year old cousin has built a computer. Yeah, you read correctly, BUILT a computer. I don’t even know what a hard drive is and I’m only four years older than him. But back to the point: the question.
“Is using too much technology in the class a bad thing? Some or most of the students will be going off to college someday where they will be subjected mainly to oral and reading based knowledge acquisition.”
Good question. Any answers anyone? Here’s mine:
Yes and no. I think the problem with this question is it assumes uniformity on the part of the learner. Ever talked about metacognitive theory with a third grader? Do you even know what that is? If you’re not in education you may not know. Metacognition is thinking about thinking basically. It’s one of the educator’s many responsibilities to teach metacognitive skills to their pupils. As a student progresses through the k-12 gauntlet, they become more aware about how they learn best. But you know as well as I do that a youngster is not going to pick up a 236 page book on Napoleon and good “oh goodie! I love reading dry, boring, never-ending non-fictions!” How do they learn best? Bright colors and flashing lights? not exactly. Students learn best when they are actively engaged with the material they are trying to learn. It’s not that they don’t like to read: I’ve probably read the first Harry Potter book 17 times (seriously). But they like to use their imagination and hands to problem solve. Some college kids like doing this too, that’s why there are majors like Biology where students subject themselves to the torture of learning the molecular structure of every amino acid. My friend Shadman did it- he said it was “easy” and “fun”… I’m going to just trust him on that. They say we learn 90% of what we teach and 10% of what we learn. To me, that means 90% of what students learn, they should teach themselves.
By the time you reach college you’ve got a pretty good idea of who you are, academically at least. You can concentrate better, you know whether lecture format or reading is more effective for you, and how to get an A without ever reading a single book. Is that to say that a college student wouldn’t have fun or learn as much playing an interactive game on the Oregon Trail? No. But they are more adept at learning.
I guess what I’m trying to say is that kids need to play to learn. It’s not their fault they can’t listen to someone lecture for more than ten minutes. It’s part of being a kid. But college age students are more settled, more mature, and more capable of longer durations of learning and specialization. You need to teach in the way that’s most effective and most practical. Having computers for an Economics 101 class where there are over 400 students is not practical. Not to say they wouldn’t benefit, but that’s the nature of the college intro-class beast. If you’re a teacher, teach the best way you can. If you a k-12 teacher affraid of using technology, I’ve got bad news for you. You have to adapt to succeed. That applies to you and your students…