Monthly Archives: November 2014

Illusions of a good grade: Effort or luck?

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Melanie Paige Moore

Article Reference

Buckelew, S. P., Byrd, N., Key, C. W., Thornton, J., Merwin, M. M. (2013). Illusions of a good grade: Effort or luck?. Teaching of Psychology, 40, 134-138.

Article DOI

doi:10.1177/0098628312475034

Summary of Article

Previous research has demonstrated that students exhibit the self-enhancement bias by overestimating end of course grade performance, on average of about one letter grade. Self enhancement beliefs may be maladaptive and result in poor academic outcomes for students with unrealistic expectations. Students with high grade point averages are better than their counterparts at predicting final grades. Attributions about grades are important in that students who make internal attributions (e.g, individual effort) are better at predicting their grades than students who make external attributions (e.g, luck).  Students with high achievement motivation attribute success to effort, whereas those with low achievement motivation attribute failure to lack of ability.

This study tested the relationship between accuracy of grade predictions, the self enhancement bias, and student attributions about grade performance.  Grade predictions were accessed during two points in the course semester, the second week of the semester (Time 1) and the final week (Time 2). Students also completed a 4-item questionnaire on attributions and gave permission to have their ACT obtained from the university records office.  Results revealed that student did in fact demonstrate the self-enhancement bias by overestimating their grades, especially at Time 1. ACT scores were significantly correlated with predicted grades at Time 1 and actual attained grades. However students with low ACT scores were less accurate in their grade predictions, demonstrated high self enhancement bias, and made greater external attributions about grades.

Discussion Questions

  1. Self-enhancement bias was defined in this study as the discrepancy between anticipated grades and actual grades. The article listed reasons as to why this discrepancy exists (e.g., attributions), what are some other reasons students overestimate their expected end of course grade?
  2. The self-serving bias is the tendency to make external attributions about negative outcomes. What type of strategies would you implement in class to reduce the self-serving bias? How might you structure or change your lectures, syllabus, classroom environment, etc?
  3. For those who teach or TA, have you had experiences with students where you felt the self-enhancement bias or self-serving bias had occurred? For those who don’t teach or TA, have you had any personal experiences in which you witnessed the self-enhancement bias or self-serving bias in an academic environment?

Diigo

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Courtney Simpson

For this assignment, I created an account on Diigo. Diigo stands for “Digest of Internet Information, Groups, and Other stuff.” It is a website that allows you to bookmark, tag, and annotate other webpages and save them for future reference. Basically, you can create a personal library of information available on the web. Furthermore, it has a social component that allows you to share information with and get information from others. You can follow people and build friend lists that allow you to find resources from people that you know, and you can build different “groups” of people that you want to share resources with. In a group, each member can add, browse, and search the content. Additionally, group members can interact with on-the-page annotations. This element of Diigo could be very beneficial for a class – you could create a group for all the students in a class and provide a library of useful tools and resources they can use to further their learning. You can highlight and annotate parts of the text you think are important so the students know what to focus on. Moreover, the class could all read the same article and comment and discuss the article right on the page. Group sticky notes and group forums are available that allow people to interact with one another and discuss their ideas about the information. This component could facilitate students learning from one anothe.

 

The annotation component of Diigo is quite extensive, and this is very beneficial if you are someone like me who likes to use lots of highlighters and mark up text. My favorite feature is that you are able to highlight in four different colors! Additionally, you can add sticky notes to webpages or articles, and they can be either tied to a highlight or freely positioned. Furthermore, you can take a screenshot and capture part of a page. This feature allows you to work with it visually as an image, and you can mark it up with colorful text, arrows, and shapes. The image is saved and linked back to the original article, and you can add a description and tags. Diigo archives all the webpages you save so you do not lose content if something is deleted.

https://www.diigo.com/ 

Overall, Diigo appears to be an awesome tool that would not only be helpful to teaching, but for personal organization as well. It is available in an app, so you are able to access all your information and resources on your phone or iPad. The one downside of Diigo is that the free version does not allow you to store or annotate PDFs. To do so, you must pay $5-6 per month. You can, however, save links to PDFs. While this is unfortunate, it is nice to be able to highlight and mark up different webpages and save them for future reference. I currently have a mess of bookmarks, and think I will start using Diigo to organize information I come across online.

Socrative (Smartphone Clicker)

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Tennisha Riley

I am taking some of the preparing future faculty courses to earn the PFF distinction. I believe the program is in conjunction with the ALT lab. The course I took last semester was ‘Technology in Higher Education’ and they gave some really good resources and examples of using technology to improve instruction. One of the ideas I really liked was how to engage students in the course by use of active feedback. Most instructors use ‘clickers’ to do this but clickers are expensive and do not really get their full use. One of the resources they provided was Socrative. It is a web-based platform that allows you to ask quick multiple choice, short answer questions in class. BUT instead of using a clicker students can use a smartphone or any electronic device and download the app.

Students typically bring an electronic device to class anyway, and so I thought this was a great way to get students more involved through technology. On their website they have tons of resources that instructors who are using socrative share. You can use this feedback multiple ways. Some examples:

– Create a wordle! This one gets an exclamation point because it is one of my favorite. I think it would be great to get students’ feedback on ex: “what word comes to mind when you think about adolescence” and then present it to them visually.

– Voting. Students can vote on pre-constructed answers or peer answers.

– Short answer. I think this is a good way to get students who are not as vocal involved. Aside from reading the answer aloud you could always ask them to explain why they decided upon an answer.

– Reports. You can get instant feedback on whether students understand the material.
I wish I could say how user friendly it is but I do not have a teaching assignment this semester. However, if you all want to humor me, I can give you my classroom link and we can start a wordle!

Weblink below:

http://www.socrative.com/index.php

http://www.socrative.com/resources.php

Classroom Blogs

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Stephen Molitor

For my technology assingment, I decided to explore the potential of using blogs as part of a course. Although you could encourage each student to make their own blog as part of a major assignment, I decided to practice building a single blog that the entire class could use. For this assignment, I took advantage of the “rampages” website that VCU has purchase; rampages is essentially a premium version of WordPress, so you have access to some extra bells and whistles. I learned several important lessons when attmpting to construct my practice blog, some of them were positive and some of them… were valuable learning experiences.

I first attempted to simply dive into a blog using all of the default settings. This was an extremely poor choice. I wasn’t happy with the default settings, tried to change my general blog theme, and messed up the format of the blog so much that I start the process over. If you plan on using blogs in your own courses and are not familiar with mechanics of WordPress, I cannot emphasize enough how valuable the introductory tutorials are. They provide you with some great information, like “play with different blog themes before adding any substantial detail to your blog.”

Once I had my theme nailed down, building the structure of the blog was actually pretty quick. I decided to make a practice blog for a history of psychology class because I felt it lent itself pretty easily to a blog format. I built several pages to my blog, treating each one like an individual assignment page where students could supply their own posts. For example, the first page I built was a biography page where students could submit summaries of different psychologists. One nice feature of the rampages setup is that you can allow your students to have varying levels of control over the blog, from simply reading and commenting on posts to submitting their own content. While intially creating your blog takes some finesse and patience, putting together a blog post is pretty straighforward. You could have student’s submit a practice post to make sure they understand the process, but you likely won’t need to devote much instruction time on the ins-and-outs of posting. I would recommend encouraging students to include photos, videos, or external links into their posts. It keeps the blog from simply becoming a wall of words and can connect students to other online resources.

I think there is some great potential for blogs as a component of a psychology course, especially for courses that tend to be a little heavier on surface-level content like introductory courses. Students can build a blog through individual and group assigments, and the amount of information added to the blog can turn it into a great review tool when it comes time for an exam. I also liked the idea that the folks from the ALT lab brought up to make the blog avaialable even after the course is done. It can be a quality resource for students as they complete other courses, and it can be clear and explorable evidence of an instructor’s incorporation of technology into the classroom.

Here’s the link to the practice blog I made: http://rampages.us/molitorsj/

Remind Smartphone App

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Chelsea Hughes

The Remind smartphone app is a FREE classroom technology that’s used to allow teachers to communicate safely and effectively with students and, for younger age groups, their parents. The app begins with the teacher, who creates a unique code for their class. Then, using that code, students can sign up. Features of the app include the following:

– Text-based communication: Teachers can send mass messages to the class, and the app can be synced to either your email or your standard texting. It does not share your phone number. I really like this aspect, since so many students don’t check their emails regularly! Messages are also stored and easily accessed as a full message history.

– Stamps: Stamps allow you to receive feedback on your communications. Remind will even organize the output data for you. This would be a great way to conduct smaller evaluations, or launch mini quizzes for class. Stamps also track when individuals have seen the message.

-Scheduling: Because the app has a built-in calendar, you can schedule tasks for your students to see (exam due dates, etc.) . You can also schedule messages to be sent at specific times. I think this would be particularly helpful for out-of-class tasks that the students have to do. If, for example, they are supposed to attend a presentation that is outside of normal class hours, you could schedule a reminder message to be sent out.

-Attachments & Voice Clips: The app doesn’t just limit you to text. You can attach pictures, PDFs, record voice clips, and more. And, just like the messages, you can keep track of who has seen and interacted with the attachments.

Overall, I think this app is an excellent way to facilitate communication between students and teachers. I think this would be most helpful in smaller classes, or perhaps practicum-based or service-learning classes (in which consistent communication is important). Like many other classroom technologies, I think it does well to consolidate information and provide convenient access and storage of that information. Because of its multiple functions and versatility, it seems like every class could use Remind’s features to some extent. In the future, I’d like to utilize this tool for a class I’m teaching!

https://www.remind.com/learn-more

Available on iOS, Android, Tablets, and Computers.

Jing Screencasts

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Athena Cairo

For the technology activity I decided to look at using Jing as a way to create short screencast SPSS tutorials for my Methods students to use. I found in class I was having to go over how to do basic tasks several times, so I thought it would be helpful in the future to have an external resource to point students toward instead of spending extra lab time going over how to use SPSS.

When you download Jing, the first thing that pops up is the option to watch a tutorial. I found this helpful, as the program itself is very minimalistic and does not have many explicit instructions, other than labels for your buttons.

Once you open Jing, a yellow sun icon moves to the top center of your browser. It fades slightly behind whatever you’re working on, which is also useful since you can activate it whenever you want to start a screencast or take a screenshot while it remains unobtrusive.

On the sun icon you have the options to start a new screencast, look at screencasts you’ve made, and change the general options for the program. If you opt to start a new screencast, it will open up another small toolbar at the bottom of your screen as well as a crosshair centered at your mouse pointer. You can use the crosshair to click and drag a box around whatever you want to capture. You can also click in the center of your screen to automatically capture the whole desktop screen.

Then, you choose whether to take a picture or video. If choosing a video, you can test your microphone volume before recording. Then, once you tell it to start, you can just begin recording and talk while you demonstrate the tutorial. After you finish making the video, the video/picture is also saved to your Jing account. You can immediately upload it to Screencast.com and generate a URL, but at this point I ran into difficulties because the videos wouldn’t upload automatically and create a URL. Instead, I had to go to my Screencast.com account and manually upload the files that had been saved to my computer.

One thing I found a little frustrating was that I couldn’t get the microphone tester to appear again after the first time I made a video- it only seems to appear once. After my first video, the sound was still way too low, so I ended up having to just keep making short sound clips to test the volume after that. It also would have been nice if there were some video editing capabilities—if I made a mistake while talking, I would have to go back and create a whole new video. However, apparently if you have access to Camtasia, or other movie editing software like IMovie, you might be able to use that to edit your video.

One thing that is important to remember is that after you have set up the viewing screen for what you want the screencast to capture, you can’t do anything on the browser behind the viewer until you hit Play, or disable the viewer. A few times I realized that I needed to re-do something in SPSS (like get rid of something in the output or windows), and I had to close Jing in order to do that since the browser was frozen.

The video quality of the finished product seems good. However, the screencasts require Flash to view, so videos might be difficult for people on iphones or ipads to view.

Overall I found Jing very easy to use, and I expect to keep using it as a quick tool for making videos, especially for short tutorials like working with statistical programs, library databases, or other programs. I would definitely recommend it to those of us who might want to give our students short tutorials on a program, or navigating a website, that we may not want to spend extra time in class revisiting.

Here are my videos:

Creating new variables in SPSS: http://www.screencast.com/t/8bhcBecdAG

Bivariate correlations in SPSS: http://www.screencast.com/t/fSqPlZ5vDd

Online Academic Integrity

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Athena Cairo

Article Reference

Mastin, D. F., Peszka, J., & Lilly, D. R. (2009). Online academic integrity. Teaching of Psychology, 36, 174-178. DOI: 10.1080/00986280902739768

Article DOI

10.1080/00986280902739768

Summary of Article

In this study, the authors investigated the extent to which students generally would tend to cheat on an online extra-credit assignment, and whether the time of the semester or signing an honor pledge might moderate these tendencies.

Background

In some older surveys of college students (1, 2), between 40 and 83% of students reported engaging in academic dishonesty at some point in their college career. Between 2%-13% of students reported having cheated in a traditional lecture course in which they were currently enrolled (3). Although slightly more than half of faculty and students think that it would be easier to cheat in an online setting (4), only 3% of students admitted to cheating in online courses (5). Additionally, reports suggest schools that have honor codes tend to have lower rates of cheating among the student body (6). In light of these points, the authors conducted an experiment to investigate the rates of cheating in their student body, and the effect of writing an honor code with the assignment (experimental condition), as well as the time of the semester on cheating rates (moderating variable). Specifically, the researchers hypothesized: 1. Being asked to agree to an honor pledges will discourage cheating 2. Students are more likely to cheat later in the semester than earlier in the semester

Methods

Participants were 439 undergraduate students taking an Intro to Psych class. Participants were tested over the course of three different time points, September 2005 (n = 141), May 2005 (n = 124), and May 2006 (n = 174).

Participants were told they could receive up to 10 bonus points for participating in the study, and were told it was a pilot study of a motor task. Upon signing up to participate, P’s were assigned to one of three pledge conditions: no pledge, check-mark pledge and typed-out pledge.

P’s accessed the motor task online, which was designed to be especially difficult (so students would have an incentive to cheat). P’s were told their points would be tied to their performance on the motor task, but als0 that the page could not track their performance on the task.

P’s completed the motor task or hitting a computer key when the correct number appeared on the screen. At the end of the task, participants reported their number of hits in a text box, allowing them to possibly cheat by over-reporting their successful hits. P’s were debriefed 7 days after participation.

Results

  • Fourteen percent of participants cheated by over-reporting their hits. Overall, participants in the group reported better performance than they obtained; t(438) = –5.37, p < .05, d = .26

  • Honor pledge conditions had no effect on cheating frequencies across all P’s, nor did it predict greater severity of cheating among those who did cheat.
  • Participants were twice as likely to cheat at the end of the semester than at the beginning χ2(2, N = 423) = 6.41, p <.05, Cramer’s V =.12. End vs. beginning of the semester did predict greater severity of cheating.

References:

(1) Bunn, Caudill, & Gropper, 1992; (2); Davis, Grover, Becker, & McGregor, 1992; (3) Kerkvliet & Sigmund, 1999; (4) Kennedy, Nowak, Raghuraman, Thomas, & Davis, 2000; (5) Grijalva, Nowell, & Kerkvliet, 2006; (6) McCabe, Trevino, & Butterfield, 2002

Discussion Questions

  1. Many students and faculty think it’s easier to cheat online than in a traditional lecture course– how could you as an instructor help prevent cheating on different types of assignments (e.g. research paper, tests, online quizzes) in both an online and a traditional lecture course?
  2. Do you think that having students write/sign an honor pledge helps prevent cheating at VCU? Would this maybe depend on different classroom contexts or types of assignment?
  3. Additionally, how can we help students not feel as stressed and compelled to cheat toward the end of the semester?

    Bonus question: Say you find a student who severely cheats or plagiarizes an assignment– what would you do? What if the student was someone you liked or knew was going through difficult circumstances?

The role of feedback during academic testing: the delay retention effect revisited

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Submitted by Chelsea Hughes

Article Reference

Dihoff, R. E., Brosvic, G. M., & Epstein, M. L. (2012). The role of feedback during academic testing: The delay retention effect revisited. The Psychological Record, 53(4), 533-548.

Article DOI

Summary of Article

THE STUDY:

Participants: 33 male, 62 female undergrad psychology students
Procedure: 5 multiple-choice quizzes throughout the semester. Final exam consisted of 50 items – 10 items randomly selected from each quiz.
Conditions (quiz format): No feedback (traditional answer sheet); No feedback (answer sheet, Scantron); end-of-test feedback; delayed feedback (24 hours); immediate feedback (IF AT).

RESULTS:

# of items recalled from quizzes: IF AT > End-of-test & Delayed > Traditional & Scantron
% of correctly identified initial errors: IF AT > Delayed > End-of-test
% of correctly identified initial responses: IF AT > Delayed > End-of-test
Mean confidence rating after test: IF AT > End-of-test > Delayed
% selecting same incorrect response: Traditional & Scantron > End-of-test > Delayed > IF AT

CONCLUSIONS:

“Immediate feedback promotes recall, the most accurate identification of initial responses, increased confidence in answers, and reduces perseverative incorrect responding.”

Discussion Questions

  1. What are your experiences with the conditions, or learning methods, presented in this article? What did you find useful, and what did you find not useful?
  2. This study only utilizes quizzes as a method of testing knowledge throughout the semester. What other methods could you use, while still incorporating immediate feedback?
  3. An interesting aspect of this article is the focus on not only retention, but perseverative incorrect responding and remembering which answer you put first. Both of these address the important issue of “Why did I get this wrong in the first place?” Precluding addressing it on a one-by-one basis, how can you incorporate this important aspect of learning in the classroom?