POSSE evaluation

From Teaching Open Source

Jump to: navigation, search

Contents

[edit] Introduction

This document is meant to be a basis for a public discussion of getting an evaluation going for POSSE.

[edit] Status

(Provided by Heidi Ellis on 17 May 2010) Greg Hislop and Heidi Ellis have provided a draft of a Post-POSSE survey for obtaining attendee opinions and attitudes. Steps are underway to gain IRB approval at the Worcester State POSSE.

File:POSSE Post Survey.pdf

(Provided by Greg Hislop on 27 April 2010) Greg Hislop and Heidi Ellis have been continuing an effort to draft some survey instruments that can be used for this summer's POSSEs. Karl Wurst is now getting involved in that conversation to be sure that we will have something useful in time for the next POSSE which Karl is hosting at Worcester State. Drafts will be posted to TOS for general comment too, as soon as we have something worth reviewing.

[edit] Evaluation Questions

The evaluation will seek to determine:

  • How successful are POSSE attendees at adding OSS instruction to their curricula and what are the enablers and inhibitors of this adoption?
    • This is a reasonable focus, adding in factors like how they do it (as lectures in an existing course, creation of new courses, etc. ) SJ
  • How effective are the POSSEs at preparing instructors to teach students to contribute to OSS?
    • This is a long term study with a lot of variables. Also how do you define how well you prepped them? Merely the above? Student success in their courses? Number of students who become active contributors? sj
  • What is the impact of OSS instruction on student learning? (a) learning about OSS, and (b) learning about general computing concepts.
    • See above. Also hard to judge how well they learned general computing concepts unless you know the students had no computing coursework in HS or College before taking the OSS course. sj
  • Are there additional questions that we should try to answer to demonstrate benefits of POSSEs for Red Hat?
    • What other benefits does Red Hat want proven? # of universities picking up SW or support after a POSSE instructor launches a course? PR benefit assuming Red Hat puts effort into promoting POSSEs? sj
  • What is the impact of OSS instruction on participation in OSS projects, where contribution is defined as: ...? Need a definition of contribution
  • What percentage of the folks reach the foothills? First plateau? (using the "contributor mountain" analogy) need link
  • Where do they fall off the mountain and why?
    • Think all of the above are first basic research on OSS dev and community in general, and then, much later, whether students who have had formal OSS instruction as part of their college experience do better than those who didn't get it sj

[edit] Potential subjects

[edit] Instructors who attend a POSSE

What to measure:

  • attitude toward OSS
  • motivation to teach OSS
  • confidence in ability to teach OSS
  • OSS knowledge and skill
  • OSS experience
  • experience teaching about OSS

When:

  • pre-POSSE
  • immediately post-POSSE
  • 6-12 months post-POSSE
  • 12-24 months post-POSSE (turning a desire to teach into a seminar or class can be slow going in some institutions)

[MCJ] A few questions:

  • What level of rigor are you interested in here? Do you want a survey that you can use internally, or that will have rigor if you want to publish about it later?
    • Rigor should be as if it were going to be disseminated to conferences at least. With the summer cadres concluded and something like 40ish professors and 5-6 sites between the initial POSSE and those completed by September or October it should be attractive to FIE as a work-in-progress type of paper/presentation for delivery in their fall 2011 conference. As a survey to help shape future posses, promote other efforts, etc it will be useful and just the stats may be helpful to some profs trying to convince their institutions to host one or consider it appropriate professional development. SJ
  • By doing a pre-test on attitude and motivation, you will likely be influencing a re-issuance of the same instrument later. You might look at the Intrinsic Motivation Inventory (IMI) for a more detailed exploration of what you seem to be suggesting.
  • You are unlikely to have an 'n' large enough to be statistically significant.
    • See above. Also, the real significance for colleges would be in students' success in careers in open source, which only happens after the post-posse profs start tracking their students success post graduation. (say it three times fast, post-posse-profs) SJ

Would it be better to be capturing rich documentation about your faculty -- an interview, short essay answers to these questions that can be subject to document analysis later, etc. -- at this stage? I know people seem to like numbers, but how do you measure "attitude" or "teaching experience?" I think that's challenging, and ultimately, does it measure effectiveness of the POSSE process? (It might... but I feel like you have way too many questions flying around and not a clear enough statement of what story you want to tell through your data.)

    • I think both is best. SJ

I'd also like to recommend Multi-Institutional, Multi-National Studies in CSEd Research: Some design considerations and trade-offs, as I think the challenges of getting good data in this space are similar.

[edit] Instructors who have not attended a POSSE

These instructors will be a control group to compare to the POSSE attendees.

What to measure:

  • attitude toward OSS
  • motivation to teach OSS
  • confidence in ability to teach OSS
  • OSS knowledge and skill
  • OSS experience
  • experience teaching about OSS (or a subset of this)

When: any time, but particularly at a point when a fellow faculty member is trying to introduce OSS in teaching

[MCJ] You're proposing a cross-cultural A/B comparison study. These are hard to do well. Any good statistician will probably throw your comparative survey out if you're not careful. This is also an area I have little to no experience with, because I think it is hard to do this kind of study well. It requires care and planning. For lack of anywhere else to start on this, I've pointed to the wiki page on the subject (Cross-cultural studies).

In essence, you seem to be interested in questions of culture across two cohorts -- faculty who are apparently interested in TOS (because they are attending a POSSE), and those who are not (because they aren't). There is a lot of self-selection bias, and your control group never had the opportunity to take part... so, are they really a control?

Again, I'm not trying to gun things down before they start, but I am asking questions (because I don't know the answers, and I know this is a subtle space).

    • Assume this means folks who are teaching OSS w/o having had the opportunity to attend a posse. People who aren't interested in teaching OSS won't. What you probably want to compare is what their BGs were before they started and what their successes, frustrations and failures were vs folks who went through a POSSE. Different animal. SJ

[edit] Students

Students who have or have not been exposed to OSS learning materials delivered by an instructor who has attended a POSSE (2 groups)

What to measure:

  • attitude toward OSS
  • OSS knowledge and skills
  • ability to contribute to an OSS project
  • attitude toward computing as a career
  • general computing knowledge and skills

When: Quasi-experimental design comparing students in two groups: those who do and do not learn about OSS, measured at two points in time: before and after that learning

    • This is a really long-term study. You need the folks who have taught POSSEs to go home, find a way to add seminars or classes, get students through them and then start asking the students about their experience. As the first effort is likely to be the most shakey, you may not even want to do this until the prof has more than one cohort of students that's gone through the material. IRB approval too unless the school, like RIT, considers surveys to be taken as part of a course acceptable without IRB approval. SJ

[edit] Notes

We should also look at instructors of POSSEs, especially since the POSSE structure is to have attendees turn around and become instructors.

How do we take cultural factors into account? The initial POSSEs are quite international.

    • Suggest you chat with Tom Edwards of Englobe. His current consulting practice is centered around the games industry but he's been

doing assement of cultural factors work for close to two decades. Good guy. SJ

How do we take disciplinary factors into account, or are we restricting ourselves to CS-focused POSSEs and CS instructors?

    • depends on how you define contributor and contributions. SJ

[edit] Approach

The central participants are the instructors who attend a POSSE. A key will be to enlist their help in doing the evaluation. In part this can be encouraged by setting some level of participation in the evaluation as an expectation in return for being able to attend the POSSE. While this would be done in a low key fashion, and not actually enforced, it should have some value in creating a group norm.

Also, evaluation is in the self-interest of participants. It is helpful to them to be able to show impact of changes in their teaching, and there will be opportunities for them to participate in publications coming from the evaluation.

The initial level of this participation by POSSE attendees would cover their own response to evaluation (pre-POSSE, immediately post-POSSE, and delayed post-POSSE). A second level would be to enlist their participation in an evaluation that extends to other faculty at their institution and to their students. Methods

Surveys are the most likely instruments for most of the measurement. These should be prepared for both paper and online administration. These would include both Likkert scale questions and open ended questions that solicit comments. Structured interviews might be useful as an additional approach. These might be particularly useful for the delayed post-POSSE measurement to get more detail on what actually happens when faculty return to their own institutions.

For student learning, quasi-experimental design may be possible to compare students who have and have not been exposed to OSS teaching.

For instructors, comparable groups that have and have not attended POSSEs could be compared.

[edit] First Steps

What has been done so far? Is there a pre or post POSSE evaluation? (Answer: Nothing concrete yet; we have anecdotal data, but nothing usable, because we have not done an IRB yet.

The obvious target would be to try to start this process for the POSSEs coming up this summer. At minimum, we could do a pre and post POSSE evaluation and try to add to these POSSEs an element of encouraging participation by attendees in an on-going evaluation effort when they return to their home institutions.

If we enlist faculty to collect data on students, we will need to figure out the simplest way to handle the IRB (Institutional Review Board) issues. We should start on that now. See POSSE IRB.