Page 2 of 2 FirstFirst 12
Results 21 to 39 of 39
  1. #21
    Bountifully Enmeshed
    Join Date
    Apr 2001
    Location
    At the Christmas Bizarre
    Posts
    38,154
    vCash
    250
    Rep Power
    46687
    Quote Originally Posted by rfisher View Post
    Based on everything I've read, it seems it's actually quite difficult to quantify critical thinking. And teaching it is even more difficult to quantify. It's easy to make esoteric definitions of what it should be. Not so easy to teach or to determine how to improve the results. Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower. And the relationship to academic success in the program was statistically unreliable.
    That's pretty much what I would expect--and will continue to expect the more that we are pushed to assess students in some sort of objectively quantifiable way. It really can't be done; it takes human judgment to determine what a main argument is, whether or not students can identify it, how effectively they can address the issue raised, etc.

    And it's not like your school is unique in this outcome, either: http://www.msnbc.msn.com/id/41136935...learning-much/

    I've been to two assessment seminars this year and the impression I am getting is that grading papers is all fine and good and important and all, but what is really being sought is some sort of foolproof system in which everyone teaches the same stuff and could give the same tests with the same outcomes because the answers are either right or wrong.

    When someone points out that the only way to assess critical thinking is analyze written or oral discussions, well,.........there must be a better way. I haven't been told what it is, exactly, but everyone seems sure that it can be done. They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).
    "The secret to creativity is knowing how to hide your sources."-- Albert Einstein.

  2. #22
    GPF Barcelona here I come
    Join Date
    Jun 2001
    Location
    NYC, New Mexico, Azores, Cape Cod then Barcelona :)
    Posts
    9,174
    vCash
    334
    Rep Power
    33415
    Here in Canada there are signs/movement away from standardization and testing.. the medical schools (McGill) have removed MCAT testing.. the Vet school also.. I actually did a paper on 'unschooling at the graduate level'
    Thanks to PI .. I discovered I'm actually a Nontheist

    "Love is better than Anger, Hope is better than fear" Jack Layton 1950-2011

  3. #23
    Satisfied skating fan
    Join Date
    Oct 2005
    Location
    Looking for a pairs team to split up
    Posts
    40,212
    vCash
    600
    Rep Power
    42549
    Quote Originally Posted by Prancer View Post
    That's pretty much what I would expect--and will continue to expect the more that we are pushed to assess students in some sort of objectively quantifiable way. It really can't be done; it takes human judgment to determine what a main argument is, whether or not students can identify it, how effectively they can address the issue raised, etc.

    And it's not like your school is unique in this outcome, either: http://www.msnbc.msn.com/id/41136935...learning-much/

    I've been to two assessment seminars this year and the impression I am getting is that grading papers is all fine and good and important and all, but what is really being sought is some sort of foolproof system in which everyone teaches the same stuff and could give the same tests with the same outcomes because the answers are either right or wrong.

    When someone points out that the only way to assess critical thinking is analyze written or oral discussions, well,.........there must be a better way. I haven't been told what it is, exactly, but everyone seems sure that it can be done. They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).
    Exactly. We have to do this (and the Department of Education is pushing this), but nobody really knows how. Our University spent three years deciding how to redo the general ed curriculum to include CI. The upshot was a lot of dithering and a final decision to just make some classes meet the criteria. They all patted themselves on the back for a job well done until this spring when the assessment monster arrived. Now, they actually have to do something. I called my accreditation agency, most of whom have masters in education rather than imaging, and asked what they wanted. I got double speak which essentially meant, we don't know, but we know we want it and are hoping you can figure it out. I ask my faculty and everybody knows what it should mean and give me the "I don't understand what the problem is" look. Then I ask for input on exactly how to quantify this. How are we to provide evidence that students demonstrate progression? If if they don't what are we going to do about it? Give them more assessments? More tests? More papers? How are we supposed to address the fact they either think or they do rote learning. I get blank looks and mumbles that that is my job not theirs. One went so far as to say, if we don't meet the benchmarks, just lower them till we do and you don't have to figure out how to address the issue before the fact. I told her, if you don't know what you are doing, establish how you are going to do it before the fact, how the hell do you think you'll figure it out after? Insight from the Gods?

    And PML at their include U Charleston in that study. Why did they beef up writing in nursing and biology? Because over 75% of their master's thesis, master's mind you, were rejected due to poor writing and worse research. I was at a Dean's meeting at our University where they were discussing this and were shocked when they realized their own statistics were no better. So, they institute writing across the curriculum as the solution. It's a joke. I just jumped through the hoops to get two of my classes designated at a WAC class. The hoops dummy down the process not make it better. I had to add all sorts of nonsense. The upshot is, the committee was thrilled with the result even though I think it's stupid.
    Last edited by rfisher; 06-08-2012 at 03:08 AM.
    Those who never succeed themselves are always the first to tell you how.

  4. #24

    Join Date
    Jun 2002
    Location
    Refusing to perform on demand
    Posts
    14,498
    vCash
    500
    Rep Power
    2276
    Quote Originally Posted by Prancer View Post
    They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).
    Even in math or similar subjects, where there is often a well-defined right answer, there is still a lot of subjectivity because it isn't just about getting the right answer, it is about the process of getting the right answer. So we often give lots of partial credit, which is inherently subjective. What are the most important steps in formulating and solving the problem and how many points are they worth?

    I have some colleagues who give no partial credit, while others like myself give lots of partial credit. So, if a student works a problem through and gets the answer 2x4+1, which is correct, but makes a stupid error and writes the final answer as 7, some profs would give 0 credit, while others would give full credit. Which is the correct allocation of points? My colleagues who give 0 credit can give compelling arguments that their approach is best (they've had many years of defending their policy to students ). You know, one minor calculation error and the bridge will fall down. But I view things quite differently.

    Then there was the time my husband took graduate level math stats and was asked on an exam for the variance of the sample mean for a particular setting. He had the right answer (he happened to know if from other stats classes) but he was given no credit because he didn't show any work. He was but I was a bit because everyone ( ) knows math stats is all about the process, i.e., why is that the correct equation for the variance of the sample mean.
    Creating drama!

  5. #25
    Bountifully Enmeshed
    Join Date
    Apr 2001
    Location
    At the Christmas Bizarre
    Posts
    38,154
    vCash
    250
    Rep Power
    46687
    Quote Originally Posted by jeffisjeff View Post
    Even in math or similar subjects, where there is often a well-defined right answer, there is still a lot of subjectivity because it isn't just about getting the right answer, it is about the process of getting the right answer. So we often give lots of partial credit, which is inherently subjective. What are the most important steps in formulating and solving the problem and how many points are they worth?
    I know this, but every time I make this point, I am shot down in flames by people who are convinced that a math test is a math test and all the answers are right or wrong.

    I always use the example of two math professors I know, one of whom grades all homework assignments, never curves, and gives incredibly difficult tests, while the other never grades homework, always curves, and gives take-home exams. Believe it or not, their course GPAs are a little different .

    And I get "Yes, but the answers! The answers are either right or wrong!"



    I was tearing my hair out over assessments after the seminars, partly because of what I heard and partly because the state had handed down a set of objectives we have to meet, and I had and still have no idea how to do what they want in a coherent, cohesive sort of way. Then I went to the department meeting and found out that everyone is just putting the state stuff on their syllabi and then completely ignoring it and doing whatever they want .

    I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.

    I've heard that the nursing program at my school is planning to do their own writing courses. Yeah, that'll be good.
    "The secret to creativity is knowing how to hide your sources."-- Albert Einstein.

  6. #26
    Registered User
    Join Date
    Jan 2005
    Posts
    23,859
    vCash
    500
    Rep Power
    0
    Quote Originally Posted by Prancer View Post
    I went to the department meeting and found out that everyone is just putting the state stuff on their syllabi and then completely ignoring it and doing whatever they want .
    Very true

    To me I don't find it all that difficult because it's basically using a mathematical rubric to quantify what I do anyway. Since I've always used point systems for grading, it's not that difficult to slice and dice it with particular goals, objectives, and benchmarks.

    The reality, though, is teachers tend to have a sense of what a student has earned, and what skill sets they have mastered at what level, so the rubrics are basically fudged to arrive at the subjective grade.

    But it helps the number crunchers out there to quantify that way, so all is well.
    I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.
    That's why I'm not exactly sure what rfisher is looking for since these things tend to be very specific to particular programs. No one here is going to be able to come up with specific benchmarks for a particular radiology program.

  7. #27
    Satisfied skating fan
    Join Date
    Oct 2005
    Location
    Looking for a pairs team to split up
    Posts
    40,212
    vCash
    600
    Rep Power
    42549
    Quote Originally Posted by Prancer View Post

    I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.

    I've heard that the nursing program at my school is planning to do their own writing courses. Yeah, that'll be good.
    One would think that, except JRCERT has declared that critical thinking is indeed a desired goal, because, well, it's a higher educational buzz word. All I can do is ascertain a student's ability to adapt to different situations. The problem is either a student can do this or they can't and no matter what we attempt to do will change that. Which means the outcome assessment is essentially predetermined and will not demonstrate growth. I know by the end of a student's first year exactly what type of imaging they should do. Five percent would make excellent trauma techs in a level 1 trauma center. They have the innate ability to assess a situation and make immediate adjustments to the norm. The other 95% need to work in an outpatient facility where there is little variation in the patient. I was in the 5% as a practicing radiographer. I loved trauma and the challenge it presented. I like innovation. The entire rest of my facilty are in the 95%. They don't do well with change. At. All.
    I can find a way to deal with JRCERT, however, I also have to answer to the University as a whole and they want assessments that are like the others. Rubrics are easy. Grading is simple. However, I'm not dealing with just individual courses. I have to do an assessment on the Program as a whole which is a whole different ballgame.

    I supect most of the faculty at the university are planning to do what they are doing at yours, except, the department chairs will indeed have to compile that data. It's going to be collected at the course level as well as the program level and the program level has to set different benchmarks that use similar assessment tools in different courses over the progression of a degree that will demonstrate the goal. At least our program has always had to do learning outcomes at the program level. Most of the University has not and are trying to ignore the new mandates. Department chairs are in a panic because funding is going to be tied to these outcomes.

    And again, Agal, I'm not talking about benchmarks. They are what you want them to be and are just a number. I'm talking about the assessment tool you use. Not the arbitrary percent you decide is good, bad or indifferent. They aren't the same thing. It's not what the answer is, but how you derive the answer I'm interested in. Because the answer is meaningless if the process isn't valid. I don't think there is a valid process for imparting critical thinking, but people who determine if my program is accredited want me to say one of our program goals is that students will demonstrate critical thinking skills. A good goal, but how to I assure that they do when all the assessments indicate some do, some don't and the program doesn't make those that don't do better.
    Last edited by rfisher; 06-08-2012 at 03:49 AM.
    Those who never succeed themselves are always the first to tell you how.

  8. #28
    Bountifully Enmeshed
    Join Date
    Apr 2001
    Location
    At the Christmas Bizarre
    Posts
    38,154
    vCash
    250
    Rep Power
    46687
    Quote Originally Posted by agalisgv View Post
    To me I don't find it all that difficult because it's basically using a mathematical rubric to quantify what I do anyway. Since I've always used point systems for grading, it's not that difficult to slice and dice it with particular goals, objectives, and benchmarks.
    The issue I am coming up against is the subjectivity factor. It's very easy to assign X points to "clear, arguable thesis," but the determination of "clear, arguable" is up to the professor.

    This means, I have been told, that while the professor grasps what a clear and arguable thesis statement is, the student may not. Students who do not understand the precise nature of the assessment will not be able to fulfill the requirement.

    Quote Originally Posted by agalisgv View Post
    That's why I'm not exactly sure what rfisher is looking for since these things tend to be very specific to particular programs. No one here is going to be able to come up with specific benchmarks for a particular radiology program.
    That is going to be tough; our benchmarks have to be tied to specific curriculum.
    "The secret to creativity is knowing how to hide your sources."-- Albert Einstein.

  9. #29
    Registered User
    Join Date
    Jan 2005
    Posts
    23,859
    vCash
    500
    Rep Power
    0
    Quote Originally Posted by Prancer View Post
    The issue I am coming up against is the subjectivity factor. It's very easy to assign X points to "clear, arguable thesis," but the determination of "clear, arguable" is up to the professor.

    This means, I have been told, that while the professor grasps what a clear and arguable thesis statement is, the student may not. Students who do not understand the precise nature of the assessment will not be able to fulfill the requirement.
    For us, we list subcriteria that explain a bit, but you're right there's always a subjective element.

    For me it really hasn't been an issue, but that may be because of the uniqueness of my field and institution. The hardest thing for me is coming up with the graphics in which to insert the numbers .

    Yes, I'm innotechnic
    Quote Originally Posted by rfisher View Post
    Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower.
    Wouldn't that be a fault in the program's curriculum then? If students score lower than when they first entered, it would seem a curricular issue rather than an assessment issue.

  10. #30

    Join Date
    Jan 2002
    Location
    BC
    Posts
    9,135
    vCash
    500
    Rep Power
    33672
    I came up with this handy essay formula during my university days.

    Intro: "The conventional wisdom is...." or "There are 2/3/4 schools of thought with respect to...

    Then summarize those arguments with both academic and real life sources.

    Then begin your argument/thesis: "However, such conventional wisdom should be reconsidered because of the following reasons..." or "Such conventional wisdom has led to the following awful tragedies..." or "Theory A is better than Theory B because..." or "Theory A and Theory B have their advantages and disadvantages, but when when taken together...BRILLIANCE!"

    Your argument is then supported by academic and real life sources (so, even if you're questioning the status quo, you rely on studies and people who are part of that status quo).

    And then conclusion...

    "My brilliant argument/thesis has academic, social, and economic implications..."

    Then BS those academic (more studies to fund!), social (no more racism!) and economic (more jobs, more money!) implications.

    That is critical thinking.

  11. #31

    Join Date
    Dec 2005
    Location
    New England, USA
    Posts
    5,938
    vCash
    470
    Rep Power
    12008
    Quote Originally Posted by rfisher View Post
    Except that isn't a measurable skill. The STAT order always takes precedence. We don't really teach triage as that isn't within the radiographer's scope of practice. Your manager was actually in non-compliance with the ARRT.
    Problem was, they were all submitted as STAT, so the rad staff had to sort them out. Which STAT is more STAT? This was a large outpatient center, and at times, there'd be a dozen people waiting, all dependent on films to determine what happened to them next.
    AceOn6, the golf loving skating fan

  12. #32
    Satisfied skating fan
    Join Date
    Oct 2005
    Location
    Looking for a pairs team to split up
    Posts
    40,212
    vCash
    600
    Rep Power
    42549
    Quote Originally Posted by Aceon6 View Post
    Problem was, they were all submitted as STAT, so the rad staff had to sort them out. Which STAT is more STAT? This was a large outpatient center, and at times, there'd be a dozen people waiting, all dependent on films to determine what happened to them next.
    Outpatient's aren't STAT. That isn't triage based on a patient's critical needs. That's a scheduling issue for management. Who should have clearly gone to senior management and requested additional equipment and technologists. I actually cover things like this in our management track but it's not something we would deal with in the general radiography curriculum.
    Those who never succeed themselves are always the first to tell you how.

  13. #33

    Join Date
    Dec 2005
    Location
    New England, USA
    Posts
    5,938
    vCash
    470
    Rep Power
    12008
    Quote Originally Posted by rfisher View Post
    Outpatient's aren't STAT. That isn't triage based on a patient's critical needs. That's a scheduling issue for management. Who should have clearly gone to senior management and requested additional equipment and technologists. I actually cover things like this in our management track but it's not something we would deal with in the general radiography curriculum.
    True this was a funked up system as it was a blended building. No one took any time to explain to the docs that STAT in the O/P rad center weren't the same as STAT in the I/P imaging department. Docs went back and forth, so they tended to be in I/P mode most of the time and if they ordered it STAT, they were always calling down to see what the problem was. Of course, it never occurred to them that 11 other docs were also ordering STAT. Real world, unfortunately.

    Back to the critical thinking thing... can you do anything around patient flow management or cultural competence, or are the students not at that level?
    AceOn6, the golf loving skating fan

  14. #34

    Join Date
    Jun 2002
    Location
    Refusing to perform on demand
    Posts
    14,498
    vCash
    500
    Rep Power
    2276
    Quote Originally Posted by Prancer View Post
    I know this, but every time I make this point, I am shot down in flames by people who are convinced that a math test is a math test and all the answers are right or wrong.
    Ugh. The idea that these things can be objectively quantified is silly. Quantified, sure. But objectively? Of course not. Failure to acknowledge the subjectivity of assessments is a dangerous trait in an educator or administrator.
    Creating drama!

  15. #35
    Registered User
    Join Date
    Dec 2005
    Location
    Looking for cupcakes
    Posts
    30,765
    vCash
    5550
    Rep Power
    0
    Quote Originally Posted by Aceon6 View Post
    Back to the critical thinking thing... can you do anything around patient flow management or cultural competence, or are the students not at that level?
    If I understand rfisher's original question it was related to a new standard of department of education requiring all courses to have a measurable objective goal for the courses, including those who are a bit "soft or esoteric" in nature.

    She is able to place those measurements on the clinical courses and that their programs have had to have those objectives in places as regulated by the state boards of the profession and the professional regulation of the Radiology Technologists. Plus she has been writing those for years, whereas the new regulations must be implemented as mandated by the DoE by this fall or experience the removal of the course in the University's program.

    For as much as rfisher wants to be a hard nose professor, she is probably trying to help her collegues write some measureable outcomes with critical thinking component and was asking if we knew about any. just my assessment of the situation

  16. #36
    Bountifully Enmeshed
    Join Date
    Apr 2001
    Location
    At the Christmas Bizarre
    Posts
    38,154
    vCash
    250
    Rep Power
    46687
    Quote Originally Posted by jeffisjeff View Post
    Ugh. The idea that these things can be objectively quantified is silly. Quantified, sure. But objectively? Of course not. Failure to acknowledge the subjectivity of assessments is a dangerous trait in an educator or administrator.
    It's not being ignored; on the contrary, it's considered a major problem.

    I am not explaining this well because I am somewhat confused myself, but as I understand it, the problem is that one of the major purposes of the whole objective-assesment-rubric-assignment approach is that it is supposed to precisely clarify the purpose and expectations for a particular assignment, in part to avoid confusion on the part of the students.

    So, if I tell students to write a research paper, for example, I break the assignment down into its component elements in a rubric. One of those elements would be the thesis statement. So I would say "clear, effective thesis" and that would be worth X points, and I would define the thesis in a breakdown, just as agalisgv said, something like this:

    Thesis clearly identifies subject
    Thesis clearly identifies position
    Thesis clearly predicts course of paper
    Thesis is appropropriate for audience

    Theoretically, this should identify for the student all the things that must be done in order to earn full points for the thesis statement. The expectation was that this method would reduce student confusion and the perception of subjectivity in the assessment process.

    What has happened, however, is that student complaints about subjectivity in assessment have increased, not decreased. Some are inclined to blame this on the increase in student complaining in general, but it seems to me that this should have been anticipated, as it does NOT make grading less subjective; au contraire, the subjectivity of the grading criteria is now explicitly spelled out. What is "clearly"? And what is "appropriate"?

    It is what I say it is, essentially. As PL said at the beginning of the thread, I know it when I see it.

    So the question at my assessment seminars was, how do we eliminate that kind of subjectivity in assessment and make the process more objective and standardized?

    I don't see how it can be done. In the end, it comes down to my judgment. That's essentially what they pay me for--my educated judgment of student work. But as we have seen in K-12 education, teachers are not trusted to judge student work on their own. It's all creeping upward. I predict that we will see some sort of standardized terminal assessment for degree programs before I escape retire.
    "The secret to creativity is knowing how to hide your sources."-- Albert Einstein.

  17. #37

    Join Date
    Feb 2003
    Location
    Not after the same people as Theatregirl
    Posts
    21,921
    vCash
    350
    Rep Power
    36500
    Quote Originally Posted by Prancer View Post
    In the end, it comes down to my judgment. That's essentially what they pay me for--my educated judgment of student work. But as we have seen in K-12 education, teachers are not trusted to judge student work on their own.
    QFT. Either trust me to do my job or fire me.

  18. #38
    Satisfied skating fan
    Join Date
    Oct 2005
    Location
    Looking for a pairs team to split up
    Posts
    40,212
    vCash
    600
    Rep Power
    42549
    Quote Originally Posted by Prancer View Post
    I predict that we will see some sort of standardized terminal assessment for degree programs before I escape retire.
    Yes. That is the next step in the particular format our affiliate has opted to use. They call it Degree Profile. It's going to be tied to federal funding. What some faculty don't seem to have realized is if you don't find assement tools for your class that can be tied into program and university wide goals and outcome assessments, your class will be considered unnecessary and eliminated. Meaning your job is on the line. Our programs are better than most non health care programs because we did a lot of this already. My problem is that I not only have to meet programmatic accreditation by JRCERT, I also have to meet the affiliate university's requirements and sometimes there is a real disconnect between the two. It takes real critical thinking on my part to accomadate both. That and dealing with a faculty that is entrenched in how things used to be and why do we have to change anything. Just today I heard "the technologists in the imaging department won't fill out the student evaluation forms....they just mark it all good.." So I said then we need to develop a different assessment tool that we can use. The reply" why do we need to change anything? The forms are just fine and we don't need to do anything different." It can't be both!!!
    Online and distance course are facing new guidelines as of this fall. Many of these classes are going to be reclassified as correspondance courses and federal student aid will not cover them. There must be 10 seminars being offered over the summer to insure any online class meets the new guidelines. The University cannot afford to lose that money.
    Those who never succeed themselves are always the first to tell you how.

  19. #39

    Join Date
    Apr 2001
    Location
    West of the 5th...
    Posts
    29,316
    vCash
    100
    Rep Power
    41498
    Haunting the Princess of Pink since 20/07/11...

Page 2 of 2 FirstFirst 12

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •