Teachers: Define critical thinking

Discussion in 'Off The Beaten Track' started by rfisher, Jun 7, 2012.

  1. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    I'm in the process of revising all our assessment forms. Critical thinking is a one of those new higher education buzz words that people like to throw around and which nobody seems to really be able to define. Our affiliate university has even designated several courses as CT courses, although the criteria for doing so is ambiguous. Intro to Sociology is critical thinking but Intro to Anthropology is not even though they are essentially the same. Go figure. In any event, one of my accreditation agency's pet goals along with the university is "students will develop critical thinking skills." What exactly is critical thinking and how do we know if the student has developed those skills and if they haven't how to we assure that they do? I have a PhD in anthropology and I have no idea how an introduction to sociology course would accomplish that goal. :lol: I have some idea how CT can apply to our more focused scientific curriculum, but I'd like to know what others are doing with this in terms of defining the concept, measuring and assessment. The Department of Education will be the bane of my existence. I used to think it would be the students, but they were just a warm up exercise.
     
  2. PrincessLeppard

    PrincessLeppard Pink Bitch

    Joined:
    Feb 17, 2003
    Messages:
    22,345
    Critical thinking is like pornography: I know it when I see it. :p

    I realize that isn't helpful :slinkaway
     
  3. numbers123

    numbers123 Well-Known Member

    Joined:
    Dec 4, 2005
    Messages:
    30,765
    Perhaps the critical thinking for sociology would include: Discuss how the sociology models affect current political parties and voting results. Determine how society attitudes towards groups of people affect perceptions of poverty.

    It's been a few years since I was in college or grad school, but is sociology where one discusses ethical models?
     
  4. nlyoung

    nlyoung Active Member

    Joined:
    Jun 8, 2002
    Messages:
    533
    As a university history professor, I would argue that "critical thinking" should be an integral component of any course in any discipline (unfortunately, critical thinking tends to be missing in high school education as well which is part of the problem). Evaluation is fairly straightforward, as it is obvious in all written work as well as in group discussions. Using history as an example, at its most basic level, critical thinking requires that a student provide analysis of a question rather than a simple summary of events in either their written work or in oral discussion. One needs to be able to explain how one arrived at a conclusion, using those "facts" that support an argument as well as being able to address points that perhaps work against it. It's really the difference between "A" work and "C" work, at least in my discipline.
     
  5. Aceon6

    Aceon6 Get off my lawn

    Joined:
    Dec 22, 2005
    Messages:
    6,480
    To me, critical thinking is the ability to properly integrate information. Can the student apply existing knowledge and research methods to challenge new information? Can the student use new information to alter an approach or a way of thinking? Can the student apply knowledge to inform an area of investigation. That sort of thing.
     
  6. numbers123

    numbers123 Well-Known Member

    Joined:
    Dec 4, 2005
    Messages:
    30,765
    To me, critical thinking is taking a situation and being able to debate for and against a situation and determine a choice that is found upon consequences of the action.
    In high school, I thought that debate clubs provided an excellent opportunity to develop critical thinking. You needed to be prepare to argue both sides with points and the consequences of either side. Hot topics when I was going to school: military draft, voting age, legalization of certain drugs (yes, even in the late 60's and early 70's it was a debate)
    Which is why I asked about ethical models - one could develop an essay requirement or even a debate in class to determine if the metric was "met"
     
  7. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    Bint. Why yes, I'll put that in the assessment plan: I know it when I saw it--you'll just have to trust me.

    Just you wait, Missy. High school teachers will have to do this soon enough and Prancer and I will laugh at you.
     
  8. agalisgv

    agalisgv Well-Known Member

    Joined:
    Jan 29, 2005
    Messages:
    24,018
    Critical thinking comes from the root of critique. So basically one is asking students to be capable of engaging in critique. That means:

    - being able to identify the main and subsidiary arguments being made
    - what is the evidence in support of those arguments
    - what is the strength of those arguments
    - what are the flaws of the main and subsidiary arguments
    - what is the student's independent assessment of the topic raised in the primary and subsidiary arguments
    - what evidence does the student provide in support of his/her assessment
    - is the student able to integrate his/her analysis with the arguments from other scholars
    - what are the theoretical and practical implications of both the initial arguments raised, and in the arguments offered by the student
    - how do those implications impact the scholarly discipline and larger sociopolitical debates
    - how clearly expressed is all of the above by the student

    Those would be the central questions and methods of evaluating critical thinking.
     
  9. Artemis@BC

    Artemis@BC Well-Known Member

    Joined:
    Jan 17, 2005
    Messages:
    5,129
    Been there, done that! I've been working with various high school humanities curriculum development processes for years, and the issue of how we define and assess critical thinking always comes up.

    One conclusion we almost always come to is that, since "critical thinking" is a somewhat amorphous concept, it's sometimes easier to think of it in the slightly more defined way of "critical thinking skills." And of course it's much easier to assess and evaluate "skills" than "thinking."

    From one curriculum I worked on, critical thinking skills include:

    ~ demonstrating skills of critical analysis (e.g., questioning, imagining, experiencing, hypothesizing, inferring, predicting, comparing, classifying, verifying, identifying relationships and patterns, extrapolating, using analogies, creating metaphors, recognizing contradictions, identifying the use of rhetoric, summarizing, drawing conclusions, defending a position, reflecting, reassessing a position)
    ~ developing pertinent questions to define the topic, issue, or situation
    ~ identifying connections among
    - their own and others’ experiences
    - local and global issues and events
    - past and present events and situations (e.g., causal connections, similarities)
    - a range of points of view on the topic, issue, or situation
    ~ making reasoned judgments (e.g., logical, based on evidence) about an issue, situation, or topic
    ~ citing evidence to justify their position

    These are all in addition to more basic research and media analysis skills, which of course are foundational to critical thinking/critical analysis (as well as being easier to define!).
     
  10. PrincessLeppard

    PrincessLeppard Pink Bitch

    Joined:
    Feb 17, 2003
    Messages:
    22,345
    Please. We've been doing this crap the entire nine years I've been teaching. We just haven't hit critical thinking yet.

    Have you had to define your essential learnings and enduring understandings yet?

    :watch:
     
  11. susan6

    susan6 Well-Known Member

    Joined:
    May 24, 2002
    Messages:
    3,700
    Critical thinking ability is the difference between a lab tech/staff scientist/master's degree and a doctorate. A master's degree student can go into lab and collect a vast quantity of data. But they lack the critical thinking ability to determine the quality of the data, to understand how pieces of data relate to each other, how they compare/relate to other published data, and what the data indicate in terms of future experiments that will need to be carried out to support and expand upon a developing hypothesis.
     
  12. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    Yes, only the HLC is calling them degree qualifications profiles and we have to identify the areas of learning (specialized knowledge, intellectural skills, applied learning etc.,) associate program goals to each profile, identify courses which will meet the individual program learning goals of which there has to be a minimum of two at different points in the curriculum which will allow us to assess progression of the outcome, then set benchmarks to measure that progression and develop associated rubrics for each. :lol: And that is at the program level. Each individual course now has to tie itself back into a specific program goal and establish the learning outcome, assessment tool and benchmark. We call those enduring understandings life-long learning and it's also a bitch to quantify. It's easy to write some vague goal. It's another thing to quantify it. My faculty have plenty of input on what these mean and zero input on how to assess them. I need tools and benchmarks. Exams are useless as an assessment tool because you can make the results of an exam what ever you want them to be. I can't get that point across to the faculty either. :wall:

    Critical thinking is one of the program goals and it has to have a specific assessment tool and benchmark. It can't be something we know when we see it. The problem is the assessment and benchmarks. It was all well and good when it was a general concept, but now it has to be a measurable outcome. There are no "arguments" for the student to debate or select a side in medical imaging. This is a very defined science. It's a challenge to bring evidence based practice into a discussion for my senior students that relates to the role of the radiographer. It's even difficult to do this at the level of the radiologist (physician) since it is the clinician who orders imaging studies and not the radiologist. I know because there are about two articles that even make the attempt and I have to explain them to the students because they haven't a clue what the author is trying to say. :lol:
    We have to quantify the difference between students. One of our goals is that the student will be clinically competent. This is a piece of cake. I have multiple ways to assess this learning outcome. Quantifying critical thinking, and more importantly progression of critical thinking is much more challenging.

    Finding a quantifiable assessment tool is the real challenge here. So far, none of you have identifed that. I can tell you these new requirements coming down from the DoE have caused havoc on our campus. At least clinical classes in the College of Health Professions have a starting point because we have to do this for our national accreditation bodies. I feel for departments like history or English who have not had to do this until now. One thing our unversity is doing is if a specific course does not tie into a specific program goal, it's likely to be eliminated. I don't have a problem, but many departments will.
     
    Last edited: Jun 7, 2012
  13. Aceon6

    Aceon6 Get off my lawn

    Joined:
    Dec 22, 2005
    Messages:
    6,480
    One thing that might be measurable for critical thinking is the students ability to review a list of patients awaiting imaging and prioritize those patients based on specific criteria. I know in my old hospital, all the MDs ordered Stat, so everything was Stat and the Radiology manager had to prioritize the patients herself. If there was no clinical criteria for choosing patient A over patient B, she'd do it using her own twisted system that allowed her techs to know what was going on without giving it away to the docs.
     
  14. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    Except that isn't a measurable skill. The STAT order always takes precedence. :lol: We don't really teach triage as that isn't within the radiographer's scope of practice. Your manager was actually in non-compliance with the ARRT.
     
  15. numbers123

    numbers123 Well-Known Member

    Joined:
    Dec 4, 2005
    Messages:
    30,765
    Except a stat order in the ED has a higher priority than a stat order in the ambulatory care clinic. One being critical and one being a level of priority for patient and doctor (get in and out as soon as possible).

    I do think the abstract courses such as literature, sociology etc. have a harder time in developing quantitative measurements. Courses that are built on qualitative designs are extremely difficult to measure.

    For the DoE to expect hard data for measurement of critical thinking skills in those courses is :rolleyes:
     
  16. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    True. The ED always has top priority, however, it's seldom that the radiographer has to make that choice unless there is only one of you. :lol: We just had a mock trauma simulation for a bunch of high school kids. The med-flight crew landed, EMS did a mock field assessment, and the respiratory and nursing faculty did a mock ER treatment. Calling for x-ray was part of the simulation. The order was for a chest, cervical spine and femur. I had chest pains when the two imaging students imaged the femur first. Clearly a lack of critical thinking skills. However, when I turned to the faculty who teaches procedures to ask what happened, I learned that he'd never actually discussed trauma with them in class, yet we expect them to know this. This is a major default with his class and we had an immediate discussion of how this is going to be changed forth with. Now, I just have to figure out where within the overall program assessment to include and measure the results.

    Sadly, I also just read the only 4 articles on teaching and assessment of critical thinking among radiography students and it seems that although all four authors thought they were teaching it in their programs, their assessments were less than 60% indicators which is an epic fail. Their conclusions were programs need to find a better methodology of teaching and assessing critical thinking skills. Well, duh. :rolleyes: That was helpful research.
     
  17. skatesindreams

    skatesindreams Well-Known Member

    Joined:
    Sep 30, 2002
    Messages:
    14,405
    All of this describes my definition.
    I agree that these skills are sorely missing in education today.
    TPTB say they want students to be able to apply this.

    However, I've always doubted it; as society doesn't really want people to question authority, or the status quo.
     
  18. agalisgv

    agalisgv Well-Known Member

    Joined:
    Jan 29, 2005
    Messages:
    24,018
    You basically build it into your grading rubric. If you give exams, you identify which questions are involve critical thinking, and build measurements around that. If you do essays, you assign points for the things I outlined, and build measurements around that.
    You *can* fudge the results, but a well-done exam is supposed to measure quantifiable results. It may be more of an issue with the quality of the exam.
    To be fair, you asked for a definition of critical thinking, and people supplied that. Now it sounds like you want people to write your program's benchmarks, and obviously no one can do that because it's specific to your program.

    It's not that difficult to quantify critical thinking or progression of it IMO. Personally I think it's more time-consuming than anything. But perhaps it's different for technical fields.
     
  19. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    Based on everything I've read, it seems it's actually quite difficult to quantify critical thinking. And teaching it is even more difficult to quantify. It's easy to make esoteric definitions of what it should be. Not so easy to teach or to determine how to improve the results. Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower. And the relationship to academic success in the program was statistically unreliable.

    I have excellent critical thinking skills and always have. I'm not a linear thinker and problem solving is easy for me. I get what the theory is. It's the application that frustrates me.. Personally, I don't think this can be taught. And, no. I don't need people to set benchmarks for me. :lol: The benchmark isn't the problem. It's the assessment tool. I've looked at multiple programs examples and frankly, the assessment and data they've collected are worthless and tell you nothing new. They are just making up pointless data and pretending it's meaningful. When you look closer, there is no science to the results. If it's not meaningful, it's pointless. I can make up pointless assessments, gather data, analyze and interpret it sufficiently for the accreditation requirements. I despise pointless data analysis and pretending the results mean something when I know they don't. I got a master's thesis and a doctoral dissertation by taking apart previous data that had done exactly that. When I threw out their results and reanalyzed the data, I arrived at an entirely different conclusion based on actual data rather than inference. This is making my eye twitch because I want solid science and not crap. I should have stayed in field biology. I hate educational theory.
     
  20. 4rkidz

    4rkidz GPF Barcelona here I come

    Joined:
    Jun 11, 2001
    Messages:
    9,747
    This is also how I understand CT as it applies to the students I teach.. from a cooking perspective - we provide the ingredients.. but can the student apply the skills attained and create a unique dish ;) Not just the continual regurgitation of other people's work... :blah:
     
  21. Prancer

    Prancer The "specialness" that is Staff Member

    Joined:
    Apr 16, 2001
    Messages:
    38,826
    That's pretty much what I would expect--and will continue to expect the more that we are pushed to assess students in some sort of objectively quantifiable way. It really can't be done; it takes human judgment to determine what a main argument is, whether or not students can identify it, how effectively they can address the issue raised, etc.

    And it's not like your school is unique in this outcome, either: http://www.msnbc.msn.com/id/4113693.../t/report-college-students-not-learning-much/

    I've been to two assessment seminars this year and the impression I am getting is that grading papers is all fine and good and important and all, but what is really being sought is some sort of foolproof system in which everyone teaches the same stuff and could give the same tests with the same outcomes because the answers are either right or wrong.

    When someone points out that the only way to assess critical thinking is analyze written or oral discussions, well,.........there must be a better way. I haven't been told what it is, exactly, but everyone seems sure that it can be done. They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).
     
  22. 4rkidz

    4rkidz GPF Barcelona here I come

    Joined:
    Jun 11, 2001
    Messages:
    9,747
    Here in Canada there are signs/movement away from standardization and testing.. the medical schools (McGill) have removed MCAT testing.. the Vet school also.. I actually did a paper on 'unschooling at the graduate level' :lol:
     
  23. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    Exactly. We have to do this (and the Department of Education is pushing this), but nobody really knows how. Our University spent three years deciding how to redo the general ed curriculum to include CI. The upshot was a lot of dithering and a final decision to just make some classes meet the criteria. They all patted themselves on the back for a job well done until this spring when the assessment monster arrived. Now, they actually have to do something. I called my accreditation agency, most of whom have masters in education rather than imaging, and asked what they wanted. I got double speak which essentially meant, we don't know, but we know we want it and are hoping you can figure it out. I ask my faculty and everybody knows what it should mean and give me the "I don't understand what the problem is" look. Then I ask for input on exactly how to quantify this. How are we to provide evidence that students demonstrate progression? If if they don't what are we going to do about it? Give them more assessments? More tests? More papers? How are we supposed to address the fact they either think or they do rote learning. I get blank looks and mumbles that that is my job not theirs. One went so far as to say, if we don't meet the benchmarks, just lower them till we do and you don't have to figure out how to address the issue before the fact. I told her, if you don't know what you are doing, establish how you are going to do it before the fact, how the hell do you think you'll figure it out after? Insight from the Gods?

    And PML at their include U Charleston in that study. Why did they beef up writing in nursing and biology? Because over 75% of their master's thesis, master's mind you, were rejected due to poor writing and worse research. I was at a Dean's meeting at our University where they were discussing this and were shocked when they realized their own statistics were no better. So, they institute writing across the curriculum as the solution. It's a joke. I just jumped through the hoops to get two of my classes designated at a WAC class. The hoops dummy down the process not make it better. I had to add all sorts of nonsense. The upshot is, the committee was thrilled with the result even though I think it's stupid.
     
    Last edited: Jun 8, 2012
  24. jeffisjeff

    jeffisjeff Well-Known Member

    Joined:
    Jun 9, 2002
    Messages:
    14,551
    Even in math or similar subjects, where there is often a well-defined right answer, there is still a lot of subjectivity because it isn't just about getting the right answer, it is about the process of getting the right answer. So we often give lots of partial credit, which is inherently subjective. What are the most important steps in formulating and solving the problem and how many points are they worth?

    I have some colleagues who give no partial credit, while others like myself give lots of partial credit. So, if a student works a problem through and gets the answer 2x4+1, which is correct, but makes a stupid error and writes the final answer as 7, some profs would give 0 credit, while others would give full credit. Which is the correct allocation of points? My colleagues who give 0 credit can give compelling arguments that their approach is best (they've had many years of defending their policy to students :p). You know, one minor calculation error and the bridge will fall down. But I view things quite differently.

    Then there was the time my husband took graduate level math stats and was asked on an exam for the variance of the sample mean for a particular setting. He had the right answer (he happened to know if from other stats classes) but he was given no credit because he didn't show any work. He was :mad: but I was a bit :lol: because everyone ( ;) ) knows math stats is all about the process, i.e., why is that the correct equation for the variance of the sample mean.
     
  25. Prancer

    Prancer The "specialness" that is Staff Member

    Joined:
    Apr 16, 2001
    Messages:
    38,826
    I know this, but every time I make this point, I am shot down in flames by people who are convinced that a math test is a math test and all the answers are right or wrong.

    I always use the example of two math professors I know, one of whom grades all homework assignments, never curves, and gives incredibly difficult tests, while the other never grades homework, always curves, and gives take-home exams. Believe it or not, their course GPAs are a little different :p.

    And I get "Yes, but the answers! The answers are either right or wrong!"

    :mad:

    I was tearing my hair out over assessments after the seminars, partly because of what I heard and partly because the state had handed down a set of objectives we have to meet, and I had and still have no idea how to do what they want in a coherent, cohesive sort of way. Then I went to the department meeting and found out that everyone is just putting the state stuff on their syllabi and then completely ignoring it and doing whatever they want :lol:.

    I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.

    I've heard that the nursing program at my school is planning to do their own writing courses. Yeah, that'll be good.
     
  26. agalisgv

    agalisgv Well-Known Member

    Joined:
    Jan 29, 2005
    Messages:
    24,018
    Very true

    To me I don't find it all that difficult because it's basically using a mathematical rubric to quantify what I do anyway. Since I've always used point systems for grading, it's not that difficult to slice and dice it with particular goals, objectives, and benchmarks.

    The reality, though, is teachers tend to have a sense of what a student has earned, and what skill sets they have mastered at what level, so the rubrics are basically fudged to arrive at the subjective grade.

    But it helps the number crunchers out there to quantify that way, so all is well.
    That's why I'm not exactly sure what rfisher is looking for since these things tend to be very specific to particular programs. No one here is going to be able to come up with specific benchmarks for a particular radiology program.
     
  27. rfisher

    rfisher Satisfied skating fan

    Joined:
    Oct 20, 2005
    Messages:
    41,435
    One would think that, except JRCERT has declared that critical thinking is indeed a desired goal, because, well, it's a higher educational buzz word. All I can do is ascertain a student's ability to adapt to different situations. The problem is either a student can do this or they can't and no matter what we attempt to do will change that. Which means the outcome assessment is essentially predetermined and will not demonstrate growth. I know by the end of a student's first year exactly what type of imaging they should do. Five percent would make excellent trauma techs in a level 1 trauma center. They have the innate ability to assess a situation and make immediate adjustments to the norm. The other 95% need to work in an outpatient facility where there is little variation in the patient. I was in the 5% as a practicing radiographer. I loved trauma and the challenge it presented. I like innovation. The entire rest of my facilty are in the 95%. They don't do well with change. At. All.
    I can find a way to deal with JRCERT, however, I also have to answer to the University as a whole and they want assessments that are like the others. Rubrics are easy. Grading is simple. However, I'm not dealing with just individual courses. I have to do an assessment on the Program as a whole which is a whole different ballgame.

    I supect most of the faculty at the university are planning to do what they are doing at yours, except, the department chairs will indeed have to compile that data. It's going to be collected at the course level as well as the program level and the program level has to set different benchmarks that use similar assessment tools in different courses over the progression of a degree that will demonstrate the goal. At least our program has always had to do learning outcomes at the program level. Most of the University has not and are trying to ignore the new mandates. Department chairs are in a panic because funding is going to be tied to these outcomes.

    And again, Agal, I'm not talking about benchmarks. They are what you want them to be and are just a number. I'm talking about the assessment tool you use. Not the arbitrary percent you decide is good, bad or indifferent. They aren't the same thing. It's not what the answer is, but how you derive the answer I'm interested in. Because the answer is meaningless if the process isn't valid. I don't think there is a valid process for imparting critical thinking, but people who determine if my program is accredited want me to say one of our program goals is that students will demonstrate critical thinking skills. A good goal, but how to I assure that they do when all the assessments indicate some do, some don't and the program doesn't make those that don't do better.
     
    Last edited: Jun 8, 2012
  28. Prancer

    Prancer The "specialness" that is Staff Member

    Joined:
    Apr 16, 2001
    Messages:
    38,826
    The issue I am coming up against is the subjectivity factor. It's very easy to assign X points to "clear, arguable thesis," but the determination of "clear, arguable" is up to the professor.

    This means, I have been told, that while the professor grasps what a clear and arguable thesis statement is, the student may not. Students who do not understand the precise nature of the assessment will not be able to fulfill the requirement.

    That is going to be tough; our benchmarks have to be tied to specific curriculum.
     
  29. agalisgv

    agalisgv Well-Known Member

    Joined:
    Jan 29, 2005
    Messages:
    24,018
    For us, we list subcriteria that explain a bit, but you're right there's always a subjective element.

    For me it really hasn't been an issue, but that may be because of the uniqueness of my field and institution. The hardest thing for me is coming up with the graphics in which to insert the numbers :shuffle:.

    Yes, I'm innotechnic :slinkaway
    Wouldn't that be a fault in the program's curriculum then? If students score lower than when they first entered, it would seem a curricular issue rather than an assessment issue.
     
  30. manhn

    manhn Well-Known Member

    Joined:
    Jan 18, 2002
    Messages:
    9,406
    I came up with this handy essay formula during my university days.

    Intro: "The conventional wisdom is...." or "There are 2/3/4 schools of thought with respect to...

    Then summarize those arguments with both academic and real life sources.

    Then begin your argument/thesis: "However, such conventional wisdom should be reconsidered because of the following reasons..." or "Such conventional wisdom has led to the following awful tragedies..." or "Theory A is better than Theory B because..." or "Theory A and Theory B have their advantages and disadvantages, but when when taken together...BRILLIANCE!"

    Your argument is then supported by academic and real life sources (so, even if you're questioning the status quo, you rely on studies and people who are part of that status quo).

    And then conclusion...

    "My brilliant argument/thesis has academic, social, and economic implications..."

    Then BS those academic (more studies to fund!), social (no more racism!) and economic (more jobs, more money!) implications.

    That is critical thinking.