PDA

View Full Version : Teachers: Define critical thinking



Pages : 1 [2] 3

rfisher
06-07-2012, 08:31 PM
Except a stat order in the ED has a higher priority than a stat order in the ambulatory care clinic. One being critical and one being a level of priority for patient and doctor (get in and out as soon as possible).
:

True. The ED always has top priority, however, it's seldom that the radiographer has to make that choice unless there is only one of you. :lol: We just had a mock trauma simulation for a bunch of high school kids. The med-flight crew landed, EMS did a mock field assessment, and the respiratory and nursing faculty did a mock ER treatment. Calling for x-ray was part of the simulation. The order was for a chest, cervical spine and femur. I had chest pains when the two imaging students imaged the femur first. Clearly a lack of critical thinking skills. However, when I turned to the faculty who teaches procedures to ask what happened, I learned that he'd never actually discussed trauma with them in class, yet we expect them to know this. This is a major default with his class and we had an immediate discussion of how this is going to be changed forth with. Now, I just have to figure out where within the overall program assessment to include and measure the results.

Sadly, I also just read the only 4 articles on teaching and assessment of critical thinking among radiography students and it seems that although all four authors thought they were teaching it in their programs, their assessments were less than 60% indicators which is an epic fail. Their conclusions were programs need to find a better methodology of teaching and assessing critical thinking skills. Well, duh. :rolleyes: That was helpful research.

skatesindreams
06-08-2012, 12:49 AM
Been there, done that! I've been working with various high school humanities curriculum development processes for years, and the issue of how we define and assess critical thinking always comes up.

One conclusion we almost always come to is that, since "critical thinking" is a somewhat amorphous concept, it's sometimes easier to think of it in the slightly more defined way of "critical thinking skills." And of course it's much easier to assess and evaluate "skills" than "thinking."

From one curriculum I worked on, critical thinking skills include:

~ demonstrating skills of critical analysis (e.g., questioning, imagining, experiencing, hypothesizing, inferring, predicting, comparing, classifying, verifying, identifying relationships and patterns, extrapolating, using analogies, creating metaphors, recognizing contradictions, identifying the use of rhetoric, summarizing, drawing conclusions, defending a position, reflecting, reassessing a position)
~ developing pertinent questions to define the topic, issue, or situation
~ identifying connections among
- their own and others’ experiences
- local and global issues and events
- past and present events and situations (e.g., causal connections, similarities)
- a range of points of view on the topic, issue, or situation
~ making reasoned judgments (e.g., logical, based on evidence) about an issue, situation, or topic
~ citing evidence to justify their position

These are all in addition to more basic research and media analysis skills, which of course are foundational to critical thinking/critical analysis (as well as being easier to define!).

All of this describes my definition.
I agree that these skills are sorely missing in education today.
TPTB say they want students to be able to apply this.

However, I've always doubted it; as society doesn't really want people to question authority, or the status quo.

agalisgv
06-08-2012, 12:52 AM
I do think the abstract courses such as literature, sociology etc. have a harder time in developing quantitative measurements. Courses that are built on qualitative designs are extremely difficult to measure. You basically build it into your grading rubric. If you give exams, you identify which questions are involve critical thinking, and build measurements around that. If you do essays, you assign points for the things I outlined, and build measurements around that.

Exams are useless as an assessment tool because you can make the results of an exam what ever you want them to be. You *can* fudge the results, but a well-done exam is supposed to measure quantifiable results. It may be more of an issue with the quality of the exam.
There are no "arguments" for the student to debate or select a side in medical imaging. This is a very defined science.

We have to quantify the difference between students. One of our goals is that the student will be clinically competent. This is a piece of cake. I have multiple ways to assess this learning outcome. Quantifying critical thinking, and more importantly progression of critical thinking is much more challenging.

Finding a quantifiable assessment tool is the real challenge here. So far, none of you have identifed that. To be fair, you asked for a definition of critical thinking, and people supplied that. Now it sounds like you want people to write your program's benchmarks, and obviously no one can do that because it's specific to your program.

It's not that difficult to quantify critical thinking or progression of it IMO. Personally I think it's more time-consuming than anything. But perhaps it's different for technical fields.

rfisher
06-08-2012, 01:23 AM
.

It's not that difficult to quantify critical thinking or progression of it IMO. Personally I think it's more time-consuming than anything. But perhaps it's different for technical fields.

Based on everything I've read, it seems it's actually quite difficult to quantify critical thinking. And teaching it is even more difficult to quantify. It's easy to make esoteric definitions of what it should be. Not so easy to teach or to determine how to improve the results. Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower. And the relationship to academic success in the program was statistically unreliable.

I have excellent critical thinking skills and always have. I'm not a linear thinker and problem solving is easy for me. I get what the theory is. It's the application that frustrates me.. Personally, I don't think this can be taught. And, no. I don't need people to set benchmarks for me. :lol: The benchmark isn't the problem. It's the assessment tool. I've looked at multiple programs examples and frankly, the assessment and data they've collected are worthless and tell you nothing new. They are just making up pointless data and pretending it's meaningful. When you look closer, there is no science to the results. If it's not meaningful, it's pointless. I can make up pointless assessments, gather data, analyze and interpret it sufficiently for the accreditation requirements. I despise pointless data analysis and pretending the results mean something when I know they don't. I got a master's thesis and a doctoral dissertation by taking apart previous data that had done exactly that. When I threw out their results and reanalyzed the data, I arrived at an entirely different conclusion based on actual data rather than inference. This is making my eye twitch because I want solid science and not crap. I should have stayed in field biology. I hate educational theory.

4rkidz
06-08-2012, 02:20 AM
To me, critical thinking is the ability to properly integrate information. Can the student apply existing knowledge and research methods to challenge new information? Can the student use new information to alter an approach or a way of thinking? Can the student apply knowledge to inform an area of investigation. That sort of thing.

This is also how I understand CT as it applies to the students I teach.. from a cooking perspective - we provide the ingredients.. but can the student apply the skills attained and create a unique dish ;) Not just the continual regurgitation of other people's work... :blah:

Prancer
06-08-2012, 02:25 AM
Based on everything I've read, it seems it's actually quite difficult to quantify critical thinking. And teaching it is even more difficult to quantify. It's easy to make esoteric definitions of what it should be. Not so easy to teach or to determine how to improve the results. Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower. And the relationship to academic success in the program was statistically unreliable.

That's pretty much what I would expect--and will continue to expect the more that we are pushed to assess students in some sort of objectively quantifiable way. It really can't be done; it takes human judgment to determine what a main argument is, whether or not students can identify it, how effectively they can address the issue raised, etc.

And it's not like your school is unique in this outcome, either: http://www.msnbc.msn.com/id/41136935/ns/us_news-education/t/report-college-students-not-learning-much/

I've been to two assessment seminars this year and the impression I am getting is that grading papers is all fine and good and important and all, but what is really being sought is some sort of foolproof system in which everyone teaches the same stuff and could give the same tests with the same outcomes because the answers are either right or wrong.

When someone points out that the only way to assess critical thinking is analyze written or oral discussions, well,.........there must be a better way. I haven't been told what it is, exactly, but everyone seems sure that it can be done. They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).

4rkidz
06-08-2012, 02:31 AM
Here in Canada there are signs/movement away from standardization and testing.. the medical schools (McGill) have removed MCAT testing.. the Vet school also.. I actually did a paper on 'unschooling at the graduate level' :lol:

rfisher
06-08-2012, 02:49 AM
That's pretty much what I would expect--and will continue to expect the more that we are pushed to assess students in some sort of objectively quantifiable way. It really can't be done; it takes human judgment to determine what a main argument is, whether or not students can identify it, how effectively they can address the issue raised, etc.

And it's not like your school is unique in this outcome, either: http://www.msnbc.msn.com/id/41136935/ns/us_news-education/t/report-college-students-not-learning-much/

I've been to two assessment seminars this year and the impression I am getting is that grading papers is all fine and good and important and all, but what is really being sought is some sort of foolproof system in which everyone teaches the same stuff and could give the same tests with the same outcomes because the answers are either right or wrong.

When someone points out that the only way to assess critical thinking is analyze written or oral discussions, well,.........there must be a better way. I haven't been told what it is, exactly, but everyone seems sure that it can be done. They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).

Exactly. We have to do this (and the Department of Education is pushing this), but nobody really knows how. Our University spent three years deciding how to redo the general ed curriculum to include CI. The upshot was a lot of dithering and a final decision to just make some classes meet the criteria. They all patted themselves on the back for a job well done until this spring when the assessment monster arrived. Now, they actually have to do something. I called my accreditation agency, most of whom have masters in education rather than imaging, and asked what they wanted. I got double speak which essentially meant, we don't know, but we know we want it and are hoping you can figure it out. I ask my faculty and everybody knows what it should mean and give me the "I don't understand what the problem is" look. Then I ask for input on exactly how to quantify this. How are we to provide evidence that students demonstrate progression? If if they don't what are we going to do about it? Give them more assessments? More tests? More papers? How are we supposed to address the fact they either think or they do rote learning. I get blank looks and mumbles that that is my job not theirs. One went so far as to say, if we don't meet the benchmarks, just lower them till we do and you don't have to figure out how to address the issue before the fact. I told her, if you don't know what you are doing, establish how you are going to do it before the fact, how the hell do you think you'll figure it out after? Insight from the Gods?

And PML at their include U Charleston in that study. Why did they beef up writing in nursing and biology? Because over 75% of their master's thesis, master's mind you, were rejected due to poor writing and worse research. I was at a Dean's meeting at our University where they were discussing this and were shocked when they realized their own statistics were no better. So, they institute writing across the curriculum as the solution. It's a joke. I just jumped through the hoops to get two of my classes designated at a WAC class. The hoops dummy down the process not make it better. I had to add all sorts of nonsense. The upshot is, the committee was thrilled with the result even though I think it's stupid.

jeffisjeff
06-08-2012, 02:58 AM
They want a standard that students can grasp easily but can't argue with--something like a math test only not, you know, math. And with the same standards for everyone (which doesn't happen in math, either).

Even in math or similar subjects, where there is often a well-defined right answer, there is still a lot of subjectivity because it isn't just about getting the right answer, it is about the process of getting the right answer. So we often give lots of partial credit, which is inherently subjective. What are the most important steps in formulating and solving the problem and how many points are they worth?

I have some colleagues who give no partial credit, while others like myself give lots of partial credit. So, if a student works a problem through and gets the answer 2x4+1, which is correct, but makes a stupid error and writes the final answer as 7, some profs would give 0 credit, while others would give full credit. Which is the correct allocation of points? My colleagues who give 0 credit can give compelling arguments that their approach is best (they've had many years of defending their policy to students :P). You know, one minor calculation error and the bridge will fall down. But I view things quite differently.

Then there was the time my husband took graduate level math stats and was asked on an exam for the variance of the sample mean for a particular setting. He had the right answer (he happened to know if from other stats classes) but he was given no credit because he didn't show any work. He was :mad: but I was a bit :lol: because everyone ( ;) ) knows math stats is all about the process, i.e., why is that the correct equation for the variance of the sample mean.

Prancer
06-08-2012, 03:11 AM
Even in math or similar subjects, where there is often a well-defined right answer, there is still a lot of subjectivity because it isn't just about getting the right answer, it is about the process of getting the right answer. So we often give lots of partial credit, which is inherently subjective. What are the most important steps in formulating and solving the problem and how many points are they worth?

I know this, but every time I make this point, I am shot down in flames by people who are convinced that a math test is a math test and all the answers are right or wrong.

I always use the example of two math professors I know, one of whom grades all homework assignments, never curves, and gives incredibly difficult tests, while the other never grades homework, always curves, and gives take-home exams. Believe it or not, their course GPAs are a little different :P.

And I get "Yes, but the answers! The answers are either right or wrong!"

:mad:

I was tearing my hair out over assessments after the seminars, partly because of what I heard and partly because the state had handed down a set of objectives we have to meet, and I had and still have no idea how to do what they want in a coherent, cohesive sort of way. Then I went to the department meeting and found out that everyone is just putting the state stuff on their syllabi and then completely ignoring it and doing whatever they want :lol:.

I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.

I've heard that the nursing program at my school is planning to do their own writing courses. Yeah, that'll be good.

agalisgv
06-08-2012, 03:27 AM
I went to the department meeting and found out that everyone is just putting the state stuff on their syllabi and then completely ignoring it and doing whatever they want :lol:. Very true

To me I don't find it all that difficult because it's basically using a mathematical rubric to quantify what I do anyway. Since I've always used point systems for grading, it's not that difficult to slice and dice it with particular goals, objectives, and benchmarks.

The reality, though, is teachers tend to have a sense of what a student has earned, and what skill sets they have mastered at what level, so the rubrics are basically fudged to arrive at the subjective grade.

But it helps the number crunchers out there to quantify that way, so all is well.
I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment. That's why I'm not exactly sure what rfisher is looking for since these things tend to be very specific to particular programs. No one here is going to be able to come up with specific benchmarks for a particular radiology program.

rfisher
06-08-2012, 03:30 AM
I would think in an applied field like rfisher's, there wouldn't be a whole lot of demand within the course for critical thinking--not because there isn't thinking involved, but because "critical thinking" is generally defined in very academic ways; critical thinking is all about objectively analyzing subjective information in a scholarly fashion. It's a different kind of assessment.

I've heard that the nursing program at my school is planning to do their own writing courses. Yeah, that'll be good.

One would think that, except JRCERT has declared that critical thinking is indeed a desired goal, because, well, it's a higher educational buzz word. All I can do is ascertain a student's ability to adapt to different situations. The problem is either a student can do this or they can't and no matter what we attempt to do will change that. Which means the outcome assessment is essentially predetermined and will not demonstrate growth. I know by the end of a student's first year exactly what type of imaging they should do. Five percent would make excellent trauma techs in a level 1 trauma center. They have the innate ability to assess a situation and make immediate adjustments to the norm. The other 95% need to work in an outpatient facility where there is little variation in the patient. I was in the 5% as a practicing radiographer. I loved trauma and the challenge it presented. I like innovation. The entire rest of my facilty are in the 95%. They don't do well with change. At. All.
I can find a way to deal with JRCERT, however, I also have to answer to the University as a whole and they want assessments that are like the others. Rubrics are easy. Grading is simple. However, I'm not dealing with just individual courses. I have to do an assessment on the Program as a whole which is a whole different ballgame.

I supect most of the faculty at the university are planning to do what they are doing at yours, except, the department chairs will indeed have to compile that data. It's going to be collected at the course level as well as the program level and the program level has to set different benchmarks that use similar assessment tools in different courses over the progression of a degree that will demonstrate the goal. At least our program has always had to do learning outcomes at the program level. Most of the University has not and are trying to ignore the new mandates. Department chairs are in a panic because funding is going to be tied to these outcomes.

And again, Agal, I'm not talking about benchmarks. They are what you want them to be and are just a number. I'm talking about the assessment tool you use. Not the arbitrary percent you decide is good, bad or indifferent. They aren't the same thing. It's not what the answer is, but how you derive the answer I'm interested in. Because the answer is meaningless if the process isn't valid. I don't think there is a valid process for imparting critical thinking, but people who determine if my program is accredited want me to say one of our program goals is that students will demonstrate critical thinking skills. A good goal, but how to I assure that they do when all the assessments indicate some do, some don't and the program doesn't make those that don't do better.

Prancer
06-08-2012, 03:44 AM
To me I don't find it all that difficult because it's basically using a mathematical rubric to quantify what I do anyway. Since I've always used point systems for grading, it's not that difficult to slice and dice it with particular goals, objectives, and benchmarks.

The issue I am coming up against is the subjectivity factor. It's very easy to assign X points to "clear, arguable thesis," but the determination of "clear, arguable" is up to the professor.

This means, I have been told, that while the professor grasps what a clear and arguable thesis statement is, the student may not. Students who do not understand the precise nature of the assessment will not be able to fulfill the requirement.


That's why I'm not exactly sure what rfisher is looking for since these things tend to be very specific to particular programs. No one here is going to be able to come up with specific benchmarks for a particular radiology program.

That is going to be tough; our benchmarks have to be tied to specific curriculum.

agalisgv
06-08-2012, 03:55 AM
The issue I am coming up against is the subjectivity factor. It's very easy to assign X points to "clear, arguable thesis," but the determination of "clear, arguable" is up to the professor.

This means, I have been told, that while the professor grasps what a clear and arguable thesis statement is, the student may not. Students who do not understand the precise nature of the assessment will not be able to fulfill the requirement. For us, we list subcriteria that explain a bit, but you're right there's always a subjective element.

For me it really hasn't been an issue, but that may be because of the uniqueness of my field and institution. The hardest thing for me is coming up with the graphics in which to insert the numbers :shuffle:.

Yes, I'm innotechnic :slinkaway
Our nursing program decided to use the standardized ATI test for incoming students and then readminister the exam when they graduated the theory being scores would show progression. The reality is the scores were essentially the same and some student's scores actually were lower. Wouldn't that be a fault in the program's curriculum then? If students score lower than when they first entered, it would seem a curricular issue rather than an assessment issue.

manhn
06-08-2012, 04:06 AM
I came up with this handy essay formula during my university days.

Intro: "The conventional wisdom is...." or "There are 2/3/4 schools of thought with respect to...

Then summarize those arguments with both academic and real life sources.

Then begin your argument/thesis: "However, such conventional wisdom should be reconsidered because of the following reasons..." or "Such conventional wisdom has led to the following awful tragedies..." or "Theory A is better than Theory B because..." or "Theory A and Theory B have their advantages and disadvantages, but when when taken together...BRILLIANCE!"

Your argument is then supported by academic and real life sources (so, even if you're questioning the status quo, you rely on studies and people who are part of that status quo).

And then conclusion...

"My brilliant argument/thesis has academic, social, and economic implications..."

Then BS those academic (more studies to fund!), social (no more racism!) and economic (more jobs, more money!) implications.

That is critical thinking.