Now to look at some applications for school . . .
One immediate thought is to employ a scaffolded approach where students learn to apply new skills in a low-stakes environment that will encourage them to raise doubts and ask questions. Instead of immediately applying skills to a project which requires a finished (& graded) project, we should let them perform some data evaluation as a discrete skill. An analogy would be a student of tennis who practices a hundred serves a day – she is not applying this skill to win a match (a high-stakes endeavor), so the practice session is a low-stakes opportunity to explore styles and tinker with her mechanics until she finds a comfort level with her ability to make good serves. And just as a coach would work with that tennis player, a teacher could provide the equivalent guidance for a student developing data evaluation skills.
As far as particular skills, a primary one would be the ability to clearly state the goal of their research by defining what they are trying to prove or disprove in specific terms. A goal like, ‘Finding out more about the Civil War’ is too vague and general to provide much guidance for most students. Instead, students should be able to enunciate a series of goals like, ‘Where were the majority of Civil War battles fought’, or ‘What Union generals were most effective’, or ‘What were the hardships people living in the Confederate states faced’, and so forth. Students should engage in a form of backwards design wherein they gain a clear understanding of what their finished product should look like before they begin to gather data.
After clarifying particular research goals and sub-goals, students should then embark on a data-gathering journey. This journey may start by brainstorming potential sources of data, and should be marked by numerous roadmap meetings along the way with peers or teachers, at least until they have gained fluency with the process. When a piece of data is found, its content should be studied in light of the goals of the project. If the data does not contain information relevant to the project goals it can be discarded. If it is relevant to the goals, it becomes part of the data that the student must then evaluate for quality.
To repeat, data should first be examined for relevance – if it passes the relevance test, it then gets evaluated for quality, which will be our next entry.
Showing posts with label evaluating sources. Show all posts
Showing posts with label evaluating sources. Show all posts
Thursday, October 9, 2008
Wednesday, September 24, 2008
Sources with Scientists, Part 4
School thoughts on this topic . . .
The type of research that Katie and Bruce perform is very much about knowledge construction, where they build meaning out of a situation that does not have a predetermined end result. One thing I will be looking to document is whether other fields require this type of research as well.
I contrast the knowledge construction model to many of the research projects I recognize from years of working in schools, primarily at the middle school grades. School research projects often take on the form of a scavenger hunt, where students gather and organize clues from various research sources to complete an assignment that meets an expected set of outcomes.
For example, a research paper on Abraham Lincoln is assessed on how well a student covers and includes certain bits of data, like Lincoln’s early life, his political career, the incidents of the Civil War, his assassination, etc. There are certain expected elements of the finished project.
So we are looking at two types of research projects that are quite different in their approaches. One primarily requires convergent research skills while the other builds in a divergent way; one reports facts while the other constructs a story; one is guided discovery, one is led by self-discovery.
Perhaps an early step in planning research reports in school will be to determine the goals of the project and decide whether those goals are best met through the knowledge construction model or the scavenger hunt model.
More to come . . .
The type of research that Katie and Bruce perform is very much about knowledge construction, where they build meaning out of a situation that does not have a predetermined end result. One thing I will be looking to document is whether other fields require this type of research as well.
I contrast the knowledge construction model to many of the research projects I recognize from years of working in schools, primarily at the middle school grades. School research projects often take on the form of a scavenger hunt, where students gather and organize clues from various research sources to complete an assignment that meets an expected set of outcomes.
For example, a research paper on Abraham Lincoln is assessed on how well a student covers and includes certain bits of data, like Lincoln’s early life, his political career, the incidents of the Civil War, his assassination, etc. There are certain expected elements of the finished project.
So we are looking at two types of research projects that are quite different in their approaches. One primarily requires convergent research skills while the other builds in a divergent way; one reports facts while the other constructs a story; one is guided discovery, one is led by self-discovery.
Perhaps an early step in planning research reports in school will be to determine the goals of the project and decide whether those goals are best met through the knowledge construction model or the scavenger hunt model.
More to come . . .
Labels:
evaluating sources,
literacy skills,
new literacies,
research
Thursday, September 18, 2008
Sources with Scientists, Part 3
The previous post on evaluating sources from a scientist’s perspective dealt with the sources themselves . . . now we’ll look at the skills of the scientist.
In order to effectively evaluate the worthiness of data, the scientist must be able to:
• Understand the limits of data
• Clarify the purpose of collecting the data
• Build a framework for understanding what’s needed
• Employ the process of ‘brain-dropping’ (finding and setting aside nuggets of data for later consolidation)
• Create flexible outlines
• Evaluate visual data
• Recognize bias
• Deconstruct sources
Over time, we will examine and process these skills, especially as they relate to the classroom. We will also be looking for consistency of skills in the data gathered from researchers in other fields.
In order to effectively evaluate the worthiness of data, the scientist must be able to:
• Understand the limits of data
• Clarify the purpose of collecting the data
• Build a framework for understanding what’s needed
• Employ the process of ‘brain-dropping’ (finding and setting aside nuggets of data for later consolidation)
• Create flexible outlines
• Evaluate visual data
• Recognize bias
• Deconstruct sources
Over time, we will examine and process these skills, especially as they relate to the classroom. We will also be looking for consistency of skills in the data gathered from researchers in other fields.
Monday, September 15, 2008
Sources with Scientists, Part 2
We covered a number of areas in our discussion, one of which was what to consider when assessing data. In other words, what do we want to know about the data before deciding whether it is worth using.
Here are some of the considerations:
Age of the data
• What has changed since data was published?
• Is there newer data on same topic?
What are the qualifications of who supplied the data?
• Background
• Reputation
• Potential bias
• Credibility
How close is data source material to raw data?
• How is source using raw data?
• Is the data research-based and cited?
• Can the data be confirmed?
How clear is the data?
What is the volume and specificity of information?
Is there depth and sophistication to the data?
In a school setting, perhaps a pre-requisite research skill would be to assess data sources before having to use them. Create a rubric that will allow students to ‘grade’ a data source as a way of developing a more critical eye towards the data they collect, rather than assuming that all sources are of equal value.
A further thought is that this evaluative process may be done through a ‘low-stakes’ exercise to encourage kids gain comfort with the process without feeling that asking questions or making mistakes will result in penalties.
Still more to come . . .
Here are some of the considerations:
Age of the data
• What has changed since data was published?
• Is there newer data on same topic?
What are the qualifications of who supplied the data?
• Background
• Reputation
• Potential bias
• Credibility
How close is data source material to raw data?
• How is source using raw data?
• Is the data research-based and cited?
• Can the data be confirmed?
How clear is the data?
What is the volume and specificity of information?
Is there depth and sophistication to the data?
In a school setting, perhaps a pre-requisite research skill would be to assess data sources before having to use them. Create a rubric that will allow students to ‘grade’ a data source as a way of developing a more critical eye towards the data they collect, rather than assuming that all sources are of equal value.
A further thought is that this evaluative process may be done through a ‘low-stakes’ exercise to encourage kids gain comfort with the process without feeling that asking questions or making mistakes will result in penalties.
Still more to come . . .
Friday, September 12, 2008
Sources with Scientists, Part 1
Returning to the project of evaluating data sources, I had a fruitful interview yesterday with two scientists, Katie & Bruce. We discussed the processes involved in their work, as well as applications to school. To reiterate, my objective is to explore how to teach students the process of evaluating data sources, since so much data is available to them.
One of the areas that arose from our discussion is determining the purpose of a data collection exercise. Two emerging thoughts: sometimes data collection is to meet discreet goals, and other times it is to construct understanding.
In the first case, we have an assignment akin to a scavenger hunt where we want students to find particular bits of information to satisfy the completion of a standard set of knowledge goals. An example would be asking students to file a report on the battle of Gettysburg that includes the setting, the duration of the battle, the number of casualties, the immediate results of the battle, and so forth – a fact-driven exercise.
In the case of constructing understanding, the goal is more indefinite and open to interpretation, and there are multiple conclusions that may be reached. The example would be a research report in which a student would have to support or dismiss the concept that Gettysburg was the turning point of the Civil War.
Without a doubt, it is possible, and even common, for school research projects to incorporate both kinds of thinking within a single report. But by recognizing two distinct processes, we can better start to prepare students for two types of data-collection even before we begin the process of evaluating sources.
More to come . . .
One of the areas that arose from our discussion is determining the purpose of a data collection exercise. Two emerging thoughts: sometimes data collection is to meet discreet goals, and other times it is to construct understanding.
In the first case, we have an assignment akin to a scavenger hunt where we want students to find particular bits of information to satisfy the completion of a standard set of knowledge goals. An example would be asking students to file a report on the battle of Gettysburg that includes the setting, the duration of the battle, the number of casualties, the immediate results of the battle, and so forth – a fact-driven exercise.
In the case of constructing understanding, the goal is more indefinite and open to interpretation, and there are multiple conclusions that may be reached. The example would be a research report in which a student would have to support or dismiss the concept that Gettysburg was the turning point of the Civil War.
Without a doubt, it is possible, and even common, for school research projects to incorporate both kinds of thinking within a single report. But by recognizing two distinct processes, we can better start to prepare students for two types of data-collection even before we begin the process of evaluating sources.
More to come . . .
Friday, September 5, 2008
Evaluating Sources
There is a further idea on pursuing study of how to equip students with the new literacy skills necessary for a global world. I will be speaking with people in the working world who regularly have the need to evaluate data sources to study the thought processes they engage.
For example, a journalist receives many bits of data while preparing a story, but must discard some because they don’t meet a professional standard or are not relevant to the story. I’m not worried about the procedures of source evaluation, but the underlying thought processes and the corresponding literacy skills that are engaged.
By interacting with people in a number of diverse fields, I hope to be able to use the backwards design process to develop a program of literacy enrichment for today’s students. I have my first research partner on board, and am working on lining up several more.
More as this develops . . .
For example, a journalist receives many bits of data while preparing a story, but must discard some because they don’t meet a professional standard or are not relevant to the story. I’m not worried about the procedures of source evaluation, but the underlying thought processes and the corresponding literacy skills that are engaged.
By interacting with people in a number of diverse fields, I hope to be able to use the backwards design process to develop a program of literacy enrichment for today’s students. I have my first research partner on board, and am working on lining up several more.
More as this develops . . .
Subscribe to:
Posts (Atom)