Envirothon Test Writing Recommendations

Description of the Envirothon Testing Experience: Good teamwork, cooperative decision making, free exchange of ideas, and information pooling is important. All test stations are staffed by a monitor. The five station tests will be 40 minutes in length.

One of the outstanding aspects of the Envirothon competition is its emphasis on hands-on, problem solving activities. In addition we want to encourage questions that involve site assessment. The intent is to make the most of the testing sites while actively involving participants

Missouri Envirothon Test Writing Guidelines:

Each of you needs 40 points worth of questions for your primary site, and 15 points worth of questions for the other sites.

All of the questions you write will be within your area of expertise.

For example, the forester will write 40 points of forestry questions at the primary Forestry site and 15 points of forestry questions at the Current Issue site, 15 points of forestry questions at the aquatics site, 15 points of forestry questions for the wildlife site and 15 points of forestry questions at the soils site. The purpose for this is so there is an opportunity for Forestry questions in a variety of landscape settings. You may tie the forestry question to the issue at each station (may want to ask a forestry question based on the soil survey, since soil surveys will most likely be at that site), but it is not required.

After the first draft of questions is submitted, any questions that seem to be duplicates will be addressed.

Each testing station will present numerous opportunities for teams to be active. Reading maps; interpreting charts and graphs; using mathematical formulas; using keys and locating information in resource manuals are all question topics that demand critical thinking from the team. Engaging teams physically and challenging them mentally using some of the suggestions given here will significantly impact the quality of station tests.

Bloom’s Taxonomy

Bloom’s taxonomy is a system of categorizing thought processes into six levels as follows:

  1. Knowledge – Knowledge level questions do not test in-depth understanding and should be used sparingly when constructing Envirothon test questions. Tested information may include factual data, definitions or observations. (words used include when, where , who, what, and define)
  2. Comprehension- These types of questions ask students to consider factual information they have learned and interpret it. Students are required to make comparisons and interpret graphs, tables, charts and even cartoons. (words used include compare, contrast, describe, show, and explain)
  3. Application – Application level questions require students to give solutions to problems. (words used include solve, which, use, classify, choose, how much, and what is)
  4. Analysis – These types of higher level questions test how deep a student’s understanding is. A student must show understanding of the parts to an entire concept. (words used include analyze, support, provide evidence, identify reasons, why, and provide conclusions)
  5. Synthesis – These types of questions have more than one correct answer or perspective. Students are required to analyze information and give explanations. The Oral Problem Competition is an example of a synthesis problem in which teams area required to comprehend concepts, apply solutions and analyze information. (words used include write, predict, develop, design, synthesize, produce, solve, devise and construct)
  6. Evaluation- These questions do not have one correct answer. These types of questions ask students to make judgments on ideas, solutions, methods or even products. Answers that provide reasons for the evaluation demonstrate knowledge and understanding of the topic, requiring the use of all the previous levels of thought. (words used include assess, decide, judge, argue, what is your opinion, appraise, do you agree or disagree, and give an evaluation)

Suggested Guidelines for % Category Breakdown for Station Tests

Questions written for stations tests can fall into many categories. Some of these categories may include current resource issues, technical skills and management planning. One way to break down a station test to more efficiently test student knowledge across all categories and all topics has been charted below.

Category % of Questions on Each Station Test

Terminology 5%

Identification 10%

Equipment/Career Information 5%

BMP and Management Planning 30%

Problem Solving and Technical Skills 50%

Site Specific Questions 30%

 

Some common problems that occur in test development.

  • Test does not match the learning objectives.
    (example: Use your knowledge of birds to identify the following frog callsJ
  • Test is more like "Trivial Pursuit" than real understanding.
    (example: What toothpaste is recommended by most foresters?)
    The goal is to find those students that have skill in critical thinking. Try to make questions short answer, site specific, case study, and/or problem solving. Do not use True/False. Multiple Choice, Matching or Fill in the blank questions can be used, but make sure they are based on problem solving, or the application of a skill or knowledge or the understanding of a concept not just rote memorization.
     

  • Questions are confusing
    (example: If a and b and also sometimes c when d is present, what is the result?)
    Use common sense and have someone check and test the questions to make sure they are clear, fair and appropriate.
     

  • Lack of hands-on/problem solving
    (example: Welcome to the Missouri Envirothon in the beautiful rural hills. Everything you need is found on your paper…from time to time enjoy the view.)
     
  • Unequal access to materials
    (example: Use this key to identify 100 rare plants. We only have one key so share it with the other teams.)
    Make sure you have ample equipment for each team and that students have equal access to resources, equipment or hands-on stations. The easiest approach is to have one of each item per team, or have the teams rotate between stations with an equal amount of time at each. This is the most common cause for grievances at the national Envirothon.
     
  • Questions with an "agenda" (usually unintentional)
    (Which is the best approach to the problem?: The one proposed by the rabid environmentalist or the one proposed by the greedy business conglomerate?)
  • Multiple Choice Questions: Suggested Guidelines

    On the negative side, multiple-choice questions are difficult and time consuming to construct well, especially when assessing at higher levels of thinking. And they do not evaluate how well students are able to communicate their understanding. Considering the number of tests that must be graded in a short period of time, however, multiple choice questions can be scored much more quickly.

    1. The stem should not be written in the form of an unfinished sentence. It should be meaningful by itself and ask a question (who, what, where, when, why, how, which) or present a problem.
    2. Avoid using negative questions or statements in the stem or response, as they tend to be ambiguous and confusing.
    3. Do not give grammatical clues to the correct answer. Using the article "a" or "an" at the end of a stem indicates whether the answer starts with a vowel or consonant.
    4. Write stems that have only one correct answer, but make the distracters plausible.

      - write the correct response first, then generate 3-4 reasonable alternatives

      - write alternative responses of roughly equal length and parallel construction

      - arrange the alternative responses in alphabetical order to avoid establishing a pattern
       

    5. Use the responses "all of the above" or "none of the above" sparingly or not at all.
    6. Place the entire item (stem and alternative responses) on the same page. Use upper case letters before each of the responses.
    7. Make a deliberate effort to stress comprehension, application, analysis, synthesis, and evaluation when you write questions. Guard against writing too many knowledge-level questions.
    8. Matching Questions: Suggested Guidelines

      In general, matching items consist of a column of stimuli presented on the left side of the exam page and a column of responses placed on the right side of the page. Students are required to match the response associated with a given stimulus. However, it is difficult to write reliable matching items, and this type of question can subject to guessing.

      1. Include directions, which clearly state the kind of relationship you are testing, and the basis for matching the stimuli with the responses. Explain whether or not a response can be used more than once and indicate where to write the answer.
      2. Use only homogenous material in matching items.
      3. Arrange the list of responses in some systematic order if possible (e. g., chronological, alphabetical)
      4. Avoid grammatical or other clues to the correct response.

      Other Tips:
      - Keep matching items brief, limiting the list of stimuli to fewer than 10.
      - Include more responses than questions to help prevent answering through the process of elimination. One of the lists should be approximately 2 or 3 items longer than the other list. This makes it difficult to mark correct matches by the process of elimination.

      Fill In the Blank/Short Answer Questions: Suggested Guidelines

      Fill in the blank/short answer questions can minimize guessing as compared to multiple choices or matching, but they can also be ambiguous and can be difficult to construct so that the desired response is clearly indicated. They can also be difficult to score if the question allows two possible correct answers.

      1. For fill in the blank, omit only significant words from the statement.
      2. Do not omit so many words from the statement that the intended meaning is lost.
      3. Avoid grammatical or other clues to the correct response, such as: a, an, he, or she.
      4. Be sure to list in the answer rubric, all possible CORRECT answers.
      5. To minimize answer clues for fill in the blank, make the blanks of equal length.
      6. When possible delete words at the end of the statement after the student has been given a clearly defined problem.
        (example: Instead of the question, "A ___________ is a species that has significant influence on many other species of animals.", a better choice would be: "What type of species has a significant influence on many other species of animals? ________________")
      7. Avoid lifting text from study materials or other resources to avoid memorized answers.
      8. Limit the desired response to a single word or phrase.
        1.  

      Scoring Tests

      Active questioning will most likely take more time to grade. You may want to have a number of people to assist you in scoring. Each question needs to be scored by the same judge for each test. If you have 4 people assisting in grading, one can score questions 1-3 for each team, while another person scores questions 4-6 for each team. This is to insure consistency in scoring.

      Develop objective criteria for judging answers. This could be a range of acceptable answers (however, you must also allow for novel answers that fit the problem presented) or you can use specific criteria for judging each aspect of the answer.

      Example:

      Question: Explain why the Red Cedar River may recharge the area’s aquifers at the highest water level or drain area aquifers at the lowest water levels. (4 points)

      Answer: A seasonal water budget simple shows water flows from higher elevations to lower. During high river levels, the top of the aquifer is lower and water goes into the area aquifer. During lowest river levels, the top of the aquifer is above the river and the aquifer flows into the river, making the river both a recharge and discharge area.

      (Rubric: Four points: 1 point for each aspect; a) at high river level water goes into aquifer and b) at low river levels, aquifer flows into river. 2 points for c) designating the river as both a recharge and discharge area.)

      Example:

      Question: Is this site suitable habitat for the Red Wolf? (3 points)

      Answer: This site is not suitable habitat for the Red Wolf because its major food sources cannot survive here.

      (Rubric: Three points: 2 points for determining if the site is suitable Red Wolf habitat and b) 1 point for the explanation why.)