Reference Document 3-1
Appendix 3
2012) . The guidance suggests using formal ranking methods such as the International Union of Conservation Networks (IUCN) Red List Categories and Criteria (IUCN 2001, 2010) , Florida Fish and Wildlife Conservation Commission’s taxa ranking system (Millsap et al. 1990) , and the NatureServe conservation status evaluation tool (NatureServe 2012a, Master et al. 2012, Faber-Langendoen et al. 2012) . Benefits of using more uniform methods include consistency of the information and the ability to share data across organizations (Salafsky et al. 2008) . The 2015 SWAP Revision Technical Team formed a Ranking Criteria Work Group (Work Group) to review and evaluate ranking metrics and prioritization tools. The Work Group was comprised of biologists from the WRC who were tasked with developing recommendations for a method to identify SGCN and to prioritize conservation efforts on behalf of species. In addition to reviewing the evaluation methods recommended by AFWA-TWW (noted above), the Work Group also considered methods described by the Convention on International Trade in Endangered Species (CITES 2011) , American Fisheries Society (Deacon et al. 1979, Jelks et al. 2008) , Partners In Flight Species Assessment Process (Beissinger et al. 2000) , and an assessment of various categorization systems conducted by deGrammont and Cuaron (2006) and Arponen (2012) . Based on the results of their review and assessment, the Work Group members determined that adopting and modifying selected ranking criteria and scoring metrics described by IUCN, Millsap (et al. 1990) , and NatureServe combined with original criteria and metrics to capture knowledge gaps and management concerns, would best meet North Carolina’s WAP goals for identifying SGCN and prioritizing conservation efforts. The Work Group also adopted the 10- point scoring system as described in Millsap (et al. 1990) because the application of this method is similar to the ranking criteria proposed in this white paper and a statistical analysis conducted by Millsap (et al. 1990) of their results indicated the metrics and scoring system were robust and selection bias was minimal. Members of the Work Group coordinated with biologists at the NC Natural Heritage Program (NCNHP) to determine whether any information used in the NatureServe evaluation tool would be compatible with the proposed ranking criteria. It was determined this information is not uniformly available across all taxa groups or for species that are not tracked for reporting to NatureServe. However, the NCNHP will provide data for those species which are tracked in their database system. The NCNHP requested that the metrics be designed in a way that ranking criteria data can augment information used in designating state-level rankings as reported by NatureServe. As a result of these coordination efforts, the Work Group adopted answer scales that utilize the NatureServe evaluation tool for several metrics that address conservation concerns (NatureServe 2012a) . Other coordination efforts include a request to faculty and staff of the North Carolina Cooperative Fish & Wildlife Research Unit and staff of the Biodiversity and Spatial Information Center at NC State University (NCSU) for review of the draft ranking criteria metrics. The request asked for comments on whether statistical analysis would be needed to reduce bias in the evaluation process. Their recommendations include
2 of 31
2025 NC Wildlife Action Plan
Made with FlippingBook Ebook Creator